Abstract
In order to best navigate our world, we must combine information gathered by our various sensory systems. What role does feature-based attention play in the integration of multi-modal stimuli? In vision, feature-based attention has been shown to enhance the processing of an attended feature, such as a specific direction of motion, throughout the visual field. Do these attentional effects transfer across modalities?
Previous research has suggested that the transfer of visual motion to auditory motion processing originates at the neural level. However, evidence for the symmetric transfer of auditory motion to visual motion processing has proved more elusive. We investigated whether controlled attention to auditory motion in depth accentuates the processing of visual motion in depth, as measured psychophysically. Auditory motion in depth was simulated by a ramp in volume over time that either increased for motion toward the observer or decreased for motion away. The adapting stimulus was visual motion in depth that was simulated by either an expanding or contracting ring centered at fixation. Subjects attended to either auditory motion in the same direction as the adapting visual stimulus, or to auditory motion in the opposite direction as the adapting visual stimulus. After adaptation, we measured the strength of the visual MAE using a motion nulling paradigm. The visual MAE was larger when the visual motion was in the same direction as the attended auditory motion, suggesting that auditory motion in depth enhances the processing of a corresponding visual stimulus. This transfer of motion processing across modalities could facilitate the binding of corresponding sensory information into a unified percept of a single moving object, enhancing the ability of the visual system to interpret motion in depth.