August 2009
Volume 9, Issue 8
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2009
Cross-modal transfer of motion processing from audition to vision
Author Affiliations
  • Zachary Ernst
    University of Washington
  • Geoffrey Boynton
    University of Washington
Journal of Vision August 2009, Vol.9, 719. doi:10.1167/9.8.719
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Zachary Ernst, Geoffrey Boynton; Cross-modal transfer of motion processing from audition to vision. Journal of Vision 2009;9(8):719. doi: 10.1167/9.8.719.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

In order to best navigate our world, we must combine information gathered by our various sensory systems. What role does feature-based attention play in the integration of multi-modal stimuli? In vision, feature-based attention has been shown to enhance the processing of an attended feature, such as a specific direction of motion, throughout the visual field. Do these attentional effects transfer across modalities?

Previous research has suggested that the transfer of visual motion to auditory motion processing originates at the neural level. However, evidence for the symmetric transfer of auditory motion to visual motion processing has proved more elusive. We investigated whether controlled attention to auditory motion in depth accentuates the processing of visual motion in depth, as measured psychophysically. Auditory motion in depth was simulated by a ramp in volume over time that either increased for motion toward the observer or decreased for motion away. The adapting stimulus was visual motion in depth that was simulated by either an expanding or contracting ring centered at fixation. Subjects attended to either auditory motion in the same direction as the adapting visual stimulus, or to auditory motion in the opposite direction as the adapting visual stimulus. After adaptation, we measured the strength of the visual MAE using a motion nulling paradigm. The visual MAE was larger when the visual motion was in the same direction as the attended auditory motion, suggesting that auditory motion in depth enhances the processing of a corresponding visual stimulus. This transfer of motion processing across modalities could facilitate the binding of corresponding sensory information into a unified percept of a single moving object, enhancing the ability of the visual system to interpret motion in depth.

Ernst, Z. Boynton, G. (2009). Cross-modal transfer of motion processing from audition to vision [Abstract]. Journal of Vision, 9(8):719, 719a, http://journalofvision.org/9/8/719/, doi:10.1167/9.8.719. [CrossRef]
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×