October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Multisensory expectations about dynamic visual objects facilitates early sensory processing of congruent sounds
Author Affiliations
  • Andrew Marin
    University of California, San Diego
  • Viola Stoermer
    University of California, San Diego
  • Leslie Carver
    University of California, San Diego
Journal of Vision October 2020, Vol.20, 431. doi:https://doi.org/10.1167/jov.20.11.431
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Andrew Marin, Viola Stoermer, Leslie Carver; Multisensory expectations about dynamic visual objects facilitates early sensory processing of congruent sounds. Journal of Vision 2020;20(11):431. doi: https://doi.org/10.1167/jov.20.11.431.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

In everyday life, visual objects are often accompanied by sounds, yet little is known about how information in one sense may influence the processing of information in another sense. We examined how dynamic visual input – an object moving continuously across the visual field – influences early auditory processing of a sound that is either congruent with the object’s motion, and thus likely perceived as being part of the visual object, or incongruent with the object’s motion. We recorded EEG activity from 31 neurotypical adults who passively viewed a red ball that appeared either on the far left or right edge of the display and continuously traversed along the horizontal midline to make contact and bounce off the opposite edge. For multisensory trials, a tone accompanied the visual input the moment the ball made contact with the opposite edge (AV-synchronous), or the sound occurred 450ms before contact (AV-asynchronous). We also included audio-only and visual-only trials. Our main analysis focused on the auditory-evoked event-related potential (ERP) measured at frontal electrode sites and revealed reliable differences in the amplitude and latency of the N1-P2 auditory complex (all p’s < 0.003). Follow-up pairwise comparisons showed a reduced N1-amplitude for the AV-synchronous condition relative to the AV-asynchronous and audio-only conditions, and a delayed latency for the AV-asynchronous condition relative to the AV-synchronous and audio-only conditions. P2-peak analyses revealed greater amplitude for the audio-only relative to the AV-synchronous and AV-asynchronous and delayed latency toward AV-asynchronous relative to AV-synchronous and audio-only conditions. Overall, these results show that audio-visual synchrony elicited a faster and attenuated early auditory response relative to asynchronous or auditory-only events. This suggests that dynamic visual stimuli can help generate expectations about the timing of auditory events, which then facilitates the processing of auditory information that matches these expectations.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×