August 2012
Volume 12, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2012
Seeing the song: Left auditory cortex tracks auditory-visual dynamic congruence
Author Affiliations
  • Julia Mossbridge
    Department of Psychology, Northwestern University
  • Marcia Grabowecky
    Department of Psychology, Northwestern University\nInterdepartmental Neuroscience Program, Northwestern University
  • Satoru Suzuki
    Department of Psychology, Northwestern University\nInterdepartmental Neuroscience Program, Northwestern University
Journal of Vision August 2012, Vol.12, 614. doi:10.1167/12.9.614
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to Subscribers Only
      Sign In or Create an Account ×
    • Get Citation

      Julia Mossbridge, Marcia Grabowecky, Satoru Suzuki; Seeing the song: Left auditory cortex tracks auditory-visual dynamic congruence. Journal of Vision 2012;12(9):614. doi: 10.1167/12.9.614.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Part of the beauty of watching a ballet is the synchrony or dynamic congruence between what we see and what we hear. It is already known that in humans, the left primary auditory cortex processes complex auditory dynamics when sounds are presented alone, but it is not clear whether the brain exploits this specialization for use in encoding cross-modal dynamic congruence. Here we report the results of an EEG experiment in which auditory and visual stimuli sharing complex dynamics across multiple feature dimensions and scales (Beethoven’s Moonlight Sonata and the iTunes Jelly visualizer) were presented to 28 participants in a temporally congruent condition (visualizer matched music) and also an incongruent condition (visualizer delayed by ~30 seconds). Condition order was counterbalanced across participants, and only ~30% of the participants detected the difference between the conditions. We used continuous auditory steady-state responses (ASSR) combined with current-source density (CSD) mapping to pinpoint auditory cortical activity tuned to the music. As expected based on previous work, right hemisphere ASSR power dominated in both conditions. More importantly, an interaction between hemisphere and congruence (p<0.035) revealed that when the music and visualizer were temporally incongruent, the left hemisphere had significantly lower steady-state power than the congruent condition (p<0.007). ASSR power arising from the same musical stimuli recorded from the same participants while they performed two separate attentionally demanding visual tasks did not differ from the congruent condition, suggesting that the drop in power in the left hemisphere during the incongruent condition is not due to a difference in auditory attention, but is a reflection of the encoding of dynamic congruence between the music and images. These results show that left primary auditory cortex encodes auditory-visual dynamic congruence and thus contributes to the binding of concurrent auditory and visual stimuli based on complex temporal cues.

Meeting abstract presented at VSS 2012

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×