May 2008
Volume 8, Issue 6
Vision Sciences Society Annual Meeting Abstract  |   May 2008
Neural mechanisms of multisensory perceptual learning
Author Affiliations
  • Robyn Kim
    Department of Psychology, University of California at Los Angeles
  • Aaron Seitz
    Department of Psychology, Boston University
  • Ladan Shams
    Department of Psychology, University of California at Los Angeles
Journal of Vision May 2008, Vol.8, 976. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Robyn Kim, Aaron Seitz, Ladan Shams; Neural mechanisms of multisensory perceptual learning. Journal of Vision 2008;8(6):976.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Studies of perceptual learning have traditionally focused on unisensory stimuli. However, multisensory interactions can occur at early stages of visual processing (Watkins et al, 2006, 2007), and therefore might play a role in low-level perceptual learning. Indeed, we recently demonstrated that training on a multisensory motion coherence detection task facilitates visual perceptual learning (Seitz, Kim, & Shams 2006). Furthermore, this facilitation is not due to a general effect of attention, but rather involves processes sensitive to featural relations between the visual and auditory stimuli (Kim et al, 2007). In the current study, we investigate neural mechanisms underlying multisensory learning effects. We scanned the brains of six subjects using functional MRI before and after 10 days of training on the congruent visual-audio motion coherence detection task. In the scanner, subjects performed a motion discrimination task involving congruent and incongruent motion-stimuli for both the trained and the opposite of the trained motion-direction. Comparing multisensory effects pre- and post-training, we observe robust changes in activation that are specific to the trained motion-direction in an impressive variety of brain areas, including subcortical (cerebellum) and “amodal” association cortices (frontal, anterior temporal, superior parietal, anterior cingulate), as well as areas traditionally known as sites of multisensory integration (inferior parietal lobe, superior temporal sulcus). Of particular interest, multisensory learning effects were also observed in visual and auditory cortices, which are typically considered to be “unisensory”. While many of these brain areas have previously been implicated in multisensory processing, this study demonstrates a substantial degree of plasticity in multisensory processing systems. Furthermore, these changes at multiple processing levels may underlie the enhancement in visual learning experienced with multisensory training.

Kim, R. Seitz, A. Shams, L. (2008). Neural mechanisms of multisensory perceptual learning [Abstract]. Journal of Vision, 8(6):976, 976a,, doi:10.1167/8.6.976. [CrossRef]

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.