June 2007
Volume 7, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   June 2007
Visual perceptual learning enhanced with congruent sound
Author Affiliations
  • Robyn Kim
    Department of Psychology, University of California, Los Angeles
  • Aaron Seitz
    Department of Psychology, Boston University
  • Ladan Shams
    Department of Psychology, University of California, Los Angeles
Journal of Vision June 2007, Vol.7, 305. doi:https://doi.org/10.1167/7.9.305
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Robyn Kim, Aaron Seitz, Ladan Shams; Visual perceptual learning enhanced with congruent sound. Journal of Vision 2007;7(9):305. https://doi.org/10.1167/7.9.305.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Numerous studies of perceptual learning have demonstrated the potential for neural plasticity in adult visual cortex; however, the effect of sensory input from other modalities on such learning has been largely neglected. Considering that the natural environment is largely multimodal, and that inputs from other modalities can affect visual processing as early as V1 (Watkins, Shams et al. 2006), multisensory interactions may play a role in perceptual learning. For example, we recently found that training with sound facilitated coherent motion detection and discrimination (Seitz, Kim & Shams, 2006). In the current study, we trained subjects over five days on a visual motion coherence detection task with either visual, congruent audiovisual, or incongruent audiovisual stimuli. Consistent with our previous findings, when comparing performance on trials containing only visual signals, subjects trained with congruent audiovisual stimuli demonstrated significantly better learning compared to those trained with only visual stimuli. Subjects trained with incongruent audiovisual stimuli, however, did not show such a learning enhancement, and in fact did not demonstrate any significant learning. Thus, congruency between the audio and visual stimulus modulates the effect of sound on visual learning, suggesting that the benefits of multisensory training are not merely due to increased attention or arousal during training, but may result from interactions at a perceptual level.

Kim, R. Seitz, A. Shams, L. (2007). Visual perceptual learning enhanced with congruent sound [Abstract]. Journal of Vision, 7(9):305, 305a, http://journalofvision.org/7/9/305/, doi:10.1167/7.9.305. [CrossRef]
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×