June 2006
Volume 6, Issue 6
Vision Sciences Society Annual Meeting Abstract  |   June 2006
Modulation of visual perceptual learning by sounds
Author Affiliations
  • Anton L. Beer
    Vision Sciences Laboratory, Boston University, Boston, MA 02215
  • Takeo Watanabe
    Vision Sciences Laboratory, Boston University, Boston, MA 02215
Journal of Vision June 2006, Vol.6, 173. doi:https://doi.org/10.1167/6.6.173
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Anton L. Beer, Takeo Watanabe; Modulation of visual perceptual learning by sounds. Journal of Vision 2006;6(6):173. doi: https://doi.org/10.1167/6.6.173.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Repetitive practice of a task leads to enhancement of the performance of the task. Since this so-called perceptual learning is in some cases highly specific for primitive visual features, it may involve low-level visual stages (e.g., Fahle & Poggio, 2001). Another line of research has shown that visual spatial perception is affected at so called ‘modality-specific’ stages by auditory signals (e.g., Eimer, 2001; McDonald et al., 2003). The present study aimed to investigate whether and how these crossmodal interactions in spatial perception also play a role in perceptual learning. Participants identified the direction of moving dots presented at one of various peripheral locations both prior to and following a crossmodal training. During the crossmodal training, two motion displays were presented to the right and left together with a sound coming from one of the display locations. Participants' task was to detect a target sound at one of the two sound locations. One motion direction co-occurred frequently with the target sound at the relevant side. Another motion direction was more likely tied to the same sound from the irrelevant side. All other directions were equally likely tied to distractor sounds. Performance improvements in the motion direction identification task tended to be larger for the motion direction tied to the target sound at the relevant side as compared to other motion directions. These auditorialy induced perceptual learning effects seemed to be spatially specific. The results suggest that crossmodal interactions not only affect instantaneous perception but also modulate perceptual learning.

Beer, A. L. Watanabe, T. (2006). Modulation of visual perceptual learning by sounds [Abstract]. Journal of Vision, 6(6):173, 173a, http://journalofvision.org/6/6/173/, doi:10.1167/6.6.173. [CrossRef]
 Supported by NSF grant (BCS-9905914), NIH grant (R01EY015980-01), Human Frontier Research grant (RGP18/2004), and NSF (CELEST).

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.