Purchase this article with an account.
Anton L. Beer, Takeo Watanabe; Modulation of visual perceptual learning by sounds. Journal of Vision 2006;6(6):173. doi: 10.1167/6.6.173.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Repetitive practice of a task leads to enhancement of the performance of the task. Since this so-called perceptual learning is in some cases highly specific for primitive visual features, it may involve low-level visual stages (e.g., Fahle & Poggio, 2001). Another line of research has shown that visual spatial perception is affected at so called ‘modality-specific’ stages by auditory signals (e.g., Eimer, 2001; McDonald et al., 2003). The present study aimed to investigate whether and how these crossmodal interactions in spatial perception also play a role in perceptual learning. Participants identified the direction of moving dots presented at one of various peripheral locations both prior to and following a crossmodal training. During the crossmodal training, two motion displays were presented to the right and left together with a sound coming from one of the display locations. Participants' task was to detect a target sound at one of the two sound locations. One motion direction co-occurred frequently with the target sound at the relevant side. Another motion direction was more likely tied to the same sound from the irrelevant side. All other directions were equally likely tied to distractor sounds. Performance improvements in the motion direction identification task tended to be larger for the motion direction tied to the target sound at the relevant side as compared to other motion directions. These auditorialy induced perceptual learning effects seemed to be spatially specific. The results suggest that crossmodal interactions not only affect instantaneous perception but also modulate perceptual learning.
This PDF is available to Subscribers Only