October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
The effects of cross-modal feature and location mappings on visual performance
Author Affiliations
  • Janna Wennberg
    University of California, San Diego
  • Viola S. Stoermer
    University of California, San Diego
Journal of Vision October 2020, Vol.20, 1578. doi:https://doi.org/10.1167/jov.20.11.1578
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Janna Wennberg, Viola S. Stoermer; The effects of cross-modal feature and location mappings on visual performance. Journal of Vision 2020;20(11):1578. https://doi.org/10.1167/jov.20.11.1578.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Previous research has shown that hearing a sound improves visual processing of objects appearing at the same location (Störmer, 2019). Another line of research has suggested that, beyond these spatial effects, cross-modal mappings exist through natural associations between certain auditory and visual features. For example, high-pitched sounds tend to be associated with bright visual stimuli and low-pitched sounds with dark stimuli (Marks, 1987). However, it is unclear whether these cross-modal feature mappings influence visual perception in a similar way as has been found in the spatial domain, i.e., improving perception for congruent (relative to incongruent) sound-object pairs. We tested this by asking participants (N = 32) to perform a cross-modal visual discrimination task in which they were briefly shown a bright or dark disk either on the left or right side of a screen. Participants indicated whether they saw a small gap inside the disk and also which disk type was presented (bright or dark; 4-alternative forced choice). Critically, on each trial, the disk was preceded by a peripheral high- or low-pitched sound from the right or the left side of the screen that was not predictive of the spatial location or the brightness of the disk. Participants showed higher visual discrimination accuracy for both congruent spatial cues (p = 0.005, d = 0.53) and congruent feature cues (p = 0.008, d = 0.50; no interaction, p = 0.146). The results show that peripheral sounds facilitate the processing of co-localized visual stimuli, replicating previous studies. Further, they demonstrate that audiovisual feature associations, such as pitch and brightness, can have similar effects on accuracy in a visual discrimination task, at least when the visual feature (brightness) is task-relevant. In order to address the possibility of response bias, further research should assess the facilitatory effect of feature cues when brightness is task-irrelevant.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.