September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Auditory Context Alters Visual Perception
Author Affiliations
  • Jamal Williams
    University of California, San Diego
  • Yuri Markov
    HSE University, Russia
  • Natalia Tiurina
    HSE University, Russia
  • Viola Stoermer
    University of California, San Diego
    Dartmouth College
Journal of Vision September 2021, Vol.21, 2796. doi:https://doi.org/10.1167/jov.21.9.2796
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jamal Williams, Yuri Markov, Natalia Tiurina, Viola Stoermer; Auditory Context Alters Visual Perception. Journal of Vision 2021;21(9):2796. https://doi.org/10.1167/jov.21.9.2796.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Visual inputs are often obscured, distorted, or ambivalent, and to form meaningful representations of incoming information, our visual system relies not only on the visual features of the object itself but takes into account the surrounding context. Most studies have focused on how visual context influences visual object perception, and it is less clear how concurrent auditory information about objects—the sound of a lawnmower, or the whistling of a tea kettle—influences what objects we see and how we experience them. Here, we investigate whether naturalistic sounds modulate the representation of visual objects. We used a visual discrimination task and a novel set of ambiguous object stimuli that were paired at random with related or unrelated sounds. Specifically, we created ambiguous stimuli by morphing together the features of two objects (Object A, Object B, e.g., a hammer and a seal), and presented these ambiguous morph stimuli with naturalistic sounds that were related to either Object A or B. Visual objects and sounds were presented simultaneously, and at the end of each trial, participants indicated what object they saw using continuous report. Overall, we found that sounds biased visual object recognition, such that the perceptual representation was pulled towards the object features that matched the sound (Exp. 1a-1b). For example, the same ambiguous hammer-seal object would appear more seal-like when paired with the sound of seal barking, but more hammer-like when paired with the sound of a hammer hitting. In various control experiments, we show that this effect is not driven by response bias (Exp. 2a-2b), and not due to a general effect of expectation (Exp.3). These results indicate that visual object representations are biased by contextual auditory information due to the continuous integration of auditory and visual information during real-world perception.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×