September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Visual experience modulates sensitivity to statistics of reverberation
Author Affiliations & Notes
  • Haydee Garcia-Lazaro
    Smith-Kettlewell Eye Research Institute
  • Audrey Wong-Kee-You
    Smith-Kettlewell Eye Research Institute
  • Yavin Alwis
    Smith-Kettlewell Eye Research Institute
  • Santani Teng
    Smith-Kettlewell Eye Research Institute
  • Footnotes
    Acknowledgements  This work was supported by an E. Matilda Ziegler Foundation for the Blind Research Grant (to ST) and Rachel C. Atkinson Fellowship (to AW)
Journal of Vision September 2021, Vol.21, 2926. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Haydee Garcia-Lazaro, Audrey Wong-Kee-You, Yavin Alwis, Santani Teng; Visual experience modulates sensitivity to statistics of reverberation. Journal of Vision 2021;21(9):2926. doi:

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Scene analysis is fundamental to successfully perceiving, interacting with, and navigating through the environment. In the absence of visual information, scene analysis relies largely on auditory signals such as reverberation, the aggregated acoustic reflections from multiple nearby surfaces. Sound sources and the spaces surrounding them are separably coded (Teng et al., 2017), an operation contingent on spectral and temporal statistics of the reverberant background (Traer & McDermott, 2016). It remains unclear how these perceptual heuristics develop or how they are influenced by experience. Here we investigated whether visual experience modulates reverberant perception. The experience-dependence hypothesis predicts higher fidelity in early- and congenitally blind listeners, who are more heavily dependent on acoustic cues. Alternatively, the calibration hypothesis predicts that, deprived of a visual “scaffold,” blind listeners would be impaired in reverberant coding. To test these predictions, we conducted an online experiment in which sighted and blind participants listened to pairs of spoken sentences, each convolved with a reverberant impulse response (IR). The IRs were recorded from a real-world space or synthesized to match or deviate from the temporal and spectral features of that same space. Manipulations included the temporal decay rate and the spectral distribution of the IRs. The task was a 2AFC judgment to indicate which of the spaces was “real.” We found that blind as well as sighted participants were highly sensitive to temporal deviations from ecologically valid reverberation, and less sensitive to spectral deviations. Interestingly, while sighted listeners reliably distinguished spectrally altered reverberation at above-chance levels, preliminary results indicate markedly reduced performance in blindness. Some below-chance performance in these conditions suggests a sensitivity to spectral alterations (cf. Voss et al., 2011), but an ambiguity in assigning them to the correct category. Taken together, our results suggest that visual experience modulates representations of auditory environmental statistics.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.