Abstract
Scene analysis is fundamental to successfully perceiving, interacting with, and navigating through the environment. In the absence of visual information, scene analysis relies largely on auditory signals such as reverberation, the aggregated acoustic reflections from multiple nearby surfaces. Sound sources and the spaces surrounding them are separably coded (Teng et al., 2017), an operation contingent on spectral and temporal statistics of the reverberant background (Traer & McDermott, 2016). It remains unclear how these perceptual heuristics develop or how they are influenced by experience. Here we investigated whether visual experience modulates reverberant perception. The experience-dependence hypothesis predicts higher fidelity in early- and congenitally blind listeners, who are more heavily dependent on acoustic cues. Alternatively, the calibration hypothesis predicts that, deprived of a visual “scaffold,” blind listeners would be impaired in reverberant coding. To test these predictions, we conducted an online experiment in which sighted and blind participants listened to pairs of spoken sentences, each convolved with a reverberant impulse response (IR). The IRs were recorded from a real-world space or synthesized to match or deviate from the temporal and spectral features of that same space. Manipulations included the temporal decay rate and the spectral distribution of the IRs. The task was a 2AFC judgment to indicate which of the spaces was “real.” We found that blind as well as sighted participants were highly sensitive to temporal deviations from ecologically valid reverberation, and less sensitive to spectral deviations. Interestingly, while sighted listeners reliably distinguished spectrally altered reverberation at above-chance levels, preliminary results indicate markedly reduced performance in blindness. Some below-chance performance in these conditions suggests a sensitivity to spectral alterations (cf. Voss et al., 2011), but an ambiguity in assigning them to the correct category. Taken together, our results suggest that visual experience modulates representations of auditory environmental statistics.