Purchase this article with an account.
Pei-Yi Ko, Carmel Levitan, Martin Banks; Detecting correlations between auditory and visual signals. Journal of Vision 2007;7(9):869. doi: 10.1167/7.9.869.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
In combining information from multiple sources, the brain must determine which signals correspond. For instance, in a crowded room, there may be many people speaking at once, but the brain correctly determines which speaker's lip movements match which sound. To examine the ability to detect correlations between auditory and visual stimuli, we presented auditory-visual stimulus pairs that contained correlated and uncorrelated changes over time. The visual stimuli were modulated in size and the auditory stimuli were modulated in intensity. We used a two-interval, forced-choice procedure to measure correlation-detection thresholds. In the signal interval, the amplitude modulations contained a correlated component. In the no-signal interval, the modulations were uncorrelated. Observers indicated which of two intervals on each trial contained the correlated modulations. To find the correlation-detection threshold, we varied the proportion of correlated and uncorrelated modulation in the signal interval. In one experiment, we varied the temporal frequency of the amplitude modulation by band-passing filtering the modulation waveforms. Correlation detection was good (threshold was ∼0.2) for temporal frequencies of 0.5–2 Hz and then deteriorated at progressively higher frequencies. This suggests that the mechanisms involved in detecting auditory-visual correlations are sluggish. In another experiment, we presented broad-band stimuli and varied the temporal lag between the auditory and visual stimuli. Correlation-detection threshold was roughly constant for lags of +/−200 msec and was elevated about two-fold for lags of +/−400 msec. There was no obvious asymmetry in this lag effect. Thus, the mechanisms involved in detecting auditory-visual correlations tolerate fairly substantial time offsets. In analogy to models of stereo correspondence, we developed an auditory-visual cross-correlator and found that its properties are similar to those observed experimentally.
This PDF is available to Subscribers Only