June 2007
Volume 7, Issue 9
Vision Sciences Society Annual Meeting Abstract  |   June 2007
Detecting correlations between auditory and visual signals
Author Affiliations
  • Pei-Yi Ko
    Vision Science Program, UC Berkeley
  • Carmel Levitan
    Joint Graduate Group in Bioengineering, UCSF & UC Berkeley
  • Martin Banks
    Vision Science Program, UC Berkeley, and Joint Graduate Group in Bioengineering, UCSF & UC Berkeley
Journal of Vision June 2007, Vol.7, 869. doi:https://doi.org/10.1167/7.9.869
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Pei-Yi Ko, Carmel Levitan, Martin Banks; Detecting correlations between auditory and visual signals. Journal of Vision 2007;7(9):869. https://doi.org/10.1167/7.9.869.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

In combining information from multiple sources, the brain must determine which signals correspond. For instance, in a crowded room, there may be many people speaking at once, but the brain correctly determines which speaker's lip movements match which sound. To examine the ability to detect correlations between auditory and visual stimuli, we presented auditory-visual stimulus pairs that contained correlated and uncorrelated changes over time. The visual stimuli were modulated in size and the auditory stimuli were modulated in intensity. We used a two-interval, forced-choice procedure to measure correlation-detection thresholds. In the signal interval, the amplitude modulations contained a correlated component. In the no-signal interval, the modulations were uncorrelated. Observers indicated which of two intervals on each trial contained the correlated modulations. To find the correlation-detection threshold, we varied the proportion of correlated and uncorrelated modulation in the signal interval. In one experiment, we varied the temporal frequency of the amplitude modulation by band-passing filtering the modulation waveforms. Correlation detection was good (threshold was ∼0.2) for temporal frequencies of 0.5–2 Hz and then deteriorated at progressively higher frequencies. This suggests that the mechanisms involved in detecting auditory-visual correlations are sluggish. In another experiment, we presented broad-band stimuli and varied the temporal lag between the auditory and visual stimuli. Correlation-detection threshold was roughly constant for lags of +/−200 msec and was elevated about two-fold for lags of +/−400 msec. There was no obvious asymmetry in this lag effect. Thus, the mechanisms involved in detecting auditory-visual correlations tolerate fairly substantial time offsets. In analogy to models of stereo correspondence, we developed an auditory-visual cross-correlator and found that its properties are similar to those observed experimentally.

Ko, P.-Y. Levitan, C. Banks, M. (2007). Detecting correlations between auditory and visual signals [Abstract]. Journal of Vision, 7(9):869, 869a, http://journalofvision.org/7/9/869/, doi:10.1167/7.9.869. [CrossRef]
 NIH EY014194

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.