September 2011
Volume 11, Issue 11
Vision Sciences Society Annual Meeting Abstract  |   September 2011
Synchronized audio-visual transients drive efficient visual search for motion-in-depth
Author Affiliations
  • Marina Zannoli
    Laboratoire Psychologie de la Perception, Universite Paris Descartes & CNRS, France
  • John Cass
    School of Psychology, University of Western Sydney, Australia
  • Pascal Mamassian
    Laboratoire Psychologie de la Perception, Universite Paris Descartes & CNRS, France
  • David Alais
    School of Psychology, University of Sydney, Australia
Journal of Vision September 2011, Vol.11, 792. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Marina Zannoli, John Cass, Pascal Mamassian, David Alais; Synchronized audio-visual transients drive efficient visual search for motion-in-depth. Journal of Vision 2011;11(11):792.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

In natural audio-visual environments, a change in depth is usually correlated with a change in loudness. In the present study, we investigated whether correlating disparity and loudness provides a functional advantage in binding disparity and sound amplitude in a visual search paradigm. To test this assumption, we used a method similar to that used by Van der Burg et al. (2008) to show that non-spatial modulations of loudness can drastically improve spatial visual search for a correlated luminance modulation. Subsequently (2010), they varied the shape of temporal modulation and demonstrated that transient events (square modulations) are required for this search efficiency, and that sinusoidal audiovisual modulations do not support efficient search. We used dynamic random-dot stereogram displays to produce pure disparity modulations. Target and distractors were 0.35 × 0.35 degrees disparity-defined squares (either 6 or 10 in total) presented on a ring at 2.5 deg eccentricity. Each square moved back and forth in depth from zero to +12 arcmin (crossed) disparity at different phases. The target's depth modulation was synchronized with an amplitude-modulated 500 Hz tone. Visual and auditory modulations were always congruent (both sinewave or squarewave). Four observers were asked to give speeded responses in a discrimination task on the target. Because binocular matching processes are known to favor smooth over abrupt changes of disparity across space and time, we expected the sine modulation condition to be at least as efficient as the square modulation in supporting efficient search. However, results show a significant improvement in visual search in the square condition compared to the sine condition, suggesting that transient auditory information can efficiently drive visual search in the disparity domain. In a second experiment, correlating sound with a distractor led to longer search times, indicating that the correlation is not easily ignored.

This research was supported by French Ministere de laEnseignement Superieur et de la Recherche grant to MZ. 

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.