July 2013
Volume 13, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   July 2013
Sound delay in audiovisual events can signal object depth
Author Affiliations
  • Philip Jaekl
    Center for Visual Science & Dept. of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA 14627
  • Jakob Maxwell Seidlitz
    Center for Visual Science & Dept. of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA 14627
  • Duje Tadin
    Center for Visual Science & Dept. of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA 14627\nDepartment of Ophthalmology, University of Rochester, Rochester, NY, USA 14627
Journal of Vision July 2013, Vol.13, 886. doi:https://doi.org/10.1167/13.9.886
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Philip Jaekl, Jakob Maxwell Seidlitz, Duje Tadin; Sound delay in audiovisual events can signal object depth. Journal of Vision 2013;13(9):886. https://doi.org/10.1167/13.9.886.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The slower arrival time of sound relative to light varies linearly with the distance of an audiovisual event and can potentially be used as reliable cue to depth. Here, we aimed to determine if sound delay in audiovisual events can influence visual judgments of distance. We hypothesized that a given visual event paired with delayed sound would be perceived as being further away than when paired with synchronous sound. Using a 2IFC task, participants judged the distance of two stereoscopically presented random dot clouds in a fixed 3D volume (250ms duration, 1-1.5 ms ISI). In one interval the cloud was paired with a synchronous sound, while in the other, the sound was delayed by 80 ms relative to the visual stimulus onset. By varying the disparity of the dots, the 'cloud' presented in the second interval was stereoscopically shifted either toward or away from the participant relative to the first cloud. Task difficulty was controlled by the percentage of dots that coherently shifted in depth, ranging from 0% (random dispersion of dots) to 100% (fully coherent shift). When the delayed sound was presented in the second interval, participants showed a bias towards perceiving the stimuli as shifting away from the observer. The opposite pattern occurred when the delayed sound occurred in the first interval. By comparing congruent and incongruent audio-visual pairings, we found a trend for participants to be more precise when sound delays were congruent, with the direction of the visual depth shift. These findings reveal that sound delay in audiovisual events can be used when making distance judgments—a task previously thought to be solved only visually. Evidently, sound delays modulate visual depth perception and possibly provide a previously unknown depth cue.

Meeting abstract presented at VSS 2013

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×