Abstract
For audiovisual events, sound arrives with a delay relative to light. This delay varies linearly with the event distance. We sought to determine if this ubiquitous audiovisual asynchrony can be exploited as a multisensory cue to object distance. Specifically, we hypothesized that events with greater audiovisual delays will be perceived as being further. To test this hypothesis we presented visual stimuli paired with leading and lagging sound onsets. In an adjustment task, participants (N = 5) controlled the disparities of dots within two random dot clusters, presented one at the time on the left and right of the display. Participants adjusted the disparities such that increases on one side corresponded with decreases on the other. When they perceived the clusters to have the same distance, the relative disparity was recorded as a response variable. For each stimulus, one dot cluster was paired with a sound lead, while the other was paired with an equal sound lag. Sound asynchronies were between 0 and 100 ms in 20 ms increments. The results showed that for audiovisual asynchronies greater than 20 ms, participants had a significant bias of perceiving dot clouds that were paired with a sound delay as being more distant. Evidently, sound delays can be used as a non-metric cue to visual distance. These findings reveal a new role of multisensory audiovisual cues in depth perception, a function commonly regarded as a unimodal, visual process.
Meeting abstract presented at VSS 2014