Purchase this article with an account.
Kalpana Dokka, Paul R. MacNeilage, Gregory C. DeAngelis, Dora E. Angelaki; Estimating distance during self-motion: A role for visual–vestibular interactions. Journal of Vision 2011;11(13):2. doi: https://doi.org/10.1167/11.13.2.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
A fundamental challenge for the visual system is to extract the 3D spatial structure of the environment. When an observer translates without moving the eyes, the retinal speed of a stationary object is related to its distance by a scale factor that depends on the velocity of the observer's self-motion. Here, we aim to test whether the brain uses vestibular cues to self-motion to estimate distance to stationary surfaces in the environment. This relationship was systematically probed using a two-alternative forced-choice task in which distance perceived from monocular image motion during passive body translation was compared to distance perceived from binocular disparity while subjects were stationary. We show that perceived distance from motion depended on both observer velocity and retinal speed. For a given head speed, slower retinal speeds led to the perception of farther distances. Likewise, for a given retinal speed, slower head speeds led to the perception of nearer distances. However, these relationships were weak in some subjects and absent in others, and distance estimated from self-motion and retinal image motion was substantially compressed relative to distance estimated from binocular disparity. Overall, our findings suggest that the combination of retinal image motion and vestibular signals related to head velocity can provide a rudimentary capacity for distance estimation.
*Indicates a significant regression slope (p < 0.05). Comparable p-values were obtained using non-parametric regression analysis.
*Indicates significant partial correlation (p < 0.05).
This PDF is available to Subscribers Only