Purchase this article with an account.
Lili Tcheang, Andrew Glennerster, Stuart J Gilson, Andrew J Parker; Systematic distortions of perceptual stability investigated using virtual reality. Journal of Vision 2003;3(9):497. doi: https://doi.org/10.1167/3.9.497.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
As observers walk through a 3-D environment with their gaze fixed on a static object, their retinal image of that object changes as if the object itself were rotating. We have investigated how well observers can judge whether an object is rotating when that rotation is linked with the observer's own movement. Subjects wore a head mounted display and fixated a spherical textured object at a distance of approximately 1.5m in an immersive virtual reality environment. Subjects walked from side to side (approximately ±1m). On each trial, the object rotated about a vertical axis with randomly assigned rotational gain factors within a range of ±1: a gain of +1 caused it to always face the observer; a gain of −1 caused an equal and opposite rotation; a gain of zero means the object is static in world coordinates. In a forced-choice paradigm, subjects judged the sign of the rotational gain. We found significant biases in subjects' judgements when the target object was presented in isolation. These biases varied little with viewing distance, suggesting that they were caused by an under-estimation of the distance walked. In a rich visual environment, subjects' judgements were more precise and biases were reduced. This was also true, in general, when we manipulated proprioceptive information by correlating the lateral translation of the target object with the observer's motion.
This PDF is available to Subscribers Only