Abstract
We assessed the contribution of the pictorial cues of linear perspective, texture and scene clutter to the perception of distance using consumer virtual reality. In sparse environments, observers tend to underestimate the distance to far objects. As additional cues are made available, distance perception is predicted to improve, as measured by a reduction in systematic bias, and an increase in precision. We assessed (1) whether space is nonlinearly distorted and (2) the degree of size constancy across changes in distance. In the first task, observers positioned two spheres, positioned at eye-height, in such a way that divided the space between the observer and an end reference stimulus (presented between 3 and 11m) into three equal sections. In the second task, observers set the size of a sphere, presented at the same distances and at eye-height, to match that of a hand-held football. Each task was performed in four environments. The first contained just a ground plane with a visible horizon. The second was a corridor defined solely by perspective cues. The third was the same corridor, with added textured walls and floor. The fourth environment contained object clutter, in addition to perspective and texture cues. We measured accuracy by identifying systematic biases in observers’ responses, and precision as the standard deviation of these responses. While there was no evidence of non-linear compression of space, observers tended to underestimate distance. This bias was reduced when perspective cues were available, but no further improvement was seen when texture and scene clutter were present. Similarly, observers were the least precise in the sparse environment, with little improvement beyond the presence of perspective cues. These results show that linear perspective cues improve the accuracy and precision of distance estimates in virtual reality.