Abstract
The ground surface can be used as a reference frame for coding object location (Gibson, 1950; Sedgwick, 1986). Confirming this, Sinai et al (1998) found that humans make errors in judging distance when a continuous ground surface information is disrupted. Their study was conducted in the real world where it is difficult to control for aberrant environmental variables, particularly those on the ground surface. Here, to obviate the problem, we used a virtual reality system (V8-HMD/Intersense/SGI) to measure absolute distance judgment in three ground conditions: (i) gap — target placed on the far side of a gap in the ground (2–8 m in depth, 0.5–2 m deep); (ii) texture discontinuation — target placed on a cobbled stone texture that was flanked by grass texture, and vice versa; (iii) occlusion — target placed on the grass beyond a brick wall whose dimension was 0.5 (height) × 1 (depth) × 5 (width) m. For each condition, observers viewed the test target (5, 7, 9, 11 m) with the HMD, judged, and remembered its absolute distance. They then turned 180 deg to face a matching object on a continuous ground surface, and perceptually matched the distance of this object with the remembered distance of the test target. Further, to reveal the impact of self-motion on distance judgment, these tests were conducted both with the head-tracker turned on and off. Overall, we found that compared to baseline (continuous ground), observers significantly overestimated distance in the gap condition, while underestimated distance in the texture discontinuation and occlusion conditions. And interestingly, for the gap condition, observers showed significantly larger overestimation errors with self-motion feedback in the scene. Thus, our current study in virtual reality not only confirms Sinai et al in the real environment, but also points to the effect of self-motion in space perception.
Supported by Sigma Xi Grant-in-Aid of Research Program; Grawemeyer Fellowship; SCO Research Funds