Abstract
We have shown that when a simulated room scales in size around a person they remain completely unaware of the scaling, ignoring stereo and motion cues in favour of assumed world stability (Glennerster, Tcheang, Gilson, Fitzgibbon and Parker. 2006). Here we investigated whether this phenomenon is unique to uniform scaling of an environment, or applies to the more general case where objects can move relative to both the observer and one another. Observers were free to move in a simulated environment consisting of checkerboard textured spheres arranged in groups of four (a "quad"). There were 1- 4 quads on different trials. In a two-interval forced-choice experiment, observers identified a sphere (pointing response) which was moved towards or away from them by 2m (matched retinal angle). There were three viewing conditions (1) isolated spheres, (2) pairs of spheres connected by a line "dipole", (3) line dipoles switched between interval one and two to connect the other sphere pair in a quad. Thus, in all conditions, the 3D coordinate information was the same across intervals, however in one condition the scene changed. Performance always deteriorated when more spheres were presented but was unaffected by adding dipoles (condition 2) unless they switched between intervals (condition 3) in which case performance was significantly worse. In follow up experiments, we showed that this degraded performance was near-eliminated if the spheres were visible during movement, and performance dramatically improved when a sphere's changing retinal angle was visible as it moved. If observers had access to the 3D coordinates of the spheres, all of the tasks would have been trivially easy to complete. We conclude that the primary information people used to tell whether the world around them had changed was determined by changes in the cyclopean image.
Meeting abstract presented at VSS 2016