Abstract
INTRODUCTION: In our previous experiment (J Vis. 2022; 22(14):3310), we demonstrated that people cannot infer distance of an object reliably from its visual size and position in a 2D scene when viewed on a computer screen. Here we repeated the experiment with the visual stimuli projected onto a wall, fully displaying a simulated scene at natural size with the ground plane aligned with the floor. Might such a realistically scaled environment improve people’s distance perception in a 2D scene? METHOD: Participants sat on a chair and viewed a hallway scene projected on a wall 2m away with their chin on a chinrest. They were asked to imagine they were looking into an actual hallway. Participants held a reference object (a cereal box) in their hands and compared its size to the image of a corresponding object in the scene. Participants adjusted either its SIZE based on its position in the scene (position-to-size task) or its POSITION based on its visual size (size-to-position task) to match the reference size. They did the tasks either binocularly or monocularly. RESULTS: In general, the adjusted target position was consistent with the object’s size. The adjusted size, however, was consistently larger than geometrically correct when viewed binocularly but not monocularly. DISCUSSION: These results suggest that participants were more accurate at judging size based on position when conflicting stereo cues were removed despite reporting that using one eye was more difficult. Stereoscopic vision conflicts with reality when viewing a 2D simulation of a 3D scene and may underly the errors we see when judging the size of an object. The fact that we observed errors only when determining size and not position support the idea that size and distance perceptions use different mechanisms as suggested in our previous paper (Kim & Harris, 2022, Vision 6, 25).