Abstract
Extensive work has shown that the visuo-motor system is critically sensitive to whether an object is in or out of reach, assessed typically using simple controlled stimuli (e.g. a dot on a desk). Here, we extend this framework, examining people’s subjective reachability judgments using computer-generated pictures of everyday environments. To do so, we created 20 different indoor environments (e.g. kitchen, library) using virtual-reality software. All environments had the same physical dimensions, with an extended surface on the back wall (e.g. countertop, desk) containing both central and surrounding objects. A series of 30 snapshot views were taken along a continuum from a full-scale scene view to a close-up view of the central object. In Experiment 1, participants were presented with a set of views and for each view judged whether it was in or out of reach. A single set consisted of 20 views (1 from each environment at a randomized point on the continuum), and overall we obtained 45 judgments for each view of every environment. Over all environments, the average position of 50% reachability was consistently judged to be position 10.4 (out of 30 positions along object-scene continuum, SD = 1.7). In Experiment 2, different participants were presented with these views but instead judged whether they would step forward, backwards, or stay in place, to get a more comfortable view of the environment. Participants chose to step back from the close-up views and to step forward at the far views. Critically, the transition point between these options coincided with the transition point of perceived reachability (mean position=10.5, SD=2.4, t(19)=1.1, n.s). Overall, these results show that people can judge subjective reachability in pictures depicting everyday environments. Further, these results suggest that the perceived reachability of a view may automatically factor into perceptual judgments of the environment.