Abstract
Visual depth perception is mostly understood as a purely visual problem including oculomotor processes. Yet, any neural spatial map must be provided with information about how the internal space scales external distances. Here, we show that the distance people walk to reach a target calibrates the visual perception of that distance. We used virtual reality to track physical walking distances, while simultaneously manipulating visual optic flow in a realistic, ecologically valid virtual environment. Participants walked toward a briefly flashed target located 2.50 m in front of them. Unbeknownst to the participant, we manipulated the optic flow during walking. As a result, participants overshot the target location in trials in which optic flow was reduced and undershot it, when the optic flow was increased. After each walking trial, participants visually localized an object presented in front of them. We found a serial dependence between the optic flow speed and subsequent distance judgements. This serial dependence could be driven either purely visually, i.e., by the optic flow or by the travel distance. To disentangle both factors, two follow-up experiments were conducted. In the first experiment, instead of walking, participants controlled their movement via a thumb stick. Again, we observed that travel distances were modulated by the manipulated optic flow, but not visually perceived distances. Finally, we isolated the physical walking by eliminating optic flow during walking. We did not observe any serial dependence on travel distances or subsequent distance judgements. In conclusion, our data reveal that visual depth perception is embodied and calibrated every time we walk toward a target. Linking depth perception directly to the walking travel distance provides a computationally efficient means of calibration. Since the sensorimotor system constantly monitors movement performance, these signals can be used with the only extra neural cost of being fed back to visual areas.