December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Step by step - Walking shapes visual space
Author Affiliations & Notes
  • Michael Wiesing
    Institute for Experimental Psychology, Heinrich Heine University Düsseldorf, Universitätsstr. 1, 40225 Düsseldorf, Germany
  • Eckart Zimmermann
    Institute for Experimental Psychology, Heinrich Heine University Düsseldorf, Universitätsstr. 1, 40225 Düsseldorf, Germany
  • Footnotes
    Acknowledgements  Supported by European Research Council (project moreSense grant agreement n. 757184).
Journal of Vision December 2022, Vol.22, 3540. doi:https://doi.org/10.1167/jov.22.14.3540
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Michael Wiesing, Eckart Zimmermann; Step by step - Walking shapes visual space. Journal of Vision 2022;22(14):3540. https://doi.org/10.1167/jov.22.14.3540.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Visual depth perception is mostly understood as a purely visual problem including oculomotor processes. Yet, any neural spatial map must be provided with information about how the internal space scales external distances. Here, we show that the distance people walk to reach a target calibrates the visual perception of that distance. We used virtual reality to track physical walking distances, while simultaneously manipulating visual optic flow in a realistic, ecologically valid virtual environment. Participants walked toward a briefly flashed target located 2.50 m in front of them. Unbeknownst to the participant, we manipulated the optic flow during walking. As a result, participants overshot the target location in trials in which optic flow was reduced and undershot it, when the optic flow was increased. After each walking trial, participants visually localized an object presented in front of them. We found a serial dependence between the optic flow speed and subsequent distance judgements. This serial dependence could be driven either purely visually, i.e., by the optic flow or by the travel distance. To disentangle both factors, two follow-up experiments were conducted. In the first experiment, instead of walking, participants controlled their movement via a thumb stick. Again, we observed that travel distances were modulated by the manipulated optic flow, but not visually perceived distances. Finally, we isolated the physical walking by eliminating optic flow during walking. We did not observe any serial dependence on travel distances or subsequent distance judgements. In conclusion, our data reveal that visual depth perception is embodied and calibrated every time we walk toward a target. Linking depth perception directly to the walking travel distance provides a computationally efficient means of calibration. Since the sensorimotor system constantly monitors movement performance, these signals can be used with the only extra neural cost of being fed back to visual areas.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×