September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Body posture affects the perception of visually simulated self-motion
Author Affiliations & Notes
  • Bjoern Joerges
    Center for Vision Research, York University
  • Nils Bury
    Center for Vision Research, York University
  • Meaghan McManus
    Center for Vision Research, York University
  • Robert S. Allison
    Center for Vision Research, York University
  • Michael Jenkin
    Center for Vision Research, York University
  • Laurence R. Harris
    Center for Vision Research, York University
  • Footnotes
    Acknowledgements  We acknowledge the generous support of the Canadian Space Agency (15ILSRA1-York).
Journal of Vision September 2021, Vol.21, 2301. doi:https://doi.org/10.1167/jov.21.9.2301
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Bjoern Joerges, Nils Bury, Meaghan McManus, Robert S. Allison, Michael Jenkin, Laurence R. Harris; Body posture affects the perception of visually simulated self-motion. Journal of Vision 2021;21(9):2301. https://doi.org/10.1167/jov.21.9.2301.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Perceiving one’s self-motion is a multisensory process involving integrating visual, vestibular and other cues. The perception of self-motion can be elicited by visual cues alone (vection) in a stationary observer. In this case, optic flow information compatible with self-motion may be affected by conflicting vestibular cues signaling that the body is not accelerating. Since vestibular cues are less reliable when lying down (Fernandez & Goldberg, 1976), conflicting vestibular cues might bias the self-motion percept less when lying down than when upright. To test this hypothesis, we immersed 20 participants in a virtual reality hallway environment and presented targets at different distances ahead of them. The targets then disappeared, and participants experienced optic flow simulating constant-acceleration, straight-ahead self-motion. They indicated by a button press when they felt they had reached the position of the previously-viewed target. Participants also performed a task that assessed biases in distance perception. We showed them virtual boxes at different simulated distances. On each trial, they judged if the height of the box was bigger or smaller than a reference ruler held in their hands. Perceived distance can be inferred from biases in perceived size. They performed both tasks sitting upright and lying supine. Participants needed less optic flow (perceived they had travelled further) to perceive they had reached the target’s position when supine than when sitting (by 4.8%, bootstrapped 95% CI=[3.5%;6.4%], determined using Linear Mixed Modelling). Participants also judged objects as larger (compatible with closer) when upright than when supine (by 2.5%, 95% CI=[0.03%;4.6%], as above). The bias in traveled distance thus cannot be reduced to a bias in perceived distance. These results suggest that vestibular cues impact self-motion distance perception, as they do heading judgements (MacNeilage, Banks, DeAngelis & Angelaki, 2010), even when the task could be solved with visual cues alone.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×