August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Allocentric spatial representations dominate when switching between real and virtual worlds
Author Affiliations & Notes
  • Meaghan McManus
    Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
  • Franziska Seifert
    Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
  • Immo Schütz
    Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
  • Katja Fiehler
    Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
  • Footnotes
    Acknowledgements  Funding: This work was supported by the German Research Foundation (DFG) grant FI 1567/6–1 TAO (“The active observer”), “The Adaptive Mind,” funded by the Excellence Program of the Hessian Ministry of Higher Education, Research, Science and the Arts.
Journal of Vision August 2023, Vol.23, 5053. doi:https://doi.org/10.1167/jov.23.9.5053
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Meaghan McManus, Franziska Seifert, Immo Schütz, Katja Fiehler; Allocentric spatial representations dominate when switching between real and virtual worlds. Journal of Vision 2023;23(9):5053. https://doi.org/10.1167/jov.23.9.5053.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

After leaving virtual reality we can be surprised to find that we are facing a different direction than expected. How well can people maintain an accurate representation of one environment while immersed in another? Previous studies in virtual reality (VR) have found that allocentric cues contribute ~50% to the reach location of a remembered object when other objects are misaligned. Here we were interested in pointing locations when the whole environment was misaligned. Participants encoded the locations of target objects in one environment, virtual or real-world, and then switched to the other environment where the targets were not visible. Participants were asked to point to the remembered location of the targets in the first environment while being immersed in the second. The virtual environment could be aligned or misaligned below perceptual threshold to the real-world by ±2.6° (pitch). We found an interaction between pitch and order (encoding in the real-world with pointing in VR, or vice-versa). The pitch of the virtual environment strongly influenced pointing locations while in VR but had little effect on pointing locations in the real-world. There was a strong allocentric effect on pointing locations in VR and the real-world. We reran the experiment using a pitch of ±20°. We hypothesized that when the misalignment was obvious, participants would reduce their reliance on allocentric cues. However, the results were unchanged with a strong allocentric influence on pointing in VR and the real-world. When the whole environment is misaligned, pointing locations are based primarily on the relationship between the objects in the current environment. The findings suggest that when switching between real and virtual environments, people might only maintain an allocentric representation of the previously seen environment.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×