December 2023
Volume 23, Issue 15
Open Access
Optica Fall Vision Meeting Abstract  |   December 2023
Poster Session I: Perspective-correct rendering for active observers
Author Affiliations
  • Phillip Guan
    Meta Reality Labs Research
  • Eric Penner
    Meta Reality Labs Research
  • Joel Hegland
    Meta Reality Labs Research
  • Benjamin Letham
    Meta
  • Douglas Lanman
    Meta Reality Labs Research
Journal of Vision December 2023, Vol.23, 32. doi:https://doi.org/10.1167/jov.23.15.32
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Phillip Guan, Eric Penner, Joel Hegland, Benjamin Letham, Douglas Lanman; Poster Session I: Perspective-correct rendering for active observers. Journal of Vision 2023;23(15):32. https://doi.org/10.1167/jov.23.15.32.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Stereoscopic, head-tracked display systems can show users realistic, world-locked virtual objects and environments (i.e., rendering perspective-correct binocular images with accurate motion parallax). However, discrepancies between the rendering pipeline and physical viewing conditions can lead to perceived instability in the rendered content resulting in reduced immersion and, potentially, visually-induced motion sickness. Precise requirements to achieve perceptually stable world-locked rendering (WLR) are unknown due to the challenge of constructing a wide field of view, distortion-free display with highly accurate head and eyetracking. We present a custom-built system capable of rendering virtual objects over real-world references without perceivable drift under such constraints. This platform is used to study acceptable errors in render camera position for WLR in augmented and virtual reality scenarios, where we find an order of magnitude difference in perceptual sensitivity. We conclude with an analytic model which examines changes to apparent depth and visual direction in response to camera displacement errors, and visual direction is highlighted as a potentially important consideration for WLR alongside depth errors from incorrect disparity.

Footnotes
 Funding: Funding: None
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×