August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Retinal Slip from Self Motion Modulates the Perceptibility of Jitter in World-Locked Augmented Reality
Author Affiliations
  • Hope Lutwak
    Reality Labs, Meta Platforms Inc.
    New York University
  • T. Scott Murdison
    Reality Labs, Meta Platforms Inc.
  • Kevin W. Rio
    Reality Labs, Meta Platforms Inc.
Journal of Vision August 2023, Vol.23, 5880. doi:https://doi.org/10.1167/jov.23.9.5880
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Hope Lutwak, T. Scott Murdison, Kevin W. Rio; Retinal Slip from Self Motion Modulates the Perceptibility of Jitter in World-Locked Augmented Reality. Journal of Vision 2023;23(9):5880. https://doi.org/10.1167/jov.23.9.5880.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The human visual system is highly attuned to differentiate egocentric retinal velocities from those that arise from independently moving objects. Augmented reality (AR) head-mounted displays (HMDs) challenge these mechanisms by tracking the wearer’s head motion and projecting world-locked (WL) virtual content. Spatiotemporal artifacts can occur during WL rendering in HMDs, including high frequency random fluctuations in virtual content position (jitter). Perceptual sensitivity to AR WL jitter has been quantified for stationary observers (Wilmott et al., 2022). Importantly, this work did not account for the contributions of real-world references and imperfect retinal stabilizations (e.g., pursuit, saccades, vestibulo-ocular reflex). Here, we investigate how sensitivity to jitter varies as a function of observer movement and virtual content placement. We hypothesize (1) that as observers move their head and eyes during self motion, their sensitivity to added retinal velocity from jitter will decrease, and (2) that rendering virtual content near real-world surfaces will increase jitter sensitivity, given proximal veridical 3D cues. We measured sensitivity to added jitter on a textured cube (.2 m side-length) versus a no-jitter reference in three motion conditions: stationary, head rotation, and walking; and three virtual content placement conditions: floating, on a desk, and against a wall. Using a commercially available AR HMD with eye tracking (HoloLens 2), we asked participants to determine which cube moved more using a two-interval forced choice task. Psychometric thresholds indicated that participants were up to 3x less sensitive to added jitter during self motion compared to when they were stationary. To generalize beyond distinct user motion and content placement conditions, we also analyzed eye movement data. We found that the amount of retinal slip (i.e. how much gaze drifted across the object) could predict jitter thresholds better than measured head movements alone, suggesting a gaze-driven decrease in object motion sensitivity during self-motion.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×