August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Tracking Perceptual Depth with Eye Vergence Movements in Real World, Augmented Reality, and Virtual Reality Environments
Author Affiliations
  • Mohammed Safayet Arefin
    DEVCOM US Army Research Laboratory
  • J. Edward Swan II
    Mississippi State University, USA
  • Russell Cohen Hoffing
    DEVCOM US Army Research Laboratory
  • Steven M. Thurman
    DEVCOM US Army Research Laboratory
Journal of Vision August 2023, Vol.23, 5209. doi:https://doi.org/10.1167/jov.23.9.5209
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Mohammed Safayet Arefin, J. Edward Swan II, Russell Cohen Hoffing, Steven M. Thurman; Tracking Perceptual Depth with Eye Vergence Movements in Real World, Augmented Reality, and Virtual Reality Environments. Journal of Vision 2023;23(9):5209. https://doi.org/10.1167/jov.23.9.5209.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Completing tasks in the real world, augmented reality (AR) and virtual reality (VR) require users to integrate visual information located at different distances. Eye movements like vergence and accommodation are engaged to render objects and visual information in sharp focus. However, due to the vergence-accommodation conflict inherent to artificial displays, it is unclear whether depth related eye movements in AR and VR unfold similarly to the real world. Our study measured eye vergence continuously to track perceptual depth changes with an eye tracker in the visually-matched real world, AR, and VR conditions. Participants were cued to shift their gaze to targets at one of four different depths (0.25m, 0.75m, 1.5m, and 4.0m) and completed an alternative forced choice perceptual task reporting the direction of the gap in a Landolt C stimulus. Trial order was fully counterbalanced. Four physical monitors were placed at the experimental depths to present the real-world stimuli on-screen. We custom-mounted the pupil labs eye tracker on Microsoft HoloLens 2 to collect the eye tracker data and represent stimuli in AR and VR. In the VR condition, Microsoft HoleLens 2’s optics were covered with a black cloth, so no light entered. As expected, results showed that eye vergence angle increased measurably when gaze-shifting far-to-near and decreased when shifting near-to-far. Importantly, all three conditions induced a similar exponential function by which the vergence angle was modulated by depth regardless of whether the targets were presented on real-world displays or head-mounted displays.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×