Abstract
Completing tasks in the real world, augmented reality (AR) and virtual reality (VR) require users to integrate visual information located at different distances. Eye movements like vergence and accommodation are engaged to render objects and visual information in sharp focus. However, due to the vergence-accommodation conflict inherent to artificial displays, it is unclear whether depth related eye movements in AR and VR unfold similarly to the real world. Our study measured eye vergence continuously to track perceptual depth changes with an eye tracker in the visually-matched real world, AR, and VR conditions. Participants were cued to shift their gaze to targets at one of four different depths (0.25m, 0.75m, 1.5m, and 4.0m) and completed an alternative forced choice perceptual task reporting the direction of the gap in a Landolt C stimulus. Trial order was fully counterbalanced. Four physical monitors were placed at the experimental depths to present the real-world stimuli on-screen. We custom-mounted the pupil labs eye tracker on Microsoft HoloLens 2 to collect the eye tracker data and represent stimuli in AR and VR. In the VR condition, Microsoft HoleLens 2’s optics were covered with a black cloth, so no light entered. As expected, results showed that eye vergence angle increased measurably when gaze-shifting far-to-near and decreased when shifting near-to-far. Importantly, all three conditions induced a similar exponential function by which the vergence angle was modulated by depth regardless of whether the targets were presented on real-world displays or head-mounted displays.