Abstract
Technologies on the mixed reality continuum, such as virtual reality (VR), commonly yield distortions in perceived distance. One source of such distortions is the vergence-accommodation conflict, where the eyes’ accommodative state is coerced to the fixed locations of a headset’s screen, while the angles at which the two eyes converge in virtual space continuously update. The current study conceptualizes the effect of vergence-accommodation conflict as a constant outward offset to the vergence angle of approximately 0.2°. Based on this conceptualization, a novel model was developed to predict and account for the resulting distance distortions in VR using the stereoscopic viewing geometry. Leveraging this model, an inverse transformation algorithm along the observer’s line of sight was applied to the rendered virtual environment to counter the effect of vergence offset. To test the effects of the transformation, participants performed a series of manual pointing movements on a tabletop with or without the inverse transformation algorithm. Results showed that the participants increasingly undershot the targets when the inverse transformation was not available, but were consistently more accurate when the algorithm was applied to the virtual environment. The results indicate that systematically transforming the rendered virtual environment based on perceptual geometry could ameliorate distance distortions arising from the vergence-accommodation conflict. The findings of the present study could be applied to designing VR-based applications, such as for medical/surgery training, to improve the accuracy when interacting with virtual objects.