September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Investigating finger-tapping and pupillometry as potential indicators of presence in VR
Author Affiliations
  • Sean Hinkle
    University of Central Florida
  • Shayna Soares
    University of Central Florida
  • Robert Bacon
    University of Central Florida
  • Corey Bohil
    Lawrence Technological University
Journal of Vision September 2024, Vol.24, 706. doi:https://doi.org/10.1167/jov.24.10.706
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Sean Hinkle, Shayna Soares, Robert Bacon, Corey Bohil; Investigating finger-tapping and pupillometry as potential indicators of presence in VR. Journal of Vision 2024;24(10):706. https://doi.org/10.1167/jov.24.10.706.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Spatial presence in VR, the feeling of ‘being there,’ is linked to outcomes in clinical, training, education, and entertainment applications. Overreliance on survey measures has hampered the field and prevented progress with few generalizable alternatives. The field has tested physiological, neuroimaging, and behavioral measures in search of continuous and objective indicators of presence. In two studies we evaluated finger-tapping and pupillometry as potential indicators of presence. We predicted that variance in inter-tap-intervals (ITIs) and pupil size would predict presence, and that a neural-net classifier would be able to identify high versus low presence conditions at the individual subject level. In Experiment 1, participants walked the “virtual plank” tapping to a rhythm at heights or on the ground to manipulate presence. Surveys confirmed that heights manipulated presence (p = .04). ITI variance did not follow this pattern (p = .375). A feedforward neural-net classifier was trained on tapping and pupillometry data at the individual level. For finger-tapping, the classifier identified the condition of four-second windows of finger-position data at 77% accuracy. Pupillometry data yielded 70% accuracy, but a lighting confound weakened our conclusions. In Experiment 2, participants watched two 360-degree videos twice, with or without sound, to manipulate presence while controlling global luminance. Each video was analyzed separately. Surveys confirmed that sound increased presence for both videos (all ps < .05), but pupil variance did not follow this pattern. The neural-net classifier was unable to replicate the high accuracy of Experiment 1, with accuracies of 57% and 55%. Our results demonstrate that finger-tapping is a promising indicator of presence in VR and is especially sensitive when analyzed via neural-net classifier. While results for pupillometry are mixed, we believe that pupillometry and other eye-tracking metrics merit further investigation with more refined machine-learning methods, potentially in combination with finger-tapping.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×