August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
vrGazeCore: an open-source package for virtual reality eye-tracking analysis
Author Affiliations
  • Thomas L. Botch
    Dartmouth College
  • Amanda J. Haskins
    Dartmouth College
  • Deepasri Prasad
    Dartmouth College
  • Jeff Mentch
    Dartmouth College
    Massachusetts Institute of Technology
  • Caroline E. Robertson
    Dartmouth College
Journal of Vision August 2023, Vol.23, 5206. doi:https://doi.org/10.1167/jov.23.9.5206
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Thomas L. Botch, Amanda J. Haskins, Deepasri Prasad, Jeff Mentch, Caroline E. Robertson; vrGazeCore: an open-source package for virtual reality eye-tracking analysis. Journal of Vision 2023;23(9):5206. https://doi.org/10.1167/jov.23.9.5206.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Virtual reality (VR) is increasingly used in combination with in-headset eye-tracking to study active vision in 360° environments. However, most software currently available to process raw data from in-headset eye-trackers is expensive and black-boxed from researcher control, placing constraints on the flexibility of analysis. Further, such conditions add complexity to traditional eye-tracking processing pipelines, requiring, for example, that researchers rectify head- and eye-centered coordinate systems to define gaze position. Here, we present vrGazeCore, an open-source package for both individual- and group-level analysis of eye-tracking data collected in 360° VR environments. As input, users provide eye-position, in normalized screen coordinates, and head-position (pitch, yaw, roll). vrGazeCore converts eye-position to degrees visual angle (DVA) and rectifies eye- and head-position to derive gaze coordinates in 360° space while simultaneously automating quality assurance of gaze position (e.g., pupil detection confidence). Our package provides multiple derivative measures of gaze position for subsequent analysis: fixation metrics, trial-averaged fixation density maps, and data visualizations. The open-source nature of our package permits the user direct access to all derivatives of gaze, as well as the ability to tailor analysis parameters to their needs. Using vrGazeCore, we demonstrate high precision and accuracy of fixations across two eye-tracking VR systems: Oculus DK2 Pupil Labs (accuracy: 2.00 DVA +/- 0.38 STE; precision: 0.26 DVA +/- 0.05 STE) and HTC Vive Eye (accuracy: 2.63 +/- 0.27 STE; precision: 0.16 +/- 0.08 STE). We additionally provide a template Unity experiment, equipped with functions to collect and save eye-tracking data, to enable users to easily interface with vrGazeCore. Future extensions of vrGazeCore will expand beyond panoramic scenes to environments with translational motion to promote studies of active navigation. In sum, vrGazeCore promotes the study of active vision in immersive scenes, offering a flexible, open-source package for analyzing eye-tracking data from head-mounted VR displays.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×