Abstract
Virtual reality (VR) is increasingly used in combination with in-headset eye-tracking to study active vision in 360° environments. However, most software currently available to process raw data from in-headset eye-trackers is expensive and black-boxed from researcher control, placing constraints on the flexibility of analysis. Further, such conditions add complexity to traditional eye-tracking processing pipelines, requiring, for example, that researchers rectify head- and eye-centered coordinate systems to define gaze position. Here, we present vrGazeCore, an open-source package for both individual- and group-level analysis of eye-tracking data collected in 360° VR environments. As input, users provide eye-position, in normalized screen coordinates, and head-position (pitch, yaw, roll). vrGazeCore converts eye-position to degrees visual angle (DVA) and rectifies eye- and head-position to derive gaze coordinates in 360° space while simultaneously automating quality assurance of gaze position (e.g., pupil detection confidence). Our package provides multiple derivative measures of gaze position for subsequent analysis: fixation metrics, trial-averaged fixation density maps, and data visualizations. The open-source nature of our package permits the user direct access to all derivatives of gaze, as well as the ability to tailor analysis parameters to their needs. Using vrGazeCore, we demonstrate high precision and accuracy of fixations across two eye-tracking VR systems: Oculus DK2 Pupil Labs (accuracy: 2.00 DVA +/- 0.38 STE; precision: 0.26 DVA +/- 0.05 STE) and HTC Vive Eye (accuracy: 2.63 +/- 0.27 STE; precision: 0.16 +/- 0.08 STE). We additionally provide a template Unity experiment, equipped with functions to collect and save eye-tracking data, to enable users to easily interface with vrGazeCore. Future extensions of vrGazeCore will expand beyond panoramic scenes to environments with translational motion to promote studies of active navigation. In sum, vrGazeCore promotes the study of active vision in immersive scenes, offering a flexible, open-source package for analyzing eye-tracking data from head-mounted VR displays.