Due to technological advances related to mobile eye-tracking, head-mounted virtual reality displays, and motion capture, the investigation of gaze behavior within virtual environments is increasingly feasible. However, only a handful of studies have adopted the analysis of gaze behavior using a head-mounted virtual display for the study of perception, action, and cognition in immersive virtual environments (Diaz, Cooper, Rothkopf, & Hayhoe,
2013; Duchowski, Medlin, Cournia, Gramopadhye, et al.,
2002; Duchowski, Medlin, Cournia, Murphy, et al.,
2002; Iorizzo, Riley, Hayhoe, & Huxlin,
2011). This is in part because the analysis of behavior within these environments presents a number of technical challenges. Therefore we present this introductory guide for the use of eye-tracking technology in virtual environments with the aim of reducing the time and cost of startup. In addition, we present methods for both data capture and analysis of human gaze patterns when navigating a virtual environment.
To aid in data capture, we provide libraries that record the image stream presented to human subjects in real time along with a visual and text-based recording of data output from the eye-tracking software and virtual reality environment (
http://sourceforge.net/p/utdvrlibraries/). Using this video record of the experimental process, experimenters are afforded immediate visual inspection of behavioral data, a numerical record for automated quantitative analysis, and a means for verifying that the data have been accurately interpreted and reconstructed following quantitative analysis.
We also provide users with several mathematical tools for transforming gaze data from pixel coordinates into forms commonly used in behavioral analysis. For example, we provide equations for calculating absolute gaze angles within a world-based reference frame. In this section, we also provide equations that allow one to compensate for the use of head-mounted displays (HMDs) with screens that have been angled away from fronto-parallel plane—a strategy sometimes used to increase the HMD field of view. In the next section, we present methods for calculating the angular distance from gaze to objects within the virtual environment. Finally, we present methodology for the identification of fixations, pursuit, and saccadic eye movements.