Abstract
Virtual Reality (VR) is gaining momentum in vision science due to its unprecedented potential to study the behavior of active participants in naturalistic but still well-controlled environments. This approach is further facilitated by the growing availability of VR devices with built-in eye tracking, allowing simultaneous recording of gaze and movement behavior. Rendering engines such as Vizard, Unity, and Unreal have made it simple to create virtual environments, but most come ill-equipped out of the box to handle the typical structure of a sensorimotor experiment, such as looped presentation of structured behavioral 'trials' and capture of continuous behavioral data. This leads to code duplication and re-implementation of the same components by multiple labs and hampers open science and reproducibility efforts. Recently, some research groups have published toolboxes for the Unity platform that facilitate the creation of behavioral experiments. Based on experience from our lab, we here present a similar toolbox for the Vizard VR platform, which is widely used in perception and action research. Using our toolbox, a researcher can implement a behavioral experiment including eye and hand movement recording with comparatively little Python code. The experimental structure can be generated programmatically or imported from a file, and the toolbox takes care of randomizing, presenting, and recording trials. Results are saved in standardized file formats such as CSV and JSON for ease of analysis. Additionally, our toolbox allows for easy online recording and calibration of gaze and motion tracking data using any hardware device supported by Vizard. We highlight the structure and central features of our toolbox using the example of a goal-directed reaching task in VR. This is accompanied by the release of the toolbox code as open source.