September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
A Toolbox for Perception and Action Experiments Using the Vizard VR Platform
Author Affiliations
  • Immo Schuetz
    Justus Liebig University Giessen
  • Harun Karimpur
    Justus Liebig University Giessen
  • Katja Fiehler
    Justus Liebig University Giessen
Journal of Vision September 2021, Vol.21, 2010. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Immo Schuetz, Harun Karimpur, Katja Fiehler; A Toolbox for Perception and Action Experiments Using the Vizard VR Platform. Journal of Vision 2021;21(9):2010.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Virtual Reality (VR) is gaining momentum in vision science due to its unprecedented potential to study the behavior of active participants in naturalistic but still well-controlled environments. This approach is further facilitated by the growing availability of VR devices with built-in eye tracking, allowing simultaneous recording of gaze and movement behavior. Rendering engines such as Vizard, Unity, and Unreal have made it simple to create virtual environments, but most come ill-equipped out of the box to handle the typical structure of a sensorimotor experiment, such as looped presentation of structured behavioral 'trials' and capture of continuous behavioral data. This leads to code duplication and re-implementation of the same components by multiple labs and hampers open science and reproducibility efforts. Recently, some research groups have published toolboxes for the Unity platform that facilitate the creation of behavioral experiments. Based on experience from our lab, we here present a similar toolbox for the Vizard VR platform, which is widely used in perception and action research. Using our toolbox, a researcher can implement a behavioral experiment including eye and hand movement recording with comparatively little Python code. The experimental structure can be generated programmatically or imported from a file, and the toolbox takes care of randomizing, presenting, and recording trials. Results are saved in standardized file formats such as CSV and JSON for ease of analysis. Additionally, our toolbox allows for easy online recording and calibration of gaze and motion tracking data using any hardware device supported by Vizard. We highlight the structure and central features of our toolbox using the example of a goal-directed reaching task in VR. This is accompanied by the release of the toolbox code as open source.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.