Abstract
Advances in virtual reality (VR) technology have provided a wealth of valuable new approaches to vision researchers. VR offers a critical new depth cue, active motion parallax, that provides the observer with a location in the virtual scene that behaves like true locations do: It changes in predictable ways as the observer moves. The contingency between observer motion and visual stimulation is critical and technically challenging and makes coding VR experiments from scratch impractical. Therefore, researchers typically use software such as Unity game engine to create and edit virtual scenes. However, Unity lacks built-in tools for controlling experiments, and existing third-party add-ins require substantial scripting and coding knowledge to design even the simplest of experiments, especially for multifactorial designs. Here, we describe a new free and open-source tool called the BiomotionLab Toolkit for Unity Experiments (bmlTUX). Unlike existing tools, our toolkit provides a graphical interface for configuring factorial experimental designs and turning them into executable experiments. New experiments work “out-of-the-box” and can be created with fewer than twenty lines of code. The toolkit can automatically handle the combinatorics of both random and counterbalanced factors, mixed designs with within- and between-subject factors, and blocking, repetition, and randomization of trial order. A well-defined API makes it easy for users to interface their custom-developed stimulus generation with the toolkit. Experiments can store multiple configurations that can be swapped with a drag-and-drop interface. During runtime, the experimenter can interactively control the flow of trials and monitor the progression of the experiment. Despite its simplicity, bmlTUX remains highly flexible and customizable, catering to both novice and advanced coders. The toolkit simplifies the process of getting experiments up and running quickly without the hassle of complicated scripting.