Purchase this article with an account.
Jacob Hadnett-Hunter, Eamonn O'Neill, Michael J. Proulx; Contributed Session II: Visual Search in Virtual Reality (VSVR): A visual search toolbox for virtual reality. Journal of Vision 2022;22(3):19. doi: https://doi.org/10.1167/jov.22.3.19.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Our understanding of human visual attention has greatly benefited from a wealth of visual search studies conducted over the past few decades. Task observers with searching for a specific target embedded among a set of distractors, and the time taken for them to find the target can reveal much about the sensory and cognitive processes involved. These experiments have typically been conducted on 2D displays under tightly controlled viewing conditions. Recently however, there have been calls within the visual attention community to explore more ecologically valid means of data collection. Virtual reality (VR) is a promising methodological tool for such research as it offers improved visual realism and the possibility of participant interaction, while retaining a significant amount of the control afforded by a computerized and monitor presented experiment. Here we present the Visual Search in Virtual Reality (VSVR) ToolBox. VSVR is a set of functions, scripts and assets that can be combined within a visual scripting environment in the Unity game engine to design, replicate and extend visual search experiments in VR. We further demonstrate the utility of such a toolbox with three experiments: a replication of feature search behavior, a demonstration of wide field-of-view visual search and eccentricity effects, and replication of depth plane as a feature for search.
This PDF is available to Subscribers Only