February 2022
Volume 22, Issue 3
Open Access
Optica Fall Vision Meeting Abstract  |   February 2022
Contributed Session II: Visual Search in Virtual Reality (VSVR): A visual search toolbox for virtual reality
Author Affiliations
  • Jacob Hadnett-Hunter
    Department of Computer Science, University of Bath, UK
  • Eamonn O'Neill
    Department of Computer Science, University of Bath, UK
  • Michael J. Proulx
    Department of Psychology, University of Bath, UK
Journal of Vision February 2022, Vol.22, 19. doi:https://doi.org/10.1167/jov.22.3.19
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jacob Hadnett-Hunter, Eamonn O'Neill, Michael J. Proulx; Contributed Session II: Visual Search in Virtual Reality (VSVR): A visual search toolbox for virtual reality. Journal of Vision 2022;22(3):19. https://doi.org/10.1167/jov.22.3.19.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Our understanding of human visual attention has greatly benefited from a wealth of visual search studies conducted over the past few decades. Task observers with searching for a specific target embedded among a set of distractors, and the time taken for them to find the target can reveal much about the sensory and cognitive processes involved. These experiments have typically been conducted on 2D displays under tightly controlled viewing conditions. Recently however, there have been calls within the visual attention community to explore more ecologically valid means of data collection. Virtual reality (VR) is a promising methodological tool for such research as it offers improved visual realism and the possibility of participant interaction, while retaining a significant amount of the control afforded by a computerized and monitor presented experiment. Here we present the Visual Search in Virtual Reality (VSVR) ToolBox. VSVR is a set of functions, scripts and assets that can be combined within a visual scripting environment in the Unity game engine to design, replicate and extend visual search experiments in VR. We further demonstrate the utility of such a toolbox with three experiments: a replication of feature search behavior, a demonstration of wide field-of-view visual search and eccentricity effects, and replication of depth plane as a feature for search.

 Funding: None

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.