September 2018
Volume 18, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2018
Study of Visual Search in 3D Space using Virtual Reality (VR)
Author Affiliations
  • Tandra Ghose
    Department of Psychology, University of Kaiserslautern, Germany
  • Aman Mathur
    Max Planck Institute for Software Systems, Kaiserslautern, Germany
  • Rupak Majumdar
    Max Planck Institute for Software Systems, Kaiserslautern, Germany
Journal of Vision September 2018, Vol.18, 286. doi:https://doi.org/10.1167/18.10.286
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Tandra Ghose, Aman Mathur, Rupak Majumdar; Study of Visual Search in 3D Space using Virtual Reality (VR). Journal of Vision 2018;18(10):286. https://doi.org/10.1167/18.10.286.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

As VR gains mainstream traction and is adopted for more serious use-cases such as remote monitoring and troubleshooting, a thorough study of perception over such devices becomes important. An advantage that VR has over its 2D counterpart is the large virtual space. However, it needs to be empirically determined how visual search characteristics derived from traditional 2D visual search experiments (~50 objects) scale to immersive 3D scenarios with more numerous objects (~1000). To study this, we designed the classic feature and conjunction search experiment in VR, modelling virtual space using a spherical coordinate system centered at the VR headset's initial position. The target was presented in one of 32 equally sized regions blocked with 45 degree increments in radial angle and elevation. The target was a red cube embedded in 96, 480, 768, or 1024 distractors that were equally distributed among the 3D regions. Distractors were either green cubes (feature search) or red spheres and green cubes (conjunction search). The task was to find the target as quickly as possible using head and body movements. We studied slopes of reaction times with respect to number of distractors for each of the 32 regions. Based on data from 25 participants, overall, the typical pattern of slope of feature and conjunction search was observed. For regions directly in front of the participants, reaction times were faster for the left versus right visual field. Even regions behind the observer followed similar trends as regions in front. Search in atypical regions, such as close to the toes or directly above demonstrated haphazard characteristics. Though these findings seem robust, we found that occlusion can be a nuisance variable in such search tasks.

Meeting abstract presented at VSS 2018

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×