October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
The role of central and peripheral vision for search in VR environments
Author Affiliations & Notes
  • Erwan David
    Scene Grammar Lab, Goethe University Frankfurt
  • Julia Beitner
    Scene Grammar Lab, Goethe University Frankfurt
  • Melissa Vo
    Scene Grammar Lab, Goethe University Frankfurt
  • Footnotes
    Acknowledgements  This work was supported by SFB/TRR 26 135 project C7 to Melissa L.-H. Võ.
Journal of Vision October 2020, Vol.20, 1101. doi:https://doi.org/10.1167/jov.20.11.1101
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Erwan David, Julia Beitner, Melissa Vo; The role of central and peripheral vision for search in VR environments. Journal of Vision 2020;20(11):1101. https://doi.org/10.1167/jov.20.11.1101.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

During visual search we rely on peripheral information to direct our gaze towards probable targets and we use our central field of view to further analyse stimuli and make a decision. Here we investigated the role of both foveal and peripheral vision for search in a more natural setting where head and body movements were unrestricted and where participants could make use of their peripheral field of view past the macula. To achieve this, we implemented a gaze-contingent paradigm within a virtual reality headset to assess the importance of central and peripheral information when an extended field of view is available (more than 80 by 80 degrees), contrary to studies using a 2D screen stimulating a narrow surface of the retinal eccentricity. Participants were looking for target objects in realistic 3D-modeled indoor scenes with gaze-contingent central or peripheral masks of six degrees of radius (and a control condition without visual interference). We simulated scotomas independently per eye and measured the contributions of head and eye movements to visual search. Results show an increase in return saccades and target refixation rates in cases where central information was missing. Conversely, with a lack of peripheral information we observed an increase in forward saccades. With regard to search efficiency, we replicated previous results in 2D where guidance to the target was unaffected by a central scotoma, while a peripheral mask substantially reduced search efficiency. Contrary to results from 2D searches, central scotomas did not affect target decision time, implying a greater role for peripheral pre-processing of target identities when searching with an extended field of view. In general, we found that artificial scotomas strongly affect eye movements while head movements are majorly reduced. Our observations demonstrate how visual attention is engaged across the entire field of view during immersive, real-world searches.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.