December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Visual orienting to beyond field of view targets in 3D space: Effects of cue modality, eccentricity, and distractor presence
Author Affiliations
  • Ryan Pfaffenbichler
    Warfighter Effectiveness Research Center; U.S. Air Force Academy, CO USA
  • Andreas Garcia
    Warfighter Effectiveness Research Center; U.S. Air Force Academy, CO USA
  • Anna Madison
    Warfighter Effectiveness Research Center; U.S. Air Force Academy, CO USA
    DEVCOM Army Research Laboratory, Aberdeen Proving Ground, MD USA
  • Chloe Callahan-Flintoft
    DEVCOM Army Research Laboratory, Aberdeen Proving Ground, MD USA
  • Christian Barentine
    Warfighter Effectiveness Research Center; U.S. Air Force Academy, CO USA
    DEVCOM Army Research Laboratory, Aberdeen Proving Ground, MD USA
  • Anthony Ries
    Warfighter Effectiveness Research Center; U.S. Air Force Academy, CO USA
    DEVCOM Army Research Laboratory, Aberdeen Proving Ground, MD USA
Journal of Vision December 2022, Vol.22, 3626. doi:https://doi.org/10.1167/jov.22.14.3626
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Ryan Pfaffenbichler, Andreas Garcia, Anna Madison, Chloe Callahan-Flintoft, Christian Barentine, Anthony Ries; Visual orienting to beyond field of view targets in 3D space: Effects of cue modality, eccentricity, and distractor presence. Journal of Vision 2022;22(14):3626. https://doi.org/10.1167/jov.22.14.3626.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Visual search in operational environments necessitates simultaneous head and eye movements (gaze shifts) to scan large portions of the visual field where objects and people vary in spatial position. Research has shown that auditory cues are more efficient than visual cues when orienting people to targets within their immediate field of view. Much of this work, however, has been performed in a limited two-dimensional space with little application to naturalistic search behaviors in a three-dimensional, immersive environment. This experiment focused on cueing targets beyond a person’s current field of view within a 360-degree virtual reality environment and comparing performance between spatialized auditory and visual cues. Participants responded to the orientation of a Gabor target (2.6 cycles/deg) which could appear at spatial locations varying in eccentricity. Targets were presented both with and without the presence of heterogenous distractors. The visual cue was a responsive three-dimensional mini-map at the bottom of the participant’s screen that indicated the location and elevation of the target’s location relative to the viewer. The spatialized auditory cue (accomplished using a standard head-related transfer function) was a constant frequency (1040 Hz) that increased in volume as the user’s gaze became closer to the target location. Both the visual and auditory cues provided continuous feedback to the searcher based on current gaze position. Here, we present data (n=20) to evaluate cue efficiency as a function of the cue modality, target eccentricity, and distractor presence by comparing the orienting response and precision of target localization. To this end, we compare the differences in gaze shifts (head movement initiation time and saccadic reaction time) and search performance (accuracy and localization time). The results suggest that the auditory and visual cues have different impacts on search behavior.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×