September 2015
Volume 15, Issue 12
Free
Vision Sciences Society Annual Meeting Abstract  |   September 2015
A Novel Approach to Measuring the Useful Field of View in Simulated Real-World Environments Using Gaze Contingent Displays: The GC-UFOV.
Author Affiliations
  • Ryan Ringer
    Kansas State University, Department of Psychological Sciences
  • Zachary Throneburg
    Kansas State University, Department of Psychological Sciences
  • Tera Walton
    Kansas State University, Department of Psychological Sciences
  • Greg Erickson
    Kansas State University, Department of Psychological Sciences
  • Allison Coy
    Kansas State University, Department of Psychological Sciences
  • Jake DeHart
    Kansas State University, Department of Psychological Sciences
  • Aaron Johnson
    Concordia University, Department of Psychology
  • Arthur Kramer
    University of Illinois, Beckman Institute, Department of Psychology
  • Lester Loschky
    Kansas State University, Department of Psychological Sciences
Journal of Vision September 2015, Vol.15, 878. doi:10.1167/15.12.878
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Ryan Ringer, Zachary Throneburg, Tera Walton, Greg Erickson, Allison Coy, Jake DeHart, Aaron Johnson, Arthur Kramer, Lester Loschky; A Novel Approach to Measuring the Useful Field of View in Simulated Real-World Environments Using Gaze Contingent Displays: The GC-UFOV.. Journal of Vision 2015;15(12):878. doi: 10.1167/15.12.878.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The Useful Field of View (UFOV) task assesses attentional breadth within a single glance. The UFOV has successfully predicted changes in attention that have real-world consequences (e.g. automobile collision likelihood), however its design prevents it from being incorporated into simulated environments (e.g. driving/flight simulators). Additionally, the UFOV task and other attentional breadth measures (e.g. the peripheral detection task) do not disentangle attention from hard-wired perceptual properties that change with retinal eccentricity (e.g., spatial resolution via cortical magnification). We therefore developed an alternative method of measuring attentional breadth using gaze-contingent (GC) displays, the GC-UFOV. Thirteen participants completed four sessions of testing to determine the effect of a secondary task (auditory N-back) on Gabor patch orientation discrimination at four retinal eccentricities (0o, 3o, 6o, and 9o). Gabor patches were occasionally presented gaze-contingently for single eye fixations while participants completed a scene recognition memory task. Gabors were size-thresholded under single-task conditions to disentangle eccentricity-dependent acuity from changes in attention occurring between single and dual-task conditions, while N-back levels were thresholded to ensure that cognitive load was equivalent across participants. Results showed a significant decrease in orientation sensitivity in the dual-task condition, but not as a function of retinal eccentricity. We conclude that interference with executive attention produces general interference with visual attention equally across retinal eccentricity. N-back sensitivity was only minimally impaired under dual-task conditions compared to single-task trials. Picture recognition memory was no different between N-back single and dual tasks, but was significantly better in the Gabor single task condition. Thus, the Gabor task did not interfere with processing picture information, making this method ideal for use in simulated real-world tasks. Furthermore, we propose that future adaptations of this method employ other ecologically valid sources of cognitive load (e.g., stress, traffic density) to observe their effects on attentional breadth.

Meeting abstract presented at VSS 2015

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×