September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
The coordination of gaze and steering behavior in drone-racing pilots during high-speed flight
Author Affiliations
  • Nathaniel V. Powell
    Rensselaer Polytechnic Institute
  • Xavier Marshall
    Rensselaer Polytechnic Institute
  • Scott T. Steinmetz
    Rensselaer Polytechnic Institute
  • Gabriel J. Diaz
    Rochester Institute of Technology
  • Oliver W. Layton
    Colby College
  • Brett R. Fajen
    Rensselaer Polytechnic Institute
Journal of Vision September 2021, Vol.21, 2697. doi:https://doi.org/10.1167/jov.21.9.2697
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Nathaniel V. Powell, Xavier Marshall, Scott T. Steinmetz, Gabriel J. Diaz, Oliver W. Layton, Brett R. Fajen; The coordination of gaze and steering behavior in drone-racing pilots during high-speed flight. Journal of Vision 2021;21(9):2697. https://doi.org/10.1167/jov.21.9.2697.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

While locomoting through natural environments, humans coordinate their gaze and steering to efficiently sample the visual information needed to guide movement. The coordination of gaze and steering during high-speed movement has been extensively studied in the context of automobile driving. Theoretical accounts that have emerged from this work, such as the waypoint fixation hypothesis (Lappi & Mole, 2018), capture behavior during self-motion along an explicit, well-defined path over a flat, obstacle-free ground plane. However, humans are also capable of visually guiding self-motion in all three dimensions through cluttered environments that lack an explicit path, as demonstrated during drone racing. The aim of the present study was to explore the gaze and steering behavior of drone pilots as they maneuvered at high speeds through a dense forest. Subjects were instructed to fly a simulated quadcopter along a race course embedded within a forest-like virtual environment built in Unity. The environment was viewed through an HTC Vive Pro head-mounted display while gaze behavior was recorded using a Pupil-Labs VR/AR extension. Drone position, orientation, and controller outputs were recorded by Microsoft AirSim. In the control condition, the race course was defined by an explicit path and there were no obstacles that impeded movement along the path. The task in this condition was similar to steering an automobile along a winding road and allowed for the fixation of waypoints. We compared gaze and steering behavior in the control condition to other conditions in which the waypoint fixation strategy was less suitable, such as when the course was defined by a series of gates rather than a path and when obstacles (trees and overhanging branches) were present that had to be avoided. Discussion focuses on how gaze and steering behavior are adapted to task demands during high-speed steering through cluttered environments.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×