August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Visual control of steering through multiple waypoints
Author Affiliations & Notes
  • AJ Jansen
    Rensselaer Polytechnic Institute
  • Nathaniel Powell
    University of Texas at Austin
  • Brett Fajen
    Rensselaer Polytechnic Institute
  • Footnotes
    Acknowledgements  NSF 2218220
Journal of Vision August 2023, Vol.23, 5721. doi:https://doi.org/10.1167/jov.23.9.5721
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      AJ Jansen, Nathaniel Powell, Brett Fajen; Visual control of steering through multiple waypoints. Journal of Vision 2023;23(9):5721. https://doi.org/10.1167/jov.23.9.5721.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Many skilled locomotor tasks involve steering through cluttered environments at high speeds and moving smoothly from one waypoint to the next while avoiding obstacles. If actors treated each waypoint individually, they may be forced to make large trajectory adjustments, collide with obstacles, or miss waypoints altogether. Hence, performing such tasks with skill would seem to require actors to adapt their trajectory relative to the most immediate object in anticipation of future objects. To date, there have been relatively few empirical studies of such behavior and the evidence from the studies that exist is somewhat murky. On the one hand, it is well established that humans shift gaze to future waypoints before reaching the upcoming waypoint and such behavior is associated with smoother steering (Wilkie, Wann, & Allison, 2008). On the other hand, anticipatory steering behavior is not consistent across subjects and improvements in steering smoothness are marginal. The present study was designed to address the need for a clearer understanding of how and in what conditions humans adapt their movements in anticipation of future goals. Subjects performed a simulated drone flying task in a custom-designed forest-like virtual environment that was presented on a monitor. Eye movements were tracked using a Pupil Core headset. Subjects were instructed to steer through a series of gates while the distance at which gates first became visible (i.e., look-ahead distance) was manipulated between trials. We found that performance degraded when subjects were unable to see the next gate until after they passed through the previous gate. The findings suggest that steering performance does indeed improve when actors can see more than one waypoint at a time. Follow up experiments explore the conditions in which such anticipatory steering behavior is exhibited. The findings inform the development of control strategies for steering through multiple waypoints.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×