Abstract
Effective locomotion often requires the ability to navigate within complex environments at speed, moving smoothly through multiple waypoints while avoiding obstacles. If actors consider only one waypoint at a time, they may be forced to make jerky steering adjustments, collide with obstacles, or miss waypoints altogether. The ability to use information from beyond the most immediate waypoint could in principle allow actors to steer more smoothly and accurately. Recently, Jansen et al. (VSS 2023) and Powell et al. (submitted) found that subjects’ heading direction upon reaching a waypoint was systematically related to the relative position of the subsequent waypoint. Such findings provide evidence that humans do in fact anticipate future waypoints. However, neither experiment was designed to reveal how steering trajectories while approaching a waypoint depend on the position and orientation of the next waypoint. Such data would provide the empirical basis for formulating possible control strategies for steering through multiple waypoints, which motivated the present study. Subjects performed a simulated drone-flying task along a straightaway in forest-like virtual environment. The simulation was viewed on a monitor while eye movements were tracked using a Pupil Core headset. On each trial, subjects used a game controller to steer through a series of three gates: two centered on the longitudinal axis and separated by a fixed distance, and the third at a distance, angle, and orientation that was manipulated between trials. After passing through Gate 1, subjects initially turned away from Gate 2 in the direction opposite Gate 3 before turning back, allowing for a smoother (albeit less direct) trajectory through the series of waypoints. Maximum lateral deviation between Gates 1 and 2 increased with the relative angle of Gate 3, but the effects of distance and orientation were weaker. The findings inform the development of models of steering through multiple waypoints.