Abstract
While steering through cluttered environments at high speeds, human actors must use gaze to efficiently sample visual information to guide locomotion. Past research on gaze and steering has focused primarily on automobile driving, but humans are able to steer efficiently in conditions very different from driving, such as during first-person view drone piloting. The present experiment examined how experienced quadcopter pilots coordinate gaze and steering while performing a task that involves path following and obstacle avoidance at high speeds. Participants were instructed to fly a simulated quadcopter through a forest-bound race course built in Unity. They viewed the environment through an HTC Vive Pro head-mounted display while gaze was recorded using a Pupil-Labs VR/AR extension. The experiment investigated how path information influences gaze and steering, and consisted of three conditions: a hoop-only condition, a hoop and path condition with a path placed below the hoops, and a path-only condition. Participants were instructed to fly through each hoop and remain as close to the center of the path as possible. In conditions with hoops, participants spent most of the time fixating distant objects that were visible through the center of the upcoming hoop. Less time was spent fixating the hoop’s edges and very little time was spent looking at the path. When the hoops were not present, participants' gaze shifted to the path more frequently, but they also spent time looking at the surrounding environment. Participants first made eye movements to each hoop between 2 and 3 seconds before reaching it, and shifted gaze to the next hoop approximately 0.5 seconds after passing through the previous hoop. We discuss how these findings expand our understanding of human gaze and steering while performing complex locomotor tasks in visually realistic conditions.