Abstract
Successful performance of daily activities such as driving a car relies on the accurate perception of self-motion, such as the direction and speed of heading. Small groups of primates—humans, macaques, marmosets—can track heading direction with their eyes in the absence of any instruction and with only minimal training (Knöll et al. PNAS 2018). Here we investigated if this tracking behavior is generalizable to a larger group of human observers and sensitive to changes in motion signal strength.
Observers (n=43) viewed a cloud of moving dots that appeared to converge to one point, resulting in perceived self-motion towards or receding from the focus of expansion (FOE). FOE location shifted across time in a random walk fashion. Observers were asked to freely view the stimulus; eye position was recorded using an Eyelink 1000 eye tracker. In Exp.1 (n=19), we verified if observers could track suprathreshold stimuli (coherence: 100%; contrast: 33%; speed: 2m/s). In Exp.2 (n=24), we tested the effect of motion signal strength by manipulating coherence (6.25-100%), contrast (3.6-90%), and speed (0.75-6m/s).
Results show that observers intuitively track heading direction changes using a combination of saccades, fixation, and slow drift. In both experiments, more than 80% of observers tracked the FOE with highly correlated position trajectories (cross-correlation coefficient > 0.6) in response to high signal-strength stimuli. Spatial tracking error (eye position error between eye and FOE) increased with motion signal strength decreasing from the highest to the lowest level of coherence (32% error increase), dot contrast (24% error increase), and speed (44% error increase).
Intuitive ocular tracking of heading direction is generalizable to a larger group of human observers, and is finely tuned to low-level motion signals. Future work will explore whether we can utilize eye movements as an indicator of self-motion perception in clinical populations