Abstract
One's instantaneous direction of self-translation (heading) and one's future trajectory (path) are two defining features of human locomotion. Using a dynamic optic-flow display in which environmental points were periodically redrawn to minimize the path information, we have previously shown that humans can perceive heading without visual path information (Li, Sweet, & Stone, JOV 2006). Here we explore the use of visual path information in active heading control. The display (110° H × 94° V) simulated an vehicle traveling on a clockwise or counterclockwise circular path (yaw rate: ±4°/s) through a random-dot 3D cloud (depth range: 6–50 m) at 8 m/s under two conditions: “static scene” in which dots were displayed until they left the field of view thus containing both optic flow and path information, and “dynamic scene” in which dot lifetime was limited to 100 ms thus removing path information. Five observers (3 naïve) used a joystick to steer and align their vehicle line of sight with true heading as their simulated vehicle yaw orientation was perturbed by the sum of 7 harmonically-unrelated (0.1 to 2.19 Hz) sinusoids. The joystick displacement generated a command proportional to the rate of change of the simulated yaw vehicle angle. Time series (90 s) of heading error and joystick displacement were Fourier analyzed and averaged across 6 trials. For all observers, overall error was similar in both conditions (mean absolute heading error ± RMS across observers: 4.2±5.9° and 4.0±6.3°, for static and dynamic scenes, respectively). However, model-based analysis of the frequency response (Li, Sweet, & Stone, IEEE 2006) shows a significant decrease in lead time constant (ratio of velocity to position gain) for static scenes. We conclude that humans can accurately control their heading from optic flow independent of path, but path information reduces low-frequency ([[lt]]0.3 Hz) drift, when available.
Supported by: Hong Kong Research Grant Council, HKU 7471//06H.