Abstract
Humans are incredibly stable and efficient walkers. This is true even when walking is performed in simultaneity with a secondary task, such as social interaction, or use of a cell phone. This is impressive because, while visually guided walking relies upon fixations to the ground plane, secondary tasks often require shifts of gaze away from the ground plane. Thus, to remain stable while walking requires the intelligent timing of task dependent gaze shifts to and from the ground plane. However, little is known of how these shifts are timed. Recent investigation has demonstrated a tight relationship between the biomechanics of bipedal gait and the spatial extent of visual information needed for the planning of foot placement. It remains unclear if the biomechanics similarly influence the timing of eye movements to and from groundplane. We investigate using motion capture and eye-tracking to record the gait kinemtaics and gaze adjustments of a subject walking a straight path. During locomotion, a single augmented reality obstacle of three possible heights (0.15, 0.25, 0.35 leg lengths) is projected at a randomized location along the subject's path. The 2D projection of this obstacle is dynamically updated with changes in head position so that the optical information provided by the 2D projection is consistent with that of a 3D obstacle. Illusory height is reinforced through the use of stereoscopic glasses. While walking, the subject also performs a vigilance task involving the identification of numbers within a random sequence displayed at eye height. Task demands ensure that the subjects allocate visual attention to both the primary and vigilance task with equal importance. Analyses demonstrate temporal coupling between the timing of gaze shifts and evolution of the gait cycle.
Meeting abstract presented at VSS 2016