Abstract
Background. Previous research suggests that heading specified by optic flow and target egocentric direction are used for the control of walking toward a goal. We examined whether these two cues are optimally combined in this process. Methods. In the walking task, participants (n=12) wore a head-mounted display (Oculus DK2, FOV: 100°) and walked toward a line target (width: 0.5°, retinal size did not increase with approach) placed at 8 m in (1) an empty virtual environment that provided only target egocentric direction or (2) with a textured ground and ceiling that also provided dense optic flow. Participants' heading in the virtual environment was displaced ±15° from their physical walking direction (i.e., straight ahead). In the perceptual task, participants passively viewed displays of the textured ground and ceiling environment with heading specified by optic flow displaced ±15° away from straight ahead. They judged perceived heading with a mouse-controlled probe at the end of each 1 s simulated self-motion display. Results. The heading error was 15° throughout the course of walking toward the target with the empty environment. With the textured environment, heading error dropped quickly and was reduced to 7° on average by the end of each trial. For the perceptual task, with a heading offset of 15°, perceived heading from optic flow had a center bias of 3.7°. Mean heading error and SD observed in the perceptual task (heading cue) and the walking task with the empty environment (target egocentric direction cue) successfully predicted heading error in the walking task with the textured environment (cue conflict) assuming optimal cue integration. This supports the claim that heading specified by optic flow and target egocentric direction cues are optimally combined for goal-oriented locomotion control.
Meeting abstract presented at VSS 2018