Abstract
Nearly everything we know about visual perception comes from tightly controlled environmental settings in the tradition of stationary, seated laboratory experiments. Arguably however, this traditional approach can never provide a complete account of how vision may operate in ecologically valid conditions, such as during dynamic activity in immersive environments. Advances in virtual-reality (VR) technology now enable the tightly-controlled presentation of immersive environments, and to complement traditional measures from psychophysics with records of movement kinematics. Here we present data from two-experiments showing how the accuracy and sensitivity of visual perception varies as a function of the gait-cycle. Participants were engaged in steady-state walking while tracking a floating target, which advanced at a constant comfortable walking speed into the foreground. We capitalised on continuous psychophysics and position tracking, to record a frame-by-frame tracking-response at the presentation rate of the target stimulus. In a first experiment, participants minimised the distance between their dominant hand and the floating target, and the error between the time-series of target position and tracking response were quantified over the gait-cycle. We observed a sinusoidal rhythm in tracking error, which peaked at the ascending phase of the gait-cycle, before rapidly returning to baseline. In the second experiment, participants monitored the target for brief increases in contrast. We observed clear differences in visual sensitivity and detection accuracy over the gait-cycle, with preferential phases for target detection within participants. These results illustrate how the most common of everyday actions influences perception, and evinces the utility of VR technology to broaden our understanding of visual information processing.