Purchase this article with an account.
Xin He, Jianying Bai, Min Bao, Tao Zhang, Yi Jiang; A virtual reality approach identifies flexible inhibition of motion aftereffects induced by head rotation. Journal of Vision 2019;19(10):301c. doi: https://doi.org/10.1167/19.10.301c.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
As we move in space, retinae receive motion signals from two causes: those resulting from motion in the world, and those resulting from self-motion. Mounting evidence has shown that vestibular signals interact with visual motion processing profoundly. However, most contemporary methods are lacking portability, generality, and incapable of measurements during locomotion. Here we developed a virtual reality approach, combining a 3-space sensor (TSS-WL Sensor, YEI technology, U.S.A.) with head-mounted display (Sony HMZ-T3, 50°×28° visual angle, 1280×720 pixel resolution at 60Hz), to quantitatively manipulate the causality between retinal motion and head rotations in the yaw plane. Using this system, we explored how self-motion affected visual motion perception, particularly the motion aftereffect (MAE). Subjects watched full-contrast gratings presented on a head-mounted display. The spatial frequency of the gratings was 0.13 cpd. Each subtended 24.7°×6.79°. The gratings drifted at the same velocity as head rotations, with the drifting directions either identical, opposite, or perpendicular to the directions of head rotations. We found that MAE lasted significantly shorter (t(10) = 4.71, p < 0.001, Cohen’s d = 1.42, Experiment 1) when subjects’ heads rotated (11.36 s, Experiment 1) than when their heads kept still (19.93 s, Experiment 1). This effect was present regardless of the drifting direction of the gratings, and was also observed during passive head rotations. These findings suggest that the adaptation to retinal motion is suppressed by head rotations. Because the suppression was also found during passive head movements, it should result from visual-vestibular interactions rather than efference copy signals. Such visual-vestibular interactions are more flexible than previously thought, since the suppression could be observed even when the retinal motion direction was perpendicular to head rotations. Our work suggests that virtual reality approach can be used to produce a wide range of applications for studying multisensory integration and interaction.
This PDF is available to Subscribers Only