September 2019
Volume 19, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2019
A virtual reality approach identifies flexible inhibition of motion aftereffects induced by head rotation
Author Affiliations & Notes
  • Xin He
    Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
    Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
  • Jianying Bai
    Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
    Xinjiang Astronomical Observatory, Chinese Academy of Sciences, Urumuqi, China
    University of Chinese Academy of Sciences, Beijing, China
  • Min Bao
    Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
    State Key Laboratory of Brain and Cognitive Science, Beijing, China
    Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
  • Tao Zhang
    State Key Laboratory of Brain and Cognitive Science, Beijing, China
    Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
  • Yi Jiang
    State Key Laboratory of Brain and Cognitive Science, Beijing, China
    Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
    Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China
Journal of Vision September 2019, Vol.19, 301c. doi:https://doi.org/10.1167/19.10.301c
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Xin He, Jianying Bai, Min Bao, Tao Zhang, Yi Jiang; A virtual reality approach identifies flexible inhibition of motion aftereffects induced by head rotation. Journal of Vision 2019;19(10):301c. https://doi.org/10.1167/19.10.301c.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

As we move in space, retinae receive motion signals from two causes: those resulting from motion in the world, and those resulting from self-motion. Mounting evidence has shown that vestibular signals interact with visual motion processing profoundly. However, most contemporary methods are lacking portability, generality, and incapable of measurements during locomotion. Here we developed a virtual reality approach, combining a 3-space sensor (TSS-WL Sensor, YEI technology, U.S.A.) with head-mounted display (Sony HMZ-T3, 50°×28° visual angle, 1280×720 pixel resolution at 60Hz), to quantitatively manipulate the causality between retinal motion and head rotations in the yaw plane. Using this system, we explored how self-motion affected visual motion perception, particularly the motion aftereffect (MAE). Subjects watched full-contrast gratings presented on a head-mounted display. The spatial frequency of the gratings was 0.13 cpd. Each subtended 24.7°×6.79°. The gratings drifted at the same velocity as head rotations, with the drifting directions either identical, opposite, or perpendicular to the directions of head rotations. We found that MAE lasted significantly shorter (t(10) = 4.71, p < 0.001, Cohen’s d = 1.42, Experiment 1) when subjects’ heads rotated (11.36 s, Experiment 1) than when their heads kept still (19.93 s, Experiment 1). This effect was present regardless of the drifting direction of the gratings, and was also observed during passive head rotations. These findings suggest that the adaptation to retinal motion is suppressed by head rotations. Because the suppression was also found during passive head movements, it should result from visual-vestibular interactions rather than efference copy signals. Such visual-vestibular interactions are more flexible than previously thought, since the suppression could be observed even when the retinal motion direction was perpendicular to head rotations. Our work suggests that virtual reality approach can be used to produce a wide range of applications for studying multisensory integration and interaction.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×