Abstract
Distinguishing whether retinal motion is caused by self-motion or external motion is essential for effectively interacting with the world and the brain combines vestibular, proprioceptive, and re-afferent motor signals to determine this. This study seeks to better understand how visual and vestibular rotation signals interact in motion perception. In Study1, participants wore a virtual reality headset while seated, turning their heads either right, left or staying stationary while judging whether a visual stimulus translated left or right. Thresholds for this task were measured by varying the signal-to-noise ratio of the motion stimulus. When head-turn and retinal motion were congruent (both left, or both right) thresholds did not differ from the stationary condition. However, when head-turn and retinal motion were incongruent (opposed directions), thresholds were elevated. In Study 2, the visual motion was presented at various angles between horizontal and vertical to determine the relationship between the head-turn and visual motion trajectories. Results replicated those of study 1 in that opposed head-turn and retinal motion directions produced elevated thresholds, but congruent directions did not. There was no clear angular tuning for relative motion trajectory. These results may be explained by known visual-vestibular interactions in area MSTd, where only incongruently tuned visual-vestibular cells have been found for rotation. Such cells could serve to detect unwanted reafferent signals produced by head rotations and suppress them from perception, explaining elevated thresholds for visual motions opposed to head-turn direction.