Abstract
Numerous perceptual demonstrations show that motion influences the spatial coding of object position. For example, the perceived position of a static object is shifted in the direction of motion contained within the object. We postulate that this motion induced position shift (MIPS) results from a process of statistical inference in which position and motion estimates are derived by integrating noisy sensory inputs with the prediction of a forward model that reflects natural dynamics. The model predicts a broad range of known MIPS characteristics, including MIPS’ dependency on stimulus speed and position uncertainty and the asymptotic increase in MIPS with increasing stimulus duration. The model also predicts a novel visual illusion. To confirm this prediction, we presented translational motion (low-pass filtered white noise moving at 7.8°/s) within a stationary Gaussian envelope (s = 0.5°). Crucially, the direction of translation changed at a constant and relatively slow rate (0.72 Hz). Subjects perceive this stimulus as moving along a circular path in a way that its perceived motion conspicuously lags behind the direction of the motion within the object. To quantify this illusion, subjects adjusted the radius and phase of a circularly moving comparison disk to match the perceived motion path of the test stimulus. We found that the matching radius of motion (i.e., perceived illusory rotation) gradually increased from 0.6° to 1.0° as the eccentricity increased from 7.9° to 22.6°. Notably, the phase of the perceived object motion lagged behind the motion in the test stimulus by almost a quarter of the cycle, increasing from 73° to 82° as eccentricity increased. These results are consistent with the behavior of a Kalman filter that integrates sensory signals over time to estimate the evolving position and motion of objects. The model provides a unifying account of perceptual interactions between motion and position signals.
Meeting abstract presented at VSS 2013