July 2013
Volume 13, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   July 2013
Bayesian observer model of the motion induced position shift
Author Affiliations
  • Oh-Sang Kwon
    Center for Visual Science & Dept. of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA 14627
  • Duje Tadin
    Center for Visual Science & Dept. of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA 14627 \nDepartment of Ophthalmology, University of Rochester, Rochester, NY, USA 14627
  • David Knill
    Center for Visual Science & Dept. of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA 14627
Journal of Vision July 2013, Vol.13, 451. doi:https://doi.org/10.1167/13.9.451
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Oh-Sang Kwon, Duje Tadin, David Knill; Bayesian observer model of the motion induced position shift. Journal of Vision 2013;13(9):451. https://doi.org/10.1167/13.9.451.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Numerous perceptual demonstrations show that motion influences the spatial coding of object position. For example, the perceived position of a static object is shifted in the direction of motion contained within the object. We postulate that this motion induced position shift (MIPS) results from a process of statistical inference in which position and motion estimates are derived by integrating noisy sensory inputs with the prediction of a forward model that reflects natural dynamics. The model predicts a broad range of known MIPS characteristics, including MIPS’ dependency on stimulus speed and position uncertainty and the asymptotic increase in MIPS with increasing stimulus duration. The model also predicts a novel visual illusion. To confirm this prediction, we presented translational motion (low-pass filtered white noise moving at 7.8°/s) within a stationary Gaussian envelope (s = 0.5°). Crucially, the direction of translation changed at a constant and relatively slow rate (0.72 Hz). Subjects perceive this stimulus as moving along a circular path in a way that its perceived motion conspicuously lags behind the direction of the motion within the object. To quantify this illusion, subjects adjusted the radius and phase of a circularly moving comparison disk to match the perceived motion path of the test stimulus. We found that the matching radius of motion (i.e., perceived illusory rotation) gradually increased from 0.6° to 1.0° as the eccentricity increased from 7.9° to 22.6°. Notably, the phase of the perceived object motion lagged behind the motion in the test stimulus by almost a quarter of the cycle, increasing from 73° to 82° as eccentricity increased. These results are consistent with the behavior of a Kalman filter that integrates sensory signals over time to estimate the evolving position and motion of objects. The model provides a unifying account of perceptual interactions between motion and position signals.

Meeting abstract presented at VSS 2013

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×