October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Nature-inspired noise model accounts for a broad range of motion phenomena
Author Affiliations & Notes
  • Hyun-Jun Jeon
    Department of Human Factors Engineering, Ulsan National Institute of Science and Technology
  • Duje Tadin
    Center for Visual Science & Dept. of Brain and Cognitive Sciences, University of Rochester
    Department of Ophthalmology, University of Rochester
  • Oh-Sang Kwon
    Department of Human Factors Engineering, Ulsan National Institute of Science and Technology
  • Footnotes
    Acknowledgements  NRF-2018R1A2B6008959 to O.-S. K
Journal of Vision October 2020, Vol.20, 1033. doi:https://doi.org/10.1167/jov.20.11.1033
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Hyun-Jun Jeon, Duje Tadin, Oh-Sang Kwon; Nature-inspired noise model accounts for a broad range of motion phenomena. Journal of Vision 2020;20(11):1033. https://doi.org/10.1167/jov.20.11.1033.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Background: According to the optimal object tracking model (Kwon, Tadin, & Knill, 2015, PNAS), the visual system integrates noisy sensory inputs with a forward model to estimate positions and motions of moving objects. The model provides a unifying account of a wide range of visual phenomena related to the integration of motion and position signals. However, illusory perception of flicker-defined motion (Mulligan & Stevenson, 2014, VSS) provides a counterexample. The flicker-defined motion appears to jump when it moves continuously, which conflicts with the prediction of the optimal tracking model. Model: The propagation noise of a tracking model represents the system's assumption of the random changes of velocity, and conventionally it was assumed to follow a Gaussian distribution. Considering that various natural movements follow a fat-tailed distribution (Kleinberg, 2000, Nature), we built a model that assumes a fat-tailed propagation noise and found that the model can predict jumping percepts of the flicker-defined motion. Experiment: We asked participants to report the perceived jumping frequency across five object speeds (3.7, 5.5, 9.2, 11°/s), two pattern speeds (1X object speed, 2X object speed) and two eccentricities (11, 16°) by adjusting the jumping frequency of a probe stimulus. Results show that the jumping frequency increases as (a) the object speed increases, (b) the relative pattern speed increases, or (c) the eccentricity decreases. Results are consistent with the prediction of the fat-tailed model and rule out the possibility that the observed jumping percept is due to periodic attentional sampling. Moreover, the same model can account for a range of other motion phenomena. Conclusion: The fat-tailed propagation noise model can account for a broader range of perceptual phenomena than the model based on commonly used Gaussian noise. Evidently, the visual system assumes fat-tailed propagation noise, noise that closely mirrors statistics of object movements observed in nature.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×