December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Humans make non-ideal inferences about world motion
Author Affiliations & Notes
  • Tyler S Manning
    University of California, Berkeley
  • Jonathan W Pillow
    Princeton University
  • Bas Rokers
    NYU Abu Dhabi
  • Emily A Cooper
    University of Western Australia
  • Footnotes
    Acknowledgements  This project was funded by NIH (F32 EY032321 & T32 EY007043), NSF (Award #2041726), and Aspire (VRI20-10)
Journal of Vision December 2022, Vol.22, 4054. doi:https://doi.org/10.1167/jov.22.14.4054
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Tyler S Manning, Jonathan W Pillow, Bas Rokers, Emily A Cooper; Humans make non-ideal inferences about world motion. Journal of Vision 2022;22(14):4054. https://doi.org/10.1167/jov.22.14.4054.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Our perception of the world is often biased in surprising ways. A classic example of such biases is the tendency to underestimate the speed of low contrast stimuli. Bayesian ideal observer models can account for this bias by framing perception as a probabilistic process that combines uncertain sensory measurements (e.g., retinal motion signals) with learned statistics about the world (e.g., most objects are stationary or slow-moving). When making Bayesian inferences about world speed, an ideal observer should compute the posterior distribution using an appropriate combination of sensory measurements (in retinal coordinates) and a prior over world speed (in world coordinates). However, most experiments measuring speed biases do not differentiate between world and retinal speed. In fact, a model of retinal noise propagation predicts that an observer with a world-based Gaussian prior for slow speeds will exhibit greater contrast-dependent speed biases at far distances. We tested this prediction in two experiments. In Experiment 1, participants viewed two moving patterns either at a near (0.5m) or far (1m) distance and reported which pattern appeared faster as we modulated contrast and speed (retinal size was matched). The results were inconsistent with this model: participants showed no difference in biases across distances. However, these results may also be explained by a non-Gaussian prior. Thus we also asked participants to sequentially view contrast-matched moving patterns at two different distances and report which one moved faster. Irrespective of the specific prior shape, we expected increased uncertainty at the farther distance to bias speed judgments towards slower speeds compared to the nearer distance. The results were again inconsistent with this prediction. These results are at odds with Bayesian accounts of speed perception in 3D space, and are consistent with observers either underestimating the depth difference of the stimuli or disregarding distance-dependent estimation noise.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×