Abstract
When estimating distance from ocular convergence humans make systematic errors such that perceived distance is a progressive underestimate of true physical distance. Similarly, when estimating the shape of an object from binocular visual cues, object depth is progressively underestimated with increasing distance. Misestimates of distance are thought to be key to explaining this lack of shape constancy. Here we present a Bayesian model of distance perception from ocular convergence which predicts these biases given the assumption that the brain is trying to estimate the most likely distance to have produced the measured, noisy, ocular convergence signal. We show that there is a lawful relationship between the magnitude of noise in the ocular convergence signal and the magnitude of perceptual bias (more noise results in greater bias). Furthermore, using a database of laser scans of natural objects, we generate prior probabilities of distances in the environment and show how these priors are distorted by the process of distance estimation, such that the perceptual prior based on distance estimates is not necessarily equal to the objectively measured distance prior in the world. This has important implications for defining perceptual priors based on direct statistical measurements of the environment across multiple disciplines.
Meeting abstract presented at VSS 2017