Abstract
With a weakened binocular disparity signal for depth (e.g., when the viewing distance of a target object is increased), traditional probabilistic models of depth perception predict an increase in estimation noise. An alternative theory of deterministic depth perception, termed Intrinsic Constraint, instead predicts a decrease in the slope of the function relating distal shape to estimated depth (the gain), while estimation noise remains unchanged. Here, we investigated the relationship between the strength of the disparity signal and the estimated depth by modulating dot brightness of random-dot stereograms (RDSs). In the first experiment, participants viewed a cylindrical curved surface defined by a RDS and adjusted a 2-dimensional curved probe to match the surface profile. The stimulus varied by three within-subject factors: six simulated depths, orientations (horizontal or vertical), and dot brightness (bright or dim). Results indicated that a surface defined by a dimmer RDS was perceived shallower. More importantly, the gain of the dimmer RDS also decreased while response variability remained the same. In other words, participants were less sensitive to changes in depth while retaining the same level of precision. In a set of follow-up 2IFC tasks, we again found evidence that dimmer RDSs yielded decreased estimated depths without systematic increases in the Just-Noticeable-Differences. Taken together, the results from both experiments indicate that the strength of the disparity signal modulates the magnitude of the estimated depth but not the estimation noise. These findings support deterministic depth perception where the strength of the depth signal directly maps to the magnitude of the perceived depth. More generally, our results indicate that care should be taken when selecting RDS properties like dot brightness to prevent unpredicted biases in perceived shapes.