Abstract
Binocular information is an important cue to depth, and is encoded by disparity-tuned cells in the visual cortex. The responses of these cells depend on their phase-disparity tuning, and the interocular correlation of the stimulus within their receptive field. This has allowed us to develop computational models of how binocular information can be used to estimate depth. One important stimulus manipulation in this context is the anticorrelation of stimuli between the two eyes, in which the contrast polarity of elements in one eye is reversed. This results in an inversion of the disparity tuning function of binocular neurons. It is typically assumed that this should result in a reversal of their preferred disparity, and the direction of depth perceived in anticorrelated stimuli. The effect of anticorrelation for individual neurons will however depend on their position and phase tuning, and the spatial frequency of their disparity tuning function. By modelling the disparity tuning functions of populations of neurons we confirm that the preferred disparity does tend to be in opposite directions for correlated and anticorrelated stimuli, but with wide variation in the relationship between two. We also show that a model of disparity-tuning functions in the human visual system predicts a reversal in the direction of depth perceived in anti-correlated stimuli, but a greatly reduced ability to discriminate depth magnitude. We also show how the influence of the second-order disparity channel predicts the perception of forward depth in some anticorrelated stimuli, such as simple Gaussian blobs and step edges in anticorrelated stimuli. We conclude that the modelling of the disparity tuning properties of real cortical neurons, and how these are combined in the estimation of disparity, allows us to make clear quantitative predictions about the perception of depth, and the roles of phase and position encoding of disparity in this process.