Abstract
In humans and primates, streoscopic depth perception often uses binocular disparity information. The primary visual cortical area V1 computes absolute disparity, which is the horizontal difference in the retinal location of an image in the left and the right fovea. However, cortical area V2 computes relative disparity (Thomas et al., 2002), which is the difference in absolute disparity of two visible features in the visual field (Cumming and DeAngelis, 2001; Cumming and Parker, 1999). Psychophysical experiments have shown that it is possible to have an absolute disparity change across a visual scene, while not affecting relative disparity. Relative disparities, unlike absolute disparities, can be unaffected by vergence eye movements or the distance of the visual stimuli from the observer. The neural computations that are carried out from V1 to V2 to compute relative disparity are still unknown. A neural model is proposed which illustrates how primates compute relative disparity from absolute disparity. The model describes how specific circuits within the laminar connectivity of V1 and V2 naturally compute relative disparity as a special case of a general laminar cortical design. These circuits have elsewhere been shown to play multiple roles in visual perception, including contrast gain control, selection of perceptual groupings, and attentional focusing (Grossberg, 1999). This explanation links relative disparity to other visual functions and thereby suggests new ways to psychophysically and neurobiologically test its mechanistic basis. Supported in part by CELEST, an NSF Science of Learning Center (NSF SBE-0354378) and by the SyNAPSE program of DARPA (HR0011-09-3-0001 and HR0011-09-C-0011)
Supported in part by CELEST, an NSF Science of Learning Center (NSF SBE-0354378) and by the SyNAPSE program of DARPA (HR0011-09-3-0001 and HR0011-09-C-0011).