Abstract
When human observers view 1D gratings at low contrast in both eyes, increasing the contrast in one eye decreases stereoacuity. This puzzling stereocontrast paradox does not occur in 2D stimuli like random-dot patterns, and little is known about its neural basis. Here, we report the effect of interocular contrast differences on disparity-selective neurons recorded extracellularly from V1 of awake fixating macaques.
We presented 2D (random-dot, RDS) and 1D (line, RLS) stereograms under four contrast conditions: high (HH) and low contrast (LL) for both eyes, high contrast for one eye and low contrast for the other (HL) and its reverse (LH). We compared disparity modulation in the latter conditions to that in HH. Modeling predicts that these should be linearly related, with the slope equal to the product of the monocular contrasts and any contrast gain terms.
When contrast was low in both eyes, slopes were lower for RDS than for RLS [LL: mean slope = 0.14 for RDS, 0.67 for RLS, t-test p <0.001]. This suggests stronger contrast gain control for RLS. For RDS, slopes were systematically higher –and thus disparity tuning stronger– when contrast was low in only one eye than when it was low in both, as expected without gain control [RDS: 0.14 for LL, 0.39 for average (LH,HL), p<0.01]. However for RLS, slopes were similar whether contrast was low in one eye or both [RLS: 0.67 for LL, 0.60 for (LH,HL), p>0.05].
The stereocontrast paradox seen in gratings may partly reflect the V1 gain control seen in RLS, which tends to keep disparity modulation amplitude similar whether contrast is low in one eye or both. In RDS, weaker gain control in V1 produced stronger disparity modulation when contrast is low in only one eye. This may explain why the stereocontrast paradox is not reported in RDS.