Abstract
We sought to use eye movements as a readout of selection of one of two spatially superimposed surfaces. We trained a monkey to saccade to a 2.75 deg. radius circular aperture within which one or two superimposed patterns of dots, which appeared as transparent surfaces, translated to the left or right. On each trial, the monkey maintained gaze on a central fixation spot and the aperture appeared in the periphery (7 deg eccentricity). After a variable period of time, the fixation spot disappeared, and the monkey made a saccade to the aperture. Reward was delivered if gaze remained within the aperture for 200 ms. We examined how smooth eye movements during this 200 ms period varied as a function of the luminance contrast of the surfaces. With a single low contrast surface, eyes moved slowly in the direction of the surface. Gain increased with contrast, saturating at 70–90% of surface velocity. The function relating gain to contrast shifted to the right when a second, lower contrast, surface was added moving in the opposite direction. Equating the contrast of the two surfaces nulled pursuit. We consider these results in the context of three models: a vector averaging model, a winner-take all model, and a weighted vector averaging model in which the motion-selective neurons that drive the pursuit system are highly sensitive to contrast, and motion in a neuron's null direction results in a rightward shift in its contrast response function.
\r\nSupported by R01-EY13802-01.