Abstract
Visual input would be of little use if not accompanied by knowledge of eye position; indeed, it is the combination of these signals that allows the brain to localise and interact with objects meaningfully. Eye-position signals have been observed throughout visual cortex – including the primary visual area (V1) – but little is known about how well such signals represent the eye during fixation and across eye movements. We examined the static and dynamic representation of eye-position in parafoveal V1 of an alert macaque by recording extracellular activity as the animal performed sequences of fixations, saccades, and smooth-pursuit eye movements. To probe population codes for eye position, we recorded from several neurons simultaneously using a chronically implanted multielectrode array. Throughout the task, neurons were stimulated by a flickering binary noise stimulus (75Hz). Consistent with previous reports, we found that many neurons showed substantial and systematic modulations of visually-evoked activity by the position of the eyes in the orbit (i.e. ‘gain fields’). We used our knowledge of these tuning functions to decode the eye position from the neural data on a trial-by-trial basis, thereby allowing an assessment of the reliability of eye-position representation. We found that the position of the eyes could be predicted to within a few degrees of visual angle, even using as few as two V1 neurons. We also found that the representation of eye position was updated rapidly after the offset of saccades (within 50-100ms). These findings point to a highly reliable and nimble representation of eye position in primary visual cortex that could support fluid and accurate visuomotor behaviour during normal exploratory vision.
Meeting abstract presented at VSS 2012