July 2013
Volume 13, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   July 2013
Implications of Microscopic Eye Movements for Retinal Encoding
Author Affiliations
  • John George
    Los Alamos National Laboratory
  • Jennifer Schei
    Los Alamos National Laboratory
  • Peter Schultz
    New Mexico Consortium
  • Garrett Kenyon
    Los Alamos National Laboratory\nNew Mexico Consortium
Journal of Vision July 2013, Vol.13, 1341. doi:https://doi.org/10.1167/13.9.1341
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      John George, Jennifer Schei, Peter Schultz, Garrett Kenyon; Implications of Microscopic Eye Movements for Retinal Encoding. Journal of Vision 2013;13(9):1341. https://doi.org/10.1167/13.9.1341.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The eye is constantly moving. In addition to voluntary saccades, "fixational" eye movements, drift and tremor on the scale of individual photoreceptors, overlap frequencies of oscillations observed in LGN and cortex. Such eye movements are implicated in the perception of fine spatial detail and other perceptual tasks. We set out to explore mechanisms underling perceptual consequences of microscopic eye movements using computational models, and retinal electrophysiology. We postulate that eye movements temporally modulate the visual response; that precisely timed and spatially coherent neural population activation enhances the detection and learning of visual features, and may encode relationships between features. We employed a model of the outer retina developed by van Hateren, (expanded to a 32x32 array of photoreceptors, with electrical coupling between horizontal cells) coupled to spiking models of the inner retina and primary visual cortex, implemented in our package Petavision. We employed a range of stimuli: illuminated points, noisy Gabor gratings, and still images. Stimulus patterns were randomly displaced. Movement orthogonal to the orientation of a grating blurred structure and reduced contrast encoded by spike rate, ultimately obliterating spatial detail. In contrast, orthogonal movements enhanced reconstructions based on temporal covariance. This effect might enhance detection of extended spatial features encoded by synchronized firing. In electrophysiological studies of isolated retina (tiger salamander), we simulated eye movements by jittering the visual stimulus; responses were recorded with multi-electrode arrays. As predicted, synthetic eye movements elicited a strong periodic response at the jitter frequency from individual cells in the salamander retina. Our models suggest that microscopic eye movements might enhance the representation of features in visual imagery encoded by correlation within a population of neurons. These predictions are testable: by functional optical imaging and electrode array measurements in isolated retina or by psychophysical investigation of the detection of perceptual targets perturbed by microscopic displacements.

Meeting abstract presented at VSS 2013

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×