Abstract
It is a reasonable hypothesis that vertebrate retinae, having evolved in natural environments, are optimized to recognize and report spatial correlations in natural scenes, which have spatial amplitude spectra roughly proportional to 1/k. Assuming that the activity of retinal ganglion cells reflects the amount of information they report about the image, we hypothesized that ganglion cell responses elicited by natural phase spectra images with spatial amplitude spectra proportional to 1/k^n, where n is variable, should be maximally robust when n = 1. To test this, we recorded action potentials from salamander retinal ganglion cells and explored the effects of changing the amplitude spectrum and/or phase spectrum of images that evoked them. With stationary stimuli having random phase, a natural amplitude spectrum (n = 1) evoked more spikes from ganglion cells than an “unnatural” one. With moving images, a natural phase spectrum elicited more spikes than a random phase spectrum. We found that moving images having natural phase spectra evoked more spikes when their amplitude spectra were natural (n = 1) than when they were unnatural. These findings support our hypothesis that the retina is optimized to recognize and report the spatial correlations in natural scenes.