Abstract
Stochastic Resonance (SR) is a phenomenon of optimization by noise in nonlinear systems [1]. It consists basically of that a weak signal, undetectable below a certain threshold, becomes detectable in the presence of an optimum amount of noise. SR has raised a great interest in biology given that noise may enhance the information processing in the nervous system, in particular in the visual system [2–4]. For example, the contrast threshold of a subject significantly decreases, reaching a minimum (resonance), when Gaussian noise is added. The SR phenomenon has also been observed in binocular rivalry when different stimuli are presented to each eye, and in depth perception. In this work, we study the effect of fluctuations in the eye's optical aberrations as a source of noise in the retinal image. Due to the temporal dynamics of the eye, the wave aberration was considered as the sum of an average reference wavefront and noise. Series of point-spread functions were calculated from aberrations and convolved with an object (sine-wave gratings) to simulate the retinal image. The improvement that occurs in image quality by adding extra defocus due to higher order aberrations may be view as a process of SR. Potential improvements in spatial vision due to optical noise, and the occurrence of processes mediated by SR are investigated.
Supported by Grant FIS04-01018 (Ministerio de Educación y Ciencia, Spain).