Abstract
The electroretinogram (ERG) is an objective measurement of the electrical response of retinal neurons, including photoreceptors, to light. The basic function of photoreceptors is to convert photons into neural signals. The present study investigated the capability of different ERG techniques to detect a reduced number of photons detected by photoreceptors compared to a functional assessment of the photon noise, which quantifies the number of photons detected by photoreceptors. The number of photons detected by photoreceptors were artificially reduced using a neutral density filter of 0.6, which reduced light intensity by a factor of 4. Three ERG techniques (full-field, pattern and multifocal) were performed for the measurement of photoreceptor electrical responses under a baseline and a reduced light intensity (i.e., neutral density filter) condition. The latency and amplitude of different retinogram waves were analyzed (full-field: a, b, flicker 30hz; pattern: P1 of each of the 5 rings; multifocal: P50, N95). The photon noise was derived from two contrast sensitivity measurements under specific conditions (presence and absence of noise, 0.5 cycles per degree, 2 Hz) using a motion direction discrimination task. The capability of each measurement (photon noise and various amplitudes and latencies of ERG techniques) to discriminate the baseline and reduced light intensity conditions was quantified using a ROC analysis. The results showed that no ERG parameter was more effective than the photon noise in discriminating between the baseline and reduced light intensity conditions (area under the ROC curve was 0.96 for the photon noise and ranged up to 0.84 for the various ERG parameters). We found no evidence that any ERG parameter would be more useful than the functional measurement of photon noise in detecting a reduced number of photons detected by photoreceptors.