Abstract
Retinal images are the primary source of information for visual perception but subject to blur from the eyes’ imperfect optics. Improving the eyes’ optics should reduce blur and improve vision, but that was not always the case with stereovision. When tested with three-dimensional waves using adaptive optics, human participants with naturally good optics performed worse instead. The mean stereoacuity was 18.61arcsec through their own eyes but 33.34arcsec with the improved optics when tested with 1cpd waves. Likewise, stereoacuity was 28.22 and 53.82arcsec respectively, for 2cpd and 46.30 vs 85.50arcsec for 3cpd waves. This suggested that the visual system adapts to one’s own blur patterns to maximize stereovision. Evidence also came from replacing the participants’ own optics with someone else’s so that everyone now had the same blur patterns. Stereoacuity was also poorer in everyone compared to the person from whom the optics came (1cpd: 53.64arcsec [mean replacement] vs 8.46arcsec; 2cpd: 101.5 vs 12.91 arcsec; 3cpd: 182.34 vs 16.76arcsec). Did adaptation occur in monocular or binocular channels? We swapped the optics of the two eyes in comparison to eyes that received their own native blur. Swapping caused unfamiliarity in the monocular channels, but not in the binocular channels that still received the same combined blur and interocular difference. Stereoacuity was worse with the swap, indicating that adaptation had occurred in monocular channels (1cpd: 28.4arcsec with the native optics vs 43.4arcsec swapped; 2cpd: 53.6 vs 123.0arcsec; 3cpd: 144.5 vs 221.9arcsec). The detriment grew with interocular difference (1cpd: r=0.91, p=0.01; 2cpd: r=0.97, p=<0.01; 3cpd: r=0.91, p<0.01) as expected because swapping would matter only if the blur patterns in the two eyes were different. These results indicate a component of adaptation that happens before binocular combination towards the blur patterns of individual eyes.