Abstract
Retinal-image blur occurs when the eye is focused at one distance and an object is at another. Vision scientists and computer-graphics engineers often wish to create images that reproduce such depth-dependent blur, but their method is incorrect because it does not incorporate the human eye's optical aberrations. We developed a rendering method that, by incorporating these aberrations, creates displayed images that produce more natural retinal images. Here we concentrate on one aberration: longitudinal chromatic aberration (LCA). LCA creates different chromatic effects in the retinal image for objects farther vs nearer than current focus. We asked whether one can drive eye focus (accommodation) by incorporating LCA into the rendering of objects meant to appear farther or nearer than current focus. Observers viewed textured planes monocularly in three conditions: 1) Real Change in which stimulus focal distance actually changed; 2) Defocus Only in which focal distance did not change but blur rendering did (each color primary treated the same); 3) Defocus + LCA (we call this ChromaBlur) in which focal distance did not change but blur rendering did (appropriately for each primary). In one experiment, real or simulated distance changed sinusoidally over time. Accommodative gains and phases were essentially identical for Real Change and Defocus + LCA. There was no response with Defocus Only. In another experiment, we opened the loop by viewing with a pinhole aperture. There was no response for Real Change, but large, overshooting responses for Defocus + LCA. Finally, we measured responses in dichromats. They did not respond reliably to Defocus + LCA because of reduced ability to sample color signals. Our results show that appropriate rendering of chromatic aberration is very effective in driving accommodation, at least in color-normal observers. This motivates new techniques for blur rendering.
Meeting abstract presented at VSS 2018