Abstract
Blur occurs naturally when the eye is focused at one distance and an object is present at another distance. Vision scientists and computer-graphics (CG) engineers often wish to create display images that reproduce such depth-dependent blur, but their method is incorrect for that purpose. Their method appropriately takes into account the scene geometry, pupil size, and focus distances, but does not take into account the optical aberrations of the person who will view the resulting display images. We developed a method that, by incorporating the viewer's optics, yields displayed images that produce retinal images close to those in natural viewing. Here we concentrate on the effects of longitudinal chromatic aberration. This aberration creates different chromatic effects in the retinal image for object farther vs nearer than current focus. Our method handles this correctly. Observers viewed scenes with depth-dependent blur. They viewed stimuli monocularly in three conditions: 1) A plane at one physical focal distance but various amounts of blur, rendered conventionally, simulating planes at different distances; 2) a plane at one focal distance but blur rendered using our method, creating natural depth-dependent chromatic effects in the retina; 3) multiple planes at different focal distances, so blur is created in natural fashion in the subject's eye. We measured accommodation and perception of depth order in these conditions. Accommodation was not driven with the conventional blur-rendering method, but it was driven with equal accuracy by real and simulated changes in focal distance when simulated changes were based on our method. We also found that depth-order judgments were random with conventional rendering of blur, but were quite accurate when the blur was created naturally in the eye or with our rendering method. Thus, creating display images that produce retinal images like those that occur naturally enables accommodation and facilitates depth perception.
Meeting abstract presented at VSS 2017