Journal of Vision Cover Image for Volume 19, Issue 10
September 2019
Volume 19, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2019
Real-time blur with chromatic aberration drives accommodation and depth perception
Author Affiliations & Notes
  • Steven A Cholewiak
    Optometry & Vision Science, University of California, Berkeley
  • Peter Shirley
    NVIDIA
  • Morgan McGuire
    NVIDIA
  • Martin S Banks
    Optometry & Vision Science, University of California, Berkeley
Journal of Vision September 2019, Vol.19, 15b. doi:https://doi.org/10.1167/19.10.15b
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Steven A Cholewiak, Peter Shirley, Morgan McGuire, Martin S Banks; Real-time blur with chromatic aberration drives accommodation and depth perception. Journal of Vision 2019;19(10):15b. https://doi.org/10.1167/19.10.15b.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

In computer-generated imagery and vision science, defocus blur is often rendered to simulate objects closer or farther than the focal plane. But depth-dependent optical effects, like longitudinal chromatic aberration (LCA), are not implemented in a physically correct manner. Recent evidence has shown that incorporating LCA into rendered images produces a powerful cue for driving accommodation and depth perception. But implementing correct LCA effects is computationally expensive. Applied implementations of defocus blur with LCA are possible, but require approximations in order to run in real-time. We investigated whether real-time implementation of blur with LCA using approximate blur kernels and simplified treatment of occlusions can still drive accommodation and improve perceived depth compared to conventional methods that do not incorporate LCA. We measured accommodative responses with an autorefractor while participants viewed stimuli at various real and simulated distances. For real changes, a focus-tunable lens altered the optical distance of the stimulus, producing a blurred retinal image with the observer’s natural aberrations. For simulated changes, optical distance was constant and rendered image content changed. These stimuli were generated using 1) conventional defocus blur with all color channels treated the same; 2) defocus blur and LCA with each channel treated correctly; or 3) approximate defocus and LCA using truncated 2D Gaussian blur kernels within a real-time game engine. Simulated changes in defocus with LCA (both physically accurate and real-time) drove accommodation as well as real changes. In another experiment, participants viewed images with two planes, one partially occluding the other, and made relative depth judgments. Incorporating physically correct rendered LCA or real-time approximations improved depth ordering relative to conventional techniques that do not incorporate LCA. Chromatic information is important for accommodation and depth perception and can be implemented in real-time applications.

Acknowledgement: NSF Research Grant BCS-1734677, Corporate University Research, Intel Labs 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×