September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Bringing color into focus: accommodative state varies systematically with the spectral content of light
Author Affiliations
  • Benjamin M Chin
    University of California, Berkeley
  • Martin S Banks
    University of California, Berkeley
  • Derek Nankivil
    Johnson & Johnson Vision Care
  • Austin Roorda
    University of California, Berkeley
  • Emily A Cooper
    University of California, Berkeley
Journal of Vision September 2024, Vol.24, 729. doi:https://doi.org/10.1167/jov.24.10.729
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Benjamin M Chin, Martin S Banks, Derek Nankivil, Austin Roorda, Emily A Cooper; Bringing color into focus: accommodative state varies systematically with the spectral content of light. Journal of Vision 2024;24(10):729. https://doi.org/10.1167/jov.24.10.729.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Humans bring the visual world into focus by changing the power of the lens in their eye until the retinal image is sharp. Light in the natural environment, however, can almost never be focused perfectly because it contains multiple wavelengths that refract differently through the lens. How does the visual system determine the wavelength to put in best-focus? We compared possible strategies used to focus light containing different proportions of long and short wavelengths. Under a ‘switching’ strategy, an observer would accommodate (focus their lens) to whichever wavelength has the highest luminance. In contrast, under a ‘weighting’ strategy, the accommodative response would be a weighted sum of the luminances across visible wavelengths. We measured the dynamic accommodative responses of eight participants with an autorefractor recording at 30Hz. On each trial, an observer viewed a three-letter word (24 arcmins per letter) against a black background on an OLED display for six seconds. Halfway through the trial, a focus-adjustable lens generated a step change in the optical distance of the stimulus, synchronized to a change in stimulus color (the proportion of long and short wavelength subpixels). We then fit participants’ accommodative changes with both the ‘switching’ and ‘weighting’ models separately. The Akaike Information Criterion showed that for all but one subject, the likelihood of the data was greater under the ‘weighting’ model. Increasing luminance of long wavelengths caused the eye to accommodate nearer, while increasing luminance of short wavelengths caused it to accommodate farther. This is remarkable because it implies that people may bring wavelengths into best focus that are weak or even absent from the visual stimulus. Using these data, we aim to develop an image-computable model that can predict how the eye accommodates to the complex spectral and spatial patterns encountered during natural vision.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×