August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Object-based computations for color constancy
Author Affiliations & Notes
  • Laysa Hedjar
    Justus-Liebig-Universität Gießen, Germany
  • Raquel Gil Rodríguez
    Justus-Liebig-Universität Gießen, Germany
  • Matteo Toscani
    Bournemouth University, UK
  • Dar'ya Guarnera
    Norwegian University of Science and Technology, Gjøvik, Norway
  • Giuseppe Claudio Guarnera
    Norwegian University of Science and Technology, Gjøvik, Norway
    University of York, UK
  • Karl R. Gegenfurtner
    Justus-Liebig-Universität Gießen, Germany
  • Footnotes
    Acknowledgements  Supported by ERC Advanced Grant Color 3.0 (project no. 884116)
Journal of Vision August 2023, Vol.23, 5100. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Laysa Hedjar, Raquel Gil Rodríguez, Matteo Toscani, Dar'ya Guarnera, Giuseppe Claudio Guarnera, Karl R. Gegenfurtner; Object-based computations for color constancy. Journal of Vision 2023;23(9):5100.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Color constancy has been shown to be high in nearly-natural situations. Yet, studying the cues used by human observers is very difficult when dealing with real objects and illuminants. We use Virtual Reality to construct realistic, immersive environments that are easily manipulable in real-time. We created an outdoor forest scene and an indoor office scene under five colored illuminants and selectively silenced individual cues to measure their impact on color constancy. For each trial, observers chose which of five test objects appeared most similar to an achromatic object previously shown. Objects ranged from a zero-constant tristimulus match to a perfect reflectance match and beyond (0-133% constancy). Similar to Kraft and Brainard (1999), we investigated local context, maximum flux, and mean color as cues. To eliminate local context, we placed a constant, rose-colored leaf under each test object. For maximum flux, we ensured that the brightest object in the scene remained constant across illuminations. To preserve the mean reflected light, we either shifted the reflectances of all objects or added new objects in the color direction opposite the illumination change. With all cues present, color constancy indices (CCIs) were high for both indoor and outdoor scenes (>75%). Silencing the local context and maximum flux mechanisms lowered CCIs slightly. In line with previous findings, constancy was massively impaired when keeping the average color constant by modifying object reflectances. However, when new objects were introduced instead, there was a modest reduction, with several observers showing no impairment at all. All results were consistent in both scenes. Our results show that VR can be a valuable tool for studying color constancy, and that computations underlying the gray-world mechanism do not simply operate on a pixel-by-pixel basis. Rather, observers seem to segment the scene and use the parts that are particularly diagnostic of the illumination.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.