December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Asymmetric matching of color and gloss across different lighting environments
Author Affiliations & Notes
  • Takuma Morimoto
    University of Giessen
    University of Oxford
  • Arash Akbarinia
    University of Giessen
  • Katherine Storrs
    University of Giessen
  • Hannah E. Smithson
    University of Oxford
  • Karl R. Gegenfurtner
    University of Giessen
  • Roland W. Fleming
    University of Giessen
  • Footnotes
    Acknowledgements  Authors thank Wiebke Siedentop for assisting data collection. TM is supported by a Sir Henry Wellcome Postdoctoral Fellowship from Wellcome Trust (218657/Z/19/Z) and a Junior Research Fellowship from Pembroke College, University of Oxford.
Journal of Vision December 2022, Vol.22, 3279. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Takuma Morimoto, Arash Akbarinia, Katherine Storrs, Hannah E. Smithson, Karl R. Gegenfurtner, Roland W. Fleming; Asymmetric matching of color and gloss across different lighting environments. Journal of Vision 2022;22(14):3279.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Natural lighting environments can dramatically differ depending on location, time, or weather conditions. Here we tested the degree to which humans can simultaneously judge color and gloss of objects under diverse lighting environments. We selected 12 image-based environmental illuminations captured under different weather conditions (sunny and cloudy) and locations (indoor and outdoor) and applied two manipulations to each illumination to expand the diversity: (i) rotating the chromatic distribution by 90 degrees to generate chromatically atypical environments and (ii) scrambling phase in the frequency domain to make the lighting geometry unnatural. Under each of 36 environments, we used a physics-based renderer to generate a test image from a single 3D mesh of a random everyday object that was assigned random color and specularity. In a different lighting environment, we separately rendered a comparison image containing a bumpy object. In each trial test and comparison images were presented side-by-side on a computer screen, and participants were asked to adjust color (in lightness, hue and chroma) and specularity of the comparison object until it appeared to be made of the same material as the test, shown in a different lighting environment. Results showed that hue settings were highly correlated with ground-truth values for natural and phase-scrambled lighting conditions, but the accuracy of the settings worsened in chromatically atypical environments. Chroma and lightness constancy were generally poor, but these failures correlated with simple image statistics such as mean chroma and mean lightness over the object region. Gloss constancy was limited especially under diffuse lighting (e.g. cloudy environments). Constancy errors had high consistency across participants. These results suggest that though color and gloss constancy hold well in many situations, some properties in lighting environments such as chromatic unfamiliarity or diffuseness potentially hamper our stable visual judgements of material properties.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.