October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Color constancy in a Virtual Reality environment
Author Affiliations
  • Raquel Gil Rodriguez
    Justus-Leibig University
  • Matteo Toscani
    Justus-Leibig University
  • Dar'ya Guarnera
    Norwegian University of Science and Technology
  • Giuseppe Claudio Guarnera
    Norwegian University of Science and Technology
    University of York
  • Florian Bayer
    Justus-Leibig University
  • Karl Gegenfurtner
    Justus-Leibig University
Journal of Vision October 2020, Vol.20, 1226. doi:https://doi.org/10.1167/jov.20.11.1226
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Raquel Gil Rodriguez, Matteo Toscani, Dar'ya Guarnera, Giuseppe Claudio Guarnera, Florian Bayer, Karl Gegenfurtner; Color constancy in a Virtual Reality environment. Journal of Vision 2020;20(11):1226. https://doi.org/10.1167/jov.20.11.1226.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Previous work has shown that the more immersed the observer is in a scene, and the more natural the task is, the more stable color appearance is across changes in lighting. In the real world, color constancy can reach levels of near perfection. Even though some experiments have used such real environments, this it is extremely difficult, time consuming, and most often it is impossible to change particular aspects of the world. Recent developments in Virtual Reality technologies provide an opportunity to overcome these limitations. The experiments were performed using the Unreal Engine, and an HTC Vive Pro head mounted display (HMD). We carefully calibrated the 2 OLED displays in the HMD, ensuring linearity and additivity of the display primaries. We used Autodesk 3Ds Max to create two different photo realistic virtual environments. An indoor scene showed a typical office environment with two light sources, one above and one behind the participant. The outdoor scene showed a natural landscape with the sun is the main source of light. The observers’ task (N=12) was to adjust the color of a test object until it appeared gray to them under five different illuminants. We used a Radiant X29 camera colorimeter, for independent colorimetric verification of every adjustment. We found that observers adapted their settings to the different illumination conditions. In the indoor environment observers reached high levels of constancy comparable to previous experiments. In the outdoor scene, we surrounded the test object, a rock, either by water or by grass. The observers’ settings shifted accordingly towards blue for the water and green for the grass. Our experiments show that it is entirely feasible to color-calibrate virtual environments and thus achieve full control over the physics of the scene while maintaining the highest level of ecological validity.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.