Abstract
Previous work has shown that the more immersed the observer is in a scene, and the more natural the task is, the more stable color appearance is across changes in lighting. In the real world, color constancy can reach levels of near perfection. Even though some experiments have used such real environments, this it is extremely difficult, time consuming, and most often it is impossible to change particular aspects of the world. Recent developments in Virtual Reality technologies provide an opportunity to overcome these limitations.
The experiments were performed using the Unreal Engine, and an HTC Vive Pro head mounted display (HMD). We carefully calibrated the 2 OLED displays in the HMD, ensuring linearity and additivity of the display primaries. We used Autodesk 3Ds Max to create two different photo realistic virtual environments. An indoor scene showed a typical office environment with two light sources, one above and one behind the participant. The outdoor scene showed a natural landscape with the sun is the main source of light. The observers’ task (N=12) was to adjust the color of a test object until it appeared gray to them under five different illuminants. We used a Radiant X29 camera colorimeter, for independent colorimetric verification of every adjustment.
We found that observers adapted their settings to the different illumination conditions. In the indoor environment observers reached high levels of constancy comparable to previous experiments. In the outdoor scene, we surrounded the test object, a rock, either by water or by grass. The observers’ settings shifted accordingly towards blue for the water and green for the grass.
Our experiments show that it is entirely feasible to color-calibrate virtual environments and thus achieve full control over the physics of the scene while maintaining the highest level of ecological validity.