Abstract
We used a virtual reality (VR) head mounted monitor to study the relation between environmental illumination and perception of distance. We programmed the VR monitor with Unity 3D in C# language and setup a three dimensional world where two object could be seen at distance in different environmental conditions (illumination, object color etc). The subject task was to align these two objects moving one of them using a mouse. We measured the movement of the mouse, timing and precision of the final alignment of the two objects. We report differences in behavior for different illumination conditions. Different illumination in virtual reality correspond to different levels of signal in the retina. However, VR guarantees perfectly identical geometrical informations and perspective conditions. The differences in the estimation of the depth of the two objects is reflected in the subjects ability to align them. This difference is expected, but our original VR setup allow us, for the first time, to measure precisely the subjects actions and timing for quantitative analysis. Overall in this research we report for the first time quantitative analysis of the effect of illumination and color of object on their three dimensional perception by the visual system. Moreover, we measure also the positioning of the head to evaluate and study the subject effort to estimate distance. This research can be useful for the fundamental understanding of environment influence on the visual system depth perception and could have applications on civil engineering, automotive design and military sciences.
Meeting abstract presented at VSS 2018