Abstract
An abundance of visual cues contribute to the perception of distance, in both physical and virtual environments. Here, two environments were created in virtual reality to test the impact of scene clutter on distance judgements in near-space. The two environments were (1) sparse, only the stimuli were visible; and (2) cluttered, additional objects were added to the scene. It was predicted that there would be under-constancy of distance perception in both environments, but fewer errors in the cluttered environment due to the additional visual cues available compared to the sparse environment. Using an equidistance task, 21 participants were required to match the distance of a target stimulus to that of a reference stimulus. While holding scene lighting between environments consistent, and using both ambient and directional lighting, performance was measured in terms of accuracy of estimates (centimetre error) and precision (standard deviation of responses). As expected, under-constancy of distance was found in both environments. However, there was no significant difference in the accuracy or precision of distance estimates between the environments. These results show that, in our environments, no additional improvement in performance was gained from the presence of scene clutter, over and above what was achieve on the basis of the geometrical and lighting cues associated with the target object and table surface.