Abstract
There is evidence that the human visual system prioritises information which appears to be near to an observer. For instance, orientation discrimination thresholds are more precise when targets appear closer than further away. This improvement in precision has been elicited using a variety of tasks, superimposed onto a 2D Ponzo illusion. If the effect is due to the increased likelihood of interaction with objects in near space, it should also be evident in stereoscopic, 3D stimuli. To assess this, a virtual reality headset was used to display stimuli with multiple sources of depth information. In the first experiment, participants (n=25) were asked to discriminate the relative orientation of line pairs positioned at two distances in an Oculus Quest 2. The retinal size of the stimuli was fixed to remove any modulation of performance due to resolution/visibility. Reaction time and discrimination thresholds were recorded using a forced choice paradigm. The results of these performance measures were indistinguishable for the near and far surfaces. To determine whether this was due to the complexity of the scene, a second experiment (n=20) was conducted with the same task but in a sparse virtual environment. Here, a 2D Ponzo illusion similar to that in the original publication was used, presented at a fixed distance from observers. Despite the similarity of the stimuli used here and in the 2D study, neither discrimination thresholds nor reaction times were lower for the supposedly near surface. In sum, there was no evidence of a close advantage in these virtual environments. This is puzzling given that the original effect has been replicated for a variety of stimuli and tasks. One explanation for the discrepancy, and the focus of ongoing studies, is that the presence of multiple, conflicting sources of depth information interferes with the phenomenon by disrupting the attentional focus.