Abstract
Previous research revealed that attention is not uniformly distributed across 3D space. Some results indicate that there is an egocentric attentional gradient through space. Accordingly, objects located closer to an observer are expected to summon attention more than those located farther away. So far, experimental evidence was limited to analyses of behavioral responses. Thus, in the present investigation eye movements were recorded while participants performed a visual search task in virtual 3D space. Vertical line segments (12 or 24 items) were distributed across two or four depth planes. In each trial, participants had to judge whether a line segment was tilted or not (target present/absent). Participants tended to identify closer targets faster than those located farther away. This effect was evident irrespective of the search volume’s size (12 or 24). Analysis of eye movements revealed that the number of fixations substantially varied across depth planes. Overall, more fixations on items located in the nearest depth plane were recorded. Beyond this plane, fixations were evenly distributed. Therefore, the present results suggest that items of a 3D search array are more likely to be selected if they are located closer to an observer. This adds further empirical evidence to the idea that visual search in (virtual) 3D space operates along an egocentric attentional gradient. This might be of particular importance if foreknowledge about the target depth plane is not available. Moreover, the results indicate that visual information beyond the fixated depth plane is not completely neglected. Covert shifts of attention between different depth planes may be used to integrate additional information. Accordingly, investigation of eye movement patterns can be regarded as useful and important tool to obtain a deeper understanding of attentional mechanisms in (virtual) 3D space.