Abstract
Visual foraging tasks are used to understand the dynamics of visual attention in contexts with multiple targets, providing insight into search behavior in complex environments. The presence of multiple targets allows for aspects of attention to be studied as targets are selected one after another. It has been posited that the size of the functional viewing field (FVF), or how much visual information is processed at a given moment, can vary with search difficulty and be indexed by eye movement metrics. In this experiment, we manipulated factors that influence search difficulty using a virtual reality (VR) foraging task while measuring eye and head movements. Participants searched for enemy targets among two friendly forces, where the camo patterns indicated identity. The two friendly force camo patterns varied in similarity to the enemy targets (low- and high-similarity). Targets appeared at varying inter-target spacing (5 or 10 DVA), and participants searched under varying time pressure conditions (blocked: 45 sec or 18 seconds). We found that target similarity, inter-target spacing, and time pressure impacted search and eye movement metrics that index the size of the FVF. The results suggest participants had an overall larger FVF in the low-similarity camo condition compared to the high-similarity camo condition. Specifically, we found in the high-similarity camo search condition, participants made more fixations accompanied by smaller saccades, whereas in the low-similarity camo condition, participants made fewer fixations with larger saccades. Overall, these findings suggest fluctuations in the FVF can be indexed by eye movement behavior in a complex, immersive VR visual foraging task.