Abstract
In the visual foraging task, participants search for and collect target elements while ignoring others. This enables the assessment of attention guidance and visual search in a continuous task which is less constrained than traditional visual search tasks. When foraging is performed on a tablet-PC, participants can respond by pointing (or tapping) on targets, and collect items at a high rate. However, this setup makes it challenging to record eye movements, a key indicator of attention. Up to now, tablet-based visual foraging experiments have solely relied on analyzing the manual selection responses. Here we show a novel method and first data from tablet-based visual foraging with eye tracking, enabling an insight into the interplay between manual selections and eye movements. The results highlight a close relationship between eye movements and manual selections. The coupling seems particularly tight in "simple" displays, in which targets are defined via a single feature. Here, selections seem to follow each fixation in a one-to-one relationship. In displays with "conjunction" targets, and particularly within same-type selection runs, gaze and hand movements deviate more from each other. This might reflect increased effort in coordinating hand and eye movements, well in agreement with the finding that foraging efficiency is typically impaired in conjunction conditions.