Abstract
We use object-based attention in our daily lives to selectively attend to and interact with task-relevant objects in our environment. Previous research has suggested that common neural mechanisms serve to guide eye movements and covert shifts of spatial attention, but might eye movements be also used to determine the focus of object-based attention? We investigated this question by developing a novel eye-tracking paradigm, wherein two objects followed pseudo-randomly generated trajectories while remaining partially overlapping. At random times, either object would be briefly spatially distorted. In separate blocks, participants had to attend to face only, flower only, or to both objects to perform a change detection task while free-viewing eye movements were monitored. We hypothesized that the selectivity of object-based attention would be reflected by the correlation strength between eye position and stimulus location. Using a sliding-window correlation analysis (N=11), we found that gaze trajectories were highly correlated with the trajectory of the task-relevant stimulus (mean r = 0.7061), weakly correlated with that of the irrelevant stimulus (mean r = 0.1612), and intermediately correlated with both stimuli in the attend-both condition (mean r = 0.4278). Behavioral performance also revealed a two-object cost, with higher accuracy for single-object tracking than for the attend-both condition (mean face accuracy, 79% vs. 57%; mean flower accuracy, 66% vs. 42%). To measure the precision with which gaze following was indicative of the attentional focus, we binned behavioral accuracy based on average correlation of each trial (10 trials/bin). This analysis revealed that trials with higher stimulus-gaze correlations had better behavioral performance for the attended stimulus. In the attend-both condition, the degree of gaze following for each of the two stimuli was also predictive of detection accuracy (p < .05). Overall, we demonstrate a promising, novel method to evaluate the trial-by-trial focus of object-based attention using gaze-tracking.