August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
A novel eye-tracking paradigm to investigate the focus of object-based attention
Author Affiliations & Notes
  • Lasyapriya Pidaparthi
    Vanderbilt University
  • Frank Tong
    Vanderbilt University
  • Footnotes
    Acknowledgements  NIH R01EY029278 (FT)
Journal of Vision August 2023, Vol.23, 5906. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Lasyapriya Pidaparthi, Frank Tong; A novel eye-tracking paradigm to investigate the focus of object-based attention. Journal of Vision 2023;23(9):5906.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

We use object-based attention in our daily lives to selectively attend to and interact with task-relevant objects in our environment. Previous research has suggested that common neural mechanisms serve to guide eye movements and covert shifts of spatial attention, but might eye movements be also used to determine the focus of object-based attention? We investigated this question by developing a novel eye-tracking paradigm, wherein two objects followed pseudo-randomly generated trajectories while remaining partially overlapping. At random times, either object would be briefly spatially distorted. In separate blocks, participants had to attend to face only, flower only, or to both objects to perform a change detection task while free-viewing eye movements were monitored. We hypothesized that the selectivity of object-based attention would be reflected by the correlation strength between eye position and stimulus location. Using a sliding-window correlation analysis (N=11), we found that gaze trajectories were highly correlated with the trajectory of the task-relevant stimulus (mean r = 0.7061), weakly correlated with that of the irrelevant stimulus (mean r = 0.1612), and intermediately correlated with both stimuli in the attend-both condition (mean r = 0.4278). Behavioral performance also revealed a two-object cost, with higher accuracy for single-object tracking than for the attend-both condition (mean face accuracy, 79% vs. 57%; mean flower accuracy, 66% vs. 42%). To measure the precision with which gaze following was indicative of the attentional focus, we binned behavioral accuracy based on average correlation of each trial (10 trials/bin). This analysis revealed that trials with higher stimulus-gaze correlations had better behavioral performance for the attended stimulus. In the attend-both condition, the degree of gaze following for each of the two stimuli was also predictive of detection accuracy (p < .05). Overall, we demonstrate a promising, novel method to evaluate the trial-by-trial focus of object-based attention using gaze-tracking.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.