September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Eye-tracking reveals robust attentional filtering in an object-based attention task
Author Affiliations & Notes
  • Lasyapriya Pidaparthi
    Vanderbilt University
  • Frank Tong
    Vanderbilt University
  • Footnotes
    Acknowledgements  NIH R01EY029278 (FT) and NIH R01EY035157 (FT)
Journal of Vision September 2024, Vol.24, 777. doi:https://doi.org/10.1167/jov.24.10.777
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Lasyapriya Pidaparthi, Frank Tong; Eye-tracking reveals robust attentional filtering in an object-based attention task. Journal of Vision 2024;24(10):777. https://doi.org/10.1167/jov.24.10.777.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

We use object-based attention in our daily lives to process task-relevant objects, and sometimes we must ignore task-irrelevant objects, even if they are salient, dynamic, or situated in front of an attended object. We have previously shown the efficacy of eye-tracking at predicting the focus of object-based attention when participants must attend to one of two naturalistic objects (face, flower) that follow pseudorandom, minimally correlated trajectories while remaining partially overlapping (Pidaparthi & Tong, VSS 2023). Although smooth pursuit eye movements are considered by some to be strongly stimulus-driven, here we asked, how effectively can attention filter out the presence of a task-irrelevant object, as indexed by eye movements? To answer this question, we adapted our paradigm across two experiments. In Experiment 1, subjects were presented with either one or two moving objects and were instructed to respond whenever the task-relevant stimulus underwent brief spatial distortions (2 conditions: attend-face, attend-flower). We then evaluated the selectivity of object-based attentional filtering by using a sliding window correlation analysis to compare gaze trajectories with the attended stimulus trajectories. Notably, even with the overlapping sets of motion signals, pursuit eye movements were not perturbed by the irrelevant motion: observers could follow one attended object in the presence of the distractor object (mean r=0.581) just as accurately as a single object alone (mean r=0.579). In Experiment 2, we replaced the irrelevant object (e.g., flower during attend-face trials) with a moving Gabor stimulus that underwent random bursts of drifting motion (at 4 Hz for 500ms), and measured the extent to which this strong low-level motion signal influenced eye movements. In both instances, we found that observers can selectively attend to the task-relevant object such that gaze-following is unperturbed by extraneous motion signals, demonstrating the robustness of attentional filtering with respect to the eye movement system.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×