August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Using regularity-based temporal predictions to shift our attentional template across time during multiple-target dynamic visual search
Author Affiliations & Notes
  • Gwenllian C. Williams
    Department of Experimental Psychology, University of Oxford
    Wellcome Centre for Integrative Neuroimaging, University of Oxford
    Oxford Centre for Human Brain Activity, University of Oxford
  • Sage E. P. Boettcher
    Department of Experimental Psychology, University of Oxford
    Wellcome Centre for Integrative Neuroimaging, University of Oxford
    Oxford Centre for Human Brain Activity, University of Oxford
  • Anna C. Nobre
    Department of Experimental Psychology, University of Oxford
    Wellcome Centre for Integrative Neuroimaging, University of Oxford
    Oxford Centre for Human Brain Activity, University of Oxford
  • Footnotes
    Acknowledgements  Oxford Medical Sciences Graduate School Studentship to G.C.W (funded by EPSRC); NIHR Oxford Health BRC; Wellcome Trust Senior Investigator Award to A.C.N. (104571/Z/14/Z); James S. McDonnell Foundation (220020448); The Wellcome Centre for Integrative Neuroimaging (203139/Z/16/Z).
Journal of Vision August 2023, Vol.23, 5304. doi:https://doi.org/10.1167/jov.23.9.5304
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Gwenllian C. Williams, Sage E. P. Boettcher, Anna C. Nobre; Using regularity-based temporal predictions to shift our attentional template across time during multiple-target dynamic visual search. Journal of Vision 2023;23(9):5304. https://doi.org/10.1167/jov.23.9.5304.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Attentional templates are internal representations used to guide our attention towards task-relevant items. Although attentional templates are traditionally studied in the context of static visual search, previous work has highlighted the importance of temporal regularities for guiding attention in dynamic settings. However, it remains unclear if we can use temporal regularities to change our attentional templates across time. We investigated this using a dynamic visual-search task in which participants searched for two targets sharing no features. Target and distractor stimuli appeared at different times during trials. Targets appeared at one of two times. The early target most likely appeared early in the trial, while the late target most likely appeared later. We found a significant interaction between target-type and target-timing, such that participants were relatively faster to identify a target when it appeared at its usual time, compared to when it appeared at the other time. This suggests that participants could use the temporal regularities to dynamically shift their attentional template towards the features of the most imminently anticipated target. Further, novel aspects of our task design enabled us to obtain continuous measures indicative of how such attentional shifts unfolded over time. Specifically, distractors shared features with either the early target, the late target, or neither target. We analysed the minimum distance of participants’ gaze from different distractors during trials where no targets appeared. Gaze distance was closer to distractors with early-target features, compared to late-target features, around the early-target time. This suggests early-target features captured more attention than late-target features during this period. The opposite was true around the late-target time. This work highlights the flexibility of our attentional system and our ability to dynamically shift attentional priorities across time based on our temporal predictions.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×