September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Recognition memory fluctuates with the floodlight of attentional state
Author Affiliations & Notes
  • Anna Corriveau
    Department of Psychology, The University of Chicago
  • Alfred Chao
    Department of Psychology, The University of Chicago
  • Megan T. deBettencourt
    Department of Psychology, The University of Chicago
    Institute for Mind and Biology, The University of Chicago
  • Monica D. Rosenberg
    Department of Psychology, The University of Chicago
    Institute for Mind and Biology, The University of Chicago
    Neuroscience Institute, The University of Chicago
  • Footnotes
    Acknowledgements  National Science Foundation BCS-2043740 (M.D.R.)
Journal of Vision September 2024, Vol.24, 666. doi:https://doi.org/10.1167/jov.24.10.666
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Anna Corriveau, Alfred Chao, Megan T. deBettencourt, Monica D. Rosenberg; Recognition memory fluctuates with the floodlight of attentional state. Journal of Vision 2024;24(10):666. https://doi.org/10.1167/jov.24.10.666.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Attentional state fluctuates across time and influences what we remember. However, it is not yet understood whether fluctuations in attention affect memory for task-relevant and task-irrelevant information similarly. One possibility is that increased attentional state heightens the roving spotlight of selective attention, resulting in better filtering of irrelevant stimuli. Alternatively, better attentional state may act like a flickering floodlight, with increased attentional capacity allowing for greater processing of irrelevant stimuli. These hypotheses make opposite predictions for the subsequent memory of irrelevant stimuli. We collected two online samples (N1=188; N2=185) in which participants viewed a stream of trial-unique stimuli (500 trials) consisting of face images superimposed on scene images and were asked to perform a category judgment on either the faces (males vs. females) or scenes (indoors vs. outdoors) by pressing one key for frequent-category images (e.g., males, 90%) and a different key for infrequent images (e.g., females, 10%). Critically, the other category (scenes or faces) was completely irrelevant for the task. Following the sustained attention task, a surprise test probed recognition memory for both relevant and irrelevant stimuli using a 4-point scale. Logistic models tested whether sustained attention measures predicted memory accuracy. Attention lapses (errors to infrequent stimuli) were preceded by established RT signatures of sustained attention, speed (b1=.640, b2=.617) and variance (b1=-.296, b2=-.223; all ps<.001). As expected, memory was better for task-relevant items (b1=.722, b2=1.37; all ps<.001). Furthermore, correct performance on infrequent trials predicted memory for both task-relevant (b1=.134, p<.001; b2=.201, p<.001) and task-irrelevant (b1=.127, p<.001; b2=.111, p=.033) stimuli in both experiments. These results support the flickering floodlight view of attentional state, such that moments of high attention improve memory of relevant and irrelevant stimuli.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×