Abstract
Working memory representations bias visual experience by drawing attention to matching stimuli in the environment (i.e., "attentional capture"). This finding is reliable when a single item is held in memory, yet it is unclear how attentional capture is impacted under increased memory loads. Here, we characterized how two items in working memory—one in a high-priority state and another in a low-priority state—biased visual attention. We implemented a double retro-cue task involving two consecutive memory delay periods within each trial. In the first delay, one memory item was cued as relevant for the first probe. This tagged that item as high-priority, simultaneously tagging the other (uncued) memory item as low-priority. A cue before the second delay indicated which item would be tested by the final probe of the trial. During both delays, subjects performed a sequence of visual search trials that manipulated the possibility of exogenous attentional biases via reappearance/absence of memory items. The systematic variation of these search trials allowed us to decode the identity of memory items with pattern classifiers applied to RTs from the delay (Dowd et al., 2017). High-priority items were decodable and showed persistent attentional capture throughout the search task (~ 12 sec). In contrast, low-priority items were not decodable and further analyses showed that these items biased attention early but briefly (~ 3 sec). During the second delay, where the previously uncued (low-priority) item was cued as relevant on half of trials, the attentional capture effect on visual attention returned. The lack of consistent visual attentional bias from a low-priority representation in working memory is consistent with theoretical models of attentional templates (Olivers et al., 2011). Our findings demonstrate that attentional capture is a transient effect that depends on the priority and, by inference, the representational state of items held in working memory.
Meeting abstract presented at VSS 2018