Abstract
When searching the environment for a visual target, observers adopt an attentional template—an internal representation of the target they are searching for. One open area of study in attention research is to understand where these templates are stored and how they guide attentional capture. Across five experiments, participants used long-term memory (LTM) to learn a set of objects with specific colours and were then asked to find the objects amongst new or old distractors. We varied the role of visual working memory (VWM) in this search task by either telling participants which target to search for at the start of each trial (VWM-based template), or by asking them to search for all objects on every trial (LTM-based template). In the first three experiments, we showed that with and without invoking VWM, participants found the targets faster when presented in their memorized colour, and slower when a distracting object matched that colour. It is possible, however, that these effects emerged during post-perceptual processes like decision-making. To test this idea, we used a probe-dot detection paradigm to measure attentional effects separately from those on decision making. This involved briefly presenting a probe at the target’s location on some trials, and having participants indicate if they saw the probe or not. For VWM-based attentional templates, probe RTs were significantly affected by the previously learned colour association, suggesting that LTM indirectly tunes the attentional template in VWM. On the other hand, when search is guided directly by LTM, the effects on search time are likely related to a post-perceptual process rather than attention. Altogether, this work clarifies the interactive roles of VWM and LTM in controlling attentional capture during visual search.