Abstract
How does past experience with an object influence how we deploy attention to it in the future? Although the guidance of attention based upon external salience and internal goals has been studied extensively, the mechanisms that underlie attentional guidance from long-term memory are not as well understood. Here we examine how visual long-term memory for objects influences their selection during visual search. In an initial encoding phase, observers learned to associate unique color and location features with each of several abstract shapes using a continuous report visual short-term memory task. In a subsequent search phase, observers were cued to search for these shapes among cluttered arrays of novel colorful distractors. On each trial, a colorless, centrally presented version of one shape was presented as the cue, and the target in the array (for present trials) appeared in either: the learned color, learned location, both, or neither. Although the color and location features associated with the shape were not relevant to the instructed task of finding the shape in whatever color and location it appeared, detection was quickest when the target appeared in both the color and location that was consistent with visual long-term memory, and slowest when it appeared in a novel color and location. We found intermediate search latencies when only one of these features matched long-term memory, suggesting that attention is guided in an additive manner by memory for different feature dimensions. In ongoing experiments, we are exploring whether long-term memory for objects guides attention via biasing mechanisms in working memory, and whether such retrieval sets up a general priority for associated features that can even influence processing of other objects. Taken together, these investigations contribute to our understanding of how memory systems incidentally and powerfully shape how we attend to and perceive the visual world.
Meeting abstract presented at VSS 2013