Purchase this article with an account.
Sage E. P. Boettcher, Freek van Ede, Anna C. Nobre; Functional biases in attentional templates from associative memory. Journal of Vision 2020;20(13):7. https://doi.org/10.1167/jov.20.13.7.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
In everyday life, attentional templates—which facilitate the perception of task-relevant sensory inputs—are often based on associations in long-term memory. We ask whether templates retrieved from memory are necessarily faithful reproductions of the encoded information or if associative-memory templates can be functionally adapted after retrieval in service of current task demands. Participants learned associations between four shapes and four colored gratings, each with a characteristic combination of color (green or pink) and orientation (left or right tilt). On each trial, observers saw one shape followed by a grating and indicated whether the pair matched the learned shape-grating association. Across experimental blocks, we manipulated the types of nonmatch (lure) gratings most often presented. In some blocks the lures were most likely to differ in color but not tilt, whereas in other blocks this was reversed. If participants functionally adapt the retrieved template such that the distinguishing information between lures and targets is prioritized, then they should overemphasize the most commonly diagnostic feature dimension within the template. We found evidence for this in the behavioral responses to the lures: participants were more accurate and faster when responding to common versus rare lures, as predicted by the functional—but not the strictly veridical—template hypothesis. This shows that templates retrieved from memory can be functionally biased to optimize task performance in a flexible, context-dependent, manner.
This PDF is available to Subscribers Only