Abstract
Memory plays an important role in orchestrating cognition. Memories across different timescales have been shown to support separate, discrete cognitive operations; for example, attentional sampling (encoding), attentional guidance, and working memory. However, during free-flowing natural behaviour, memories provide the scaffolding for these discrete operations in a continuous and interconnected way. In three virtual reality (VR) experiments, we embraced the interconnected nature of these hallmark cognitive operations. We tracked head, hand, and eye movements as well as free-flowing interactions with the environment while participants copied a Model display by selecting realistic objects from a Resource pool and placing them into a Workspace. Using our novel VR protocol, we segmented continuous temporally extended behaviour into tractable sub-units: attentional sampling (encoding), attentional guidance during visual search, and working memory usage. By repeating selected arrangements within the environment (Exp1: Model, Exp2: Resource, Exp3: both) and using non-repeated/novel arrangements as a baseline, we show how different types of memories guide the interlinked processes of encoding, search, and working memory use during continuous natural behaviour. First, overall task performance increased for repeated vs novel arrangements. Next, we demonstrate that reliance on information in memory – compared to gathering information from the external environment – increased when Model arrangements were repeated. Further, search times improved for repeated Resource arrangements. We also found high performance in a subsequent recognition memory task for repeated Model and Resource arrangements, suggesting that the incidentally formed representation during the task were durable and accessible. Overall, we find memories help guide not only overall performance, but also differentially affect segmented cognitive operations during complex behaviour. Our work provides a novel framework for investigating naturally unfolding memory-guided behaviour and sheds new light on the coordination between vision, memory, and action.