August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Memory guidance of attentional sampling, visual search, and working memory use during natural behaviour in virtual reality.
Author Affiliations & Notes
  • Dejan Draschkow
    University of Oxford
  • Levi Kumle
    University of Oxford
  • Rhianna Watt
    University of Oxford
  • Sage Boettcher
    University of Oxford
  • Anna C. Nobre
    University of Oxford
  • Footnotes
    Acknowledgements  This research was funded by a Wellcome Trust Award (104571/Z/14/Z) and James S. McDonnell Foundation Award (220020448) to A.C.N. WIN is supported by core funding from the Wellcome Trust (203139/Z/16/Z). The work is supported by the NIHR Oxford Health Biomedical Research Centre.
Journal of Vision August 2023, Vol.23, 4993. doi:https://doi.org/10.1167/jov.23.9.4993
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Dejan Draschkow, Levi Kumle, Rhianna Watt, Sage Boettcher, Anna C. Nobre; Memory guidance of attentional sampling, visual search, and working memory use during natural behaviour in virtual reality.. Journal of Vision 2023;23(9):4993. https://doi.org/10.1167/jov.23.9.4993.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Memory plays an important role in orchestrating cognition. Memories across different timescales have been shown to support separate, discrete cognitive operations; for example, attentional sampling (encoding), attentional guidance, and working memory. However, during free-flowing natural behaviour, memories provide the scaffolding for these discrete operations in a continuous and interconnected way. In three virtual reality (VR) experiments, we embraced the interconnected nature of these hallmark cognitive operations. We tracked head, hand, and eye movements as well as free-flowing interactions with the environment while participants copied a Model display by selecting realistic objects from a Resource pool and placing them into a Workspace. Using our novel VR protocol, we segmented continuous temporally extended behaviour into tractable sub-units: attentional sampling (encoding), attentional guidance during visual search, and working memory usage. By repeating selected arrangements within the environment (Exp1: Model, Exp2: Resource, Exp3: both) and using non-repeated/novel arrangements as a baseline, we show how different types of memories guide the interlinked processes of encoding, search, and working memory use during continuous natural behaviour. First, overall task performance increased for repeated vs novel arrangements. Next, we demonstrate that reliance on information in memory – compared to gathering information from the external environment – increased when Model arrangements were repeated. Further, search times improved for repeated Resource arrangements. We also found high performance in a subsequent recognition memory task for repeated Model and Resource arrangements, suggesting that the incidentally formed representation during the task were durable and accessible. Overall, we find memories help guide not only overall performance, but also differentially affect segmented cognitive operations during complex behaviour. Our work provides a novel framework for investigating naturally unfolding memory-guided behaviour and sheds new light on the coordination between vision, memory, and action.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×