August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Neural encoding and dynamics of visual working memory during distraction
Author Affiliations & Notes
  • Jonas Karolis Degutis
    Charité Universitätsmedizin Berlin
    Max Planck School of Cognition
  • Joram Soch
    Charité Universitätsmedizin Berlin
    German Center for Neurodegenerative Diseases
  • Simon Weber
    Charité Universitätsmedizin Berlin
    Humboldt-Universität zu Berlin
  • John-Dylan Haynes
    Charité Universitätsmedizin Berlin
    Max Planck School of Cognition
    Humboldt-Universität zu Berlin
    Technische Universität Dresden
  • Footnotes
    Acknowledgements  We thank Rosanne Rademaker, Chaipat Chunharas, and John Serences for collecting and sharing their data open access, without which this reanalysis would not have been possible. Funding was supported by BMBF and Max Planck Society.
Journal of Vision August 2023, Vol.23, 4887. doi:https://doi.org/10.1167/jov.23.9.4887
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jonas Karolis Degutis, Joram Soch, Simon Weber, John-Dylan Haynes; Neural encoding and dynamics of visual working memory during distraction. Journal of Vision 2023;23(9):4887. https://doi.org/10.1167/jov.23.9.4887.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Recent studies have shown that both sensory percepts and visual working memory (VWM) contents are concurrently present in the visual cortex; however, the mechanisms dissociating them are still unclear. We reanalyzed an open-access fMRI dataset where six participants performed two tasks: a delayed match-to-sample task where the delay contained either a noise, orientation, or no distractor, and a perceptual orientation localizer task. We investigated the role of two mechanisms, mixed tuning and dynamic coding, during VWM maintenance in visual areas. First, we assessed whether evidence exists for mixed tuning in perceptual and memory representation encoding. Cross-generalization would indicate equivalent tuning functions between perception and memory; differences in tuning, despite maintained information, would indicate mixed tuning. We calculated tuning functions for perception and no-distractor VWM and identified two subpopulations per region: generalizable and mixed voxels, where tuning was correlated or not, respectively. We then used support vector regression (SVR) to investigate the subpopulations’ representational format of VWM and found evidence for a sensory and non-sensory coding of memory contents in generalizable and mixed tuning voxels, respectively, during the noise distractor and across several visual areas. Second, we examined whether VWM representations are dynamic; does the VWM code change across the delay? We conducted a temporal cross-decoding analysis where we trained an SVR decoder on every timepoint and iteratively tested on all timepoints. If the decoding accuracy was lower between different timepoints than within the same timepoint, we concluded that the VWM is dynamic in nature at those timepoints. We saw more dynamic coding during the noise distractor in V1 and V2 compared to other delay conditions. Our results demonstrate that visual areas can adapt to incoming sensory distractors by relying on a non-perceptual memory code and can dynamically change it across time, thus supporting a dynamic coding framework of VWM.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×