September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Neural representations of spatial position recalled from long-term and short-term memory diverge across the cortical hierarchy
Author Affiliations
  • Vy Vo
    Neurosciences Graduate Department, University of California, San Diego
  • David Sutterer
    Department of Psychology, University of Chicago
    Institute for Mind and Biology, University of Chicago
  • Joshua Foster
    Department of Psychology, University of Chicago
    Institute for Mind and Biology, University of Chicago
  • Thomas Sprague
    Department of Psychology, New York University
  • John Serences
    Neurosciences Graduate Department, University of California, San Diego
    Department of Psychology, University of California, San Diego
  • Edward Awh
    Department of Psychology, University of Chicago
    Institute for Mind and Biology, University of Chicago
Journal of Vision August 2017, Vol.17, 1115. doi:10.1167/17.10.1115
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Vy Vo, David Sutterer, Joshua Foster, Thomas Sprague, John Serences, Edward Awh; Neural representations of spatial position recalled from long-term and short-term memory diverge across the cortical hierarchy. Journal of Vision 2017;17(10):1115. doi: 10.1167/17.10.1115.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Holding an item in short-term memory (STM) elicits stimulus-specific representations across sensory, parietal, and frontal cortex (Ester, Sprague, Serences 2015). Recent evidence suggests that retrieving items from long-term memory (LTM) also reinstates neural representations in sensory cortex (Bosch et al. 2014). However, it remains unknown how sensory representations of items from LTM and STM differ in their distribution across these areas of human cortex. Here, we directly compared neural representations of locations that were retrieved from LTM or maintained in STM, and assessed how these neural measures tracked behavioral precision. We trained subjects to associate 24 unique clip art items with spatial positions along an isoeccentric ring (Sutterer & Awh, 2015). In subsequent training days, subjects performed both a LTM task to retrieve the learned pairings and a STM task which required subjects to maintain a spatial position in STM. The precision of subjects' LTM recall plateaued after ~5 days of training, with precision that approached STM performance. We then acquired fMRI while subjects performed both tasks in the scanner. On separate runs, we obtained data to train an inverted encoding model (IEM) for spatial position. This allowed us to reconstruct the spatial location of each item as it was being held in memory (e.g., during a stimulus-absent delay period; Sprague et al. 2014; Ester et al. 2015). We successfully reconstructed the position of the remembered stimulus during the delay period of both tasks. STM representations were more robust than LTM representations in early sensory areas. However, the difference between the fidelity of STM and LTM representations decreased in later sensory areas. Overall, our data suggest that changes in the relative representations of items stored in STM and LTM across the visual hierarchy likely support the precision of LTM recall.

Meeting abstract presented at VSS 2017

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×