August 2014
Volume 14, Issue 10
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2014
Fine-grained representation of visual object information retrieved from long-term memory
Author Affiliations
  • Sue-Hyun Lee
    Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health
  • Dwight Kravitz
    The Department of Psychology, The George Washington University
  • Chris Baker
    Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health
Journal of Vision August 2014, Vol.14, 167. doi:https://doi.org/10.1167/14.10.167
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Sue-Hyun Lee, Dwight Kravitz, Chris Baker; Fine-grained representation of visual object information retrieved from long-term memory. Journal of Vision 2014;14(10):167. https://doi.org/10.1167/14.10.167.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Long-term memory processes allow humans to store newly learned information, and recall that information later. Although prior studies have suggested that short-term (or working) memory retrieval generates object-specific representations in visual cortex, it remains unclear how specific the representations recalled from long-term memory are. To test whether the visual cortex as well as hippocampus represents object-specific activation during recall of visual information from long-term memory, we performed an event-related functional magnetic resonance imaging (fMRI) experiment, comprising separate perception, learning and recall sessions. During the perception session, participants were presented with fixed pairings of 14 auditory cues (psuedowords) and object images (e.g. tenire- chair) inside the scanner. During the learning session, on a separate day outside the scanner, participants were trained to memorize the pseudoword-object associations for about one hour. Finally, one day after the learning session, participants were scanned and instructed to recall each object image in response to the paired pseudoword cue. To test the veracity of the recalled visual information, participants were asked to perform forced-choice tests and draw detailed pictures of the object images after the retrieval scan session. Every participant showed good performance in the forced-choice (> 90% correct) and drawing tests. We focused on two primary regions-of-interest: object-selective cortex and hippocampus. Both object-selective cortex and hippocampus were significantly activated during the recall of paired object images. Moreover, the response of both object-selective cortex and hippocampus areas could be used to decode the identity of individual remembered objects, and there was close correspondence between the representations during perception and retrieval in object-selective cortex. These results suggest that recall of visual information from long-term memory activates a fine-grained representation in both hippocampal and cortical areas.

Meeting abstract presented at VSS 2014

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×