August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Building up visual memories from sensory evidence
Author Affiliations & Notes
  • Maria Robinson
    University of California, San Diego
  • Isabella DeStefano
    Department of Psychiatry, Boston University School of Medicine, Boston MA
  • Edward Vul
    Department of Psychological and Brain Sciences, Johns Hopkins University, Baltimore, MD
  • Timothy Brady
    National Center for PTSD, VA Boston Healthcare System, Boston, MA
  • Footnotes
    Acknowledgements  The study was supported by National Institute of Health Grant 1F32MH127823-01 awarded to Maria M. Robinson
Journal of Vision August 2023, Vol.23, 5835. doi:https://doi.org/10.1167/jov.23.9.5835
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Maria Robinson, Isabella DeStefano, Edward Vul, Timothy Brady; Building up visual memories from sensory evidence. Journal of Vision 2023;23(9):5835. https://doi.org/10.1167/jov.23.9.5835.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Visual short-term memory is a fundamental memory structure that supports the online maintenance of visual information in the service of goal-directed action. While much work examines the nature of visual memories, relatively little work investigates how people use their underlying, complex sensory information to build memory representations and make memory-based decisions. We examined this question by comparing two variants of a signal detection model. Both models start with an assumed high-dimensional set of sensory evidence, and they differ in how this evidence is integrated to make memory decisions. Through the lens of the first signal detection model, people pool sensory evidence via summation or averaging of sensory signals; according to the alternative signal detection model people take the maximum of the distribution of sensory signals. These two models of how sensory signals are integrated naturally result in different distributions of evidence, a Gaussian versus a Gumbel distribution over sensory evidence, respectively. To distinguish them, we compared these models on their ability to jointly fit data across a different number of alternatives in a multiple alternative forced-choice visual memory paradigm. Across two experiments, we found evidence that people pool sensory evidence via averaging or summing to make memory-based decisions. Furthermore, our findings suggest that this pooling process is robust; it is used both when people remember simple features (color) that are presented simultaneously (p<.001; dz = .77), and when they remember complex real-world objects that are presented sequentially (p<.001; dz = .81). This work elucidates the processes that link perception and memory, and opens venues for establishing novel linking propositions between neural and cognitive models of visual memory.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×