September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Similarity-dependent memory integration of scene images
Author Affiliations
  • Simeng Guo
    School of Psychological and Cognitive Sciences, Peking University, Beijing, China
    PKU-IDG/McGovern Institute for Brain Research, Peking University, Beijing, China
    Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, China
    Key Laboratory of Machine Perception (Ministry of Education), Peking University, Beijing, China
    National Key Laboratory of General Artificial Intelligence
  • Sheng Li
    School of Psychological and Cognitive Sciences, Peking University, Beijing, China
    PKU-IDG/McGovern Institute for Brain Research, Peking University, Beijing, China
    Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, China
    Key Laboratory of Machine Perception (Ministry of Education), Peking University, Beijing, China
    National Key Laboratory of General Artificial Intelligence
Journal of Vision September 2024, Vol.24, 301. doi:https://doi.org/10.1167/jov.24.10.301
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Simeng Guo, Sheng Li; Similarity-dependent memory integration of scene images. Journal of Vision 2024;24(10):301. https://doi.org/10.1167/jov.24.10.301.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

People often encounter novel events similar to their previous experiences. An intriguing question is how the similar representations interact during learning. Previous studies suggested that feature-based similarity resulted in systematic memory distortions. The present study examined the effect of memory integration due to learning similar scenes. We used generative adversarial networks (GANs) to generate scene wheels from which the to-be-remembered scenes were selected. In an online experiment (n = 59), we evaluated the perceptual similarity of images from the scene wheels and selected scene-pairmates (A1 and A2) with varying perceptual similarities. In three main experiments (n = 27, 27, 28), A1 and A2 were paired with different images (B1 and B2) to form competitive associations. Subjects learned these associations with explicit knowledge that scenes paired with different images were always different (even though they might look similar). Importantly, learning of competitive associations was temporally separated (“A1-B1” preceded its competitor, “A2-B2”). Across three experiments, we found robust attractive memory distortion of A2 towards its highly-similar competitor (A1). In Experiment 1 and Experiment 2 (with increased training on A2-B2), the attraction effects were asymmetric: memory of A1 was not biased relative to A2. Interestingly, in Experiment 3 in which training on A1-B1 was increased, the asymmetry disappeared: memories of A1 and A2 were biased towards each other. Moreover, we examined the consequences of the distortions. As expected, attractive distortions decreased discriminability between highly-similar associative memories. We unified these findings using a Hebbian learning framework and suggested that (1) greater coactivation between B2 and A1 as compare to the coactivation between B1 and A2 caused asymmetric integration, and (2) the balanced coactivations eliminated such asymmetry. Collectively, we showed that similarity-dependent integration of complex visual experiences might cause asymmetric memory distortion. The degree of the asymmetry depends on the level of coactivation during integration.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×