October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Global scene similarity structure predicts memory performance
Author Affiliations & Notes
  • Hayden M Schill
    Department of Psychology, University of California, San Diego
  • Timothy F Brady
    Department of Psychology, University of California, San Diego
  • Footnotes
    Acknowledgements  NSF BCS-1829434 to TFB
Journal of Vision October 2020, Vol.20, 614. doi:https://doi.org/10.1167/jov.20.11.614
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Hayden M Schill, Timothy F Brady; Global scene similarity structure predicts memory performance. Journal of Vision 2020;20(11):614. https://doi.org/10.1167/jov.20.11.614.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Memory for low-level features such as color and orientation can be explained using a signal-detection model that takes into account perceptual similarity (Schurgin, Wixted, Brady, 2018). Such similarity falls off as an approximately exponential function of distance in perceptual space, similar to the extent of overlap in corresponding neural populations. Can perceptual similarity judgements for higher-level representations such as scenes predict scene memory? In scenes, similarity judgments and memory must depend on much richer representations than simple overlap in a single neural population (e.g., categorization depends on function: Greene et al. 2016; memory on conceptual overlap: Konkle et al. 2010). In order to assess this, we created a new continuous scene space database by extracting temporally evenly spaced frames from videos shot from drones, resulting in 100 unique categories with a gradient of similarity of scenes in each. In a similarity task, we presented N=100 UCSD undergraduate participants with two images at a time and asked them to rate their similarity on a continuous scale. As expected, similarity judgments were complex and not well explained by simple low-level feature overlap, though global measures such as color histograms and histograms of oriented gradients did predict significant amounts of similarity variance. In order to see if similarity ratings predict memory confusability, we then conducted an independent memory experiment (N=200) on Prolific, where participants viewed 100 categorically distinct images and then did a 2-AFC memory test. We found that participants’ judgements of similarity explained almost 50% of the explainable variance in memory performance (p<0.0001). Thus, memory confusability is linearly predicted by independent similarity ratings. Both are themselves complex, however, depending on functional and conceptual features rather than perceptual features. This is broadly consistent with the case of simple features, as well with theories of recognition memory that depend on similarity (e.g., Nosofsky, 1992).

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×