September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Scene wheels: Measuring perception and memory of real-world scenes with a continuous stimulus space
Author Affiliations & Notes
  • Gaeun Son
    University of Toronto
  • Dirk B. Walther
    University of Toronto
  • Michael L. Mack
    University of Toronto
  • Footnotes
    Acknowledgements  This research was supported by Natural Sciences and Engineering Research Council (NSERC) Discovery Grants (RGPIN-2017-06753 to MLM and RGPIN-2020-04097 to DBW) and Canada Foundation for Innovation and Ontario Research Fund (36601 to MLM).
Journal of Vision September 2021, Vol.21, 2027. doi:https://doi.org/10.1167/jov.21.9.2027
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Gaeun Son, Dirk B. Walther, Michael L. Mack; Scene wheels: Measuring perception and memory of real-world scenes with a continuous stimulus space. Journal of Vision 2021;21(9):2027. https://doi.org/10.1167/jov.21.9.2027.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Precisely characterizing mental representations of visual experiences requires careful control of experimental stimuli. Recent work leveraging such stimulus control in continuous report paradigms have led to important insights; however, these findings are constrained to simple visual properties like colour and line orientation. There remains a critical methodological barrier to characterizing perceptual and mnemonic representations of realistic visual experiences. Here, we introduce a novel method to systematically control visual properties of natural scene stimuli. Using generative adversarial networks (GAN), a state-of-art deep learning technique for creating highly realistic synthetic images, we generated scene wheels in which continuously changing visual properties smoothly transition between meaningful realistic scenes. To validate the efficacy of scene wheels, we conducted a memory experiment in which participants reconstructed to-be-remembered scenes from the scene wheels. Reconstruction errors for these scenes resemble error distributions observed in prior studies using simple stimulus properties. We additionally manipulated the radii of the wheels to parametrically control the similarity among the scenes in the wheels. Upon this manipulation, we found clear evidence that participants’ memory performance systematically varied with the level of scene similarity. These findings suggest our novel approach to generate scene stimuli using a GAN not only allows for an unprecedented level of stimulus control for complex scene stimuli, but also that the GAN’s latent spaces generating our scene wheels reflect fundamental representational spaces important for human scene perception and memory. Based on this level of control over the scene stimulus space, we expect that findings from using simple stimuli, such as colour wheels, will generalize to photo-realistic scenes, providing key insights into how we perceive and remember the real-world naturalistic environments that serves as the backdrop to our everyday experiences.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×