September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Reconstructing physical representations of block towers in visual working memory
Author Affiliations & Notes
  • Stefan Uddenberg
    University of Chicago Booth School of Business
  • JunHyeok Kwak
    Yale University
  • Brian Scholl
    Yale University
  • Footnotes
    Acknowledgements  This project was funded by ONR MURI #N00014-16-1-2007 awarded to BJS.
Journal of Vision September 2021, Vol.21, 2929. doi:https://doi.org/10.1167/jov.21.9.2929
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Stefan Uddenberg, JunHyeok Kwak, Brian Scholl; Reconstructing physical representations of block towers in visual working memory. Journal of Vision 2021;21(9):2929. https://doi.org/10.1167/jov.21.9.2929.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Recent studies have explored the perception of physical properties (such as mass and stability) in psychology, neuroscience, and AI, and perhaps the most popular stimulus from such studies is the block tower -- since such displays (of stacked rectilinear objects) evoke immediate visual impressions of physical (in)stability. Here we explored a maximally simple question: what properties are represented during natural viewing of such stimuli? Previous work on this question has been limited in two ways. First, such studies typically involve explicit judgments ("Which way will it fall?"), which may prompt encoding strategies that would not otherwise operate automatically. Second, such studies can typically only explore those tower properties that are systematically manipulated as explicit independent variables. Here we attempted to overcome such limitations in an especially direct way: observers viewed a briefly-flashed block tower, and then immediately *reproduced* its structure from memory -- by dragging and dropping an array of blocks (initially presented on the simulated ground plane) using a custom 3D interface. This allowed us to directly measure the success of reproductions in terms of both lower-level image properties (e.g. the blocks' colors/orientations) and higher-level physical properties (e.g. when comparing the stability of the initial towers and their reproductions). Analyses revealed two types of evidence for the visual representation of 'invisible' abstract physical properties. First, the (in)stability of the reproductions (computed, e.g., in terms of the blocks' summed displacements from their original positions, as analyzed in a physics engine with simulated gravity) could not be directly predicted by lower-level image properties (such as the blocks' initial heights or spread). Second, reproductions of unstable towers tended to be more stable, but not vice versa. This work demonstrates how physical representations in visual memory can be revealed, all without ever asking anyone anything about physics.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×