September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
The mental representation of materials distilled from >1.5 million similarity judgements
Author Affiliations & Notes
  • Filipp Schmidt
    Justus-Liebig-University Giessen, Giessen, Germany
    Center for Mind, Brain and Behavior (CMBB), Marburg and Giessen, Germany
  • Martin N. Hebart
    Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
  • Alexandra Schmid
    National Institutes of Health, Bethesda, MD, USA
  • Roland W. Fleming
    Justus-Liebig-University Giessen, Giessen, Germany
    Center for Mind, Brain and Behavior (CMBB), Marburg and Giessen, Germany
  • Footnotes
    Acknowledgements  This work was funded by the Max Planck Society, the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation)–project number 222641018–SFB/TRR 135 TP C1 and by the European Research Council (ERC) Consolidator Award ‘SHAPE’–project number ERC-CoG-2015-682859.
Journal of Vision September 2021, Vol.21, 1981. doi:https://doi.org/10.1167/jov.21.9.1981
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Filipp Schmidt, Martin N. Hebart, Alexandra Schmid, Roland W. Fleming; The mental representation of materials distilled from >1.5 million similarity judgements. Journal of Vision 2021;21(9):1981. https://doi.org/10.1167/jov.21.9.1981.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Our ability to visually discriminate, categorize, recognize and compare materials is crucial for many tasks. Given the enormous variety of materials and variability in their appearance, what visual information do we rely on for distinguishing and comparing them? Here, we sought to uncover the major dimensions in our mental representation of materials, using a large-scale data-driven approach. First, we identified 200 diverse material classes sampled systematically from nouns in the American English language and collected three high-quality photographs for each class. Next, we used crowdsourcing to collect >1.5 million judgments asking which of two materials randomly chosen from the set was more similar to a third reference material, where the unchosen material acted as a context to highlight the relevant dimensions shared by the other two. We described each material image as a sparse, non-negative vector in a multidimensional representational space, modeled the assumed cognitive process for predicting choices, and iteratively used the difference between predicted choice probability and actual choice to adapt the dimensions’ weights. The resulting model predicted material similarity judgments in an independent test set with >90% accuracy relative to the human noise ceiling and allowed us to accurately construct the similarity matrix of all 600 images. Similar to recent findings in the visual perception of objects, individual material images can be described as a combination of a small number of 36 material dimensions. These dimensions are highly reproducible and interpretable, comprising material categories, color and texture attributes, and more complex (e.g., mechanical) properties. Our computational model and resulting dimensions have broad application for studying material perception and its natural dimensions, such as predicting context-specific similarities, providing perceptual scales along which to rate new materials, testing the validity of ratings of material properties commonly used in the visual perception literature, and comparing behavioral representations to cortical representations.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×