Journal of Vision Cover Image for Volume 24, Issue 10
September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
The psychophysics of style
Author Affiliations & Notes
  • Tal Boger
    Johns Hopkins University
  • Chaz Firestone
    Johns Hopkins University
  • Footnotes
    Acknowledgements  NSF BCS 2021053, NSF GRFP 2023351964
Journal of Vision September 2024, Vol.24, 398. doi:https://doi.org/10.1167/jov.24.10.398
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Tal Boger, Chaz Firestone; The psychophysics of style. Journal of Vision 2024;24(10):398. https://doi.org/10.1167/jov.24.10.398.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Images vary not only in content, but also in style. When viewing a Monet painting, for example, we see both the scenery it depicts (lilies dotting a pond) and the manner in which it does so (broken brushstrokes, blended colors, etc.). Parsing images in this way is a remarkable perceptual achievement, akin to separating illumination and reflectance to achieve color constancy, or disentangling letter-identities from typefaces when reading. What is the nature of this process, and what are its psychophysical signatures? Here, 9 experiments reveal 3 new phenomena of style perception. (1) Style tuning. Using neural style-transfer models, we rendered natural scenes in the styles of famous artists. Then, inspired by ‘font tuning’ (wherein text is easier to read in a single typeface than multiple typefaces), we asked observers to scan arrays of images and enumerate all scenes of one type (e.g., mountains). Observers were faster and more accurate in same-style arrays than mixed-style arrays [E1–E2]. Such tuning accumulated over time [E3] and survived controls for color and luminance [E4]. (2) Style discounting. Analogous to ‘discounting the illuminant’ in color constancy, we find that vision ‘discounts’ style. Changes to a scene’s content (e.g., Monet-pond → Monet-building) were more easily detected than changes to its style (Monet-pond → Klimt-pond; E5), even when low-level image statistics predicted the opposite [E6]. (3) Style extrapolation. After viewing items in a given style (e.g., a fork and knife from one cutlery set), observers misremembered seeing additional items from that style (the spoon from that set; E7), even with low-level similarity equated across lures [E8–E9]. Such errors suggest spontaneous representation of the unseen items — as if mentally 'rendering' objects in newly learned styles. While we typically associate style with more qualitative approaches, our work explores how tools from vision research can illuminate its psychological basis.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×