September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Sensor-based quantization of a color image disturbs material perception
Author Affiliations
  • Masataka Sawayama
    Graduate School of Information Science and Technology, The University of Tokyo, Japan
Journal of Vision September 2024, Vol.24, 375. doi:https://doi.org/10.1167/jov.24.10.375
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Masataka Sawayama; Sensor-based quantization of a color image disturbs material perception. Journal of Vision 2024;24(10):375. https://doi.org/10.1167/jov.24.10.375.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Materials in our daily environments undergo diverse color changes based on environmental contexts. For instance, water is inherently colorless, but wetting a surface changes its colors due to optical interactions. Previous studies have explored the effect of colors on material perception while following the literature on object recognition, i.e., examining the effect of categorical colors on a grayscale image. However, unlike object recognition, categorical colors are not always diagnostic for material changes due to context dependence. To address the issue, this study explores color dimensions diagnostic to material perception. Building on recent studies showing that material perception depends on image color entropy (Sawayama et al., 2017), this study investigated the extent to which modulating the image color entropy, defined by the color quantization in a sensor color space (e.g., RGB or LMS), affects the material estimation. Specifically, the experiment leveraged a zero-shot prediction paradigm using pre-trained vision and language machine-learning models. It used 2AFC text prompts related to material perception, such as wet/dry or glossy/matte. The FMD (Sharan et al., 2014) and THINGS (Hebart et al., 2019) datasets were chosen for visual images. Color quantization was applied through the median cut to each image, reducing the quantized numbers from 128 to 2. Additionally, grayscale images were created from the original images. Results showed that the distribution of prediction probabilities was diversely distributed for original and grayscale images across all dataset images. However, when an original image was modulated by color quantization, the distribution diversity was biased heavily towards specific attributes, particularly dry and matte. Further experiments confirm that color quantization has less impact on zero-shot object recognition performance. These findings suggest that diverse material perception of an object image is available for high color entropy, where the color space is defined while mixing chromatic and luminance components.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×