August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
Peripheral material perception
Author Affiliations
  • Shaiyan Keshvari
    Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology
  • Maarten Wijntjes
    Perceptual Intelligence lab, Industrial Design Engineering, Delft University of Technology
Journal of Vision September 2016, Vol.16, 641. doi:https://doi.org/10.1167/16.12.641
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Shaiyan Keshvari, Maarten Wijntjes; Peripheral material perception. Journal of Vision 2016;16(12):641. https://doi.org/10.1167/16.12.641.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Humans can rapidly detect and identify materials, such as a hardwood floor or a smooth leather jacket, in a visual scene. Prior research shows that visual texture plays an important role in how humans identify materials. Interestingly, recent models of peripheral vision suggest a texture-like encoding. This might mean that textures are well represented by peripheral vision, and therefore a natural question to ask is how well materials are perceived in the periphery, and whether that peripheral vision model can predict this performance. We present windowed grayscale images of materials from the Flickr Materials Database (photos of various materials from different scales and viewpoints) briefly to 8 human subjects at 10 degrees visual eccentricity. Subjects categorized each image (50 per category) into one of 6 material categories (fabric, foliage, leather, stone, water, and wood). To generate model predictions, we synthesized images with the same texture statistics as the original, using Portilla-Simoncelli texture analysis-synthesis. Subjects free-viewed these synthesized images and also performed the material classification task. (Note that the full version of the model would extract statistics from multiple overlapping regions, not from a single region as here.) We found small but significant correlation between the peripheral and synthesized conditions. This provides evidence that the model can predict peripheral vision, including material perception. Each subject was well above chance in both conditions. Visual inspection of the individual stimuli seems to suggest that images in which peripheral vision is better than the synthetic versions contain more large-scale shape cues, such as water droplets and individual leaves (foliage). Both viewing conditions seemed to be performed well when images contained more small-scale texture properties, such as wooden or stone surfaces. Importantly, we examined performance for individual stimuli between subjects. There is a huge range in performance in both conditions, which we discuss in detail.

Meeting abstract presented at VSS 2016

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×