July 2013
Volume 13, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   July 2013
Can you see what you feel? Tactile and visual matching of material properties of fabrics
Author Affiliations
  • Bei Xiao
    Brain and Cognitive Sciences, Massachusettes Institute of Technology, Cambridge, MA
  • Xiaodan Jia
    Brain and Cognitive Sciences, Massachusettes Institute of Technology, Cambridge, MA
  • Edward Adelson
    Brain and Cognitive Sciences, Massachusettes Institute of Technology, Cambridge, MA\nComputer Science and Artificial Intelligence Laboratory, Massachusettes Institute of Technology, Cambridge, MA
Journal of Vision July 2013, Vol.13, 197. doi:https://doi.org/10.1167/13.9.197
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Bei Xiao, Xiaodan Jia, Edward Adelson; Can you see what you feel? Tactile and visual matching of material properties of fabrics. Journal of Vision 2013;13(9):197. https://doi.org/10.1167/13.9.197.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Material perception is typically studied by asking for verbal judgments about properties like gloss. But words are limited. When you shop online for clothing, the picture wordlessly conveys haptic information. By looking at the picture you can (hopefully) predict how the fabric will act and feel. The ability to make such predictions is an important benefit of material perception. We selected a pair of fabrics, and presented them both visually ("look but don’t touch") or tactilely ("touch but don’t look"), and asked the observers to decide which matched which. Since there is a right answer, this allows for an objective assessment of material perception. During each trial, observers were asked to arrange two pieces of fabrics, using their hands (without looking) inside a box to match these to a photograph of the same pair of fabrics displayed on a monitor for 12 seconds. We manipulated the folding conditions and color of the fabrics in the images. In Experiment 1, the fabrics were folded into three conditions: draped over a bumpy object (3D draping), hanging from two adjacent corners (3D hanging), and framed on a flat foam board (2D flat). For each folding condition, 60 pairs of photographs of the fabrics were used. The data show that mean matching accuracy is higher for the 3D conditions than for the 2D conditions. In Experiment 2, grayscale images of the same pairs of fabrics were used for the same shape conditions. We found that removing color reduced matching accuracy, but only for the 3D conditions. Together, our results show that richer visual information, revealing both mechanical structure and texture information, leads to better tactile discrimination in fabrics. Specifically, color and 3D information (such as their draping and deformation properties) are important visual cues for predicting tactile properties of fabrics from images.

Meeting abstract presented at VSS 2013

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×