Abstract
Material perception is typically studied by asking for verbal judgments about properties like gloss. But words are limited. When you shop online for clothing, the picture wordlessly conveys haptic information. By looking at the picture you can (hopefully) predict how the fabric will act and feel. The ability to make such predictions is an important benefit of material perception. We selected a pair of fabrics, and presented them both visually ("look but don’t touch") or tactilely ("touch but don’t look"), and asked the observers to decide which matched which. Since there is a right answer, this allows for an objective assessment of material perception. During each trial, observers were asked to arrange two pieces of fabrics, using their hands (without looking) inside a box to match these to a photograph of the same pair of fabrics displayed on a monitor for 12 seconds. We manipulated the folding conditions and color of the fabrics in the images. In Experiment 1, the fabrics were folded into three conditions: draped over a bumpy object (3D draping), hanging from two adjacent corners (3D hanging), and framed on a flat foam board (2D flat). For each folding condition, 60 pairs of photographs of the fabrics were used. The data show that mean matching accuracy is higher for the 3D conditions than for the 2D conditions. In Experiment 2, grayscale images of the same pairs of fabrics were used for the same shape conditions. We found that removing color reduced matching accuracy, but only for the 3D conditions. Together, our results show that richer visual information, revealing both mechanical structure and texture information, leads to better tactile discrimination in fabrics. Specifically, color and 3D information (such as their draping and deformation properties) are important visual cues for predicting tactile properties of fabrics from images.
Meeting abstract presented at VSS 2013