Abstract
There are many circumstances in which we are unable to touch an object, for example during online shopping. This can be a serious problem when the haptic qualities are an important aspect of the product experience. Here, we investigated whether this inability to touch can be alleviated by using a visual substitute. In Experiment 1, we measured visuo-haptic matching of fabric samples. Nine pieces of jeans were filmed in six different styles (e.g. hands pinching cloth or a downward flutter movement). Twenty observers matched haptically presented fabrics to the visual representations. Ten observers were shown the actual movies, and the other ten served as a control group that was shown stills from the movies. Results indicate that movies enrich the sensorial information sufficiently to improve the prediction of haptic apparel properties, although the effect is modest. In Experiment 2, we compared haptic similarity judgments of all possible cloth pairs, with visual estimations of haptic similarities. We hypothesized that haptic judgments would be closer to the visual judgments based on movies, than those based on merely movie stills. A total or 32 observers participated in this experiment: 16 based their judgments on haptic perception, 9 on the movies and 7 on the pictures. Results revealed high correlations (r~0.8-0.9) between the two visual conditions. Surprisingly, we found no correlation between the similarity judgments based on visual information with the similarity judgments based on haptic information. These results were rather constant over the various movie styles. The result of Experiment 2 implies that observers perceive the haptic qualities of fabrics differently when actually touching then through their visual substitutes. Thus, although performance improved (Experiment 1) for dynamic visual information as opposed to static, the visual communication of haptic properties has to be further improved to optimally match actual haptic sensations.
Meeting abstract presented at VSS 2016