Abstract
Haptic feedback consists of the tactile and kinesthetic information available during physical interaction with surfaces and objects. When grasping an object, haptic information specifies object size and therefore may influence grasp kinematics over multiple grasps. In real-world grasping movements, the visual size of an object exactly corresponds to its haptically specified size, confounding the relative influences of these two modalities on grasp performance. As a result, accurate grasp performance in repeated grasping tasks may not be attributed entirely to veridical encoding of visual size. In these experiments, we independently controlled the visual and haptic information presented during repeated grasping of vertically-oriented rectangular bars. Visual bar sizes were presented using a computer display and an oblique mirror such that the bars appeared at eye height 40 cm in front of the seated observer. Haptic size information was provided by physical bars presented at the same 3D locations as the visual bars. Vision of the hand and physical arrangement was unavailable, making it impossible to notice size mismatches. In Experiment 1, three physical bars were grasped in a random order in three blocks of 30 grasps. In each block, the visual bars appeared smaller (-4 mm), larger (+4 mm), or equal in size to the physical bars. Maximum Grip Aperture scaling matched the physical size differences across all three blocks while we found no modulations due to visual size variation. To further investigate the time course of visuo-haptic integration, we ran a follow-up experiment looking at how MGA scaling develops over 14 repeated grasps at bars with seven different visuo-haptic combinations. Our results suggest that visual information is initially dominant in determining the size of the maximum grip aperture, but haptic information gradually overrides visual information in less than ten trials and enables physically accurate grasp performance.
Meeting abstract presented at VSS 2015