Abstract
Multi-sensory feedback about the shape of objects provides information about their physical reality. This study tested how visual cues and simple haptic shape perception are integrated for three-dimensional angle discrimination. Participants explored a pair of two depth-rotated planes that joined to form a concave angle. A haptic device (Phantom Omni) made it possible for them to touch the virtual planes. As they moved an articulated arm with force-feedback, everything was loose until the location of the cursor coincided with that of the virtual plane. Then the arm stiffened up, although they could feel it slide around on the plane. The range of angles was from 45 to 135 degrees (in increments of 5 degrees). A two-alternative forced-choice task was used to indicate whether the angle was greater than or less than a right angle. The right angle was chosen because of its familiarity in everyday life. Two kinds of two-dimensional visual cues, a cursor location cue and a plane displacement cue, were manipulated. On a CRT display, the cursor location cue indicated the motion of the articulated arm, and the plane displacement cue was a main vertical line and some additional lines that indicated the location of the angle. The obtained data were fitted to a logistic function, and the discrimination threshold was estimated to be 86 degrees with the location cue and 101 degrees without the location cue, while the displacement cue was ineffective. The estimated angle difference was robust in the series of experiments with a variety of visual cues. This difference is considered to come from the integration of the 3D haptic plane and the 2D visual cue.