Purchase this article with an account.
Sergei Gepshtein, Martin S. Banks, Carmel A. Levitan; How sight and touch are combined depends on viewing geometry. Journal of Vision 2002;2(7):399. doi: 10.1167/2.7.399.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
We studied how the brain combines information from sight and touch in size perception. Contrary to earlier views that sight always dominates touch, recent evidence shows that the more visual information is corrupted by external noise, the more perceived size depends on touch. We asked whether touch information affects size perception when size is hard to measure visually. In a 2-IFC procedure, each interval contained two planes. Observers indicated the interval containing the more widely separated planes. Visual stimuli were depicted with stereograms, and touch (haptic) stimuli with force-feedback devices (PHANToMs). The planes were presented in three orientations: (1) perpendicular to the line of sight so the separation had to be judged from disparity-specified depth alone, (2) parallel to the line of sight so the separation could be judged from 2-D separation, and (3) 45 deg relative to the line of sight. Observers made size discriminations for these orientations with vision alone (V task), touch alone (T task), and with vision and touch together (VT task). In the VT task, the visual and haptic sizes were the same in one interval and differed in the other. We found that the just-discriminable separation in the V task depended on the planes orientation: it was significantly higher when they were perpendicular as opposed to parallel to the line of sight. Performance in the T task did not depend on orientation. Because the visual estimates varied with orientation and the touch estimates did not, optimal use of the two sources of information predicts visual dominance in the parallel orientation and more touch influence in the perpendicular orientation. Results in the VT task were very consistent with this prediction. Thus, the brain combines information from sight and touch in a way that depends on stimulus orientation. This is evidence that the weights given to sight and touch vary according to the relative reliabilities of the two information sources.
This PDF is available to Subscribers Only