Purchase this article with an account.
Walter Gerbino, Joanna Jarmolowska, Carlo Fantoni; Visuo-Haptic 3D Interpolation Shapes Amodally Completed Angles. Journal of Vision 2016;16(12):1195. doi: 10.1167/16.12.1195.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
According to action-perception coupling, extraretinal signals can disambiguate optic information, as in the contribution of head movements to the visual interpolation of twisted surfaces (Fantoni, Gerbino, Milani, Domini, JoV 2014). Going beyond approaches that exploit only the geometry of optic fragments, we focus here on the integration of self-generated haptic and optic information in 3D amodal completion. Does the precise shape of interpolated 3D surfaces depend on haptic information? We asked observers to reach and touch the nearest (visually amodal) portion of a convex 90° random-dot dihedron centrally occluded by a horizontal bar, glowing in the dark at eye height, with its virtual edge in the 500-550 mm depth range along the sagittal axis. As soon as the observer moved her hand, our HighRes Virtual Reality environment generated the 3D structure and a "magic finger", marked by a light point that provided a visual feedback of the index tip. The finger could go through the occluder and reach the amodally completed surface, conveying a tactile feedback. Using four coplanar gauge probes and four randomly interleaved staircases we measured the location of the (visually amodal) surface while the observer was (modally) touching it in three visual feedback conditions: normal (coincident with the actual finger position); 70-mm farther away; 70-mm closer. Relative to the normal feedback, the farther feedback molded the interpolated surface towards the good continuation solution, as if it were attracted by the objective index position (not the visually experienced one); while the closer feedback molded the interpolated surface farther away, in the direction of the minimal path solution. This is the first evidence of visuo-haptic 3D interpolation: when generating amodal surfaces, our "shape modeler" includes modal haptic information (derived from hand proprioception), going beyond geometric constraints defined by the co-circularity of optic fragments.
Meeting abstract presented at VSS 2016
This PDF is available to Subscribers Only