August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
Visuo-Haptic 3D Interpolation Shapes Amodally Completed Angles
Author Affiliations
  • Walter Gerbino
    Psychology Unit, Department of Life Sciences, University of Trieste
  • Joanna Jarmolowska
    Psychology Unit, Department of Life Sciences, University of Trieste
  • Carlo Fantoni
    Psychology Unit, Department of Life Sciences, University of Trieste
Journal of Vision September 2016, Vol.16, 1195. doi:10.1167/16.12.1195
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Walter Gerbino, Joanna Jarmolowska, Carlo Fantoni; Visuo-Haptic 3D Interpolation Shapes Amodally Completed Angles. Journal of Vision 2016;16(12):1195. doi: 10.1167/16.12.1195.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

According to action-perception coupling, extraretinal signals can disambiguate optic information, as in the contribution of head movements to the visual interpolation of twisted surfaces (Fantoni, Gerbino, Milani, Domini, JoV 2014). Going beyond approaches that exploit only the geometry of optic fragments, we focus here on the integration of self-generated haptic and optic information in 3D amodal completion. Does the precise shape of interpolated 3D surfaces depend on haptic information? We asked observers to reach and touch the nearest (visually amodal) portion of a convex 90° random-dot dihedron centrally occluded by a horizontal bar, glowing in the dark at eye height, with its virtual edge in the 500-550 mm depth range along the sagittal axis. As soon as the observer moved her hand, our HighRes Virtual Reality environment generated the 3D structure and a "magic finger", marked by a light point that provided a visual feedback of the index tip. The finger could go through the occluder and reach the amodally completed surface, conveying a tactile feedback. Using four coplanar gauge probes and four randomly interleaved staircases we measured the location of the (visually amodal) surface while the observer was (modally) touching it in three visual feedback conditions: normal (coincident with the actual finger position); 70-mm farther away; 70-mm closer. Relative to the normal feedback, the farther feedback molded the interpolated surface towards the good continuation solution, as if it were attracted by the objective index position (not the visually experienced one); while the closer feedback molded the interpolated surface farther away, in the direction of the minimal path solution. This is the first evidence of visuo-haptic 3D interpolation: when generating amodal surfaces, our "shape modeler" includes modal haptic information (derived from hand proprioception), going beyond geometric constraints defined by the co-circularity of optic fragments.

Meeting abstract presented at VSS 2016

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×