August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Multisensory grasping relies on individual finger positions and their joint relationship
Author Affiliations & Notes
  • Ivan Camponogara
    New York University Abu Dhabi
  • Robert Volcic
    New York University Abu Dhabi
  • Footnotes
    Acknowledgements  This work was partially supported by the NYUAD Center for Artificial Intelligence and Robotics, funded by Tamkeen under the NYUAD Research Institute Award CG010.
Journal of Vision August 2023, Vol.23, 4720. doi:https://doi.org/10.1167/jov.23.9.4720
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Ivan Camponogara, Robert Volcic; Multisensory grasping relies on individual finger positions and their joint relationship. Journal of Vision 2023;23(9):4720. https://doi.org/10.1167/jov.23.9.4720.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Visuo-haptically guided actions (grasping a handheld object) are more accurate than actions guided by vision or haptics. This multisensory advantage originates from the additional haptic positional information provided by the hand holding the object. However, it is still unclear whether grasping relies on the average fingers’ position providing the overall object location or on the individual finger positions providing information about the object sides. Here we contrasted these hypotheses by introducing visuo-haptic size incongruencies. We varied the held lower part of an object (30, 40, 50 mm) with respect to the grasped upper part, which remained constant in size (40 mm). We then compared Visuo-Haptic (VH) grasping with two additional conditions where participants (n = 25) could either only see (Visual, V) the object or hold its lower part (Haptic, H). We found a modulation of grip aperture in both H and VH but not in V, with haptics in VH accounting for 25% of these changes. This suggests that the multisensory advantage is not based on overall object location only. In a second experiment (n = 27), we contrasted the separate and joint contributions of the individual fingers holding the object. We compared VH with two additional conditions where only the index finger or the thumb were contacting either the back or front side of the object, respectively. Grip aperture was similar between conditions but was modulated only when both fingers were contacting the object (VH condition). In contrast, when the index finger and thumb contacted the object separately, grasping movements were shifted toward each finger. Our results suggest that multisensory grasping relies on individual finger positions and their joint relationship, which provide combined information about the position of the object sides and their mutual distance.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×