May 2008
Volume 8, Issue 6
Free
Vision Sciences Society Annual Meeting Abstract  |   May 2008
The brain integrates visual and haptic information from different spatial locations when using a tool
Author Affiliations
  • Chie Takahashi
    School of Psychology, Bangor University, UK
  • Jörn Diedrichsen
    School of Psychology, Bangor University, UK
  • Simon Watt
    School of Psychology, Bangor University, UK
Journal of Vision May 2008, Vol.8, 1060. doi:https://doi.org/10.1167/8.6.1060
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Chie Takahashi, Jörn Diedrichsen, Simon Watt; The brain integrates visual and haptic information from different spatial locations when using a tool. Journal of Vision 2008;8(6):1060. https://doi.org/10.1167/8.6.1060.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Several studies suggest that size estimates from visual and haptic modalities are combined in a statistically optimal fashion (Ernst & Banks, 2002). Recently, Gepshtein et al. (2005) showed that optimal integration occurs only if the two signals originate from the same perceived spatial location. Here we show that when using a tool to explore an object haptically, optimal integration is largely restored despite spatial separation between haptic and visual input. Our method was similar to Gepshtein et al.'s. We used a two-interval forced-choice task to measure discrimination thresholds for judging the distance between two planes. We first measured thresholds in visual- and haptic-alone conditions, in order to predict performance when both cues were available. We then measured thresholds when both cues were available, at spatial offsets between 0 and 100 mm, in (i) a no-tool condition, and (ii) a with-tool condition, in which subjects grasped the stimulus with visually defined “sticks” attached to the finger and thumb. Tool length varied with spatial offset so that the ends of the tool always aligned with the visual stimulus. The tool was extinguished before contact with the planes, so the information available was identical in no-tool and with-tool conditions. In the no-tool condition we replicated Gepshtein et al.'s result. With zero offset, two-cue thresholds were as predicted by optimal cue integration, increasing to single-cue levels at 100 mm offset. In the with-tool condition, thresholds were significantly lower than single-cue performance at all spatial offsets and were close, though not equal, to the prediction of optimal cue integration. We conclude that the amount of integration of visual and haptic information is not determined by the spatial separation of the hand and visual object, but by a more sophisticated mapping process that can take into account the dynamics and geometry of tools.

Takahashi, C. Diedrichsen, J. Watt, S. (2008). The brain integrates visual and haptic information from different spatial locations when using a tool [Abstract]. Journal of Vision, 8(6):1060, 1060a, http://journalofvision.org/8/6/1060/, doi:10.1167/8.6.1060. [CrossRef]
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×