August 2009
Volume 9, Issue 8
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2009
Visual-haptic integration during pointing movements
Author Affiliations
  • Sascha Serwe
    Department for General and Experimental Psychology, JLU Giessen, Germany
  • Julia Trommershäuser
    Department for General and Experimental Psychology, JLU Giessen, Germany
  • Konrad Körding
    Rehabilitation Institute of Chicago, Northwestern University, Chicago, IL, USA
Journal of Vision August 2009, Vol.9, 707. doi:https://doi.org/10.1167/9.8.707
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Sascha Serwe, Julia Trommershäuser, Konrad Körding; Visual-haptic integration during pointing movements. Journal of Vision 2009;9(8):707. https://doi.org/10.1167/9.8.707.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Many perceptual cue combination studies have shown that humans integrate information across modalities as well as within a modality in a manner statistically close to optimal. Here we asked whether the same rules hold in the context of movement planning tasks. We tested this during a pointing task where information about the target location was provided either by haptic or by visual or by both visual and haptic feedback during the pointing movement. Visual information was provided by briefly flashing three dots sampled from a Gaussian around the target position with a standard deviation of 4 cm. Haptic information was provided by pushing the index finger upwards using a PHANToM haptic interface. The strength of the force pulse (1 N to 3.5 N) indicated the target position. We measured the distance from hit point to target location and subjects earned money for minimizing this distance. We could well account for this data after extending the common maximum-a-posteriori (MAP) model for cue combination by adding a term that compensates for motor noise. Our model assumes that subjects select a target point by optimally combining a prior and all available sensory information. In addition, motor noise is present in both unimodal and bimodal trials and cannot be reduced further. The model parameters were fitted for all conditions simultaneously on a trial-by-trial basis. The model accurately predicts visual and haptic weights as well as subjects' performance. To test whether synchronicity influences the way the nervous system combines cues, we also analyzed situations in which visual and haptic information was presented with temporal disparity. We find that for our sensorimotor task temporal disparity had no effect. Sensorimotor learning appears to converge to the same near optimal rules for cue combination that are used by perception and to make use of all the available information.

Serwe, S. Trommershäuser, J. Körding, K. (2009). Visual-haptic integration during pointing movements [Abstract]. Journal of Vision, 9(8):707, 707a, http://journalofvision.org/9/8/707/, doi:10.1167/9.8.707. [CrossRef]
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×