September 2011
Volume 11, Issue 11
Free
Vision Sciences Society Annual Meeting Abstract  |   September 2011
Optimal visual and proprioceptive cue integration in motion perception
Author Affiliations
  • Bo Hu
    Center for Visual Science, University of Rochester, USA
  • Grayson Sipe
    Neuroscience Graduate Program, University of Rochester, USA
  • David Knill
    Center for Visual Science, University of Rochester, USA
Journal of Vision September 2011, Vol.11, 785. doi:https://doi.org/10.1167/11.11.785
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Bo Hu, Grayson Sipe, David Knill; Optimal visual and proprioceptive cue integration in motion perception. Journal of Vision 2011;11(11):785. https://doi.org/10.1167/11.11.785.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

When we move our hands, both visual and proprioceptive input provides information about the motion. We show that subjects integrate these two modalities in a Bayesian optimal way in a two-part study. In the first part, we measured the reliabilities of subjects' estimates of movement direction using only proprioceptive or only visual motion information. In the proprioceptive condition, a robot arm moved a manipulandum held by subjects 15 cm back and forth along linear trajectories sampled uniformly from −25 to −35 and 55 to 65 degrees relative to the midsagittal plane. Subjects judged whether the motion direction presented in a second interval was clockwise to that of the first interval. In the visual condition, subjects made similar judgments of spatial-temporally correlated noise patterns moving with the same velocity profiles as the robot in the proprioceptive condition. We modulated the reliability of the visual information by using two different signal-to-noise ratios in the visual stimuli. In the second part, the robot moved subjects' hands behind a mirror while they viewed similar visual patterns (spatially co-aligned with the manipulandum held by subjects). The visual motion either equaled that of the robot or deviated in direction by ±10 degrees. Subjects adjusted a dial to indicate their perceived motion direction. Subjects finished two sessions of counter-balanced proprioceptive and visual discrimination trials and two sessions of the visual-prioprioceptive adjustment task. The reliability of each modality was computed from fitted psychometric functions from proprioceptive and vision-only discrimination sessions. Relative cue weights were estimated by regressing subjects' direction judgments in the last two sessions against the directions suggested by each cue. Subjects gave slightly more weight to vision in the high visual SNR condition but relied less on the visual cue in the low SNR condition as predicted by the threshold data.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×