December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Sensory and motor sources of delay in visuomotor tracking: a model for continuous psychophysics
Author Affiliations
  • Joshua Ryu
    Stanford University
  • Justin Gardner
    Stanford University
Journal of Vision December 2022, Vol.22, 4396. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Joshua Ryu, Justin Gardner; Sensory and motor sources of delay in visuomotor tracking: a model for continuous psychophysics. Journal of Vision 2022;22(14):4396.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Human visual tracking behavior, such as moving a mouse cursor to follow a visual target, shows systematic changes with target visibility. In particular, with a less visible stimulus, humans integrate the stimulus longer and the behavior is more delayed. We hypothesized that such change in tracking behavior is due to a tradeoff between sensory demands (longer integration to reduce uncertainty) and motor demands (increasing movement costs for accuracy). We tested this hypothesis through model comparison on human subject tracking data. We had subjects perform a continuous psychophysics task in which subjects track a gaussian blob in Brownian motion with a mouse cursor, while manipulating the stimulus signal-to-noise ratio. We cross-correlated the stimulus velocities with the cursor velocities and found that the kernels have a stereotypical delayed gamma-like shape: 100ms~200ms delay, followed by a peak at 300 ~ 500ms, and slowly coming back to baseline. Furthermore, replicating earlier results (Bonnen et al. 2015), both the delay time to the peak and the width of the kernel increased systematically with the stimulus SNR. Using the human tracking cross-correlations as a benchmark, we compared the behavior of two ideal observer models: one with an optimal control model including a control cost and one without. Both models included a visuoperceptual module implementing a Kalman filter, a Bayesian model for integrating uncertain visual information using an internal dynamics model. However, the full model also included an optimal control module that added linear cost to movements. The Kalman filter showed an exponential decay in cross-correlation, with the time constants increasing with sensory uncertainty. Meanwhile, the full model produced a gamma-like kernel, whose delay and width also increased systematically with stimulus uncertainty. Model comparison suggests that the dynamics of human tracking behavior arises not just from sensory constraints but from a combination of sensory and motor constraints.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.