Abstract
Human visual tracking behavior, such as moving a mouse cursor to follow a visual target, shows systematic changes with target visibility. In particular, with a less visible stimulus, humans integrate the stimulus longer and the behavior is more delayed. We hypothesized that such change in tracking behavior is due to a tradeoff between sensory demands (longer integration to reduce uncertainty) and motor demands (increasing movement costs for accuracy). We tested this hypothesis through model comparison on human subject tracking data. We had subjects perform a continuous psychophysics task in which subjects track a gaussian blob in Brownian motion with a mouse cursor, while manipulating the stimulus signal-to-noise ratio. We cross-correlated the stimulus velocities with the cursor velocities and found that the kernels have a stereotypical delayed gamma-like shape: 100ms~200ms delay, followed by a peak at 300 ~ 500ms, and slowly coming back to baseline. Furthermore, replicating earlier results (Bonnen et al. 2015), both the delay time to the peak and the width of the kernel increased systematically with the stimulus SNR. Using the human tracking cross-correlations as a benchmark, we compared the behavior of two ideal observer models: one with an optimal control model including a control cost and one without. Both models included a visuoperceptual module implementing a Kalman filter, a Bayesian model for integrating uncertain visual information using an internal dynamics model. However, the full model also included an optimal control module that added linear cost to movements. The Kalman filter showed an exponential decay in cross-correlation, with the time constants increasing with sensory uncertainty. Meanwhile, the full model produced a gamma-like kernel, whose delay and width also increased systematically with stimulus uncertainty. Model comparison suggests that the dynamics of human tracking behavior arises not just from sensory constraints but from a combination of sensory and motor constraints.