Abstract
Bonne et al. (2015) proposed a Kalman filter model to estimate position sensitivity of visual perception in a continuous motion tracking task. The current study tested this model with stimuli defined by different visual information (i.e. luminance contrast vs. color contrast) and using two different tracking tasks (i.e. hand tracking vs. eye pursuit). For each trial, the subjects (N = 20) viewed a Gabor patch, which had luminance contrast (3 levels) or color contrast (L-M, and S axes in DKL color space, 3 levels each) to the gray background and randomly moved on the screen, for 60 seconds and were asked to track the patch using a stylus or fixate on the patch. The performance for each trial was fitted by a Kalman filter model using Gibbs sampling to estimate the posterior distributions of perceptual noise. Each condition was replicated twice to test the reliability of the method. We also measured the position thresholds using a traditional discrimination task with an adaptive procedure (60 trials for each level). In the hand tracking task, we observed medium-to-high correlations (r = .62-.69) between estimated mean posteriors of perceptual noise and fitted thresholds across subjects and contrast levels for all the three types of stimuli. In addition, the estimates from the hand tracking task were highly reliable (r = .986 ~ .994) between two trials of a same condition. On the other hand, estimates from the eye pursuit task had very limited correlation (r < .14) with the thresholds from the traditional procedure. It was possibly because the oculomotor system might not be sensitive to irregular small movements of the target as long as it remained in the foveal area. So we summarize that one can obtain effectively spatial sensitivity of visual perception from continuous motion tracking by hand, but not by eye.