Purchase this article with an account.
Alexis D. J. Makin, Tushar Chauhan; Memory-guided tracking through physical space and feature space. Journal of Vision 2014;14(13):10. doi: https://doi.org/10.1167/14.13.10.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
People can estimate the current position of an occluded moving target. This is called motion extrapolation, and it has been suggested that the performance in such tasks is mediated by the smooth-pursuit system. Experiment 1 contrasted a standard position extrapolation task with a novel number extrapolation task. In the position extrapolation task, participants saw a horizontally moving target become occluded, and then responded when they thought the target had reached the end of the occluder. Here the stimuli can be tracked with pursuit eye movements. In the number extrapolation task, participants saw a rapid countdown on the screen that disappeared before reaching zero. Participants responded when they thought the hidden counter would have reached zero. Although this stimulus cannot be tracked with the eyes, performance was comparable on both the tasks. The response times were also found to be correlated. Experiments 2 and 3 extended these findings, using extrapolation through color space as well as number space, while Experiment 4 found modest evidence for similarities between color and number extrapolation. Although more research is certainly needed, we propose that a common rate controller guides extrapolation through physical space and feature space. This functions like the velocity store module of the smooth-pursuit system, but with a broader function than previously envisaged.
This PDF is available to Subscribers Only