Abstract
The ability to extrapolate the motion of objects that move in straight paths at a fixed velocity when they go behind an occluding surface has been shown to be poor but to improve when the surface has landmarks (Pylyshyn & Cohen, ARVO 1999).i Here we extend this study by having observers tap on the screen to indicate where they believe the moving object was when it was occluded vs. when visible, and when there were landmarks along the route. We also recorded eye movements to investigate whether gaze may play a role in imagined motion-tracking. Method. Forty participants tracked a square moving from left-to-right on a display screen, and twenty tracked the square moving right-to-left (during ~20 - 5 second trials for 2-occlusion and 2-landmark conditions). Subjects selected the position of tracked (but hidden) object with their finger when signaled by a randomly presented sound probe. Results. Most accurate localization occurs when object is always visible. But when occluded, gaze and touch localization undershoot actual target position regardless of movement direction or landmark presence. Response-lag is greater for gaze, except when probe onset is brief (<1.9 s) or when subjects are familiar with motion path (e.g., when block of occluding trials precede non-occluding trials.). We will discuss how lag-bias may reflect coding of object position in eye-movement system and guide imagined localization and tracking accuracy. Furthermore, we will describe an unexpected findingii, suggesting that our eyes may serve as a "place-holder" to maintain the position of tracked non-visible objects.
i Pylyshyn, Z. W., & Cohen, J. (May 1999). Imagined extrapolation of uniform motion is not continuous. Paper presented at the Annual Conference of the Association for Research in Vision and Ophthalmology, Ft. Lauderdale, FL.
ii Immediately following sound probe eyes tend to stay in approximate position until response is made.
Meeting abstract presented at VSS 2012