Purchase this article with an account.
Martin Stritzke, Julia Trommershauser; Guidance of eye movements by vision and hand. Journal of Vision 2005;5(8):593. doi: 10.1167/5.8.593.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Vision is needed for guiding the hand towards a visually specified target. In everyday situations, however, hitting or missing the target with the hand leads to different consequences than hitting or missing it with the eye. Here we asked how eye and hand interact in a task in which movements of the hand, but not the eye lead to monetary consequences for the movement planner. In a series of four experiments, we measured eye movements during a video-game-like pointing task. In the first three experiments, subjects were instructed to rapidly touch a target region (green) on a screen while trying not to hit a nearby penalty region (red). Each target hit yielded a gain of points; each penalty hit incurred a loss of points. Late responses were penalized. In the first experiment, the penalty was a filled red disk and the target a hollow green circle, the background was grey. In the second experiment, the penalty was hollow and the target was filled. In the third experiment, the stimulus configuration was the same as in the first experiment, but disappeared as soon as the pointing movement was initiated. Penalty value, overlap of the circles and stimulus locations were varied. In a control experiment (experiment 4), subjects performed a visual judgement indicating whether the target was on the right side or on the left side of the penalty. Four subjects completed all experiments. In experiments 1, 2 and 4, subjects made two saccades on average. This number was slightly lower in the third experiment. In all experiments, the majority of landing points of the first saccade was within the region of the filled circle, i.e. the more salient stimulus. The landing point of the second saccade shifted closer towards the touch point of the finger. In most trials, the second saccade was concluded before the finger hit the screen. We conclude that eye movements during pointing tasks are guided by both the visual properties of the stimulus and the hand.
This PDF is available to Subscribers Only