Abstract
When we reach for an object, our eyes often guide toward the object as well. The behavioral pattern of both eye and hand movements directing to an object is well known as eye-hand coordination. We investigate interactions between eye and reaching movements in two types of behavioral tasks: 1) motor control, where human subjects were asked to do a memory-guided reaching task. A target was presented in one of 13 positions of a half circle (from −90° to 90°, by every 15°) for about 100ms. Behavioral results indicated that a systematic bias in both eye and hand movements correlated to reaching angles. 2) motor learning, where subjects were introduced scaled visual feedbacks of reaching endpoints. These visual instructions were scaled up or down by a function of angles. First, we have applied a single scaled feedback on one spot. Reaching movements to the spot have been adapted according to the instruction, while this learning effect can be generalized to all other positions. Second, we have applied double scaled feedbacks on two spots with an intersection angle of 90°, 60° or 30°. An adapted interference effect was found on these two spots. The learned response would be generalized to other positions differently based on intersection angles. Our data indicated that humans could be capable of learning a relationship between any two spatial instructions and generalize a learning effect in 2-D space.
Acknowledgement: CAS QYZDB-SSW-SMC019