Abstract
Observers experience affordance-specific biases in visual processing for objects within the hands' grasping space. Although these biases presumably reflect the visual system's adaptive sensitivity to behavioral contexts, it is unclear the extent to which long-term evolutionary pressures versus short-term learning experiences tune visual cognition to facilitate action. I investigated the hypothesis that altered vision near the hands is a result of experience-driven plasticity. Participants performed global-motion-detection and global-form-perception tasks under hands-near and hands-far conditions. In the hands-near condition, participants adopted a posture in which the backs of their hands faced the display—a posture placing stimuli outside of the hands' typical grasping space. In the hands-far condition, participants rested their hands in their laps. Participants then engaged in a brief training session in which they repeatedly grasped and moved an object by securing it between the backs of their hands, essentially learning a new power grasp affordance. Following this training session, participants again performed the motion-detection and form-perception tasks under the hands-near and hands-far conditions. Prior to training, performance on both tasks was similar in the hands-near and hands-far conditions. However, following training, participants' performance on the motion-detection task improved substantially when stimuli were viewed near the backs of the hands, while performance in all other conditions remained relatively unchanged. These results point to the mechanism responsible for altered vision near the hands, suggesting that biases are plastic and facilitate processing of visual information relevant to learned grasp affordances: training with an unconventional power grasp enhanced temporal, but not spatial, sensitivity for stimuli viewed near the backs of the hands.
Meeting abstract presented at VSS 2016