Purchase this article with an account.
Jody C. Culham; Neuroimaging reveals the human neural representations for visually guided grasping of real objects and pictures. Journal of Vision 2017;17(10):383. doi: https://doi.org/10.1167/17.10.383.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Neuroimaging, particularly functional magnetic resonance imaging (fMRI), has revealed many human brain areas that are involved in the processing of visual information for the planning and guidance of actions. One area of particular interest is the anterior intraparietal sulcus (aIPS), which is thought to play a key role in processing information about object shape for the visual control of grasping. However, much fMRI research has relied on artificial stimuli, such as two-dimensional photos, and artificial actions, such as pantomimed grasping. Recent fMRI studies from our lab have used representational similarity analysis on the patterns of fMRI activation from brain areas such as aIPS to infer neural coding in participants performing real actions upon real objects. This research has revealed the visual features of the object (particularly elongation) and the type of grasp (including the number of digits and precision required) that are coded in aIPS and other regions. Moreover, this work has suggested that these neural representations are affected by the realness of the object, particularly during grasping. Taken together, these results highlight the value of using more ecological paradigms to study sensorimotor control.
Meeting abstract presented at VSS 2017
This PDF is available to Subscribers Only