September 2018
Volume 18, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2018
Decoding typical (but not atypical) actions with real tools from both dorsal and ventral visual stream regions
Author Affiliations
  • Ethan Knights
    School of Psychology, University of East Anglia, Norwich, UK
  • Fraser Smith
    School of Psychology, University of East Anglia, Norwich, UK
  • Courtney Mansfield
    School of Psychology, University of East Anglia, Norwich, UK
  • Diana Tonin
    School of Psychology, University of East Anglia, Norwich, UK
  • Holly Weaver
    School of Psychology, University of East Anglia, Norwich, UK
  • Jenna Green
    Department of Radiology, Norfolk and Norwich University Hospitals NHS Foundation Trust, Norwich, UK
  • Janak Saada
    Department of Radiology, Norfolk and Norwich University Hospitals NHS Foundation Trust, Norwich, UK
  • Stephanie Rossit
    School of Psychology, University of East Anglia, Norwich, UK
Journal of Vision September 2018, Vol.18, 180. doi:10.1167/18.10.180
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Ethan Knights, Fraser Smith, Courtney Mansfield, Diana Tonin, Holly Weaver, Jenna Green, Janak Saada, Stephanie Rossit; Decoding typical (but not atypical) actions with real tools from both dorsal and ventral visual stream regions. Journal of Vision 2018;18(10):180. doi: 10.1167/18.10.180.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Tools are manipulable objects that, unlike other objects in the world (e.g., buildings), are tightly linked to highly predictable action procedures. Neuroimaging has revealed a left-lateralized network of dorsal and ventral visual stream regions for tool-use and knowledge, but the exact role of these regions remains unclear. Moreover, studies involving actual hand actions with real tools are rare as most research to date used proxies for tool-use including presenting visual stimuli (e.g., pictures) or action simulation (e.g., pantomime). Here we used functional magnetic resonance imaging (fMRI) and multi-voxel pattern analysis (MVPA) to investigate whether the human brain represents actual object-specific functional grasps with real 3D tools. Specifically, we tested if patterns of brain activity would differ depending on whether the grasp was consistent or inconsistent with how tools are typically grasped for use (e.g., grasp knife by handle rather than by its serrated edge). In a block-design fMRI paradigm, 19 participants grasped the left or right sides of 3D-printed tools (kitchen utensils) and non-tool objects (bar-shaped objects) in open loop with the right-hand. Importantly, and unknown to participants, by varying movement direction (right/left) the tool grasps were performed in either a typical (by the handle) or atypical (by the business end) manner. In addition, for each participant separate functional localizer runs were obtained to define regions of interest. MVPA showed that typical vs. atypical grasping could be decoded significantly higher for tools than non-tools in hand-selective regions of the lateral occipital temporal cortex and intraparietal sulcus. None of the body-selective, tool-selective or object-selective areas discriminated typical vs. atypical grasps with tools higher than non-tools. These results indicate that dorsal and ventral hand-selective regions contain representations of how to appropriately interact with tools and that these are evoked even when they are irrelevant to task performance.

Meeting abstract presented at VSS 2018

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×