September 2018
Volume 18, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2018
Decoding the electrophysiological dynamics of visual-to-motor transformations during grasp planning and execution
Author Affiliations
  • Lin Guo
    University of Toronto Scarborough
  • Adrian Nestor
    University of Toronto Scarborough
  • Dan Nemrodov
    University of Toronto Scarborough
  • Matthias Niemeier
    University of Toronto ScarboroughThe Centre for Vision Research, York University
Journal of Vision September 2018, Vol.18, 71. doi:https://doi.org/10.1167/18.10.71
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Lin Guo, Adrian Nestor, Dan Nemrodov, Matthias Niemeier; Decoding the electrophysiological dynamics of visual-to-motor transformations during grasp planning and execution. Journal of Vision 2018;18(10):71. https://doi.org/10.1167/18.10.71.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The time course of visuomotor transformations of human grasp actions remains largely unclear. For instance, the informativeness of electroencephalography (EEG) data is limited both because of the anatomy of the visuomotor cortex and because of the traditional emphasis on univariate effects in EEG investigations. Here, we applied classification techniques to spatiotemporal EEG patterns to characterize the electrophysiological dynamics of visuomotor processes during grasp planning and execution. To this end, we recorded from 64 channels while participants used their right dominant hand to grasp 3D objects with two kinds of shapes and textures, using two different grasp orientations. Each trial encompassed three relevant events; a 200ms Preview of the object followed by a variable delay in darkness, a Go period during which the object re-appeared indicating that participants should move to grasp it, and a Movement onset period. After aligning event-related potentials (ERPs) with each event we attempted to classify visual object features (i.e., different object shapes or different textures) based on all channels across ~10ms temporal windows. Our results show that texture classification was poor throughout, but shape classification was robust, peaking at ~100ms after Preview onset and slowly declining during darkness. The Go period showed a similar shape classification curve, but around Movement onset the curve remained close to chance level. Interestingly, classification of shape combined with grasp orientation, while roughly similar during the Preview and Go periods, ramped up before Movement onset, and a similar curve was found for texture-by-grasp classification. Finally, grasp orientation regardless of object features was successfully decoded during Preview, yet it ramped up to higher levels during the Go period, and prior to Movement onset. These results reveal the progression of visual to visuomotor and motor representations over the course of planning and executing grasp movements as reflected in the dynamics of electrophysiological signals.

Meeting abstract presented at VSS 2018

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×