September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Biophysically plausible neural model for the interaction between visual and motor representations of action
Author Affiliations
  • Mohammad Hovaidi Ardestani
    Section for Computational Sensomotorics, Department of Cognitive Neurology, Hertie Institute for Clinical Brain Research, and Centre for Integrative Neuroscience, University Clinic Tübingen, D-72076 Tübingen, Germany
    IMPRS for Cognitive and Systems Neuroscience, University of Tübingen, Germany
  • Martin Giese
    Section for Computational Sensomotorics, Department of Cognitive Neurology, Hertie Institute for Clinical Brain Research, and Centre for Integrative Neuroscience, University Clinic Tübingen, D-72076 Tübingen, Germany
Journal of Vision August 2017, Vol.17, 1167. doi:https://doi.org/10.1167/17.10.1167
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Mohammad Hovaidi Ardestani, Martin Giese; Biophysically plausible neural model for the interaction between visual and motor representations of action. Journal of Vision 2017;17(10):1167. https://doi.org/10.1167/17.10.1167.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

INTRODUCTION: Action perception and action execution are intrinsically linked in the human brain. Experiments show that concurrent motor execution influences the visual perception of actions. This interaction is mediated by action-selective neurons in premotor and parietal cortex. We have developed a model based on biophysically realistic spiking neurons that accounts for such interactions. METHODS: Our model is based on neural representations of different motor actions by mutually coupled neural fields. One field model represents the perceived action (vision field), and the other one the associated motor program (motor field). They consist of coupled ensembles of Exponential Integrate-and-Fire neurons (Brette et al., 2005), and stabilize travelling local solutions (activity peaks), which either follow the stimulus pattern in the vision field, or propagate autonomously after a 'go-signal' in the motor field. Both fields are coupled by interaction kernels that stabilize solutions with synchronously propagating pulses in both fields. Representations for different actions inhibit each other. We used the model to reproduce the results of several experiments focusing on action-perception coupling and mirror neurons. RESULTS: Consistent with experimental data, this architecture provides a unifying account for spatial and temporal tuning of action-perception coupling (Christensen et al., 2011), and for the influence of action perception on variability of execution (Kilner et al., 2003). The model reproduces the behavior of the neural population vector trajectories of mirror neurons in premotor cortex (Caggiano et al., 2016). Duplication of the model architecture allows to reproduce the spontaneous synchronization of two observers that see each other executing periodic body movements (Schmidt et al., 1990). CONCLUSION: The proposed model reproduces, using a single parameter set, a variety of quite different experiments that address the interactions between action vision and action execution. Since the model uses physiologically plausible circuits it makes a variety of predictions at the single-cell level.

Meeting abstract presented at VSS 2017

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×