August 2010
Volume 10, Issue 7
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2010
A dynamical neural model of motion integration
Author Affiliations
  • Émilien Tlapale
    NeuroMathComp, INRIA Sophia Antipolis, France
  • Guillaume S. Masson
    DyVA, INCM, UMR 6193, CNRS & Université de la Méditerranée, France
  • Pierre Kornprobst
    NeuroMathComp, INRIA Sophia Antipolis, France
Journal of Vision August 2010, Vol.10, 843. doi:https://doi.org/10.1167/10.7.843
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Émilien Tlapale, Guillaume S. Masson, Pierre Kornprobst; A dynamical neural model of motion integration. Journal of Vision 2010;10(7):843. https://doi.org/10.1167/10.7.843.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

We propose a dynamical model of 2D motion integration where diffusion of motion information is modulated by luminance information. This model incorporates feedforward, feedback and inhibitive lateral connections and is inspired by the neural architecture and dynamics of motion processing cortical areas in the primate (V1, V2, and MT). The first aspect of our contribution is to propose a new anisotropic integration model where motion diffusion through recurrent connectivity between cortical areas working at different spatial scales is gated by the luminance distribution in the image. This simple model offers a competitive alternative to less parsimonious models based on a large set of cortical layers implementing specific form or motion features detectors. A second aspect that is often ignored by many 2D motion integration models is that biological computation of global motion is highly dynamical. When presented with simple lines, plaids or barberpoles stimuli, the perceived direction reported by human observers, as well as the response of motion sensitive neurons, will shift over time. We demonstrate that the proposed approach produces results compatible with several psychophysical experiments concerning not only the resulting global motion perception, but also concerning the oculomotor dynamics Our model can also explain several properties of MT neurons regarding the dynamics of selective motion integration, a fundamental property of object motion disambiguation and segmentation. As a whole, we present an improved motion integration model, which is numerically tractable and reproduces key aspect of cortical motion integration in primate.

Tlapale, É. Masson, G. S. Kornprobst, P. (2010). A dynamical neural model of motion integration [Abstract]. Journal of Vision, 10(7):843, 843a, http://www.journalofvision.org/content/10/7/843, doi:10.1167/10.7.843. [CrossRef]
Footnotes
 This research work has received funding from the European Community's Seventh Framework Program under grant agreement N°215866, project SEARISE and the Région Provence-Alpes-Côte d'Azur. GSM was supported by the CNRS, the European Community (FACETS, IST-FET, VIh Framework, N°025213) and the Agence Nationale de la Recherche (ANR, NATSTATS).
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×