July 2013
Volume 13, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   July 2013
Towards a biologically-inspired vision system for the control of locomotion in complex environments
Author Affiliations
  • Stephane Bonneaud
    Brown University, USA
  • William H. Warren
    Brown University, USA
  • Kerwin Olfers
    Leiden University, Netherlands
  • Gerrit Irwin
    Brown University, USA
  • Thomas Serre
    Brown University, USA
Journal of Vision July 2013, Vol.13, 753. doi:https://doi.org/10.1167/13.9.753
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Stephane Bonneaud, William H. Warren, Kerwin Olfers, Gerrit Irwin, Thomas Serre; Towards a biologically-inspired vision system for the control of locomotion in complex environments. Journal of Vision 2013;13(9):753. https://doi.org/10.1167/13.9.753.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

We investigate the perceptual principles responsible for the visual control of human locomotor behaviors by modeling the neural mechanisms of human vision. We have built an empirically-grounded perception and action system by coupling a biologically plausible computational model of motion processing in the dorsal stream of the visual cortex (Jhuang et al., ICCV, 2007) to a cognitively valid locomotor control model (Warren & Fajen, In Fuchs & Jirsa, Coordination, 2008). Here we report on initial results of this integration of the two models and show that the coupled system accounts for human data for steering to a goal while avoiding obstacles (Warren & Fajen, In Fuchs & Jirsa, Coordination, 2008). We further report on a recent extension of the vision model to also include the detection of motion boundaries, binocular disparity as well as shape and color information in addition to motion energy. We simulate an agent moving in complex virtual environments using a perception-action loop that includes the vision model, the locomotor control model, and the environment. Visual cues are successfully used as input to the locomotion model to modulate the activation of control laws, leading the agent to avoid obstacles while steering to its goal. As the agent travels through its environment, the virtual optic flow from the agent’s perspective changes, which in turn modifies the detected visual cues. The agent is thus coupled to its environment through information and action, and locomotor trajectories are emergent (Warren, Psychological Review, 2006). We use different degrees of realism and complexity for the virtual scenes, from a few to many objects, and simple to complex textures, to demonstrate the validity of the approach.

Meeting abstract presented at VSS 2013

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×