September 2015
Volume 15, Issue 12
Vision Sciences Society Annual Meeting Abstract  |   September 2015
The neural basis of intuitive physical reasoning
Author Affiliations
  • Jason Fischer
    Department of Brain and Cognitive Sciences and McGovern Institute for Brain Research, Massachusetts Institute of Technology
  • Nancy Kanwisher
    Department of Brain and Cognitive Sciences and McGovern Institute for Brain Research, Massachusetts Institute of Technology
Journal of Vision September 2015, Vol.15, 518. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jason Fischer, Nancy Kanwisher; The neural basis of intuitive physical reasoning. Journal of Vision 2015;15(12):518.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Visual scene understanding entails not just determining which people and objects are present in which locations, but also grasping the causal structure of a scene. We “see” that a table supports an object, that a stack of dishes is unstable and may fall, that a squash ball is on a trajectory to ricochet off the wall and head in our direction. Physical reasoning is ubiquitous in daily life. Yet despite the rich literature on the development of physical intuitions during childhood, and evidence that adults can predict in detail how physical events will unfold, little is known about the neural mechanisms that allow us to perceive physical events in a scene and predict what will happen next. Here we sought to identify the brain regions recruited by seeing and reasoning about physical events, and to test how reliably such regions are engaged across a variety of stimuli and tasks that vary in physical content. In a series of fMRI experiments, we uncovered a network of brain regions in parietal and premotor cortices that is engaged both by observing physical events and predicting their future outcomes. Responses in this network are modulated by task (performing a physical prediction as opposed to a non-physical visual judgment on identical stimuli), but are also reliably elicited by passively viewing physical events unfold such as objects rolling, falling, or colliding. Control experiments demonstrated that the pattern of responses in this network cannot be explained by task difficulty, non-physical spatial processing, or the recruitment of domain-general prediction mechanisms. Collectively, our experiments reveal a set of brain regions that may support our ability to perceive and understand physical events from visual input, and lay the groundwork for discovering the neural processes by which we predict how objects in the real world will behave.

Meeting abstract presented at VSS 2015


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.