August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
Neural coding of navigational affordances in visual scenes
Author Affiliations
  • Michael Bonner
    Department of Psychology, University of Pennsylvania
  • Jack Ryan
    Department of Psychology, University of Pennsylvania
  • Russell Epstein
    Department of Psychology, University of Pennsylvania
Journal of Vision September 2016, Vol.16, 569. doi:https://doi.org/10.1167/16.12.569
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Michael Bonner, Jack Ryan, Russell Epstein; Neural coding of navigational affordances in visual scenes. Journal of Vision 2016;16(12):569. https://doi.org/10.1167/16.12.569.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

An essential component of visually guided navigation is the ability to perceive features of the environment that afford or constrain movement. For example, in indoor environments, walls limit one's potential routes, while passageways facilitate movement. Here we examine the cortical mechanisms that encode such navigational features. In two fMRI experiments we test the hypothesis that scene-selective cortices represent the navigational affordances of local space. In the first study, subjects viewed images of artificially rendered rooms that had identical geometry as defined by their walls, but varied on the number and position of open passageways leading out of them. The layout of these passageways defined the principal navigational affordances in each scene. We used multivoxel pattern analysis to identify representations of navigational layout that were invariant to other visual features, including surface textures and visual clutter. This analysis revealed that the occipital place area (OPA), a scene-selective region near the transverse occipital sulcus, contained fine-grained representations of navigational layout that could be decoded across variations in visual appearance even though the local geometry was the same for all scenes. Given our tightly controlled, artificial stimuli, an important question was whether these conclusions would generalize to more complex, naturalistic scenes. We addressed this in the second study by using images of real-world indoor environments. To identify navigational affordances, we asked independent raters to indicate the paths they would take to walk through each scene. We then used representational similarity analysis of fMRI data obtained during an orthogonal category-judgment task to identify brain regions that encode these navigational affordances. Once again we found representations of navigational layout in the OPA, demonstrating that this effect generalizes to naturalistic scenes with heterogeneous visual and semantic properties. These findings indicate that the OPA supports a critical aspect of scene perception—the representation of navigational space.

Meeting abstract presented at VSS 2016

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×