Purchase this article with an account.
Michael Bonner, Jack Ryan, Russell Epstein; Neural coding of navigational affordances in visual scenes. Journal of Vision 2016;16(12):569. doi: https://doi.org/10.1167/16.12.569.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
An essential component of visually guided navigation is the ability to perceive features of the environment that afford or constrain movement. For example, in indoor environments, walls limit one's potential routes, while passageways facilitate movement. Here we examine the cortical mechanisms that encode such navigational features. In two fMRI experiments we test the hypothesis that scene-selective cortices represent the navigational affordances of local space. In the first study, subjects viewed images of artificially rendered rooms that had identical geometry as defined by their walls, but varied on the number and position of open passageways leading out of them. The layout of these passageways defined the principal navigational affordances in each scene. We used multivoxel pattern analysis to identify representations of navigational layout that were invariant to other visual features, including surface textures and visual clutter. This analysis revealed that the occipital place area (OPA), a scene-selective region near the transverse occipital sulcus, contained fine-grained representations of navigational layout that could be decoded across variations in visual appearance even though the local geometry was the same for all scenes. Given our tightly controlled, artificial stimuli, an important question was whether these conclusions would generalize to more complex, naturalistic scenes. We addressed this in the second study by using images of real-world indoor environments. To identify navigational affordances, we asked independent raters to indicate the paths they would take to walk through each scene. We then used representational similarity analysis of fMRI data obtained during an orthogonal category-judgment task to identify brain regions that encode these navigational affordances. Once again we found representations of navigational layout in the OPA, demonstrating that this effect generalizes to naturalistic scenes with heterogeneous visual and semantic properties. These findings indicate that the OPA supports a critical aspect of scene perception—the representation of navigational space.
Meeting abstract presented at VSS 2016
This PDF is available to Subscribers Only