Abstract
As we navigate through the world, the visual scene in front of us is linked seamlessly to the broader environment. Where in the brain is this contextual information represented? Here, we tested where in the brain activity levels during perception of a scene view is modulated by the degree of spatial context associated with that view in memory. Participants (N=17) studied 20 real-world scenes using head-mounted VR comprising three study conditions, with varying amounts of spatial context: Image (45° of a panorama), Panorama (270° of a panorama), and Street (navigable environment of three contiguous panoramas). Using fMRI, we compared neural responses when participants perceived (Exp.1) or recalled (Exp.2) discrete fields-of-view from each place. We tested which brain regions are modulated by the degree of spatial context associated with a visual scene, focusing on the Scene Perception Areas (PPA, OPA, and MPA) and Place Memory Areas (PMAs) (Steel et al., 2021). As predicted, all SPAs were robustly activated during scene perception (all p<0.001), but their activity was not modulated by the degree of spatial context associated with a scene in memory (all p>0.4). In contrast, the PMAs showed significant modulation by spatial context, where scenes with greater spatial context induced greater PMA activity (all p<0.001). The same pattern of results was present during recall (Exp 2). Intriguingly, spatial context did not modulate hippocampal activity during recall (all p>0.6) and importantly, activity in control areas (V1 and FFA) was not impacted by spatial context (all p>0.05) in either experiment. Together, these results show that the PMAs are uniquely sensitive to the amount of spatial context associated with a real-world scene suggesting that they may be involved in providing spatial context to the SPAs to facilitate visually-guided behavior.