Abstract
The visual system has distinctive networks supporting scene processing and object processing. However, in typical experience we perceive a continuum of views between these extremes. How do visual brain responses change as a function of the depicted scale of the space in view? To investigate this question, we created 20 indoor environments (e.g. kitchen, library) using virtual-reality software. All environments had the same physical dimensions, with an extended surface along the back wall (e.g. countertop, desk) containing both central and surrounding objects. For each environment, a series of 15 snapshots were rendered to smoothly sample between a close-up view of the central object on this surface, and far-scale view of the full environment, using logarithmically spaced steps. Brain responses were measured to each position along the continuum using functional magnetic resonance imaging in 12 participants. Within independently-localized scene and object regions, we found evidence for parametric responses: activation smoothly changed as a function of the spatial scale of the depicted view. In a whole-brain analysis, we found mixed evidence for a smoothly mapped continuum across the cortex—some participants showed intermediate regions with preferences to intermediate spatial scales, while others did not. Interestingly, this was in contrast to our localizer runs, where photographs of intermediate-scale spaces consistently activated some regions more than both object and full-scale scene views; however these regions were equally activated by all locations along the object-scene continuum in rendered images. The most consistent finding is that all participants showed extensive cortical territory along the ventral visual stream with either a monotonic increase or decrease in activation when the depicted scale of space is parametrically varied from objects to scenes. More work is needed to understand the individual differences in the mapping of this continuum onto the cortex.