Abstract
Our natural environment is composed of a rich tapestry of sights and sounds. How is this multimodal information from a real-world setting processed in our brain, forming a representation of a particular environment? A number of studies have shown that in the visual domain, several visual properties related to scene categories are identified and processed in scene-selective areas in the visual cortex. Moving beyond the purely sensory representations of scene categories, we are here looking for a more abstract and conceptual representation of scenes that transcends sensory modalities. To this end, we had participants look at scene images or listen to scene sounds while their neural activity was recorded in an MRI scanner. Using multi-voxel pattern analysis, we were able to decode scene categories not only in sensory cortex (image categories in visual cortex and sound categories in auditory cortex) but also in prefrontal cortex, which is known to engage in high-level cognitive functions. Furthermore, the scene representation in the middle and inferior frontal gyrus generalized across sensory modalities, as shown by successful cross-decoding of scene categories between images and sounds. Finally, we compared the error patterns of the neural decoder to those of human categorization from a separate behavioral experiment. We found significant agreement between behavioral errors and the errors of the neural decoder in the inferior and middle frontal gyrus, which shows that the information that humans use for categorical judgment is represented in these regions. These results indicate that there exists a conceptual level of scene representations in prefrontal cortex, which reflects human behavior and does not rely on any one sensory modality. To our knowledge, this is the first time that such a cross-modal conceptual representation of real-world scenes has been measured explicitly.
Meeting abstract presented at VSS 2016