Abstract
We investigated whether contextual auditory information is contained in the neural activity pattern of visual cortex in a category-specific manner. While being blindfolded, subjects were presented with three types of natural sounds: a forest scene (animate/non-human), a talking crowd (animate/human) and traffic noise (inanimate). We used multivariate pattern analysis (linear support vector machines) to classify the three different sounds from BOLD activity pattern in early visual cortex as identified with retinotopic mapping. Preliminary results show above chance classification in visual areas V2 and V3. This suggests that contextual information from the auditory modality shapes the neural activity pattern in early visual cortex, in a category-specific manner and in the absence of visual stimulation.