Abstract
A high number of feedback connections links early visual cortex to several other cortical areas. Auditory cortex also sends feedback information to early visual cortex, but the content of this flux of information has not been fully explored yet. In a recent study by Vetter, Smith & Muckli (2014, Current Biology), it was found that feedback sent by auditory cortex is sufficient to produce distinguishable neural activity in early visual cortex when participants listen to different natural sounds in the absence of visual stimulation. The current study focused on understanding the information content of this flux of auditory feedback to visual cortex. We presented a sample of 36 sounds to 20 blindfolded participants while acquiring functional MRI data. Natural sounds were selected according to different semantic categories, e.g. animate sounds, divided into humans and animals, divided into specific species or types of sound, and the same for inanimate sounds. The boundaries of V1, V2 and V3 were drawn using individual retinotopic mapping. We analysed the fMRI activity patterns produced by these sounds in each early visual region using Multivoxel Pattern Analysis (MVPA). Results showed that the MVPA classifier could distinguish sounds belonging to different semantic categories on the basis of activity patterns in V1, V2 and V3. Particularly, animate sounds seemed to be generally better distinguished compared to inanimate sounds. Thus, auditory feedback to early visual cortex seems to follow some categorical distinctions, but not others. Our results show that auditory feedback relays certain categorical information to areas that were once believed to be exclusively specialised for vision. We hypothesise that early visual cortex may use this categorical information from sounds to predict visual input, enhance visual perception or to solve visual ambiguities.