Abstract
Early visual cortical neurons receive non-feedforward input from lateral and top-down connections (Muckli & Petro, 2013). Auditory input to early visual cortex has been shown to contain contextual information of complex natural sounds (Vetter, Smith, Muckli, 2014). To date, contextual auditory information in early visual cortex has only been examined in the absence of visual input (i.e. subjects were blindfolded). Therefore the representation of contextual auditory information in visual cortex during concurrent visual stimulation remains unknown. Using functional brain imaging and multivoxel pattern analysis, we investigated if auditory information can be discriminated in early visual areas during an eyes-open fixation paradigm, while subjects were independently stimulated with complex aural and visual scenes. We investigated similarities between auditory and visual stimuli in eccentricity mapped V1, V2 & V3 by comparing contextually-matched top-down auditory input with feedforward visual input. Lastly, we compared top-down auditory input to V1, V2 & V3 with top-down visual input, by presenting visual scene stimuli with the lower-right quadrant occluded. We find contextual auditory information is distinct in the periphery of early visual areas, in line with previous research (Vetter, Smith, Muckli, 2014). We also report contextual similarity between sound and visual feedback to occluded visual areas. We suggest that top-down expectations are shared between modalities and contain abstract contextual information. Such cross-modal information could facilitate spatial temporal expectations by amplifying and disamplifying feedforward input based on context (Phillips et al., 2015).
Meeting abstract presented at VSS 2016