Abstract
Moving objects in a cluttered scene are often detected by multisensory cues. We investigated the cortical activations associated with coherent visual motion perception in the presence of a stationary or moving sound source. Subjects (n = 12) judged 5s-episodes of random-dot motion containing either no (0%), meager (3%) or abundant (16%) coherent visual direction information. Simultaneous auditory noise was presented via MR-compatible headphones with an in-phase moving, out-of-phase moving or stationary sound source (simulated with generic head-related transfer function). In a 4AFC response paradigm, subjects judged whether visual coherent motion was present, and if so, whether the auditory sound source was moving in-phase with the visual motion, was moving out-of-phase or was not moving. Threshold-level performance was achieved by all subjects at the 3% visual coherence level, and the false alarm rate remained below 20%. T2*-weighted images were acquired using a 1.5 T Siemens Sonata with an 8-channel phase-array headcoil, and fixation was monitored with the MR-Eyetracker (CRS Ltd). To eliminate interference with the noises created by the gradient system (with headphone dampening, 80 db), a sparse imaging, whole-brain (36 slices) design was employed with TR = 3.4 s and inter-acquisition breaks of 11.6 s. An SPM2 fixed-effects analysis revealed significant BOLD clusters in extrastriate and associational visual cortex that increased in magnitude with visual coherence level. Auditory motion activated an extended region (> 1000 voxels) of the STG, exhibiting a right-hemispheric preponderance. Combined audio-visual motion (contrast: in-phase > static sound) led to significant activations in the supramarginal gyrus and STG, and the resulting effect size was larger for the in-phase than for the out-of-phase condition. Our findings indicate that the lateral parietal and superior temporal cortex underlies our ability to integrate audio-visual motion cues.
Supported by: DFG, SPP1107