Abstract
Recent work in neuroimaging has highlighted that visual and auditory evidence integration can be tracked by the centroparietal positivity (CPP; O'Connell et al., 2012, Nature Neuroscience), an ERP component that echoes the ramping activity of neurons in the parietal cortex. While the CPP has been observed for both visual and auditory signals in separate tasks, it is unclear whether this component also reflects the integration of multisensory signals. 14 participants monitored a continuous stream of visual random dot motion and auditory tone clouds. The random dot motion consisted of 200 small dots displayed within a 7dva circular aperture, repositioned every 50ms. The tone clouds consisted of 10 simultaneous 50ms pure tones uniformly drawn from a range of 6 octaves (220 to 14080Hz) with a resolution of 12 semitones per octave. Participants had to detect the onset of a change from incoherent noise to a rising pattern of either coherently upward moving dots (unisensory visual), rising tone sequences (unisensory auditory) or simultaneous changes in both modalities (bimodal redundant). While participants performed the task, continuous EEG was acquired through 64 electrodes. The CPP could be observed above the centroparietal electrodes both in unimodal visual and auditory changes, and in bimodal redundant changes. Its slope predicted reaction times in the three conditions, and, similarly to previous observations, it reached a stereotypical amplitude level leading to response execution. Additionally, the visual scalp projection of the CPP could be distinguished from its auditory equivalent by linear discriminant and single trial analyses, implying that visual evidence integration can be separated from its auditory equivalent. Altogether, our results suggest that the CPP can be extended to bimodal decision-making but also that visual and auditory signatures are distinct and can reveal idiosyncratic dynamics of visual and auditory dynamics of evidence integration.