Abstract
Is the conscious perception of seeing a flash, hearing a sound or feeling a touch associated with one common core-activity pattern in the brain? Here, I present novel magnetoencephalography (MEG) data that reveal such supramodal neural correlates of conscious perception. On each trial, different visual, auditory or tactile stimuli were shown at individual perceptual thresholds, such that about half of the stimuli were consciously detected, while the other half was missed. Four different stimuli per modality were used (i.e. different Gabor patches, sound-frequencies, stimulated fingers) in order to subsequently leverage representational similarity analysis (RSA) for differentiating modality-specific, sensory processing from supramodal conscious experiences, which are similar across modalities. As expected, there was stronger evoked MEG-activity for detected vs. missed stimuli during sensory processing (<0.5 s) in the respective sensory cortices. Moreover consistent with previous work, there was stronger alpha-frequency band power (8-13 HZ) for missed vs. detected trials in the pre-stimulus period and in a later time window after stimulus onset (>0.5 s) for all three modalities. Critically, the RSA distinguished activity patterns related to modality-specific, sensory processing shortly after stimulus onset (<0.5 s) from later supramodal conscious processing (>0.5 s). Overall, our findings suggest a three-stage model for conscious multisensory experiences, involving pre-stimulus alpha oscillations, modality-specific, sensory processing upon stimulus onset and then later supramodal conscious perception. This temporal processing cascade may serve the integration and updating of pre-stimulus brain states, presumably reflecting top-down predictions about upcoming sensory events, with subsequent conscious experiences irrespective of the specific sensory modality.