Abstract
Multisensory integration is fundamentally a problem of redundancy exploitation in which the brain combines correspondent information from different senses to get faster and more reliable perceptual estimates. To achieve that, the brain must solve the correspondence problem, that is to continuously monitor the senses to infer which signals contain redundant (i.e., correlated) information that should be integrated. Over the last few years, several psychophysical studies have demonstrated the fundamental role of temporal correlation in the integration and temporal processing of visual and acoustic stimuli. However, it is still an open question whether the same principles operate across other pairs of modalities and whether the detection of spatial correlation is also necessary for integration. To answer this question, we used state-of-the-art virtual reality and hand tracking technology to deliver visual and tactile stimuli on the palm of the hand. On each trial the stimuli–composed of a train of four white spheres and four vibrotactile bursts– were randomly delivered in different spatial positions and with random timing, and different seeds were used to generate the visual and tactile streams. Participants observed the stimuli and reported whether the two modalities appeared to share a common cause or not. Reverse correlation analyses were then performed on the spatiotemporal cross-correlation of the visuotactile stimuli. Results clearly demonstrated that participants indeed relied on spatiotemporal correlation to perform the causality judgment task: only stimuli with high correlation at near-zero spatial and temporal offset were consistently judged as sharing a common cause. These results are consistent with the behavior of a population of biologically-plausible multisensory correlation detectors, whose architecture closely resembles the Hassenstein-Reichardt detector originally proposed for motion vision. Taken together, this study supports the view that correlation detection is indeed the canonical computation for multisensory integration, and that it operates across modalities in both time and space.