Abstract
To better interact with their surrounding, all animals are equipped with multiple sensory systems to redundantly perceive their body and the world around them. Over the last decade, a number of studies have demonstrated that, by selectively combining related information across the continuous stream of sensory inputs, the nervous system is able to integrate multisensory cues in a statistically optimal fashion. However, the fundamental question of how the brain selects which signals to combine, that is, how it solves the correspondence problem, is still waiting for a proper explanation. Here we propose and validate a biologically plausible, multi-purpose model for multisensory correlation detection. Such a model can solve the correspondence problem, perform optimal cue integration, and detect both correlations and lags across multiple sensory signals. The model could tightly replicate human performance in a variety of both novel and previously published psychophysical tasks. The present results provide for the first time a unique general explanation for a number of key aspects of multisensory perception, such as optimal cue integration, breakdown of integration, and the detection of correlation, lag, and synchrony across the senses.
Meeting abstract presented at VSS 2015