Abstract
The brain's synthesis of information across visual and auditory modalities confers substantive benefits to it in detecting, localizing, and identifying salient environmental events. When encountering brief and discrete cross-modal cues, the brain appears to use all of the available sensory information, and the ensuring multisensory detection and localization decisions conform to statistically optimal models of multisensory integration. The aim of the present study was to determine if and how this optimality extends to processing continuous, patterned stimuli needed to inform an ongoing behavioral task. Human participants (n=20) manually tracked the fluctuations of patterned visual (an expanding/contracting annulus) or auditory (an FM tone) stimuli in the presence of significant signal contamination. Cues were tested both individually and in combination. The frequency of oscillation in one or both modalities subtly shifted on some trials, requiring participants to detect the change(s) and shift their behavioral plan. In addition to tracking baseline performance, we analyzed the probability and rapidity with which subjects detected changes in the underlying pattern, began their shift to the new pattern, and re-established stable tracking. These performance metrics were compared across the unisensory and multisensory test conditions. The largest multisensory enhancements were observed in change-detection and the speed of shifting to the new pattern. The enhancements for most (but not all) subjects approximated predictions of the statistical optimal model. Thus, the statistical optimality of multisensory processing appears to apply not only for static cues but also continuous-time contexts.
Meeting abstract presented at VSS 2016