Abstract
Sensory stimuli introduce varying degrees of uncertainty, and it is crucial to accurately estimate and utilize this sensory uncertainty for appropriate perceptual decision making. Previous research has examined the estimation of uncertainty in both low-level multisensory cue combination and metacognitive estimation of confidence. However, it remains unclear whether these two forms of uncertainty estimation share the same computations. To address this question, we used a well-established method to induce a dissociation between confidence and accuracy by manipulating energy levels in a random-dot kinematogram. Subjects (N = 99) completed a direction discrimination task for visual stimuli with low vs. high overall motion energy. We found that the high-energy stimuli led to higher confidence but lower accuracy in a visual-only task. Importantly, we also investigated the impact of these visual stimuli on auditory motion perception in a separate task, where the visual stimuli were irrelevant to the auditory task. The results showed that both the high- and low-energy visual stimuli influenced auditory judgments, presumably through automatic low-level mechanisms. Critically, the high-energy visual stimuli had a stronger influence on auditory judgments compared to the low-energy visual stimuli. This effect was in line with the confidence but contrary to the accuracy differences between the high- and low-energy stimuli in the visual-only task. These effects were captured by a simple computational model that assumes that common computations underly confidence reports and multisensory cue combination. Our results reveal a deep link between automatic sensory processing and metacognitive confidence reports, and suggest that vastly different stages of perceptual decision making rely on common computational principles.