September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Common computations in automatic cue combination and metacognitive confidence reports 
Author Affiliations & Notes
  • Yi Gao
    Georgia Institute of Technology
  • Kai Xue
    Georgia Institute of Technology
  • Brian Odegaard
    University of Florida
  • Dobrimir Rahnev
    Georgia Institute of Technology
  • Footnotes
    Acknowledgements  We thank Minzhi Wang for his help with data collection. This work was supported by the National Institute of Health (award: R01MH119189) and the Office of Naval Research (award: N00014-20-1-2622). 
Journal of Vision September 2024, Vol.24, 313. doi:https://doi.org/10.1167/jov.24.10.313
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yi Gao, Kai Xue, Brian Odegaard, Dobrimir Rahnev; Common computations in automatic cue combination and metacognitive confidence reports . Journal of Vision 2024;24(10):313. https://doi.org/10.1167/jov.24.10.313.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Sensory stimuli introduce varying degrees of uncertainty, and it is crucial to accurately estimate and utilize this sensory uncertainty for appropriate perceptual decision making. Previous research has examined the estimation of uncertainty in both low-level multisensory cue combination and metacognitive estimation of confidence. However, it remains unclear whether these two forms of uncertainty estimation share the same computations. To address this question, we used a well-established method to induce a dissociation between confidence and accuracy by manipulating energy levels in a random-dot kinematogram. Subjects (N = 99) completed a direction discrimination task for visual stimuli with low vs. high overall motion energy. We found that the high-energy stimuli led to higher confidence but lower accuracy in a visual-only task. Importantly, we also investigated the impact of these visual stimuli on auditory motion perception in a separate task, where the visual stimuli were irrelevant to the auditory task. The results showed that both the high- and low-energy visual stimuli influenced auditory judgments, presumably through automatic low-level mechanisms. Critically, the high-energy visual stimuli had a stronger influence on auditory judgments compared to the low-energy visual stimuli. This effect was in line with the confidence but contrary to the accuracy differences between the high- and low-energy stimuli in the visual-only task. These effects were captured by a simple computational model that assumes that common computations underly confidence reports and multisensory cue combination. Our results reveal a deep link between automatic sensory processing and metacognitive confidence reports, and suggest that vastly different stages of perceptual decision making rely on common computational principles. 

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×