September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
EEG signature of evidence integration suggests distinct visual and auditory representation in multisensory context
Author Affiliations
  • Thomas Schaffhauser
    Laboratoire des Systèmes Perceptifs, Département d'études cognitives, École Normale Supérieure - PSL University, CNRS, 75005 Paris, France
  • Yves Boubenec
  • Pascal Mamassian
Journal of Vision September 2021, Vol.21, 2627. doi:https://doi.org/10.1167/jov.21.9.2627
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Thomas Schaffhauser, Yves Boubenec, Pascal Mamassian; EEG signature of evidence integration suggests distinct visual and auditory representation in multisensory context. Journal of Vision 2021;21(9):2627. https://doi.org/10.1167/jov.21.9.2627.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Recent work in neuroimaging has highlighted that visual and auditory evidence integration can be tracked by the centroparietal positivity (CPP; O'Connell et al., 2012, Nature Neuroscience), an ERP component that echoes the ramping activity of neurons in the parietal cortex. While the CPP has been observed for both visual and auditory signals in separate tasks, it is unclear whether this component also reflects the integration of multisensory signals. 14 participants monitored a continuous stream of visual random dot motion and auditory tone clouds. The random dot motion consisted of 200 small dots displayed within a 7dva circular aperture, repositioned every 50ms. The tone clouds consisted of 10 simultaneous 50ms pure tones uniformly drawn from a range of 6 octaves (220 to 14080Hz) with a resolution of 12 semitones per octave. Participants had to detect the onset of a change from incoherent noise to a rising pattern of either coherently upward moving dots (unisensory visual), rising tone sequences (unisensory auditory) or simultaneous changes in both modalities (bimodal redundant). While participants performed the task, continuous EEG was acquired through 64 electrodes. The CPP could be observed above the centroparietal electrodes both in unimodal visual and auditory changes, and in bimodal redundant changes. Its slope predicted reaction times in the three conditions, and, similarly to previous observations, it reached a stereotypical amplitude level leading to response execution. Additionally, the visual scalp projection of the CPP could be distinguished from its auditory equivalent by linear discriminant and single trial analyses, implying that visual evidence integration can be separated from its auditory equivalent. Altogether, our results suggest that the CPP can be extended to bimodal decision-making but also that visual and auditory signatures are distinct and can reveal idiosyncratic dynamics of visual and auditory dynamics of evidence integration.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×