December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
EEG evoked activity suggests amodal evidence integration in multisensory decision-making
Author Affiliations
  • Thomas Schaffhauser
    CNRS & Ecole Normale Supérieure, Paris, France
  • Alain De Cheveigné
    CNRS & Ecole Normale Supérieure, Paris, France
  • Yves Boubenec
    CNRS & Ecole Normale Supérieure, Paris, France
  • Pascal Mamassian
    CNRS & Ecole Normale Supérieure, Paris, France
Journal of Vision December 2022, Vol.22, 3963. doi:https://doi.org/10.1167/jov.22.14.3963
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Thomas Schaffhauser, Alain De Cheveigné, Yves Boubenec, Pascal Mamassian; EEG evoked activity suggests amodal evidence integration in multisensory decision-making. Journal of Vision 2022;22(14):3963. https://doi.org/10.1167/jov.22.14.3963.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Recent works in neuroimaging have revealed neural signatures of evidence integration (O’Connell et al., 2012, Nat Neuro; Philiastides et al., 2014, J Neuro) that reflect the ramping activity of neurons in the parietal cortex. While these experiments focused on unisensory visual and auditory perceptual decision-making, it is unclear to what extent the neural correlates of multisensory evidence integration are shared with their unisensory counterparts. To address this issue, we designed a change detection paradigm in which twenty-one participants monitored a continuous stream of visual random dot motion and auditory tone clouds. The random dot motion was displayed within a circular aperture and consisted of 200 small dots repositioned every 50 ms. The tone clouds consisted of 10 simultaneous 50 ms pure tones drawn from a range of 6 octaves (220 to 14,080 Hz) with a resolution of 12 semitones per octave. In this continuous bimodal stream, participants had to detect unisensory changes (a change from incoherent noise to a coherent pattern of upward moving dots or rising tone sequences) or bimodal changes (simultaneous auditory and visual changes in coherence) while continuous EEG was acquired via 64 scalp electrodes. EEG activity was denoised with spatial filtering techniques to isolate components that capture neural activity most reproducibly evoked by stimulus change onset (de Cheveigné & Simon, 2008, J Neuro Methods). EEG evoked activity could be discriminated between visual and auditory target stimuli highlighting separable encoding of visual and auditory coherence changes. Further analyses revealed a component rising before participants response that echoes evidence accumulation and appeared to be common for both unisensory (visual, auditory) and redundant audio-visual changes. These results point to a single amodal accumulator that integrates evidence coming from each sensory modality in isolation or a combined bimodal signal.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×