August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Audiovisual multisensory Event Related Potentials using the McGurk effect as a stimulation paradigm
Author Affiliations
  • Jonathon Toft-Nielsen
    JÖRVEC Corp
    Intelligent Hearing Systems Corp
    University of Miami , Department of Biomedical Engineering
  • Rafael Delgado
    JÖRVEC Corp
    Intelligent Hearing Systems Corp
    University of Miami , Department of Biomedical Engineering
  • Özcan Özdamar
    University of Miami , Department of Biomedical Engineering
Journal of Vision August 2023, Vol.23, 5322. doi:https://doi.org/10.1167/jov.23.9.5322
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jonathon Toft-Nielsen, Rafael Delgado, Özcan Özdamar; Audiovisual multisensory Event Related Potentials using the McGurk effect as a stimulation paradigm. Journal of Vision 2023;23(9):5322. https://doi.org/10.1167/jov.23.9.5322.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The purpose of this study was to assess how audiovisual (AV) multisensory stimuli is integrated in speech perception using behavioral and neurological measures. In particular, the McGurk effect is a robust phenomenon that occurs when a listener is presented with conflicting auditory and visual cues of a person speaking, resulting in the incorrect auditory perception of speech tokens corresponding to the visual stimulus. This effect demonstrates the influence of vision on hearing in the perception of speech. For the experiments, the McGurk effect was generated using combinations of audio and video stimuli corresponding to /fa/ and /ba/ speech tokens and presented to the subjects using the Duet Evoked Potential System with a Video Controller Module (Intelligent Hearing Systems, Miami, FL). The system allows precise synchronization and mixing of audio and video stimuli required to generate the McGurk effect and record evoked potentials. Behavioral measures were conducted to determine which auditory token was perceived by the subjects in combination with the matching (congruous) or conflicting (incongruous) videos. Behavioral results indicated that the video presented affects the perceived auditory token, confirming the McGurk effect in the subject population. An odd-ball paradigm (85% common to 15% odd) was used to record Event Related Potentials (ERPs) corresponding to auditory only, vision only and matching or conflicting AV multisensory stimulation. P300 response amplitudes and latencies were measured. As expected, the auditory and vision only stimulation generated corresponding ERPs to changes in common to odd presentations of each stimulus. Similarly, the combined AV presentation with the conflicting audio and video also generated a robust ERP response. The specific contributions of the perceived auditory and visual stimulus components in the overall ERP responses are studied analyzed and compared with other studies.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×