September 2018
Volume 18, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2018
Eye Movements During Face Viewing Predict Individual Differences in Noisy Audiovisual Speech Perception
Author Affiliations
  • Johannes Rennig
    Department of Neurosurgery and Core for Advanced MRI, Baylor College of Medicine, Houston TX, USA
  • Kira Wegner-Clemens
    Department of Neurosurgery and Core for Advanced MRI, Baylor College of Medicine, Houston TX, USA
  • Micael Beuachamp
    Department of Neurosurgery and Core for Advanced MRI, Baylor College of Medicine, Houston TX, USA
Journal of Vision September 2018, Vol.18, 938. doi:https://doi.org/10.1167/18.10.938
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Johannes Rennig, Kira Wegner-Clemens, Micael Beuachamp; Eye Movements During Face Viewing Predict Individual Differences in Noisy Audiovisual Speech Perception. Journal of Vision 2018;18(10):938. https://doi.org/10.1167/18.10.938.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Humans use visual speech information from a talker's mouth movements to complement auditory information from the talker's voice. Recently, we discovered individual differences in eye movements during viewing of talking faces: some observers mainly fixate the mouth of the talker, while others mainly fixate the eyes. We tested the hypothesis that mouth-lookers would make better use of visual speech in 34 participants. In experiment 1, participants viewed clear audiovisual syllables. A median split of the eye-tracking data was used to classify participants as mouth-lookers (81% of trial-time spent fixating the mouth) and eye-lookers (45%). In experiment 2, participants repeated noisy auditory sentences presented alone or paired with visual speech. An ANOVA on the number of words accurately repeated showed main effects of condition (higher accuracy for audiovisual than auditory speech, F=234, p=10-15) and group (higher accuracy for mouth-lookers, F=5, p=0.03). Critically, there was a significant interaction, with mouth-lookers showing a greater improvement in accuracy when visual speech was presented (F=7, p=0.01). Given the higher acuity of foveal vision, fixating the talker's mouth might be expected to provide more visual speech information. To assess this possibility, we examined the eye movements made by the participants during experiment 2. Both mouth-lookers and eye-lookers almost exclusively fixated the mouth (94% vs. 92% mouth fixation time, p=0.53) consistent with previous demonstrations that noisy auditory speech drives mouth fixation. The propensity to fixate the mouth of the talker even when it is not necessary (during perception of clear audiovisual speech) is linked to improved perception under noisy conditions in which mouth movements are critical for understanding speech. We speculate that although all humans have extensive experience with talking faces, the additional time that mouth-lookers spend examining the mouth leads to greater expertise in extracting visual speech features.

Meeting abstract presented at VSS 2018

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×