August 2014
Volume 14, Issue 10
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2014
Angry faces reduce sensitivity for auditory-visual temporal asynchrony
Author Affiliations
  • L Jacob Zweig
    Department of Psychology, Northwestern University
  • David Brang
    Department of Psychology, Northwestern University
  • Satoru Suzuki
    Department of Psychology, Northwestern University
  • Marcia Grabowecky
    Department of Psychology, Northwestern University
Journal of Vision August 2014, Vol.14, 436. doi:https://doi.org/10.1167/14.10.436
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      L Jacob Zweig, David Brang, Satoru Suzuki, Marcia Grabowecky; Angry faces reduce sensitivity for auditory-visual temporal asynchrony . Journal of Vision 2014;14(10):436. https://doi.org/10.1167/14.10.436.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Perception of multisensory events, such as a person speaking, relies on binding information from distinct sensory modalities into a unitary percept. A temporal window of integration for multisensory events allows flexibility to account for latency differences arising from both variable physical transmission rates through the environment and neural transmission rates within the brain (Shelton, 2010). Previous research has shown that the width of the temporal window is subject to influences of factors including attention, spatial disparity, and stimulus complexity (e.g., Spence & Parise, 2010). The extent to which the temporal window of integration for speech is influenced by emotion, however, remains unknown. In the present study, we demonstrate that an angry expression reduces temporal sensitivity for detecting auditory-visual asynchrony in speech perception. The auditory and visual streams of a video of a person uttering syllables were presented to participants at varying time delays using the method of constant stimuli. For each auditory stream, the accompanying visual stream was manipulated to assume a happy, neutral, or angry facial expression. Participants made unspeeded temporal order judgments indicating whether the auditory or visual stream occurred first. Facial expression did not influence the point of subjective simultaneity, suggesting that facial expression either does not influence the perception of speech onset or does so equally for the visual and auditory modalities. An angry expression significantly increased the just noticeable difference, suggesting that an angry expression reduces sensitivity for detecting temporal asynchronies between auditory and visual speech streams. Our result provides evidence suggesting that emotion processing influences the perception of audiovisual synchrony.

Meeting abstract presented at VSS 2014

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×