September 2019
Volume 19, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2019
Time-resolved discrimination of audiovisual expressions of emotion in children with and without autism
Author Affiliations & Notes
  • Kirsty Ainsworth
    Perceptual Neuroscience Laboratory for Autism and Development (PNLab), McGill University, Montréal, QC, Canada
  • Federica Falagiarda
    Institute of Research in Psychology (IPSY) & Institute of Neuroscience (IoNS), University of Louvain, Louvain, Belgium
  • Olivier Collignon
    Institute of Research in Psychology (IPSY) & Institute of Neuroscience (IoNS), University of Louvain, Louvain, Belgium
    Centre for Mind/Brain Studies, University of Trento, Trento, Italy
  • Armando Bertone
    Perceptual Neuroscience Laboratory for Autism and Development (PNLab), McGill University, Montréal, QC, Canada
Journal of Vision September 2019, Vol.19, 20a. doi:https://doi.org/10.1167/19.10.20a
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Kirsty Ainsworth, Federica Falagiarda, Olivier Collignon, Armando Bertone; Time-resolved discrimination of audiovisual expressions of emotion in children with and without autism. Journal of Vision 2019;19(10):20a. https://doi.org/10.1167/19.10.20a.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Atypical sensory perception is now recognized as one of the key characteristics of autism (APA, 2013), with research suggesting that disrupted multi-sensory integration (MSI) may underlie the sensory behaviours seen in this population (Feldman et al., 2018). Further, the integration of social information (such as faces and facial expressions of emotion) has been shown to be particularly anomalous for individuals with autism (Charbonneau et al., 2013). A novel gating paradigm was used to assess the discrimination of emotion expressions (anger, fear, happiness and sadness) communicated by voices, faces or multisensory stimuli presented through 10 incremental gates of ~33ms (Falagiarda and Collignon, 2018). 32 children with autism (Mage 12.13 (3.45)) and 56 typical controls (Mage 11.91 (3.41)) responded to the stimuli via a 4-alternative forced choice paradigm. Thresholds were extracted from logistic psychometric curves. An ANOVA revealed a significant effect of modality (F = 79.34, p < 0.001) indicating that bimodal stimuli were easiest to detect, next visual, last auditory (all ps < 0.001; Bonferroni corrected). An overall effect of emotion was also observed (F = 5.27, p = 0.01) with fear being detected the fastest. The model also revealed a main effect of group (F = 5.92, p = 0.02) showing higher thresholds in children with autism. Our study is the first to define time-resolved discrimination of audiovisual expressions of emotion in children with and without autism. Results demonstrate that in children, faces elicit earlier detection of emotion compared to voices, while bimodal stimulation is superior to unimodal. Further, results indicate that children with autism need additional, accumulated sensory information for reliable emotion detection which is generalizable across senses and across emotions. The relationship between emotion discrimination performance and sensory-related behaviors at different periods of development will also be explored.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×