Abstract
Atypical sensory perception is now recognized as one of the key characteristics of autism (APA, 2013), with research suggesting that disrupted multi-sensory integration (MSI) may underlie the sensory behaviours seen in this population (Feldman et al., 2018). Further, the integration of social information (such as faces and facial expressions of emotion) has been shown to be particularly anomalous for individuals with autism (Charbonneau et al., 2013). A novel gating paradigm was used to assess the discrimination of emotion expressions (anger, fear, happiness and sadness) communicated by voices, faces or multisensory stimuli presented through 10 incremental gates of ~33ms (Falagiarda and Collignon, 2018). 32 children with autism (Mage 12.13 (3.45)) and 56 typical controls (Mage 11.91 (3.41)) responded to the stimuli via a 4-alternative forced choice paradigm. Thresholds were extracted from logistic psychometric curves. An ANOVA revealed a significant effect of modality (F = 79.34, p < 0.001) indicating that bimodal stimuli were easiest to detect, next visual, last auditory (all ps < 0.001; Bonferroni corrected). An overall effect of emotion was also observed (F = 5.27, p = 0.01) with fear being detected the fastest. The model also revealed a main effect of group (F = 5.92, p = 0.02) showing higher thresholds in children with autism. Our study is the first to define time-resolved discrimination of audiovisual expressions of emotion in children with and without autism. Results demonstrate that in children, faces elicit earlier detection of emotion compared to voices, while bimodal stimulation is superior to unimodal. Further, results indicate that children with autism need additional, accumulated sensory information for reliable emotion detection which is generalizable across senses and across emotions. The relationship between emotion discrimination performance and sensory-related behaviors at different periods of development will also be explored.