October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Face movements temporally decouple the transmission of emotion category and intensity information
Author Affiliations & Notes
  • Chaona Chen
    School of Psychology, University of Glasgow, Scotland, UK
  • Daniel S. Messinger
    Department of Psychology, University of Miami, USA
  • Yaocong Duan
    School of Psychology, University of Glasgow, Scotland, UK
  • Robin A.A. Ince
    Institute of Neuroscience and Psychology, University of Glasgow, Scotland, UK
  • Oliver G.B. Garrod
    Institute of Neuroscience and Psychology, University of Glasgow, Scotland, UK
  • Philippe G. Schyns
    School of Psychology, University of Glasgow, Scotland, UK
    Institute of Neuroscience and Psychology, University of Glasgow, Scotland, UK
  • Rachael E. Jack
    School of Psychology, University of Glasgow, Scotland, UK
    Institute of Neuroscience and Psychology, University of Glasgow, Scotland, UK
  • Footnotes
    Acknowledgements  The European Research Council (FACESYNTAX; 75858), Economic and Social Research Council (ES/K001973/1 and ES/K00607X/1), British Academy (SG113332) supported this work.
Journal of Vision October 2020, Vol.20, 686. doi:https://doi.org/10.1167/jov.20.11.686
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Chaona Chen, Daniel S. Messinger, Yaocong Duan, Robin A.A. Ince, Oliver G.B. Garrod, Philippe G. Schyns, Rachael E. Jack; Face movements temporally decouple the transmission of emotion category and intensity information. Journal of Vision 2020;20(11):686. doi: https://doi.org/10.1167/jov.20.11.686.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Identifying the intensity of other people’s emotions is critical for effective social interaction because it substantially influences adaptive responses. Although humans regularly use facial expressions to communicate variations of emotional intensity, it remains unknown what specific face movements convey this fundamental social information. Here, we address this knowledge gap using a data-driven approach that combines a novel face movement generator, reverse correlation and subjective perception to model the specific face movements that drive perceptions of high emotional intensity in the six classic emotions – ‘happy,’ ‘surprise,’ ‘fear,’ ‘disgust,’ ‘anger’ and ‘sad.’ On each experimental trial, we generated a random facial animation by randomly sampling a set of individual face movements called Action Units (AUs) and assigning random temporal dynamics (e.g., acceleration, peak latency) to each AU. Each of 60 participants (Western, 31 female, mean age = 22 years) categorized the facial animation according to one of the six emotions and rated intensity on a 5-point scale ('very weak' to 'very strong'), and completed 2400 such trials. Following the experiment, we measured the relationship between the face movements presented on each trial and the participants emotion and intensity responses using an information-theoretic analysis of Mutual Information. Our results revealed that a specific sub-set of face movements convey emotional intensity across the six emotions, characterized by expansion face movements (e.g., Mouth Stretch, AU27) amongst positive and negative emotions, and contraction face movements (e.g., Nose Wrinkler, AU9) exclusively amongst negative emotions. We further observed this pattern in a broader set of 18 complex emotions (e.g., shame, embarrassed, excited), thereby validating our results. Here, we identify for the first time the specific face movements that convey emotional intensity, and show that this fundamental social information is signaled using a latent expressive pattern of expansion and contraction face movements.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×