September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Social Trait Facial Expressions Comprise Latent Affective Facial Signals
Author Affiliations & Notes
  • Laura B. Hensel
    University of Glasgow
  • Oliver G. B. Garrod
    University of Glasgow
  • Philippe G. Schyns
    University of Glasgow
  • Rachael E. Jack
    University of Glasgow
  • Footnotes
    Acknowledgements  Supported by [ES/P000681/1; ES/K001973/1; ES/K00607X/1]; British Academy [SG113332]; ERC [FACESYNTAX; 75858]; Wellcome Trust [107802]; Multidisciplinary University Research Initiative/Engineering and Physical Sciences Research Council [USA, UK; 172046-01]
Journal of Vision September 2021, Vol.21, 1988. doi:https://doi.org/10.1167/jov.21.9.1988
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Laura B. Hensel, Oliver G. B. Garrod, Philippe G. Schyns, Rachael E. Jack; Social Trait Facial Expressions Comprise Latent Affective Facial Signals. Journal of Vision 2021;21(9):1988. doi: https://doi.org/10.1167/jov.21.9.1988.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Theories of social perception posit that the perception of social traits from faces, such as trustworthiness and dominance, are overgeneralizations of emotions (e.g., Montepare & Dobish, 2003), suggesting a latent affective signaling structure. Here, we address this question using a data-driven approach to model dynamic facial expressions of four key social traits – trustworthiness, warmth, dominance, and competence – and six classic emotions – happiness, surprise, fear, disgust, anger and sadness – and compared their face movement patterns. On each experimental trial, we generated a random facial animation comprised of a random sub-set of individual dynamic face movements (Action Units, AUs; Ekman & Friesen, 1978; see Yu et al., 2012 for details). Sixty participants (white Western, 31 women; mean age = 22+/-1.71 years) each categorized 2,400 such facial animations (sex-balanced, same-ethnicity) according to the six emotions (see Jack et al., 2014). A separate participant group (five white Western, 3 females, mean age = 24.0+/-5.2 years) also rated 2,400 facial animations on each of four social traits on a 7-point scale, e.g., 'extremely dominant' to 'extremely submissive' in separate tasks. We then built per-participant dynamic facial expression models of each emotion and social trait by measuring the statistical relationship between the face movements on each trial and the participants' responses using Monte Carlo simulations and a one-tailed test (p < .05). Analysis of the resulting facial expression models revealed systematic commonalities between social traits and emotions in line with the overgeneralization hypothesis. Specifically, positive social traits such as trustworthiness and warmth, share Lip Corner Puller with positive emotions such as happiness. Negative social traits such as dominance share Nose Wrinkler and Upper Lip Raiser with negative emotions such as anger and disgust. Our results enhance understanding of social face perception by revealing the latent expressive patterns across social traits and emotions.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×