September 2011
Volume 11, Issue 11
Free
Vision Sciences Society Annual Meeting Abstract  |   September 2011
Dynamic Cultural Representations of Facial Expressions of Emotion are not Universal
Author Affiliations
  • Rachael Jack
    Institute of Neuroscience and Psychology (INP), University of Glasgow, United Kingdom, G12 8QB
    Centre for Cognitive Neuroimaging (CCNi), University of Glasgow, United Kingdom, G12 8QB
  • Oliver Garrod
    Institute of Neuroscience and Psychology (INP), University of Glasgow, United Kingdom, G12 8QB
    Centre for Cognitive Neuroimaging (CCNi), University of Glasgow, United Kingdom, G12 8QB
  • Hui Yu
    Institute of Neuroscience and Psychology (INP), University of Glasgow, United Kingdom, G12 8QB
    Centre for Cognitive Neuroimaging (CCNi), University of Glasgow, United Kingdom, G12 8QB
  • Roberto Caldara
    Department of Psychology, University of Fribourg, Switzerland
  • Philippe Schyns
    Institute of Neuroscience and Psychology (INP), University of Glasgow, United Kingdom, G12 8QB
    Centre for Cognitive Neuroimaging (CCNi), University of Glasgow, United Kingdom, G12 8QB
Journal of Vision September 2011, Vol.11, 563. doi:10.1167/11.11.563
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Rachael Jack, Oliver Garrod, Hui Yu, Roberto Caldara, Philippe Schyns; Dynamic Cultural Representations of Facial Expressions of Emotion are not Universal. Journal of Vision 2011;11(11):563. doi: 10.1167/11.11.563.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Six ‘universal’ facial expressions – ‘Happy,’ ‘Surprise,’ ‘Fear,’ ‘Disgust,’ ‘Anger,’ and ‘Sadness’ – are defined by specific, static patterns of facial muscle activation (Facial Action Coding System codes, FACS). However, systematic differences in facial expression recognition between Western Caucasians (WC) and East Asians (EA) question the notion of universality, raising a new question: How do different cultures represent facial expressions? Here, we derived culture-specific models of facial expressions using state-of-the-art 4D imaging (dynamics of 3D face shape and texture) combined with reverse correlation techniques. Specifically, we modelled 41 core Action Units (AUs, groups of facial muscles) from certified FACS coders and parameterized each using 6 temporal parameters (peak amplitude; peak latency; onset latency; offset latency; acceleration; deceleration). The 41 AUs and their parameters formed the basis of a pseudo-random generative model of expressive signals. On each trial, we pseudo-randomly selected parametric values for each AU, producing an expressive facial animation (see Figure S1 in Supplementary Material). Ten WC and 10 EA na&ıuml;ve observers each categorized 9,600 such animations according to the 6 emotion categories listed above and rated the perceived intensity of the emotion (see Figure S1 in Supplementary Material). We then reverse correlated the dynamic properties of the AUs with the emotion categories they elicited, producing “dynamic classification models” (i.e., expected 4D face information) per emotion and observer. Analyses of the models reveal clear cultural contrasts in (a) the presence/absence of specific AUs predicting the reported EA miscategorizations and (b) radically different temporal dynamics of emotional expression whereby EA observers expect “smoother” emotional displays with lower acceleration and amplitude (see link in Supplementary Material for example videos). For the first time, we reveal cultural diversity in the dynamic signals representing each basic emotion, demonstrating that the complexities of emotion cannot adequately be reduced to a single set of static ‘universal’ signals.

The Economic and Social Research Council and Medical Research Council (ESRC/MRC-060-25-0010). 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×