September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Psychophysical decoding of 4D dynamic spontaneous facial emotions.
Author Affiliations & Notes
  • Adelaide L. Burt
    Swinburne University of Technology
  • David P. Crewther
    Menzies Health Institute, Queensland
  • Footnotes
    Acknowledgements  Swinburne University of Technology
Journal of Vision September 2021, Vol.21, 1841. doi:https://doi.org/10.1167/jov.21.9.1841
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Adelaide L. Burt, David P. Crewther; Psychophysical decoding of 4D dynamic spontaneous facial emotions.. Journal of Vision 2021;21(9):1841. https://doi.org/10.1167/jov.21.9.1841.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Emotion recognition studies have chiefly displayed posed facial expressions as two-dimensional static images, photographs and video-recordings. Recently, facial stimuli have been reproduced as 4D spatial-temporal patterns. Thus, spontaneous, highly-realistic representations of face stimuli are now available where 3D volume and surface models produce dynamic movements across time (BU-4D-S; Zhang et al., 2013; 2014). The aim of the present study is to investigate emotion recognition to 4D spontaneous and dynamic expressions. 13 healthy adult participants performed a 4AFC task to classify facial stimuli in a standard upright or inverted position as either happy, angry, fearful or disgusted expressions. A generalized linear mixed model was conducted on emotion recognition accuracy and response times for fixed factors of emotion and position, with a random factor of individual subjects. The resulting model demonstrated that emotion recognition accuracy for happy expressions was significantly improved compared to the other emotions (83.93%) with a reduction in reaction times at M ≈ 50ms; disgust (61%; M = 57ms, p < .001***); anger (32.58%; M = 74ms, p < .001***); fear (39.14%; M = 65ms, p < .001***). In the inverted position, no significant differences contributed to the model for reaction time, while only inverted disgust expressions elicited a significant decrease in accuracy. The Facial Action Coding System (FACS; Ekman & Friesen, 1978) demonstrated that the happy expression stimuli present significantly increased activations overall compared to the other emotions, specifically in the mouth region. Based on prior meta-analyses which likewise describe a psychophysical happiness-superiority effect from facial emotions (Calvo et al., 2016), we substantiate that this effect extends to 4D dynamic spontaneous expressions, where superior detection of happy expressions can be linked to increased smiling behaviors or mouth regional activations. We conclude that 4D dynamic and spontaneous facial stimuli are critical to understand how we naturalistically detect emotional expressions.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×