September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Third Social Pathway computes Dynamic Action Unit Features for Emotion Decision Behavior
Author Affiliations & Notes
  • Yuening Yan
    University of Glasgow
  • Jiayu Zhan
    Peking University
  • Oliver Garrod
    University of Glasgow
  • Robin A.A. Ince
    University of Glasgow
  • Rachael Jack
    University of Glasgow
  • Philippe Schyns
    University of Glasgow
  • Footnotes
    Acknowledgements  This work was funded by the Wellcome Trust (Senior Investigator Award, UK; 107802) and the Multidisciplinary University Research Initiative/Engineering and Physical Sciences Research Council (USA, UK; 172046-01), awarded to P.G.S; and the Wellcome Trust [214120/Z/18/Z], awarded to R.I..
Journal of Vision September 2024, Vol.24, 354. doi:https://doi.org/10.1167/jov.24.10.354
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yuening Yan, Jiayu Zhan, Oliver Garrod, Robin A.A. Ince, Rachael Jack, Philippe Schyns; Third Social Pathway computes Dynamic Action Unit Features for Emotion Decision Behavior. Journal of Vision 2024;24(10):354. https://doi.org/10.1167/jov.24.10.354.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Faces convey stable identity via static 3D shape/complexion features and transient emotions via dynamic movements features (i.e. Action Units, AUs). With a transparent generative Virtual Human (VH), we studied how brain pathways dynamically compute (i.e. represent, communicate, integrate) AUs and 3D identity features for emotion decisions. In a behavioral task, the generative VH presented randomly parametrized AUs applied to 2,400 random 3D identities. This produced a different animation per trial that each participant (N=10) categorized as one emotion (happy, surprise, fear, disgust, anger, sad). Using participant’s responses, we modelled the AUs causing their perception of each emotion. In subsequent neuroimaging, each participant categorized their own emotion models applied to 8 new identities while we randomly varied each AU’s amplitude and concurrently measured MEG. Using information theoretical analyses, we traced where and when MEG source amplitudes represent each AU and how sources then integrate AUs for decisions. We compared these representations to covarying but decision-irrelevant 3D face identities. Our results replicate across all participants (p<0.05, FWER-corrected): (1) Social Pathway (Occipital Cortex to Superior Temporal Gyrus) directly represents AUs with time lags, with no Ventral involvement; (2) AUs represented early are maintained until STG integrates them with later AUs. In contrast, emotion-irrelevant 3D identities are reduced early, within Occipital Cortex. In summary, we show that the third “Social” Brain Pathway (not the dorsal pathway) dynamically represents facial action units with time lags that are resorbed by the time they reach STG, where they are integrated for emotion decision behavior; while the irrelevant 3D face identity is not represented beyond OC.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×