September 2019
Volume 19, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2019
The Neural Underpinning of Abstracting Emotion from Facial Expressions
Author Affiliations & Notes
  • Yi-Chen Kuo
    Department of Psychology and Center for Research in Cognitive Sciences, National Chung Cheng University, Chiayi, Taiwan
  • Ya-Yun Chen
    Department of Psychology and Center for Research in Cognitive Sciences, National Chung Cheng University, Chiayi, Taiwan
  • Gary C.-W. Shyi
    Department of Psychology and Center for Research in Cognitive Sciences, National Chung Cheng University, Chiayi, Taiwan
    Advanced Institute of Manufacturing with High-tech Innovations, National Chung Cheng University, Chiayi, Taiwan
Journal of Vision September 2019, Vol.19, 182. doi:https://doi.org/10.1167/19.10.182
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yi-Chen Kuo, Ya-Yun Chen, Gary C.-W. Shyi; The Neural Underpinning of Abstracting Emotion from Facial Expressions. Journal of Vision 2019;19(10):182. https://doi.org/10.1167/19.10.182.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Abstraction bridges the perception and cognition, minimizing the storage of information, while speeding up processing. In our previous study (Kuo, Shyi, & Chen, 2017, VSS), we found behavioral evidence suggesting that people applied the abstraction strategy of image-label-conversion (ILC) when they were asked to judge others’ facial expressions. Some neuroimaging studies have revealed brain regions related to labeling, and others have demonstrated areas underneath the processing of facial expression. However, cortical representations underlying facial expression labeling have not been probed directly. In the present study, we compared cortical activations in three conditions, including BaseFace, BaseLabel, and BaseFace, to investigate the brain regions that are involved in the process of ILC. In the BaseFace condition, participants were to match facial expressions from the same identity. In the BaseLabel condition, they had to choose between a pair of affective labels the one that matches a previously displayed facial expression, which presumably would prompt them to apply the ILC strategy. Finally, in the FaceCue condition, participants were to match two faces of different identities but exhibiting the same expression, which may adopt the ILC twice. The contrasts of brain activation between the three conditions showed that the right ventral lateral prefrontal cortex (rVLPFC), left fusiform gyrus, dorsal lateral prefrontal cortex (DLPFC), and anterior cingulate cortex (ACC) were significantly activated when participants applied the ILC. Moreover, the ROI analyses showed that (a) the rVLPFC was negatively correlated with the accuracy under the conditions where the ILC was used, (b) the rDLPFC was negatively correlated with the accuracy of BaseLabel condition under the contrast of BaseLabel > BaseFace, and (c) the activation of lFFG negatively predicted the performance of both the FaceCue and the BaseLabel conditions. Taken together, these findings highlight a network of cortical representations that may underpin the emotional abstraction of facial expressions.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×