December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Factors Contributing to Facial Emotion Recognition Ability
Author Affiliations
  • Margaret Wise
    Naval Submarine Medical Research Laboratory
  • Krystina Diaz
    Naval Submarine Medical Research Laboratory
  • Sylvia Guillory
    Naval Submarine Medical Research Laboratory
  • Jeffrey Bolkhovsky
    Naval Submarine Medical Research Laboratory
  • Chad Peltier
Journal of Vision December 2022, Vol.22, 4325. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Margaret Wise, Krystina Diaz, Sylvia Guillory, Jeffrey Bolkhovsky, Chad Peltier; Factors Contributing to Facial Emotion Recognition Ability. Journal of Vision 2022;22(14):4325.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Accurate facial emotion recognition ability is a critical component of social cognition and functioning, especially in occupations involving high-stakes social interactions (e.g., police officers, military personnel, etc.). Despite research suggesting the presence of demographic biases (e.g., race and sex) in the facial recognition of positive emotions, few studies have examined these physical features as mediators of emotion recognition performance for negative emotions. Twenty-four participants (12 Females, mean age=31.04, SD=10.82) performed a computer-based Emotion Recognition Task (ERT) as part of a larger cognitive test battery, and classified the emotions portrayed in 40 different images of facial expressions from the NimStim Facial Expressions Set. Each image depicted a person/model of a certain race (White, Black, or Asian) and sex (Male or Female) making an expression to convey one of five states: happiness, sadness, anger, fear, or no emotion. The order of image presentation was randomized. Though no significant differences in classification performance were detected between the stimuli racial groups (p=.40) or sex (p=.98), differences between classification accuracy of stimuli emotions were observed. Specifically, fear was less accurately classified than anger, happiness, and sadness (p<.05). Fearful faces were the greatest sources of error (accounting for approximately 44% of the total errors), often mistaken for sadness (46%) or for no emotion (32%). Additionally, fearful faces generally took longer to recognize (p<.05). Sixty-one percent of the errors misidentifying sad faces were from categorizing them as angry. The increased frequency of participants misinterpreting fearful expressions for sad or no emotion, and misinterpreting sad expressions for anger could have concerning societal implications. Particularly, during encounters that do not allow for ample processing before an individual responds behaviorally, negative emotions such as fearful, neutral, or sad expressions may be perceived as threatening.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.