August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
Color changes in facial expressions of emotion are consistent within emotion and differential between emotions
Author Affiliations
  • Aleix Martinez
    The Ohio State University
  • C. Fabian Benitez-Quiroz
    The Ohio State University
  • Pamela Pallett
    The Ohio State University
  • Angela Brown
    The Ohio State University
  • Delwin Lindsey
    The Ohio State University
Journal of Vision September 2016, Vol.16, 1383. doi:https://doi.org/10.1167/16.12.1383
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Aleix Martinez, C. Fabian Benitez-Quiroz, Pamela Pallett, Angela Brown, Delwin Lindsey; Color changes in facial expressions of emotion are consistent within emotion and differential between emotions. Journal of Vision 2016;16(12):1383. https://doi.org/10.1167/16.12.1383.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Human faces are colorful, and specific local changes in luminance and chromaticity occur when a person expresses emotion. The relatively small amount of hair on our faces and the three types of cone cells in our retina allow us to perceive the skin color changes that accompany the expression of emotion (Changizi, et al., 2006). Despite the importance of color vision to humans, there is a surprising gap in our understanding of the role of color vision in the perception of facial expressions of emotion. Here, we examine the hypothesis that specific modulations of facial chromaticity and luminance are correlated with the expression of each emotion category, and that these color changes are different between emotion categories. We analyzed the images of more than 184 individuals, who displayed twenty-one distinct basic and compound facial expressions of emotion. For each individual, we computed the chromatic and luminance changes from their neutral expressions to the apices of the 21 emotions in MacLeod-Boynton (1979) space. Machine learning algorithms then identified color changes that were consistently used in the expression of a particular emotion, and were different from those seen in other emotions. A ten-fold cross-validation analysis demonstrated that the identified color patterns can discriminate the images of the 21 emotion categories with about 52% accuracy (chance: 4.7%). These color features also provided information independent from that of Gabor filters and shape features, which are the image features most typically used in models of the visual analysis of facial expressions of emotion. We also examined the role the different color components related to each emotion category, which provided information on the possible role of low-level chromatic and achromatic channels in the subsequent neural processing of the face images of different emotions.

Meeting abstract presented at VSS 2016

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×