Abstract
Human faces are colorful, and specific local changes in luminance and chromaticity occur when a person expresses emotion. The relatively small amount of hair on our faces and the three types of cone cells in our retina allow us to perceive the skin color changes that accompany the expression of emotion (Changizi, et al., 2006). Despite the importance of color vision to humans, there is a surprising gap in our understanding of the role of color vision in the perception of facial expressions of emotion. Here, we examine the hypothesis that specific modulations of facial chromaticity and luminance are correlated with the expression of each emotion category, and that these color changes are different between emotion categories. We analyzed the images of more than 184 individuals, who displayed twenty-one distinct basic and compound facial expressions of emotion. For each individual, we computed the chromatic and luminance changes from their neutral expressions to the apices of the 21 emotions in MacLeod-Boynton (1979) space. Machine learning algorithms then identified color changes that were consistently used in the expression of a particular emotion, and were different from those seen in other emotions. A ten-fold cross-validation analysis demonstrated that the identified color patterns can discriminate the images of the 21 emotion categories with about 52% accuracy (chance: 4.7%). These color features also provided information independent from that of Gabor filters and shape features, which are the image features most typically used in models of the visual analysis of facial expressions of emotion. We also examined the role the different color components related to each emotion category, which provided information on the possible role of low-level chromatic and achromatic channels in the subsequent neural processing of the face images of different emotions.
Meeting abstract presented at VSS 2016