Abstract
Facial expressions of emotion in humans are believed to be produced by contracting one's facial muscles, generally called action units. Yet, the surface of the face is also innervated with a large network of blood vessels. Blood flow variations in these vessels yield visible color changes on the face. Here, we study the hypothesis that these visible facial colors allow observers to successfully transmit and visually interpret emotion even in the absence of facial muscle activation. To study this hypothesis, we address the following questions. Are observable facial colors consistent within and differential between emotion categories? Are observable facial colors consistent within and differential between positive and negative emotions (valence)? And, does the human visual system use these facial colors to decode emotion categories and valence from faces? These questions have, to the authors knowledge, never been assessed, yet suggest the existence of an important, unexplored mechanism of the production of facial expressions of emotion by a sender and their visual interpretation by an observer. The results of our studies provide the first evidence in favor of our hypothesis. Specifically, we use a machine learning algorithm to identify the most descriptive color features associated with each emotion category. This allows us to change the color of neutral face images (i.e., faces without any facial muscle movement) to match the color of specific emotions. Showing these images to human subjects demonstrates that people perceive the correct emotion category (and valence) on the face even in the absence of any muscle movement. We also demonstrate that this color signal is independent from that provided by facial muscle movements. These results support a revised model of the perception of facial expressions of emotion where facial color is an effective mechanism to visually transmit and decode emotion.
Meeting abstract presented at VSS 2018