Abstract
Four experiments suggest a continuous representation of emotions in a norm-based face space (sometimes called mean-based face space), where faces are represented as variations from a mean or norm face. In particular, we show that the vertical distance between eyes, nose and mouth of an individual face is correlated with the perception of anger and sadness in face images displaying a neutral expression. When the vertical distance between eyes and mouth is made larger than that of the average face (i.e., the average distance within a population), the perception of sadness increases (experiment 1). A decrease of the same distance from that of the average face increases the perception of sadness (experiment 1). The perception of anger/sadness does not appear on the opposite side of the mean face (experiment 2). For example, a decrease of the eye-mouth distance on faces with larger distance than that of the mean face, are not perceived angrier. This effect is also clear from “face sketches” where the eyes have been reduced to dots, and the mouth, brows and nose to a single line (experiment 3). In experiment 3, the perception of sadness and anger were almost as strong as in the previous experiments where real faces were used. Furthermore, in experiments 1–3, the perception of anger/sadness increased proportionally to the feature distance, suggesting a continuous representation in a norm-based face space. A simple inversion of our stimuli disrupts this effect (experiment 4), suggesting that the characteristics being modeled by this face space correspond to second-order configural cues.
Supported in part by the National Institutes of Health and the National Science Foundation.