Abstract
Body postures and movements provide important information about affective states. A variety of existing work has focused on the characterization of the perception of emotions from bodies and point-light motion, often using rather qualitative or heuristic methods. Recent advances in computational learning and computer animation have opened novel possibilities for the well-controlled study of emotional signals conveyed by the human body and their visual perception. In addition, almost no quantitative work exists on the features that underlie the perception of emotions conveyed by the body during interactive behavior. Using motion capture combined with a mood induction paradigm, we studied systematically the expression and perception of emotions expressed by interactive and non-interactive movements. Combining methods from machine learning with psychophysical experiments we characterize the kinematic features that characterize emotional movements and investigate how they drive the visual perception of emotions from the human body.