Abstract
Affect recognition and communication are critical for everyday life. Most emotion research suggests that facial expressions are directly correlated with one’s emotional state, however, emerging data show that facial expressions may depend on context. This study uses the Basel Face Database and genetic algorithms to create subjective representations of facial expressions within context. 12 participants generated individualized faces that appeared consistent with standardized contextual scenarios. These scenarios were from Howard Schatz's “Actors Acting” book and were independently rated on a scale of 0-4 on 13 primary emotions (Amusement, Anger, Awe, Contempt, Disgust, Embarrassment, Fear, Happiness, Interest, Pride, Sadness, Shame, Surprise). Every primary emotion was paired with all 12 secondary emotions that were either maximized (condition 1) or minimized (condition 2). After completing these 26 scenarios, subjects completed a 3rd condition with just the 13 emotion words. 12 faces were presented for each of 6 generations for each scenario. Successful faces were randomly combined with a genetic algorithm to create 6 of the faces in successive generations. Face coefficients were compared between conditions with ANOVA. There was a significant difference (P<.05) between conditions in all emotions besides Sadness. These results reveal significant variability in individuals’ perceptions of facial expressions and demonstrate that context plays a vital role in the perception of affect expressed on faces. These findings have important implications for future research in elucidating individual differences in typical and atypical emotion representation in a context-sensitive way.