Purchase this article with an account.
Alison Campbell, James Tanaka; Individual differences in antisocial and prosocial traits predict perception of dynamic expression. Journal of Vision 2015;15(12):1374. doi: 10.1167/15.12.1374.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Successful everyday, social interactions are mediated by the accurate perception of dynamic facial expressions. For example, the decoding of distress cues have been shown to inhibit antisocial behaviour while simultaneously eliciting empathetic responses. The import of this recognition-behaviour connection has been revealed by psychiatric research demonstrating that disorders characterized by interpersonal deficits are associated with impairments in processing facial affect. This area of research has generated speculation that the neurocognitive mechanisms specific to social behaviour are also pivotally involved in expression recognition, but it is unknown whether this relationship also holds in nonclinical populations. To explore this question, we examined how individual differences in antisocial traits relate to the recognition of facial affect. Antisocial behaviour was assessed using the Inventory for Callous-Unemotional Traits (ICU, Kimonis et al., 2008), a scale which designates a subgroup of antisocial individuals who are more likely to show deficits in processing emotional stimuli relative to other antisocial individuals. Perceptual sensitivity to facial affect was probed using a Dynamic Expression Recognition Task (DERT, Deriso et al., 2012) in which participants were shown 75ms, 150ms, or 225ms reveals of a dynamic face morph progressing from neutral to one of sad, happy, angry, fear, surprise, or disgust. The main finding was that participants who scored higher in callous-unemotional traits were significantly less accurate in expression recognition compared to participants who scored low on the ICU. This result supports the hypothesis for a common mechanism underlying affect recognition and antisocial behaviour across clinical and nonclinical groups, while also highlighting the diagnostic potential of facial affect processing.
Meeting abstract presented at VSS 2015
This PDF is available to Subscribers Only