August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
You not me: others' emotional facial expressions capture attention automatically – but only for empathic people.
Author Affiliations
  • Christian Wallraven
    Brain and Cognitive Engineering, Korea University
  • June Kang
    Department of Biomedical Science, Korea University
Journal of Vision September 2016, Vol.16, 500. doi:https://doi.org/10.1167/16.12.500
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Christian Wallraven, June Kang; You not me: others' emotional facial expressions capture attention automatically – but only for empathic people.. Journal of Vision 2016;16(12):500. https://doi.org/10.1167/16.12.500.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Facial expressions are processed effortlessly and quickly, allowing us to assess a person's mood and emotion. In addition, emotional facial expressions are known to preferentially attract attention for visual processing. This preferential attention, however, may be mitigated by individual differences in traits – in particular, we speculated that highly empathic people may process emotional expressions more efficiently than non-empathic people. To study preferential attention in facial expression processing, we used the attentional blink paradigm in which identification of a first target (T1) transiently impairs the detection of a second target (T2) during rapid serial visual presentation of a stimulus stream. If emotional expressions are processed preferentially by empathic people, then the impairment for T2-stimuli should be less compared to non-empathic people. Crucially, however, this effect should only occur for faces of others but not for one's own face. To test this, we recruited 100 participants and split them into low- (N=34), medium- (N=47), and high-empathy (N=19) groups based on self-reported levels of emotional empathy. Stimuli consisted of happy, sad, and neutral expressions from the Korean Facial Expressions of Emotion resource for other-face stimuli. In addition, we recorded and validated these three expressions for all participants for use as own-face stimuli. A standard attentional-blink paradigm was implemented with neutral faces as T1-stimuli and emotional faces as T2-stimuli in Psychtoolbox-3. In accordance with our hypothesis, the amount of impairment correlated significantly with the self-reported empathy score only for the other-face condition, but not for the own-face condition. Overall, the high-empathy group showed significantly less impairment for other-face emotional faces compared to the two other groups. These results clearly show that emotional expressions preferentially capture the attention of empathic people. Our findings also provide support for a previously untested component of the Perception-Action-Model of empathy that posits automatic, preferential processing for emotionally-charged stimuli.

Meeting abstract presented at VSS 2016

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×