September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Eye Movement Modulates the Face Inversion Effect in Emotion Recognition
Author Affiliations
  • Angeline Yang
    UC Berkeley
  • Yueyuan Zheng
    University of Hong Kong
  • Janet Hsiao
    University of Hong Kong
  • Susana Chung
    UC Berkeley
Journal of Vision September 2024, Vol.24, 1395. doi:https://doi.org/10.1167/jov.24.10.1395
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Angeline Yang, Yueyuan Zheng, Janet Hsiao, Susana Chung; Eye Movement Modulates the Face Inversion Effect in Emotion Recognition. Journal of Vision 2024;24(10):1395. https://doi.org/10.1167/jov.24.10.1395.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The Face Inversion Effect (FIE), characterized by a greater reduction in recognition performance for inverted versus upright faces than objects, suggests that unlike objects, faces are processed holistically – a process disrupted with inversion. Despite many studies investigating FIE in identity recognition, its effect in emotion recognition has shown mixed results. We aim to clarify the effect of FIE on the recognition of different emotions and uncover mechanisms of holistic processing by linking behavioral performance with eye movement. Participants (n=40, White, 30 females, M=21.45 years) completed an expression recognition task of anger, fear, happiness, and sadness for upright and inverted faces with eye-tracking (400 trials per participant). The same face stimuli were presented in a random order with gender, emotion, and orientation counter-balanced across blocks. As expected, participants performed worse in identifying emotions of inverted faces in general. However, while recognition of sadness and anger was worse, recognition of happiness and fear remained unaffected by inversion. Using a data-driven machine-learning-based approach, Eye Movement analysis with Hidden Markov Models (EMHMM), we discovered two representative eye movement patterns adopted by participants during the emotion recognition tasks – eyes-focused and nose-focused patterns. Consistent with literature on diagnostic face regions for identification of different emotions, for upright faces, participants’ eye movement patterns were more eyes-focused for anger, fear, and sadness recognition and more nose-focused for happiness recognition. Interestingly, for inverted faces, participants’ eye movement patterns were only more eyes-focused for fear recognition, while anger, happiness and sadness recognition were more nose-focused. We thus show that the face scanning patterns for different emotions were influenced by the orientation of the face (upright versus inverted), and that FIE and disruption of holistic processing in emotion recognition are modulated by adherence to scanning patterns that obtain the most diagnostic information to identify the expression.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×