July 2013
Volume 13, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   July 2013
Emotion recognition (sometimes) depends on horizontal orientations
Author Affiliations
  • Carol Huynh
    Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University
  • Benjamin Balas
    Center for Visual and Cognitive Neuroscience, Department of Psychology, North Dakota State University
Journal of Vision July 2013, Vol.13, 589. doi:https://doi.org/10.1167/13.9.589
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Carol Huynh, Benjamin Balas; Emotion recognition (sometimes) depends on horizontal orientations. Journal of Vision 2013;13(9):589. https://doi.org/10.1167/13.9.589.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Face recognition depends critically on horizontal orientations (Dakin, 2009). Presently, we asked if facial emotion recognition also exhibits this dependency. We measured observers’ performance at classifying happy and sad faces that were filtered to include predominantly horizontal information, predominantly vertical information, or both. In addition, we used picture-plane rotation (0 or 90 degrees) to dissociate image-based orientation energy from object-based orientation. We recruited participants to complete two emotion recognition experiments using orientation-filtered faces. In Experiment 1 (N=17), we measured the speed of correct emotion categorization using genuine happy/sad faces. In Experiment 2 (N=21), we used posed emotions to control for confounding features in genuine emotions (open/closed mouths). In both tasks, participants viewed stimuli in a fully randomized order for 2000ms each and classified facial emotion as quickly and accurately as possible. Picture-plane orientation varied across experimental blocks, and filter orientation, emotion, and open/closed mouth position (Experiment 2) were randomized within blocks. The results of Experiment 1 revealed main effects of emotion (p<0.001) and filter orientation (p<0.001), with longer response latencies to sad faces and vertically-filtered faces. We also obtained an interaction between emotion and filter orientation (p<0.01); vertical filtering only affected sad faces. In Experiment 2, we replicated the main effects of emotion and filter orientation, and observed main effects of mouth position (open <closed, p<0.001) and picture-plane orientation (p<0.001). Critically, we observed a three-way interaction between emotion, mouth position, and orientation filter (p<0.001) such that the disadvantage for vertically-filtered faces disappears for open-mouthed happy faces. We conclude that emotion recognition does depend on horizontal orientations, but this varies according to emotion category and specific pose. Furthermore, the lack of any interaction between image and filter orientation suggests that raw orientation (as computed by V1) does not dictate performance, but filter orientation relative to the face does.

Meeting abstract presented at VSS 2013

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×