August 2010
Volume 10, Issue 7
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2010
Visual information extraction for static and dynamic facial expression of emotions: an eye-tracking experiment
Author Affiliations
  • Cynthia Roy
    Université de Montréal, Psychology department
  • Caroline Blais
    Université de Montréal, Psychology department
  • Daniel Fiset
    Université de Montréal, Psychology department
  • Frédéric Gosselin
    Université de Montréal, Psychology department
Journal of Vision August 2010, Vol.10, 531. doi:https://doi.org/10.1167/10.7.531
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Cynthia Roy, Caroline Blais, Daniel Fiset, Frédéric Gosselin; Visual information extraction for static and dynamic facial expression of emotions: an eye-tracking experiment. Journal of Vision 2010;10(7):531. https://doi.org/10.1167/10.7.531.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Human faces convey a great deal of information for human social interactions. In this wealth of information, rapid and exact inferences about what others think or feel play a crucial role in tuning our behaviors. Most studies aimed at identifying the visual processes and strategies subtending facial emotion recognition have used static stimuli. However, there is a growing body of evidence that recognizing facial emotion in the real-world involves motion (e.g., Kamachi et al., 2001; Ambadar, Schooler, & Cohn, 2005). The goal of the present study was to compare eye movements during the recognition of facial expression of emotions in static and dynamic stimuli. We used the stimuli from the STOIC database (Roy et al., submitted; the database includes static and dynamic facial expression of emotion of the six basic categories–fear, happiness, sadness, disgust, anger, and surprise–plus pain and neutral). Twenty participants each completed 320 trials (4 blocks of 80 stimuli, containing either static of dynamic stimuli). After each 500-ms stimulus, participants had to recognize the displayed emotion. Participants wore the EyeLink II head-mounted eye-tracking device while looking at the photos or videos showing different emotions. Participants were more accurate with dynamic than with static stimuli (83% vs. 78%). Average fixation maps were computed for each emotion and stimulus condition using the correct answers only. Eye movements clearly differed for the static and dynamic stimuli: For the dynamic faces, the gaze of participants remained close to the center of the face, whereas, for the static faces, their gaze rapidly spread outward. This was true for all the facial expressions tested. We will argue that the ampler eye movements observed with static faces result from a ventral-stream compensation strategy due to the relative lack of information useful to the dorsal-stream.

Roy, C. Blais, C. Fiset, D. Gosselin, F. (2010). Visual information extraction for static and dynamic facial expression of emotions: an eye-tracking experiment [Abstract]. Journal of Vision, 10(7):531, 531a, http://www.journalofvision.org/content/10/7/531, doi:10.1167/10.7.531. [CrossRef]
Footnotes
 NSERC.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×