Abstract
Human faces convey a great deal of information for human social interactions. In this wealth of information, rapid and exact inferences about what others think or feel play a crucial role in tuning our behaviors. Most studies aimed at identifying the visual processes and strategies subtending facial emotion recognition have used static stimuli. However, there is a growing body of evidence that recognizing facial emotion in the real-world involves motion (e.g., Kamachi et al., 2001; Ambadar, Schooler, & Cohn, 2005). The goal of the present study was to compare eye movements during the recognition of facial expression of emotions in static and dynamic stimuli. We used the stimuli from the STOIC database (Roy et al., submitted; the database includes static and dynamic facial expression of emotion of the six basic categories–fear, happiness, sadness, disgust, anger, and surprise–plus pain and neutral). Twenty participants each completed 320 trials (4 blocks of 80 stimuli, containing either static of dynamic stimuli). After each 500-ms stimulus, participants had to recognize the displayed emotion. Participants wore the EyeLink II head-mounted eye-tracking device while looking at the photos or videos showing different emotions. Participants were more accurate with dynamic than with static stimuli (83% vs. 78%). Average fixation maps were computed for each emotion and stimulus condition using the correct answers only. Eye movements clearly differed for the static and dynamic stimuli: For the dynamic faces, the gaze of participants remained close to the center of the face, whereas, for the static faces, their gaze rapidly spread outward. This was true for all the facial expressions tested. We will argue that the ampler eye movements observed with static faces result from a ventral-stream compensation strategy due to the relative lack of information useful to the dorsal-stream.