June 2007
Volume 7, Issue 9
Vision Sciences Society Annual Meeting Abstract  |   June 2007
A dynamic facial expression database
Author Affiliations
  • Sylvain Roy
    Departement de Psychologie, Universite de Montreal
  • Cynthia Roy
    Departement de Psychologie, Universite de Montreal
  • Isabelle Fortin
    Departement de Psychologie, Universite de Montreal
  • Catherine Ethier-Majcher
    Departement de Psychologie, Universite de Montreal
  • Pascal Belin
    Departement de Psychologie, Universite de Montreal
  • Frederic Gosselin
    Departement de Psychologie, Universite de Montreal
Journal of Vision June 2007, Vol.7, 944. doi:https://doi.org/10.1167/7.9.944
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Sylvain Roy, Cynthia Roy, Isabelle Fortin, Catherine Ethier-Majcher, Pascal Belin, Frederic Gosselin; A dynamic facial expression database. Journal of Vision 2007;7(9):944. https://doi.org/10.1167/7.9.944.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Facial expressions provide crucial information for adaptive behaviors, since they help us make inferences about what others are thinking and feeling. To date, most studies that have investigated the perception of facial expressions have used static displays (Ekman & Friesen, 1975). Such stimuli underestimate the importance of motion, or the dynamic changes that occur in a face, in emotion recognition (Ambadar, Schooler, & Cohn, 2005).The few studies of dynamic facial expressions have used stimuli, which present methodological limitations that we sought to remedy. In particular, most video database currently used, have not been empirically validated. For our freely available database, we recruited a total of 34 actors to express various emotions. A total of 1,088 grayscale video clips (34 actors * 4 exemplars * 8 expressions) were created. Clips include all basic emotions (happiness, fear, surprise, disgust, sadness, anger) as well as pain and neutral expressions. These videos were spatially aligned frame by frame, on the average coordinates of the eyes and nose (i.e., the clips only contain facial movements), and the luminance was calibrated to allow linear manipulation. All clips contain 15 frames (30 Hz) beginning on the last neutral frame. We empirically validated these stimuli through participants' rating the intensity of the emotions in all stimuli on continuous scales. The video database was adjusted to reflect a confusability matrix with satisfactory d's. We will discuss the main characteristics of the selected clips.

Roy, S. Roy, C. Fortin, I. Ethier-Majcher, C. Belin, P. Gosselin, F. (2007). A dynamic facial expression database [Abstract]. Journal of Vision, 7(9):944, 944a, http://journalofvision.org/7/9/944/, doi:10.1167/7.9.944. [CrossRef]

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.