August 2010
Volume 10, Issue 7
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2010
Reverse correlation in temporal FACS space reveals diagnostic information during dynamic emotional expression classification
Author Affiliations
  • Oliver Garrod
    Department of Psychology, Centre for Cognitive Neuroimaging, University of Glasgow
  • Hui Yu
    Department of Psychology, Centre for Cognitive Neuroimaging, University of Glasgow
  • Martin Breidt
    Department of Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics
  • Cristobal Curio
    Department of Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics
  • Philippe Schyns
    Department of Psychology, Centre for Cognitive Neuroimaging, University of Glasgow
Journal of Vision August 2010, Vol.10, 700. doi:https://doi.org/10.1167/10.7.700
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Oliver Garrod, Hui Yu, Martin Breidt, Cristobal Curio, Philippe Schyns; Reverse correlation in temporal FACS space reveals diagnostic information during dynamic emotional expression classification. Journal of Vision 2010;10(7):700. https://doi.org/10.1167/10.7.700.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Reverse correlation experiments have previously revealed the locations of facial features crucial for recognition of different emotional expressions, and related these features to brain electrophysiological activity [SchynsEtal07]. However, in social perception we expect the generation and encoding of communicative signals to share a common framework in the brain [SeyfarthCheney03] and neither ‘Bubbles’ [GosselinSchyns03] nor white noise based manipulation effectively target the input features underlying facial expression generation - the combined activation of sets of facial muscles over time. [CurioEtal06] propose a motion-retargeting method that controls the appearance of facial expression stimuli via a linear 3D Morphable Model [BlanzVetter99] composed of recorded Action Units (AUs). Each AU represents the surface deformation of the face, given the full activation of a particular muscle or muscle group taken from the FACS [EkmanFriesen79] system. The set of weighted linear combinations of AUs are hypothesised as a generative model for the set of typical facial movements for this actor.

Here we report the outcome of a facial emotion reverse correlation experiment with one such generative AU model over a space of temporally parameterized AU weights. On each trial, a random selection of between 1 and 5 AUs are selected. Random timecourses for selected AUs are generated according to 6 temporal parameters (see supplementary figure). The observer rates the stimulus for each of the 6 ‘universal emotions’ on a continuous confidence scale from 0 to 1 and, from these ratings, optimal AU timecourses (timecourses whose temporal parameters maximize the expected rating for a given expression) are derived per expression and AU. These are then fed as weights into the AU model to reveal the feature dynamics associated with the expression. This method extends Bubbles and reverse correlation techniques to a relevant input space – one that makes explicit hypotheses about the temporal structure of diagnostic information.

Garrod, O. Yu, H. Breidt, M. Curio, C. Schyns, P. (2010). Reverse correlation in temporal FACS space reveals diagnostic information during dynamic emotional expression classification [Abstract]. Journal of Vision, 10(7):700, 700a, http://www.journalofvision.org/content/10/7/700, doi:10.1167/10.7.700. [CrossRef]
Footnotes
 The Economic and Social Research Council and Medical Research Council (ESRC/RES-060-25-0010).
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×