June 2007
Volume 7, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   June 2007
The N170 Marks the End of the process — Dynamics of Occipito-temporal integration of facial features across spatial frequency bands to categorize facial expressions of emotion
Author Affiliations
  • Philippe G. Schyns
    Centre for Cognitive Neuroimaging (CCNi), Department of Psychology, University of Glasgow
  • Lucy Petro
    Centre for Cognitive Neuroimaging (CCNi), Department of Psychology, University of Glasgow
  • Marie Smith
    Centre for Cognitive Neuroimaging (CCNi), Department of Psychology, University of Glasgow
Journal of Vision June 2007, Vol.7, 996. doi:https://doi.org/10.1167/7.9.996
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Philippe G. Schyns, Lucy Petro, Marie Smith; The N170 Marks the End of the process — Dynamics of Occipito-temporal integration of facial features across spatial frequency bands to categorize facial expressions of emotion. Journal of Vision 2007;7(9):996. https://doi.org/10.1167/7.9.996.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

For the first time, we reveal the time course of integration of Spatial Frequency (SF) facial features from the brain activity of observers who categorized Eckman's six basic expressions of emotions (i.e. happy, surprised, fearful, angry, disgusted, sad). In the experiment, three observers saw 21,000 sparse versions of expressive faces. Their task was to categorize them while we recorded their EEG. Original stimuli were 70 FACS-coded images of 5 males and 5 females, each displaying one of the 6 basic expressions plus neutral. We used Bubbles to synthesize each sparse face by randomly sampling facial information from 5 one-octave non-overlapping SF bands (Gosselin & Schyns, 2001). Online calibration of sampling density ensured 75% accuracy per expression. Using classification image techniques, we reveal the combination of SF features that each observer's brain requires to produce correct categorization behavior (e.g. the mouth for happy, two eyes for fear). With the same techniques applied to the EEG (measured on face sensitive occipito-temporal electrodes P7 and P8), we reveal the SF features that the brain processes over the time course of the N170. Then, we relate the SF features required for behavior with those integrated over the N170 time course. We show, in 42 independent instances (3 observers × 7 expressions × 2 electrodes), that the slopes of the N170 (reflecting phase onset and amplitude) fit with the slopes of a function that integrates SF featural information over time. In all instances, the maximum of the N170 coincides (with a precision of 4 ms) with the plateau of the information integration function. Thus, the N170 marks the end point of a process that integrates SF features in the 50 ms preceding the N170 peak. The characteristics of the N170 curves (latency, amplitude and width) depend on the nature of the SF features integrated.

Schyns, P. G. Petro, L. Smith, M. (2007). The N170 Marks the End of the process — Dynamics of Occipito-temporal integration of facial features across spatial frequency bands to categorize facial expressions of emotion [Abstract]. Journal of Vision, 7(9):996, 996a, http://journalofvision.org/7/9/996/, doi:10.1167/7.9.996. [CrossRef]
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×