September 2018
Volume 18, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2018
An objective signature of emotional expressions and context integration within a single glance: evidence from electroencephalographic frequency-tagging
Author Affiliations
  • Stéphanie MATT
    Laboratoire INTERPSY – 2LPN (EA4432) - Université de Lorraine (France)
  • Joan LIU-SHUANG
    Institute of Research in Psychological Science, Institute of Neuroscience, University of Louvain (Belgium)
  • Louis MAILLARD
    CRAN (UMR 7039 CNRS) - CHU de Nancy - Université de Lorraine (France)
  • Joëlle LIGHEZZOLO-ALNOT
    Laboratoire INTERPSY (EA4432) – Université de Lorraine (France)
  • Bruno ROSSION
    Institute of Research in Psychological Science, Institute of Neuroscience, University of Louvain (Belgium)
  • Stéphanie CAHAREL
    Laboratoire INTERPSY – 2LPN (EA4432) - Université de Lorraine (France)
Journal of Vision September 2018, Vol.18, 908. doi:10.1167/18.10.908
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Stéphanie MATT, Joan LIU-SHUANG, Louis MAILLARD, Joëlle LIGHEZZOLO-ALNOT, Bruno ROSSION, Stéphanie CAHAREL; An objective signature of emotional expressions and context integration within a single glance: evidence from electroencephalographic frequency-tagging. Journal of Vision 2018;18(10):908. doi: 10.1167/18.10.908.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The ability to quickly and accurately extract someone's emotional state from their face is crucial for social interaction. Over the last decades, the processing of emotional expressions has been studied mainly using isolated faces. However, at the behavioral level, contextual information often leads to radical changes in the categorization of facial expressions, yet the underlying mechanisms are not well understood (Aviezer et al., 2017, Current Opinion in Psychology, 17, 47–54; Barrett et al., 2011, Current Directions in Psychological Science, 20, 286 –290). Here we examined the impact of emotional visual scenes on the perception of emotional expressions within a single glance by means of fast periodic visual stimulation (FPVS). We recorded 128-channel EEG while participants viewed 60s sequences with a dual frequency-tagging paradigm (Boremanse et al, 2013, Journal of Vision (11):6, 1-18). We presented faces and scenes simultaneously, with each stimulus set flickering at specific frequency (f1=4.61 Hz and f2=5.99 Hz; frequencies were counterbalanced across stimuli). Each sequence displayed different faces with the same emotional expression (disgust, fear, or joy) within either positive or negative valence visual scenes. Periodic EEG responses at the image presentation frequencies (4.61 Hz and 5.99 Hz) captured general visual processing of the emotional faces and scenes, while intermodulation components (e.g. f2-f1: 5.99 – 4.61 Hz = 1.38 Hz) captured the integration between the emotional expressions and their context. At the group-level, emotional expressions elicited right-lateralized occipito-temporal electrophysiological responses that were stronger for negative valence expressions (especially disgust). Similarly, negative scenes elicited stronger neural responses than positive scenes over the medial occipital region. Finally, and critically, we observed intermodulation components that were prominent over right occipito-temporal sites and showed increased response amplitude for negative scenes, thereby providing an objective demonstration of the perceptual integration of emotional facial expressions with their emotional context.

Meeting abstract presented at VSS 2018

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×