September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Representational similarity analysis of EEG and fMRI responses to face identities and emotional expressions
Author Affiliations
  • Kaisu Ölander
    Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
    Aalto NeuroImaging, Aalto University, Espoo, Finland.
  • Ilkka Muukkonen
    Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
  • Jussi Numminen
    Helsinki Medical Imaging Center, Töölö Hospital, University of Helsinki, Helsinki, Finland
  • Viljami Salmela
    Department of Psychology and Logopedics, University of Helsinki, Helsinki, Finland
    Aalto NeuroImaging, Aalto University, Espoo, Finland.
Journal of Vision August 2017, Vol.17, 271. doi:https://doi.org/10.1167/17.10.271
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Kaisu Ölander, Ilkka Muukkonen, Jussi Numminen, Viljami Salmela; Representational similarity analysis of EEG and fMRI responses to face identities and emotional expressions. Journal of Vision 2017;17(10):271. https://doi.org/10.1167/17.10.271.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Currently it is acknowledged that the cortical network processing facial information consists of several areas that process different aspects of faces from low-level features to person-related knowledge. We applied multivariate representational similarity analysis to investigate how parametric variation of facial expressions and identities affect temporal (EEG) and spatial (fMRI) patterns of brain activity. As stimuli, we used faces with neutral expression, happy, fearful, angry, and morphed (50%) versions of the expressions, as well as four core identities (two male and two female) and all combinations of these identities morphed (33 and 67%) to each other. In total, we had 112 different faces (7 expressions from 16 identities). The representational dissimilarity matrices (RDMs) were calculated for each time point in the event related potentials (ERPs) from EEG, and for each voxel in fMRI data by using a searchlight approach. Low-level stimulus model RDMs were based on spatial frequency (SF) spectrum of the whole face, the region around the eyes or the region around the mouth. Additional model RDMs were based on the emotional expressions, identities, and interactions between these factors. ERP RDMs correlated with SF models between 220-460 ms, with the identity model between 270-420 ms and 670-1000 ms, and with emotion models at 180 ms, between 230-500 ms, and at 600 ms. There was also an interaction between emotion type and identity. In fMRI, activity patterns related to expressions were found in early visual areas (V1-V3), lateral occipital complex (LOC), occipital face area (OFA), fusiform face area (FFA), posterior superior temporal sulcus (pSTS), and left middle frontal regions, and identity related patterns only in frontal areas. Distinct distributions of positive and negative correlations across face selective areas suggest a different type of processing in LOC, OFA and FFA in comparison to other regions in the face network.

Meeting abstract presented at VSS 2017

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×