Abstract
Previous EEG and fMRI studies have shown that processing of facial information contains several temporal phases and that several brain regions are involved in analysis of facial expressions and identities. We used multivariate representational similarity analysis (RSA) to combine data from separately measured EEG and fMRI experiments in order to investigate spatiotemporal dynamics of face processing. The face stimuli involved neutral, happy, angry and fearful expressions, as well as morphed (50%) versions of the expressions. The stimuli contained four different identities, two male and two female, as well as all the identities morphed (33% and 67%) to each other. In total, we had 112 images (7 expressions x 16 identities). Event related potentials (ERPs) were calculated from the EEG. The differences between the ERPs elicited by facial manipulations were quantified with representational dissimilarity matrices (RDMs). The ERP RDMs were calculated from consecutive 10 ms time windows from -100 to 1000 ms after stimulus onset. With a searchlight RSA, fMRI RDM within each searchlight was correlated with all ERP RDMs. This resulted in a series of correlation maps showing spatiotemporal dynamics of face processing. Distinct temporal profiles (correlations between fMRI and ERP RMDs) were found in occipital, parietal, temporal and frontal face regions. At 110 ms, correlations peaked in lateral occipital complex (LOC) and occipital face area (OFA). At 180 ms, correlations peaked in early visual areas (V1-V3), OFA and in fusiform face area (FFA). In FFA another peak was found at 230 ms. Temporal areas showed two peaks at 250 ms and 400 ms. Finally, in parietal and frontal regions correlation peaks were found between 400 and 600 ms. The results show that RSA can be applied to combine ERP and fMRI data, and reveal precise spatiotemporal dynamics within the face processing network.
Meeting abstract presented at VSS 2017