Abstract
When the brain visually detects a face, a well-defined distributed network of brain areas is rapidly engaged to represent the combination of visual, social, and affective information inherent to faces. It has traditionally been difficult to elucidate simultaneously the spatial and temporal dynamics that underlie face representations because of recording technique limitations that cannot adequately measure transient, fine-grained differences in individual face feature dimensions. To rectify this, we recorded intracranial electroencephalography (iEEG) from 29 epilepsy patients (~3300 total electrode contacts) who completed a gender discrimination task with face stimuli from the Radboud Faces Database. Each patient viewed repetitions of 14 unique identities (7 female) displaying each 5 facial expressions and 3 gaze directions. We adopt an elastic net regularized regression approach to perform whole-montage classification of face identity, expression, gaze, and gender over time that allows us to identify precise cortical sources of face information from individual electrode contacts. First, this analysis shows differing latencies by which above-chance classification of exemplar-level information initially emerges for various face dimensions: identity and expression at ~140 ms, and gender at ~190 ms. Second, we quantify the spatial representation of these face dimensions throughout temporal cortex to demonstrate that more medial electrode contacts contribute to identity decoding, more lateral electrode contacts contribute to expression decoding, and a large, diffuse set of contacts contributes to gender decoding. Additionally, we find distinct contributions of ERP and high-frequency broadband signal components to these spatial profiles. Finally, we identify a robust posterior-anterior gradient throughout ventral temporal cortex along which face identity information emerges temporally. The results here highlight the heterogeneity of cortical areas that represent individual face dimensions across time, the unique profiles of iEEG signal components during face processing, and the utility of data-driven investigations over whole-montage human iEEG signal.