September 2019
Volume 19, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2019
Distinct spatiotemporal profiles for identity, expression, gender, and gaze information during face perception from intracranial EEG recordings
Author Affiliations & Notes
  • Brett B Bankson
    Laboratory of Cognitive Neurodynamics, Department of Neurological Surgery, University of Pittsburgh
    Cognitive Program, Department of Psychology, University of Pittsburgh
  • Michael J Ward
    Laboratory of Cognitive Neurodynamics, Department of Neurological Surgery, University of Pittsburgh
  • R. Mark Richardson
    Brain Modulation Lab, Department of Neurological Surgery, University of Pittsburgh
  • Avniel S Ghuman
    Laboratory of Cognitive Neurodynamics, Department of Neurological Surgery, University of Pittsburgh
Journal of Vision September 2019, Vol.19, 55. doi:https://doi.org/10.1167/19.10.55
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Brett B Bankson, Michael J Ward, R. Mark Richardson, Avniel S Ghuman; Distinct spatiotemporal profiles for identity, expression, gender, and gaze information during face perception from intracranial EEG recordings. Journal of Vision 2019;19(10):55. https://doi.org/10.1167/19.10.55.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

When the brain visually detects a face, a well-defined distributed network of brain areas is rapidly engaged to represent the combination of visual, social, and affective information inherent to faces. It has traditionally been difficult to elucidate simultaneously the spatial and temporal dynamics that underlie face representations because of recording technique limitations that cannot adequately measure transient, fine-grained differences in individual face feature dimensions. To rectify this, we recorded intracranial electroencephalography (iEEG) from 29 epilepsy patients (~3300 total electrode contacts) who completed a gender discrimination task with face stimuli from the Radboud Faces Database. Each patient viewed repetitions of 14 unique identities (7 female) displaying each 5 facial expressions and 3 gaze directions. We adopt an elastic net regularized regression approach to perform whole-montage classification of face identity, expression, gaze, and gender over time that allows us to identify precise cortical sources of face information from individual electrode contacts. First, this analysis shows differing latencies by which above-chance classification of exemplar-level information initially emerges for various face dimensions: identity and expression at ~140 ms, and gender at ~190 ms. Second, we quantify the spatial representation of these face dimensions throughout temporal cortex to demonstrate that more medial electrode contacts contribute to identity decoding, more lateral electrode contacts contribute to expression decoding, and a large, diffuse set of contacts contributes to gender decoding. Additionally, we find distinct contributions of ERP and high-frequency broadband signal components to these spatial profiles. Finally, we identify a robust posterior-anterior gradient throughout ventral temporal cortex along which face identity information emerges temporally. The results here highlight the heterogeneity of cortical areas that represent individual face dimensions across time, the unique profiles of iEEG signal components during face processing, and the utility of data-driven investigations over whole-montage human iEEG signal.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×