October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Localising the information processing neural sources underlying the N170 event related potential
Author Affiliations
  • Y Duan
    Institute of Neuroscience and Psychology, University of Glasgow
  • J Gross
    Institute of Neuroscience and Psychology, University of Glasgow
    Institute for Biomagnetism and Biosignalanalysis, University of Muenster, Germany
  • RAA Ince
    Institute of Neuroscience and Psychology, University of Glasgow
  • PG Schyns
    Institute of Neuroscience and Psychology, University of Glasgow
    School of Psychology, College of Science and Engineering, University of Glasgow
Journal of Vision October 2020, Vol.20, 1786. doi:https://doi.org/10.1167/jov.20.11.1786
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Y Duan, J Gross, RAA Ince, PG Schyns; Localising the information processing neural sources underlying the N170 event related potential. Journal of Vision 2020;20(11):1786. doi: https://doi.org/10.1167/jov.20.11.1786.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The N170 event related potential (ERP) observed in EEG has been related to representation and processing of faces, as well as linked to expertise in other object categorisation tasks. To explicitly investigate neural sources of the specific stimulus information sensitivity of the EEG N170 we asked participants to perform four different categorization tasks on the same set of stimulus images (FigA). The four 2-AFC tasks were: 1. happy vs. neutral central face; 2. male vs. female central face; 3. male vs. female pedestrian; 4. normal car vs. SUV. In each trial, the stimulus information is randomly sampled with Bubbles and categorisation responses, MEG and EEG are simultaneously recorded. Our goal is to co-localise the specific stimulus information sensitivity of the N170 ERP for both faces and objects, with the concurrently recorded source localised MEG signal. Mutual Information (MI) between stimulus samples and responses (FigB) shows the specific image information that must be represented in the brain as the participants perform the task. We extracted stimulus features by summing bubble masks within task-specific regions of interest. We determined the stimulus sensitivity of the EEG at sensor level and MEG at source level by calculating MI between the stimulus feature and the recorded signals. We then quantified the common trial-by-trial representation between the two modalities using redundancy: Red(sensor EEG; source MEG; stimulus feature). Focussing on the face expression task the eyes are represented in sensor level EEG signals around 200ms, during the second half of the N170 (FigC). Redundant information in the concurrent source MEG is shown in occipital cortex and fusiform gyrus. These results demonstrate the potential of information theoretic methods to relate specific stimulus information processing between different imaging modalities on a trial-by-trial basis, and thereby gain insight into the underlying neural information processing mechanisms.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×