Abstract
The N170 event related potential (ERP) observed in EEG has been related to representation and processing of faces, as well as linked to expertise in other object categorisation tasks. To explicitly investigate neural sources of the specific stimulus information sensitivity of the EEG N170 we asked participants to perform four different categorization tasks on the same set of stimulus images (FigA). The four 2-AFC tasks were: 1. happy vs. neutral central face; 2. male vs. female central face; 3. male vs. female pedestrian; 4. normal car vs. SUV. In each trial, the stimulus information is randomly sampled with Bubbles and categorisation responses, MEG and EEG are simultaneously recorded. Our goal is to co-localise the specific stimulus information sensitivity of the N170 ERP for both faces and objects, with the concurrently recorded source localised MEG signal.
Mutual Information (MI) between stimulus samples and responses (FigB) shows the specific image information that must be represented in the brain as the participants perform the task. We extracted stimulus features by summing bubble masks within task-specific regions of interest. We determined the stimulus sensitivity of the EEG at sensor level and MEG at source level by calculating MI between the stimulus feature and the recorded signals. We then quantified the common trial-by-trial representation between the two modalities using redundancy: Red(sensor EEG; source MEG; stimulus feature). Focussing on the face expression task the eyes are represented in sensor level EEG signals around 200ms, during the second half of the N170 (FigC). Redundant information in the concurrent source MEG is shown in occipital cortex and fusiform gyrus.
These results demonstrate the potential of information theoretic methods to relate specific stimulus information processing between different imaging modalities on a trial-by-trial basis, and thereby gain insight into the underlying neural information processing mechanisms.