Abstract
Being able to read social information is vital for an individual. A wealth of social cues is provided by the face, in particular emotional expressions. To address the question of how the brain discriminates emotional faces, we recorded electroencephalogram from 18 participants during a fast periodic oddball paradigm, which provides an objective, implicit and robust quantifiable measure of visual discrimination. The same face with a neutral expression was presented at a rate of 5.88 Hz during an 80 sec sequence. Every five faces, the same face displaying an emotional expression of fear, disgust or happiness (in different sequences), was presented, thus resulting in a sequence NNNNFNNNNFNNNNF (e.g., neutral-fear oddball sequence). The oddball 1.18Hz (5.88Hz/5) response and its harmonics (e.g., 2f = 2.36 Hz) were used to measure emotional face discrimination. This emotional face discrimination response was observed bilaterally at occipito-temporal sites. Furthermore, inverting the faces significantly reduced the brain response over the occipito-temporal regions for the oddball frequency, suggesting that it reflected high level processes related to the emotional faces. The response to happy faces was characterised with more dorsal distribution than angry and disgusted faces. The latter face type was characterised with more anterior scalp topography than the angry faces. An additional analysis confirmed the topographical differences and hinted at partly distinct neural generators. A complementary time domain analysis revealed several components discriminating neutral from emotional faces and an additional experiment comparing the mode of stimulus presentation – sine vs. square wave – suggested that these 3 components peaked at 120 ms (positive); 170 ms (negative) and 250 ms (positive) after stimulus onset. These observations provide new insights into the temporal dynamics of facial expression processing and show that the fast periodic oddball paradigm can be successfully employed to address processes underlying social perception.
Meeting abstract presented at VSS 2015