Abstract
Facial expressions represent an important part of non-verbal communication used in everyday life. The N170 is widely regarded as a face-sensitive potential and has been linked to facial structural encoding, however it remains debated whether the N170 is modulated by facial expressions of emotion. We investigated how attention to facial features affects the early stages of emotion perception during an implicit emotion processing task. ERPs were recorded in response to presentations of fearful, joyful, or neutral facial expressions while fixation was restricted to the left eye, right eye, nose, or mouth using an eye tracker. Participants’ task was to discriminate the face gender. Enhanced N170 amplitudes and longer latencies were found when participants were fixated on the left and right eyes compared to the mouth and nose irrespective of emotion. Importantly, the N170 was not modulated by emotion. The results support the view that the N170 component is not sensitive to the facial expression in an implicit emotional task. In contrast, which feature is fixated modulates this component. As the eyes have been shown to be the diagnostic feature used to correctly categorize face gender, it could be that attention to the diagnostic feature is what drives N170 modulation with emotion in previous studies not controlling for fixation. This idea is currently being tested using an explicit emotional task with the same stimuli.
Meeting abstract presented at VSS 2013