Abstract
The neural representations of facial expression are not well defined. We used an adaptation paradigm to determine how a variety of images affected perception of expression in a subsequently viewed series of ambiguous target faces that were derived by morphing between two facial expressions. When subjects viewed one of the emotional faces that had been used to generate the morphed series, there was a substantial shift towards perceiving the emotion opposite to the adapting stimulus in the ambiguous face. However, this aftereffect was not due to low-level image adaptation, as an equally strong aftereffect was generated when using a different image of the same expression from the same individual for the adapting stimulus. A weaker but significant aftereffect was still generated when the adapting stimulus was the same expression on the face of a different person, regardless of gender. Non-face visual, auditory, or verbal representations of emotion did not generate a similar after-effect. The results suggest that adaptation affects at least two neural representations of emotion: one that is specific to the individual involved (but not specific to image), and one that represents emotion across different facial identities. The identity-independent aftereffect suggests the existence of a generalizable visual semantic representation of facial expression in the human visual system.
This work was supported by NIH grant 1R01 MH069898 and CIHR grant 77615. CJF was supported by a Michael Smith Foundation for Health Research Junior Graduate Studentship. JJSB was supported by a Canada Research Chair and a Michael Smith Foundation for Health Research Senior Scholarship.