Abstract
Holistic face processing in humans has been studied with regard to identity and emotional facial expressions (Calder et al., 2000). However, prior work on holistic processing of facial expressions has used the partial composite paradigm, which is known to exhibit bias effects (Richler, et al., 2011).The complete composite paradigm (Gauthier & Bukach, 2007) provides a bias-proof way to quantitatively measure the interaction between a subject’s or model’s decision on whether a cued half remains the same between a pair of composite faces, and the presence or absence of a change in the half of the face that is not focused on in a trial; such an interaction, a congruency effect, is indicative of holistic processing. We performed a series of simulations of the complete composite paradigm with EMPATH, a neurocomputational model of facial expression recognition, in order to predict human behavior in this task. The version of EMPATH that we used has a layer of Gabor filtering corresponding to V1, followed by a layer of Principal Components Analysis corresponding to category specific areas of IT, and finally a perceptron with a hidden layer of units corresponding to PFC (Dailey et al., 2002). We observed a reduction in the holistic facial expression processing effects when the face halves are misaligned relative to when the face halves are aligned as expected with the complete composite paradigm. Our model also performed qualitatively similar to the participants in recent experiments in facial recognition in humans (Tanaka et al., 2011) in terms of accuracy as well as a measure of confidence that we defined for comparison with the reaction times of the participants. Our results suggest that facial expression processing is holistic, using an unbiased test of holistic processing. [/blockquote] [/blockquote]
Meeting abstract presented at VSS 2013