Abstract
In general, observers can reliably distinguish between real faces and computer-generated (CG) faces. Previous results suggest that observers rely heavily on eye appearance to categorize faces as real or artificial. We examined how eye appearance affects real/CG face categorization by using contrast negation to selectively disrupt local feature appearance. Negation severely disrupts face processing, so we expected that real/CG categorization would be impaired following the negation of critical features. Critically, we generated "contrast chimeras" (images in which the contrast polarity of the eyes differs from the rest of the face) to determine how task performance was affected by eye appearance relative to the rest of the face. In two experiments, we used a simple discrimination task to measure real/CG categorization. Our stimuli were comprised of photographs of real men and women and CG faces created from the original photographs. In both experiments, participants viewed a real face and a CG face sequentially on each trial and were to identify which image depicted a real person. In Experiment 1 (N=16), observers completed this task using the original images and fully contrast-negated faces. In Experiment 2 (N=21), participants viewed positive and negative images, as well as "contrast chimeras" depicting either negative eyes in a positive face or vice-versa. In all cases, image presentation time was 500ms with a 750ms ISI. We observed a main effect of contrast polarity on performance in Experiment 1 (p<0.05), demonstrating that real/CG categorization is disrupted by negation. In Experiment 2, we observed no significant effect of eye polarity (p=0.14) on accuracy, but did find that negating the rest of the face did significantly lower accuracy (p=0.02). We conclude that real/CG face categorization may depend less upon eye appearance than prior reports suggest, and that the appearance of the skin may be a highly diagnostic cue.
Meeting abstract presented at VSS 2014