Abstract
To study how the visual system computes the 3D shape of faces from shading information, we manipulated the illumination conditions on 3D scanned face models and measured how the face discrimination changes with lighting direction. To dissociate surface albedo and illumination component of face images, we used a symmetry algorithm to separatethe symmetric and asymmetric components face images in both low and high spatial frequency bands. Stimuli were hybrid male/female faces with different combinations of symmetric and asymmetric spatial content. We verified that the perceived depth of the face was proportional to the degree of asymmetric low spatial frequency (shading) information in the faces. The symmetric component was morphed from a male face to a female one. The asymmetric shading component was manipulated through the change of lighting direction from 0 degree (front) to 60 degree (side). In each trial, the task of an observer was to determined whether the test image was male or female. The proportional of “female” response increased with the proportion of female component in a morph. Faces with asymmetric “male ” shading was more easily judged as male than those with “female” shading, and vice versa. This shading effect increased with lighting direction. Conversely, the low spatial frequency symmetric information had little, if any, effect. The perceived depth of a face increased with shading information but not symmetric information. Together, these results suggest that (1) the shading information from asymmetric low spatial frequencies dramatically affects both perceived face identity and perceived depth of the facial structure; and (2) this effect increased as the lighting direction shifts to the side. Thus, our results provide evidence that face processing has a strong 3D component.
NSC 96-2413-H-002-006-MY3.