Abstract
Humans naturally perceive the similarity between different objects. Humans are especially sensitive to facial similarity and it has been suggested that individuals seek partners with similar facial attributes. The goal of this work is to place facial similarity on a solid computational footing by developing algorithms which map measured facial features to metric spaces that conform to human notions of facial similarity.
While there is a vast literature devoted to facial recognition, judging facial similarity is a more subtle and difficult topic and presents numerous challenges. One challenge is obtaining a large number of reliable facial similarity judgments with which to learn an appropriate metric space for faces. We compare the efficiency of several different similarity rating techniques using both information theoretical analysis and synthetic simulations based on subject data. Using these analyses we are able to identify the most promising rating method. Another challenge is developing objective approaches for measuring facial similarity, and acquiring data-sets which can supplement similarity information obtained from human ratings. For this purpose, we utilize a variety of data-sets, including facial morphs and sets of images of the same person under different poses and lighting conditions. Finally, we outline a computational framework which automatically maps measurements on faces into a new space which conforms to notions of facial similarity. Our system exhibits a surprising ability to generalize similarity information learned from different data-sets and the perceptual maps show significant improvement over baseline techniques in predicting human similarity judgments.