Abstract
We explore methods for finding perceptually similar faces by combining unsupervised ‘eigenface’ techniques with facial similarity ratings obtained from human subjects. We use a set of 1050 face images for which we have obtained the locations of facial features as well as the facial outline. The images are from the internet and exhibit uncontrolled lighting, viewpoint, and resolution. We segment and warp the faces to register the features and extract PCA coefficients from both the warping as well as the appearance of the faces. Previous authors (e.g. Dailey, Cottrel, Busey, 1999) have observed performance gains on facial recognition tasks by utilizing a psychological space obtained from human similarity ratings. We explore whether psychological space may be modeled as a Euclidean space by constructing a mapping function from PCA space to a space which conforms to psychometric similarity judgments. The map is computed in two steps. First, we create an affinity matrix for a set of 750 train images. Similarity trees of 35 images are constructed using a greedy algorithm in which the two most similar faces are repeatedly joined together until every image has been placed in a tree (Rhodes, 1988). Each tree provides a fully ranked 35×35 matrix of distances which is used to populate the affinity matrix. Next, we explore various cost functions and optimize the parameters of the mapping in order to minimize the disparities between the similarity judgments made by subjects and the Euclidean distances of the PCA representations. The performance of the map is measured using a test set of 300 images rated using the similarity tree algorithm. Our performance metric considers the ∼25 closest images to a target image. It rewards retrievals which subjects consider close to a target image, while punishing retrievals which are far. Mapping results in performance increases of up to 20% over Euclidean PCA, indicating that mapping to a psychometric space can result in performance gains.