June 2006
Volume 6, Issue 6
Vision Sciences Society Annual Meeting Abstract  |   June 2006
Explaining human facial attractiveness judgements
Author Affiliations
  • Philip Bronstad
    Brandeis University
  • Judith H. Langlois
    The University of Texas at Austin
  • R Russell
    Harvard University
Journal of Vision June 2006, Vol.6, 1070. doi:10.1167/6.6.1070
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Philip Bronstad, Judith H. Langlois, R Russell; Explaining human facial attractiveness judgements. Journal of Vision 2006;6(6):1070. doi: 10.1167/6.6.1070.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

People from different cultural and economic backgrounds agree about which faces are attractive, suggesting that humans share an attractiveness “template.” Much research in the last two decades has focused on the composition of this template, such as whether it is primarily sensitive to facial averageness, symmetry, or sexual dimorphism. However, we have little understanding of how different aspects of appearance codetermine perceptions of attractiveness. Using a simple neural network model, we can reconstruct an attractiveness template from images of faces and attractiveness judgements of those images. The model solves the difficult problem of replicating and predicting human attractiveness judgements to images of faces. It reduces judgements into simpler image-based factors that are sensitive to aspects of facial appearance. These factors are strikingly similar to averageness, symmetry, and sexual dimorphism.

Bronstad, P. Langlois, J. H. Russell, R. (2006). Explaining human facial attractiveness judgements [Abstract]. Journal of Vision, 6(6):1070, 1070a, http://journalofvision.org/6/6/1070/, doi:10.1167/6.6.1070. [CrossRef]

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.