September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
The multidimensional representation of facial attributes.
Author Affiliations & Notes
  • Jessica Taubert
    The University of Queensland, QLD Australia
    The National Institute of Mental Health, MD United States.
  • Shruti Japee
    The National Institute of Mental Health, MD United States.
  • Amanda Robinson
    The University of Queensland, QLD Australia
  • Houqiu Long
    The University of Queensland, QLD Australia
  • Tijl Grootswagers
    Western Sydney University, NSW Australia.
  • Charles Zheng
    The National Institute of Mental Health, MD United States.
  • Francisco Pereira
    The National Institute of Mental Health, MD United States.
  • Chris Baker
    The National Institute of Mental Health, MD United States.
  • Footnotes
    Acknowledgements  This research was supported by the Intramural Research Program of the National Institute of Mental Health (ZIAMH002909 to CIB) and the Australian Research Council (FT200100843 to JT)
Journal of Vision September 2024, Vol.24, 662. doi:https://doi.org/10.1167/jov.24.10.662
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jessica Taubert, Shruti Japee, Amanda Robinson, Houqiu Long, Tijl Grootswagers, Charles Zheng, Francisco Pereira, Chris Baker; The multidimensional representation of facial attributes.. Journal of Vision 2024;24(10):662. https://doi.org/10.1167/jov.24.10.662.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

As primates, our social behaviour is shaped by our ability to read the faces of the people around us. Our current understanding of the neural processes governing ‘face reading’ comes primarily from studies that have focused on the recognition of facial expressions. However, these studies have often used staged facial expressions, potentially disconnecting facial morphology from genuine emotion and circumstance. Therefore, a reliance on staged stimuli might be obscuring our understanding of how faces are perceived and recognised during everyday life. Here our goal was to identify the core dimensions underlying the mental representation of expressive facial stimuli using a data driven approach. In two behavioural experiments (Experiment 1, N = 940; Experiment 2, N = 489), we used an odd-one-out task to measure perceived dissimilarity within two sets of faces; 900 highly-variable, naturalistic, expressive stimuli from the Wild Faces Database (Long, Peluso, et al., 2023 Sci Reports, 13: 5383) and 670 highly-controlled, staged stimuli from the NimStim database (Tottenham, Tanaka, et al., 2009 Psychiatry Res, 168: 3). Using Representational Similarity Analysis, we mapped the representation of the faces in the Wild and NimStim databases, separately, and compared these representations to behavioral and computational models. We also employed the state-of-the-art VICE model (Muttenthaler, Zheng, et al., 2022 Adv Neural Inf Process Syst) to uncover the dimensions that best explained behaviour towards each of the face sets. Collectively, these results indicate that the representation of the Wild Faces was best characterised by perceived social categories, such as gender, and emotional valence. By comparison, facial expression category explained more of the perceived dissimilarity among the NimStim faces than the Wild Faces. These findings underscore the importance of stimulus selection in visual cognition research and suggest that, under naturalistic circumstances, humans spontaneously use information about both social category and expression to evaluate faces.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×