December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Multiracial Reading the Mind in the Eyes Test (MRMET): validation of a stimulus-diverse and norm-referenced version of a classic measure
Author Affiliations
  • Jeremy Wilmer
    Wellesley College
  • Heesu Kim
    Wellesley College
  • Jasmine Kaduthodil
    Wellesley College
  • Laura Germine
    Wellesley College
  • Sarah Cohan
    Wellesley College
  • Brian Spitzer
    Wellesley College
  • Roger Strong
    Wellesley College
Journal of Vision December 2022, Vol.22, 3205. doi:https://doi.org/10.1167/jov.22.14.3205
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jeremy Wilmer, Heesu Kim, Jasmine Kaduthodil, Laura Germine, Sarah Cohan, Brian Spitzer, Roger Strong; Multiracial Reading the Mind in the Eyes Test (MRMET): validation of a stimulus-diverse and norm-referenced version of a classic measure. Journal of Vision 2022;22(14):3205. https://doi.org/10.1167/jov.22.14.3205.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Do racially homogeneous stimuli facilitate scientific control, and thus validity of measurement? Here, as a case in point, we ask whether a multiracial cognitive assessment utilizing a diverse set of stimuli maintains psychometric qualities that are as good as, if not better than, an existing Eurocentric measure. The existing measure is the Reading the Mind in the Eyes Test (RMET) (Baron-Cohen et al., 2001), a clinically significant neuropsychiatric paradigm that has been used to assess face expression reading, theory of mind, and social cognition. The original measure, however, lacked racially inclusive stimuli, among other limitations. In an effort to rectify this and other limitations of the original RMET, we have created and digitally administered a Multiracial version of the RMET (MRMET) that is reliable, validated, stimulus-diverse, norm-referenced, and free for research use. We show, with a series of sizable datasets (Ns ranging from 1,000 to 12,000), that the MRMET is on par or better than the RMET across a variety of psychometric indices. Moreover, the reliable signal captured by the two tests is statistically indistinguishable, evidence for full interchangeability. Given the diversity of the populations that neuropsychology aims to survey, we introduce the Multiracial RMET as a high-quality, inclusive alternative to the RMET that is conducive to unsupervised digital administration across a diverse array of populations. With the MRMET as a key example, we suggest that multiracial cognitive assessments utilizing diverse stimuli can be as good as, if not better than, Eurocentric measures.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×