September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Investigating coloration as an emotion expressive cue for social robots
Author Affiliations
  • Christopher Thorstenson
    University of Wisconsin-Madison
  • Karen Schloss
    University of Wisconsin-Madison
Journal of Vision September 2021, Vol.21, 2826. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Christopher Thorstenson, Karen Schloss; Investigating coloration as an emotion expressive cue for social robots. Journal of Vision 2021;21(9):2826. doi:

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Social robots are artificial social agents designed to interact with humans across settings, including education, recreation, and healthcare. To be effective, social robots need to communicate social information to humans, such as emotions. Color cues are one promising way to convey emotions because people have rich color-emotion associations that can be leveraged to influence judgments of artificial emotions. In the current work, we generated images of different social robot models (i.e., one that resembles a human face, and one nonanthropomorphic disk that resembles an Alexa) with color differences. Participants indicated how much the color elicited judgments of emotions (anger, disgust, happy, fear, sad, surprise, valence, arousal). Robot colors were sampled across CIELAB color space. Color varied within subjects and robot model varied between-subjects (25 colors x 8 emotions x 2 repetitions = 400 trials). Results for some emotions (e.g., anger) were largely consistent with previous research on human facial color-emotion associations (Thorstenson et al., 2018). For example, both face and disk robots that were redder (a*; B=10.98, p<.001) and yellower (b*; B=8.09, p<.001) were judged as angrier. However, results for other emotions differed between face and disk robots in systematic ways; Emotion judgments for face robots tended to be driven by differences along color-opponent dimensions (a* or b*), while judgments for disk robots tended to be driven by differences along chroma (C*). For example, participants judged face robots as sadder as they increased in blueness (b*, B=-13.06, p<.001), whereas they judged disk robots as sadder when they decreased in chroma (C*, B=-19.86, p<.001) with no effect of blueness (b*, B=-.26, p=.93). These differences can be understood in terms of different emotion-inference processes for different ecological functions. Understanding how people evaluate emotions of robots from coloration will help develop social robots with robust emotion-expressive capabilities, thus facilitating meaningful human-robot social interactions.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.