August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Prediction of preference judgments of face images using facial expressions and EEG signals
Author Affiliations
  • Satoshi Shioiri
    Tohoku University
  • Hikaru Nagata
    Max Planck Institute for Intelligent Systems, Tübingen
  • Yoshiyuki Sato
    William James Center for Research, ISPA-Instituto Universitario, Lisbon, Portugal
  • Yasuhiro Hatori
    University of Liverpool
Journal of Vision August 2023, Vol.23, 5063. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Satoshi Shioiri, Hikaru Nagata, Yoshiyuki Sato, Yasuhiro Hatori; Prediction of preference judgments of face images using facial expressions and EEG signals. Journal of Vision 2023;23(9):5063.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Facial expression is related to mental processes, and face images could be used to predict them. Since neural activities are responsible to the mental processes, combining EEG signals, and facial expression makes prediction of mental processes likely better than either one of the two. The purpose of the study is to predict preference judgments of face images as one of mental processes, using facial expressions and EEG signals, which would facilitate to create new methods for investigation of mental processes. We conducted experiments to assess the performance judgments of face images, during which EEG signals and face images of the participants were recorded. Participants compared the two images to choose the preferable one (relative judgment) for pairs with different preference pre-estimated in Exp. 1. Participants reported absolute levels of preference (absolute judgment) using numbers between 1 and 7 in Exp. 2. The facial features were analyzed using an open source software, OpenFace, and EEG signals were analyzed separately for δ, θ, α, β, γ bands for each of eight electrodes. Using a machine learning method, Light GBM, we tried to classified the trials with larger differences in preference levels pre-estimated (top 25%) and those with smaller differences (bottom 25%) in Exp. 1, and trials with higher preference (6,7) and those with lower preference (1,2) in Exp. 2. Classification in both experiments shows performance higher than chance: Area Under Curve of Receiver Operating Characteristic was 0.65 and 0.66 for Exps. 1 and 2. The analysis shows that major contributing factors are different between the two cases. Facial features are dominant factors for prediction of relative judgments while EEG signals also contributes to absolute judgments. These results suggest that combining of facial expressions and EEG signals help to create new investigation methods of mental processes.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.