September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Processing of congruent and incongruent facial expressions during listening to music: an eye-tracking study
Author Affiliations
  • Kari Kallinen
    Finnish Defense Research Agency
Journal of Vision August 2017, Vol.17, 199. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Kari Kallinen; Processing of congruent and incongruent facial expressions during listening to music: an eye-tracking study. Journal of Vision 2017;17(10):199. doi:

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Introduction Studies have shown that (a) multimodal emotional experience might be increased in the combined music-picture condition and (b) that music influences ratings on visual stimuli. However, there is scarcityof studies that examine the potential moderating effects of music on looking at images. In the present paper we report the results of an eye-tracking study on congruent and incongruent emotional music (joy, sad, and anger) and facial expressions (happy and sad). We expected that facial expressions congruent to music would attach more attention than incongruent faces. In addition, we expected that angry music (which had no corresponding face images), would elicit higest eye-movement activity between the facial expression (as the subject search for corresponding facial expression). Methods Five men and five women aged 33-64 years (M=46,9) took part in the experiment. Their task was to listen to three pieces of music (a priori sad, joyful and angry) and at the same time look at facial expressions (sad and happy) presented in the screen. Eye movements were tracked with Ergoneer Dikablis eye-tracker during listening to music and watching the facial expressions. Results As expected, in connection joyful and sad music the congruent faces (i.e., happy faces for joyful music and sad faces for sad music) elicited more attention in terms of AOI attention ratio and total glance time as compared for incongruent faces (for AOI attention ratio Ms = 53,2% and 36,7%; p=.002; for total glance time Ms =12,9 and 8,88 seconds, p=.002) . In connection with music that expressed anger the preliminary analysis showed no effects. Conclusion The results give new information about the interactive effects of emotional music and facial expressions. The knowledge about the effects of music on image processing and interaction between music and images are important and useful, among other things, in the context of (multi)media design and presentation.

Meeting abstract presented at VSS 2017


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.