August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Musically induced microvalences in high-level visual processing of everyday scenes
Author Affiliations
  • Elizabeth Galbo
    Fordham University
  • Nathan Lincoln-DeCusatis
    Fordham University
  • Elissa M. Aminoff
    Fordham University
Journal of Vision August 2023, Vol.23, 5223. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Elizabeth Galbo, Nathan Lincoln-DeCusatis, Elissa M. Aminoff; Musically induced microvalences in high-level visual processing of everyday scenes. Journal of Vision 2023;23(9):5223.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Our understanding of a scene, although typically described only within the visual domain, can be influenced by other modalities. Here, we examine the link between visual and auditory cognitive processing, or cross-modal processing, using an affective priming paradigm. Current theories do not typically incorporate scene affect into models of scene understanding, yet we explore how the affect associated with a scene can be modulated by music. In the current experiment, participants (N = 39) rated both how much they enjoyed musical excerpts and images of everyday, neutral scenes. A novel musical stimulus dataset was created for the present study to ensure the musical examples did not carry semantic associations and that the observed effects would be attributed to their affectual influence. This dataset included sixty-four original miniature piano compositions composed with features controlled along six binaries. Participants listened to a brief musical excerpt, reported an affect rating from really dislike to really like, then viewed and rated a neutral scene (e.g., a dining room). A significant difference between scene affect ratings after participants heard music they disliked and liked was found on both the participant and individual scene levels. These results imply auditory processing plays a role in scene understanding. Not only were participants’ scene ratings modulated by the affect of the musical stimuli, but the same scene was rated more positively or negatively depending on the affect of the preceding musical example. Crossmodal processing occurs between music and scene perception, and our results demonstrate how one can affect the perception of the other.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.