August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Search efficiency scales with semantic relatedness in audiovisual contexts
Author Affiliations
  • Kira Wegner-Clemens
    George Washington University
  • George Malcolm
    University of East Anglia
  • Sarah Shomstein
    George Washington University
Journal of Vision August 2023, Vol.23, 5772. doi:https://doi.org/10.1167/jov.23.9.5772
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Kira Wegner-Clemens, George Malcolm, Sarah Shomstein; Search efficiency scales with semantic relatedness in audiovisual contexts. Journal of Vision 2023;23(9):5772. https://doi.org/10.1167/jov.23.9.5772.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Semantic information strongly influences attention in visual scenes. However, its role in audiovisual contexts remains unclear. Recent studies demonstrated that a task-irrelevant sound improves search for a matched visual target (e.g., hearing a bark allows participants to find an image of a dog more quickly). However, the extent to which a non-match but semantically related sound modulates attentional selection remains an open question. To elucidate the role of semantic processing beyond exact matches in audiovisual search, 109 participants searched for visual targets in an image array while a task-irrelevant sound played simultaneously. The task irrelevant sound could either exactly match the visual target or be one of 9 other sounds that varied in their relatedness to the visual target, as defined by a database of crossmodal semantic judgements (the Sight Sound Semantics database). As semantic relatedness between sounds and target images increases, search times decrease (r=-0.27; p=0.009). In prior studies, only sounds that exactly matched images increased search performance, which may reflect a prioritization of stimuli that co-occur in time and space (e.g., often when you hear a meow sound, you also see a cat so you prioritize all cat representations). Our results show that there is a graded effect of semantic relatedness, which means that sounds can modulate prioritization to images even outside of exact co-occurrence in time and space, suggesting a more robust influence of semantic information across modalities than previously thought.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×