September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Semantic relationships between sounds and images modulate attention even when the stimuli are task-irrelevant
Author Affiliations & Notes
  • Kira Wegner-Clemens
    George Washington University
  • Dwight J Kravitz
    George Washington University
  • Sarah Shomstein
    George Washington University
  • Footnotes
    Acknowledgements  NIH F31EY034030 to KWC; NSF BCS 1921415 to SSS; NSF BCS 2022572 to SSS & DJK
Journal of Vision September 2024, Vol.24, 1454. doi:https://doi.org/10.1167/jov.24.10.1454
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Kira Wegner-Clemens, Dwight J Kravitz, Sarah Shomstein; Semantic relationships between sounds and images modulate attention even when the stimuli are task-irrelevant. Journal of Vision 2024;24(10):1454. https://doi.org/10.1167/jov.24.10.1454.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Semantic information plays an important but poorly understood role in guiding attention in naturalistic scenes. Semantic relationships among visual objects have been shown to modulate attentional priority, even in tasks where object identity is irrelevant. In an audiovisual context, semantically related sounds can improve search speeds for visual targets, with the benefit scaling with the degree of semantic relatedness. However, prior research almost exclusively focused on the targets defined by their identity, meaning the visual semantic information was task-relevant. Thus, it is unclear whether crossmodal semantic relationships influence attention only when they are task relevant, or whether those relationships play a more general role in attentional selection. In the present study, we investigate whether an audiovisual semantic benefit exists when both the image and sound’s semantic information are task-irrelevant. Participants were presented with two images and a sound, then subsequently presented with two Gabor patches at the image locations and asked to identify whether the target Gabor was slanted clockwise or counterclockwise. On valid trials, when the sound matched the image where the target Gabor subsequently appeared, participants responded significantly faster than on invalid trials, when the sound matched the image at the location where a distractor appeared. The size of the validity benefit was modulated by the degree of semantic relationship between the sound and the other image. In a mixed effect model with relatedness and validity as fixed effects and subject, target location, and rotation direction as random effects, there was a significant interaction between semantic relatedness and validity, such that a stronger semantic relatedness between the unmatched image and sound results in a smaller validity effect. These results show that crossmodal semantic relationships guide attention even when task-irrelevant, suggesting that semantic relatedness plays a general role in guiding attentional selection.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×