Abstract
Semantic information is an important feature used to guide attention in real world environments. However, research into semantic guidance of attention has been limited by the difficulty of quantifying semantic relatedness, particularly between stimuli in different sensory modalities. To address this, we first created a constrained audiovisual stimulus set to derive similarity ratings between each item within three categories (animals, instruments, household items). A set of 140 participants judged which of two images was more similar to a given sound, and which of two sounds was more similar to a given image. These judgments were used to derive semantic relatedness values for pairs of images and sounds. Using these semantic relatedness values, in a separate experiment, we have been able to probe semantic guidance of attention in audiovisual contexts in a more nuanced continuous manner than previously possible. Current work moves beyond evaluating semantic guidance of attention in a categorical manner (either semantically related or not), and allows us to test it in a continuous manner using the semantic relatedness values derived from the similarity task. The derived quantified judgments have been made available in a database form to the wide research community to be used as a measure of semantic relatedness in cognitive psychology experiments, enabling more robust studies of semantic influences in audiovisual environments.