Abstract
In natural circumstances, the visual objects we search for may be accompanied by characteristic sounds. The hiding cat may meow, or the keys in the cluttered drawer may jingle when moved. Do these characteristic sounds facilitate visual localization of objects even if the sound has no location information? To answer this question, we used a visual-search paradigm in which participants searched for a target named on each trial. Each search display contained a target object (e.g., a cat) and three distractor objects (e.g., a car, a mosquito, and a wine-glass). The display was presented simultaneously with a sound that was either associated with the target object (e.g., “meow”), associated with a distractor object (e.g., “clink”), or unrelated to any object in the search array (e.g., “jingle”). We used 20 objects and their associated sounds counterbalanced across target, distractor and unrelated conditions. Participants responded to the location of the target by pressing a key that corresponded to the quadrant in which the target appeared. Characteristic sounds of target objects significantly speeded their visual localization; for example, the “meow” sound significantly speeded visual localization of the cat. Search was slower in the other conditions regardless of whether the concurrent sound was associated with a distractor or was unrelated to any objects in the display; for example, visually localizing a cat was equally slow whether participants heard a “clink” or a “jingle”. There was no evidence of a speed-accuracy trade-off. These results suggest that sounds associated with targets facilitate visual search, but that there is little cost to hearing a sound associated with an object that is not the focus of a current search. We thus demonstrated an object-based auditory-visual facilitation, going beyond location-based interactions. Our data add to the growing body of evidence that visual processes are enhanced by consistent information from other sensory modalities.
Supported by NIH Grant EY14110 to SS.