Abstract
The brain derives information from several sense modalities to enhance the speed and accuracy of detection of objects and events, and the choice of appropriate responses. There is mounting evidence that perceptual experiences that may appear to be modality specific are also influenced by activity from other sensory modalities, even in the absence of awareness of this interaction. In a series of speeded classification tasks we found natural spontaneous mappings between the pitch of sounds and the visual features of vertical location, size and spatial frequency. The facilitatory effects we observed during congruent bimodal stimulation hint at the automatic nature of these crossmodal interactions. In the present studies we examined the role of attention in the interaction between crossmodal features. Participants performed speeded classification or search task of low or high load while ignoring irrelevant stimuli in a different modality. We found in both paradigms that crossmodal irrelevant distractors were processed regardless of the difficulty of the classification or of the search task. Congruency between the visual and the irrelevant auditory stimulus had an equal effect in the low and in the high load conditions. A third experiment tested divided instead of selective attention, requiring participants to compare stimuli on both modalities and respond to the visual-auditory compound. Here too the congruency effect was no larger with attention divided across both modalities than when it was focused on one. These findings offer converging evidence that interaction between corresponding audio-visual features is not dependent on attention.