June 2007
Volume 7, Issue 9
Vision Sciences Society Annual Meeting Abstract  |   June 2007
Dynamics of crossmodal interactions between corresponding auditory and visual features
Author Affiliations
  • Karla Evans
    Princeton University
  • Anne Treisman
    Princeton University
Journal of Vision June 2007, Vol.7, 865. doi:https://doi.org/10.1167/7.9.865
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Karla Evans, Anne Treisman; Dynamics of crossmodal interactions between corresponding auditory and visual features. Journal of Vision 2007;7(9):865. https://doi.org/10.1167/7.9.865.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Objects and events in the environment typically produce correlated input to several sensory modalities at once. There is mounting evidence that perceptual experiences that may appear to be modality-specific are also influenced by activity from other sensory modalities, even in the absence of awareness of this interaction. Using both behavioral and electrophysiological indices, we explored crossmodal interactions between non-speech auditory and visual stimuli on the basis of content correspondence between the stimuli. In a series of different psychophysical paradigms we found spontaneous mappings between the pitch of sounds (high or low) and the visual features of vertical location, size and spatial frequency. High pitch combined with visual high spatial position, small size or high spatial frequency gave better performance than the opposite pairings.

An EEG study using high-density mapping explored the time-course and scalp topography of this crossmodal interaction between pitch and spatial position in the vertical plane. Event-related potentials (ERPs) were recorded from 128 scalp electrodes while participants performed a one-back recognition task with unimodal auditory or visual stimuli or combined bimodal stimuli. Interaction effects were assessed by comparing the responses to combined stimulation with the algebraic sum of responses to the constituent auditory and visual stimuli when they were presented alone. The bimodal components were then also compared for congruent and incongruent pairings. Spatiotemporal analysis of ERP's revealed several audio-visual interaction components that were temporally, spatially and functionally distinct. The perceptual gains that we observe behaviorally correlate with amplification of the neuronal response in the participating sensory-specific and nonspecific cortices. Amplification of bimodal relative to summed unimodal responses was found within the time course of sensory processing, peaking at 120 and 190 msec poststimulus. A third component, peaking at 270 msec over parieto-occipital sites, was also modulated by congruency between the two features.

Evans, K. Treisman, A. (2007). Dynamics of crossmodal interactions between corresponding auditory and visual features [Abstract]. Journal of Vision, 7(9):865, 865a, http://journalofvision.org/7/9/865/, doi:10.1167/7.9.865. [CrossRef]

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.