September 2011
Volume 11, Issue 11
Vision Sciences Society Annual Meeting Abstract  |   September 2011
Perceptual averaging by eye and ear: Computing visual and auditory summary statistics from multimodal stimuli
Author Affiliations
  • Alice R. Albrecht
    Dept of Psychology, Yale University
  • Brian J. Scholl
    Dept of Psychology, Yale University
  • Marvin M. Chun
    Dept of Psychology, Yale University
Journal of Vision September 2011, Vol.11, 1210. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Alice R. Albrecht, Brian J. Scholl, Marvin M. Chun; Perceptual averaging by eye and ear: Computing visual and auditory summary statistics from multimodal stimuli. Journal of Vision 2011;11(11):1210. doi:

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Beyond perceiving the features of individual objects, we also have the intriguing ability to efficiently perceive average values of collections of objects across various dimensions – e.g. the average size of a sequence of discs presented one at a time. Over what features can perceptual averaging occur? Work to date has been limited to visual properties, but perceptual experience is intrinsically multimodal. To find out how perceptual averaging operates in multimodal environments, we explored three questions. First, we asked how well observers can average an auditory feature over time: the changing pitch of a single tone. Not only was auditory averaging robust, but it was more efficient than visual averaging (of the changing size of a disc over time), equating the magnitudes of the changes. Second, we asked how averaging in each modality was influenced by concomitant congruent vs. incongruent changes in the other (task-irrelevant) modality, again combining sizes and pitches. Here we observed a clear and intriguing dissociation. Incongruent visual information hindered auditory averaging, as might be predicted from a simple model of generalized perceptual magnitudes. However, congruent auditory information hindered visual averaging – perhaps due to a Doppler effect induced by the perception of a disc moving in depth. When modalities are readily separable, you may be able to attend to any modality you choose; but when modalities are readily bound into a cohesive whole, vision may dominate. Finally, we asked about the ability to average both pitch and size simultaneously, and we found very little cost for averaging in either modality when subjects did not know until the end of a trial which average they had to report. These results collectively illustrate that perceptual averaging can span different sensory modalities, and they illustrate how vision and audition can both cooperate and compete for resources.

NSF Graduate Research Fellowship. 

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.