August 2012
Volume 12, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2012
Audiovisual action priming: meaning, time, and signal strength
Author Affiliations
  • James Thomas
    Rutgers University - Newark
  • Maggie Shiffrar
    Rutgers University - Newark
Journal of Vision August 2012, Vol.12, 615. doi:https://doi.org/10.1167/12.9.615
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      James Thomas, Maggie Shiffrar; Audiovisual action priming: meaning, time, and signal strength. Journal of Vision 2012;12(9):615. https://doi.org/10.1167/12.9.615.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The integration of multisensory information depends upon spatial and temporal coincidence, signal strength, and semantics (Meredith & Stein, 1983; Welch & Warren, 1986). Sounds also aid visual detection. Visual sensitivity to human actions improves when paired with sounds that are both semantically and temporally congruent (e.g. Arrighi et al., 2009). Two studies investigated the roles of meaning, timing, and visual signal strength in visual sensitivity to human actions. Experiment 1 tested whether temporal synchrony is necessary for meaningful sounds to impact visual sensitivity. Participants performed a point-light walker detection task with sounds that were meaningful (footsteps) or neutral (tones) and synchronous or asynchronous with point-light footfalls. Results revealed a main effect of sounds, no effect of synchrony, and no interaction. Sensitivity with both coincident and random footsteps was significantly greater than sensitivity with temporally coincident tones or temporally random tones. This suggests that audiovisual action priming occurs at the level of meaning and that sounds can enhance visual sensitivity in the absence of temporal coincidence (e.g. Schneider et al., 2008). Experiment 2 investigated whether signal strength moderates the effect of meaningful sounds on the priming of visual actions, as predicted by the multisensory rule of inverse effectiveness (IE) (e.g. Collignon et al., 2008). Participants detected a point-light walker in a mask of varying densities that rendered detection more or less difficult. Results revealed a main effect of sounds, but no interaction between sounds and mask density; footsteps improved sensitivity across all levels of mask density. However, when the data were analyzed according to walker detection accuracy in silent displays (Thomas & Shiffrar, 2011), an interaction emerged, such that footstep sounds improved sensitivity for visually difficult movies. These results agree with the IE rule. When an action is difficult to perceive, a related sound can facilitate visual detection.

Meeting abstract presented at VSS 2012

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×