August 2010
Volume 10, Issue 7
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2010
Multimodal integration of the auditory and visual signals in dyadic point-light interactions
Author Affiliations
  • Lukasz Piwek
    University of Glasgow, Department of Psychology, Glasgow, UK
  • Karin Petrini
    University of Glasgow, Department of Psychology, Glasgow, UK
  • Frank Pollick
    University of Glasgow, Department of Psychology, Glasgow, UK
Journal of Vision August 2010, Vol.10, 788. doi:https://doi.org/10.1167/10.7.788
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Lukasz Piwek, Karin Petrini, Frank Pollick; Multimodal integration of the auditory and visual signals in dyadic point-light interactions. Journal of Vision 2010;10(7):788. https://doi.org/10.1167/10.7.788.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Multimodal aspects of non-verbal communication have thus far been examined using displays of a solitary character (e.g. the face-voice and/or body-sound of one actor). We extend investigation to more socially complex dyadic displays using point-light displays combined with speech sounds that preserve only prosody information. Two actors were recorded approaching each other with three different intentions: negative, positive and neutral. The actors' movement was recorded using a Vicon motion capture system. The speech was simultaneously recorded and subsequently processed with low-pass filtering to obtain an audio signal that contained prosody information but not intelligible speech. In Experiment 1, displays were presented bimodally (audiovisual) and unimodally (audio-only and visual-only) to examine whether bimodal audiovisual conditions would facilitate perception of the original social intention, compared to the unimodal conditions. In Experiment 2, congruent (visual and audio signal from same actor and intent) and incongruent displays (visual and audio signal from different actor and intent) were used to explore changes in social perception when the sensory signals gave discordant information. Results supported previous findings obtained with solitary characters: the visual signal dominates over the auditory signal (however, auditory information can influence the visual signal when the intentions from both modalities are discordant). Results also showed that this dominance of visual over auditory is significant only when the interaction between characters is perceived as socially meaningful i.e. when positive or negative intentions are present.

Piwek, L. Petrini, K. Pollick, F. (2010). Multimodal integration of the auditory and visual signals in dyadic point-light interactions [Abstract]. Journal of Vision, 10(7):788, 788a, http://www.journalofvision.org/content/10/7/788, doi:10.1167/10.7.788. [CrossRef]
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×