October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Auditory and visual information affect social event segmentation differently
Author Affiliations & Notes
  • Francesca Capozzi
    McGill University
  • Nida Latif
    McGill University
  • Emma Ponath
    McGill University
  • Jelena Ristic
    McGill University
  • Footnotes
    Acknowledgements  Social Sciences and Humanities Research Council of Canada (SSHRC); Natural Sciences and Engineering Research Council of Canada (NSERC); William Dawson Chairs Fund
Journal of Vision October 2020, Vol.20, 420. doi:https://doi.org/10.1167/jov.20.11.420
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Francesca Capozzi, Nida Latif, Emma Ponath, Jelena Ristic; Auditory and visual information affect social event segmentation differently. Journal of Vision 2020;20(11):420. doi: https://doi.org/10.1167/jov.20.11.420.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Humans spontaneously parse the dynamic environmental content into social and non-social events. Although social segmentation is thought to reflect a perceptual grouping process, the role of different perceptual modalities in this ability remains unclear. Here we tested how auditory and visual social information in isolation and in conjunction influenced social segmentation. Participants viewed a video clip depicting a dyadic social interaction. In separate groups, they first viewed the clip including only auditory or visual information and then including both modalities. In each condition, participants were asked to mark social and nonsocial events in separate blocks by pressing a keyboard key. Results indicated both overlapping and unique social and nonsocial events. Replicating past data, analysis of response agreement and variability revealed that social events were recognized with higher agreement and lower variability than nonsocial events, especially when both auditory and visual information was available. The lowest agreement and the highest variability were found when participants segmented nonsocial events using auditory information only, while the highest agreement and the lowest variability were found when participants segmented social events using auditory and visual information after they have segmented the same video using visual information only. Thus, visual social information appears to have a facilitatory effect on social segmentation with auditory and visual information influencing the ability to parse the environmental socio-interactive content into social and nonsocial events differently.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×