Abstract
Humans spontaneously parse the dynamic environmental content into social and non-social events. Although social segmentation is thought to reflect a perceptual grouping process, the role of different perceptual modalities in this ability remains unclear. Here we tested how auditory and visual social information in isolation and in conjunction influenced social segmentation. Participants viewed a video clip depicting a dyadic social interaction. In separate groups, they first viewed the clip including only auditory or visual information and then including both modalities. In each condition, participants were asked to mark social and nonsocial events in separate blocks by pressing a keyboard key. Results indicated both overlapping and unique social and nonsocial events. Replicating past data, analysis of response agreement and variability revealed that social events were recognized with higher agreement and lower variability than nonsocial events, especially when both auditory and visual information was available. The lowest agreement and the highest variability were found when participants segmented nonsocial events using auditory information only, while the highest agreement and the lowest variability were found when participants segmented social events using auditory and visual information after they have segmented the same video using visual information only. Thus, visual social information appears to have a facilitatory effect on social segmentation with auditory and visual information influencing the ability to parse the environmental socio-interactive content into social and nonsocial events differently.