Abstract
People often perceive social and nonsocial events simultaneously. What types of environmental information determine the nature of those event boundaries? Here we tested the role of face cues. Participants viewed a video clip depicting a social interaction between two individuals. Actors' faces were either visible or blurred. In separate counterbalanced blocks, observers were asked to manually mark social and nonsocial events in both visibility conditions. During the task, their eye movements were recorded using a high-speed remote eye tracker. Key press data indicated overlapping social and nonsocial event boundaries in both visibility conditions. Eye-tracking data revealed that extracting information from actors' faces supported both social and nonsocial event segmentation. That is, participants looked more frequently at actor's faces, especially when they were visible. When faces were blurred, however, participants looked equally frequently at actors' faces and bodies. Thus, information conveyed by faces appears to be an important factor in parsing the environmental socio-interactive content into both social and nonsocial events.
Meeting abstract presented at VSS 2018