December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
The role of motion in the neural representation of social interactions
Author Affiliations & Notes
  • Kami Koldewyn
    Bangor University
  • Julia Landsiedel
    Bangor University
  • Katie Daughters
    University of Essex
  • Paul E. Downing
    Bangor University
  • Footnotes
    Acknowledgements  This work was funded by a European Research Council (ERC) Starting Grant: Grant ID #716974, ‘Becoming Social’
Journal of Vision December 2022, Vol.22, 4011. doi:https://doi.org/10.1167/jov.22.14.4011
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Kami Koldewyn, Julia Landsiedel, Katie Daughters, Paul E. Downing; The role of motion in the neural representation of social interactions. Journal of Vision 2022;22(14):4011. https://doi.org/10.1167/jov.22.14.4011.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Humans are inherently social, with dedicated brain regions sensitive to social cues such as faces, bodies, and biological motion. More recently, research has begun to investigate how the brain responds to more complex social scenes. This work has identified the posterior superior temporal sulcus (pSTS) as a key region for processing dynamic social interactions. Findings from static vs dynamic paradigms differ, however, suggesting that the extrastriate body area (EBA) - but not the pSTS - is central to processing simple static dyadic interactions. Despite an upsurge in work investigating social interactions both behaviourally and neurally, the crucial role of motion in interaction perception has not yet been investigated directly. Here, 23 participants viewed videos, image sequences, scrambled image sequences, and static images of dyadic social interactions or non-interactive independent actions. Both bilateral pSTS and left EBA showed sensitivity to motion and interactive content. Indeed, both regions showed higher responses to interactions than independent actions in videos and intact sequences, but not in other conditions. While both regions show this “dynamic-specific” interaction sensitivity, it is seen most strongly in pSTS. Contrary to our expectations, EBA was not sensitive to interactive content in static pictures. Intriguingly, exploratory multivariate regression analyses suggest that, for bilateral pSTS, both interaction selectivity and body selectivity (measured in separate localisers) but not motion sensitivity drive interaction selectivity for videos in our main task. In contrast, interaction selectivity in EBA appears to be driven primarily by body selectivity. More work is needed to understand the different roles EBA may play in processing interactions conveyed by prototypical static dyads (facing dyads) as opposed to our complex and varied dynamic displays. Altogether, these findings support the existence of a third visual stream supporting dynamic social scene perception, where EBA may play a supporting, and pSTS a central role.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×