August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Shared and individual thresholds for social signal detection
Author Affiliations & Notes
  • Rekha S. Varrier
    Dartmouth College
  • Alison H. Sasaki
    Dartmouth College
  • Tory G. Benson
    Dartmouth College
  • Ashna J. Kumar
    Dartmouth College
  • Jordan M. Selesnick
    Dartmouth College
  • Emily S. Finn
    Dartmouth College
  • Footnotes
    Acknowledgements  This project was supported by a NARSAD Young Investigator Award (grant nr. 28392) from the Brain & Behavior Research Foundation (E.S.F.), a Neukom CompX Faculty Grant from the Neukom Institute for Computational Science at Dartmouth College (E.S.F.), and by grant number R01MH129648 from NIMH (E.S.F).
Journal of Vision August 2023, Vol.23, 4820. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Rekha S. Varrier, Alison H. Sasaki, Tory G. Benson, Ashna J. Kumar, Jordan M. Selesnick, Emily S. Finn; Shared and individual thresholds for social signal detection. Journal of Vision 2023;23(9):4820.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

To the human eye, social cues are omnipresent. The recently proposed third visual stream highlights the role of motion in detecting social signals (Pitcher and Ungerleider, 2021, TICS). Parameters like the directness of a chase (“chase subtlety”; Gao et al., 2009, Cognitive Psychology) and speed can be vital cues indicating the presence and nature of social interactions. We hypothesized that social perception is driven by both shared and individual thresholds of these motion parameters – leading to group-level similarities and individual differences in percepts, respectively. We studied two types of dynamic interactions involving circles (“agents”) – one agent pursuing the other (Study1), and two agents intermittently coming into contact for a friendly or aggressive interaction (Study2) – by systematically varying the chase subtlety (Study1) and the speed at which agents charged toward each other (Study2). To control for baseline perceptual biases, we also included additional videos comprising two agents wandering randomly. Using continuous scales, individuals indicated how social an interaction was (Study1; three batches, total n=608), and what type of interaction they perceived on a play-fight spectrum (Study2; n=199). Participants also filled out trait questionnaires. We found that social perception decreased with increasing chase subtleties in all three batches of Study1 (LMEM; b<–0.25; p<1e-12). In Study2, interactions were perceived as more aggressive as the charging speed increased (LMEM; b=0.23; p<1e-43). Individuals who have stronger beliefs about the world being an interactive place perceived the videos of agents wandering as more social (Spearman r>.16; p<.02 in all three batches). Lower social ratings may also be associated with higher social skill deficits (Spearman r<–.13, p<.06). Our work shows that strong group-level social percepts (things that “everyone” sees) can result from varying simple motion parameters, and that individual differences are influenced by our beliefs about the world and general social skills.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.