August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Autistic group differences in social attention are magnified by real-world perceptual and linguistic features
Author Affiliations
  • Amanda J Haskins
    Dartmouth College
  • Jeff Mentch
    Harvard University
  • Thomas L. Botch
    Dartmouth College
  • Brenda D. Garcia
    Dartmouth College
  • Alexandra L. Burrows
    Dartmouth College
  • Caroline E. Robertson
    Dartmouth College
Journal of Vision August 2023, Vol.23, 5870. doi:https://doi.org/10.1167/jov.23.9.5870
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Amanda J Haskins, Jeff Mentch, Thomas L. Botch, Brenda D. Garcia, Alexandra L. Burrows, Caroline E. Robertson; Autistic group differences in social attention are magnified by real-world perceptual and linguistic features. Journal of Vision 2023;23(9):5870. https://doi.org/10.1167/jov.23.9.5870.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Autism is characterized by differences in remarkably distinct functional domains, including sensory processing, social cognition, and language. How are autistic traits across these domains related, particularly in real-world environments? Here, we sought to understand how a core characteristic of the autistic phenotype—reduced social attention—is impacted by both perceptual and linguistic cues in real-world environments using eye-tracking in VR. Adult participants (N = 40; 19 autistic) freely viewed 360° scenes in three conditions that systematically increased perceptual load, while holding visual semantic information constant (conditions: static photosphere, dynamic videosphere, multisensory videosphere). On each trial, participants’ gaze was measured via in-headset eyetracking. To quantify participants’ social attention, we generated a continuous model of social information for each scene using a novel computational language modeling approach. We modeled the degree to which participants’ social attention was modulated by perceptual load as well as language cues in the multisensory environment. Across all participants, social attention significantly increased across conditions (p<0.001), demonstrating that as perceptual load increased, participants increasingly selectively attended to social information at the expense of other information sources. Importantly, group differences in social attention also emerged with perceptual load (p=0.02), where autistic participants displayed reduced social attention only in higher load conditions. Crucially, this pattern was specific to the social domain: we did not observe this same pattern toward nonsocial semantic information (p>0.05). Finally, in the multisensory condition, we found that reduced social attention for autistic relative to non-autistic participants specifically preceded words of high social relevance (p=0.006) in the conversational environment. Our results suggest that autistic group differences in social attention are not a static phenotypic signature. Instead, we provide evidence that autistic group differences are modulated by increasing perceptual load and the evolving stream of linguistic cues from social agents in real-world environments.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×