December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Investigating the temporal dynamics of visual categorization in the human brain using fast periodic visual stimulation
Author Affiliations & Notes
  • Xiaoqian Yan
    Department of Psychology, Stanford University
    Wu Tsai Neurosciences Institute, Stanford University
  • Yulan Diana Chen
    Department of Psychology, Stanford University
    Wu Tsai Neurosciences Institute, Stanford University
  • Anthony M. Norcia
    Department of Psychology, Stanford University
    Wu Tsai Neurosciences Institute, Stanford University
  • Kalanit Grill-Spector
    Department of Psychology, Stanford University
    Wu Tsai Neurosciences Institute, Stanford University
    Neurosciences Program, Stanford University
  • Footnotes
    Acknowledgements  This research was funded by Stanford Wu Tsai Neurodevelopment Big Ideas grant and the Stanford Human Centered Artificial Intelligence Institute.
Journal of Vision December 2022, Vol.22, 4295. doi:https://doi.org/10.1167/jov.22.14.4295
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Xiaoqian Yan, Yulan Diana Chen, Anthony M. Norcia, Kalanit Grill-Spector; Investigating the temporal dynamics of visual categorization in the human brain using fast periodic visual stimulation. Journal of Vision 2022;22(14):4295. https://doi.org/10.1167/jov.22.14.4295.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Fast periodic visual stimulation produces category-specific responses in short recordings (Liu-Shuang et al., 2014), making it suitable for studying developmental and clinical populations (de Heering & Rossion, 2015). Pattern analyses of conventional event-related potentials found that information about visual categories such as faces and bodies can be decoded from patterns of electrical/magnetic activity across the scalp within a few hundreds of milliseconds (Carlson 2013; Cichy et al. 2014; Kaneshiro et al., 2015). Here we developed a novel paradigm combining these two approaches to measure the temporal dynamics of categorical responses to faces, cars, corridors, limbs, and words. We measured high-density electroencephalography (EEG) brain signals from nineteen participants who observed rapid streams of natural images randomly drawn at 4.286 Hz; In each 70-s run, four images were drawn randomly from four categories and every 5th image (0.86 Hz) was drawn randomly from the fifth category. Subjects participated in 12 runs, with all categories appearing in all conditions. With as little as 140-s of visual stimulation per category, we observed in the EEG spectrum (0.86 Hz and its harmonics) significant categorical responses to faces, corridors, limbs, and words over posterior regions of the brain. Further, EEG waveforms to these four categories had distinctive spatio-temporal dynamics: using a winner-take-all-classifier applied in 33-ms sliding time windows, we successfully decoded from each participant’s distributed EEG responses across the scalp which category they were viewing within 200ms of stimulus onset. Interestingly, the timing of successful decoding varied across categories: decoding of faces and words was earliest (onset latency < 150ms, peak 200ms), then corridors (onset~180ms, peak 220ms), and then limbs (onset~200ms, peak 340ms). Together, this study opens an exciting new avenue for fast measurements of spatio-temporal dynamics of visual category information in the brain from EEG data of individual patient participant and child populations.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×