September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Evidence for efficient inverse graphics in the human brain using large-scale ECoG data
Author Affiliations & Notes
  • Daniel Calbick
    Yale University
  • Hakan Yilmaz
    Yale University
  • Greg McCarthy
    Yale University
  • Ilker Yildirim
    Yale University
  • Footnotes
    Acknowledgements  Yale Center for Research Computing
Journal of Vision September 2024, Vol.24, 1502. doi:https://doi.org/10.1167/jov.24.10.1502
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Daniel Calbick, Hakan Yilmaz, Greg McCarthy, Ilker Yildirim; Evidence for efficient inverse graphics in the human brain using large-scale ECoG data. Journal of Vision 2024;24(10):1502. https://doi.org/10.1167/jov.24.10.1502.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The efficient inverse graphics (EIG) hypothesis states that the initial few hundred milliseconds of visual processing unfolds a cascade of computations reversing generative models of how scenes form and project to images, thereby mapping retinal inputs to three-dimensional scene percepts. Here we ask: Is EIG a shared computational motif of face processing between non-human primates and the human brain? In two key domains of vision, perception of faces and bodies, EIG models have been shown to closely recapitulate processing hierarchies in macaque inferotemporal cortex. These studies used representational similarity analysis (RSA) and high spatio-temporal resolution recordings of single neurons, enabling effective model evaluation. To test EIG in the human brain, we turned to a large-scale electrocorticography (ECoG) dataset in epilepsy patients (N=60). All patients passively viewed a common screening task with faces and other stimulus categories. In addition, each patient viewed a stimulus set of only faces, with different subsets of patients seeing one of a number of distinct stimulus sets. Analyzing this data is challenging due to the idiosyncratic distribution of electrodes across patients and differences across stimulus sets. We address these challenges with a data-driven pipeline for functional and anatomical identification and registration of electrodes in ECoG: We exploit the shared screening task to identify face-informative electrodes in each patient and register electrodes to regions of interest (RoI; e.g., fusiform face area). Using RSA, we find that the similarity structure of EIG on a given stimulus set significantly correlates with the empirical similarity matrices, based on the stimulus-evoked time-frequency patterns of electrodes in individual patients. We also confirm this correlation using random shuffling of data. Finally, EIG explains the time-frequency patterns especially highly between 200-400 ms after stimulus onset. These results suggest EIG as a shared motif of visual computation across primates and humans.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×