September 2019
Volume 19, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2019
Neural Dynamics of Category Representations Across Space and Time in the Ventral Visual Cortex
Author Affiliations & Notes
  • Yalda Mohsenzadeh
    Computer Science and Artificial Intelligence Lab, MIT
  • Caitlin Mullin
    Computer Science and Artificial Intelligence Lab, MIT
  • Benjamin Lahner
    Computer Science and Artificial Intelligence Lab, MIT
  • Aude Oliva
    Computer Science and Artificial Intelligence Lab, MIT
Journal of Vision September 2019, Vol.19, 60b. doi:https://doi.org/10.1167/19.10.60b
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yalda Mohsenzadeh, Caitlin Mullin, Benjamin Lahner, Aude Oliva; Neural Dynamics of Category Representations Across Space and Time in the Ventral Visual Cortex. Journal of Vision 2019;19(10):60b. https://doi.org/10.1167/19.10.60b.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Previous neuroscience works have established a functional organization across ventral visual stream such that distinct categories are processed in different brain areas; e.g. the parahippocampal cortex (PHC) for places, the object-selective lateral occipital complex (LOC), the fusiform gyrus for faces and animates. However, it is still an open question how categorical representation in these specialized brain regions unfold over time. Here we extended the approach of MEG-fMRI fusion from Cichy et al. (2014,2016) to reveal the category-related spatiotemporal neural dynamics of vision in the human brain. We collected fMRI and MEG data while participants (N=15) viewed 156 natural images in four categories, namely objects, scenes, faces and animates (people and animals). We created theoretical model representational dissimilarity matrices (RDMs) representing category membership—higher similarity within a category. Using representational similarity analysis, per category content, we correlated spatially-resolved fMRI RDMs with temporally-resolved MEG RDMs while partialling out the other three category models. Then we localized the category-related neural activity in fMRI data by conducting a searchlight analysis, comparing each category model RDM with searchlight fMRI RDMs. Finally, we filtered the partial fMRI MEG representational correlations with the spatially localized category contents resulting in a spatiotemporal map per category while controlled for other categories. Scene and Object category processing started at ~80ms in early visual cortex and reached medially to PHC at ~110ms and laterally to LOC at ~100ms, respectively, both with a peak at ~135ms. Body category content emerged in LOC at ~90ms and extended to Fusiform at ~110ms. Fusiform showed significant face category content processing starting at 70ms reaching a peak at ~140ms. Together, our results revealed the spatiotemporal dynamics of category representations in the ventral stream at the millimeter and millisecond resolution.

Acknowledgement: This research was funded by NSF grant number 1532591, in Neural and Cognitive Systems and the Vannevar Bush Faculty Fellowship program funded by the ONR grant number N00014-16-1-3116. The experiments were conducted at the Athinoula A. Martinos Imaging Center at the McGovern Institute for Brain Research, Massachusetts Institute of Technology. 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×