August 2012
Volume 12, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2012
High-level neural similarity predicts perceptual competition during encoding of different object categories
Author Affiliations
  • Michael Cohen
    Harvard University
  • Talia Konkle
    University of Trento, Italy
  • Juliana Rhee
    Harvard University
  • Ken Nakayama
    Harvard University
  • George Alvarez
    Harvard University
Journal of Vision August 2012, Vol.12, 1269. doi:https://doi.org/10.1167/12.9.1269
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Michael Cohen, Talia Konkle, Juliana Rhee, Ken Nakayama, George Alvarez; High-level neural similarity predicts perceptual competition during encoding of different object categories. Journal of Vision 2012;12(9):1269. https://doi.org/10.1167/12.9.1269.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Faces, scenes, objects, and bodies evoke distinct but overlapping neural activation patterns in the ventral stream when presented in isolation. If these stimulus categories are presented simultaneously in the visual field, how do they compete for perceptual resources? Is the degree of competition between stimulus categories predicted by the similarity of their individual neural responses?

Participants’ task was to detect changes between two successively presented displays, each containing four items. These items either came from the same category (e.g. four faces) or a mixture of two categories (e.g. two faces and two scenes). For each category pairing, we compared change detection performance on same-category and mixed-category trials. Overall we found that performance was significantly better for mixed-category than same-category trials, where the size of effect (Cohen’s D) depended on the pair of categories tested: F/S=1.24; B/S=1.18; B/F=.97; B/O=.93; F/O=.4; O/S=-.13.

Next, we used a blocked fMRI design to obtain activation patterns for each stimulus category for images presented in isolation. Neural similarity was calculated by computing the average Euclidean distance in beta-weights across voxels within independently localized regions of interest (ventral stream excluding V1/V2 and category selective voxels, category selective voxels, areas V1/V2, and dorsolateral prefrontal cortex, DLPFC). We observed a significant (p<.05) correlation between the degree of perceptual competition found in the behavioral experiment and neural similarity in the ventral stream (R2=.5) and within category selective regions (e.g. FFA+PPA, etc.; R2=.47), but not significant in low-level regions (V1/V2. R2=.03) or a non-visual region (DLPFC, R2=.005).

These results show perceptual competition depends on the categories being encoded, and is predicted by the neural similarity of those categories in high-level cortex. These results suggest that the ability to represent multiple objects concurrently is limited by the extent to which representing those objects depend on separate underlying neural resources.

Meeting abstract presented at VSS 2012

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×