August 2009
Volume 9, Issue 8
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2009
Relating neural object representations to perceptual judgments with representational similarity analysis
Author Affiliations
  • Marieke Mur
    Section on Functional Imaging Methods, Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA and Dept of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, The Netherlands
  • Mirjam Meys
    Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, The Netherlands
  • Jerzy Bodurka
    Functional Magnetic Resonance Imaging Facility, National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA
  • Peter Bandettini
    Section on Functional Imaging Methods, Laboratory of Brain and Cognition, National Inst. of Mental Health, Bethesda, MD, USA, and Functional Magnetic Resonance Imaging Facility, National Inst. of Mental Health, National Insts. of Health, Bethesda, MD, USA
  • Nikolaus Kriegeskorte
    Section on Functional Imaging Methods, Laboratory of Brain and Cognition, National Institute of Mental Health, Bethesda, MD, USA
Journal of Vision August 2009, Vol.9, 780. doi:https://doi.org/10.1167/9.8.780
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Marieke Mur, Mirjam Meys, Jerzy Bodurka, Peter Bandettini, Nikolaus Kriegeskorte; Relating neural object representations to perceptual judgments with representational similarity analysis. Journal of Vision 2009;9(8):780. https://doi.org/10.1167/9.8.780.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Human inferior temporal cortex (hIT) has been shown to be involved in the representation of visual objects. Recent studies have begun to investigate the relationship between perceived object similarity and similarity of response patterns in hIT. These studies often used a small set of (novel) stimuli from a few a priori defined categories. Here, we use a stimulus set consisting of 96 object images from a wide range of object categories including faces, body parts, animals, places, and artificial objects. We compare the neural and perceptual similarity structure of these 96 object images using representational similarity analysis.

We performed BOLD fMRI measurements at high resolution (voxel size 1.95x1.95x2 mm3). Activity in response to 96 different object photos was measured in four subjects. hIT was defined at a range of sizes by selecting the most visually responsive voxels, based on independent data. The neural similarity structure of hIT response patterns was constructed by computing the dissimilarity (1-correlation distance) between each pair of object activity patterns. Subjects were asked to arrange the 96 object images in 2D to report perceptual similarity.

The neural and perceptual similarity structures were significantly correlated (r =0.46, p[[lt]]0.0001). This indicates that objects that are perceived as similar tend to elicit similar response patterns in hIT. In addition, both structures showed a categorical organization, with the main clusters being animate and inanimate objects. These findings suggest that, for a wide range of real-world objects, similarity of neural object representations in hIT reflects perceived object similarity.

Mur, M. Meys, M. Bodurka, J. Bandettini, P. Kriegeskorte, N. (2009). Relating neural object representations to perceptual judgments with representational similarity analysis [Abstract]. Journal of Vision, 9(8):780, 780a, http://journalofvision.org/9/8/780/, doi:10.1167/9.8.780. [CrossRef]
Footnotes
 This research was supported by the Intramural Research Program of the NIH, NIMH.
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×