Abstract
Recent fMRI studies have shown that basic-level object categories can be distinguished by the differential patterns of activity that they evoke in human ventral temporal cortex (Haxby et al., 2001). Here, we investigated if distributed patterns of fMRI activity can differentiate objects at the subordinate level, and tested if these distributed representations reflect the coding of local, low-level features or position-invariant, high-level features. We observed fMRI activity while subjects viewed images from 1 of 8 different bird species (e.g. seagull, penguin). Birds were presented at random orientations, either at fixation (Exp 1), or in the upper- and lower-left visual field (Exp 2). Correlational analyses were used to evaluate if different types of birds could be reliably classified by comparing activity patterns on test trials to those evoked by the different bird species on training trials. When training and test stimuli were both presented at fixation, birds were correctly classified on 70% of trials (chance=50%) based on activity patterns in ventral temporal cortex. However, activity patterns in retinotopic visual cortex were equally effective at subordinate-level discriminations (73% correct), suggesting that local low-level feature information alone might entirely account for successful classification performance. In Exp 2, activity patterns in ventral temporal areas were effective at discriminating between different bird species irrespective of whether the test and training stimuli were presented in the same location or different locations (67% and 62%). In contrast, activity patterns in the retinotopic cortex were unable to generalize across changes in location and led to chance levels of discrimination performance. Our results demonstrate that ventral temporal areas contain flexible position-invariant information that effectively discriminate the subtle differences between subordinate-level objects.
Supported by NIH grants R01-EY14202 and P50-MH62196 to FT