Abstract
Humans can flexibly select locations, features, or objects in a scene for prioritized processing. Although it is relatively straightforward to manipulate location- and feature-based attention, it is difficult to isolate object-based selection. A critical problem is that objects are always composed of features. Thus most studies of object-based selection can be explained in terms of selection of a combination of locations and features. Here we aimed to demonstrate a pure case of object-based selection. Subjects viewed two superimposed gratings that continuously changed their color, orientation, and spatial frequency. On each trial, each grating started with a fixed set of features and then continuously evolved along a fixed trajectory in feature space. Over the time of a trial (6 s), the two gratings traversed exactly the same feature values in each dimension, thus were completely equivalent in terms of visual features. Subjects were cued at the beginning of each trial to attend to one or the other grating to detect a brief threshold-level luminance increment. We first trained subjects on the behavioral task to criterion and then measured their brain activity with fMRI during the task. We found a network of occipital and frontoparietal areas active during the tracking task. The mean fMRI response amplitude did not differ across the two attention conditions (attend grating 1 vs. attend grating 2) in all areas. However, using multi-voxel pattern classification, we were able to decode the attended grating in a set of frontoparietal areas, including intraparietal sulcus (IPS), frontal eye field (FEF), and inferior frontal junction (IFJ). Early visual areas exhibited weaker decoding accuracies. Thus, a perceptually varying object can be represented by patterned neural activity in frontoparietal areas. We suggest these areas encode attentional priority for high-level objects independent of their locations and features.
Meeting abstract presented at VSS 2013