Abstract
Our visual environments contain multiple sources of data relevant to categorization. These information sources are usually operationalized as component “features” within a comprehensive conceptual space, and scene categories can be differentiated according to many perceptual and conceptual features. However, the task that is being performed informs which features may be preferentially processed (Schyns, 1998). The goal of the current study was to assess how changing task demands influences feature usage over time as indexed by event related potentials (ERPs). Participants viewed repeated presentations of 28 different scene images while making cued judgments about 1) relative orientation biases, 2) presence of various objects, or 3) the functions afforded by the scenes. The images were selected to create maximally different representational dissimilarity matrices (RDMs) based on those three features. The stimuli (subtending 18.5 degrees of visual angle) were presented for 500 ms each, and brain activity was recorded via 128-channel EEG. Our analysis followed an encoding approach. For each task, we used the orientation, object, and function RDMs to predict the ERP activity of each electrode in a sliding 40 ms time window. The results revealed that each RDM predicted unique ERP variance at specific time intervals (orientation at ~95 ms; object and functions at 150–175 ms) and with different scalp topographies. The orientation RDM explained the same occipital scalp topography regardless of task, while the object and function RDMs predicted different central and parietal scalp topographies with different tasks. Together, these findings suggest that low-level visual features such as orientation are less subject to changing task demands, while high-level features such as functions and objects strongly influence the processing of those features over time and across the scalp. These results provide insight into the types of visual tasks that seem to be encapsulated from higher-level cognitive processes (Firestone & Scholl, 2016).
Acknowledgement: National Science Foundation (1736394) and James S. McDonnell Foundation (220020439) grants to BCH, National Science Foundation (1736274) grant to MRG.