Purchase this article with an account.
Martin Eimer; The time course of feature-based and object-based control of visual attention. Journal of Vision 2015;15(12):1394. doi: 10.1167/15.12.1394.
Download citation file:
© 2017 Association for Research in Vision and Ophthalmology.
Many models of attentional control in vision assume that the allocation of attention is initially guided by independent representations of task-relevant visual features, and that the integration of these features into bound objects occurs at a later stage that follows their feature-based selection. This presentation will report results from recent event-related brain potential (ERP) experiments that measured on-line electrophysiological markers of attentional object selection to dissociate feature-based and object-based stages of selective attention in real time. These studies demonstrate the existence of an early stage of attentional object selection that is controlled by local feature-specific signals. During this stage, attention is allocated in parallel and independently to visual objects with target-matching features, irrespective of whether another target-matching object is simultaneously present elsewhere. From around 250 ms after stimulus onset, information is integrated across feature dimensions, and attentional processing becomes object-based. This transition from feature-based to object-based attentional control can be found not only in tasks where target objects are defined by a combination of simple features (such as colour and form), but also when one of the two target attributes is defined at the categorical level (letter versus digit). Overall, the results of these studies demonstrate that feature-based and object-based stages of attentional selectivity in vision can be dissociated in real time.
This PDF is available to Subscribers Only