Abstract
Perceptual attentional effects have been characterized in terms of sensitivity or response time changes. Here we examine observers' decisional behavior in conditions thought to introduce attentional effects. We take advantage of a series of studies measuring changes of sensitivity (d') and decision criterion (zFA) between single and dual tasks (in standard detection and discrimination visual and auditory tasks) to reveal an unexpected relationship between the decisional behavior and sensitivity. Data show that while observers adopt a quasi-optimal decision criterion (in the Signal Detection Theory sense) in single tasks, they depart from it in dual tasks showing criteria convergence. In the extreme case, observers use a unique criterion (uc) in accordance with a model whereby decisions are based on a unique internal representation. Depending on the nature of the task and of the stimuli used, uc occurs in experiments showing no sensitivity drop, while separate criteria are used in experiments showing a sensitivity drop. Criteria ratio (zFA2/zFA1, where 1 and 2 refer to the less and more salient targets in a pair) in the dual task was found to be highly correlated with the d' reduction between the single and dual conditions (explaining more than 90% of the variance in the data). This correlation is accounted for by a model positing that observers always use a uc (Sagi & Gorea, VSS 2004) in the dual tasks and that the observed departures from it reflect an unequal increase of the internal noises related to the two targets. According to the model, the less salient stimulus yields a larger internal noise increment relative to the more salient one. Hence, both sensitivity losses and departures from optimality in dual tasks appear to be determined by the same process and can be used interchangeably as indices of attentional dispersal. This is the first demonstration of an attentional link between sensitivity and decisional impairments in dual tasks.