Abstract
When navigating our environment, we encounter various cues that may help us orient attention. Which cues we attend and under what conditions we decide to use them may prove to be beneficial or detrimental to our goals. Moreover, individuals differ in their usage of cues; for instance, it has been shown that individuals sometimes ignore cues that validly predict the target location. To explain this failure to use presumably beneficial information, it has been proposed that people consider both the temporal costs of not using the cue vs. the temporal costs of processing the cue (see “Least Costs Hypothesis,” or LCH; Pauszek & Gibson, 2018). That is, if the reaction time (RT) slowing due to cue processing is greater than the RT benefits of using the cue, observers will ignore the cue. Previous work supporting the LCH has focused on group-averaged data, but it remains unknown if individuals vary in their choice of least costs. To address this, we examined individual differences in cue usage. We employed two different cueing methods: an arrow cue and a verbal cue, each of which predicted the target in a 4-item search display with 70% validity. Cue usage was calculated by subtracting valid-trial from invalid-trial RTs. Then, cue usage and overall RTs were compared across individuals. According to the LCH, individual differences in cue usage should reflect individual variation in the relative costs of processing cues vs. ignoring cues. However, results showed that increased cue usage corresponded with slower overall RT. This suggests that greater cue usage had detrimental effects on overall performance, thus departing from adherence to the LCH. Taken together, while the LCH accounts to some degree for overall usage of valid spatial cues, the present results expose variation in the degree to which individuals optimize performance via cue usage.