Abstract
We present a new computational model of visual search that follows on prior theoretical work by Buetti and Lleras emphasizing the contributions of parallel peripheral processing to visual search performance. The model uses concurrent parallel (distributed attention) and serial (focused attention) evaluative processes for inspecting items in a visual display. Search items are assigned random priorities for attentional selection. These priorities immediately begin to decay, and are refreshed based on feature similarity to the search template. Items are stochastically selected for focused attention based on Luce's choice axiom defined over their priorities. Selected items are matched to a search template and either accepted as the target or rejected as a distractor. During this serial process, the priorities of the remaining search items are updated in parallel, in proportion to their proximity to fixation. The resulting model successfully simulates the typical logarithmic slopes found in human data when the target-distractor similarity is medium to low (e.g., Buetti et al., 2016; Wang et al., 2017). It also produces linear search slopes when target-distractor similarity is elevated. We present simulations of these and other classic visual search phenomena, like the difference between feature and conjunction search, as well as search asymmetries.