Abstract
Objects do not exist in isolation; they exist in relation to each other. Recognizing that a printed Journal of Vision article is below a desk lamp is thought to require a focal shift of attention between objects (Franconeri, et al., 2012). Relatedly, target object pairs produce highly inefficient search slopes when distractors only differ by their relative arrangement (X above Y, vs. X below Y), suggesting a serial process and no attentional guidance (Logan, 1994). Rather than searching for an article below a lamp, observers might search for the features of an article and/or a lamp, then check their relative arrangement. We tested these hypotheses across two experiments that recorded eye movements and used previewed random pairs of trial-unique real-world objects. Experiment 1 asked participants to search for object pairs that only differed from distractors in their relative arrangement, preventing target-feature guidance from individual objects; guidance would rather require a holistic representation that captures the relative arrangement. Experiment 2 asked participants to search for target pairs among heterogeneous pairs of distractors, which allows target-feature guidance from individual objects. The targets relative arrangement was swapped on half of the trials (indicating target absence). Above chance search guidance in Experiment 1, and significantly more guidance for matched relative to swapped targets in Experiment 2, argues for small but significant search guidance based on the relative arrangement of objects (operationalized as the proportion of trials where the target pair was fixated first, and the number of distractors fixated before the target; all p < .05). We conclude that the computation of relative object arrangement may not be strictly serial, and that it is possible to extract and use a relative-object-arrangement feature to guide search. Future work will explore related features that generate small amounts of guidance in other highly inefficient searches.