Abstract
Most studies of attention use experimental designs with short trial structures and simple two-dimensional displays. These studies have demonstrated that search can be made more efficient by attending to items that match a target along categorical and feature dimensions. However, real world visual search unfolds across space over extended periods of time, such as when shopping for items from a list. It is therefore unclear whether the principles learned from lab studies extend to naturalistic search. In the present study, participants first explore a large (4,216 sq m) IKEA-like furniture store in virtual reality (VR) prior to a surprise visual search task in which they “shop” for items in the store. In the “shopping” task, participants sequentially searched for 10 pieces of furniture that were located in display rooms that contained objects of the same type (e.g., lamps, couches, beds). Fixations were measured via eye tracking. The primary question of interest is how target-matching category features (e.g., small, round, yellow, lamp) guide attention and looking during navigation towards, and then within, the target’s display room. Results (N=42) showed that, prior to entering the target display room, fixations were distributed across objects with target-matching features. On the other hand, once the room with the target’s display room was entered, search proceeded hierarchically, such that guidance was dominated first by a highly diagnostic feature (size or shape) followed by a secondary diagnostic feature, and then by color. Feature diagnosticity was dependent on the objects themselves and the organization of the room. Notably, in contrast to lab studies, color was not a prioritized feature for any target. These findings demonstrate how categorical and feature-based attentional guidance occur hierarchically to improve search efficiency across large-scale, naturalistic environments.