Abstract
In addition to low-level features, all objects in our environment readily elicit high-level content, such as meaning (semantic information). It has been demonstrated that both low-level (e.g., color, shape) and high-level (e.g., semantics) features of objects influence attentional allocation. Most evidence for semantic guidance of attention, however, has been garnered from tasks rendering semantic information task-relevant. Whether semantic information of task-irrelevant objects guides attentional allocation remains to be an open question. It is hypothesized that, given its availability and strength of representation, semantic information may bias attention even when such information has no predictive value for the task, or is task-irrelevant. Specifically, when presented with multiple task-irrelevant objects, attention is guided, or prioritized, to a subset of objects that are semantically related creating a grouping-like effect. If grouping is driven by semantic relatedness, then performance should be directly related to the size of the group, such that performance should decrease as the number of related objects within a group increases. In the present study, four objects were presented on a screen equidistant from a central fixation point. The number of semantically related objects varied, from none related to all related. After a set time, a target appeared randomly on one of the four objects, not predicted by semantic relatedness. Individuals’ perceptions of semantic relatedness and its effect on semantic grouping was assessed through multidimensional scaling (MDS) (Kriegeskorte & Mur, 2012). It was observed that as the number of semantically related objects increased, accuracy for identifying a target presented on one of the semantically related objects decreased. This result suggests that attentional grouping is modulated by high-level semantic information, even for task-irrelevant objects.
Acknowledgement: NSF BCS-1534823 to S.S.