Abstract
Visual perception is driven by expectations regarding familiar concepts and relations in the real world. For instance, trees are expected to be big and apples to be small. Such expectations help to guide behavior and cognition by generating predictions that boost the efficiency of visual processing. When judging which of two objects is visually smaller on the screen, reaction time (RT) is longer if the visual size of the items is incongruent with their real-world size (e.g. a small tree next to a big apple) compared to when it is congruent - despite the fact that real-world size is task-irrelevant (Konkle & Oliva, 2012). Here, we replicated this familiar-size Stroop effect and additionally assessed the influence of retinal position (Experiment 1) and semantic relatedness among objects (Experiment 2). In both experiments, two objects were presented on the same horizontal plane on the left and right of a centered fixation point. In Experiment 1, both objects were either positioned in the center or in the periphery. We found Stroop effects of similar magnitudes for both positions. Further, while there was no difference in the number of errors, RT was overall longer for peripheral positions. This delay in RT could be related to reduced recognizability of the objects under peripheral viewing and/or the fact that there was an increased distance between the objects in the periphery. The rationale for Experiment 2 was based on research that found semantically related objects to be recognized better than unrelated objects (e.g., toaster and refrigerator vs. toaster and shower). However, we found no evidence that a manipulation of semantic relatedness affects the Stroop effect. In sum, our findings indicate that objects’ real-world size is automatically processed not only in central but also in peripheral vision, and regardless of whether neighboring objects are semantically related or not.