Purchase this article with an account.
Tim Lauer, Verena Willenbockel, Melissa Vo; Do inverted scenes modulate semantic object processing? Behavioral and electrophysiological insights. Journal of Vision 2017;17(10):1347. doi: https://doi.org/10.1167/17.10.1347.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Objects presented in semantically related contexts are better recognized than in unrelated surroundings. What information of a scene yields this facilitative consistency effect? Here, we investigated whether seeing scenes in an unfamiliar orientation (rotated 180 degrees in the picture-plane) modulates the semantic processing of superimposed objects. We paired indoor and outdoor scenes with either an indoor or an outdoor thumbnail object to manipulate semantic consistency. We then presented either consistent or inconsistent upright thumbnail objects on three types of backgrounds: upright scenes, inverted scenes, and scrambled scenes (control condition). In Experiment 1, stimuli were gray-scale and the critical object was presented for 56 ms centered on the background image followed by a dynamic mask. Participants were instructed to name the object, while the background image was task-irrelevant. On upright scenes, consistent objects were named more accurately than inconsistent objects. Inverted scenes showed only a non-significant trend in the same direction, whereas the control condition did not show any difference. In Experiment 2, participants saw the same stimuli in color for 2000 ms each and completed a repetition detection task with both objects and backgrounds. We recorded event-related potentials (ERPs) as a possibly more sensitive measure to specifically look at the N400 component, which has originally been linked to semantic access in language processing and more recently also in scene perception. If inverted scenes indeed modulate semantic object processing they should elicit more negative ERPs to inconsistent than to consistent objects. While inconsistent versus consistent objects on upright scenes triggered a frontal negativity in the N400 time window (350-600 ms), neither inverted scenes nor control images elicited differential ERPs in this N400 time window. Together, our behavioral and electrophysiological data suggests that scene inversion - which preserves low-level image properties (except for phase) - highly limits contextual influences on semantic object processing.
Meeting abstract presented at VSS 2017
This PDF is available to Subscribers Only