Purchase this article with an account.
Clara Colombatto, Yi-Chia Chen, Brian Scholl; Gazing to look vs. gazing to think: Gaze cueing is modulated by the perception of others' external vs. internal attention. Journal of Vision 2021;21(9):2800. doi: https://doi.org/10.1167/jov.21.9.2800.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
What we see depends on where we look, and where we look is often influenced by where others are looking. In particular, when we see another person turn to look in a new direction, we automatically follow their gaze and attend in the same direction -- a phenomenon known as gaze cueing. This reflexive reorienting is adaptive, since people usually shift their gaze to *look* toward the objects or locations they are attending to. But not always: Sometimes people shift their gaze to *think*, as when they look up and away while retrieving information from memory or solving a difficult problem. Such gazes are not directed at any particular external location, but rather signal disengagement from the external world to aid internal focus. Is gaze cueing sophisticated enough to be sensitive to others' (external vs. internal) focus of attention? To find out, we had observers view videos of an actress who is initially looking forward. She is then asked a question, and before responding she looks upward and to the side. The questions themselves concerned either an external stimulus ("Who painted that piece of art on the wall over there?") or an internal memory ("Who painted that piece of art we saw in the museum?"). Despite using identical videos (differing only in their audio tracks), gazes preceded by the 'external' (vs. 'internal') questions elicited far stronger gaze cueing, as measured by the ability to identify a briefly flashed symbol in the direction of the gaze. This effect replicated in multiple samples, and with multiple pairs of 'external' vs. 'internal' questions. This shows how gaze cueing is surprisingly 'smart', and is not simply a brute reflex triggered by others' eye and head movements. And perhaps more importantly, it demonstrates how perception constructs a rich and flexible model of others' attentional states.
This PDF is available to Subscribers Only