Abstract
How do we attend to faces in realistic encounters? Is it, for example, true that we tend to look at somebody's eyes? Most of the work on face perception has come from static face presentations raising the question whether previous findings actually scale to reality. An intermediate step towards real-world face perception is to use dynamic displays of faces. Here we monitored participants' eye movements while they watched videos featuring close-ups of pedestrians engaged in interviews. Dynamic interest areas were used to measure fixation distributions on moving face regions including eyes, nose, and mouth. Additionally, fixation distributions were analyzed as a function of events such as speech, head movement, or gaze direction. Contrary to previous findings using static displays, we observed no general preference to fixate the eyes. Rather, gaze was dynamically adjusted to the dynamics of faces: When a depicted face was speaking, participants showed increased gaze towards the mouth, while the eyes were preferably fixated when the face engaged with the viewers by looking straight into the camera. Further, when two faces were present and one face looked at the other, viewers followed the observed gaze from one face to the other. Thus, especially in dynamic displays, observed gaze direction seems to promote gaze following. Interestingly, when a face moved quickly, participants tended to look more at the nose than at any other face region. We interpret this “nose tracking” as a strategy to use a centered viewing position to optimally track and monitor moving faces. All in all, these findings provide evidence for the wealth of moment-to-moment adjustments of gaze control that become necessary when viewing dynamic faces. Since human interaction heavily relies on the understanding of information conveyed by facial movements, it is of key interest to learn more about gaze dynamics while viewing dynamic faces.