Abstract
The human face is central to our social interactions, as it allows us to form judgements about identities, intentions, and emotions. Recent studies have shown that, while gazing at faces, each one of us has a particular eye-scanning pattern that is highly stable across time. Although some variables such as culture or personality have been shown to modulate gaze behaviour, we still don't know what shapes these idiosyncrasies. Here we demonstrate that the gender of both the participant (gazer) and the person being observed (actor) strongly influence gaze patterns during face exploration. We make use of the largest set of eye tracking data recorded (405 participants, 58 nationalities) from participants watching videos of another person. Exploiting advanced data-mining techniques, we build a data-driven model able to encapsulate the highly dynamic and individualistic dimension of a participant's gaze behaviour. Female gazers follow a much more exploratory scanning strategy than males, who focus on the eye region. Moreover, female gazers watching female actresses look more at the eye on the left side. This result has strong implications in every field using gaze-based models, including the diagnosis of disorders such as Autism, where a substantial sex ratio difference exists. We also show that a classifier solely trained with eye tracking data is able to identify the gender of both the gazer and actor with very high accuracy. The finding that a gender fingerprint can be extracted from how people observe others reveals widespread, stereotypical face scanning behaviours that fundamentally differ between men and women.
Meeting abstract presented at VSS 2016