Purchase this article with an account.
Gregory J. Zelinsky, Yifan Peng, Dimitris Samaras; Eye can read your mind: Decoding gaze fixations to reveal categorical search targets. Journal of Vision 2013;13(14):10. doi: https://doi.org/10.1167/13.14.10.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Is it possible to infer a person's goal by decoding their fixations on objects? Two groups of participants categorically searched for either a teddy bear or butterfly among random category distractors, each rated as high, medium, or low in similarity to the target classes. Target-similar objects were preferentially fixated in both search tasks, demonstrating information about target category in looking behavior. Different participants then viewed the searchers' scanpaths, superimposed over the target-absent displays, and attempted to decode the target category (bear/butterfly). Bear searchers were classified perfectly; butterfly searchers were classified at 77%. Bear and butterfly Support Vector Machine (SVM) classifiers were also used to decode the same preferentially fixated objects and found to yield highly comparable classification rates. We conclude that information about a person's search goal exists in fixation behavior, and that this information can be behaviorally decoded to reveal a search target—essentially reading a person's mind by analyzing their fixations.
This PDF is available to Subscribers Only