Abstract
It has been known for a long time that visual task greatly influences eye movement patterns. Perhaps the best demonstration of this is the celebrated study of Yarbus showing that different eye movement scanpaths emerge depending on the visual. Forward Yarbus process, the effect of visual task on eye movement pattern, has been investigated for various tasks. In this work, we have developed an inverse Yarbus process whereby we can infer the visual task by observing the measurements of a viewer's eye movements while executing the visual task. To do so, first we need to track the allocation of attention, for different tasks entail attending various locations in an image and therefore tracking attention will lead us to task inference. Eye position does not tell the whole story when it comes to tracking attention. While it is well known that there is a strong link between eye movements and attention, the attentional focus is nevertheless frequently well away from the current eye position. Eye tracking methods may be appropriate when the subject is carrying out a task that requires foveation. However, these methods are of little use (and even counter-productive) when the subject is engaged in tasks requiring peripheral vigilance. The model we have developed for attention tracking uses Hidden Markov Models (HMMs), where covert (and overt) attention is represented by the hidden states of task-dependent HMMs. Fixation locations, thus, correspond to the observations of an HMM and were used in training (by using Baum-Welch algorithm) task-dependent models whereby we could evaluate the likelihood of observing an eye trajectory given a task (forward algorithm). Having this likelihood term, we were able to use the Bayesian inference and recognize the ongoing task by viewing the eye movements of subjects while performing a number of simple visual tasks.
Le Fonds Québécois de la Recherche sur la Nature et les Technologies (FQRNT).