Abstract
In recent years there has been a shift from desktop-based eye tracking to mobile eye tracking to facilitate experimentation in naturalistic environments. One substantial challenge of mobile eye tracking is inadvertent movement of the eye tracking camera with respect to the head, or “slippage”. Studies have shown that slippage substantially reduces gaze accuracy. Slippage often accumulates over long recording sessions, and consequently constrains the duration and type of activities that can be recorded with mobile eye tracking. One solution is to perform gaze calibration multiple times across a session. However, this is cumbersome, and slippage between calibrations can still reduce gaze accuracy. In this study, to compensate for slippage we use a deep neural network to track the eyelids across frames of the eye video and use them to estimate the gradual drift of the eye with respect to the eye camera. We track the eyelids for frames with a given eye width, under the assumption that if there is no slippage, the position of the eyelids should not change across time for a fixed eye width. We use five keypoints along these selected eyelid outlines to calculate the horizontal and vertical slippage of the eye camera with respect to the first eye image after calibration. Using rigid body transformations to account for slippage, we correct the pupil estimates before using them to estimate gaze. We tested this method on data collected for the Visual Experience Database, in which participants wore a PupilCore head-mounted eye tracker augmented with world and stereo-depth cameras. The front-heavy design of this rig commonly resulted in slippage. Our method results in reduction of gaze error by 72%, even in sessions with substantial slippage, suggesting that this can be a viable tool to correct slippage and improve gaze tracking accuracy for long recordings of naturalistic behaviors.