December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Slippage Correction in Mobile Head Mounted Eye-tracking Systems
Author Affiliations & Notes
  • Arnab BISWAS
    University of Nevada, Reno,
  • Kamran Binaee
    University of Nevada, Reno,
  • Mark D. Lescroart
    University of Nevada, Reno,
  • Footnotes
    Acknowledgements  NSF EPSCoR RII Track2 #1920896 to M.G., M.D.L., P.M., B.B.
Journal of Vision December 2022, Vol.22, 4487. doi:https://doi.org/10.1167/jov.22.14.4487
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Arnab BISWAS, Kamran Binaee, Mark D. Lescroart; Slippage Correction in Mobile Head Mounted Eye-tracking Systems. Journal of Vision 2022;22(14):4487. https://doi.org/10.1167/jov.22.14.4487.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

In recent years there has been a shift from desktop-based eye tracking to mobile eye tracking to facilitate experimentation in naturalistic environments. One substantial challenge of mobile eye tracking is inadvertent movement of the eye tracking camera with respect to the head, or “slippage”. Studies have shown that slippage substantially reduces gaze accuracy. Slippage often accumulates over long recording sessions, and consequently constrains the duration and type of activities that can be recorded with mobile eye tracking. One solution is to perform gaze calibration multiple times across a session. However, this is cumbersome, and slippage between calibrations can still reduce gaze accuracy. In this study, to compensate for slippage we use a deep neural network to track the eyelids across frames of the eye video and use them to estimate the gradual drift of the eye with respect to the eye camera. We track the eyelids for frames with a given eye width, under the assumption that if there is no slippage, the position of the eyelids should not change across time for a fixed eye width. We use five keypoints along these selected eyelid outlines to calculate the horizontal and vertical slippage of the eye camera with respect to the first eye image after calibration. Using rigid body transformations to account for slippage, we correct the pupil estimates before using them to estimate gaze. We tested this method on data collected for the Visual Experience Database, in which participants wore a PupilCore head-mounted eye tracker augmented with world and stereo-depth cameras. The front-heavy design of this rig commonly resulted in slippage. Our method results in reduction of gaze error by 72%, even in sessions with substantial slippage, suggesting that this can be a viable tool to correct slippage and improve gaze tracking accuracy for long recordings of naturalistic behaviors.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×