September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Gaze-in-World movement Classification for Unconstrained Head Motion during Natural Tasks.
Author Affiliations
  • Rakshit Kothari
    Chester F. Carlson Center for Imaging Sciences, RIT
  • Kamran Binaee
    Chester F. Carlson Center for Imaging Sciences, RIT
  • Reynold Bailey
    B. Thomas Golisano College of Computing and Information Sciences, RIT
  • Christopher Kanan
    Chester F. Carlson Center for Imaging Sciences, RIT
  • Gabriel Diaz
    Chester F. Carlson Center for Imaging Sciences, RIT
  • Jeff Pelz
    Chester F. Carlson Center for Imaging Sciences, RIT
Journal of Vision August 2017, Vol.17, 1156. doi:10.1167/17.10.1156
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to Subscribers Only
      Sign In or Create an Account ×
    • Get Citation

      Rakshit Kothari, Kamran Binaee, Reynold Bailey, Christopher Kanan, Gabriel Diaz, Jeff Pelz; Gaze-in-World movement Classification for Unconstrained Head Motion during Natural Tasks.. Journal of Vision 2017;17(10):1156. doi: 10.1167/17.10.1156.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Accurate classification of eye movements is an integral component of many psychophysical experiments. Most event classifiers are restricted to the analysis of eye-in-head (EIHv) vectors while the head is fixed, and detect two classes of gaze events, namely fixations and saccades. Classification of eye movements as Vestibular Ocular reflex (VOR) or Smooth pursuits (SP) require that we compensate gaze motion with a person's head movements. As the complexity of naturalistic behavioral paradigms increases, so does the need for gaze classification with head-free motion which could detect these complex eye movements. We propose the use of a supervised Machine Learning (ML) classifier for the automated analysis of head free gaze signal. As a preliminary test, we used a SVM classifier to label a 60 Hz test dataset from a virtual reality ball catching task into periods of fixations, saccades and pursuit. This preliminary test resulted in 94% accuracy in a sample to sample comparison with an artificially generated set of gaze movements reflective of measured gaze statistics, and visual inspection shows very high correspondence to subjective analysis of the gaze signal. This promising methodology has since been extended to the analysis of data collected during natural tasks. Five subjects' head and gaze movements were recorded using an IMU and a 120 Hz SMI Wireless ETG2 as they performed four different natural tasks for 10 minutes each. This yielded 120 minutes of angular velocity data of the gaze in world vector (GIWv) during natural tasks. Four experts manually annotated a subset of the data as periods of fixation, saccade, smooth pursuit, and blinks. Inter-labeler reliability was measured to generate a confidence metric and a cost function for misclassifications. We summarize the statistics of natural eye movements and compare various ML based temporal series classification schemes.

Meeting abstract presented at VSS 2017

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×