October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Look where you go: Humans intuitively track heading direction changes with their eyes
Author Affiliations & Notes
  • Hiu Mei Chow
    University of British Columbia, Vancouver, Canada
  • Jonas Knöll
    Institute of Animal Welfare and Animal Husbandry, Friedrich-Loeffler-Institut, Greifswald, Germany
  • Matthew Madsen
    University of British Columbia, Vancouver, Canada
  • Miriam Spering
    University of British Columbia, Vancouver, Canada
  • Footnotes
    Acknowledgements  This work was supported by Wall Solutions Grant funded by the UBC Peter Wall Institute of Advanced Studies and Research Trainee Award funded by the Michael Smith Foundation for Health Research.
Journal of Vision October 2020, Vol.20, 443. doi:https://doi.org/10.1167/jov.20.11.443
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Hiu Mei Chow, Jonas Knöll, Matthew Madsen, Miriam Spering; Look where you go: Humans intuitively track heading direction changes with their eyes. Journal of Vision 2020;20(11):443. doi: https://doi.org/10.1167/jov.20.11.443.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Successful performance of daily activities such as driving a car relies on the accurate perception of self-motion, such as the direction and speed of heading. Small groups of primates—humans, macaques, marmosets—can track heading direction with their eyes in the absence of any instruction and with only minimal training (Knöll et al. PNAS 2018). Here we investigated if this tracking behavior is generalizable to a larger group of human observers and sensitive to changes in motion signal strength. Observers (n=43) viewed a cloud of moving dots that appeared to converge to one point, resulting in perceived self-motion towards or receding from the focus of expansion (FOE). FOE location shifted across time in a random walk fashion. Observers were asked to freely view the stimulus; eye position was recorded using an Eyelink 1000 eye tracker. In Exp.1 (n=19), we verified if observers could track suprathreshold stimuli (coherence: 100%; contrast: 33%; speed: 2m/s). In Exp.2 (n=24), we tested the effect of motion signal strength by manipulating coherence (6.25-100%), contrast (3.6-90%), and speed (0.75-6m/s). Results show that observers intuitively track heading direction changes using a combination of saccades, fixation, and slow drift. In both experiments, more than 80% of observers tracked the FOE with highly correlated position trajectories (cross-correlation coefficient > 0.6) in response to high signal-strength stimuli. Spatial tracking error (eye position error between eye and FOE) increased with motion signal strength decreasing from the highest to the lowest level of coherence (32% error increase), dot contrast (24% error increase), and speed (44% error increase). Intuitive ocular tracking of heading direction is generalizable to a larger group of human observers, and is finely tuned to low-level motion signals. Future work will explore whether we can utilize eye movements as an indicator of self-motion perception in clinical populations


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.