August 2016
Volume 16, Issue 12
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2016
The Influence of Biomechanics on Visual Attention while Walking
Author Affiliations
  • Rakshit Kothari
    Chester F. Carlson Center for Imaging Science, Rochester Institute of Technology
  • Gabriel Diaz
    Chester F. Carlson Center for Imaging Science, Rochester Institute of Technology
  • Kamran Binaee
    Chester F. Carlson Center for Imaging Science, Rochester Institute of Technology
  • Reynold Bailey
    Department of Computer Science, Rochester Institute of Technology
  • Johnatan Matthis
    Center for Perceptual Systems, University of Texas at Austin
Journal of Vision September 2016, Vol.16, 1362. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Rakshit Kothari, Gabriel Diaz, Kamran Binaee, Reynold Bailey, Johnatan Matthis; The Influence of Biomechanics on Visual Attention while Walking. Journal of Vision 2016;16(12):1362.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Humans are incredibly stable and efficient walkers. This is true even when walking is performed in simultaneity with a secondary task, such as social interaction, or use of a cell phone. This is impressive because, while visually guided walking relies upon fixations to the ground plane, secondary tasks often require shifts of gaze away from the ground plane. Thus, to remain stable while walking requires the intelligent timing of task dependent gaze shifts to and from the ground plane. However, little is known of how these shifts are timed. Recent investigation has demonstrated a tight relationship between the biomechanics of bipedal gait and the spatial extent of visual information needed for the planning of foot placement. It remains unclear if the biomechanics similarly influence the timing of eye movements to and from groundplane. We investigate using motion capture and eye-tracking to record the gait kinemtaics and gaze adjustments of a subject walking a straight path. During locomotion, a single augmented reality obstacle of three possible heights (0.15, 0.25, 0.35 leg lengths) is projected at a randomized location along the subject's path. The 2D projection of this obstacle is dynamically updated with changes in head position so that the optical information provided by the 2D projection is consistent with that of a 3D obstacle. Illusory height is reinforced through the use of stereoscopic glasses. While walking, the subject also performs a vigilance task involving the identification of numbers within a random sequence displayed at eye height. Task demands ensure that the subjects allocate visual attention to both the primary and vigilance task with equal importance. Analyses demonstrate temporal coupling between the timing of gaze shifts and evolution of the gait cycle.

Meeting abstract presented at VSS 2016


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.