July 2013
Volume 13, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   July 2013
Shared neural sensory signals for eye-hand coordination in humans
Author Affiliations
  • Li Li
    Department of Psychology, The University of Hong Kong, Pokfulam, Hong Kong SAR
  • Diederick C. Niehorster
    Department of Psychology, The University of Hong Kong, Pokfulam, Hong Kong SAR
  • Dorion Liston
    Human-Systems Integration Division, NASA Ames Research Center, Moffett Field, CA, USA\nSan Jose State University, San Jose, CA, USA
  • Wilfred W.F. Siu
    Department of Psychology, The University of Hong Kong, Pokfulam, Hong Kong SAR
  • Lee Stone
    Human-Systems Integration Division, NASA Ames Research Center, Moffett Field, CA, USA
Journal of Vision July 2013, Vol.13, 1205. doi:10.1167/13.9.1205
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to Subscribers Only
      Sign In or Create an Account ×
    • Get Citation

      Li Li, Diederick C. Niehorster, Dorion Liston, Wilfred W.F. Siu, Lee Stone; Shared neural sensory signals for eye-hand coordination in humans. Journal of Vision 2013;13(9):1205. doi: 10.1167/13.9.1205.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Eye-hand coordination is a central research topic in neuroscience and cognitive psychology. There has long been a debate on whether eye-hand coordination is limited by shared neural sensory signals that drive both the eye and hand sensorimotor systems, or by independent signals later in sensorimotor pathways. In this study, we addressed this question by examining the correlation between the tracking errors in the ocular and manual responses. Two experimental conditions were tested: in the eye-hand condition, as participants used pursuit eye movements to track the movement of a Gaussian target (σ=0.6°) on a computer display (40°H x 30°V) as its horizontal position was perturbed by the sum of seven harmonically-unrelated sinusoids (0.1-2.19 Hz), they also used a high-precision mouse to control the horizontal position of a second vertically-offset Gaussian cursor (8° below) to align it with the pursuit target. In the eye-alone condition, the target and cursor positions previously recorded in the eye-hand condition were played back, and participants were instructed to use only pursuit eye movements to track the movement of the target. Prior to computing the correlation between the tracking errors in the ocular and manual responses, for each 90-s trial, we subtracted best-fitting linear tracking responses at the seven input perturbation frequencies to remove direct correlation with the visual stimulus. Across 13 participants, trial-by-trial examination revealed that the correlation between the residual noises (i.e., tracking errors) in the ocular and manual responses in the eye-hand condition (mean r=0.20; range: 0.08-0.34) was highly significant (p<0.0001, Pearson’s R), and in all but seven out of 78 cases (13 subjects x 6 trials) higher than the spurious correlation in the eye-alone condition when ocular pursuit was not accompanied by simultaneous manual tracking. We conclude that common neural visual motion signals drive both the eye and hand sensorimotor systems.

Meeting abstract presented at VSS 2013

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×