Journal of Vision Cover Image for Volume 18, Issue 10
September 2018
Volume 18, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2018
Psychophysiology of Visual-Motor Learning during a Simulated Marksmanship Task in Immersive Virtual Reality

Author Affiliations
  • Lawrence Appelbaum
    Department of Psychiatry, Duke University School of Medicine
  • Jillian Clements
    Department of Electrical and Computer Engineering, Duke University
  • Elayna Kirsch
    Department of Psychiatry, Duke University School of Medicine
  • Hrishikesh Rao
    Department of Biomedical Engineering, Pratt School of Engineering, Duke University
  • Nicholas Potter
    Athletic Department, Duke University
  • Regis Kopper
    Department of Mechanical Engineering and Material Science, Duke University
  • Marc Sommer
    Department of Biomedical Engineering, Pratt School of Engineering, Duke University
Journal of Vision September 2018, Vol.18, 432. doi:https://doi.org/10.1167/18.10.432
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Lawrence Appelbaum, Jillian Clements, Elayna Kirsch, Hrishikesh Rao, Nicholas Potter, Regis Kopper, Marc Sommer; Psychophysiology of Visual-Motor Learning during a Simulated Marksmanship Task in Immersive Virtual Reality
. Journal of Vision 2018;18(10):432. https://doi.org/10.1167/18.10.432.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The ability to coordinate visual information with motor output is essential to a great number of endeavors. In particular, activities such as sports, surgery, and law enforcement rely on efficient reciprocal interactions between visual perception and motor control, allowing individuals to execute precision movements under time-limited, stressful situations. Immersive virtual reality (VR) systems offer flexible control of an interactive environment, along with precise position tracking of realistic movements that can be used in conjunction with neurophysiological monitoring techniques, such as electroencephalography (EEG), to record neural activity as users perform complex tasks. As such, the fusion of immersive VR, kinematic tracking, and EEG offers a powerful testbed for naturalistic neuroscience research. In this study, we combine these elements to investigate the cognitive and neural mechanisms that underlie motor skill learning during a multi-day simulated marksmanship training protocol conducted with 20 participants. On each of 3 days, participants performed 8 blocks of 60 trials in which a simulated clay pigeon was launched from behind a trap house. Participants attempted to shoot the moving target with a firearm game controller, receiving immediate positional feedback and running scores after each shot. Over the course of 3 days of practice, shot accuracy and precision improved significantly while reaction times got significantly faster. The temporal cascade of target launch-locked psychophysiological responses proceeded with significant visual evoked potentials (VEP) (~120-180), followed by eye movements (measured by EOG, ~190ms), then hand (~200ms) and head (~290 ms) movements. Furthermore, greater amplitudes and earlier latencies in the VEP elicited contralateral-to-target trajectories both correlated with better shooting performance, as measured by reaction times and accuracy. These findings, therefore, point towards a naturalistic neuroscience approach that can be used to characterize learning and identify neural markers predictive of marksmanship performance.

Meeting abstract presented at VSS 2018

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×