August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Decoding Features of Real-world Navigation from the Human Hippocampus
Author Affiliations & Notes
  • Kathryn N. Graves
    Department of Psychology, Yale University
  • Ariadne Letrou
    Department of Psychology, Yale University
  • Tyler E. Gray
    Department of Neurology, Yale University
  • Imran H. Quraishi
    Department of Neurology, Yale University
  • Nicholas B. Turk-Browne
    Department of Psychology, Yale University
    Wu Tsai Institute, Yale University
  • Footnotes
    Acknowledgements  This work was supported by National Institutes of Health (NIH) Grant R01MH069456 (to N. B. Turk-Browne), a Swebilius Foundation Grant (to I. H. Quraishi), an NIH Grant 1F99NS125835-01 (to K. N. Graves), and a Swebilius Foundation Grant (to K. N. Graves)
Journal of Vision August 2023, Vol.23, 4753. doi:https://doi.org/10.1167/jov.23.9.4753
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Kathryn N. Graves, Ariadne Letrou, Tyler E. Gray, Imran H. Quraishi, Nicholas B. Turk-Browne; Decoding Features of Real-world Navigation from the Human Hippocampus. Journal of Vision 2023;23(9):4753. https://doi.org/10.1167/jov.23.9.4753.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Neuronal recordings during animal navigation and virtual human navigation have revealed specialized spatial coding properties in the hippocampus for representing an organism's location, speed, and direction. However, it remains unclear how these representations operate during human navigation in the real world and how they manifest at the level of local field potentials (LFPs). Here we report evidence of stable real-world navigation coding in the human hippocampus. Epilepsy patients chronically implanted with a Responsive Neurostimulation (RNS) device walked back and forth along a linear track while hippocampal LFP data were recorded from two or four channels. Position along the track was measured continuously using an optical head tracker, from which we could label neural data with instantaneous heading direction, speed, and location. We trained three models for each channel and patient to predict these labels from hippocampal LFPs over time: a binary classifier to decode which of the two heading directions and two regression models to predict continuous estimates of speed and location, respectively. These models successfully predicted direction, speed, and location more accurately than chance. The channels that represented these variables overlapped significantly, some coding for two and others for all three. Finally, we investigated the stability of these representations in a subset of repeatedly tested patients by training models on their initial session and testing on subsequent sessions. These across-session models significantly predicted speed and location, but not direction, and the channels representing speed overlapped across sessions. These results demonstrate that complex features of human movement through space are represented in neuronal population activity in the hippocampus. The innovative RNS platform we developed allows for highly precise electrophysiological measurements in the hippocampus during real-world activities, creating untold possibilities for investigating neural mechanisms of navigation, memory, and other hippocampal-dependent behaviors in ecologically valid tasks and settings.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×