December 2022
Volume 22, Issue 14
Open Access
Vision Sciences Society Annual Meeting Abstract  |   December 2022
Coding of head direction in the human visual system during dynamic navigation
Author Affiliations & Notes
  • Zhengang Lu
    University of Pennsylvania
  • Joshua B. Julian
    Princeton University
  • Russell A. Epstein
    University of Pennsylvania
  • Footnotes
    Acknowledgements  This research was supported by NIH R01EY022350
Journal of Vision December 2022, Vol.22, 4277. doi:https://doi.org/10.1167/jov.22.14.4277
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Zhengang Lu, Joshua B. Julian, Russell A. Epstein; Coding of head direction in the human visual system during dynamic navigation. Journal of Vision 2022;22(14):4277. https://doi.org/10.1167/jov.22.14.4277.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Head direction cells, which code facing direction in the environment, have been extensively studied in freely-moving rodents. fMRI studies have identified heading codes in humans; however, these are typically assessed during tasks that involve spatial memory recall in response to static stimuli. A recent neuroimaging study showed head direction codes in visual, retrosplenial, parahippocampal and medial temporal cortices during dynamic navigation in a small circular arena (Nau et al., 2020). However, it is unclear whether these head direction codes observed in a small-scale, simple environment would extend to more realistic, large-scale spaces, or whether they would be tolerant to changes in visual cues. Here we address these issues by using voxel-wise encoding modelling of fMRI data obtained during dynamic navigation in two complex virtual environments. 15 participants performed a “taxi-cab” task in two large virtual reality cities (201 vm x 120 vm), which had identical spatial layout and buildings but different surface textures on the buildings and roads. We modeled participants’ virtual head direction using circular-gaussian functions with width = 8° centered on preferred head directions sampled at 8° intervals on the full 360° range. Using the estimated model weights from data in one version of the city, we examined fMRI responses in the other version of the city (cross-city-validation). Our encoding model of head direction significantly predicts activity of voxels in the early visual areas (EVC) and retrosplenial cortex (RSC), with significant predictive voxels in all of the participants. The presence of head direction codes in the EVC and RSC suggest that the visual system could encode heading information that is invariant to appearance changes of the environment during dynamic, visually-guided navigation.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×