September 2019
Volume 19, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2019
Encoding of spatial working memory in virtual reality in the primate prefrontal cortex
Author Affiliations & Notes
  • Megan Roussy
    University of Western Ontario
    Robarts Research Institute
  • Rogelio Luna
    University of Western Ontario
    Robarts Research Institute
  • Lena Palaniyappan
    University of Western Ontario
    Robarts Research Institute
  • Julio C. Martinez-Trujillo
    University of Western Ontario
    Robarts Research Institute
Journal of Vision September 2019, Vol.19, 204b. doi:https://doi.org/10.1167/19.10.204b
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Megan Roussy, Rogelio Luna, Lena Palaniyappan, Julio C. Martinez-Trujillo; Encoding of spatial working memory in virtual reality in the primate prefrontal cortex. Journal of Vision 2019;19(10):204b. https://doi.org/10.1167/19.10.204b.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Spatial working memory (WM) allows us to briefly remember and manipulate spatial information. Traditionally, spatial WM is tested in non-human primates using an oculomotor delayed response (ODR) task that requires eye fixation away from the cue to be remembered. Using this task, a myriad of studies has shown that neurons in the primate lateral prefrontal cortex (LPFC) encode WM representations. One caveat of this highly controlled approach is that it departs from natural behavior – one would typically make eye movements when remembering locations. It currently remains unclear whether neurons in the LPFC encode spatial WM during natural behavior, that is, in the presence of distracting information and eye movements. To address this issue, we created a novel virtual reality (VR) spatial WM task which incorporates complex 3D stimuli and does not constrain eye movement. During a trial, a visual cue is presented to the animal in a circular arena in 1 of 9 locations for 3 seconds, it then disappears and the animal is required to remember its location during a 2 second delay period. Navigation in the environment is then enabled and animals navigate to the cued location using a joystick. We implanted two 10×10 Utah arrays in LPFC area 8A of 2 rhesus macaques (ventral and dorsal to the principal sulcus). Both animals correctly performed the task (average hit rate: MonkeyB=86%; MonkeyT=67%). Neurons were selective for target location during the delay period in both arrays (ventral=39%, N=1799; dorsal=47%, N=1725). We used a linear classifier with crossvalidation to decode remembered locations on a single trial basis from neural activity. Decoding was ~50% (chance=12%) and was not explained by neural activity influenced by target location specific patterns of eye movement. These findings show that LPFC neurons encode spatial WM during virtual reality tasks regardless of distracter information and eye movements.

Acknowledgement: CIHR NSERC 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×