Abstract
Spatial working memory (WM) allows us to briefly remember and manipulate spatial information. Traditionally, spatial WM is tested in non-human primates using an oculomotor delayed response (ODR) task that requires eye fixation away from the cue to be remembered. Using this task, a myriad of studies has shown that neurons in the primate lateral prefrontal cortex (LPFC) encode WM representations. One caveat of this highly controlled approach is that it departs from natural behavior – one would typically make eye movements when remembering locations. It currently remains unclear whether neurons in the LPFC encode spatial WM during natural behavior, that is, in the presence of distracting information and eye movements. To address this issue, we created a novel virtual reality (VR) spatial WM task which incorporates complex 3D stimuli and does not constrain eye movement. During a trial, a visual cue is presented to the animal in a circular arena in 1 of 9 locations for 3 seconds, it then disappears and the animal is required to remember its location during a 2 second delay period. Navigation in the environment is then enabled and animals navigate to the cued location using a joystick. We implanted two 10×10 Utah arrays in LPFC area 8A of 2 rhesus macaques (ventral and dorsal to the principal sulcus). Both animals correctly performed the task (average hit rate: MonkeyB=86%; MonkeyT=67%). Neurons were selective for target location during the delay period in both arrays (ventral=39%, N=1799; dorsal=47%, N=1725). We used a linear classifier with crossvalidation to decode remembered locations on a single trial basis from neural activity. Decoding was ~50% (chance=12%) and was not explained by neural activity influenced by target location specific patterns of eye movement. These findings show that LPFC neurons encode spatial WM during virtual reality tasks regardless of distracter information and eye movements.
Acknowledgement: CIHR NSERC