September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
Dissociation between eye position and working memory signals during virtual reality tasks in the primate lateral prefrontal cortex
Author Affiliations
  • Megan Roussy
    University of Western Ontario
    Robarts Research Institute
  • Rogelio Luna
    University of Western Ontario
    Robarts Research Institute
  • Benjamin Corrigan
    University of Western Ontario
    Robarts Research Institute
  • Adam Sachs
    University of Ottawa
  • Lena Palaniyappan
    University of Western Ontario
    Robarts Research Institute
  • Julio Martinez-Trujillo
    University of Western Ontario
    Robarts Research Institute
Journal of Vision September 2021, Vol.21, 2118. doi:https://doi.org/10.1167/jov.21.9.2118
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Megan Roussy, Rogelio Luna, Benjamin Corrigan, Adam Sachs, Lena Palaniyappan, Julio Martinez-Trujillo; Dissociation between eye position and working memory signals during virtual reality tasks in the primate lateral prefrontal cortex. Journal of Vision 2021;21(9):2118. https://doi.org/10.1167/jov.21.9.2118.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Neurons in the primate lateral prefrontal cortex (LPFC) maintain working memory (WM) representations of space. However, a proportion of LPFC neurons also encode signals related to eye position. Potential interference between eye related signals and WM representations has prompted strict control of eye position in traditional WM tasks. Therefore, it is unclear how unrestrained eye position may affect performance of a WM task and task related LPFC activity. To explore this, we trained two rhesus monkeys on a spatial WM task set in a naturalistic virtual environment. During task trials, a target was presented at 1 of 9 locations in the environment. The target then disappeared during a two second delay epoch, after which the animals were required to navigate to the cued target location using a joystick. Animals were permitted free visual exploration throughout the task. We recorded neuronal activity using two 96-channel Utah Arrays implanted in LPFC area 8ad/v. Even with unrestrained eye position, animals only spent 3.6% of total fixation time during the delay period looking at the target location. The duration in which animals looked at the target location did not influence trial outcome (Kruskal Wallis, p=0.151). We tested whether neuronal population activity during fixations could predict eye position on targets. Classifiers using neuronal population activity during fixation periods were unable to decode eye position above chance (T-Test, p=0.646). Moreover, we calculated the proportion of neurons tuned for saccade landing position in different reference frames. Only 2% of neurons were tuned for both target location and saccades in the retinocentric frame and 3% were tuned for target location and saccades in the spatiocentric frame. These results indicate that in a virtual environment, unrestricted eye position does not diminish performance on a spatial WM task. Results suggest a dissociation between eye position and WM signals within LPFC.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×