Abstract
As technology and scientific knowledge have advanced, the possibility and necessity of using virtual environments to run experiments has arrived, and the eye movements that are used to explore the environments need to be characterized. We trained two rhesus macaques to use a joystick to navigate in a virtual environment during a complex learning task and a foraging task. We analyzed gaze on screen behaviour and eye movement behaviours: saccades, fixations, and smooth pursuits. We also analysed the kinematics of saccades across the tasks and within periods of the Learning task. We found that gaze on screen as a function of the proportion of a trial changed based on whether there was a target currently in the environment. There was a median trial proportion of 47% when the subject was just navigating. When there were rewarded targets, the median gaze-on-screen was 80% and 91% for the Foraging and Learning tasks. For saccade kinematics, we calculated the main sequence by matching saccades on start location (< 5dva) and direction (< 10°) in bins of 3 dva amplitude. We ran repeated measures ANOVAs to test for differences and we fit a non-linear model to estimate the change in the main sequence. We did not find an effect of static vs dynamic phase of stimuli in the virtual environment. We did find that saccades were 7% faster when there were rewarded objects on the screen in virtual environments, and that the different levels of difficulty in our task did not alter the main sequence. There is likely an arousal change between simple virtual navigation and navigation towards a rewarded target, when the subject is more engaged in the task.
Meeting abstract presented at VSS 2017