September 2019
Volume 19, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2019
Costs of attentional set-shifting during dynamic foraging, controlled by a novel Unity3D-based integrative experimental toolkit
Author Affiliations & Notes
  • Marcus R Watson
    Department of Biology, Centre for Vision Research, York University
  • Benjamin Voloh
    Department of Psychology, Vanderbilt University
  • Christopher Thomas
    Department of Psychology, Vanderbilt University
  • Asif Hasan
    Department of Electrical Engineering and Computer Science, Vanderbilt University
  • Thilo Womelsdorf
    Department of Biology, Centre for Vision Research, York University
    Department of Psychology, Vanderbilt University
    Department of Electrical Engineering and Computer Science, Vanderbilt University
Journal of Vision September 2019, Vol.19, 47d. doi:https://doi.org/10.1167/19.10.47d
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Marcus R Watson, Benjamin Voloh, Christopher Thomas, Asif Hasan, Thilo Womelsdorf; Costs of attentional set-shifting during dynamic foraging, controlled by a novel Unity3D-based integrative experimental toolkit. Journal of Vision 2019;19(10):47d. doi: https://doi.org/10.1167/19.10.47d.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Introduction: There is an increasing demand for experiments in which participants are presented with realistic stimuli, complex tasks, and meaningful actions. The Unified Suite for Experiments (USE) is a complete hardware and software suite for the design and control of dynamic, game-like behavioral neuroscience experiments, with support for human, nonhuman, and AI agents. We present USE along with an example feature-based learning experiment coded in the suite. Methods: USE extends the game engine Unity3D with a hierarchical, modular state-based architecture that supports tasks of any complexity. The hardware, based around an Arduino Mega2560 board, governs communication between the experimental computer and any experimental hardware. Participants in our task had their eyes tracked as they navigated via joystick through a virtual arena, choosing between two objects on each trial, only one of which was rewarded. Objects were composed of multiple features, each with two possible values. Each context, signaled by the pattern of the floor, had a single rewarded feature value (e.g. red objects might be rewarded on a grass floor, pyramidal objects might be rewarded on a marble one). Results: USE’s hardware enables the synchronization of all data streams with precision and accuracy well under 1 ms. Gaze was classified into behaviors (e.g. fixations/saccades) which displayed appropriate characteristics (e.g. velocities/magnitudes), and demonstrated ecologically meaningful characteristics when re-presented over task videos. Rule learning was all-or nothing, moving from chance to near-perfect performance in one or two trials. Participants displayed standard effects of set switching, including worse performance when contexts differed from the previous trial, and when rules involved an extra-dimensional shift from the previous block. Conclusions USE enables the creation and temporally-precise reconstruction of highly complex tasks in dynamic environments. Our example task shows that costs associated with attentional set-switching generalize to such dynamic tasks.

Acknowledgement: Grant MOP 102482 from the Canadian Institutes of Health Research (TW) and by the Natural Sciences and Engineering Research Council of Canada Brain in Action CREATE-IRTG program (MRW and TW). 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×