September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Real-time experimental control with graphical user interface (REC-GUI) for vision research
Author Affiliations
  • Ari Rosenberg
    Department of Neuroscience, School of Medicine and Public Health, University of Wisconsin-Madison, Madison, WI, USA
  • Byounghoon Kim
    Department of Neuroscience, School of Medicine and Public Health, University of Wisconsin-Madison, Madison, WI, USA
  • Shobha Kenchappa
    Department of Neuroscience, School of Medicine and Public Health, University of Wisconsin-Madison, Madison, WI, USA
  • Ting-Yu Chang
    Department of Neuroscience, School of Medicine and Public Health, University of Wisconsin-Madison, Madison, WI, USA
Journal of Vision August 2017, Vol.17, 153. doi:https://doi.org/10.1167/17.10.153
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Ari Rosenberg, Byounghoon Kim, Shobha Kenchappa, Ting-Yu Chang; Real-time experimental control with graphical user interface (REC-GUI) for vision research. Journal of Vision 2017;17(10):153. https://doi.org/10.1167/17.10.153.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Vision science studies often involve a combination of behavioral control, stimulus rendering/presentation, and precisely timed measurements of electrophysiological and/or behavioral responses. The constraints imposed by these various requirements can make it challenging to jointly satisfy all the necessary design specifications for experimental control systems. Since precise knowledge of the temporal relationships between behavioral and neuronal data is fundamental to understanding brain function, we are spearheading an open-source, flexible software suite for implementing behavioral control, high precision control of stimulus presentation, and electrophysiological recordings. The state-of-the-art system is being developed to implement highly demanding specifications (e.g., rendering geometrically correct stereoscopic images with large depth variations, binocular stimulus presentation at 240 Hz, and real-time enforcement of behavior such as binocular eye and head positions), making the system ideally suited for a broad range of vision studies. The Real-Time Experimental Control with Graphical User Interface (REC-GUI) consists of three major components: (i) experimenter control panel (Python), (ii) scripts for rendering 2D or 3D visual stimuli (MATLAB/Octave with PsychToolbox), and (iii) data acquisition components including eye/head monitoring (search coil: Crist Instruments; optical: EyeLink, SR-Research Inc.) and high-density neural recording (Scout Processor, Ripple Inc.). Because rendering and presenting complex visual stimuli like 3D stereoscopic images can require significant computing power capable of interrupting display synchronization, the system divides stimulus rendering/presentation and behavioral control between different processors. All processors communicate with each other over a network in real-time using User Datagram Protocol to minimize communication delays (average 767 ± 260 µsec over a gigabyte network switch). Because the system is modular, all components can be easily substituted to be compatible with different software, hardware, and data acquisition systems. For example, MATLAB-based stimulus rendering/presentation can be readily replaced with C-code. We will soon make all of the MATLAB/Octave and Python scripts available for customization and collaborative development.

Meeting abstract presented at VSS 2017

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×