In experiments of neurophysiology and visual psychophysics it is often desirable to modify the stimulus according to the eye movements performed by the subject. With recent improvements in computational power and video hardware, personal computers with very fast CRTs provide a flexible and affordable approach to experiments of eye movement contingent display (EMCD). Unfortunately, real-time control of the stimulus, that is ensuring an upper boundary on the delay between subject eye movements and the update of the stimulus on the display, is challenged by the characteristics of the operating systems commonly available on PCs, such as Microsoft Windows or Apple MacOS. These operating systems do not allow a precise temporal control of events. We have recently developed an integrated software and hardware system to perform real-time EMCD on a Windows PC. This system can be either connected to the PC through parallel port or through PCI bus and acts as an interface between the eye tracker and the video board on the PC. Based on a Digital Signal Processor with analog and digital interfaces, this system is responsible for sampling both eye movement data and subject responses, performing real-time data analysis, and communicating with the graphic card on the host PC via CPU. It can be easily programmed in C++ using a specifically designed environment to detect conditions for image manipulations. It allows a wide range of stimuli to be generated, modified, and visualized within a delay of less than two frames with refresh rates up to 200 Hz. We analyze system performances in a range of EMCD experiments. To compare system performances to those of a more traditional device, we consider experiments of retinal stabilization in which eye movements are eliminated by properly shifting the image to follow the eye. We compare the quality of retinal stabilization to that obtained with a stimulus deflector directly coupled to a Dual Purkinje Image eye tracker.
This material is based upon work supported in part by the National Science Foundation under grant No. EIA-0130851.