Abstract
A popular new field in vision and neuroimaging is the co-registration of eye-tracking information with EEG recordings. Accomplishing this previously was a complicated and non-standardized procedure. This poster presents a free, open-source, operating system independent framework for conducting co-registration experiments. OpenSesame was used for experimental control and stimulus presentation, along with the PyGaze plugin and PyNetstation plugin. PyGaze is for control of multiple eye-tracking systems under uniform conditions, in this scenario it was tested with a Tobii TX 300 eye tracking system. PyNetstation was used for control of Net Station EEG recording software and event markers, while most EEG systems can receive stimulus signals through a simple serial-port/parallel-port command within OpenSesame. A facial stroop task used to test participants for attention was used to determine the efficacy of equipment for creating a set up in which one could conduct co-registration analyses. The task presented participants with a face showing happy or sad emotions, and also showed the word "happy" or "sad" somewhere onscreen. If the face was presented centrally onscreen, a word was overlaid on the forehead. If the face was on the left or right side of the screen, then the word was on the opposite side of the screen. Participants were instructed to press a button to indicate when the face showed happy emotions or to press a different button when the face showed sad emotions. Visual fixations were determined in real-time within the experiment, and their onset and location were flagged in the EEG data. Data were analyzed to event-related potentials (ERP) and eye-fixation related potentials (EFRP) to determine the presence of an N170 component, commonly observed as a "face specific" component. Comparison of ERP and EFRP methods indicated the latter more reliably detected the N170 component, validating the utility of the proposed framework.
Meeting abstract presented at VSS 2016