Abstract
Creating visual experiments often requires programming expertise, especially if precise control over the stimulus presentation is necessary. Coding visual experiments can be time-consuming, imprecise, and error-prone for those who know how to program and impossible for those who don’t. These issues not only impede psychophysical research, but also make it difficult to openly share one’s experiment or to reuse it for future studies. AutoExperiment is a program built with Psychtoolbox in Matlab that reads a spreadsheet containing stimulus, timing, and response information and runs the experiment exactly as specified. The spreadsheet format is simple and easy to understand, so it requires no coding experience to create an experiment. AutoExperiment can handle any number of simultaneous stimuli as well as combinations of images, videos, and audio. It can be used to run behavioral tasks at a computer on in an fMRI scanner, requiring minimal to no changes. Easy-to-read diagnostic information is automatically provided to ensure no problems occurred. With this approach, AutoExperiment presents a two-fold solution to the problems discussed above. First, AutoExperiment abstracts away the coding and implementation from the design of the experiment. Second, AutoExperiment provides a flexible but precise framework for creating and running visual experiments that can be easily shared. We show that our program can be used to easily create precise visual experiments across a wide range of paradigms, including match-to-sample, judgment tasks, and visual search. Although not all experiments can be written in this spreadsheet format, AutoExperiment can greatly speed up the creation of experiments, allow non-programmers to create precise visual experiments, provide easy verification that no visual or timing issues arose during the experiment, and make visual experiments more accessible and shareable.
Acknowledgement: NSF Graduate Research Fellowship