Abstract
In the last several years, web-based experiments with visual stimuli have become increasingly common as researchers have utilized online paradigms to facilitate fast data collection with large samples. However, few open-access tools exist for conducting rigorous visual psychophysical studies on the internet. Here, we present new tools to enable vision science in web browsers, as well as sample experiments and results which demonstrate their viability. These tools include several methods to estimate psychophysical threshold parameters that run entirely in JavaScript/CSS/HTML, including the PEST adaptive staircase procedure and the Confidence Signal Detection model (Yi & Merfeld, 2016), which leverages confidence judgments to estimate thresholds with a small number of trials. We also present the first open-access random-dot kinematogram which runs entirely in web browsers and includes parameters to customize coherence levels, aperture shape, dot size, and other features. Our initial experiments on human motion perception demonstrate three important findings: (1) with our tools, motion threshold parameters estimated from online subjects are comparable to those estimated in controlled laboratory environments; (2) our web-based implementation of new methods facilitates faster threshold estimation than traditional methods; (3) data from online subjects indicates these participants are much more demographically diverse than studies from university laboratories. We have also developed new paradigms for testing peripheral color perception online, and results show that observers often overestimate how saturated parafoveal visual stimuli truly are. Finally, we will discuss results from recent investigations investigating differences between foveal and parafoveal motion perception. Together, these experiments demonstrate that despite sacrificing a degree of experimental control, rigorous web-based psychophysics is quite possible, as our initial results provide promising evidence to motivate future development of online tools for visual science.
Meeting abstract presented at VSS 2018