September 2018
Volume 18, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2018
Visual psychophysics on the web: open-access tools, experiments, and results using online platforms
Author Affiliations
  • Sivananda Rajananda
    University of California-Riverside, Department of BioengineeringUniversity of California-Los Angeles, Department of Psychology
  • Megan A.K. Peters
    University of California-Riverside, Department of Bioengineering
  • Hakwan Lau
    University of California-Los Angeles, Department of PsychologyUniversity of California-Los Angeles, Brain Research Institute
  • Brian Odegaard
    University of California-Los Angeles, Department of Psychology
Journal of Vision September 2018, Vol.18, 299. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Sivananda Rajananda, Megan A.K. Peters, Hakwan Lau, Brian Odegaard; Visual psychophysics on the web: open-access tools, experiments, and results using online platforms. Journal of Vision 2018;18(10):299.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

In the last several years, web-based experiments with visual stimuli have become increasingly common as researchers have utilized online paradigms to facilitate fast data collection with large samples. However, few open-access tools exist for conducting rigorous visual psychophysical studies on the internet. Here, we present new tools to enable vision science in web browsers, as well as sample experiments and results which demonstrate their viability. These tools include several methods to estimate psychophysical threshold parameters that run entirely in JavaScript/CSS/HTML, including the PEST adaptive staircase procedure and the Confidence Signal Detection model (Yi & Merfeld, 2016), which leverages confidence judgments to estimate thresholds with a small number of trials. We also present the first open-access random-dot kinematogram which runs entirely in web browsers and includes parameters to customize coherence levels, aperture shape, dot size, and other features. Our initial experiments on human motion perception demonstrate three important findings: (1) with our tools, motion threshold parameters estimated from online subjects are comparable to those estimated in controlled laboratory environments; (2) our web-based implementation of new methods facilitates faster threshold estimation than traditional methods; (3) data from online subjects indicates these participants are much more demographically diverse than studies from university laboratories. We have also developed new paradigms for testing peripheral color perception online, and results show that observers often overestimate how saturated parafoveal visual stimuli truly are. Finally, we will discuss results from recent investigations investigating differences between foveal and parafoveal motion perception. Together, these experiments demonstrate that despite sacrificing a degree of experimental control, rigorous web-based psychophysics is quite possible, as our initial results provide promising evidence to motivate future development of online tools for visual science.

Meeting abstract presented at VSS 2018


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.