Abstract
Contrast sensitivity is an important feature of functional vision, but traditional psychometric assessment methods require too many trials to estimate a complete contrast sensitivity function across the full range of spatial frequencies relevant to normal vision in humans. To overcome this challenge, Quick CSF (qCSF), a Bayesian adaptive procedure to estimate an observer’s contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), assumes a four-parameter model of the contrast sensitivity function (Watson & Ahumada, 2005). The parametric nature of this model allows for a more rapid evaluation through Bayesian inference. Stimuli parameters of contrast and spatial frequency are adaptively selected based on previous responses. As few as 25–50 trials can be collected to give a usable broad sensitivity metric across the frequency range. With 100–300 trials, contrast sensitivity function estimates reach similar precision levels of traditional laboratory CSF measurements (Lesmes, et al., 2010). We present an open-source implementation of the Quick CSF method. Our implementation of Quick CSF is written in the Python programming language as a standard Python package. The software operates as a typical full-screen desktop application, presenting a 2AFC detection task. Many settings are configurable, including stimulus size, orientation, eccentricity, color, display time, etc. Alternatively, the software can be used as a library to generate stimuli contrast/spatial frequency values and calculate the parameters of the contrast sensitivity function estimation. This allows the qCSF method to be easily integrated with new or existing software projects. The open source nature of our qCSF implementation makes it accessible to any researchers or clinicians interested in using it for their work.
Acknowledgement: Research reported in this abstract was partially supported by the Cognitive and Neurobiological Approaches to Plasticity (CNAP) Center of Biomedical Research Excellence (COBRE) of the National Institutes of Health under grant number P20GM113109.