Abstract
An equivalent noise experiment was conducted to investigate the effect of spatial frequency on contrast sensitivity. Under the linear amplifier model, performance can be accounted for by the efficiency of the mechanism responsible for detecting the target (relative to an ideal observer) and the variance of its internal noise. Previous studies have found conflicting results as to whether efficiency varies with spatial frequency, or if the threshold differences are due entirely to changes in internal noise variance. These experiments have frequently used broadband noise, which has the disadvantage of also activating non-target mechanisms. This leads to additional threshold elevation due to cross-channel masking (through the contrast gain pool), resulting in a confound in experiments where the relationship between the noise and target spectra is not constant. Baker & Meese [2012, Journal of Vision, 12(10):20, 1-12] proposed a novel noise masking method, where the noise is simply a contrast-jittered version of the target. This injects the noise directly into the target mechanism, minimising contrast gain pool effects. In this study, observers detected a horizontal log-Gabor target at five spatial frequencies (0.25 – 4 c/deg) in three types of noise: broadband (2D white), tuned to the target channel (2D noise filtered to have the same power spectrum as the target), and tuned to the target mechanism (contrast jitter). For each noise type, the fitted internal noise variance parameter increased with spatial frequency. In 2D white noise the fitted efficiency parameter increased with spatial frequency from 17% to 55%. In 2D filtered noise and contrast jitter noise efficiency was flat across spatial frequency at 59% and 88% respectively. By tuning our noise to the target mechanism at each spatial frequency we show that efficiency is constant, and that the decline of the contrast sensitivity function arises solely from increasing internal noise.
Meeting abstract presented at VSS 2014