Abstract
Previous research has found differences in sensitivity to increments and decrements, which is likely due to the differences between ON and OFF channels. In the current study, we performed two experiments to investigate the difference between achromatic increments (A+) and decrements (A-) using 1.5° square stimuli. In the first experiment, we measured the perceptual scale for A+ and A-, separately, using Maximum Likelihood Difference Scaling (MLDS). In the second experiment, we measured pedestal discrimination thresholds with the same stimuli on different pedestal conditions and multiple pedestal levels. To relate the two experiments, we use a model in which the pedestal discrimination thresholds are inversely proportional to the derivative of the perceptual scale. Both sets of results show large asymmetries between A+ and A-. The perceptual scale of A+ follows a Naka-Rushton curve for most, but not all, of our five observers, while that of A- follows a cubic function for all our observers. Correspondingly, discrimination thresholds increase monotonically with suprathreshold A+ pedestal contrast, while the discrimination thresholds first increase with A- pedestal contrast when the contrast is low, then decrease again as the pedestal contrast gets higher. The individual differences we observe may be accounted for by differences in the relationship between contrast and the limiting noise: for some observers, the noise is constant, while for others it may grow with contrast. Our findings generally agree with Whittle’s previous studies (Whittle 1986, 1992) which also found a strong asymmetry between A+ and A-. This asymmetry limits the psychophysical utility of stimuli that contain the same amount of energy of both polarities, such as gratings and flickers. Our results suggest that, under some conditions, bipolar stimulus detection is actually driven by the decrement portions of the stimulus.