Abstract
Sensory inputs are noisy and under-constrained, and it has been suggested that people perform probabilistic inference over these inputs in order to infer the most likely state of the world. But does probabilistic inference occur at the lowest levels of sensory processing and to what extent does it affect our percepts? We explored this question by asking whether inference occurs in a visual signal detection (SD) task. In the traditional SD model (Green & Swets, 1966), sensitivity (d′) to a signal is unaffected by whether people infer the parameters of the signal and no-signal distributions. However, according to more recent probability matching models (in which people respond in proportion to the inferred probability of a signal), beliefs about signal discriminability should affect d′; therefore, we can use d′ to probe whether people perform inference in a SD task. Here, we presented subjects with a SD paradigm in which we manipulated their beliefs while keeping sensory inputs constant. Specifically, when they reported low confidence, subjects in the ‘signal-present’ group were told that the signal was present regardless of whether the stimulus was actually presented. Subjects in the ‘signal-absent’ group were told that the signal was absent. If individuals are not performing inference over the signal and no-signal distribution parameters, then there should be no change in d′ for either group. If they are, we predict significantly worse sensitivity for the ‘signal-present’ group compared to the ‘signal-absent’ group. Consistent with the latter prediction, we found a significantly lower d′ (p < 0.03) for the ‘signal-present’ (d′ = 1.61) versus the ‘signal-absent’ group (d′ = 1.95). Our results suggest that people learn the statistics (i.e. signal mean and noise) of the SD distributions in real time using top-down information, that this learning is well-predicted by an ideal Bayesian observer model, and that this inference immediately affects sensitivity.