Abstract
Most models of perceptual decision making assume that sensory evidence is accumulated until a decision threshold is reached, even when this evidence is dominated by noise. One important question is whether the brain performs the same or different computations to produce choices under high and low visibility. Here we specifically investigated how the strength of sensory evidence influences how perceptual choices are made in the human brain. Subjects were required to judge the direction of motion of dynamic random dot patterns of varying motion coherence while their brain activity was measured with fMRI. We used multivoxel pattern analysis to decode choices from local patterns of brain activity under different coherence levels. We found a positive relationship between motion coherence and decoding accuracy for perceptual choices in early visual cortex, suggesting that stimulus representations increasingly contribute to perceptual choice with increasing sensory evidence. Interestingly, we also found a negative relationship in posterior parietal brain regions, with highest decoding accuracies for low levels of motion coherence. This could indicate that different mechanisms contribute to perceptual choices for motion under high and low sensory evidence. These results are discussed in light of current models and previous experimental results on the neural underpinnings of perceptual decision making.
This work was funded by the German Research Foundation (DFG Grant HA 5336/1-1), the Bernstein Computational Neuroscience Program of the German Federal Ministry of Education and Research (BMBF Grant 01GQ0411), the Excellency Initiative of the German Federal Ministry of Education and Research (DFG Grant GSC86/1-2009), and the German National Merit Foundation.