Purchase this article with an account.
Martin N. Hebart, Tobias H. Donner, John-Dylan Haynes; Decoding perceptual choices for motion stimuli of varying coherence. Journal of Vision 2011;11(11):768. doi: https://doi.org/10.1167/11.11.768.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Most models of perceptual decision making assume that sensory evidence is accumulated until a decision threshold is reached, even when this evidence is dominated by noise. One important question is whether the brain performs the same or different computations to produce choices under high and low visibility. Here we specifically investigated how the strength of sensory evidence influences how perceptual choices are made in the human brain. Subjects were required to judge the direction of motion of dynamic random dot patterns of varying motion coherence while their brain activity was measured with fMRI. We used multivoxel pattern analysis to decode choices from local patterns of brain activity under different coherence levels. We found a positive relationship between motion coherence and decoding accuracy for perceptual choices in early visual cortex, suggesting that stimulus representations increasingly contribute to perceptual choice with increasing sensory evidence. Interestingly, we also found a negative relationship in posterior parietal brain regions, with highest decoding accuracies for low levels of motion coherence. This could indicate that different mechanisms contribute to perceptual choices for motion under high and low sensory evidence. These results are discussed in light of current models and previous experimental results on the neural underpinnings of perceptual decision making.
This PDF is available to Subscribers Only