Abstract
When deciding between two noisy sensory states (e.g. is motion to the left or right), single-unit recording studies show that neural activity in early visual areas tracks the quality of the sensory information, whereas activity in parietal and frontal cortices more closely tracks the categorical decision process. Here we used fMRI to track the computation of a simple perceptual decision in human observers given a variable amount of sensory evidence. Observers viewed a display containing two moving-dot apertures, one in each of the upper quadrants. On each 12s trial, a variable percentage (0%, 50%, or 100%) of the dots in each aperture moved coherently at either 45° or 135°. Occasionally the speed the dots in one aperture slowed, which instructed observers to press one of two buttons indicating the currently perceived direction of motion. A multivariate pattern classification analysis of the fMRI signal was used to estimate the degree of neuronal selectivity in a given visual area for each perceptual state. The pattern of activation in nearly all visual areas discriminated the direction of motion in high-coherence displays. However, the pattern of activation in some mid and high level areas also discriminated the perceived direction of motion even in the absence of sensory evidence (0% coherent motion), suggesting that neural activity within these regions more closely tracks the perceptual state of the observer.
Supported by F32EY01726 (JS) and EY12925 (GMB)