Abstract
Neurophysiological and fMRI experiments have probed how the brain makes decisions. Speed and accuracy of perceptual decisions covary with the certainty in the input image, and are related to the rate of evidence accumulation in parietal and frontal cortical “decision neurons” (Shadlen and Newsome, 2001; Roitman and Shadlen, 2002; Heekeren et al., 2004). Some modelers claim perception to be a form of Bayesian inference, which estimates an optimal interpretation of image data given priors and likelihoods. The Bayesian approach, however, uses classical statistical methods that do not explain the neocortical mechanisms that make the decisions. An emerging laminar model of visual cortex quantitatively simulates dynamic properties of decision-making during form and motion perception. This LAMINART model embodies a new type of computation that goes beyond Bayesian methods. Model simulations illustrate how visual cortex may exhibit fast feedforward processing when visual data are unambiguous, and slower feedback processing to reduce uncertainty when several perceptual interpretations are possible. A tradeoff is shown between certainty and processing speed, leading to a new approach to speed/accuracy tradeoff data. These model cortical circuits embody a self-organizing system that does not need to compute the stationary probability distributions of Bayesian models while making real-time decisions. Instead, inhibitory shunting interactions within prescribed cortical layers exhibit self-normalizing properties which, together with long-range cooperative interactions, enable the cortex to carry out real-time probabilistic decision-making in which the amplitude and spatiotemporal distribution of cell activities covary with the certainty of the network's decision.
Support contributed by: NIH, NSF and ONR