Purchase this article with an account.
Kestas Kveraga, Jasmine Boshyan, Moshe Bar; Magnocellular contributions to top-down-facilitation of object recognition. Journal of Vision 2006;6(6):813. doi: 10.1167/6.6.813.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Most research on object recognition has focused on hierarchical, bottom-up processing along the ventral visual stream. However, emerging evidence suggests that top-down processes play a key role in facilitating object recognition. A recently proposed model of such top-down facilitation posits that a coarse, low spatial frequency image of a stimulus is rapidly projected to the orbitofrontal cortex (OFC), where it activates a prediction about potential object matches (Bar, 2003). A subsequent top-down projection from the OFC to the inferior temporal cortex narrows the object search-space by biasing the bottom-up process toward the most likely representations. We predicted that these facilitatory projections would rely on the magnocellular (M) pathway, known to convey low spatial frequency information, compared with the complementary parvocellular (P) pathway. We studied humans with fMRI, using stimuli designed to engage the M or the P pathways selectively. The P stimuli were isoluminant, chromatic line drawings, and the M stimuli were luminance-defined, low-contrast drawings. We hypothesized that despite the greater visibility of the P stimuli, the M stimuli would be recognized faster, and activate the OFC more. Conversely, the P-biased stimuli would require more processing in the inferior temporal cortex because of the lack of top-down facilitation. Our results support these predictions, in that the M stimuli resulted in shorter recognition times and greater activity in the OFC compared with the P stimuli. These findings lend strong support to the top-down facilitation model by showing that magnocellular projections play a critical role in triggering top-down facilitation of object recognition.
This PDF is available to Subscribers Only