Abstract
We are remarkably accurate at recognizing objects in “noisy” viewing conditions, such as those involving dim illumination, partial occlusion, or conditions of rain or snow. To accomplish this task, the visual system must somehow extract the coherent visual structure embedded in noisy images to achieve a more noise-invariant representation of object form. We investigated the neural basis of this denoising process by scanning subjects as they viewed line drawings of faces, houses, chairs, and shoes, in the presence of varying levels of pixel-based noise (0, 50% or 75% noise). Pattern classification decoding techniques were used to infer the categories of the viewed objects from fMRI activity patterns, allowing us to determine how well individual areas along the visual hierarchy could distinguish object categories at different noise levels. Although all visual areas could discriminate the object category of noise-free images with well above chance accuracy (55–65% correct, chance 25%), high levels of noise severely disrupted the performance of early visual areas (V1–V3). In contrast, higher visual areas showed greater robustness to visual noise, with lateral occipital and ventral object-selective regions demonstrating almost complete noise invariance. For all visual areas, classifiers trained on cortical responses to either noisy or noise-free images were equally accurate at identifying the category of noisy test images. This implies that while on average noisy and noise-free images of the same object category produced similar patterns of activation, the patterns evoked by noisy images of different categories were less well separated than those produced by noise-free images. In summary, the results suggest that invariance to visual noise emerges gradually along the cortical hierarchy, with a fully “denoised” image representation found in higher object-sensitive areas of the visual system. This form of noise invariance may reflect the operation of long-range contour integration mechanisms in higher visual areas.