Abstract
We can remember images we have seen, even after seeing thousands, and we remember those images with considerable visual detail. How our brains manage to store visually detailed image memories at vast capacity is not well understood. Pattern separation proposals suggest that the visual representations of images reflected in high-level visual cortex are too overlapping to support the visual fidelity of memory. Rather, computations in the hippocampus are thought to enhance the visual fidelity of memory by separating the representations of visually similar images before visual memory storage. To investigate this class of proposals, we compared visual memory representations in the hippocampus (HC) and in inferotemporal cortex (ITC), measured as one rhesus monkey performed an adapted Mnemonic Similarity Task. In this task, the monkey judged whether images were novel or repeated in the presence of visually similar lure images. We found behavioral evidence for sharpening, reflected as sharpened memory performance relative to the benchmark expected based on ITC visual representations. To investigate the neural correlates of these behavioral effects, we measured neural predictions of behavior using linear decoders optimized to extract memory information based on the population responses in each brain area. We found that by the end of the 500 ms image viewing period, behavioral sharpening was reflected not only in HC, but also in ITC. Moreover, sharpened visual memory representations emerged considerably earlier in ITC (230ms after stimulus onset) than HC (350ms after stimulus onset). These results suggest that the analogs of memory sharpening behavior observed in ITC cannot be accounted for by computation in HC, and they imply that cortical computation contributes to shaping the visual detail of visual memory.