Abstract
Visual mental imagery and visual perception have been shown to share a hierarchical topological visual structure of neural representation. However, despite many studies demonstrating that mental imagery involves the top-down modulation perceptual processes, we have limited knowledge about how during perception internally generated mental imagery integrate with the outside input stimuli. Here we used the dataset from previous fMRI research (Horikawa & Kamitani, 2017), which included a visual perception and an imagery experiment with human participants. We trained two types of voxel-wise encoding models, based on Gabor features and activity patterns of high visual areas, to predict activity in early visual areas (V1-V3) during perception, and then evaluated the performance of these models during mental imagery. Our results showed that during perception, activity in early visual areas could be independently predicted by the Gabor features and activity of high visual areas from encoding models, which shared with mental imagery. We further found that there existed a Gabor-specific and a non-Gabor-specific neural response pattern to stimuli in early visual areas, which paralleled a representation of the stimuli relevance information. These findings provide insights into mechanisms of how our brain integrates external and internal information.