Abstract
Although the gist of a scene can be extracted during a single glance, detailed visual memory encoding is suspected to depend on the serial selection of more specific scene information via eye movements. It is yet unknown how aspects of complex real-world scenes are accumulated and stored into memory with each fixation. Exploiting the excellent temporal resolution of the EEG, we investigated how memory representations are enriched with information extracted from successive fixations. We co-registered EEG and eye movements while thirty participants either actively explored the scenes or passively fixated in their center. We then examined oscillations and potentials, both fixation- and stimulus-onset related, as a function of whether a scene was remembered or forgotten in a recognition test 24 hours later. A regression-based deconvolution modeling approach was employed to remove signal distortions from overlapping EEG potentials. In the active viewing condition, we found a subsequent memory effect aligned to the initial presentation of the scene, with a greater positivity at mid-frontal and parietal electrode sites 340-450 ms after image onset for subsequently remembered scenes. Moreover, stronger alpha (7-12 Hz) synchronization was found for fixations on remembered as opposed to forgotten scenes, corroborating the role of an online short-term memory mechanism supporting the maintenance of information within a scene. Lastly, in the low-beta band (15-20 Hz), we found a transition from relatively more synchronization to relatively more desynchronization from the first to the fourth fixation on the scene, suggesting that successive ordinal fixations enrich long-term memory representations for the entire scene. Thus, our results show how the content of memory is updated with each successive fixation. They also emphasize the role of visual short-term memory for the construction of robust long-term memory representations during visual exploration.