Abstract
Understanding how neurons in the visual system support visual perception requires deep sampling of neural responses across a wide array of visual stimuli. Part of this challenge has been met by a recent large-scale 7T fMRI dataset, termed the Natural Scenes Dataset (NSD). This dataset provides extensive high-resolution spatial sampling of brain activity in eight observers while they view complex natural scenes. Here, we present the NSD-EEG, a large-scale electroencephalography (EEG) dataset that provides detailed characterisation of brain activity from a temporal perspective, thereby completing the characterisation of visual processing in the human brain. For this dataset, we optimised data quality by choosing 8 participants from a larger pool based on empirical signal-to-noise metrics and by using a high-density (164 channels) EEG system within a shielded Faraday cage. NSD images were shown for a duration of 250 ms, followed by a variable interstimulus interval of 750-1000 ms. Each participant viewed 10000 images 10 times, with a subset of 1000 images (common across participants) repeated 30 times. Preliminary analyses reveal remarkably consistent event-related potentials (ERPs) for each stimulus, with high inter-trial reliability even at a rapid one stimulus per second pace (max Pearson R: 0.8, p<0.001). Additionally, split-half representational dissimilarity matrices exhibit strong reliability (max Spearman R: 0.4, p<0.001), further affirming the robustness of our data. We plan to publicly release the NSD-EEG dataset in the near future, alongside an exhaustive battery of complementary behavioural and psychophysical data. In combination with the NSD dataset, this will enable a comprehensive examination of neural responses in space and time to complex natural scenes. Altogether, this will support the ongoing movement using machine learning, artificial intelligence, and other computational methods to characterise and understand the neural mechanisms of vision.