Abstract
Scene processing occurs rapidly to create a coherent visual representation of our external environment over time. Historically, electrophysiological correlates of scene processing have been studied with experiments using static stimuli presented for discrete timescales where participants maintain a fixed eye position. Gaps remain in generalizing these findings to real world conditions where eye movements are made to select new visual information and where the environment remains stable but changes with our position in space drive dynamic visual stimulation. Co-recording of eye movements and electroencephalography (EEG) provides an approach to leverage fixations as time-locking events in the EEG recording under free-viewing conditions. The resulting fixation-related potential (FRP) provides a neural snapshot in which to study visual processing under more naturalistic conditions. The current experiment aimed to explore the influence of scene statistics, specifically spatial frequency and luminance, on the early visual components evoked from fixations in a dynamic, continuous task. We present co-recorded eye movement and EEG data from a virtual navigation and visual search task where spatial frequency and pixel-wise RGB luminance was calculated around a 5 deg patch, centered on fixation. As part of our FRP estimation process, we used Independent Component Analysis to remove ocular artifacts (Dimigen, 2020) and deconvolutional modeling to control for overlapping neural activity and nonlinear covariates (Ehinger & Dimigen, 2021). The results suggest that early visual components of the FRP are sensitive to luminance and spatial frequency scene statistics around fixation, separate from saccade-driven amplitude modulation.