Abstract
Effective visual search requires an orchestrated sequence of steps, sampling the environment, directing attention to one part of it, comparing what we see with what we are looking for and, if necessary, deciding where we should move our eyes next. Recent developments in our ability to co-register brain scalp potentials (EEG) during free eye movements has allowed investigating brain responses related to fixations (fixation-Related Potentials; fERPs), including the identification of sensory and cognitive local ERP components linked to individual fixations (e.g. Ossandon et al., 2010; Kamienkowski et al., 2012; Kaunitz et al., 2014). However, little is known about how local information across individual fixations is integrated globally to facilitate visual search. Given the links between low-frequency oscillations and integrative processes in fixed-gaze paradigms (e.g. Donner and Siegel, 2011; Bauer et al., 2014), we hypothesized that signatures of global integration of information along the task would be reflected in changes in low-frequency oscillations. Here, we performed an EEG and eye tracking co-registration experiment in which participants searched for a target face in natural images of crowds. We successfully obtained local fERPs, associated to the classical fixed-gaze ERP components (P1, N170/VPP, P3), and showed that changes in the frequency power indexed accumulation of evidence along the task, thus supporting our experimental hypothesis. Finally, we show how our findings lead to a data-driven integrative framework, including a role for occipital theta oscillations in visual attention and reduced alpha in expectancy, which can be a starting point to elucidate how complex mental processes are implemented in natural viewing.