Abstract
The human brain can generate information predictions to facilitate visual processing. We used a cued left vs. right Gabor patch orientation task to trace the dynamics of visual information predictions. On each trial, three observers categorized low and high spatial frequency Gabor patches displayed at one of 3 orientations (-15, 0, +15 deg.) at one of two locations (right vs. left visual field) while we measured their brain’s magnetoencephalographic (MEG) activity. In a two-stage cueing design, a visual cue (a dot on the screen) predicted the visual field location of the incoming Gabor, followed by an auditory cue (at 220 Hz, LSF; 1760 Hz, HSF; 880 Hz, uncued control) that predicted the spatial frequency (SuppFig. 1-A, B). Observers indicated the spatial frequency of the Gabor with a key press. As expected, valid cueing reduced reaction times in all observers, p<0.001, observer 1: 562 vs. 662ms, observer 2: 390 vs. 515ms, observer 3: 454 vs. 505ms, SuppFig. 2.
We traced the prediction dynamics in the 1s that followed the spatial and auditory cues by computing the mutual information between the cue and the corresponding responses of 12,773 MEG voxels sampled every 2ms. Spatial cueing induced an early (123-136ms post spatial cue) contra-lateral representation of the dot cues in occipital cortex that extended to frontal regions (189-246ms) and came back as a prediction in occipital cortex (326–363ms, SuppFig. 3 and 4). Auditory cuing revealed a similar early representation in auditory cortex (80–91ms post auditory cue), followed by an extension to frontal regions (156-207ms) and a prediction back into occipital cortex (334–359ms, SuppFig. 4).
Our results indicate that the early sensory processing of a cue (visual or auditory) propagates first to frontal area from which we traced a backward flow into occipital cortex that speeds up perceptual decisions.