October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Representation of information prediction in the brain
Author Affiliations
  • Yuening Yan
    Institution of Neuroscience and Psychology, University of Glasgow
  • Jiayu Zhan
    Institute of Neuroscience, National Research Council, Pisa, Italy
  • Robin Ince
    Center for Neuroscience, Indian Institute of Science
  • Philippe Schyns
    Nankai University. Tianjin, China
Journal of Vision October 2020, Vol.20, 1044. doi:https://doi.org/10.1167/jov.20.11.1044
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Yuening Yan, Jiayu Zhan, Robin Ince, Philippe Schyns; Representation of information prediction in the brain. Journal of Vision 2020;20(11):1044. https://doi.org/10.1167/jov.20.11.1044.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The human brain can generate information predictions to facilitate visual processing. We used a cued left vs. right Gabor patch orientation task to trace the dynamics of visual information predictions. On each trial, three observers categorized low and high spatial frequency Gabor patches displayed at one of 3 orientations (-15, 0, +15 deg.) at one of two locations (right vs. left visual field) while we measured their brain’s magnetoencephalographic (MEG) activity. In a two-stage cueing design, a visual cue (a dot on the screen) predicted the visual field location of the incoming Gabor, followed by an auditory cue (at 220 Hz, LSF; 1760 Hz, HSF; 880 Hz, uncued control) that predicted the spatial frequency (SuppFig. 1-A, B). Observers indicated the spatial frequency of the Gabor with a key press. As expected, valid cueing reduced reaction times in all observers, p<0.001, observer 1: 562 vs. 662ms, observer 2: 390 vs. 515ms, observer 3: 454 vs. 505ms, SuppFig. 2. We traced the prediction dynamics in the 1s that followed the spatial and auditory cues by computing the mutual information between the cue and the corresponding responses of 12,773 MEG voxels sampled every 2ms. Spatial cueing induced an early (123-136ms post spatial cue) contra-lateral representation of the dot cues in occipital cortex that extended to frontal regions (189-246ms) and came back as a prediction in occipital cortex (326–363ms, SuppFig. 3 and 4). Auditory cuing revealed a similar early representation in auditory cortex (80–91ms post auditory cue), followed by an extension to frontal regions (156-207ms) and a prediction back into occipital cortex (334–359ms, SuppFig. 4). Our results indicate that the early sensory processing of a cue (visual or auditory) propagates first to frontal area from which we traced a backward flow into occipital cortex that speeds up perceptual decisions.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×