September 2019
Volume 19, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2019
Low-frequency oscillations track the contents of visual perception and mental imagery
Author Affiliations & Notes
  • Siying Xie
    Department of Education and Psychology, Freie Universität Berlin, Berlin, Germany
  • Daniel Kaiser
    Department of Education and Psychology, Freie Universität Berlin, Berlin, Germany
  • Polina Iamshchinina
    Department of Education and Psychology, Freie Universität Berlin, Berlin, Germany
  • Radoslaw Cichy
    Department of Education and Psychology, Freie Universität Berlin, Berlin, Germany
    Berlin School of Mind and Brain, Humboldt-Universität Berlin, Berlin, Germany
    Bernstein Center for Computational Neuroscience Berlin, Berlin, Germany
Journal of Vision September 2019, Vol.19, 171c. doi:https://doi.org/10.1167/19.10.171c
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Siying Xie, Daniel Kaiser, Polina Iamshchinina, Radoslaw Cichy; Low-frequency oscillations track the contents of visual perception and mental imagery. Journal of Vision 2019;19(10):171c. https://doi.org/10.1167/19.10.171c.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Mental imagery of objects is phenomenologically similar to veridical perception. Agreeing with phenomenology, fMRI studies showed that visual perception and imagery of objects share neural representations. However, the temporal dynamics with which these representations emerge remain elusive. To investigate this, we performed an EEG experiment, which included three conditions: participants either saw one of 12 everyday objects (visual condition) or heard the corresponding words while being instructed to imagine the object (mental imagery condition) or not (auditory-only condition). We performed multivariate classification on oscillatory responses to reveal the time courses of perception and imagery. We conducted two key analyses. Firstly, using time- and frequency-resolved classification, we found that in all three conditions object representations emerged rapidly (from around 110ms) in oscillatory components at 5Hz and 30Hz. Comparing these representations across conditions revealed higher classification in the imagery condition, compared to the auditory-only condition in low frequencies (in the theta and alpha range), indexing additional imagery-specific processing. Secondly, using time-generalization analysis, we found that imagery and visual perception share content-specific representations in the theta and alpha frequencies, which emerge in imagery at around 1000ms and in visual perception from 400ms onwards. Altogether, our results indicate that low-frequency oscillations track the contents of visual perception and imagery in a shared neural format, suggesting that mental imagery is supported by an activation of oscillatory mechanisms also recruited during visual perception.

Acknowledgement: The work was supported by the DFG Emmy Noether Grant (CI-241/1-1) and Chinese Scholarship Council Award (201706750004). 
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×