September 2018
Volume 18, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2018
Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object Color
Author Affiliations
  • Michael Bannert
    Werner Reichardt Centre for Integrative Neuroscience, University of TübingenBernstein Center for Computational Neuroscience
  • Andreas Bartels
    Werner Reichardt Centre for Integrative Neuroscience, University of TübingenBernstein Center for Computational Neuroscience
Journal of Vision September 2018, Vol.18, 871. doi:10.1167/18.10.871
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Michael Bannert, Andreas Bartels; Human V4 Activity Patterns Predict Behavioral Performance in Imagery of Object Color. Journal of Vision 2018;18(10):871. doi: 10.1167/18.10.871.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Among the multitude of elements making up visual experience, color stands out in that it can specify both subjective experience and objective properties of the outside world. Whereas most neuroimaging research on human color vision has focused on external stimulation, the present study addressed this duality by investigating how externally elicited color vision is linked to subjective color experience induced by object imagery. We recorded fMRI activity while showing our participants abstract color stimuli that were either red, green, or yellow in half of the runs ("real-color runs") and asked them to produce mental images of colored objects corresponding to the same three categories in the remaining half ("imagery runs"). To make sure that participants were engaged in visual imagery, they performed a 1-back same/different color judgment task on the imagined objects. We trained color classifiers using MVPA to distinguish between fMRI responses to the three color stimuli and cross-validated them on data from real-color or imagery runs. Although real-color percepts could be predicted from all retinotopically mapped visual areas, only color decoders trained on hV4 responses could additionally predict the color category of an object that was being imagined. This suggests that sensory-driven and self-induced colors share a common neural code in hV4. Using a hierarchical drift diffusion model, we furthermore demonstrated that the decoding accuracy in hV4 was predictive of performance in the color judgment task on a trial-by-trial basis. The commonality between neural representations of perceived and imagined object color, in combination with the behavioral modeling evidence, hence identifies area hV4 as a "perceptual bridge" linking externally triggered color vision with color in self-generated object imagery.

Meeting abstract presented at VSS 2018

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×