Journal of Vision Cover Image for Volume 23, Issue 9
August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
A common neural code for representing imagined and inferred tastes
Author Affiliations & Notes
  • Jason Avery
    National Institute of Mental Health
  • Madeline Carrington
    Neuroscience Graduate Group, University of Pennsylvania, Philadelphia, PA, USA
  • Alex Martin
    Bioengineering Graduate Group, University of Pennsylvania, Philadelphia, PA, USA
  • Footnotes
    Acknowledgements  This study was supported by the Intramural Research Program of the NIMH/NIH: ZIA MH002588
Journal of Vision August 2023, Vol.23, 5613. doi:https://doi.org/10.1167/jov.23.9.5613
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jason Avery, Madeline Carrington, Alex Martin; A common neural code for representing imagined and inferred tastes. Journal of Vision 2023;23(9):5613. https://doi.org/10.1167/jov.23.9.5613.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Inferences about the taste of foods are a key aspect of our everyday experience of food choice. Our previous studies have identified that the responses to directly experienced tastes as well as food pictures can be classified according to their taste category (e.g, sweet, salty, sour) within the gustatory dorsal mid-insular cortex. These findings suggest that this region represents not only taste experience, but also the inferred taste of food pictures. However, the patterns of activity elicited by pictures and their associated tastes were found to be unrelated, suggesting either that the response to images reflects some other property correlated with the taste of the depicted foods, or that the responses to pictures and tastes activate different neuronal populations in this region. To explore this question further, we examined subjects as they explicitly imagined different tastes during ultra-high resolution 7-Tesla functional magnetic resonance imaging. During scanning, healthy participants imagined basic tastes (sugar, salt, lemon juice). During separate scanning runs, subjects viewed pictures of a variety of sweet, salty, and sour foods, and non-food objects, while performing an object picture repetition-detection task. Using searchlight-based multivoxel pattern analysis, we were able to reliably classify both imagined tastes and the taste category of food pictures within the dorsal mid-insula and ventral occipito-temporal cortex. Importantly, within the dorsal mid-insula specifically, we were able to cross-classify the taste category of food pictures by training our model on imagined basic tastes (and vice versa). These findings suggest that the evoked response to food pictures within this region does indeed contain information about the inferred taste properties of the depicted foods and provide further evidence for the multimodal nature of this region of insular cortex.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×