September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
The effects of different types of human-object interactions on the ventral occipitotemporal cortex
Author Affiliations
  • Huichao Yang
    School of Brain and Cognitive Sciences, National Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University
  • Chenxi He
    School of Brain and Cognitive Sciences, National Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University
  • Xiaoying Wang
    School of Brain and Cognitive Sciences, National Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University
  • Zaizhu Han
    School of Brain and Cognitive Sciences, National Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University
  • Yanchao Bi
    School of Brain and Cognitive Sciences, National Key Laboratory of Cognitive Neuroscience and Learning & IDG/McGovern Institute for Brain Research, Beijing Normal University
Journal of Vision August 2017, Vol.17, 1236. doi:10.1167/17.10.1236
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to Subscribers Only
      Sign In or Create an Account ×
    • Get Citation

      Huichao Yang, Chenxi He, Xiaoying Wang, Zaizhu Han, Yanchao Bi; The effects of different types of human-object interactions on the ventral occipitotemporal cortex. Journal of Vision 2017;17(10):1236. doi: 10.1167/17.10.1236.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The human ventral occipitotemporal cortex (VOTC) contains clusters that are preferentially activated by different domains of objects. A currently prevailing hypothesis is that VOTC functionality is driven by its large-scale connections that support humans' different interactions with various domains of object: manipulation for small artifacts, navigation for scenes, and social interaction for conspecific entities. To test this hypothesis, we trained participants to learn associating meaningless graphs with these three types of human-object interactions by watching cartoons. BOLD fMRI responses were collected for the following sessions: viewing of the graph (one back task), cartoon watching for the three types of interactions with the graphs, viewing of the graph (one back task). The following results were obtained: 1) watching cartoons of human figures performing the three types of interactions with the same graphs elicited whole-brain activation patterns that generally corresponded to the domain-specific regions for artifacts, scenes, and faces/humans, respectively; 2) in VOTC, the lateral occipital temporal cortex (LOTC) was selectively activated when watching manipulation cartoons and the parahippocampal place area (PPA) when watching navigation cartoons; 3) the functional connectivity strength between LOTC and another manipulation-activated region (supramarginal gyrus) during the manipulation learning significantly predicted how strong the LOTC activity to graph viewing changed by training, and also how well the participant memorized the graph-manipulation association. No such effects were observed for PPA. In summary, establishing the mapping between a visual object and a type of manipulation, supported by the functional coupling between LOTC and supramarginal gyrus, induced the domain-specific activation changes for visual perception in LOTC. The social interaction and navigation interaction, however, did not have such an effect on the VOTC activation. These findings highlight the role of human interaction and functional coupling among relevant networks for the LOTC's domain-specific activation for small artifacts.

Meeting abstract presented at VSS 2017

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×