September 2017
Volume 17, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2017
Mutual transfer between visual and auditory temporal interval learning supports a central clock in temporal processing
Author Affiliations
  • Shu-chen Guan
    School of Psychological and Cognitive Sciences, IDG-McGovern Institute for Brain Sciences, and Peking-Tsinghua Center for Life Sciences, Peking University, Beijing
  • Ying-Zi Xiong
    School of Psychological and Cognitive Sciences, IDG-McGovern Institute for Brain Sciences, and Peking-Tsinghua Center for Life Sciences, Peking University, Beijing
  • Cong Yu
    School of Psychological and Cognitive Sciences, IDG-McGovern Institute for Brain Sciences, and Peking-Tsinghua Center for Life Sciences, Peking University, Beijing
Journal of Vision August 2017, Vol.17, 518. doi:https://doi.org/10.1167/17.10.518
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Shu-chen Guan, Ying-Zi Xiong, Cong Yu; Mutual transfer between visual and auditory temporal interval learning supports a central clock in temporal processing. Journal of Vision 2017;17(10):518. https://doi.org/10.1167/17.10.518.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Temporal perceptual learning shows specificity to the visual or auditory modality, or asymmetric partial transfer from auditory to visual, but not vice versa. These findings are interpreted as evidence for distributed, rather than central, temporal processing (Ivry & Schlerf, 2008). For visual perceptual learning, location and orientation specificity can be eliminated with double training, indicating that learning specificity may not be used to infer the mechanisms of perceptual learning (e.g., Xiao et al., 2008). Here we investigated whether double training can eliminate the modality specificity in temporal learning. We first replicated asymmetric partial transfer results between auditory and visual learning of a temporal-interval discrimination (TID) task with standard training. The standard interval was marked by a pair of auditory beeps or visual gratings at 100-ms. The subjects practiced either the auditory or the visual TID task for 5 sessions. Visual TID learning had no impact on auditory TID performance (p=0.65), while auditory TID learning improved visual TID performance (p=0.005), although not as much as direction visual TID learning (p=0.028). However, complete learning transfer was evident with double-training. When visual TID learning was paired with an auditory frequency discrimination task at the same 100-ms interval, auditory TID performance was improved similarly to direct auditory training (p=0.051), indicating complete cross-modal learning transfer. Similarly, when auditory TID learning was paired with a visual contrast discrimination task at the same 100-ms interval, visual TID performance was improved equally as direct visual training (p=0.95), again indicating complete cross-modal learning transfer. In both cases we found no significant impact of practicing auditory frequency discrimination or visual contrast discrimination alone on TID performance. Our results suggest mutual and nearly complete learning transfer of TID learning between visual and auditory modalities, which are consistent with a central temporal processing mechanism shared by different modalities.

Meeting abstract presented at VSS 2017

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×