July 2013
Volume 13, Issue 9
Free
Vision Sciences Society Annual Meeting Abstract  |   July 2013
Comparisons of temporal frequency limits for cross-attribute binding tasks in vision and audition
Author Affiliations
  • Shoko Kanaya
    The University of Tokyo, Graduate school of Sociology and Humanities\nNational Institute of Advanced Industrial Science and Technology (AIST)
  • Waka Fujisaki
    National Institute of Advanced Industrial Science and Technology (AIST)
  • Shin'ya Nishida
    NTT Communication Science Laboratories, NTT Corporation
  • Shigeto Furukawa
    NTT Communication Science Laboratories, NTT Corporation
  • Kazuhiko Yokosawa
    The University of Tokyo, Graduate school of Sociology and Humanities
Journal of Vision July 2013, Vol.13, 885. doi:10.1167/13.9.885
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to Subscribers Only
      Sign In or Create an Account ×
    • Get Citation

      Shoko Kanaya, Waka Fujisaki, Shin'ya Nishida, Shigeto Furukawa, Kazuhiko Yokosawa; Comparisons of temporal frequency limits for cross-attribute binding tasks in vision and audition. Journal of Vision 2013;13(9):885. doi: 10.1167/13.9.885.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The speed of temporal binding of sensory signals, processed in parallel, can be psychophysically estimated from a critical temporal frequency beyond which observers cannot discriminate the phase relationship between two oscillating stimulus sequences. Fujisaki and Nishida (2010) showed that the temporal limit for visual cross-attribute binding tasks, as well as cross-modal binding tasks,z88;is about 2.5 Hz regardless of attribute combination. Last year, we examined the temporal limits of two auditory cross-attribute binding tasks, and found that the limit of one condition was significantly higher than 2.5 Hz (Kanaya et al., 2012, VSS). However, this experiment did not completely exclude sensory cues produced by peripheral interactions of two auditory sequences. The present study therefore measured the temporal binding limits within and across three auditory attributes (frequency (FREQ)z88;and amplitude (AMP) of a pure tone, and fundamental frequency (F0) of a band-limited pulse train) using stimulus parameters carefully selected to eliminate signal interactions within peripheral channels. The same participants also performed a visual binding task (color-orientation). Results showed that the temporal limits for auditory within-attribute binding tasks were 3.9 (FREQ-FREQ), 5.4 (AMP-AMP) and 3.4 Hz (F0-F0). The limits for auditory cross-attribute binding tasks were 4.0 (FREQ-AMP), 3.6 (FREQ-F0) and 3.3 Hz (AMP-F0), whereas the limit of the visual cross-attribute binding task remained close to 2.5 Hz. Therefore, even under conditions excluding peripheral interactions, the temporal limit obtained with auditory cross-attribute binding tasks can be higher than that of vision. Our findings are consistent with the hypothesis that, while cross-modal and visual cross-attribute binding tasks reflect a high-level attribute-independent binding mechanism, auditory cross-attribute binding tasks, at least those we have tested, can also reflect sensory processing stages earlier than the high-level binding mechanism because neural processing for different auditory attributes is not segregated as clearly as that for different visual attributes.

Meeting abstract presented at VSS 2013

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×