Purchase this article with an account.
Laura K. Suttle, Nicholas B. Turk-Browne; Visual benefits from auditory statistical learning: The case of reading. Journal of Vision 2011;11(11):838. doi: https://doi.org/10.1167/11.11.838.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Statistical learning (SL) may be important for language acquisition, as revealed by enhanced auditory word recognition following learning. However, language extends beyond audition, being fundamentally dependent on vision in numerous ways, including reading, writing, gestures, and sign language. To what extent does auditory SL in speech streams influence visual language processing? Previous research on audiovisual SL has found a lack of transfer between modalities for underlying grammars, emphasizing that SL operates in a stimulus-specific manner. Here, we test the transfer of stimulus-specific SL, where stimuli are spoken during familiarization and then written during test. Observers were familiarized with a five-minute-long auditory syllable stream that, without their knowledge, was constructed from four tri-syllabic words. There were no pauses or prosodic cues to indicate boundaries between words, and thus any resulting word knowledge reflects SL. Learning was tested using a speeded reading task in which observers read text on a computer screen aloud and pressed a key when finished. They completed six test trials (order counterbalanced across observers) containing a string of 24 written syllables, with two trials from each of three conditions: words from familiarization (same-words), syllables from familiarization grouped into new words (different-words), and different syllables than familiarization grouped into new words (different-syllables). Reading times per syllable were significantly faster for same-words vs. different-words (a 6.6% boost in reading speed). Given that syllable familiarity was equated, this provides clear evidence that auditory SL affects reading. This difference reflects a benefit for same-words rather than a cost for different-words (which violated the learned structure), as supported by a directional test showing that same-words were also read faster than different-syllables. In sum, these results demonstrate that auditory SL can transfer to the visual domain, and reveal a new implicit benefit of SL for a core task in our everyday experience.
This PDF is available to Subscribers Only