Purchase this article with an account.
Matthew Schneps, Chen Chen, Marc Pomplun, Jiahui Wang, Anne Crosby, Kevin Kent; Re-Inventing Reading: Rapid multi-channel processing of language accelerates reading.. Journal of Vision 2016;16(12):462. doi: https://doi.org/10.1167/16.12.462.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
The prevailing methods used for reading have been honed through centuries of social engineering so as to be extraordinarily efficient. And yet most people read at speeds far below the known neurological limits for language processing (Vagharchakian, Dehaene-Lambertz, Pallier, & Dehaene, 2012, J. Neurosci.), using methods inherited from pen and ink technologies that are fast becoming obsolete. In this study we investigate whether the neurological cap on the speed of reading can be relaxed by invoking parallel multimodal pathways for language using concurrent visual and auditory presentations of text. Here, 40 college students with and without dyslexia used software that forcibly accelerated the visual presentation of text, which was concurrently rendered in tandem using compressed auditory text-to-speech. Observing speed and comprehension, we found that reading using this accelerated multimodal presentation was superior to reading accelerated text using either modality separately, controlling for comprehension. Importantly, when accelerated reading methods were compared with traditional methods of reading on paper, the traditional paper-based approach was found to be the least effective overall. The accelerated methods were also effective as an assistive technology: people with dyslexia read faster using multimodal accelerated methods when compared with typical readers using paper. Findings here suggest that in future evolutions —using technologies readily available today— parallel pathways for reading can be exploited to optimize reading, to make reading substantially more efficient and inclusive than possible using traditional paper-based methods
Meeting abstract presented at VSS 2016
This PDF is available to Subscribers Only