Purchase this article with an account.
Joo-Hyun Song, Hee Yeon Im, Patrick Bédard; Encoding attentional-states during visuomotor adaptation. Journal of Vision 2015;15(12):50. doi: 10.1167/15.12.50.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
We recently showed that visuomotor adaptation acquired under attentional distraction is better recalled under a similar level of distraction compared to no distraction. This paradoxical effect suggests that attentional state (e.g., divided or undivided) is encoded as an internal context during visuomotor learning, and should be reinstated for successful recall (Song & Bédard, in press). To investigate if there is a critical temporal window for encoding attentional state in visuomotor memory, we manipulated whether participants performed the secondary attention-demanding task concurrently in the early or late phase of visuomotor learning. Recall performance was enhanced when the attentional states between recall and the early phase of visuomotor learning were consistent. However, it reverted to untrained levels when tested under the attentional state of the late phase learning. This suggests that attentional state is primarily encoded during the early phase of learning before motor errors decrease and reach an asymptote. Furthermore, we demonstrate that when divided and undivided attentional states were mixed during visuomotor adaptation, only divided attention was encoded as an internal cue for memory retrieval. Therefore, a single attentional state appears to be primarily integrated with visuomotor memory while motor error reduction is in progress during learning.
Meeting abstract presented at VSS 2015
This PDF is available to Subscribers Only