Purchase this article with an account.
Bhavin Sheth, Alma Tijiboy; On utilizing eye movements to inform and guide subsequent thought. Journal of Vision 2017;17(10):528. doi: https://doi.org/10.1167/17.10.528.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Eye movements are a window into ongoing unconscious cognition. Here, we ask (i) if eye scan patterns provide a sneak peek into future thought and overt response, and (ii) if experimental manipulations can alter subsequent thought. i) Can the eye movements of a subject prior to response reliably predict it? We asked subjects to think of a number, then anticipated their choice from their eye scan data. Subjects (n=50) were shown a number line (with no numbers) for 300 ms followed by a blank screen and an oral prompt: "Think of number 1, 2, or 3 and say it out loud". Features for the pattern classification were extracted from the eye scan data over the time period before (predictive) and after the number choices were presented but before overt subject response. Classifier test performance (random forest; leave one out cross-validation, or LOOCV) was 92% (=46/50) correct. Predictive features alone correctly predicted subject choice on 84% of cases. Thus, an individual's eye scan data foretells their response in an abstract number choice task. ii) Can we drive subject eye position and/or spatial attention so as to reliably alter their subsequent thought? We yoked the subject's attention with an attention-grabbing cue to a part of space and asked if this systematically influenced the number ("1"/"2") the subject later thought of. Subjects were shown a video on the screen's left (n=20) or right (n=20) or no video at all (n=20) prior to the trial and respectively chose "1" 90/5/65% of the time. Thus, an external stimulus successfully altered the number one will think of. Our findings thus demonstrate that eye movements both inform and guide subsequent thought. We are currently exploring if temporal ordering of choice presentation imposes a spatial mapping or if an innate spatial mapping onto physical space is required.
Meeting abstract presented at VSS 2017
This PDF is available to Subscribers Only