Journal of Vision Cover Image for Volume 21, Issue 9
September 2021
Volume 21, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2021
From the lab into the wild: studying cognition with mobile EEG and augmented reality
Author Affiliations & Notes
  • Alexandra Krugliak
    University of Cambridge
  • Alex Clarke
    University of Cambridge
  • Footnotes
    Acknowledgements  This work was supported by a Royal Society and Wellcome Trust Sir Henry Dale Fellowship to AC (211200/Z/18/Z)
Journal of Vision September 2021, Vol.21, 1908. doi:https://doi.org/10.1167/jov.21.9.1908
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Alexandra Krugliak, Alex Clarke; From the lab into the wild: studying cognition with mobile EEG and augmented reality. Journal of Vision 2021;21(9):1908. https://doi.org/10.1167/jov.21.9.1908.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Our visual environment impacts multiple aspects of cognition including perception, memory and attention. However, to understand the neural underpinnings of cognition, studies traditionally remove or control the external environment. As a result, we have limited understanding of neurocognitive processes beyond the lab. Here, in order to bridge the gap between studying cognition in the classic lab setting and in the real world environment, we used mobile EEG (mEEG) and augmented reality (AR) that allows to maintain some control over perception by placing virtual objects into the real world. In this research, we aimed to validate our AR and mEEG setup using a well-characterised cognitive response - the face inversion effect. Participants viewed upright and inverted faces in three settings (1) lab-based, (2) walking through an indoor environment while seeing photos of faces, and (3) walking through an indoor environment while seeing virtual faces. Results show greater low frequency activity for inverted compared to upright faces in all experimental settings, demonstrating that cognitively relevant signals can be extracted from mEEG and AR paradigms. Further, we present a potential route to explore the link between dynamic environments and EEG. Low and mid-level visual features were extracted from head-mounted videos recorded while walking through an outdoor environment using a computational model of visual cortex, before statistically relating the features to continuous EEG. As expected, we find that posterior electrodes correlated with video features, highlighting a framework to relate continuous perceptions to continuous neural activity. Together, this research helps pave the way to exploring neurocognitive processes in real world environments.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×