September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Visual motion extrapolation of moving objects drives real-time temporal re-alignment across hierarchical neural position representations
Author Affiliations & Notes
  • William Turner
    Queensland University of Technology
    The University of Melbourne
  • Charlie Sexton
    The University of Melbourne
  • Philippa Johnson
    The University of Melbourne
    Leiden University
  • Ella Wilson
    The University of Melbourne
  • Hinze Hogendoorn
    Queensland University of Technology
    The University of Melbourne
  • Footnotes
    Acknowledgements  This work was supported by Australian Research Council Grants FT200100246, DP220101166, and DP180102268 awarded to HH, as well as a QUT ECRIS grant and Unimelb DSH Research Support Scheme grant awarded to WT.
Journal of Vision September 2024, Vol.24, 488. doi:https://doi.org/10.1167/jov.24.10.488
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      William Turner, Charlie Sexton, Philippa Johnson, Ella Wilson, Hinze Hogendoorn; Visual motion extrapolation of moving objects drives real-time temporal re-alignment across hierarchical neural position representations. Journal of Vision 2024;24(10):488. https://doi.org/10.1167/jov.24.10.488.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

There is mounting evidence that the brain uses predictive mechanisms to encode the position of moving objects. Recent theoretical work has shown that for hierarchical networks (e.g., the visual system) to accurately represent the position of moving objects under neural delays, predictions must be generated along both forwards and backwards pathways. An important consequence of this is the alignment of position representations across network layers. However, empirical evidence of this occurring for position representations in the human brain is lacking. In this study, we investigated the temporal dynamics of object-position representations over human visual cortex during smooth motion processing. Participants (N = 18, 2 sessions) viewed a stimulus moving along a circular trajectory, while EEG was recorded. Using multi-class LDA classification, we constructed high-resolution probabilistic maps of the stimulus’ location over time. These revealed clear evidence of ‘representational overshoot’ following the unexpected disappearance or reversal of the stimulus, indicative of predictive position encoding. Importantly, examining neural dynamics immediately following motion onset we found evidence of rapid temporal re-alignment occurring between distinct object-position representations. We show that a similar temporal shift emerges spontaneously at all layers of a simulated neural network via spike-timing-dependent plasticity – providing a simple account of this predictive effect. Ultimately, this study sheds light on the neural encoding of moving-object position, and constitutes the first empirical evidence of predictive temporal re-alignment occurring in the human brain.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×