August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
The role of motor and auditory predictive cues in modulating neural processing of predicted visual stimuli
Author Affiliations & Notes
  • Batel Buaron
    Tel Aviv University
  • Roy Mukamel
    Tel Aviv University
  • Footnotes
    Acknowledgements  This research was supported by the Israel Science Foundation (grant No. 2392/19 to R.M.)
Journal of Vision August 2023, Vol.23, 5302. doi:https://doi.org/10.1167/jov.23.9.5302
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Batel Buaron, Roy Mukamel; The role of motor and auditory predictive cues in modulating neural processing of predicted visual stimuli. Journal of Vision 2023;23(9):5302. https://doi.org/10.1167/jov.23.9.5302.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Performance of goal directed actions requires integration of motor commands with their expected sensory outcome. A prominent theory suggests that predictions of actions’ sensory outcome (‘efference copies’) are sent to relevant sensory regions and modulate their neural state, resulting in differential processing of the reafferent sensory signal. However, predictive signals are not unique to actions and have been found for non-motor sources. It is an open question whether motor and non-motor predictive signals share common neural mechanisms. Our previous study showed that neural activity in visual regions depend on the hand (Right/ Left) that was used to trigger identical visual stimuli. This phenomenon provides a handle for comparing the mechanism underlying motor and sensory predictions, by testing whether sensory predictive cues also modulate processing in visual cortex in a laterality-dependent manner. To this end, we used multi-voxel pattern analysis to classify fMRI activity patterns of identical visual stimuli according to the laterality of preceding cue: either Right/Left button-presses or tones delivered to Right/Left ear. Preliminary results (n=5) suggest that activity in visual cortex evoked by identical visual stimuli was modulated in a hand-dependent manner, similar to our previous findings. In addition, activity in visual cortex was also modulated in an ear-dependent manner (cue presented to Right/Left Ear). Interestingly, we see little overlap between patches separating Right/Left Motor and Auditory cues. This pattern of results suggests that lateral representation of cues in visual cortex is common for both motor and auditory prediction mechanisms, though the anatomical distribution of such representations is different. We will further examine this by performing a cross-decoding analysis between the two modalities (e.g., train a classifier to separate between hands and test it on separating between ears based on signals in visual cortex). These results will help shape models of predictive mechanisms in visual processing.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×