September 2023
Volume 23, Issue 11
Open Access
Optica Fall Vision Meeting Abstract  |   September 2023
Invited Session I: Vision restoration: Deep learning-based stimulus optimization for prosthetic vision
Author Affiliations
  • Michael Beyeler
    University of California, Santa Barbara
Journal of Vision September 2023, Vol.23, 4. doi:https://doi.org/10.1167/jov.23.11.4
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Michael Beyeler; Invited Session I: Vision restoration: Deep learning-based stimulus optimization for prosthetic vision. Journal of Vision 2023;23(11):4. https://doi.org/10.1167/jov.23.11.4.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Visual neuroprostheses are emerging as a promising technology to restore a rudimentary form of vision to people living with incurable blindness. However, phosphenes elicited by current devices often appear artificial and distorted. Although current computational models can predict the neural or perceptual response to an electrical stimulus, an optimal stimulation strategy needs to solve the inverse problem: what is the required stimulus to produce a desired response? Here we frame this as an end-to-end optimization problem, where a deep neural network encoder is trained to invert a psychophysically validated phosphene model that predicts phosphene appearance as a function of stimulus amplitude, frequency, and pulse duration. As a proof of concept, we show that our strategy can produce high-fidelity, patient-specific stimuli representing handwritten digits and segmented images of everyday objects that drastically outperform conventional encoding strategies by relying on smaller stimulus amplitudes at the expense of higher frequencies and longer pulse durations. Overall, this work is an important first step towards improving visual outcomes in visual prosthesis users across a wide range of stimuli.

Footnotes
 Funding: Funding: NIH R00-EY029329
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×