Abstract
The field of electronic retinal prostheses is moving quickly, with three varieties of retinal prostheses approved for commercial use in patients and several others in development. However, data from implanted patients make it clear that current technologies do not restore natural vision: Interactions between the electronics and the underlying neurophysiology result in significant spatiotemporal distortions of the perceptual experience (Fine and Boynton 2015). These distortions include temporal fading, spatial streaks, and inter-electrode interactions.
Here we describe a linear-nonlinear cascade model, developed using a variety of patient behavioral data, that predicts the perceptual experience of epiretinal prosthesis patients – ‘virtual patients’. These virtual patients can be used to provide realistic simulations for patients, doctors, regulatory and reimbursement agencies. Virtual patients can also be used to improve prosthetic design, including optimizing electrode configurations and stimulation protocols. For example, we show how virtual patients can be used to generate training images for a machine learning algorithm to improve Snellen letter acuity.