Abstract
Introduction: The field of electronic retinal prostheses is moving quickly, with three varieties of retinal prostheses approved for commercial use in patients and several others in development. However, data from implanted patients make it clear that current technologies do not restore natural vision: Interactions between the electronics and the underlying neurophysiology result in significant spatiotemporal distortions of the perceptual experience •(Fine and Boynton 2015)•. Here we describe a linear-nonlinear cascade model, developed using a variety of patient data describing the brightness and shape of phosphenes elicited by stimulating a single electrode, that has the goal of predicting the perceptual experience of epiretinal prosthesis patients. Our goal was to see whether this model could predict data from an independent set of behavioral measures examining spatiotemporal interactions across multiple electrodes. Methods and Results: Behavioral data were collected from two Argus I epiretinal prosthesis (Second Sight Medical Products Inc.) patients, on 15 different electrode pairs with 800, 1600, or 2400 micron center-to-center separation. Subjects compared the perceived brightness of a standard stimulus (synchronous pulse trains presented across both electrodes) to the perceived brightness of a test stimulus (pulse trains across the electrode pair phase-shifted by 0.075, 0.375, 1.8, or 9 ms). A staircase procedure was used to determine the current amplitude necessary for each phase-shifted test stimulus to match the brightness of the standard. The model closely reproduced the patient psychophysical data: Specifically, the model captured spatiotemporal interactions that vary between suppression, independence, and summation depending on whether current fields overlapped and/or fell on the same ganglion axon pathway. Conclusions: Simulations such as these provide an insight into the perceptual experience of retinal prosthesis patients, can guide current and future technology development, and provide regulatory bodies with guidance into what sort of visual tests are appropriate for evaluating prosthetic performance. (http://github.com/uwescience/pulse2percept)
Meeting abstract presented at VSS 2017