Abstract
When the human eye is exposed to a short-wavelength light in the near-ultra violet region, the light causes the lens to fluoresce, which produces a widespread glare effect on the retina. This glare may interfere with normal vision, especially at lower ambient illumination conditions. This study characterized the relationship between fluorescence-induced glare from an ultraviolet laser and the laser irradiance. An equivalent veiling luminance technique was used to estimate the luminance of fluorescence-induced glare from an ultraviolet laser operating at 364 nm. First, threshold vs. intensity (TVI) relationships were determined for a Landolt ring target with a critical detail of 0.5° against a background luminance from 1 cd.m−2 to 100 cd.m−2. Threshold measurements were then made for the same target against a lens fluorescence-induced glare field. The laser exposures were from 0.6 mW.cm−2 to 60 mW.cm−2 at the cornea, and were 5 s in duration. The angle between the laser beam axis and the visual task was 10°. Initial results indicate that the glare luminance, estimated from the TVI curves, varies non-linearly with respect to laser irradiance, with more veiling glare than expected at higher irradiances. These non-linearities are thought to be related to the equivalent luminance estimation technique rather than any physical mechanisms. Notwithstanding this, exposure to a near-ultra violet laser at safe exposure levels can induce a veiling glare intense enough to impair contrast acuity significantly.