Purchase this article with an account.
Anthony LoPrete, Arthur Shapiro; Luminance captures equiluminance in three-dimensional motion. Journal of Vision 2019;19(15):49. doi: https://doi.org/10.1167/19.15.49.
Download citation file:
© ARVO (1962-2015); The Authors (2016-present)
Virtual Reality (VR) and Augmented Reality (AR) environments depend on binocular cues in order to create the perception of depth. However, the experience of depth can also be influenced by local adaptation, contrast, and other cues. Improvement of VR and AR environments therefore requires new techniques for efficiently measuring and understanding the interaction between color contrast and depth. To this end, we have created a novel display--the Pulfrich Helix--in which two columns of equally spaced dots move horizontally back and forth (see https://bit.ly/2O5cTrr). The dots create two types of motion: 1. when a darkening filter is placed over one eye, the dots become part of a three-dimensional helix spiraling around the central Y axis; 2. a low-frequency motion that tracks the intersection of the two columns of dots that move vertically and horizontally. Observers typically do not see the low-frequency motion; however, when the display is blurred, the helix motion dramatically disappears and the observer sees motion tracking up or down the screen. Here, we report that the vertical motion also overrides the perception of helical motion when all the dots are equiluminant with the background. Furthermore, the helical motion dramatically reappears when just one of the dots differs in luminance from the background. We measure the number and contrast of dots required to produce the perceptual change from 3D helix to vertical streaming. The results indicate that even very small luminance contrast signals capture the equiluminant information and turn two-dimensional equiluminant motion into a three-dimensional percept.
This PDF is available to Subscribers Only