Abstract
Psychophysical studies have established that the diplopia threshold (the binocular disparity at which fusion is lost) depends on both the amount of disparity and the lateral distance between the object and fixation. The ratio of these values, the disparity gradient, and its relation to diplopia has been widely studied using simple isolated line or dot patterns. Typically, observers see diplopia with gradients greater than 1 with limits varying as a function of factors such as viewing duration and stimulus size. Very large relative disparities are common in real-world environments and, particularly in cluttered spaces, disparity gradients are often many times higher than 1. However, unless we are looking at high-contrast lines or edges we rarely perceive diplopia. To understand this apparent inconsistency we assessed diplopia thresholds in complex 3D environments displayed in a virtual reality headset. Using Blender, we rendered realistic 3D tree structures. The structure consisted of multiple branches with a central triad which was used to measure thresholds. Across trials, the disparity of the central target branch was varied relative to its neighbouring reference branches according to a method of constant stimuli. A fixation dot was continuously visible and positioned randomly on one of the reference branches. On each trial, observers indicated whether the central branch appeared single or double. In one condition there were no explicit disparity gradient limit violations, in another a diagonal branch positioned behind the triad created an extreme gradient. Psychometric functions were fit to individual data to determine 50% thresholds. We found no significant difference in the thresholds across conditions. Our results challenge assumptions regarding the linkage between fusion and disparity gradients. Further, these data support the hypothesis that in the real world our visual system can capitalize on the spatial continuity of objects and surfaces to reduce the perception of diplopia.