Abstract
Previously, with a relative depth judgment task in virtual reality, we have reported that textures play an important role in extracting depth information. We found that the disparity sensitivity function with different textures has a bandpass shape, peaking at 1-3cpd for both the target and its surround. However, differences in luminance at the edges of the target may also be a contributing factor. Here, we isolate the individual contributions of edges vs. textures. Wearing a VR headset, subjects (n=5) identified which one of three adjacent square targets appeared closer to them. The targets were placed at 10ft with a surrounding background at 20ft virtual distance. The targets (3x3°, with 1 second presentation time) and background had bandpass filtered noise texture (0.3-4.5cpd) or no texture (uniform gray). Two different conditions were tested: either with no edges or only edges (center of the texture missing), rendered with a transparency map using the Three.js Javascript 3D rendering library. The visibility of the targets from center to edge followed either an increasing or decreasing sigmoidal function. Consistent with our previous results, background texture affected depth judgments. With no texture background, the disparity sensitivity function obtained from the no edges condition retained its bandpass shape, peaking at 3cpd. Meanwhile, the edges-only condition was independent of spatial frequency (SF), appearing flat except at the highest SF (Paired t-test, p=0.03). When the background contained a texture, the depth sensitivity was poorer when it matched with the target SF. With a no-texture target against a textured background, sensitivity was substantially reduced and the existence of edges was significant only at lower SF range, < 1cpd (Wilcoxon test, p<0.01). Surprisingly, any advantage of edges was absent at the middle and high SFs. Information from the edges and textures both contribute to depth judgments in different ways.