Abstract
Previous research has shown that self-produced head motions towards and away from a target serve as effective information about egocentric depth. We investigated depth perception of a remote target viewed via a camera attached to a robotic arm. The participants viewed the expansion and contraction of the target's image on a monitor as the camera was passively moved towards and away from the target with a consistent sinusoidal velocity profile. The task was to report the distance from the remote camera to the target. Eight subjects participated in three experimental sessions in a fixed order. The first session used familiar objects as stimuli (a tennis ball, soda can, compact disk, etc.). This allowed participants to relate the images' expansion rate to the distances of objects of known actual size. The second and third sessions used plain white squares with image sizes controlled so as not to be related to depth. Visual feedback regarding error was given throughout the first and second sessions. No feedback was given in the third session. For all three sessions the actual distance of the target was a significant predictor of perceived distance. Results indicate that radial outflow produced by an oscillating camera can serve as effective information about depth in a remote environment. The results also show that after initial training this information can be used without feedback. These results have implications for the design of systems to aid depth perception for teleoperation.
This research was supported by DARPA under contract N66001-03-C-8043 and the National Science Foundation under Grant No. SES-0353698.