The experiment was performed in an immersive virtual reality apartment. A top view of the apartment is shown in
Figure 1. The apartment consisted of two rooms (a bedroom and a living room, approximately 3.6 meters wide by 4.5 meters long) connected by a corridor. A TV on the wall in the center of the corridor was used to present an image of the
target object for each trial. The apartment was decorated to look natural (see left side of
Figure 2) with objects and furniture chosen from a database of 3D models. There were doors from the corridor to the rooms. Doors slid up to open when participants approached them, to prevent seeing the rooms from the corridor. The apartment's size (including walls) was approximately 6.4 by 7.2 meters. Its virtual size corresponded to the real space that was tracked.
The apartment was created in FloorPlan 3D V11 (IMSI) and then rendered by Vizard 4 (WorldViz, LLC). The visual display was delivered via a nVisor SX111 (NVIS) wide field of view HMD. The nVisor SX111 is made of a pair of SXGA displays with fields of view of 76°H × 64°V per eye and a resolution of 1280 × 1024 pixels, which corresponds to horizontal and vertical fields-of-view of 102° and 64°, respectively. It weighs 1.3 kg, according to manufacturers. The stereo image was generated by Vizard using a standard windows computer. A ViewPoint monocular infrared video eye-tracker (Arrington Research, Scottsdale, AZ, USA) recording at 60 Hz was integrated into the helmet and monitored the left eye. Its accuracy was about 1°. Because of a relocation of the laboratory halfway through the project, head position was recorded with two different systems. For six participants the HMD had six LED markers attached to it, which were tracked by a PhaseSpace motion tracking system (PhaseSpace, San Leandro, CA, USA). For the remaining seven participants head position was monitored with a wide-area ceiling HiBall motion Tracking System (3rd Tech, Chapel Hill, NC, USA) working at 600 Hz. Head position in the environment was used to update the stereo view at the 60 Hz refresh rate of the HMD and was also recorded for later analysis. The latency for updating the visual display after a head movement was about 50 to 75 ms.
2 A Wiimote Remote (Nintendo, Kyoto, Japan) was also given to the participants to indicate when the target object was found with a button press. Views of the helmet, eye-tracker and one of the authors wearing the portable system are shown in
Figure 2. During the experiment the video records of the eye and scene camera were combined in a custom QuickTime digital format. Data from head, eye-position, and Wiimote presses, as well as the virtual simulation of the room and objects, was saved as synchronized metadata on each video frame. Synchronization occurred at the frequency rate of the eye-tracker (60 Hz).