Abstract
Measurement of eye movements has revealed rapid development of memory for object locations in 3D immersive environments. To examine the nature of that representation, and to see if memory is coded with respect to the 3D coordinates of the room, head position was recorded while participants performed a visual search task in an immersive virtual reality apartment. The apartment had two rooms, connected by a corridor. Participants searched the apartment for a series of geometric target objects. Some target objects were always placed at the same location (stable objects), while others appeared at a new location in each trial (random objects). We analyzed whether body movements showed changes that reflected memory for target location. In each trial we calculated how far the participant's trajectory deviated from a straight path to the target object. Changes in head orientation from the moment the room was entered to the moment the target was reached were also computed. We found that the average deviation from the straight path was larger and more variable for random target objects (.47 vs .31 meters). Also the point of maximum deviation from the straight path occurred earlier for random objects than for stable objects (at 42% vs 52% of the total trajectory). On room entry lateral head deviation from the room center was already bigger for stable objects than for random objects (18º vs. 10º). Thus for random objects participants move to the center of the room until the target is located, while for stable objects subjects are more likely to follow a straight trajectory from first entry. We conclude that memory for target location is coded with respect to room coordinates and is revealed by body orientation at first entry. The visually guided component of search seems to be relatively unimportant or occurs very quickly upon entry.
Meeting abstract presented at VSS 2015