Abstract
Human navigation highly depends on scene perception as an input. However, does the scene perception depend on human navigation as well? Specifically, if you cannot navigate through a scene that is otherwise visually navigable, will it affect the way you perceive the scene? This study aimed to understand how navigation experience influences visual scene perception. Participants first navigated in a virtual reality (VR) outdoor environment and were asked to perform an object detection task at a particular location. In half of the environments, after performing the object detection task, they could walk through the scene to return to their starting point. Critically, in the other half, they could not walk through the scene although it was visually navigable, as if there was an invisible wall blocking their navigation. In this case, they had to turn back and choose another way back to the starting point. Participants navigated in each environment for four times to associate the visual experience with scenes. After the VR navigation phase, they performed a same/different task to judge whether the two scenes were visually the same. Stimuli were scenes captured from the VR environments. Within the ‘different’ trials where visually different scenes were presented, in half of the trials participants were shown a pair of scenes that had different navigability as experienced from the VR environments, and in the other half they were shown a pair of scenes that matched in navigability. Critically, there were no navigation component in this same/different task. We observed that participants were slower in RT and had higher error rate to answer ‘different’ when two visually different scenes but with same navigability were presented. These results suggest that past navigation experience with scenes affect the representational distance for scene perception.