Abstract
Distances are often misperceived in virtual environments but, while a brief period of interaction through physical walking can improve this, other body-based interactions, such as reaching, do not. In fact, research in the real world suggests that action planning affects perceptual processing by biasing the cognitive system toward response-related dimensions facilitating their perception, but the role of this action-perception interplay in the virtual space is far from being fully understood. To contribute to this area of research, this study investigates the perception of an object in a virtual environment by testing the performance of 15 healthy participants (Males=3; mean age=28.4 ± 3.62 years; all right-handed) using a size judgment task. Participants were instructed to interact with the virtual object by either grasping, reaching, or no hand-movement conditions using a virtual copy of their real hand. They were then asked to report the estimation of the object size by adjusting the dimension of a comparison-object before and after the interaction phase. The interaction phase was preceded by a walking simulation to approach the target positioned far away from the participant. The results show that, overall, the size estimation errors improve after the interaction phase (β= -0.33812 mm, SE= 0.14486, t= 2.334, p<0.05). However, the no hand-movement interaction condition leads to significant smaller errors compared to the grasping one (β= -0.78053 mm, SE= 0.25181, t= -3.100, p<0.01). These findings suggest that interacting with the object through hand movements, specifically grasping, which is known to facilitate the detection of size-related features in the real-world, does not lead to an improvement on the perception of the object’s size in this virtual reality context.