Abstract
Despite the multisensory nature of human perception, applications involving virtual humans are typically limited to visual stimulation and speech. We performed an experiment investigating the effects of combined visual, auditory and/or vibrotactile stimuli on a participant's sense of social presence with a virtual human. In an immersive virtual environment achieved via a head-mounted display, the participants were exposed to a virtual human (VH) walking toward them and pacing back and forth, within their social space. Participants were randomly assigned to one of three conditions: participants in the "Sound" condition (N=11) received spatial auditive feedback of the ground impact of the footsteps of the VH; participants in the "Vibration" condition (N=10) received additional vibrotactile feedback from the footsteps of the VH via a haptic platform; while participants in the "Mute" condition (N=11) were not exposed to sound or vibrotactile feedback. We measured presence/social presence via questionnaires. We analyzed the participants' head movement data regarding backing away behaviors when the VH invaded the participant's personal space as well as the view direction toward the face of the VH. Our results show that social presence and the backing away distance in the Vibration condition were significantly higher than in the Sound condition. Presence in the Mute condition was significantly lower than in the other two conditions. The vibrotactile feedback of a VH's footsteps increased the social presence in both subjective self-reports of the sense of social presence and behavioral responses when it was accompanied by sounds, compared to vision and sounds only. We found that participants who experienced both the footstep sounds and vibrations exhibited a greater avoidance behavior to the VH, e.g., avoided looking at the VH's face directly and moved their head backward more when the VH invaded their personal space.
Meeting abstract presented at VSS 2017