Abstract
We present a large-scale dataset of human body and eye movements in 3D, interactive, and ecological household VR environments. Our simulation platform BEHAVIOR includes 100 essential daily activities according to the American Time Use Survey, such as cleaning and cooking. The simulator (iGibson 2.0) provides photo-realistic, fully interactive 3D scenes reconstructed from real homes, and supports physically realistic articulated objects. The dataset contains 500 trials (5 trials per activity), collected from 8 subjects, in 15 virtual scenes with 244 different types of objects. The total length is 18 hours, with each trial ranging from 6 seconds to 11 minutes. We provide rich annotations for visual stimuli, including depth, surface normal, optical flow, object segmentation, and object poses. Additionally, we extract abstract state information, such as object relations, represented in a symbolic logic language (e.g., sliced(objectA), onTop(objectB, objectC)). The goal of each activity is encoded in this logic representation, which facilitates understanding the low-level behaviors as evidence of high-level reasoning towards the completion of a task. Our initial results show that this dataset introduces several challenges to modeling attentional control: 1) Distribution of gaze indicates strong top-down task modulation of attention, as 34% of gaze are on goal objects. 2) 3D scenes with diverse and visually rich objects challenge the most advanced saliency models. 3) The long-horizon activities involve navigation and object manipulation, capturing diverse cognitive abilities such as physical scene understanding, visual search, and eye-hand coordination. We propose to tackle these challenges by reconciling bottom-up sensory inputs and top-down task signals, and leveraging state-of-the-art machine learning models. Together with the dataset, we open-source our simulation environment and tools. Researchers can modify or create their own activities by adding new objects and designing new household environments. We hope this will make BEHAVIOR an appealing VR experimental platform.