Abstract
Much of human behavior takes place in the near space: whether for work or for leisure, we spend our days acting upon the reachable environment, using our hands to manipulate objects in the service of a task—typing an email, chopping vegetables, or pouring a cup of coffee. While classic frameworks in vision do not draw sharp distinctions between reachable-scale and navigable-scale scenes, we have found that views of these reachspaces dissociate from navigable-scale scenes in both perceptual and neural measures (Josephs & Konkle, 2019; Josephs & Konkle, VSS, 2019). Here, we introduce a large-scale database of reachspace images, in order to facilitate research into the perception of reachable environments. To sample reachspaces, we compiled a list of hobbies, chores, and professions, and used online searches to find images of the reachable environments people view when engaged in those activities. We additionally considered common locations, generated lists of activities performed in those spaces, and collected the corresponding images. We restricted images to have an appropriate perspective, angle, and distance, reflecting the view of a person performing a task in that space. This process has identified 175 reachspace categories each with 20 or more images per category (N= 4,267 images), with more categories continuously being added. The sampled categories endeavor to show broad cultural variability (i.e. Go boards as well as chessboards), represent a broad sampling of occupations (i.e. sound mixers, office workers, carpenters), and include a broad sampling of activities in each location (i.e. kitchens can support chopping vegetables, washing dishes, decorating cakes, mixing ingredients, etc). Rich and high-quality image sets exist for objects and scenes, and here we introduce one for views of the space in between. Overall, this database will provide an important resource for researching visual cognitive processes operating over reachable views of the environment.