Abstract
Photographic images of objects and scenes are widely used as stimuli in studies of memory and perception, in both behavioral and neuroimaging paradigms. Many repositories for color photographs of objects and scenes are publicly available, offering a range of valuable features such as standardized photographic composition (e.g., viewing and illumination angle), large numbers of exemplars in specific sub-categories (e.g., animate/inanimate; indoor/outdoor; everyday objects; faces), or standardized visual features (e.g., Greebles; stimuli normalized for low-level image properties). However, most of these sets do not provide quantitative data about the subjective relations between images within a set from the perspective of a human observer, i.e., perceptual and semantic similarity. This information is valuable because stimulus similarity influences many cognitive processes. The aim of the present study was to create a database of object and scene color photographs with both visual and semantic similarity ratings among images within well-defined sub-categories of the objects and scenes. We used Amazon’s Mechanical Turk to collect subjective similarity ratings – both visual and semantic – for 240 color photographs in four sub-categories (60 animate objects, 60 inanimate objects, 60 indoor scenes, and 60 outdoor scenes). Next, we implemented multidimensional scaling (MDS) to create a visual and semantic similarity space for each sub-category, and used automated clustering methods to provide similarity-based groupings of stimuli within the sub-categories. The stimulus set, similarity ratings, and methods for analyzing and grouping the stimuli by similarity will be made publicly available.