Abstract
People make rapid visual judgments of the properties of surfaces they are going to walk on or touch. How do they do this when the interactions of illumination geometry with 3-D material structure and object shape result in images that inverse optics cannot resolve? A possibly effective strategy would be to use heuristics based on information that can be gleaned rapidly from retinal images. By using perceptual scaling of a large sample of images, combined with correspondence and canonical correlation analyses, we discovered that material properties, such as roughness, thickness, and undulations, are characterized by specific scales of shape from shading. Using movies, we demonstrate that observers' percepts of these 3-D qualities vary continuously as a function of the relative energy in corresponding 2-D frequency bands. We also show that judgments of roughness, thickness, and undulations are predictably altered by adaptation to dynamic noise at the corresponding scales.
We exploit the link between perceived visual properties of a material, and specific bands in its spectrum of spatial frequencies or octaves of a wavelet decomposition, to introduce appearance bending, a set of image-based manipulation operators, such as thicken, inflate, and roughen. The result is an editing interface that produces plausible results at interactive rates, even for drastic manipulations of volumetric objects. We present the effectiveness of our method on a database of bidirectional texture functions for a variety of materials.
The scale of local 3-D structure, inferred from shading cues, is thus critical in perceiving and useful in modifying material properties.