In recent years, the study of surface appearance has been greatly facilitated by computer graphics and electronic display technologies that enable experiments in which images of surfaces with complex geometric, reflectance and illumination properties can be rendered and displayed with great fidelity. However, a significant limitation of current methods is that these images are typically presented statically or in pre-calculated motion sequences that are passively viewed by experiment observers. Under real-world conditions, to understand surface properties, observers often engage in complex behaviors that involve active manipulation and dynamic viewpoint changes. To support these kinds of interactions, we have developed a novel display system that supports both active manipulation and dynamic viewing of computer graphics simulations. The system is based on an off-the-shelf laptop computer that contains an accelerometer and a webcam as standard components. Through custom software that integrates these devices, we are able to actively sense the 3d orientation of the laptop's display and dynamically track the observer's viewpoint. We use this information to drive a physically-based illumination-map rendering algorithm that generates an accurately oriented and realistically shaded view of a surface to the laptop's display. The user experience is akin to holding a physical surface in one's hands and being able to actively tilt it and observe it from different directions to see the changing patterns and properties of environmental surface reflections. The system provides a powerful research tool that allows anyone with a compatible laptop computer to use more natural modes of interaction in their surface appearance studies.
This work was supported by the National Science Foundation and the Eastman Kodak Company.