Abstract
The perceived visual flow of features on a 3D object provides cues about the underlying shape and motion. Likewise, imagery of a dynamic virtual object projected onto a static featureless physical object can be used to simulate shape or motion. If the shape of the virtual object is geometrically different from the physical object, the visual flow of features moving across the surface will be distorted, conveying incorrect shape cues. Mitigating these incorrect shape cues supports synthetic animatronics—simulating physical motion or deformation on geometrically static display surfaces. To achieve this, we define two sets of feature flow curves that represent the visual flow of a set of features over the course of an animation for a specific viewpoint: one set of features for the flow corresponding to the correct perception of the virtual object, and a second for the flow of features otherwise distorted by the display surface. These feature flow curves provide a basis for a perceptual error measure at single time steps (e.g., visual angular error) and for identifying temporal flow patterns that might give perceptual shape cues for the underlying display surface (e.g., a sharp trajectory change indicative of a fold or edge). We then dynamically alter the virtual imagery on the physical surface to reduce perceptual error by diminishing the visibility of specific features (and thus the resulting visual flow). This is achieved by contrast reduction or low-pass filtering proportional to the aggregate error across a set of viewpoints. We have observed that by doing this dynamic filtering of the virtual imagery we can reduce the unwanted perception of the underlying surface while maintaining feature salience in areas of geometric similarity, upholding the overall perception of the desired virtual shape and motion.
Meeting abstract presented at VSS 2017