Computer graphics has long benefited from an understanding of human vision. Basic perceptual phenomena such as trichromacy, opponent colors, and color JNDs have enabled the design of graphics hardware and software. More complex stimuli such as human movements are also important for graphics but are not so well understood.
We present an investigation of the mapping between the parameters used by a computer to generate animated movements (different gaits of a walking figure), subjects' descriptions of movements, and their judgements of the similarity of the movements.
Experiment One: examined the classification of gaits within the structure of pairs of opposite movement description terms. We found that the focus of attention varied among subjects, but that similar stimulus characteristics were salient in determining the classification of gaits, and that classification was somewhat consistent across most of our subjects and could be reduced to three to four principal components.
Experiment Two: explored the metric properties of motion similarity judgements by asking subjects to make comparisons between a limited range of movements that were unlikely to span boundaries between multiple linguistic descriptors, but which were perceptually distinct. We conclude that similarity judgements do not have all of the metric properties but that their evaluation was similar across subjects.
Observed intersubject differences suggest that animation systems should be customizable not only for the user's preferences, but for their perceptual abilities and movement categories as well. If our findings are correct, we can predict that this customization might be achieved by altering parameter values associated with the relative weights given to common perceptual cues without the need to add new cues or substantially modify the nature of the cues themselves.
Acknowledgements: We gratefully acknowledge the support of the National Sciences and Engineering Research Council of Canada “Interactive Computer Graphics” research grant to K.S. Booth.