In
Experiment 1a, we found viewpoint-dependent object recognition for objects rotated around the pitch axis. This effect can best be described by a quadratic curve that approximates symmetry around 120° rotation. We also found that in our sequential matching task, only the match condition produced viewpoint-dependent behavior, whereas mismatch trials seemed unaffected by viewpoint. Finding a mismatch might rely more on the analysis of global, viewpoint-invariant features, whereas matching might be more dependent on the analysis of local, viewpoint-dependent features (e.g.,
Jolicoeur, 1990a) (e.g., deciding a shape is not a car might require less viewpoint-dependent information than identifying the shape as a chair). In
Experiment 1b, we were able to replicate our results from
Experiment 1a. Grayscaling the images seemed to have made the overall task slightly more difficult, while still producing similarly viewpoint-dependent behavior. Although some studies report mirror confusion effects for rotations around 180° (e.g.,
Gregory & McCloskey, 2010), we did not encounter this phenomenon in our study. In our case, rotating around the pitch axis produced views such as “upside-down, from behind” which is untypical for images that usually produce mirror confusions. The canonical (0°) and noncanonical (120°) viewpoints we used in
Experiment 2 represented viewpoints that produced the best and worst recognition performance derived from average accuracy ratings obtained from
Experiments 1a and
1b.