It has long been known that discrimination of oblique
orientations is less precise than discrimination of cardinal (horizontal or vertical) orientations (Appelle,
1972; Heeley & Timney,
1988). Similarly for motion, observers are more precise at fine direction discrimination (i.e., reporting whether a stimulus is clockwise or anticlockwise of a reference direction) for motions near the cardinal directions (0°, 90°, 180°, or 270°) than they are for oblique directions (Ball & Sekuler,
1979,
1980; Dakin, Mareschal, & Bex,
2005a,
2005b; Gros, Blake, & Hiris,
1998; Heeley & Buchanan-Smith,
1992; Krukowski, Pirog, Beutter, Brooks, & Stone,
2003). Interestingly, this oblique effect only applies to discrimination and not detection performance (at least in dot patterns; Gros et al.,
1998). In plaids, it is pattern motion, and not component motion, that determines the oblique effect (Heeley & Buchanan-Smith,
1992). Generally, the oblique effect is thought to result from low-level tuning properties of orientation-sensitive neurons, with oblique orientations and motion directions being underrepresented relative to cardinal directions (Li, Peterson, & Freeman,
2003; McMahon & MacLeod,
2003). It has been suggested that this uneven distribution of neural sensitivities is due to the statistical properties of the natural environment, which exhibit a similar bias to vertical and horizontal (Essock, Haun, & Kim,
2009; Keil & Cristobal,
2000), and consequent underrepresentation of the obliques. However, an oblique effect for direction discrimination with natural images is yet to be established.