Abstract
How the brain constructs a coherent representation of the environment from noisy and discrete visual input remains poorly understood. Here we explored whether awareness of the stimulus plays a role in the integration of local features into a representation of global shape. Participants were primed with a shape defined either by position or orientation cues, and performed a shape discrimination task on a subsequently presented probe shape. Crucially, the probe could either be defined by the same or different cues as the prime, which allowed us to distinguish the effect of priming by local features and global shape. We found a robust priming benefit for visible primes with response times being faster when the probe and prime were the same shape, regardless of the defining cue. However, rendering the prime invisible uncovered a dissociation: while there was only local priming for position-defined primes, we found only global priming for orientation-defined primes. This suggests that the brain extrapolates global shape from orientation cues in the absence of awareness of a stimulus, but it does not integrate invisible position information. In further control experiments we tested the effects of invisible priming on processing of local elements without a global interpretation.