Abstract
Ensemble perception plays a fundamental role in how our visual system represents complex scenes. It is commonly studied by briefly presenting sets of stimuli to participants and having them report the average across the feature dimension of interest (e.g. orientation). In real scenes, however, multiple feature sets are often present at the same time, and their inputs to the visual system change continuously, due to variability in the environment or eye and head movements of the observer. Here, we developed a new task to test how participants track feature summaries continuously and how irrelevant distractor features affect the precision and time course of ensemble estimates. During 45-second long tracking trials, participants viewed a set of oriented lines that continuously changed orientations, and concurrently rotated a joystick to reproduce the average orientation of those lines. Following previous work using continuous psychophysics (Bonnen et al., 2015), we computed the cross-correlation between the mean target orientation and the response time series, with the peak amplitude of the resulting cross-correlogram reflecting perceptual sensitivity and the peak latency reflecting processing time. We first validated that our novel task produces results that parallel findings from traditional ensemble tasks: the precision of orientation estimates increased when more items were present (p=0.003), and decreased with higher variability of the orientation set (p<0.001). Next, we examined the impact of distractors on tracking performance by adding differently-colored distractor lines. We found lower precision (p=0.001) and a temporal cost of ~100ms (p=0.025) when distractors were present. Together, these results suggest that the presence of distractors in ensemble processing impairs and delays the extraction of relevant feature summaries, and demonstrate the utility of continuous tasks by revealing a temporal cost that may not be captured by traditional reaction time measures.