Abstract
Problem. Visual segmentation mechanisms comprise rapid base grouping of simple features by pre-configured circuits and slower but more flexible incremental grouping by dynamically enhancing feature representations (Roelfsema, Ann. Rev. Neurosci., 2006). How can neural circuits for low- and intermediate-level vision learn such mechanisms? Deep learning provides a framework to develop neuroscience-related models, in which inductive biases constrain model computations (Richards et al., Nat. Neurosci., 2019). Here, we investigate the task of contour tracing to incrementally group visual items for decision-making in a reinforcement learning scenario (Roelfsema, Lamme & Spekreijse, Nature, 1998) with architectural constraints inspired by cortical columns. Method. We propose a multi-layer, recurrent, convolutional network where each layer is composed of retinotopically organized cortical columns. Such columns are modeled as pairs of interacting excitatory-inhibitory (E-I) neurons in discrete feature channels. E-I pair activations are computed from input filtering, lateral integration, activity normalization over neuronal pools, and top-down modulating feedback. Reinforcement signals are provided during training, from which the network can learn rewarding actions over sequential stimulus presentations, similar to the corresponding psychophysical experiments in monkeys, where the task is to fixate and hold gaze at a fixation cue and finally making a saccade to target items indicated by tagging via learned neural tracing. Results and Conclusion. The network successfully solves incremental grouping tasks after learning to act in accordance with the different phases of contour target presentation. Activity spreads along the target line to enhance its neural representation relative to a distractor. Inspection of the model reveals that maintaining fixations is mediated by E-I interactions, while the tracing operation that constitutes incremental grouping is mainly implemented by feedforward and recurrent excitatory interactions.