Abstract
How is attention control learned? Most neuro-cognitive models avoid asking this question, focusing instead on how the prioritization and selection functions of attention affect neural and behavioral responses. Recently, we introduced ATTNet, an image-computable deep network that combines behavioral, neural, and machine-learning perspectives into a working model of the broad ATTention Network. ATTNet also has coarse biological plausibility; it is inspired by biased-competition theory, trained using deep reinforcement learning, and has a foveated retina. Through the application of reward during search, ATTNet learns to shift its attention to locations where there are features of a rewarded object category. We tested ATTNet in the context of two different “microwave oven” and “clock” search tasks using images of kitchen scenes (Microsoft COCO) depicting both a microwave and a clock (target present) or neither a microwave nor a clock (target absent). This design therefore perfectly controls for the visual input; any difference in the model’s behavior could only be due to target-specific applications of reward. Similar to the eye movements of our behavioral participants (n=60) searching the same scenes for the same target categories, ATTNet preferentially fixated clocks but not microwaves when previously rewarded for clocks, and preferentially fixated microwaves but not clocks when previously rewarded for microwaves. Analysis of target-absent search behavior also revealed clear scene context effects; ATTNet and participants looked at locations along walls when searching for a clock, and looked at locations along countertops when searching for a microwave. We therefore suggest a computational answer to one fundamental question; that the simple pursuit of reward causes, not only the prioritization of space in terms of expected reward signals, but the use of these signals to control the shift of what the literature has come to know as spatial attention.
Acknowledgement: This work was funded by NSF IIS award 1763981.