August 2010
Volume 10, Issue 7
Free
Vision Sciences Society Annual Meeting Abstract  |   August 2010
How objects and spatial attention interact: Prefrontal-parietal interactions determine attention switching costs and their individual differences
Author Affiliations
  • Nicholas C. Foley
    Department of Cognitive and Neural Systems, Boston University
    Center for Adaptive Systems, Boston University
    Center of Excellence for Learning in Education, Science and Technology, Boston University
  • Stephen Grossberg
    Department of Cognitive and Neural Systems, Boston University
    Center for Adaptive Systems, Boston University
    Center of Excellence for Learning in Education, Science and Technology, Boston University
  • Ennio Mingolla
    Department of Cognitive and Neural Systems, Boston University
    Center for Adaptive Systems, Boston University
    Center of Excellence for Learning in Education, Science and Technology, Boston University
Journal of Vision August 2010, Vol.10, 214. doi:10.1167/10.7.214
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Nicholas C. Foley, Stephen Grossberg, Ennio Mingolla; How objects and spatial attention interact: Prefrontal-parietal interactions determine attention switching costs and their individual differences. Journal of Vision 2010;10(7):214. doi: 10.1167/10.7.214.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

How are spatial and object attention coordinated to achieve rapid object learning and recognition during eye movement search? How do prefrontal priming and parietal spatial mechanisms interact to determine the reaction time costs of intra-object attention shifts, inter-object attention shifts, and shifts between visible objects and covertly cued locations and their effects on individual differences (Brown and Denny, 2007; Roggeveen et al., 2009)? The current work builds on the ARTSCAN model (Fazl, Grossberg and Mingolla, 2009) of how spatial attention in the Where cortical stream coordinates stable, view-invariant object category learning in the What cortical stream under free viewing conditions. Our model explains psychological data about covert attention switching and multifocal attention without eye movements. The model predicts that ‘attentional shrouds’ (Tyler and Konsevich, 1995) are formed when surface representations in cortical area V4 resonate with spatial attention in posterior parietal cortex (PPC) and prefrontal cortex (PFC) while shrouds compete among themselves for dominance. Winning shrouds support view invariant object category learning and active surface-shroud resonances support conscious surface perception. In the present model, visual inputs are transformed by simulated cortical magnification and then separated into left and right hemifield representations, consistent with both anatomical and behavioral evidence of independent attention resources in the left and right visual hemifields (Alvarez and Cavanagh, 2005). Activity levels of filled-in surface representations are modulated by attention from shroud representations in PPC and PFC, consistent with V4 neuronal data (Reynolds and Desimone, 2004). Attentive competition between multiple objects is simulated in variations of the two-object cueing paradigm of Egly, Driver and Rafael (1994).

Foley, N. C. Grossberg, S. Mingolla, E. (2010). How objects and spatial attention interact: Prefrontal-parietal interactions determine attention switching costs and their individual differences [Abstract]. Journal of Vision, 10(7):214, 214a, http://www.journalofvision.org/content/10/7/214, doi:10.1167/10.7.214. [CrossRef]
Footnotes
 Supported in part by CELEST, an NSF Science of Learning Center (SBE-0354378), the SyNAPSE program of DARPA (HR001109-03-0001, HR001-09-C-0011), the National Science Foundation (BCS-0235398), and the Office of Naval Research (N00014-01-1-0624).
×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×