October 2020
Volume 20, Issue 11
Open Access
Vision Sciences Society Annual Meeting Abstract  |   October 2020
Finding the Explanatory Limits of the eSTST Model of the Attentional Blink
Author Affiliations
  • Shekoofeh Hedayati Zafarghandi
    Pennsylvania State University
  • Brad Wyble
    Pennsylvania State University
  • Natalie Russo
    Syracuse University
Journal of Vision October 2020, Vol.20, 1315. doi:https://doi.org/10.1167/jov.20.11.1315
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Shekoofeh Hedayati Zafarghandi, Brad Wyble, Natalie Russo; Finding the Explanatory Limits of the eSTST Model of the Attentional Blink. Journal of Vision 2020;20(11):1315. https://doi.org/10.1167/jov.20.11.1315.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

The Attentional Blink (AB) phenomenon (i.e. missing a second target presented 100-500ms after a first target; Raymond et.al.,1992) has been studied to understand the temporal dynamics of attentional deployment. Along with conducting behavioral experiments, models of the AB have been developed to provide computationally and neurally plausible explanations of its underlying mechanisms. However, manipulating experimental parameters (stimulus type, presentation duration, etc.) gives rise to various patterns of the AB that are quantitatively and qualitatively distinct. How can we know the explanatory limits of a model for different data sets? To understand how well a set of AB data can be simulated by a particular model, and also to gain insight into how the model’s parameters map onto the data, we designed an automated algorithm that searches parameter space to find the best set of parameters. We utilized the episodic simultaneous type serial token (eSTST) model of the AB (Wyble Bowman & Nieuwenstein, 2009) and chose three specific parameters that modulate the shape of a simulated AB curve. The search algorithm was based on the Markov Chain Monte Carlo (MCMC) which tried to fit both quantitative (i.e. mean squared error) and qualitative (i.e. features that are vital to the theory of AB such as AB depth, lag-1 sparing, etc.) features of the two AB data sets. The algorithm was fit to two AB data sets demonstrating different patterns. The parameter search performed on the eSTST model revealed that the model was unable to account for data that contained no lag-1 sparing. Furthermore, given that the chosen parameters of the eSTST corresponded to distinct constructs of the visual working memory, the algorithm was informative on how these parameters were mapped onto each data set. This mapping, in turn, allowed for making inferences of the role of these constructs in mediating the AB.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.