Journal of Vision Cover Image for Volume 24, Issue 10
September 2024
Volume 24, Issue 10
Open Access
Vision Sciences Society Annual Meeting Abstract  |   September 2024
Hierarchical Bayesian Augmented Hebbian Reweighting Model of Perceptual Learning
Author Affiliations & Notes
  • Zhong-Lin Lu
    NYU Shanghai
    New York University
  • Shanglin Yang
    NYU Shanghai
  • Barbara Dosher
    University of California, Irvine
  • Footnotes
    Acknowledgements  The research was supported by NEI grant EY017491.
Journal of Vision September 2024, Vol.24, 325. doi:https://doi.org/10.1167/jov.24.10.325
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Zhong-Lin Lu, Shanglin Yang, Barbara Dosher; Hierarchical Bayesian Augmented Hebbian Reweighting Model of Perceptual Learning. Journal of Vision 2024;24(10):325. https://doi.org/10.1167/jov.24.10.325.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

The Augmented Hebbian Reweighting Model (AHRM; Petrov et al., 2005) has successfully modeled various phenomena in perceptual learning. Fitting the AHRM to data presents a significant challenge because, as a sequential learning model, it must be simulated to generate performance predictions with sequential trial-by-trial updates, and estimation of the AHRM parameters is generally done using hierarchical grid-search methods. In this study, we introduce three modeling technologies to facilitate AHRM fitting: A Hierarchical Bayesian AHRM (HBAHRM) that incorporates population, subject, and test levels to estimate the joint posterior hyperparameter and parameter distribution while considering covariance within and between subjects; vectorization techniques with PyTensor to drastically speed up simulations involving multi-dimensional arrays; and pre-computed the likelihood function of the AHRM. We fit the data from Petrov et al. (2005), which investigated perceptual learning in an orientation identification task with 13 subjects in two external noise orientation contexts. We found that the HBAHRM provided significantly better fits to the data than the Bayesian Inference Procedure that inferred AHRM parameters for each subject independently. At the population level, the HBAHRM generated fits with an Rsq of 0.852 and RMSE of 0.031 (in d’ units). In a simulation study, we found that the HBAHRM exhibited excellent parameter recovery and fit the simulated data with an Rsq of 0.982 and RMSE of 0.010 (d’ units). Additionally, the HBAHRM made excellent predictions of the performance of a new simulated observer with no data, 300 trials (all in one context), and 2700 trials (300 in one and 2400 in the other context) of data. The HBAHRM and the new modeling techniques can be readily applied to analyze data from various perceptual learning experiments and provide predictions of performance of new observers with no or limited data.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×