August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Pupillometric signature of implicit learning
Author Affiliations & Notes
  • Paola Binda
    University of Pisa
  • Chiara Terzo
    University of Florence
  • Marco Turi
    University of Salento
    Fondazione Stella Maris Mediterraneo
  • David C. Burr
    University of Florence
  • Footnotes
    Acknowledgements  European Research Council (ERC): European Union’s Horizon 2020 research and innovation program, grant n. 801715 (PUPILTRAITS) and n. 832813 (GenPercept). Italian Ministry of University and Research: PRIN2017 program (grant n. 2017HMH8FA and n. 2017SBCPZY), FARE-2 (grant SMILY) and PNRR THE 8.9.1
Journal of Vision August 2023, Vol.23, 5232. doi:
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Paola Binda, Chiara Terzo, Marco Turi, David C. Burr; Pupillometric signature of implicit learning. Journal of Vision 2023;23(9):5232.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

Far from being a mere reflection of ambient light, the diameter of our eye-pupils has been shown to track the contents of visual perception, the direction of attention and the occurrence of unexpected sensory events. Here we show that changes in pupil-size provide a reliable index of implicit learning, reflecting statistical structures even when they are neither consciously perceived nor within the focus of attention. We used a frequency-tagging temporal segmentation paradigm (Schwiedrzik and Sudmann, J.Neurosci 2020), where sequences of visual images (refreshed at 2 Hz) are displayed either in random order or in pairs, with odd-trial images reliably predicting even-trial images (pairs cycling at 1 Hz). Stimuli were either two-digit numbers, or arrays of lines: in the paired-images condition for arrays, the only information predicting even from odd trials was the numerosity of the array, as the arrangement and orientation of the lines was always randomly resampled on every trial. For both digits and arrays of lines, pupil diameter in N=8 observers oscillated at 1 Hz in the paired-images condition, tracking the statistical structure of the stimulus sequence. For the arrays, the oscillation emerged only when numerosity varied in steps larger than the discrimination threshold (suggesting a potential technique to measure numerosity acuity). Participants were never asked to consciously discriminate the paired sequences and were unaware of the difference between the paired and random conditions. The 1-Hz oscillation remained strong even when attention was directed to an irrelevant feature (the orientation of the lines, which was never predictive from odd to even trials). In summary, we extracted a pupillometric signature of neural prediction in paired images, providing a novel, objective, and seamless way to quantify the automatic and implicit structuring of sensory flow into meaningful units.


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.