August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
How many perceptual categories do observers experience during visual multistability?
Author Affiliations & Notes
  • Jan Skerswetat
    Northeastern University, USA
  • Peter J. Bex
    Northeastern University, USA
  • Footnotes
    Acknowledgements  Supported by NIH grant R01EY029713. InFoRM(Indicate-Follow-Replay Me) is disclosed as a patent held by Northeastern University, Boston USA. Both authors are founders and shareholders of the company PerZeption Inc. (USA).
Journal of Vision August 2023, Vol.23, 4822. doi:https://doi.org/10.1167/jov.23.9.4822
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Jan Skerswetat, Peter J. Bex; How many perceptual categories do observers experience during visual multistability?. Journal of Vision 2023;23(9):4822. https://doi.org/10.1167/jov.23.9.4822.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Multistability perception, e.g. binocular rivalry, is a phenomenon widely used in visual neuroscience. Classic methods use experimenter-determined perceptual categories that track when and for how long principal categories (exclusive and mixed percepts) were seen. Unfortunately, these methods bias observers toward experimenter-defined categories and thus may not correspond with an observer’s experience, they do not generate continuous data, nor do they record gradual changes within mixed percepts. We recently published the InFoRM (Indicate-Follow-Replay Me) method, which measures near-continuously multistability, gradual between-and within-percept changes, and generates introspection maps. Here, we introduce an a priori method to estimate how many perceptual states people perceive during physical and perceptual rivalry. 28 participants performed eight 1min trials for three different contrast conditions while viewing obliquely-oriented sinusoidal gratings, which changed across time either physically or perceptually their spatial composition. Participants were trained to highlight continuously six perceptual categories (chosen based on reports in the literature) via joystick tilting. We applied k-means unsupervised machine learning to partition clusters of 2D-joystick data for each trial (3600 data/trial), participant, and contrast condition and compared those between perceptual rivalry and physical replay data. Averaged across trials, participants, and conditions, six clusters explained 96.41% ± 0.29σ and 96.42% ± 0.34σ of data for physical and perceptual rivalry, respectively. We then used silhouette value analysis, fitted those data to the number of clusters using polynomial functions for each observer to determine the minimum cluster separation. Averaged across trials, observers, and conditions, 9 clusters (range across observers: 2-10) had minimum silhouette values for both perceptual-rivalry and physical-replay and an intra-observer-agreement of 12/28. InFoRM’s novel approach allows autonomous inter- and intra-individual classification and counting of perceptual clusters during multistable perception.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×