August 2023
Volume 23, Issue 9
Open Access
Vision Sciences Society Annual Meeting Abstract  |   August 2023
Using convolutional neural networks to relate external sensory features to internal decisional evidence
Author Affiliations & Notes
  • Marshall Green
    Georgia Institute of Technology
  • Mingjia Hu
    Indiana University
  • Rachel Denison
    Boston University
  • Dobromir Rahnev
    Georgia Institute of Technology
  • Footnotes
    Acknowledgements  This work was supported by the National Institute of Health (awards: R01MH119189 and R21MH122825) and the Office of Naval Research (award: N00014-20-1-2622).
Journal of Vision August 2023, Vol.23, 5685. doi:https://doi.org/10.1167/jov.23.9.5685
  • Views
  • Share
  • Tools
    • Alerts
      ×
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Marshall Green, Mingjia Hu, Rachel Denison, Dobromir Rahnev; Using convolutional neural networks to relate external sensory features to internal decisional evidence. Journal of Vision 2023;23(9):5685. https://doi.org/10.1167/jov.23.9.5685.

      Download citation file:


      © ARVO (1962-2015); The Authors (2016-present)

      ×
  • Supplements
Abstract

Perceptual decision-making is the process of making a judgment about the identity of a stimulus based on the available sensory information. All theories of perceptual decision-making postulate that external sensory information is transformed into the internal evidence that is used to guide behavior. However, the nature of this external-to-internal transformation remains unknown. In two experiments, we examined how a particular external stimulus feature – orientation – is transformed into internal evidence. In Experiment 1, subjects (N=12) judged whether Gabor patches of different orientations were tilted clockwise or counterclockwise from 45 degrees. The results demonstrated that increasing the orientation offset of a high-contrast Gabor in fine-scale increments from .4 to 2.4 degrees resulted in a linear increase in sensitivity (d’), suggesting a linear transformation from orientation to internal evidence strength. However, in Experiment 2, we altered the task such that subjects (N=11) judged the orientation offset of a noisy, low-contrast Gabor in coarse-scale increments from 7 to 42 degrees away from vertical. In this experiment, we found a very different relationship with orientation offset having little effect on sensitivity, suggesting a highly non-linear relationship between orientation and internal evidence strength. These behavioral results show that a given sensory feature may not have a one-to-one mapping with the internal representation of evidence across different tasks. We then investigated whether a convolutional neural network (CNN) can reproduce our external-to-internal mapping results. We found that a CNN trained on orientation discrimination reproduced the observed pattern of results – fine-scale increments in orientation offset were linearly transformed into internal evidence, but coarse-scale increments in orientation offset had little influence on internal evidence. These results begin to reveal how external sensory information is mapped onto internal decisional evidence and demonstrate that CNNs can serve as a theory-testing platform for this critical external-to-internal transformation.

×
×

This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.

×