July 2013
Volume 13, Issue 9
Vision Sciences Society Annual Meeting Abstract  |   July 2013
Publication and verification bias in vision science
Author Affiliations
  • Gregory Francis
    Psychological Sciences, Purdue University
Journal of Vision July 2013, Vol.13, 441. doi:https://doi.org/10.1167/13.9.441
  • Views
  • Share
  • Tools
    • Alerts
      This feature is available to authenticated users only.
      Sign In or Create an Account ×
    • Get Citation

      Gregory Francis; Publication and verification bias in vision science. Journal of Vision 2013;13(9):441. doi: https://doi.org/10.1167/13.9.441.

      Download citation file:

      © ARVO (1962-2015); The Authors (2016-present)

  • Supplements

With cases of fraud, unbelievable discoveries (such as people being influenced by future events), key experimental findings that fail to replicate, and evidence that researchers use questionable research practices to produce significant findings, psychological science faces serious questions about whether the field can be trusted to produce valid scientific work. In an effort to reinstate the integrity of reported findings in psychological science, many researchers emphasize the importance empirical replication. Such an approach relies on the belief that true phenomena can be successfully demonstrated in well-designed experiments. Indeed, the ability to reliably reproduce an experimental outcome is widely considered the gold standard of scientific investigations. Unfortunately, this view is incorrect; and misunderstandings about replication contribute to the conflicts in psychological science. Because experimental effects in psychology are measured by statistics, there should almost always be some variability in the reported outcomes. An absence of such variability actually indicates that experimental replications are invalid, perhaps because of a publication bias to suppress contrary findings or because of a verification bias that utilizes improper methods to favor a desired experimental result. Such experimental results should be considered "too good to be true." Vision science is not immune to these issues, and overly successful replication rates are easily identified within the field; some representative examples will be presented. Although not a cure all for scientific investigations, a return to the principles of psychophysical methods would greatly alleviate these kinds of biases and help insure confidence in the validity of scientific reports.

Meeting abstract presented at VSS 2013


This PDF is available to Subscribers Only

Sign in or purchase a subscription to access this content. ×

You must be signed into an individual account to use this feature.