Preregistration (science)

From HandWiki

Preregistration is the practice of registering the hypotheses, methods, and/or analyses of a scientific study before it is conducted.[1][2] Clinical trial registration is similar, although it may not require the registration of a study's analysis protocol. Finally, registered reports include the peer review and in principle acceptance of a study protocol prior to data collection.[3] Preregistration assists in the identification and/or reduction of a variety of potentially problematic research practices, including p-hacking, publication bias, data dredging, inappropriate forms of post hoc analysis, and (relatedly) HARKing. It has recently gained prominence in the open science community as a potential solution to some of the issues that are thought to underlie the replication crisis.[1] However, critics have argued that it may not be necessary when other open science practices are implemented such as Registered Reports.[4]

Types

Standard preregistration

In the standard preregistration format, researchers prepare a research protocol document prior to conducting their research. Ideally, this document indicates the research hypotheses, sampling procedure, sample size, research design, testing conditions, stimuli, measures, data coding and aggregation method, criteria for data exclusions, and statistical analyses, including potential variations on those analyses. This preregistration document is then posted on a publicly available website such as the Open Science Framework or AsPredicted. The preregistered study is then conducted, and a report of the study and its results is submitted for publication together with access to the (anonymised) preregistration document. This preregistration approach allows peer reviewers and subsequent readers to cross-reference the preregistration document with the published research article in order to identify (a) any “exploratory” tests that were not included in the preregistration document and (b) any suppressed tests that were included in the preregistered protocol but excluded from the final research report.

Registered reports

The registered report format requires authors to submit a description of the study methods and analyses prior to data collection. Once the method and analysis plan is vetted through Stage 1 peer review, publication of the findings is provisionally guaranteed. The associated study is then conducted, and the research report is submitted to Stage 2 peer review. Stage 2 peer review confirms that the actual research methods are consistent with the preregistered protocol and that quality thresholds are met (e.g., manipulation checks confirm the validity of the experimental manipulation). Studies that pass Stage 2 peer review are then published regardless of whether the results are confirming or disconfirming, significant or nonsignificant.

Hence, both preregistration and registered reports involve creating a time-stamped non-modifiable public record of the study and analysis plan before the data is collected. However, the study and analysis plan is only subjected to a formal peer review before data collection in the case of registered reports.[citation needed]

Specialised preregistration

Preregistration can be used in relation to a variety of different research designs and methods, including:

  • Quantitative research in psychology (Bosnjak et al., 2021)[5]
  • Qualitative research (Haven & Van Grootel, 2019)[6]
  • Preexisting data (Mertens & Krypotosm, 2019; Weston et al., 2019; van den Akker et al., 2021)[7][8][9]
  • Single case designs (Johnson & Cook, 2019)[10]
  • Electroencephalogram research (Paul et al., 2021)[11]
  • Experience sampling (Kirtley et al., 2019)[12]
  • Exploratory research (Dirnagl, 2020)[13]
  • Animal Research (Bert et al., 2019)[14]

Clinical trial registration

Clinical trial registration is the practice of documenting clinical trials before they are performed in a clinical trials registry so as to combat publication bias and selective reporting.[15] Registration of clinical trials is required in some countries and is increasingly being standardized.[16] Some top medical journals will only publish the results of trials that have been pre-registered.[17]

A clinical trials registry is a platform which catalogs registered clinical trials. ClinicalTrials.gov, run by the United States National Library of Medicine (NLM) was the first online registry for clinical trials, and remains the largest and most widely used. In addition to combating bias, clinical trial registries serve to increase transparency and access to clinical trials for the public. Clinical trials registries are often searchable (e.g. by disease/indication, drug, location, etc.). Trials are registered by the pharmaceutical, biotech or medical device company (Sponsor) or by the hospital or foundation which is sponsoring the study, or by another organization, such as a contract research organization (CRO) which is running the study.

There has been a push from governments and international organizations, especially since 2005, to make clinical trial information more widely available and to standardize registries and processes of registering. The World Health Organization is working toward "achieving consensus on both the minimal and the optimal operating standards for trial registration".[18]

Creation and development

For many years, scientists and others have worried about reporting biases such that negative or null results from initiated clinical trials may be less likely to be published than positive results, thus skewing the literature and our understanding of how well interventions work.[19] This worry has been international and written about for over 50 years.[20] One of the proposals to address this potential bias was a comprehensive register of initiated clinical trials that would inform the public which trials had been started.[21] Ethical issues were those that seemed to interest the public most, as trialists (including those with potential commercial gain) benefited from those who enrolled in trials, but were not required to “give back,” telling the public what they had learned.

Those who were particularly concerned by the double standard were systematic reviewers, those who summarize what is known from clinical trials. If the literature is skewed, then the results of a systematic review are also likely to be skewed, possibly favoring the test intervention when in fact the accumulated data do not show this, if all data were made public.

ClinicalTrials.gov was originally developed largely as a result of breast cancer consumer lobbying, which led to authorizing language in the FDA Modernization Act of 1997 (Food and Drug Administration Modernization Act of 1997. Pub L No. 105-115, §113 Stat 2296), but the law provided neither funding nor a mechanism of enforcement. In addition, the law required that ClinicalTrials.gov only include trials of serious and life-threatening diseases.

Then, two events occurred in 2004 that increased public awareness of the problems of reporting bias. First, the then-New York State Attorney General Eliot Spitzer sued GlaxoSmithKline (GSK) because they had failed to reveal results from trials showing that certain antidepressants might be harmful.[22]

Shortly thereafter, the International Committee of Medical Journal Editors (ICMJE) announced that their journals would not publish reports of trials unless they had been registered. The ICMJE action was probably the most important motivator for trial registration, as investigators wanted to reserve the possibility that they could publish their results in prestigious journals, should they want to.

In 2007, the Food and Drug Administration Amendments Act of 2007 (FDAAA) clarified the requirements for registration and also set penalties for non-compliance (Public Law 110-85. The Food and Drug Administration Amendments Act of 2007 [1].

International participation

The International Committee of Medical Journal Editors (ICMJE) decided that from July 1, 2005 no trials will be considered for publication unless they are included on a clinical trials registry.[23][24] The World Health Organization has begun the push for clinical trial registration with the initiation of the International Clinical Trials Registry Platform. There has also been action from the pharmaceutical industry, which released plans to make clinical trial data more transparent and publicly available. Released in October 2008, the revised Declaration of Helsinki, states that "Every clinical trial must be registered in a publicly accessible database before recruitment of the first subject."[25][26]

The World Health Organization maintains an international registry portal at http://apps.who.int/trialsearch/.[27] WHO states that the international registry's mission is "to ensure that a complete view of research is accessible to all those involved in health care decision making. This will improve research transparency and will ultimately strengthen the validity and value of the scientific evidence base."[28]

Since 2007, the International Committee of Medical Journal Editors ICMJE accepts all primary registries in the WHO network in addition to clinicaltrials.gov. Clinical trial registration in other registries excluding ClinicalTrials.gov has increased irrespective of study designs since 2014.[29]

Reporting compliance

Various studies have measured the extent to which various trials are in compliance with the reporting standards of their registry.[30][31][32][33][34]

Overview of clinical trial registries

Worldwide, there is growing number of registries. A 2013 study[35] identified the following top five registries (numbers updated as of August 2013):

1. ClinicalTrials.gov 150,551
2. EU register 21,060
3. Japan registries network (JPRN) 12,728
4. ISRCTN 11,794
5. Australia and New Zealand (ANZCTR) 8,216

Overview of preclinical study registries

Similar to clinical research, preregistration can help to improve transparency and quality of research data in preclinical research.[36][37] In contrast to clinical research where preregistration is mandatory for vast parts it is still new in preclinical research. A large part of preclinical and basic biomedical research relies on animal experiments. The non-publication of results gained from animal experiments not only distorts the state of research by reinforcing the publication bias, it further represents an ethical issue.[38][39] Preregistration is discussed as a measure that could counteract this problem. Following registries are suited for the preregistration of preclinical studies.

1. Animalstudyregistry.org
2. As Predicted
3. OSF Registry
4. Preclinicaltrials.eu

Journal support

Over 200 journals offer a registered reports option (Centre for Open Science, 2019),[40] and the number of journals that are adopting registered reports is approximately doubling each year (Chambers et al., 2019).[41]

Psychological Science has encouraged the preregistration of studies and the reporting of effect sizes and confidence intervals.[42] The editor-in-chief also noted that the editorial staff will be asking for replication of studies with surprising findings from examinations using small sample sizes before allowing the manuscripts to be published.

Nature Human Behaviour has adopted the registered report format, as it “shift[s] the emphasis from the results of research to the questions that guide the research and the methods used to answer them”.[43]

European Journal of Personality defines this format: “In a registered report, authors create a study proposal that includes theoretical and empirical background, research questions/hypotheses, and pilot data (if available). Upon submission, this proposal will then be reviewed prior to data collection, and if accepted, the paper resulting from this peer-reviewed procedure will be published, regardless of the study outcomes.”[44]

Note that only a very small proportion of academic journals in psychology and neurosciences explicitly stated that they welcome submissions of replication studies in their aim and scope or instructions to authors.[45][46] This phenomenon does not encourage the reporting or even attempt on replication studies.

Overall, the number of participating journals is increasing, as indicated by the Center for Open Science, which maintains a list of journals encouraging the submission of registered reports.[47]

Rationale

Several articles have outlined the rationale for preregistration (e.g., Lakens, 2019; Nosek et al., 2018; Wagenmakers et al., 2012).[48][49][1] As Rubin (2020, Table 1) summarized, preregistration helps to identify and/or curtail the following issues:

  1. Poorly planned hypotheses and tests
  2. HARKing: undisclosed hypothesizing after the results are known
  3. The suppression of a priori hypotheses that yield null or disconfirming results
  4. Deviations from planned analyses
  5. Lack of clarity between confirmatory and exploratory analyses
  6. Undisclosed multiple testing
  7. Forking paths, in which researchers make decisions about which tests to conduct based on information from their sample
  8. p-hacking: continuing data analysis until a significant p value is obtained
  9. Optional stopping: repeating the same test at different stages of data collection until a significant result is obtained
  10. Invalid use of p values, because p values lose their meaning in exploratory analyses
  11. Researchers’ biases, including the confirmation bias and hindsight bias
  12. Selective reporting of results: “cherry-picking” specific supportive results and suppressing non-supportive results
  13. Unclear test severity, preventing the identification of hypotheses that have a low probability of being confirmed when they are false
  14. Unreported null findings
  15. Publication bias: unpublished null findings, resulting in the file draw problem
  16. Potentially low replicability, ostensibly due to the use of questionable research practices (e.g., HARKing, p-hacking, optional stopping)

Identifying issues such as these via preregistration helps to improve "the interpretability and credibility of research findings" (Nosek et al., 2018, p. 2605).[1] However, Rubin (2020) argued that only some of these issues are problematic and only under some conditions.[4] He also argued that, when they are problematic, preregistration is not necessary to identify these issues. Instead, they can be identified via (a) clear rationales for current hypotheses and analytical approaches, (b) public access to research data, materials, and code, and (c) demonstrations of the robustness of research conclusions to alternative interpretations and analytical approaches.

Criticisms

Proponents of preregistration have argued that it is "a method to increase the credibility of published results" (Nosek & Lakens, 2014), that it "makes your science better by increasing the credibility of your results" (Centre for Open Science), and that it "improves the interpretability and credibility of research findings" (Nosek et al., 2018, p. 2605).[1][50] This argument assumes that non-preregistered exploratory analyses are less "credible" and/or "interpretable" than preregistered confirmatory analyses because they may involve "circular reasoning" in which post hoc hypotheses are based on the observed data (Nosek et al., 2018, p. 2600).[1] However, critics have argued that preregistration is not necessary to identify circular reasoning during exploratory analyses (Rubin, 2020). Circular reasoning can be identified by analysing the reasoning per se without needing to know whether that reasoning was preregistered. Critics have also noted that the idea that preregistration improves research credibility may deter researchers from undertaking non-preregistered exploratory analyses (Coffman & Niederle, 2015; see also Collins et al., 2021, Study 1).[51][52] In response, preregistration advocates have stressed that exploratory analyses are permitted in preregistered studies, and that the results of these analyses retain some value vis-a-vis hypothesis generation rather than hypothesis testing. Preregistration merely makes the distinction between confirmatory and exploratory research clearer (Nosek et al., 2018; Nosek & Lakens, 2014; Wagenmakers et al., 2012).[1][48][50] Hence, although preregistraton is supposed to reduce researcher degrees of freedom during the data analysis stage, it is also supposed to be “a plan, not a prison” (Dehaven, 2017).[53] However, critics counterargue that, if preregistration is only supposed to be a plan, and not a prison, then researchers should feel free to deviate from that plan and undertake exploratory analyses without fearing accusations of low research credibility due to circular reasoning and inappropriate research practices such as p-hacking and unreported multiple testing that leads to inflated familywise error rates (e.g., Navarro, 2020).[54] Again, they have pointed out that preregistration is not necessary to address such concerns. For example, concerns about p-hacking and unreported multiple testing can be addressed if researchers engage in other open science practices, such as (a) open data and research materials and (b) robustness or multiverse analyses (Rubin, 2020; Steegen et al., 2016; for several other approaches, see Srivastava, 2018).[4][55][56] Finally, and more fundamentally, critics have argued that the distinction between confirmatory and exploratory analyses is unclear and/or irrelevant (Devezer et al., 2020; Rubin, 2020; Szollosi & Donkin, 2019),[57][4][58] and that concerns about inflated familywise error rates are unjustified when those error rates refer to abstract, atheoretical studywise hypotheses that are not being tested (Rubin, 2020, 2021; Szollosi et al., 2020).[4][59][60]

There are also concerns about the practical implementation of preregistration. Many preregistered protocols leave plenty of room for p-hacking (Bakker et al., 2020; Heirene et al., 2021; Ikeda et al., 2019; Singh et al., 2021; Van den Akker et al., 2023),[61][62][63][64][65] and researchers rarely follow the exact research methods and analyses that they preregister (Abrams et al., 2020; Claesen et al., 2019; Heirene et al., 2021; see also Boghdadly et al., 2018; Singh et al., 2021; Sun et al., 2019).[66][67][68][69][63][64] For example, pre-registered studies are only of higher quality than non-pre registered studies in that the former does has a power analysis and higher sample size than the latter but other than that they do not seem to prevent p-hacking and HARKing, as both the proportion of positive results and effect sizes are similar between preregistered and non-preregistered studies (Van den Akker et al., 2023).[65] In addition, in a survey of 27 preregistered studies found that researchers deviated from their preregistered plans in all cases (Claesen et al., 2019).[67] The most frequent deviations were with regards to the planned sample size, exclusion criteria, and statistical model. Hence, what were intended as preregistered confirmatory tests ended up as unplanned exploratory tests. Again, preregistration advocates argue that deviations from preregistered plans are acceptable as long as they are reported transparently and justified. They also point out that even vague preregistrations help to reduce researcher degrees of freedom and make any residual flexibility transparent (Simmons et al., 2021, p. 180).[70] However, critics have argued that it is not useful to identify or justify deviations from preregistered plans when those plans do not reflect high quality theory and research practice. As Rubin (2020) explained, “we should be more interested in the rationale for the current method and analyses than in the rationale for historical changes that have led up to the current method and analyses” (pp. 378–379).[4] In addition, pre-registering a study requires careful deliberation about the study's hypotheses, research design and statistical analyses. This depends on the use of pre-registration templates that provides detailed guidance on what to include and why (Bowman et al., 2016; Haven & Van Grootel, 2019; Van den Akker et al., 2021).[71][72][73] Many pre-registration template stress the importance of a power analysis but not only stress the importance of why the methodology was used.

Finally, some commentators have argued that, under some circumstances, preregistration may actually harm science by providing a false sense of credibility to research studies and analyses (Devezer et al., 2020; McPhetres, 2020; Pham & Oh, 2020; Szollosi et al., 2020).[57][74][59][75] Consistent with this view, there is some evidence that researchers view registered reports as being more credible than standard reports on a range of dimensions (Soderberg et al., 2020; see also Field et al., 2020 for inconclusive evidence),[76][77] although it is unclear whether this represents a "false" sense of credibility due to pre-existing positive community attitudes about preregistration or a genuine causal effect of registered reports on quality of research.

See also

References

  1. 1.0 1.1 1.2 1.3 1.4 1.5 1.6 Nosek, B. A.; Ebersole, C. R.; DeHaven, A. C.; Mellor, D. T. (2018). "The preregistration revolution". Proceedings of the National Academy of Sciences 115 (11): 2600–2606. doi:10.1073/pnas.1708274114. PMID 29531091. Bibcode2018PNAS..115.2600N. 
  2. Parsons, Sam; Azevedo, Flávio; Elsherif, Mahmoud M.; Guay, Samuel; Shahim, Owen N.; Govaart, Gisela H.; Norris, Emma; O’Mahony, Aoife et al. (2022-02-21). "A community-sourced glossary of open scholarship terms" (in en). Nature Human Behaviour 6 (3): 312–318. doi:10.1038/s41562-021-01269-4. ISSN 2397-3374. PMID 35190714. https://www.nature.com/articles/s41562-021-01269-4. 
  3. "Registered Replication Reports". Association for Psychological Science. http://www.psychologicalscience.org/index.php/replication. 
  4. 4.0 4.1 4.2 4.3 4.4 4.5 Rubin, M. (2020). "Does preregistration improve the credibility of research findings?". The Quantitative Methods for Psychology 16 (4): 376–390. doi:10.20982/tqmp.16.4.p376. 
  5. Bosnjak, M.; Fiebach, C. J.; Mellor, D.; Mueller, S.; O’Connor, D. B.; Oswald, F. L.; Sokol-Chang, R. I. (2021). "A template for preregistration of quantitative research in psychology: Report of the Joint Psychological Societies Preregistration Task Force". The American Psychologist 77 (4): 602–615. doi:10.31234/osf.io/d7m5r. PMID 34807636. https://doi.org/10.31234/osf.io/d7m5r. 
  6. Haven, T. L.; Van Grootel, D. L. (2019). "Preregistering qualitative research.". Accountability in Research 26 (3): 229–244. doi:10.1080/08989621.2019.1580147. PMID 30741570. 
  7. Mertens, G.; Krypotos, A. M. (2019). "Preregistration of analyses of preexisting data.". Psychologica Belgica 59 (1): 338–352. doi:10.5334/pb.493. PMID 31497308. 
  8. Weston, S. J.; Ritchie, S. J.; Rohrer, J. M. (2019). "Recommendations for increasing the transparency of analysis of preexisting data sets". Advances in Methods and Practices in Psychological Science 2 (3): 214–227. doi:10.1177/2515245919848684. PMID 32190814. 
  9. Akker, Olmo R. van den; Weston, Sara; Campbell, Lorne; Chopik, Bill; Damian, Rodica; Davis-Kean, Pamela; Hall, Andrew; Kosie, Jessica et al. (2021-11-09). "Preregistration of secondary data analysis: A template and tutorial" (in en). Meta-Psychology 5. doi:10.15626/MP.2020.2625. ISSN 2003-2714. https://open.lnu.se/index.php/metapsychology/article/view/2625. 
  10. Johnson, A. H.; Cook, B. G. (2019). "Preregistration in single-case design research.". Exceptional Children 86 (1): 95–112. doi:10.1177/0014402919868529. 
  11. Paul, M.; Govaart, G. H.; Schettino, A. (2021). "Making ERP research more transparent: Guidelines for preregistration". International Journal of Psychophysiology 164: 52–63. doi:10.31234/osf.io/4tgve. PMID 33676957. 
  12. Kirtley, O. J.; Lafit, G.; Achterhof, R.; Hiekkaranta, A. P.; Myin-Germeys, I. (2019). "Making the black box transparent: A template and tutorial for (pre-)registration of studies using experience sampling methods (ESM)". PsyArXiv. doi:10.31234/osf.io/seyq7. 
  13. Dirnagl, U. (2020). "Preregistration of exploratory research: Learning from the golden age of discovery.". PLOS Biol 18 (3): e3000690. doi:10.1371/journal.pbio.3000690. PMID 32214315. 
  14. Bert, Bettina; Heinl, Céline; Chmielewska, Justyna; Schwarz, Franziska; Grune, Barbara; Hensel, Andreas; Greiner, Matthias; Schönfelder, Gilbert (2019-10-15). "Refining animal research: The Animal Study Registry" (in en). PLOS Biology 17 (10): e3000463. doi:10.1371/journal.pbio.3000463. ISSN 1545-7885. PMID 31613875. 
  15. "International Clinical Trials Registry Platform (ICTRP)". https://www.who.int/ictrp/trial_reg/en/. 
  16. "WHO | Working Group on Best Practice for Clinical Trials Registers (BPG)". https://www.who.int/ictrp/network/bpg/en/. 
  17. Barrett, Stephen (13 September 2004). "Major Journals Press for Clinical Trial Registration". https://www.quackwatch.org/06ResearchProjects/journals.html. 
  18. "WHO - Working Group on Best Practice for Clinical Trials Registers (BPG)". https://www.who.int/ictrp/network/bpg/en/index.html. 
  19. Dickersin, K; Rennie, D (2009). "Registering clinical trials". JAMA 290 (4): 516–523. doi:10.1001/jama.290.4.516. PMID 12876095. 
  20. Sterling, TD (1959). "Publication decisions and their possible effects on inferences drawn from tests of significances – or vice versa". J Am Stat Assoc 54 (285): 30–34. doi:10.1080/01621459.1959.10501497. 
  21. International Collaborative Group on Clinical Trial Registries (1993). "Position paper and consensus recommendations on clinical trial registries. Ad Hoc Working Party of the International Collaborative Group on Clinical Trials Registries". Clin Trials Metaanal 28 (4–5): 255–266. PMID 10146333. 
  22. Dickersin, K; Rennie, D (2012). "The evolution of trial registries and their use to assess the clinical trial enterprise". JAMA 307 (17): 1861–4. doi:10.1001/jama.2012.4230. PMID 22550202. 
  23. SANCTR. "SANCTR > Home". http://www.sanctr.gov.za/. 
  24. "ICMJE: Frequently Asked Questions about Clinical Trials Registration". http://www.icmje.org/faq_clinical.html. 
  25. "WMA Declaration of Helsinki - Ethical Principles for Medical Research Involving Human Subjects". http://www.wma.net/en/30publications/10policies/b3/index.html. 
  26. "ANZCTR". http://www.anzctr.org.au/Default.aspx. 
  27. Gülmezoglu, AM; Pang, T; Horton, R; Dickersin, K (2005). "WHO facilitates international collaboration in setting standards for clinical trial registration". Lancet 365 (9474): 1829–1831. doi:10.1016/s0140-6736(05)66589-0. PMID 15924966. http://www.thelancet.com/journals/lancet/article/PIIS0140-6736(05)66589-0/abstract. 
  28. "International Clinical Trials Registry Platform (ICTRP)". https://www.who.int/ictrp/en/. 
  29. Banno, M; Tsujimoto, Y; Kataoka, Y (2019). "Studies registered in non-ClinicalTrials.gov accounted for an increasing proportion of protocol registrations in medical research". Journal of Clinical Epidemiology 116: 106–113. doi:10.1016/j.jclinepi.2019.09.005. PMID 31521723. 
  30. Anderson, Monique L.; Chiswell, Karen; Peterson, Eric D.; Tasneem, Asba; Topping, James; Califf, Robert M. (12 March 2015). "Compliance with Results Reporting at ClinicalTrials.gov". New England Journal of Medicine 372 (11): 1031–1039. doi:10.1056/NEJMsa1409364. PMID 25760355. 
  31. DeVito, Nicholas J; Bacon, Seb; Goldacre, Ben (February 2020). "Compliance with legal requirement to report clinical trial results on ClinicalTrials.gov: a cohort study". The Lancet 395 (10221): 361–369. doi:10.1016/S0140-6736(19)33220-9. PMID 31958402. https://ora.ox.ac.uk/objects/uuid:0427c19e-a95d-4caf-978d-90f2393f8b84. 
  32. Pullar, T; Kumar, S; Feely, M (October 1989). "Compliance in clinical trials.". Annals of the Rheumatic Diseases 48 (10): 871–5. doi:10.1136/ard.48.10.871. PMID 2684057. 
  33. Miller, Jennifer E; Korn, David; Ross, Joseph S (12 November 2015). "Clinical trial registration, reporting, publication and FDAAA compliance: a cross-sectional analysis and ranking of new drugs approved by the FDA in 2012". BMJ Open 5 (11): e009758. doi:10.1136/bmjopen-2015-009758. PMID 26563214. 
  34. Miseta, Ed (9 January 2018). "As ClinicalTrialsgov Turns 10 Will We See Compliance Improve". https://www.clinicalleader.com/doc/as-clinicaltrials-gov-turns-will-we-see-compliance-improve-0001. 
  35. Huser, V.; Cimino, J. J. (2013). "Evaluating adherence to the International Committee of Medical Journal Editors' policy of mandatory, timely clinical trial registration". Journal of the American Medical Informatics Association 20 (e1): e169–74. doi:10.1136/amiajnl-2012-001501. PMID 23396544. 
  36. Wieschowski, Susanne; Silva, Diego S.; Strech, Daniel (2016-11-10). "Animal Study Registries: Results from a Stakeholder Analysis on Potential Strengths, Weaknesses, Facilitators, and Barriers" (in en). PLOS Biology 14 (11): e2000391. doi:10.1371/journal.pbio.2000391. ISSN 1545-7885. PMID 27832101. 
  37. Kimmelman, Jonathan; Anderson, James A. (June 2012). "Should preclinical studies be registered?" (in en). Nature Biotechnology 30 (6): 488–489. doi:10.1038/nbt.2261. ISSN 1546-1696. PMID 22678379. 
  38. Wieschowski, Susanne; Biernot, Svenja; Deutsch, Susanne; Glage, Silke; Bleich, André; Tolba, René; Strech, Daniel (2019-11-26). "Publication rates in animal research. Extent and characteristics of published and non-published animal studies followed up at two German university medical centres" (in en). PLOS ONE 14 (11): e0223758. doi:10.1371/journal.pone.0223758. ISSN 1932-6203. PMID 31770377. Bibcode2019PLoSO..1423758W. 
  39. Naald, Mira van der; Wenker, Steven; Doevendans, Pieter A.; Wever, Kimberley E.; Chamuleau, Steven A. J. (2020-08-01). "Publication rate in preclinical research: a plea for preregistration" (in en). BMJ Open Science 4 (1): e100051. doi:10.1136/bmjos-2019-100051. ISSN 2398-8703. PMID 35047690. PMC 8647586. https://openscience.bmj.com/content/4/1/e100051. 
  40. Centre for Open Science. "Registered Reports: Peer review before results are known to align scientific values and practices". https://cos.io/rr/. 
  41. Chambers, C. D.; Forstmann, B.; Pruszynski, J. A. (2019). "Science in flux: Registered Reports and beyond at the European Journal of Neuroscience". European Journal of Neuroscience 49 (1): 4–5. doi:10.1111/ejn.14319. PMID 30584679. 
  42. Lindsay, D. Stephen (2015-11-09). "Replication in Psychological Science". Psychological Science 26 (12): 1827–32. doi:10.1177/0956797615616374. ISSN 0956-7976. PMID 26553013. 
  43. Mellor, D. (2017). "Promoting reproducibility with registered reports". Nature Human Behaviour 1: 0034. doi:10.1038/s41562-016-0034. 
  44. "Streamlined review and registered reports soon to be official at EJP". 6 February 2018. https://www.ejp-blog.com/blog/2017/2/3/streamlined-review-and-registered-reports-coming-soon. 
  45. Yeung, Andy W. K. (2017). "Do Neuroscience Journals Accept Replications? A Survey of Literature" (in en). Frontiers in Human Neuroscience 11: 468. doi:10.3389/fnhum.2017.00468. ISSN 1662-5161. PMID 28979201. 
  46. Martin, G. N.; Clarke, Richard M. (2017). "Are Psychology Journals Anti-replication? A Snapshot of Editorial Practices" (in en). Frontiers in Psychology 8: 523. doi:10.3389/fpsyg.2017.00523. ISSN 1664-1078. PMID 28443044. 
  47. "Registered Reports Overview". Center for Open Science. https://cos.io/rr/#journals. 
  48. 48.0 48.1 Wagenmakers, E. J.; Wetzels, R.; Borsboom, D.; van der Maas, H. L.; Kievit, R. A. (2012). "An agenda for purely confirmatory research". Perspectives on Psychological Science 7 (6): 632–638. doi:10.1177/1745691612463078. PMID 26168122. https://doi.org/10.1177/1745691612463078. 
  49. Lakens, D. (2019). "The value of preregistration for psychological science: A conceptual analysis.". Japanese Psychological Review 62 (3): 221–230. http://team1mile.com/sjpr62-3/wp-content/uploads/2020/03/Lakens_JPR623221-230.pdf. 
  50. 50.0 50.1 Nosek, B. A.; Lakens, D. (2014). "Registered reports: A method to increase the credibility of published results". Social Psychology 45 (3): 137–141. doi:10.1027/1864-9335/a000192. 
  51. Coffman, L. C.; Niederle, M. (2015). "Pre-analysis plans have limited upside, especially where replications are feasible". Journal of Economic Perspectives 29 (3): 81–98. doi:10.1257/jep.29.3.81. 
  52. Collins, H.K.; Whillans, A. V.; John, L. K (2021). "Joy and rigor in behavioral science.". Organizational Behavior and Human Decision Processes 164: 179–191. doi:10.1016/j.obhdp.2021.03.002. https://doi.org/10.1016/j.obhdp.2021.03.002. 
  53. Dehaven, A.. "Preregistration: A plan, not a prison". https://www.cos.io/blog/preregistration-plan-not-prison#:~:text=Preregistration%20is%20the%20process%20of,decisions%20before%20conducting%20the%20experiment.&text=As%20you'll%20see%2C%20this,made%20to%20the%20planned%20research. 
  54. Navarro, D. (2020). Paths in strange spaces: A comment on preregistration.. doi:10.31234/osf.io/wxn58. https://psyarxiv.com/wxn58/download. 
  55. Steegen, S.; Tuerlinckx, F.; Gelman, A.; Vanpaemel, W. (2016). "Increasing transparency through a multiverse analysis.". Perspectives on Psychological Science 11 (5): 702–712. doi:10.1177/1745691616658637. PMID 27694465. 
  56. Srivastava, S. (2018). "Sound inference in complicated research: A multi-strategy approach". PsyArXiv. doi:10.31234/osf.io/bwr48. https://doi.org/10.31234/osf.io/bwr48. 
  57. 57.0 57.1 Devezer, B.; Navarro, D. J.; Vandekerckhove, J.; Buzbas, E. O. (2020). "The case for formal methodology in scientific reform". bioRxiv: 2020.04.26.048306. doi:10.1101/2020.04.26.048306. https://www.biorxiv.org/content/biorxiv/early/2020/11/22/2020.04.26.048306.full.pdf. 
  58. Szollosi, A.; Donkin, C. (2019). Arrested theory development: The misguided distinction between exploratory and confirmatory research. doi:10.31234/osf.io/suzej. https://psyarxiv.com/suzej/download. 
  59. 59.0 59.1 Szollosi, A.; Kellen, D.; Navarro, D. J.; Shiffrin, R.; van Rooji, I.; Van Zandt, T.; Donkin, C. (2020). "Is preregistration worthwhile?". Trends in Cognitive Sciences 24 (2): 94–95. doi:10.1016/j.tics.2019.11.009. PMID 31892461. https://doi.org/10.1016/j.tics.2019.11.009. 
  60. Rubin, Mark (2021). "When to adjust alpha during multiple testing: A consideration of disjunction, conjunction, and individual testing". Synthese 199 (3–4): 10969–11000. doi:10.1007/s11229-021-03276-4. https://doi.org/10.1007/s11229-021-03276-4. 
  61. Bakker, M.; Veldkamp, C. L. S.; van Assen, M. A. L. M.; Crompvoets, E. A. V.; Ong, H. H.; Nosek, B.; Soderberg, C. K.; Mellor, D. et al. (2020). "Ensuring the quality and specificity of preregistrations". PLOS Biol 18 (12): e3000937. doi:10.1371/journal.pbio.3000937. PMID 33296358. 
  62. Ikeda, A.; Xu, H.; Fuji, N.; Zhu, S.; Yamada, Y. (2019). "Questionable research practices following pre-registration". Japanese Psychological Review 62 (3): 281–295. https://www.jstage.jst.go.jp/article/sjpr/62/3/62_281/_pdf. 
  63. 63.0 63.1 Template:Cite medRxiv
  64. 64.0 64.1 Heirene, R.; LaPlante, D.; Louderback, E. R.; Keen, B.; Bakker, M.; Serafimovska, A.; Gainsbury, S. M.. "Preregistration specificity & adherence: A review of preregistered gambling studies & cross-disciplinary comparison". PsyArXiv. https://psyarxiv.com/nj4es. Retrieved 17 July 2021. 
  65. 65.0 65.1 van den Akker, Olmo R.; van Assen, Marcel A. L. M.; Bakker, Marjan; Elsherif, Mahmoud; Wong, Tsz Keung; Wicherts, Jelte M. (2023-11-10). "Preregistration in practice: A comparison of preregistered and non-preregistered studies in psychology" (in en). Behavior Research Methods. doi:10.3758/s13428-023-02277-0. ISSN 1554-3528. PMID 37950113.  This article incorporates text from this source, which is available under the CC BY 4.0 license.
  66. Abrams, E.; Libgober, J.; List, J. A. (2020). "Research registries: Facts, myths, and possible improvements". NBER Working Papers 27250. http://s3.amazonaws.com/fieldexperiments-papers2/papers/00703.pdf. 
  67. 67.0 67.1 Claesen, A.; Gomes, S.; Tuerlinckx, F.; Vanpaemel, W.; Leuven, K. U. (2019). "Preregistration: Comparing dream to reality". Royal Society Open Science 8 (10). doi:10.31234/osf.io/d8wex. PMID 34729209. 
  68. Boghdadly, K. El.; Wiles, M. D.; Atton, S.; Bailey, C. R. (2018). "Adherence to guidance on registration of randomised controlled trials published in Anaesthesia". Anaesthesia 73 (5): 556–563. doi:10.1111/anae.14103. PMID 29292498. 
  69. Sun, L. W.; Lee, D. J.; Collins, J. A.; Carll, T. C.; Ramahi, K.; Sandy, S. J.; Unteriner, J. G.; Weinberg, D. V. (2019). "Assessment of consistency between peer-reviewed publications and clinical trial registries". JAMA Ophthalmology 137 (5): 552–556. doi:10.1001/jamaophthalmol.2019.0312. PMID 30946427. 
  70. Simmons, J. P.; Nelson, L. D.; Simonsohn, U. (2021). "Pre-registration is a game changer. But, like random assignment, it is neither necessary nor sufficient for credible science". Journal of Consumer Psychology 31 (1): 177–180. doi:10.1002/jcpy.1207. https://doi.org/10.1002/jcpy.1207. 
  71. Bowman, Sara D.; Dehaven, Alexander Carl; Errington, Timothy M.; Hardwicke, Tom Elis; Mellor, David Thomas; Nosek, Brian A.; Soderberg, Courtney K.. OSF. doi:10.31222/osf.io/epgjd. https://osf.io/epgjd/. Retrieved 2023-11-12. 
  72. L. Haven, Tamarinde; Van Grootel, Dr. Leonie (2019-04-03). "Preregistering qualitative research" (in en). Accountability in Research 26 (3): 229–244. doi:10.1080/08989621.2019.1580147. ISSN 0898-9621. PMID 30741570. 
  73. Akker, Olmo R. van den; Weston, Sara; Campbell, Lorne; Chopik, Bill; Damian, Rodica; Davis-Kean, Pamela; Hall, Andrew; Kosie, Jessica et al. (2021-11-09). "Preregistration of secondary data analysis: A template and tutorial" (in en). Meta-Psychology 5. doi:10.15626/MP.2020.2625. ISSN 2003-2714. https://open.lnu.se/index.php/metapsychology/article/view/2625. 
  74. McPhetres, J. (2020). What should a preregistration contain?. doi:10.31234/osf.io/cj5mh. https://doi.org/10.31234/osf.io/cj5mh. 
  75. Pham, M. T.; Oh, T. T. (2020). "Preregistration is neither sufficient nor necessary for good science". Journal of Consumer Psychology 31: 163–176. doi:10.1002/jcpy.1209. 
  76. Field, S. M.; Wagenmakers, E. J.; Kiers, H. A.; Hoekstra, R.; Ernst, A.F.; van Ravenzwaaij, D. (2020). "The effect of preregistration on trust in empirical research findings: Results of a registered report". Royal Society Open Science 7 (4): 181351. doi:10.1098/rsos.181351. PMID 32431853. Bibcode2020RSOS....781351F. 
  77. Soderberg, C. K.; Errington, T. M.; Schiavone, S R.; Bottesini, J.; Singleton Thorn, F.; Vazire, S.; Esterling, K. M.; Nosek, B. A. (2020). Research Quality of registered reports compared to the standard publishing model. doi:10.31222/osf.io/7x9vy. https://osf.io/preprints/metaarxiv/7x9vy. 

External links