Systematic review

From HandWiki
Jump to: navigation, search

Systematic reviews are a type of literature review that uses systematic methods to collect secondary data, critically appraise research studies, and synthesize findings qualitatively or quantitatively.[1] Systematic reviews formulate research questions that are broad or narrow in scope, and identify and synthesize studies that directly relate to the systematic review question.[2] They are designed to provide a complete, exhaustive summary of current evidence relevant to a research question. For example, systematic reviews of randomized controlled trials are key to the practice of evidence-based medicine,[3] and a review of existing studies is often quicker and cheaper than embarking on a new study.

While systematic reviews are often applied in the biomedical or healthcare context, they can be used in other areas where an assessment of a precisely defined subject would be helpful.[4] Systematic reviews may examine clinical tests, public health interventions, environmental interventions,[5] social interventions, adverse effects, and economic evaluations.[6]

An understanding of systematic reviews and how to implement them in practice is highly recommended for professionals involved in the delivery of health care, public health, and public policy.[7]

Characteristics

A systematic review aims to provide a complete, exhaustive summary of current literature relevant to a research question. The first step in conducting a systematic review is to create a structured question to guide the review.[8] The second step is to perform a thorough search of the literature for relevant papers. The Methodology section of a systematic review will list all of the databases and citation indexes that were searched such as Web of Science, Embase, and PubMed and any individual journals that were searched. The searching of different databases is a hallmark of clinical trials. In this regard, more recently there has been increased recognition of the importance of using different search technologies, with artificial intelligence-based tools gaining recognition. The titles and abstracts of identified articles are checked against pre-determined criteria for eligibility and relevance to form an inclusion set. This set will relate back to the research problem. Each included study may be assigned an objective assessment of methodological quality preferably by using methods conforming to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement (the current guideline)[9] or the high quality standards of Cochrane.[10]

Systematic reviews often, but not always, use statistical techniques (meta-analysis) to combine results of eligible studies, or at least use scoring of the levels of evidence depending on the methodology used. An additional rater may be consulted to resolve any scoring differences between raters.[4] Systematic review is often applied in the biomedical or healthcare context, but it can be applied in any field of research. Groups like the Campbell Collaboration are promoting the use of systematic reviews in policy-making beyond just healthcare.

A systematic review uses an objective and transparent approach for research synthesis, with the aim of minimizing bias. While many systematic reviews are based on an explicit quantitative meta-analysis of available data, there are also qualitative reviews which adhere to standards for gathering, analyzing and reporting evidence.[11] The EPPI-Centre has been influential in developing methods for combining both qualitative and quantitative research in systematic reviews.[12] The PRISMA statement[13] suggests a standardized way to ensure a transparent and complete reporting of systematic reviews, and is now required for this kind of research by more than 170 medical journals worldwide.[14]

Developments in systematic reviews during the 21st century included realist reviews and the meta-narrative approach, both of which addressed problems of methods and heterogeneity existing on some subjects.[15][16]

Scoping reviews

Scoping reviews are distinct from systematic reviews in a number of important ways. A scoping review is an attempt to search for concepts, mapping the language which surrounds those and adjusting the search method iteratively.[17] A scoping review may often be a preliminary stage before a systematic review, which 'scopes' out an area of inquiry and maps the language and key concepts. As it is a kind of review which should be systematically conducted (the method is repeatable), some academic publishers categorize them as a kind of 'systematic review', which may cause confusion. Scoping reviews are helpful when it is not possible to carry out a systematic synthesis of research findings, for example, when there are no published clinical trials in the area of inquiry. Scoping reviews are a helpful method when an area of inquiry is very broad, for example, exploring how the public are involved in all stages systematic reviews.[18] There is still a lack of clarity when defining the exact method of scoping review as it is both an iterative process and is still relatively new and there have been a number of attempts to improve the standardisation of the method.[19][20][21][22][23] PROSPERO does not permit the submission of protocols of scoping reviews,[24] although some journals will publish protocols for scoping reviews.[18]

Stages

The main stages of a systematic review are:

  1. Defining a question and agreeing an objective method.[25] It is considered best practice to publish the protocol of the systematic review before initiating it in order to help avoid unplanned duplication and to enable comparison of reported review methods with what was planned in the protocol.[26]
  2. A search for relevant data from research that matches certain criteria. For example, only selecting research that is good quality and answers the defined question.[25] Contacting a trained information professional or librarian can improve the quality of the systematic review.[27]
  3. 'Extraction' of relevant data. This can include how the research was done (often called the method or 'intervention'), who participated in the research (including how many people), how it was paid for (for example funding sources) and what happened (the outcomes).[25]
  4. Assess the quality of the data by judging it against criteria identified at the first stage.[25] This can include assessing the quality (or certainty) of evidence, using criteria such as GRADE.[28]
  5. Analyse and combine the data (using complex statistical methods) which give an overall result from all of the data. This combination of data can be visualised using a blobbogram (also called a forest plot).[25] The diamond in the blobbogram represents the combined results of all the data included. Because this combined result uses data from more sources than just one data set, it's considered more reliable and better evidence, as the more data there is, the more confident we can be of conclusions.[25]

Once these stages are complete, the review may be published, disseminated and translated into practice after being adopted as evidence.

Living systematic reviews

Living systematic reviews are a relatively new kind of high quality, semi-automated, up-to-date online summaries of research which are updated as new research becomes available.[29] The essential difference between 'living systematic review' and conventional systematic review is the publication format. Living systematic reviews are 'dynamic, persistent, online-only evidence summaries, which are updated rapidly and frequently'.[30]

Research fields

Medicine and biology

History

In 1753, James Lind published A Treatise on the Scurvy, a study which he described in the subtitle as A Critical and Chronological View of What has been Published on the Subject. This is the first which can be said as systematic review.[31] Lind comments, "It became requisite to exhibit a full and impartial view of what had hitherto been published on the scurvy," and, "before the subject could be set in a clear and proper light, it was necessary to remove a great deal of rubbish."[32] The first proper analysis of different sources was a report made by Karl Pearson in 1904.[33] Pearson analysed the effectiveness of typhoid vaccine from five studies of immunity and six studies of mortality. What amazed him was the irregularity of the data. He wrote:

Assuming that the inoculation is not more than a temporary inconvenience, it would seem to be possible to call for volunteers... [and] only to inoculate every second volunteer... with a view to ascertaining whether any inoculation is likely to prove useful... In other words, the 'experiment' might demonstrate that this first step to a reasonably effective prevention was not a false one.[34]

This is considered as the first analysis of clinical trials using what is later known as meta-analysis.[35]

In 1925, Ronald Fisher introduced the statistical method (probability) for such analysis in his book Statistical Methods for Research Workers.[33] A colleague of Fisher, William Gemmell Cochran made an improved method in 1937.[36] With Frank Yates, Cochran reported analysis of agricultural data in 1938 titled "The analysis of groups of experiments" using the new method.[37]

In 1974, Archie Cochrane and colleagues made analysis of the use of aspirin in the prevention of heart attack (myocardial infarction).[38] This became one of the most influential systematic analysis in randomized controlled trial.[35] In 1979, Archie Cochrane wrote 'It is surely a great criticism of our profession that we have not organised a critical summary, by specialty or subspecialty, adapted periodically, of all relevant randomised controlled trials'.[39] Critical appraisal and synthesis of research findings in a systematic way emerged in the 1970s after Gene V. Glass introduced the term 'meta analysis' in 1976. According to Glass, meta-analysis is "the statistical analysis of a large collection of analysis results from individual studies for the purpose of integrating the findings".[40][41][42] Early syntheses were conducted in broad areas of public policy and social interventions, with systematic research synthesis applied to medicine and health. The Cochrane Collaboration was founded in 1993, building on the work by Iain Chalmers and colleagues in the area of pregnancy and childbirth.[43]

Contemporary

Named after Archie Cochrane, Cochrane is a group of over 37,000 specialists in healthcare who systematically review randomised trials of the effects of prevention, treatments and rehabilitation as well as health systems interventions. When appropriate, they also include the results of other types of research. Cochrane Reviews are published in The Cochrane Database of Systematic Reviews section of the Cochrane Library. The 2015 impact factor for The Cochrane Database of Systematic Reviews was 6.103, and it was ranked 12th in the “Medicine, General & Internal” category.[44] There are six types of Cochrane Review:[45][46][47][48]

  1. Intervention reviews assess the benefits and harms of interventions used in healthcare and health policy.
  2. Diagnostic test accuracy reviews assess how well a diagnostic test performs in diagnosing and detecting a particular disease.
  3. Methodology reviews address issues relevant to how systematic reviews and clinical trials are conducted and reported.
  4. Qualitative reviews synthesize qualitative and quantitative evidence to address questions on aspects other than effectiveness.[11]
  5. Prognosis reviews address the probable course or future outcome(s) of people with a health problem.
  6. Overviews of Systematic Reviews (OoRs) are a new type of study in order to compile multiple evidence from systematic reviews into a single document that is accessible and useful to serve as a friendly front end for the Cochrane Collaboration with regard to healthcare decision-making.

The Cochrane Collaboration provides a handbook for systematic reviewers of interventions which "provides guidance to authors for the preparation of Cochrane Intervention reviews."[10] The Cochrane Handbook outlines eight general steps for preparing a systematic review:[10]

  1. Defining the review question(s) and developing criteria for including studies
  2. Searching for studies
  3. Selecting studies and collecting data
  4. Assessing risk of bias in included studies
  5. Analysing data and undertaking meta-analyses
  6. Addressing reporting biases
  7. Presenting results and "summary of findings" tables
  8. Interpreting results and drawing conclusions

The Cochrane Handbook forms the basis of two sets of standards for the conduct and reporting of Cochrane Intervention Reviews (MECIR - Methodological Expectations of Cochrane Intervention Reviews)[49]

The Cochrane Library is a collection of databases in medicine and other health care specialties provided by Cochrane and other organizations. It is the collection of Cochrane Reviews, a database of systematic reviews and meta-analyses which summarize and interpret the results of medical research. It was originally published by Update Software and now published by the share-holder owned publisher John Wiley & Sons, Ltd. as part of Wiley Online Library.

Authors must pay an additional fee for their review to be truly open access.[50] Cochrane has an annual income of $10m USD.[51]

Social sciences

The quasi-standard for systematic review in the social sciences is based on the procedures proposed by the Campbell Collaboration, which is one of a number of groups promoting evidence-based policy in the social sciences. The Campbell Collaboration "helps people make well-informed decisions by preparing, maintaining and disseminating systematic reviews in education, crime and justice, social welfare and international development.[52] It is a sister initiative of Cochrane. The Campbell Collaboration was created in 2000 and the inaugural meeting in Philadelphia, USA, attracted 85 participants from 13 countries.[53]

Business and economics

Due to the different nature of research fields outside of the natural sciences, the aforementioned methodological steps cannot easily be applied in business research. Early attempts to transfer the procedures from medicine to business research have been made by Tranfield et al. (2003).[54] A step-by-step approach has been developed by Durach et al.: Based on the experiences they have made in their own discipline, these authors have adapted the methodological steps and developed a standard procedure for conducting systematic literature reviews in business and economics.[55]

Strengths and weaknesses

While systematic reviews are regarded as the strongest form of medical evidence, a review of 300 studies found that not all systematic reviews were equally reliable, and that their reporting can be improved by a universally agreed upon set of standards and guidelines.[56] A further study by the same group found that of 100 systematic reviews monitored, 7% needed updating at the time of publication, another 4% within a year, and another 11% within 2 years; this figure was higher in rapidly changing fields of medicine, especially cardiovascular medicine.[57] A 2003 study suggested that extending searches beyond major databases, perhaps into grey literature, would increase the effectiveness of reviews.[58]

Roberts and colleagues highlighted the problems with systematic reviews, particularly those conducted by the Cochrane, noting that published reviews are often biased, out of date and excessively long.[59] They criticized Cochrane reviews as not being sufficiently critical in the selection of trials and including too many of low quality. They proposed several solutions, including limiting studies in meta-analyses and reviews to registered clinical trials, requiring that original data be made available for statistical checking, paying greater attention to sample size estimates, and eliminating dependence on only published data.

Some of these difficulties were noted early on as described by Altman: "much poor research arises because researchers feel compelled for career reasons to carry out research that they are ill equipped to perform, and nobody stops them."[60] Methodological limitations of meta-analysis have also been noted.[61] Until now, there was no standardized methods to be used by researchers and the optimal methodological steps should be standardized.[7] Another concern is that the methods used to conduct a systematic review are sometimes changed once researchers see the available trials they are going to include.[62] Bloggers have described retractions of systematic reviews and published reports of studies included in published systematic reviews.[63][64][65]

Systematic reviews are increasingly prevalent in other fields, such as international development research.[66] Subsequently, a number of donors – most notably the UK Department for International Development (DFID) and AusAid – are focusing more attention and resources on testing the appropriateness of systematic reviews in assessing the impacts of development and humanitarian interventions.[66]

Limited reporting of clinical trials

The 'All Trials' campaign highlights that around half of clinical trials have never reported results and works to improve reporting.[67] This lack of reporting has extremely serious implications for research, including systematic reviews, as it is only possible to synthesize data of published trials. In addition, positive trials were twice as likely to be published as those with negative results.[68] At present, it is legal for for-profit companies to conduct clinical trials and not publish the results.[69] For example, in the past 10 years 8.7 million patients have taken part in trials that haven’t published results.[69] These factors mean that it is likely there is a significant publication bias, with only 'positive' or perceived favorable results being published. A recent systematic review of industry sponsorship and research outcomes concluded that 'sponsorship of drug and device studies by the manufacturing company leads to more favorable efficacy results and conclusions than sponsorship by other sources' and that the existence of an industry bias that cannot be explained by standard 'Risk of bias' assessments.[70]

Systematic reviews of such a bias may amplify the effect, although the flaw is in the reporting of research, not in the systematic review process.

Review tools

A 2019 publication identified 15 systematic review tools and ranked them according to the number of 'critical features' as required to perform a systematic review, including:[71]

  • DistillerSR: a paid web application
  • Swift Active Screener: a paid web application
  • Covidence: a paid web application and Cochrane technology platform.
  • Rayyan: a free web application
  • Sysrev: a free web application

See also

References

  1. "Cochrane Update. 'Scoping the scope' of a cochrane review". Journal of Public Health 33 (1): 147–50. March 2011. doi:10.1093/pubmed/fdr015. PMID 21345890. 
  2. "systematic review". http://getitglossary.org/term/systematic+review. 
  3. "What is EBM?". Centre for Evidence Based Medicine. 2009-11-20. http://www.cebm.net/index.aspx?o=1914. 
  4. 4.0 4.1 Ader, Herman J.; Mellenbergh, Gideon J.; Hand, David J. (2008). "Methodological quality". Advising on Research Methods: A consultant's companion. Johannes van Kessel Publishing. ISBN 978-90-79418-02-2. https://books.google.com/books?id=LCnOj4ZFyjkC&printsec=frontcover&hl=fr#v=onepage&q=%22Methodological%20quality%22&f=false. 
  5. Bilotta, Gary S.; Milner, Alice M.; Boyd, Ian (2014). "On the use of systematic reviews to inform environmental policies". Environmental Science & Policy 42: 67–77. doi:10.1016/j.envsci.2014.05.010. 
  6. Systematic reviews: CRD's guidance for undertaking reviews in health care.. York: University of York, Centre for Reviews and Dissemination. 2008. ISBN 978-1-900640-47-3. https://www.york.ac.uk/media/crd/Systematic_Reviews.pdf. Retrieved 17 June 2011. 
  7. 7.0 7.1 Giang, Hoang Thi Nam; Ahmed, Ali Mahmoud; Fala, Reem Yousry; Khattab, Mohamed Magdy; Othman, Mona Hassan Ahmed; Abdelrahman, Sara Attia Mahmoud; Thao, Le Phuong; Gabl, Ahmed Elsaid Abd Elsamie et al. (26 July 2019). "Methodological steps used by authors of systematic reviews and meta-analyses of clinical trials: a cross-sectional study". BMC Medical Research Methodology 19 (1): 164. doi:10.1186/s12874-019-0780-2. PMID 31349805. 
  8. Finding What Works in Health Care: Standards for Systematic Reviews.. 2011. doi:10.17226/13059. ISBN 978-0-309-16425-2. 
  9. "PRISMA". Prisma-statement.org. http://www.prisma-statement.org/. 
  10. 10.0 10.1 10.2 Cochrane Handbook for Systematic Reviews of Interventions (Version 5.1 ed.). The Cochrane Collaboration. March 2011. http://handbook.cochrane.org. 
  11. 11.0 11.1 "Qualitative synthesis and systematic review in health professions education". Medical Education 47 (3): 252–60. March 2013. doi:10.1111/medu.12092. PMID 23398011. 
  12. "Integrating qualitative research with trials in systematic reviews". BMJ 328 (7446): 1010–2. April 2004. doi:10.1136/bmj.328.7446.1010. PMID 15105329. 
  13. "The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration". PLoS Medicine 6 (7): e1000100. July 2009. doi:10.1371/journal.pmed.1000100. PMID 19621070. 
  14. "PRISMA Endorsers". http://www.prisma-statement.org/Endorsement/PRISMAEndorsers.aspx. 
  15. "Realist review--a new method of systematic review designed for complex policy interventions". Journal of Health Services Research & Policy 10 Suppl 1: 21–34. July 2005. doi:10.1258/1355819054308530. PMID 16053581. 
  16. "Tensions and paradoxes in electronic patient record research: a systematic literature review using the meta-narrative method". The Milbank Quarterly 87 (4): 729–88. December 2009. doi:10.1111/j.1468-0009.2009.00578.x. PMID 20021585. 
  17. Arksey, Hilary; O'Malley, Lisa (2005). "Scoping studies: Towards a methodological framework". International Journal of Social Research Methodology 8: 19–32. doi:10.1080/1364557032000119616. http://eprints.whiterose.ac.uk/1618/1/Scopingstudies.pdf. 
  18. 18.0 18.1 "Stakeholder involvement in systematic reviews: a protocol for a systematic review of methods, outcomes and effects". Research Involvement and Engagement 3 (1): 9. 2017-04-21. doi:10.1186/s40900-017-0060-4. PMID 29062534. 
  19. "Scoping studies: advancing the methodology". Implementation Science 5 (1): 69. September 2010. doi:10.1186/1748-5908-5-69. PMID 20854677. 
  20. "Guidance for conducting systematic scoping reviews". International Journal of Evidence-Based Healthcare 13 (3): 141–6. September 2015. doi:10.1097/XEB.0000000000000050. PMID 26134548. 
  21. "Scoping reviews: time for clarity in definition, methods, and reporting". Journal of Clinical Epidemiology 67 (12): 1291–4. December 2014. doi:10.1016/j.jclinepi.2014.03.013. PMID 25034198. 
  22. Arksey, Hilary; O'Malley, Lisa (2005-02-01). "Scoping studies: towards a methodological framework". International Journal of Social Research Methodology 8 (1): 19–32. doi:10.1080/1364557032000119616. http://eprints.whiterose.ac.uk/1618/1/Scopingstudies.pdf. 
  23. "PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation | The EQUATOR Network" (in en-US). http://www.equator-network.org/reporting-guidelines/prisma-scr/. 
  24. "PROSPER O". Centre for Reviews and Dissemination. University of York. https://www.crd.york.ac.uk/prospero/#aboutregpage. 
  25. 25.0 25.1 25.2 25.3 25.4 25.5 "Animated Storyboard: What Are Systematic Reviews?". Cochrane Consumers and Communication. https://cccrg.cochrane.org/animated-storyboard-what-are-systematic-reviews. 
  26. "PRISMA". Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). http://www.prisma-statement.org/Protocols/Registration. 
  27. "Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews". Journal of Clinical Epidemiology 68 (6): 617–26. June 2015. doi:10.1016/j.jclinepi.2014.11.025. PMID 25766056. 
  28. "GRADE home". http://www.gradeworkinggroup.org/. 
  29. "Systematic review automation technologies". Systematic Reviews 3 (1): 74. July 2014. doi:10.1186/2046-4053-3-74. PMID 25005128. 
  30. "Living systematic reviews: an emerging opportunity to narrow the evidence-practice gap". PLoS Medicine 11 (2): e1001603. February 2014. doi:10.1371/journal.pmed.1001603. PMID 24558353. 
  31. Bartholomew, M (2002). "James Lind's Treatise of the Scurvy (1753)". Postgraduate Medical Journal 78 (925): 695–696. doi:10.1136/pmj.78.925.695. PMID 12496338. 
  32. Clarke, Mike; Chalmers, Iain (2018). "Reflections on the history of systematic reviews". BMJ Evidence-Based Medicine 23 (4): 121–122. doi:10.1136/bmjebm-2018-110968. PMID 29921711. 
  33. 33.0 33.1 Gurevitch, Jessica; Koricheva, Julia; Nakagawa, Shinichi; Stewart, Gavin (2018). "Meta-analysis and the science of research synthesis". Nature 555 (7695): 175–182. doi:10.1038/nature25753. PMID 29517004. 
  34. Pearson, K (1904). "Report on Certain Enteric Fever Inoculation Statistics". BMJ 2 (2288): 1243–1246. doi:10.1136/bmj.2.2288.1243. PMID 20761760. 
  35. 35.0 35.1 O'Rourke, K. (2007). "An historical perspective on meta-analysis: dealing quantitatively with varying study results". Journal of the Royal Society of Medicine 100 (12): 579–582. doi:10.1258/jrsm.100.12.579. PMID 18065712. 
  36. Cochran, W. G. (1937). "Problems Arising in the Analysis of a Series of Similar Experiments". Supplement to the Journal of the Royal Statistical Society 4 (1): 102-118. doi:10.2307/2984123. 
  37. Yates, F.; Cochran, W. G. (1938). "The analysis of groups of experiments". The Journal of Agricultural Science 28 (4): 556–580. doi:10.1017/S0021859600050978. 
  38. Elwood, P. C.; Cochrane, A. L.; Burr, M. L.; Sweetnam, P. M.; Williams, G.; Welsby, E.; Hughes, S. J.; Renton, R. (1974). "A Randomized Controlled Trial of Acetyl Salicyclic Acid in the Secondary Prevention of Mortality from Myocardial Infarction". BMJ 1 (5905): 436–440. doi:10.1136/bmj.1.5905.436. PMID 4593555. 
  39. "1.1.2 A brief history of Cochrane". https://community.cochrane.org/handbook-sri/chapter-1-introduction/11-cochrane/112-brief-history-cochrane. 
  40. Glass, Gene V (1976). "Primary, Secondary, and Meta-Analysis of Research". Educational Researcher 5 (10): 3–8. doi:10.3102/0013189X005010003. 
  41. Glass, Gene V.; Smith, Mary Lee (1978). Meta-Analysis of Research on the Relationship of Class-Size and Achievement. The Class Size and Instruction Project. Washington, D.C.: Distributed by ERIC Clearinghouse. https://catalogue.nla.gov.au/Record/5363514. 
  42. "History of Systematic Reviews". Evidence for Policy and Practice Information and Co-ordinating Centre (EPPI-Centre). https://eppi.ioe.ac.uk/cms/Resources/EvidenceInformedPolicyandPractice/HistoryofSystematicReviews/tabid/68/Default.aspx. 
  43. "1.1.2 A brief history of Cochrane". https://community.cochrane.org/handbook-sri/chapter-1-introduction/11-cochrane/112-brief-history-cochrane. 
  44. The Cochrane Library. 2015 impact factor. Cochrane Database of Systematic Reviews (CDSR) Retrieved 2016-07-20.
  45. Review Manager (RevMan) [Computer program]. Version 5.2. Copenhagen: The Nordic Cochrane Centre, The Cochrane Collaboration, 2012.
  46. The Cochrane Library
  47. "Overview of systematic reviews - a new type of study. Part II". Sao Paulo Medical Journal 133 (3): 206–17. 2015. doi:10.1590/1516-3180.2013.8150015. PMID 25388685. 
  48. "Overview of systematic reviews - a new type of study: part I: why and for whom?". Sao Paulo Medical Journal 130 (6): 398–404. 2012. doi:10.1590/S1516-31802012000600007. PMID 23338737. 
  49. "Methodological Expectations of Cochrane Intervention Reviews (MECIR)". Cochrane. http://www.editorial-unit.cochrane.org/mecir. 
  50. "Open access options for the Cochrane Database of Systematic Reviews". Cochrane. https://www.cochranelibrary.com/about/open-access. 
  51. "Has Cochrane lost its way?". BMJ 364: k5302. January 2019. doi:10.1136/bmj.k5302. 
  52. "About Us". The Campbell Collaboration. http://www.campbellcollaboration.org/about_us/index.php. 
  53. "History - Campbell". http://www.campbellcollaboration.org/history/explore/background. 
  54. "Towards a methodology for developing evidence-informed management knowledge by means of systematic review". British Journal of Management 14 (3): 207–222. 2003. doi:10.1111/1467-8551.00375. 
  55. "A New Paradigm for Systematic Literature Reviews in Supply Chain Management". Journal of Supply Chain Management 53 (4): 67–85. 2017. doi:10.1111/jscm.12145. 
  56. "Epidemiology and reporting characteristics of systematic reviews". PLoS Medicine 4 (3): e78. March 2007. doi:10.1371/journal.pmed.0040078. PMID 17388659. 
  57. "How quickly do systematic reviews go out of date? A survival analysis". Annals of Internal Medicine 147 (4): 224–33. August 2007. doi:10.7326/0003-4819-147-4-200708210-00179. PMID 17638714. 
  58. "Beyond Medline: reducing bias through extended systematic review search". International Journal of Technology Assessment in Health Care 19 (1): 168–78. 2003. doi:10.1017/S0266462303000163. PMID 12701949. 
  59. "The knowledge system underpinning healthcare is not fit for purpose and must change". BMJ 350: h2463. June 2015. doi:10.1136/bmj.h2463. PMID 26041754. http://researchonline.lshtm.ac.uk/2210686/1/bmj.h2463.full.pdf. 
  60. "The scandal of poor medical research". BMJ 308 (6924): 283–4. January 1994. doi:10.1136/bmj.308.6924.283. PMID 8124111. 
  61. "Meta-analysis/Shmeta-analysis". American Journal of Epidemiology 140 (9): 771–8. November 1994. doi:10.1093/oxfordjournals.aje.a117324. PMID 7977286. 
  62. "Bias due to selective inclusion and reporting of outcomes and analyses in systematic reviews of randomised trials of healthcare interventions". The Cochrane Database of Systematic Reviews (10): MR000035. October 2014. doi:10.1002/14651858.MR000035.pub2. PMID 25271098. 
  63. Roberts, Ian. "Retraction Of Scientific Papers For Fraud Or Bias Is Just The Tip Of The Iceberg". http://www.iflscience.com/editors-blog/retraction-scientific-papers-fraud-or-bias-just-tip-iceberg. 
  64. Ferguson, Cat (2015-04-02). "Retraction and republication for Lancet Resp Med tracheostomy paper". Retraction Watch. http://retractionwatch.com/2015/04/02/retraction-and-republication-for-lancet-resp-med-tracheostomy-paper/. 
  65. Ferguson, Cat (2015-03-26). "BioMed Central retracting 43 papers for fake peer review". Retraction Watch. http://retractionwatch.com/2015/03/26/biomed-central-retracting-43-papers-for-fake-peer-review/. 
  66. 66.0 66.1 Hagen-Zanker, Jessica; Duvendack, Maren; Mallett, Richard; Slater, Rachel; Carpenter, Samuel; Tromme, Mathieu (January 2012). "Making systematic reviews work for international development research". Overseas Development Institute. http://www.odi.org.uk/resources/details.asp?id=6260&title=systematic-review-slrc-international-development-research-methods. 
  67. "Half of all clinical trials have never reported results". 20 August 2015. http://www.alltrials.net/news/half-of-all-trials-unreported/. 
  68. Song, F.; Parekh, S.; Hooper, L.; Loke, Y. K.; Ryder, J.; Sutton, A. J.; Hing, C.; Kwok, C. S. et al. (February 2010). "Dissemination and publication of research findings: an updated review of related biases". Health Technology Assessment 14 (8): iii, ix–xi, 1–193. doi:10.3310/hta14080. PMID 20181324. 
  69. 69.0 69.1 Iacobucci, Gareth (2016-11-04). "Nearly half of all trials run by major sponsors in past decade are unpublished". BMJ 355: i5955. doi:10.1136/bmj.i5955. PMID 27815253. 
  70. Lundh, Andreas; Lexchin, Joel; Mintzes, Barbara; Schroll, Jeppe B; Bero, Lisa (2017-02-16). "Industry sponsorship and research outcome". Cochrane Database of Systematic Reviews 2: MR000033. doi:10.1002/14651858.MR000033.pub3. PMID 28207928. https://portal.findresearcher.sdu.dk/da/publications/9a5fd1d6-2b2e-4f81-8014-e03cf1977f89. 
  71. van der Mierden, Stevie (2019). "Software tools for literature screening in systematic reviews in biomedical research". ALTEX 36 (3): 508–517. doi:10.14573/altex.1902131. PMID 31113000. 

External links

https://en.wikipedia.org/wiki/Systematic review was the original source. Read more.