
Systematic Literature Reviews (SLRs) are the gold standard in evidence synthesis, occupying the pinnacle of the evidence pyramid.[1] Their trustworthiness is paramount, as SLRs frequently form the foundation of evidence-based guidelines and consensus statements.[2] SLRs differ from narrative reviews because the former aims to provide a comprehensive, unbiased summary of all relevant research on a specific question, considering all possible evidence, both favoring and opposing a particular topic of interest. SLRs are also quite helpful in identifying gaps in the current knowledge, thereby providing a direction in terms of future research efforts. Thus, it becomes essential that the methods employed in conducting an SLR are robust, authentic, and reliable so that the resultant evidence can be trustworthy.[1,2]
Poorly conducted or reported SLRs can have wide-ranging negative effects. Despite their objective to provide an evidence-based synthesis, SLRs at times do not meet the rigorous standards expected. Critics argue that such SLRs of poor methodological quality or high degree bias contribute to research waste, and can be misleading or serve conflicted interests.[3] Many poor-quality reviews continue to be published, even though clear guidance has been available.[4]
In an era of perverse academic incentives, the publication of redundant, overlapping, unreliable, or poor-quality SLRs is still relentless. Research has identified several issues, including redundancy, with multiple SLRs covering the same topic, often with similar conclusions. Methodological flaws are also prevalent, such as inadequate search strategies, incomplete data extraction, and poor statistical analyses. Furthermore, biased conclusions are a problem, with exaggerated or misleading interpretations of results. Poor reporting is another issue, with inadequate disclosure of methods, conflicts of interest, or funding sources. These flaws have been highlighted in various studies, but their impact is not being adequately addressed. Consolidating these findings is crucial to understanding the scale of the problem and pushing for improvements in SR quality.[5,6] A study conducted in 2023 found that between 2000 and November 2022, at least 485 articles documented issues with published SLRs, ranging from editorials highlighting concerns over specific reviews to rigorous analyses of issues with hundreds or thousands of reviews.[7]
To ensure systematic reviews achieve their potential as reliable sources of evidence, it is essential to implement specific measures and maintain rigorous standards. SLRs should aim to include all relevant studies. Problems arise when relevant studies are missed or ignored, which can compromise the review’s validity. These issues can stem from overly stringent inclusion criteria, exclusion of grey literature, insufficient or outdated literature searches, and language restrictions. Additionally, appropriate methods must be used to ensure methodological soundness of the review. Errors in conducting the review or a lack of expertise can jeopardize the review’s internal validity. Issues including data extraction errors, flawed risk of bias assessments, limited quality assessment, and failure to incorporate risk of bias into conclusions can contribute to this.[7]
To ensure the reproducibility of systematic literature reviews, it is essential to report their methods in sufficient detail. Poor reporting quality or inaccessible methods can hinder the ability of others to replicate the review’s findings. This is particularly problematic when reviews are used to inform important decisions. To address this issue, review authors should adhere to reporting guidelines like PRISMA 2020 (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) and register their review protocols in databases such as PROSPERO.[7, 8]
SLRs can become outdated over time due to the rapid pace of scientific research. New studies are constantly being published, and these can introduce new evidence that may challenge or modify the findings of existing SLRs. Additionally, the context in which SLRs are conducted can change, rendering previous findings less relevant or applicable. Living SLRs are a dynamic approach to evidence synthesis that addresses this limitation of traditional SLRs. Living SLRs are continuously updated as new research emerges. This ensures that the conclusions and recommendations of the review remain relevant and accurate over time.[8]
Tools such as AMSTAR 2 (Assessment of Multiple Systematic Reviews) and ROBIS have been developed to assess the methodological quality and risk of bias of SLRs that have already been published. Such tools evaluate whether published SLRs have high quality in terms of internal validity, bias, and quality. Further, conducting double-checks of data and contacting statistical experts ensures results consistency and validates findings, increasing confidence in the review’s conclusions. Results should be interpreted with careful consideration of quality, risk of bias, and certainty, and any limitations or gaps in the evidence base should be acknowledged. Additionally, disclosing potential conflicts of interest and managing researcher bias are critical to ensuring that SR conclusions are not unduly influenced by conflicted parties, and that the review’s findings can be trusted by stakeholders.[8-12]
In conclusion, the reliability of SLRs is crucial in guiding healthcare and policy decisions. As research continues to expand, ensuring the integrity and rigor of these systematic reviews is more important than ever. By following best practices, maintaining clarity, and properly applying methodological frameworks, the scientific community can safeguard the credibility of SLRs.
Become A Certified HEOR Professional – Enrol yourself here!
References:
- Murad MH, Asi N, Alsawas M, Alahdab F. New evidence pyramid. Evid Based Med. 2016 Aug;21(4):125–7.
- Uttley L, Quintana DS, Montgomery P, Carroll C, Page MJ, Falzon L, Sutton A, Moher D. The problems with systematic reviews: a living systematic review. Journal of Clinical Epidemiology. 2023 Apr 1;156:30-41.
- Dr Jenny McSharry, What health evidence can we trust when we need it most? Cochrane news https://www.cochrane.org/news/what-health-evidence-can-we-trust-when-we-need-it-most.
- Ioannidis JP. The mass production of redundant, misleading, and conflicted systematic reviews and meta‐analyses. The Milbank Quarterly. 2016 Sep;94(3):485-514.
- Uttley L, Montgomery P. The influence of the team in conducting a systematic review. Systematic reviews. 2017 Dec;6:1-4.
- Chapelle C, Ollier E, Bonjean P, Locher C, Zufferey PJ, Cucherat M, Laporte S. Replication of systematic reviews: is it to the benefit or detriment of methodological quality?. Journal of Clinical Epidemiology. 2023 Aug 28.
- Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, Shamseer L, Tetzlaff JM, Akl EA, Brennan SE, Chou R. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. bmj. 2021 Mar 29;372.
- Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, Moher D, Tugwell P, Welch V, Kristjansson E, Henry DA. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. bmj. 2017 Sep 21;358.
- Bristol U of. ROBIS tool [Internet]. University of Bristol. Available from: https://www.bristol.ac.uk/population-health-sciences/projects/robis/robis-tool/
- Dang A, Chidirala S, Veeranki P, Vallish BN. A Critical Overview of Systematic Reviews of Chemotherapy for Advanced and Locally Advanced Pancreatic Cancer using both AMSTAR2 and ROBIS as Quality Assessment Tools. Rev Recent Clin Trials. 2021;16(2):180-192.
- Uttley L, Quintana DS, Montgomery P, Carroll C, Page MJ, Falzon L, Sutton A, Moher D. The problems with systematic reviews: a living systematic review. Journal of Clinical Epidemiology. 2023 Apr 1;156:30-41.
- Pussegoda K, Turner L, Garritty C, Mayhew A, Skidmore B, Stevens A, Boutron I, Sarkis-Onofre R, Bjerre LM, Hróbjartsson A, Altman DG. Systematic review adherence to methodological or reporting quality. Systematic reviews. 2017 Dec;6:1-4

