Clinical Librarian Washington University School of Medicine St. Louis, Missouri
Objectives: PRISMA has been widely adopted as a standard for reporting systematic review findings. Item eight on the PRISMA checklist requires authors to “present [a] full electronic search strategy for at least one database…such that it could be repeated.” Our study aims to evaluate the reproducibility of search strategies reported in systematic reviews where PRISMA guidelines were reportedly followed.
Methods: A search was executed in Ovid-Medline to find systematic reviews that included the term “PRISMA” in the title or abstract. Search results were limited to 2010-2019 and English; 4815 results were retrieved. Two hundred citations were selected for testing the inter-rater reliability of our search reproducibility assessment tool. The assessment tool consists of seven questions that help to determine if a reproducible search strategy is present in the systematic review. Inter-rater reliability testing was conducted using four reviewers.
A sub-analysis of a selection of the 200 citations used for inter-rater reliability testing was conducted to see how many of the articles included reproducible search strategies. This provided some initial results to review and aided in determining if continuing with the larger study was necessary.
Results: Using Fleiss Kappa, it was determined that each of the seven items of the search reproducibility assessment tool had good-excellent agreement (ranging from .772-.900). The assessment tool can reliably be used to rate the reproducibility of searches for the remaining 4615 citations. Out of the 200 citations used for IRR testing, 125 had coding agreement from 3 out of 4 reviewers on all seven questions and were used for a sub-analysis. Out of these 125 only 52 (41%) included an electronic search strategy and only 32 (25%) included reproducible search strategies. These findings indicate that the larger study is warranted.
Conclusions: Though the larger research project is not complete yet, our initial findings demonstrated that it is very likely that systematic reviews that report using PRISMA standards are not actually including reproducible search strategies. By developing an assessment tool and methodically coding the articles, we can begin to show evidence that authors are not adhering to this PRISMA reporting checklist item. Systematic review authors may be contributing to the larger problem of the lack of reproducibility in science. It is also more difficult to assess the quality of a systematic review if adequate information about search methods is not reported.