Translate
${alt}
By Michael Haederle

Reproducing Knowledge

Finding a Better Way to Conduct Systematic Reviews

In the health sciences, systematic reviews play a key role in illuminating a subject by combining comprehensive literature searches and in-depth analysis to provide the best information on that topic.

But these reviews are only as good as their information-gathering methodology – and it turns out that their quality is not what it should be.

“We see a lot of people who are doing systematic reviews,” says Melissa Rethlesfesen, who recently started as executive director of The University of New Mexico Health Sciences Library & Informatics Center.

melissa-rethlefsen.jpgOften, “people don’t accurately report what they’ve done, so you can’t tell what database they’ve accessed,” she says. “They use very poor methodology. You can’t actually tell what they did, so you can’t assess it for yourself.”

Systematic reviews are sometimes described as a kind of “evidence synthesis,” Rethlefsen says. They not only shape medical and scientific decision-making, but they also guide policymakers.

In a new paper published online in the journal Systematic Reviews, Rethlefsen and colleagues in the U.S., U.K., Canada, Germany and Australia propose guidelines to help ensure that reviewers follow well-defined steps to ensure that their searches are accurate and reproducible.

“We realized that there was no consensus on what actually constituted a reproducible search or reproducible methods,” Rethlefsen says. “We want to counteract what we’re seeing as major problems with this whole body of literature. We are really trying to improve the quality of those reviews.”

Rethlefsen and her colleagues focused on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement, crafted in 2009 as a first step toward addressing this problem.

The PRISMA checklist includes three sections specifically pertaining to literature searches, but the team found that they were worded so broadly that many researchers following the guidelines omitted completing key steps, she says.

Literature searches are complicated by the fact that references to previously published research may be scattered across multiple databases, each with a unique search and indexing protocol, Rethlefsen says.

“What we’re seeing is people are doing these kinds of literature reviews who haven’t been trained and don’t understand the nuances of how all these information systems work,” she says.

Rethlefsen and her colleagues are proposing a 16-item checklist dubbed PRISMA-S as a step toward making the review process more accurate and transparent. It was drawn up with an eye toward a 2020 revision of the main PRISMA Statement, due to be published soon.

“There was a huge pent-up demand for it,” she says of their new checklist, which has already had more than 5,000 views since it was posted in late January.

Another way to enhance the quality of systematic reviews is to enlist the services of an information specialist or librarian to help craft literature searches – but that option isn’t available to everyone. “It can be cost prohibitive because not every team necessarily has a librarian accessible to them,” Rethlefsen says.

Meanwhile, the PRISMA-S checklist could easily be applied to other disciplines, such as the environmental and social sciences, she says. “It’s going to be part of our process moving forward, trying to work with researchers in those domains to help them understand.”

Categories: Education, Research, Top Stories