Practice Perfect - PRESENT Podiatry
Practice Perfect
top title divider

Reliability of Medical Research: Are We In Trouble?

lower title divider
Jarrod Shapiro
Errors stamp over a pie chart, bar graphs, and the words Statistics, Research, Percentages, Pie Charts, and Bar Graphs.

As an academic podiatrist teaching at one of our colleges and running a residency program, I spend a reasonable amount of time participating in and thinking about journal clubs and the medical research. I typically participate in three to four journal clubs per month, reviewing somewhere around six to ten articles over the month. Some of my faculty partners hate journal clubs, and I don’t blame them. These things can sometimes get a little long in the tooth, especially if we delve deeply into the biostatistics part. Luckily for me, I have a couple of strong academic partners for this part.

Journal Clubs Uncover the Truth About Clinical Research Articles

I’ve always felt journal clubs are an important part of both the educational process for our students and residents, but also a key method to remain current and active in an ever-changing field field. They also serve as a key component of evidence-based practice. Given my feeling on this topic, I find it especially sad that at least 95% of the time, the articles we review turn out to be junk.


At least 95% of the time, the articles we review turn out to be junk.


It’s incredibly rare that our journal clubs come out at the end saying, “Wow, this article is really going to change how I practice podiatric medicine and surgery.” Sad, right?

Is It Us?

One might think this is unique to podiatry, that somehow our profession is particularly bad at performing research. Not even close. Take a look at the orthopedic literature, for example. There’s no shortage of junk research from them. Even the Journal of Bone and Joint Surgery has published research with poor methodology. Better than some of our journals, I’ll concede, but still not great. One of the things they do is publish a level of evidence for each study. I won’t argue the merits of that idea (it’s a good one), but there are too many times where their evidence level doesn’t match the study. In cases like this, they’re just putting lipstick on a pig. It may have pretty lips, but it’s still a pig.


Even the Journal of Bone and Joint Surgery has published research with poor methodology. Publishing an article with a poor evidence level is just putting lipstick on a pig. It may have pretty lips, but it’s still a pig.


Let me now disabuse you of the idea that this is just a podiatry-ortho problem. In April 2015, the Wellcome Trust had a symposium on just this subject in London that looked at the “reproducibility and reliability of biomedical research”.1 They state that possibly as much as half of the scientific literature may be untrue.1 Is there any evidence to this?


Possibly as much as half of the scientific literature may be untrue1


It All Starts with the Abstract

Let’s start with the abstract. Pitkin and colleagues2 looked at 44 abstracts from each of six medical journals (the big ones – JAMA, NEJM, Lancet, Annals of Internal Medicine, the Canadian Association Medical Journal, and the British Medical Journal) published over a 15 month period. Abstracts were considered inconsistent if the information in the abstract was either different or absent from the actual study.

And Ends with the Citations

If the abstract is the beginning, what about the end? How accurate are the citations? Luo, et al analyzed the citations and quotations in 249 references and 408 quotations from 25 articles published in five orthopedic journals3. They found a citation error rate of 41% (103 errors out of 249 references) and quotation error rate of 20% (80 errors out of 408 quotes). Not even the references are accurate!

Now, I can give researchers some leeway on this. As someone who does educational writing and having written some book chapters, I can tell you the references can be unwieldy with lots of chances to make errors. On the other hand, these are professional journals for which peer review is an important part.

And the Study Design Isn’t so Great Either

How about the middle? If the beginning and the end contain inaccurate information, what about the actual research? For the sake of time (and my own sanity), let’s take one representative part of the methods of research studies: the statistics. Parsons and colleagues surveyed 100 orthopedic journal articles using a validated survey model.4 They found significant failing of study design, statistical analysis, and the presentation of results:

In 17% of studies the conclusions were not justified by the results.
In 39% of studies the researchers used the wrong statistical methods.
In 17% a different analysis could have made a difference in the overall conclusions.

It’s the Best We’ve Got - Make it Better

Are you surprised by these results? I’m not. I can tell you that almost every journal article we review in our journal clubs have methodological or reporting errors. It makes one question the basis for our practice of medicine. I haven’t even brought up the concept of repeating studies. How many studies actually repeat a prior study to see if it was accurate? Very few. Some have suggested creating a research subspecialty of confirmatory research, which is a great idea.

Midwest Residency Education Summit Ad

So, what do we do with all this? We shouldn’t just give up and say, “That’s how it is.” Instead, we should all demand more from our researchers. We need to create clear methodological protocols for research to limit the amount of variability. If our journals publish peer-reviewed articles that are inaccurate, we need to write in to those journals and complain. Finally, until all of this inaccuracy changes, we each need to be highly skeptical of what we read. Don’t simply believe something because it is written. Use your mind and your own judgment to determine if a study has valid results and if you should use those results to change your clinical practice. Until that blue sky day of perfect research occurs (don’t hold your breaths), remain, as I, a skeptic.


Use your mind and your own judgment to determine if a study has valid results and if you should use those results to change your clinical practice


Best wishes.
Jarrod Shapiro Signature
Jarrod Shapiro, DPM
PRESENT Practice Perfect Editor
[email protected]
article bottom border
References
  1. Horton R. Offline: What Is Medicine’s 5-sigma? The Lancet. 2015;385:1380.
     
  2. Pitkin RM, Branagan MA, Burmeister LF. Accuracy of data in abstracts of published research articles. JAMA.1999 Mar 24-31;281(12):1110-1111.
     
  3. Luo M, Li CC, Molina D 4th, et al. Accuracy of citation and quotation in foot and ankle surgery journals. Foot Ankle Int. 2013 Jul;34(7):949-955.
     
  4. Parsons NR, Price CL, Hiskens R, et al. An evaluation of the quality of statistical design and analysis of published medical research: results from a systematic survey of general orthopaedic journals. BMC Medical Research Methodology. 2012;12(60):1-9.
lower title divider
Get a steady stream of all the NEW PRESENT Podiatry eLearning by becoming our Facebook Fan. Effective eLearning and a Colleague Network await you.

Grand Sponsor


Major Sponsor