Sunday, October 22, 2006

There's a study for everything

Like its religious counterparts the Bible and the Koran, both notorious for providing great interpretive flexibility in their guidance of moral human behavior, the medical literature also, while providing a framework for best medical practice, leaves plenty of space for, well, the art of medicine. The Bible and the Koran are well known for passages that have been used to rationalize slavery, violence, and hatred, even while their over-riding themes teach against these practices. In the medical literature, similarly, if you are willing to dig a little bit (and extrapolate to a reasonable extent), there is in fact a study for everything. If you want to prove that eating red clover slows down aging, there’s a study to back you up. Conversely, if your point is to show that red clover is actually toxic, well, you’ll find that too. What’s important is not the details of the particular study, but your ability to spout it off quickly and confidently. Because evidence (the noblest form of which is the randomized controlled trial) occupies such an exalted place in medicine, beyond the reproach of common practice, logic, physiology, sensibility, or any other possible objections of the mere human mind, one good study in your pocket is all you need to justify pretty much anything you wish to do. In fact, the more obscure your study, the better, as nobody will be able to challenge a study that they haven’t read.

Pharmaceutical companies and representatives are well acquainted with the power and the flexibility of studies. It’s a simple numbers game actually: if aspirin and plavix, for example, work equally well in stroke prevention, and you run the study comparing them twenty times, requiring 95% confidence for a positive conclusion, on average one out of those twenty studies will show that plavix works better. Eureka! You can discard the other nineteen studies, since, as already mentioned, all you really need is one.

Medicine does have a defense against the capriciousness of individual studies, though; it’s called the meta-analysis. In a meta-analysis, you collect all of the studies you can find on a subject, pool them together, and determine the overall result. Unfortunately, even meta-analyses don’t always agree with each other. Case in point, my review of pseudomonas double coverage last month: a meta-analysis of 17 studies by Safdar et al. in 2004 showed that antibiotic double coverage of pseudomonas bacteremias decreases mortality by half, while a meta-analysis of 64 studies by Paul et al. in 2006 showed no benefit to double coverage. What we need is a meta-meta-analysis. A supercalifragilistic-expialidocious-alysis.

Of course, there aren’t any studies when it comes to doing things that actually make sense. The study, for example, to determine if parachutes reduce mortality in sky diving, has never been done. Crazy ideas, like making residents go home after 30-hour shifts instead of finishing out their work days post-call—those are the only ones that need studies. For that reason, the next time that somebody quotes a study to you, be advised that whatever it is they’re proposing, it’s probably nuts. After all, if you need a study to justify something, it means it already failed the test of common sense.