Evidence Based Medicine
Are Doctors Just Playing Hunches?
Nobody pretends medicine is easy, but if there’s one thing we ought to be able to rely on, it’s that the doctors looking out for us are doing more than playing hunches. We take certain medicines because they work, right? We go into the operating room for certain procedures because they’ll make us well, don’t we? Well, maybe. More and more, however, doctors are making the unnerving case that no matter how reliable a drug or other treatment appears to be, too often there’s simply little hard evidence that it would make a long-term difference in a person’s quality of life or prolonged survival. Obviously, drugs are tested rigorously to show that they are safe and effective before they are approved by the U.S. and other developed countries. But a clinical study is not the real world, and just because a drug leads to a statistically significant improvement in, say, cholesterol levels doesn’t guarantee that the desired effect-a healthier heart and a longer life-will follow.
Often your doctor is left to make prescription decisions based at least in part on faith, bias or even an educated guess. That ought to be enough to spook even the least jumpy patient, but the fact is, recognizing just what a roll of the dice medicine can be may be a good thing. Increasingly, doctors seeking to provide their patients with the best possible care are exploring what is known as evidence-based medicine-a hard, cold, empirical look at what works, what doesn’t and how to distinguish between the two. It’s not enough to prove that a particular blood test or CT scan really spots cancer, for example. You also need to know whether early detection of that cancer would make a difference in your ability to respond to treatment or it merely means that you would die at the same point but learn about your illness earlier than you would have without the test.
Evidence-based medicine, which uses volumes of studies and show-me skepticism to answer such questions, is now being taught-with varying degrees of success-at every medical school in North America. It has been extraordinarily successful in shooting down some of the most cherished beliefs in health care, like the idea that long-term hormone-replacement therapy would help prevent heart disease in women. And it has clearly saved lives. Many doctors used to give anti-arrhythmia drugs to everyone who experienced irregular heartbeats after a heart attack because severely irregular beats could rapidly prove fatal. But then came the results of a randomized trial showing that patients with only mildly irregular heartbeats were more likely to die if given the anti-arrhythmia medication than their untreated counterparts were. Doctors now prescribe more judiciously, though treatment still saves lives in the case of severe arrhythmias.
Advocates believe that evidence-based medicine can go much further, reducing the reliance on expert opinion and overturning the flawed assumptions and even financial incentives that underlie many decisions. “This is a whole way of looking at the world,” says Dr. Gordon Guyatt of McMaster University in Hamilton, Ont., who coined the term and is a pioneer of the evidence-based movement. But is such certainty possible-or even desirable? Medicine, after all, is a personalized service, one built around the uniqueness of each patient and the skilled physician’s ability to design care accordingly. “I’m worried about training a generation of physicians who don’t have the other skills they need for the optimal practice of medicine,” says Dr. Mark Tonelli, a pulmonary-care specialist at the University of Washington in Seattle. “They can read the scientific literature, understand the statistics, but they don’t understand how that should influence their treatment of the individual in front of them.”
What’s more, some insurance companies have been very aggressive in using evidence-based arguments to deny payment for untested treatments-a circular problem, because how do you create the evidence the insurers demand unless you test the untested? Whatever the merits of evidence-based medicine, it got off to a rocky start. When Guyatt began championing it back in the 1990s, he called it “scientific medicine,” but he learned quickly that if you want to start a revolution, it helps to pick the right slogan. Many of his colleagues were outraged by the implied insult to their expertise. So he quickly went with “evidence-based,” and tempers cooled. Guyatt’s ideas complemented the work of the Cochrane Collaboration, an international network of researchers, physicians and others that was founded in 1993 to systematically gather and evaluate the knowledge found in medical research.
The organization aggregates all published scientific studies on a particular treatment question to get a sense of the field. Then reviewers carefully consider the design of the research to determine just how strong the evidence is. One of their most famous reports was a 2005 finding based on 139 studies showing that there was “no credible evidence” that the vaccine against measles, mumps and rubella was involved in the development of either autism or Crohn’s disease. Guyatt and another doctor, David Sackett, wanted to go a step further by making sure doctors used the evidence that was collected and ranked. Many physicians began doing just that, but there have been a few nasty surprises. Consider the case of Dr. Daniel Merenstein, a family-medicine physician trained in evidence-based practice. In 1999 Merenstein examined a healthy 53-year-old man who showed no signs of prostate cancer.
As he had been taught, Merenstein explained to his patient that there are advantages and disadvantages to having a blood test for prostate-specific antigen (PSA). The test can lead to early detection of prostate cancer but also to unnecessary biopsies and even treatment-with all its attendant risks of impotence and incontinence-for a cancer that might have grown so slowly that it didn’t need immediate attention. And for aggressive prostate cancers, there is little evidence that early detection makes a difference in whether treatment could save your life. As a result, the patient did not get a PSA test. Unfortunately, several years later, the patient was found to have a very aggressive and incurable prostate cancer. He sued Merenstein for not ordering a PSA test, and a jury agreed-despite the lack of evidence that it would have made a difference. Most doctors in the plaintiff’s state, the lawyers showed, would have ignored the debate and simply ordered the test. Although Merenstein was found not liable, the residency program that trained him in evidence-based practice was-to the tune of $1 million.
Even champions of evidence-based practice acknowledge that the approach has limits. “Some things can’t be tested in randomized trials, and some things are so obvious, they don’t need it,” says Dr. Paul Glasziou, director of the Center for Evidence-Based Medicine in Oxford, England. There have never been randomized trials to show that giving electrical shocks to a heart that has stopped beating saves more lives than doing nothing, for example. Similarly, giving antibiotics to treat pneumonia has never been rigorously tested from a scientific point of view. It’s clear to everyone, however, that if you want to survive a bout of bacterial pneumonia, antibiotics are your best bet, and nobody would want to go into cardiac arrest without a crash cart handy. “Where randomized trials are most important is where you’re trying to affect a long-term condition, like stroke or cancer,” Glasziou says.
Finally, the very definition of evidence-based medicine is something of a moving target. Physicians who encouraged their female patients to take hormone-replacement therapy to prevent heart problems later on were practicing a kind of evidence-based medicine, since the best available evidence at the time-observational studies and the like-suggested a benefit. Of course, when a randomized controlled trial showed otherwise, the advice changed. Even at that, the case is not entirely closed. Some researchers now believe there may be a window of opportunity right around the years of menopause during which hormone-replacement therapy could help the heart. Proving that would, naturally, require another study. All the same, few people deny that the trend in medicine is increasingly to be guided, if not governed, by the data–an idea that is spreading to other fields as well.
Evidence-based practice is now being taught in nursing, general education and even philanthropy, thanks to the influence of the Bill and Melinda Gates Foundation, a results-based group if ever there was one. You could see even the political fights over global warming as the birth pangs of the new practice of evidence-based policy. But it is in medicine that the practice will have the most emotional impact. All patients would probably benefit if their doctors were abreast of the latest data, but none would benefit from being reduced to one of those statistical points. “You have to be able to take a good history and do a physical examination,” Guyatt says. “And you have to have an understanding of patients’ values and preferences.” As much as some physicians might wish it otherwise, there is still as much art to medicine as there is science.