(2 of 3)
But is such certainty possible--or even desirable? Medicine, after all, is a personalized service, one built around the uniqueness of each patient and the skilled physician's ability to design care accordingly. "I'm worried about training a generation of physicians who don't have the other skills they need for the optimal practice of medicine," says Dr. Mark Tonelli, a pulmonary-care specialist at the University of Washington in Seattle. "They can read the scientific literature, understand the statistics, but they don't understand how that should influence their treatment of the individual in front of them." What's more, some insurance companies have been very aggressive in using evidence-based arguments to deny payment for untested treatments--a circular problem, because how do you create the evidence the insurers demand unless you test the untested?
Whatever the merits of evidence-based medicine, it got off to a rocky start. When Guyatt began championing it back in the 1990s, he called it "scientific medicine," but he learned quickly that if you want to start a revolution, it helps to pick the right slogan. Many of his colleagues were outraged by the implied insult to their expertise. So he quickly went with "evidence-based," and tempers cooled.
Guyatt's ideas complemented the work of the Cochrane Collaboration, an international network of researchers, physicians and others that was founded in 1993 to systematically gather and evaluate the knowledge found in medical research. The organization aggregates all published scientific studies on a particular treatment question to get a sense of the field. Then reviewers carefully consider the design of the research to determine just how strong the evidence is. One of their most famous reports was a 2005 finding based on 139 studies showing that there was "no credible evidence" that the vaccine against measles, mumps and rubella was involved in the development of either autism or Crohn's disease.
Guyatt and another doctor, David Sackett, wanted to go a step further by making sure doctors used the evidence that was collected and ranked. Many physicians began doing just that, but there have been a few nasty surprises.
Consider the case of Dr. Daniel Merenstein, a family-medicine physician trained in evidence-based practice. In 1999 Merenstein examined a healthy 53-year-old man who showed no signs of prostate cancer. As he had been taught, Merenstein explained to his patient that there are advantages and disadvantages to having a blood test for prostate-specific antigen (PSA). The test can lead to early detection of prostate cancer but also to unnecessary biopsies and even treatment--with all its attendant risks of impotence and incontinence--for a cancer that might have grown so slowly that it didn't need immediate attention. And for aggressive prostate cancers, there is little evidence that early detection makes a difference in whether treatment could save your life. As a result, the patient did not get a PSA test.