It’s been clear for some time that consumers don’t tend to make use of health care quality reports even when they’re available. If anything, consumer reliance on such reporting is dropping. According to the Kaiser Family Foundation:
Fewer U.S. residents use Web sites that rate health services when selecting insurance plans, hospitals or physicians, according to state and national studies…
A survey released in October by the Kaiser Family Foundation found that fewer than 15% of U.S. residents used quality ratings services to help them make decisions about health insurance plans, hospitals or physicians, compared with about 20% of people who said they had used comparative quality ratings in 2004 and 2006. Most people said they never have seen or used comparative quality information services, the survey found…
Bryan Liang, executive director of the Institute of Health Law Studies at California Western School of Law, said, “The basic problem of these kinds of ranking systems is that patients do not choose on the basis of scores,” adding, “They choose on the basis of personal familiarity and experience with the health care entity or provider”
It sure sounds like consumers need more education to act on the information that’s out there. The CMS website Hospital Compare has even been running an advertising campaign to encourage consumers to make use of such data.
But maybe consumers are rational to ignore quality ratings, at least for hospitals. An article in the current Health Affairs (Choosing The Best Hospital: The Limitations of Public Quality Reporting) reveals why this may be so.
The authors identified five public reporting services (Hospital Compare, HealthGrades, Leapfrog Group, US News and World Report, and Massachusetts Quality and Cost Council) and used them to compare various Boston-area hospitals on four non-emergent conditions: community acquired pneumonia, total hip replacement, percutaneous coronary intervention (PCI) and coronary artery bypass graft (CABG).
The services provided wildly inconsistent rankings, even on the same measures. For example:
…the two hospitals ranked first for CABG by at least one service were also ranked fourth and last by another… Conversely, the two hospitals that were ranked last for CABG by at least one service were ranked first or second by another.
But even more damning than that is that the ratings don’t reveal any serious differences.
Most rating systems did not perform statistical tests, but when they did, all nine hospitals were indistinguishable. In fact only one hospital (out of 71) in the state had cardiac mortality that was statistically better than the mean.
It’s actually even worse than that. The scores for each hospital represent an average across physicians and cases. Who’s really able to say what the quality will be for a given patient with a given doctor?
Considering the state of the art, no wonder people rely on personal experience, relationships, and anecdotes when choosing a hospital?
By the way my preferred way to choose a hospital or physician is to speak to the fellows, or –better yet– have a family member who’s a doctor do so. Fellows are in the best position to see and understand what really goes on, and are still young and idealistic enough to level with you about it. I admit this is not a practical route for most people, but that doesn’t stop me from recommending it.