Making patients have "skin in the game"

Original Reporting | By James Lardner |

April 20, 2011 — The House of Representatives’ approval of a deficit-slashing plan for 2012 and beyond may not have been everyone’s idea of “our generation’s defining moment,” as budget committee chairman Paul Ryan proclaimed it. Beyond any question, however, that vote was a breakthrough for the “skin in the game” school of health care policy.

There may be a host of consumer products for which Americans commonly yearn, but, as Shannon Brownlee pointed out, “I don’t think anyone says, ‘Oh goody, I get to be in the hospital; oh goody, I get to be in the ICU; oh yippee, I get to have open-heart surgery.’”

For decades, advocates and experts affiliated with such free-market think tanks as the Heritage Foundation, the Cato Institute, and the Hoover Institution have criticized the system of “third party payment” that, in their view, over-insulates Americans (those fortunate enough to have insurance, at any rate) from the cost of their health care decisions. With more of our personal dollars at stake — more “skin in the game” — we will learn to make smarter health-care choices, the argument goes, and providers will respond by competing and innovating harder. That kind of approach, according to the theory, will lead to a system characterized by better care — or at least equally good care — at lower net cost.

Although it was Republicans who voted last week to turn Medicare into a privatized, insurance-shopping voucher arrangement, the skin-in-the-game theory also commands wide support among centrist and conservative Democrats. Ironically, the theory’s ascent has occurred in the face of groundbreaking research on how Americans actually make health care decisions — research that appears to punch big holes in the skin-in-the-game hypothesis.

 

Patients cutting back on preventive care

A Rand Corporation study, described in the March 2011 issue of the American Journal of Managed Care, compared enrollees in traditional health insurance plans with enrollees in so-called “consumer directed” health plans. The latter plans were characterized by high deductibles and tax-free or employer-supported health savings accounts — a combination of features championed by skin-in-the-game theorists as a way of turning Americans into more discerning and engaged “consumers” of health care.

Under the consumer-directed plans, Rand found that people were more likely to economize on doctor visits and medicines that would cost them money out of pocket; but in a result that one of the lead researchers, Neeraj Sood, characterized as “not ideal,” patients also cut back on preventive care, despite the fact that such care was fully covered under all the plans in the study.

“If you have high cost-sharing you’re probably seeing your doctor less often and therefore there is less opportunity to start conversations about preventive care,” said Neeraj Sood, one of the lead researchers of the recent Rand Corporation study.

In a phone interview with Remapping Debate, Sood posited two likely explanations. Some people, he said, may simply not have realized that their plans’ deductibles excluded preventive care. But even for those who did, “a lot of preventive care is initiated when you actually see a doctor,” Sood pointed out, “and if you have high cost-sharing you’re probably seeing your doctor less often and therefore there is less opportunity to start conversations about preventive care.”

Advocates of vouchers, health savings accounts, and other market-oriented policies often point to another Rand study, dating from the 1970s and early ‘80s, as proof that more cost-sharing leads to less care at no sacrifice of health. But even that study raised questions about the skin-in-the-game-model, according to Shannon Brownlee, an instructor at the Dartmouth Institute for Health Policy and Clinical Practice, and author of “Overtreated: Why Too Much Medicine Is Making Us Sicker and Poorer.”

“Yes, when people had more skin in the game they were more judicious about spending money on health care,” Brownlee said in a phone interview. “But they didn’t do it in a rational way. They were just as likely to forego care that they needed as care that they didn’t need. So they weren’t really very prudent consumers of health care — they were simply more worried about spending money on health care.”

 

Higher co-pays, more hospital time

Because the latest Rand study tracked its subjects for only one year, it is impossible to know whether short-term neglect of worthwhile care led to long-term health problems. That, however, is the pattern strongly suggested by another recent study, involving enrollees in private “Medicare advantage” plans that have experienced sharp increases in co-pays and deductibles. Among this group, a research team led by Amal Trivedi, an assistant professor of community health at Brown University, found fewer doctor visits but, at the same time, more and longer hospital stays. The study — “Increased Ambulatory Care Copayments and Hospitalizations among the Elderly” — was originally reported in the New England Journal of Medicine last year.

“For every one hundred people exposed to a doubling of co-pays, there were 20 fewer outpatient visits, two additional hospital admissions, and 13 more inpatient days,” Trivedi said. The increased hospital spending, he added, was sufficient to “dwarf” the savings on ambulatory care.

Send a letter to the editor