The good news is that we know more about the economics of health care than we did when Clinton tried and failed to remake the system. There’s now a large body of evidence on what works and what doesn’t work in health care, and it’s not hard to see how to make dramatic improvements in US practice. As we’ll see, the evidence clearly shows that the key problem with the US health care system is its fragmentation. A history of failed attempts to introduce universal health insurance has left us with a system in which the government pays directly or indirectly for more than half of the nation’s health care, but the actual delivery both of insurance and of care is undertaken by a crazy quilt of private insurers, for-profit hospitals, and other players who add cost without adding value. A Canadian-style single-payer system, in which the government directly provides insurance, would almost surely be both cheaper and more effective than what we now have. And we could do even better if we learned from “integrated” systems, like the Veterans Administration, that directly provide some health care as well as medical insurance.

The bad news is that Washington currently seems incapable of accepting what the evidence on health care says. In particular, the Bush administration is under the influence of both industry lobbyists, especially those representing the drug companies, and a free-market ideology that is wholly inappropriate to health care issues. As a result, it seems determined to pursue policies that will increase the fragmentation of our system and swell the ranks of the uninsured.

Before we talk about reform, however, let’s talk about the current state of the US health care system. Let us begin by asking a seemingly naive question: What’s wrong with spending ever more on health care?

1.

Is health care spending a problem?

In 1960 the United States spent only 5.2 percent of GDP on health care. By 2004 that number had risen to 16 percent. At this point America spends more on health care than it does on food. But what’s wrong with that?

The starting point for any discussion of rising health care costs has to be the realization that these rising costs are, in an important sense, a sign of progress. Here’s how the Congressional Budget Office puts it, in the latest edition of its annual publication The Long-Term Budget Outlook:

Growth in health care spending has outstripped economic growth regardless of the source of its funding…. The major factor associated with that growth has been the development and increasing use of new medical technology…. In the health care field, unlike in many sectors of the economy, technological advances have generally raised costs rather than lowered them.

Notice the three points in that quote. First, health care spending is rising rapidly “regardless of the source of its funding.” Translation: although much health care is paid for by the government, this isn’t a simple case of runaway government spending, because private spending is rising at a comparably fast clip. “Comparing common benefits,” says the Kaiser Family Foundation,

changes in Medicare spending in the last three decades has largely tracked the growth rate in private health insurance premiums. Typically, Medicare increases have been lower than those of private health insurance.

Second, “new medical technology” is the major factor in rising spending: we spend more on medicine because there’s more that medicine can do. Third, in medical care, “technological advances have generally raised costs rather than lowered them”: although new technology surely produces cost savings in medicine, as elsewhere, the additional spending that takes place as a result of the expansion of medical possibilities outweighs those savings.

So far, this sounds like a happy story. We’ve found new ways to help people, and are spending more to take advantage of the opportunity. Why not view rising medical spending, like rising spending on, say, home entertainment systems, simply as a rational response to expanded choice? We would suggest two answers.

The first is that the US health care system is extremely inefficient, and this inefficiency becomes more costly as the health care sector becomes a larger fraction of the economy. Suppose, for example, that we believe that 30 percent of US health care spending is wasted, and always has been. In 1960, when health care was only 5.2 percent of GDP, that meant waste equal to only 1.5 percent of GDP. Now that the share of health care in the economy has more than tripled, so has the waste.

This inefficiency is a bad thing in itself. What makes it literally fatal to thousands of Americans each year is that the inefficiency of our health care system exacerbates a second problem: our health care system often makes irrational choices, and rising costs exacerbate those irrationalities. Specifically, American health care tends to divide the population into insiders and outsiders. Insiders, who have good insurance, receive everything modern medicine can provide, no matter how expensive. Outsiders, who have poor insurance or none at all, receive very little. To take just one example, one study found that among Americans diagnosed with colorectal cancer, those without insurance were 70 percent more likely than those with insurance to die over the next three years.

Advertisement

In response to new medical technology, the system spends even more on insiders. But it compensates for higher spending on insiders, in part, by consigning more people to outsider status—robbing Peter of basic care in order to pay for Paul’s state-of-the-art treatment. Thus we have the cruel paradox that medical progress is bad for many Americans’ health.

This description of our health care problems may sound abstract. But we can make it concrete by looking at the crisis now afflicting employer-based health insurance.

2.

The unraveling of employer-based insurance

In 2003 only 16 percent of health care spending consisted of out-of-pocket expenditures by consumers. The rest was paid for by insurance, public or private. As we’ll see, this heavy reliance on insurance disturbs some economists, who believe that doctors and patients fail to make rational decisions about spending because third parties bear the costs of medical treatment. But it’s no use wishing that health care were sold like ordinary consumer goods, with individuals paying out of pocket for what they need. By its very nature, most health spending must be covered by insurance.

The reason is simple: in any given year, most people have small medical bills, while a few people have very large bills. In 2003, health spending roughly followed the “80–20 rule”: 20 percent of the population accounted for 80 percent of expenses. Half the population had virtually no medical expenses; a mere 1 percent of the population accounted for 22 percent of expenses.

Here’s how Henry Aaron and his coauthors summarize the implication of these numbers in their book Can We Say No?: “Most health costs are incurred by a small proportion of the population whose expenses greatly exceed plausible limits on out-of-pocket spending.” In other words, if people had to pay for medical care the way they pay for groceries, they would have to forego most of what modern medicine has to offer, because they would quickly run out of funds in the face of medical emergencies.

So the only way modern medical care can be made available to anyone other than the very rich is through health insurance. Yet it’s very difficult for the private sector to provide such insurance, because health insurance suffers from a particularly acute case of a well-known economic problem known as adverse selection. Here’s how it works: imagine an insurer who offered policies to anyone, with the annual premium set to cover the average person’s health care expenses, plus the administrative costs of running the insurance company. Who would sign up? The answer, unfortunately, is that the insurer’s customers wouldn’t be a representative sample of the population. Healthy people, with little reason to expect high medical bills, would probably shun policies priced to reflect the average person’s health costs. On the other hand, unhealthy people would find the policies very attractive.

You can see where this is going. The insurance company would quickly find that because its clientele was tilted toward those with high medical costs, its actual costs per customer were much higher than those of the average member of the population. So it would have to raise premiums to cover those higher costs. However, this would disproportionately drive off its healthier customers, leaving it with an even less healthy customer base, requiring a further rise in premiums, and so on.

Insurance companies deal with these problems, to some extent, by carefully screening applicants to identify those with a high risk of needing expensive treatment, and either rejecting such applicants or charging them higher premiums. But such screening is itself expensive. Furthermore, it tends to screen out exactly those who most need insurance.

Most advanced countries have dealt with the defects of private health insurance in a straightforward way, by making health insurance a government service. Through Medicare, the United States has in effect done the same thing for its seniors. We also have Medicaid, a means-tested program that provides health insurance to many of the poor and near poor. But nonelderly, nonpoor Americans are on their own. In practice, only a tiny fraction of nonelderly Americans (5.3 percent in 2003) buy private insurance for themselves. The rest of those not covered by Medicare or Medicaid get insurance, if at all, through their employers.

Employer-based insurance is a peculiarly American institution. As Julius Richmond and Rashi Fein tell us in The Health Care Mess, the dominant role of such insurance is the result of historical accident rather than deliberate policy. World War II caused a labor shortage, but employers were subject to controls that prevented them from attracting workers by offering higher wages. Health benefits, however, weren’t controlled, and so became a way for employers to compete for workers. Once employers began offering medical benefits, they also realized that it was a form of compensation workers valued highly because it protected them from risk. Moreover, the tax law favored employer-based insurance, because employers’ contributions weren’t considered part of workers’ taxable income. Today, the value of the tax subsidy for employer-based insurance is estimated at around $150 billion a year.

Advertisement

Employer-based insurance has historically offered a partial solution to the problem of adverse selection. In principle, adverse selection can still occur even if health insurance comes with a job rather than as a stand-alone policy. This would occur if workers with health problems flocked to companies that offered health insurance, while healthy workers took jobs at companies that didn’t offer insurance and offered higher wages instead. But until recently health insurance was a sufficiently small consideration in job choice that large corporations offering good health benefits, like General Motors, could safely assume that the health status of their employees was representative of the population at large and that adverse selection wasn’t inflating the cost of health insurance.

In 2004, according to census estimates, 63.1 percent of Americans under sixty-five received health insurance through their employers or family members’ employers. Given the inherent difficulties of providing health insurance through the private sector, that’s an impressive number. But it left more than a third of nonelderly Americans out of the system. Moreover, the number of outsiders is growing: the share of nonelderly Americans with employment-based health insurance was 67.7 percent as recently as 2000. And this trend seems certain to continue, even accelerate, because the whole system of employer-based health care is under severe strain.

We can identify several reasons for that strain, but mainly it comes down to the issue of costs. Providing health insurance looked like a good way for employers to reward their employees when it was a small part of the pay package. Today, however, the annual cost of coverage for a family of four is estimated by the Kaiser Family Foundation at more than $10,000. One way to look at it is to say that that’s roughly what a worker earning minimum wage and working full time earns in a year. It’s more than half the annual earnings of the average Wal-Mart employee.

Health care costs at current levels override the incentives that have historically supported employer-based health insurance. Now that health costs loom so large, companies that provide generous benefits are in effect paying some of their workers much more than the going wage—or, more to the point, more than competitors pay similar workers. Inevitably, this creates pressure to reduce or eliminate health benefits. And companies that can’t cut benefits enough to stay competitive—such as GM—find their very existence at risk.

Rising health costs have also ended the ability of employer-based insurance plans to avoid the problem of adverse selection. Anecdotal evidence suggests that workers who know they have health problems actively seek out jobs with companies that still offer generous benefits. On the other side, employers are starting to make hiring decisions based on likely health costs. For example, an internal Wal-Mart memo, reported by The New York Times in October, suggested adding tasks requiring physical exertion to jobs that don’t really require it as a way to screen out individuals with potential health risks.

So rising health care costs are undermining the institution of employer-based coverage. We’d suggest that the drop in the number of insured so far only hints at the scale of the problem: we may well be seeing the whole institution unraveling.

Notice that this unraveling is the byproduct of what should be a good thing: advances in medical technology, which lead doctors to spend more on their patients. This leads to higher insurance costs, which causes employers to stop providing health coverage. The result is that many people are thrown into the world of the uninsured, where even basic care is often hard to get. As we said, we rob Peter of basic care in order to provide Paul with state-of-the-art treatment.

Fortunately, some of the adverse consequences of the decline in employer-based coverage have been muted by a crucial government program, Medicaid. But Medicaid is facing its own pressures.

3.

Medicaid and Medicare

The US health care system is more privatized than that of any other advanced country, but nearly half of total health care spending nonetheless comes from the government. Most of this government spending is accounted for by two great social insurance programs, Medicare and Medicaid. Although Medicare gets most of the public attention, let’s focus first on Medicaid, which is a far more important program than most middle-class Americans realize.

In The Health Care Mess Richmond and Fein tell us that Medicaid, like employer-based health insurance, came into existence through a sort of historical accident. As Lyndon Johnson made his big push to create Medicare, the American Medical Association, in a last-ditch effort to block so-called “socialized medicine” (actually only the insurance is socialized; the medical care is provided by the private sector), began disparaging Johnson’s plan by claiming that it would do nothing to help the truly needy. In a masterful piece of political jujitsu, Johnson responded by adding a second program, Medicaid, targeted specifically at helping the poor and near poor.

Today, Medicaid is a crucial part of the American safety net. In 2004 Medicaid covered almost as many people as its senior partner, Medicare—37.5 million versus 39.7 million.

Medicaid has grown rapidly in recent years because it has been picking up the slack from the unraveling system of employer-based insurance. Between 2000 and 2004 the number of Americans covered by Medicaid rose by a remarkable eight million. Over the same period the ranks of the uninsured rose by six million. So without the growth of Medicaid, the uninsured population would have exploded, and we’d be facing a severe crisis in medical care.

But Medicaid, even as it becomes increasingly essential to tens of millions of Americans, is also becoming increasingly vulnerable to political attack. To some extent this reflects the political weakness of any means-tested program serving the poor and near poor. As the British welfare scholar Richard Titmuss said, “Programs for the poor are poor programs.” Unlike Medicare’s clients—the feared senior group—Medicaid recipients aren’t a potent political constituency: they are, on average, poor and poorly educated, with low voter participation. As a result, funding for Medicaid depends on politicians’ sense of decency, always a fragile foundation for policy.

The complex structure of Medicaid also makes it vulnerable. Unlike Medicare, which is a purely federal program, Medicaid is a federal-state matching program, in which states provide on average about 40 percent of the funds. Since state governments, unlike the federal government, can’t engage in open-ended deficit financing, this dependence on state funds exposes Medicaid to pressure whenever state budgets are hard-pressed. And state budgets are hard-pressed these days for a variety of reasons, not least the rapidly rising cost of Medicaid itself.

The result is that, like employer-based health insurance, Medicaid faces a possible unraveling in the face of rising health costs. An example of how that unraveling might take place is South Carolina’s request for a waiver of federal rules to allow it to restructure the state’s Medicaid program into a system of private accounts. We’ll discuss later in this essay the strange persistence, in the teeth of all available evidence, of the belief that the private sector can provide health insurance more efficiently than the government. The main point for now is that South Carolina’s proposed reform would seriously weaken the medical safety net: recipients would be given a voucher to purchase health insurance, but many would find the voucher inadequate, and would end up being denied care. And if South Carolina gets its waiver, other states will probably follow its lead.

Medicare’s situation is very different. Unlike employer-based insurance or Medicaid, Medicare faces no imminent threat of large cuts. Although the federal government is deep in deficit, it’s not currently having any difficulty borrowing, largely from abroad, to cover the gap. Also, the political constituency behind Medicare remains extremely powerful. Yet federal deficits can’t go on forever; even the US government must eventually find a way to pay its bills. And the long-term outlook for federal finances is dire, mainly because of Medicare and Medicaid.

The chart in figure 1 illustrates the centrality of health care costs to America’s long-term budget problems. The chart shows the Congressional Budget Office’s baseline projection of spending over the next twenty-five years on the three big entitlement programs, Social Security, Medicare, and Medicaid, measured as a percentage of GDP. Not long ago advocates of Social Security privatization tried to use projections like this one to foster a sense of crisis about the retirement system. As was pointed out last year in these pages,1 however, there is no program called Socialsecuritymedicareandmedicaid. In fact, as the chart shows, Social Security, whose costs will rise solely because of the aging of the population, represents only a small part of the problem. Most of the problem comes from the two health care programs, whose spending is rising mainly because of the general rise in medical costs.

To be fair, there is a demographic component to Medicare and Medicaid spending too—Medicare because it only serves Americans over sixty-five, Medicaid because the elderly, although a minority of the program’s beneficiaries, account for most of its spending. Still, the principal factor in both programs’ rising costs is what the CBO calls “excess cost growth”—the persistent tendency of health care spending per beneficiary to grow faster than per capita income, owing to advancing medical technology. Without this excess cost growth, the CBO estimates that entitlement spending would rise by only 3.7 percent of GDP over the next twenty-five years. That’s a significant rise, but not overwhelming, and could be addressed with moderate tax increases and possibly benefit cuts. But because of excess cost growth the projected rise in spending is a crushing burden—about 10 percent of GDP over the next twenty-five years, and even more thereafter.

Rising health care spending, then, is driving a triple crisis. The fastest-moving piece of that crisis is the unraveling of employer-based coverage. There’s a gradually building crisis in Medicaid. And there’s a long-term federal budget crisis driven mainly by rising health care spending.

So what are we going to do about health care?

4.

The “consumer-directed” diversion

As we pointed out at the beginning of this essay, one of the two big reasons to be concerned about rising spending on health care is that as the health care sector grows, its inefficiency becomes increasingly important. And almost everyone agrees that the US health care system is extremely inefficient. But there are wide disagreements about the nature of that inefficiency. And the analysts who have the ear of the Bush administration are committed, for ideological reasons, to a view that is clearly wrong.

We’ve already alluded to the underlying view behind the Bush administration’s health care proposals: it’s the view that insurance leads people to consume too much health care. The 2004 Economic Report of the President, which devoted a chapter to health care, illustrated the alleged problem with a parable about the clothing industry:

Suppose, for example, that an individual could purchase a clothing insurance policy with a “coinsurance” rate of 20 percent, meaning that after paying the insurance premium, the holder of the insurance policy would have to pay only 20 cents on the dollar for all clothing purchases. An individual with such a policy would be expected to spend substantially more on clothes—due to larger quantity and higher quality purchases—with the 80 percent discount than he would at the full price…. The clothing insurance example suggests an inherent inefficiency in the use of insurance to pay for things that have little intrinsic risk or uncertainty.

The report then asserts that “inefficiencies of this sort are pervasive in the US health care system”—although, tellingly, it fails to match the parable about clothing with any real examples from health care.

The view that Americans consume too much health care because insurers pay the bills leads to what is currently being called the “consumer-directed” approach to health care reform. The virtues of such an approach are the theme of John Cogan, Glenn Hubbard, and Daniel Kessler’s Healthy, Wealthy, and Wise. The main idea is that people should pay more of their medical expenses out of pocket. And the way to reduce public reliance on insurance, reformers from the right wing believe, is to remove the tax advantages that currently favor health insurance over out-of-pocket spending. Indeed, last year Bush’s tax reform commission proposed taxing some employment-based health benefits. The administration, recognizing how politically explosive such a move would be, rejected the proposal. Instead of raising taxes on health insurance, the administration has decided to cut taxes on out-of-pocket spending.

Cogan, Hubbard, and Kessler call for making all out-of-pocket medical spending tax-deductible, although tax experts from both parties say that this would present an enforcement nightmare. (Douglas Holtz-Eakin, the former head of the Congressional Budget Office, put it this way: “If you want to have a personal relationship with the IRS do that [i.e., make all medical spending tax deductible] because we are going to have to investigate everybody’s home to see if their running shoes are a medical expense.”) The administration’s proposals so far are more limited, focusing on an expanded system of tax-advantaged health savings accounts. Individuals can shelter part of their income from taxes by depositing it in such accounts, then withdraw money from these accounts to pay medical bills.

What’s wrong with consumer-directed health care? One immediate disadvantage is that health savings accounts, whatever their ostensible goals, are yet another tax break for the wealthy, who have already been showered with tax breaks under Bush. The right to pay medical expenses with pre-tax income is worth a lot to high-income individuals who face a marginal income tax rate of 35 percent, but little or nothing to lower-income Americans who face a marginal tax rate of 10 percent or less, and lack the ability to place the maximum allowed amount in their savings accounts.

A deeper disadvantage is that such accounts tend to undermine employment-based health care, because they encourage adverse selection: health savings accounts are attractive to healthier individuals, who will be tempted to opt out of company plans, leaving less healthy individuals behind.

Yet another problem with consumer-directed care is that the evidence says that people don’t, in fact, make wise decisions when paying for medical care out of pocket. A classic study by the Rand Corporation found that when people pay medical expenses themselves rather than relying on insurance, they do cut back on their consumption of health care—but that they cut back on valuable as well as questionable medical procedures, showing no ability to set sensible priorities.

But perhaps the biggest objection to consumer-directed health reform is that its advocates have misdiagnosed the problem. They believe that Americans have too much health insurance; the 2004 Economic Report of the President condemned the fact that insurance currently pays for “many events that have little uncertainty, such as routine dental care, annual medical exams, and vaccinations,” and for “relatively low-expense items, such as an office visit to the doctor for a sore throat.” The implication is that health costs are too high because people who don’t pay their own medical bills consume too much routine dental care and are too ready to visit the doctor about a sore throat. And that argument is all wrong. Excessive consumption of routine care, or small-expense items, can’t be a major source of health care inefficiency, because such items don’t account for a major share of medical costs.

Remember the 80–20 rule: the great bulk of medical expenses are accounted for by a small number of people requiring very expensive treatment. When you think of the problem of health care costs, you shouldn’t envision visits to the family physician to talk about a sore throat; you should think about coronary bypass operations, dialysis, and chemotherapy. Nobody is proposing a consumer-directed health care plan that would force individuals to pay a large share of extreme medical expenses, such as the costs of chemotherapy, out of pocket. And that means that consumer-directed health care can’t promote savings on the treatments that account for most of what we spend on health care.

The administration’s plans for consumer-directed health care, then, are a diversion from meaningful health care reform, and will actually worsen our health care problems. In fact, some reformers privately hope that George W. Bush manages to get his health care plans passed, because they believe that they will hasten the collapse of employment-based coverage and pave the way for real reform. (The suffering along the way would be huge.)

But what would real reform look like?

5.

Single-payer and beyond

How do we know that the US health care system is highly inefficient? An important part of the evidence takes the form of international comparisons. Table 1 compares US health care with the systems of three other advanced countries. It’s clear from the table that the United States has achieved something remarkable. We spend far more on health care than other advanced countries—almost twice as much per capita as France, almost two and a half times as much as Britain. Yet we do considerably worse even than the British on basic measures of health performance, such as life expectancy and infant mortality.

One might argue that the US health care system actually provides better care than foreign systems, but that the effects of this superior care are more than offset by unhealthy US lifestyles. Ezra Klein of The American Prospect calls this the “well-we-eat-more-cheeseburgers” argument. But a variety of evidence refutes this argument. The data in Table 1 show that the United States does not stand out in the quantity of care, as measured by such indicators as the number of physicians, nurses, and hospital beds per capita. Nor does the US stand out in terms of the quality of care: a recent study published in Health Affairs that compared quality of care across advanced countries found no US advantage. On the contrary, “the United States often stands out for inefficient care and errors and is an outlier on access/cost barriers.”2 That is, our health care system makes more mistakes than those of other countries, and is unique in denying necessary care to people who lack insurance and can’t pay cash. The frequent claim that the United States pays high medical prices to avoid long waiting lists for care also fails to hold up in the face of the evidence: there are long waiting lists for elective surgery in some non-US systems, but not all, and the procedures for which these waiting lists exist account for only 3 percent of US health care spending.3

So why does US health care cost so much? Part of the answer is that doctors, like other highly skilled workers, are paid much more in the United States than in other advanced countries. But the main source of high US costs is probably the unique degree to which the US system relies on private rather than public health insurance, reflected in the uniquely high US share of private spending in total health care expenditure.

Over the years since the failure of the Clinton health plan, a great deal of evidence has accumulated on the relative merits of private and public health insurance. As far as we have been able to ascertain, all of that evidence indicates that public insurance of the kind available in several European countries and others such as Taiwan achieves equal or better results at much lower cost. This conclusion applies to comparisons within the United States as well as across countries. For example, a study conducted by researchers at the Urban Institute found that

per capita spending for an adult Medicaid beneficiary in poor health would rise from $9,615 to $14,785 if the person were insured privately and received services consistent with private utilization levels and private provider payment rates.4

The cost advantage of public health insurance appears to arise from two main sources. The first is lower administrative costs. Private insurers spend large sums fighting adverse selection, trying to identify and screen out high-cost customers. Systems such as Medicare, which covers every American sixty-five or older, or the Canadian single-payer system, which covers everyone, avoid these costs. In 2003 Medicare spent less than 2 percent of its resources on administration, while private insurance companies spent more than 13 percent.

At the same time, the fragmentation of a system that relies largely on private insurance leads both to administrative complexity because of differences in coverage among individuals and to what is, in effect, a zero-sum struggle between different players in the system, each trying to stick others with the bill. Many estimates suggest that the paperwork imposed on health care providers by the fragmentation of the US system costs several times as much as the direct costs borne by the insurers.

The second source of savings in a system of public health insurance is the ability to bargain with suppliers, especially drug companies, for lower prices. Residents of the United States notoriously pay much higher prices for prescription drugs than residents of other advanced countries, including Canada. What is less known is that both Medicaid and, to an even greater extent, the Veterans’ Administration, get discounts similar to or greater than those received by the Canadian health system.

We’re talking about large cost savings. Indeed, the available evidence suggests that if the United States were to replace its current complex mix of health insurance systems with standardized, universal coverage, the savings would be so large that we could cover all those currently uninsured, yet end up spending less overall. That’s what happened in Taiwan, which adopted a single-payer system in 1995: the percentage of the population with health insurance soared from 57 percent to 97 percent, yet health care costs actually grew more slowly than one would have predicted from trends before the change in system.

If US politicians could be persuaded of the advantages of a public health insurance system, the next step would be to convince them of the virtues, in at least some cases, of honest-to-God socialized medicine, in which government employees provide the care as well as the money. Exhibit A for the advantages of government provision is the Veterans’ Administration, which runs its own hospitals and clinics, and provides some of the best-quality health care in America at far lower cost than the private sector. How does the VA do it? It turns out that there are many advantages to having a single health care organization provide individuals with what amounts to lifetime care. For example, the VA has taken the lead in introducing electronic medical records, which it can do far more easily than a private hospital chain because its patients stay with it for decades. The VA also invests heavily and systematically in preventive care, because unlike private health care providers it can expect to realize financial benefits from measures that keep its clients out of the hospital.

In summary, then, the obvious way to make the US health care system more efficient is to make it more like the systems of other advanced countries, and more like the most efficient parts of our own system. That means a shift from private insurance to public insurance, and greater government involvement in the provision of health care—if not publicly run hospitals and clinics, at least a much larger government role in creating integrated record-keeping and quality control. Such a system would probably allow individuals to purchase additional medical care, as they can in Britain (although not in Canada). But the core of the system would be government insurance—“Medicare for all,” as Ted Kennedy puts it.

Unfortunately, the US political system seems unready to do what is both obvious and humane. The 2003 legislation that added drug coverage to Medicare illustrates some of the political difficulties. Although it’s rarely described this way, Medicare is a single-payer system covering many of the health costs of older Americans. (Canada’s universal single-payer system is, in fact, also called Medicare.) And it has some though not all the advantages of broader single-payer systems, notably low administrative costs.

But in adding a drug benefit to Medicare, the Bush administration and its allies in Congress were driven both by a desire to appease the insurance and pharmaceutical lobbies and by an ideology that insists on the superiority of the private sector even when the public sector has demonstrably lower costs. So they devised a plan that works very differently from traditional Medicare. In fact, Medicare Part D, the drug benefit, isn’t a program in which the government provides drug insurance. It’s a program in which private insurance companies receive subsidies to offer insurance—and seniors aren’t allowed to deal directly with Medicare.

The insertion of private intermediaries into the program has several unfortunate consequences. First, as millions of seniors have discovered, it makes the system extremely complex and obscure. It’s virtually impossible for most people to figure out which of the many drug plans now on offer is best. This complexity, coupled with the Katrina-like obliviousness of administration officials to a widely predicted disaster, also led to the program’s catastrophic initial failure to manage the problem of “dual eligibles,” i.e., older Medicaid recipients whose drug coverage was supposed to be transferred to Medicare. When the program started up in January, hundreds of thousands of these dual eligibles found that they had fallen through the cracks, that their old coverage had been canceled but their new coverage had not been put into effect.

Second, the private intermediaries add substantial administrative costs to the program. It’s reasonably certain that if seniors had been offered the choice of receiving a straightforward drug benefit directly from Medicare, the vast majority would have chosen to pass up the private drug plans, which wouldn’t have been able to offer comparable benefits because of their administrative expenses. But the drug bill avoided that embarrassing outcome by denying seniors that choice.

Finally, by fragmenting the purchase of drugs among many private plans, the administration denied Medicare the ability to bargain for lower prices from the drug companies. And the legislation, reflecting pressures from those companies, included a provision specifically prohibiting Medicare from intervening to help the private plans get lower prices.

In short, ideology and interest groups led the Bush administration to set up a new, costly Medicare benefit in such a way as to systematically forfeit all the advantages of public health insurance.

6.

Beyond reform: How much health care should we have?

Imagine, for a moment, that some future US administration were to push through a fundamental reform of health care that covered all the uninsured, replaced private insurance with a single-payer system, and took heed of the VA’s lessons about the advantages of integrated health care. Would our health care problems be solved?

No. Although real reform would bring great improvement in our situation, continuing technological progress in health care still poses a deep dilemma: How much of what we can do should we do?

The medical profession, understandably, has a bias toward doing whatever will bring medical benefit. If that means performing an expensive surgical procedure on an elderly patient who probably has only a few years to live, so be it. But as medical technology advances, it becomes possible to spend ever larger sums on medically useful care. Indeed, at some point it will become possible to spend the entire GDP on health care. Obviously, we won’t do this. But how will we make choices about what not to do?

In a classic 1984 book, Painful Prescription: Rationing Hospital Care, Henry Aaron and William Schwartz studied the medical choices made by the British system, which has long operated under tight budget limits that force it to make hard choices in a way that US medical care does not. Can We Say No? is an update of that work. It’s a valuable survey of the real medical issues involved in British rationing, and gives a taste of the dilemmas the US system will eventually face.

The operative word, however, is “eventually.” Reading Can We Say No?, one might come away with the impression that the problem of how to ration care is the central issue in current health care policy. This impression is reinforced by Aaron and his co-authors’ decision to compare the US system only with that of Britain, which spends far less on health care than other advanced countries, and correspondingly is forced to do a lot of rationing. A comparison with, say, France, which spends far less than the United States but considerably more than Britain, would give a very different impression: in many respects France consumes more, not less, health care than the United States, but it can do so at lower cost because our system is so inefficient.

The result of Aaron et al.’s single-minded focus on the problem of rationing is a somewhat skewed perspective on current policy issues. Most notably, they argue that the reason we need universal health coverage is that a universal system can ration care in a way that private insurance can’t. This seems to miss the two main immediate arguments for universal care—that it would cover those now uninsured, and that it would be cheaper than our current system. A national health care system will also be better at rationing when the time comes, but that hardly seems like the prime argument for adopting such a system today.

Our Princeton colleague Uwe Reinhardt, a leading economic expert on health care, put it this way: our focus right now should be on eliminating the gross inefficiencies we know exist in the US health care system. If we do that, we will be able to cover the uninsured while spending less than we do now. Only then should we address the issue of what not to do; that’s tomorrow’s issue, not today’s.

7.

Can we fix health care?

Health policy experts know a lot more about the economics of health care now than they did when Bill Clinton tried to remake the US health care system. And there’s overwhelming evidence that the United States could get better health care at lower cost if we were willing to put that knowledge into practice. But the political obstacles remain daunting.

A mere shift of power from Republicans to Democrats would not, in itself, be enough to give us sensible health care reform. While Democrats would have written a less perverse drug bill, it’s not clear that they are ready to embrace a single-payer system. Even liberal economists and scholars at progressive think tanks tend to shy away from proposing a straightforward system of national health insurance. Instead, they propose fairly complex compromise plans. Typically, such plans try to achieve universal coverage by requiring everyone to buy health insurance, the way everyone is forced to buy car insurance, and deal with those who can’t afford to purchase insurance through a system of subsidies. Proponents of such plans make a few arguments for their superiority to a single-payer system, mainly the (dubious) claim that single-payer would reduce medical innovation. But the main reason for not proposing single-payer is political fear: reformers believe that private insurers are too powerful to cut out of the loop, and that a single-payer plan would be too easily demonized by business and political propagandists as “big government.”

These are the same political calculations that led Bill Clinton to reject a single-payer system in 1993, even though his advisers believed that a single-payer system would be the least expensive way to provide universal coverage. Instead, he proposed a complex plan designed to preserve a role for private health insurers. But the plan backfired. The insurers opposed it anyway, most famously with their “Harry and Louise” ads. And the plan’s complexity left the public baffled.

We believe that the compromise plans being proposed by the cautious reformers would run into the same political problems, and that it would be politically smarter as well as economically superior to go for broke: to propose a straightforward single-payer system, and try to sell voters on the huge advantages such a system would bring. But this would mean taking on the drug and insurance companies rather than trying to co-opt them, and even progressive policy wonks, let alone Democratic politicians, still seem too timid to do that.

So what will really happen to American health care? Many people in this field believe that in the end America will end up with national health insurance, and perhaps with a lot of direct government provision of health care, simply because nothing else works. But things may have to get much worse before reality can break through the combination of powerful interest groups and free-market ideology.

—February 22, 2006

This Issue

March 23, 2006