• Email
  • Single Page
  • Print

What Is Living and What Is Dead in Social Democracy?

It was social democracy that bound the middle classes to liberal institutions in the wake of World War II (I use “middle class” here in the European sense). They received in many cases the same welfare assistance and services as the poor: free education, cheap or free medical treatment, public pensions, and the like. In consequence, the European middle class found itself by the 1960s with far greater disposable incomes than ever before, with so many of life’s necessities prepaid in tax. And thus the very class that had been so exposed to fear and insecurity in the interwar years was now tightly woven into the postwar democratic consensus.

By the late 1970s, however, such considerations were increasingly neglected. Starting with the tax and employment reforms of the Thatcher-Reagan years, and followed in short order by deregulation of the financial sector, inequality has once again become an issue in Western society. After notably diminishing from the 1910s through the 1960s, the inequality index has steadily grown over the course of the past three decades.

In the US today, the “Gini coefficient”—a measure of the distance separating rich and poor—is comparable to that of China.1 When we consider that China is a developing country where huge gaps will inevitably open up between the wealthy few and the impoverished many, the fact that here in the US we have a similar inequality coefficient says much about how far we have fallen behind our earlier aspirations.

Consider the 1996 “Personal Responsibility and Work Opportunity Act” (a more Orwellian title would be hard to conceive), the Clinton-era legislation that sought to gut welfare provision here in the US. The terms of this act should put us in mind of another act, passed in England nearly two centuries ago: the New Poor Law of 1834. The provisions of the New Poor Law are familiar to us, thanks to Charles Dickens’s depiction of its workings in Oliver Twist. When Noah Claypole famously sneers at little Oliver, calling him “Work’us” (“Workhouse”), he is implying, for 1838, precisely what we convey today when we speak disparagingly of “welfare queens.”

The New Poor Law was an outrage, forcing the indigent and the unemployed to choose between work at any wage, however low, and the humiliation of the workhouse. Here and in most other forms of nineteenth-century public assistance (still thought of and described as “charity”), the level of aid and support was calibrated so as to be less appealing than the worst available alternative. This system drew on classical economic theories that denied the very possibility of unemployment in an efficient market: if wages fell low enough and there was no attractive alternative to work, everyone would find a job.

For the next 150 years, reformers strove to replace such demeaning practices. In due course, the New Poor Law and its foreign analogues were succeeded by the public provision of assistance as a matter of right. Workless citizens were no longer deemed any the less deserving for that; they were not penalized for their condition nor were implicit aspersions cast upon their good standing as members of society. More than anything else, the welfare states of the mid-twentieth century established the profound impropriety of defining civic status as a function of economic participation.

In the contemporary United States, at a time of growing unemployment, a jobless man or woman is not a full member of the community. In order to receive even the exiguous welfare payments available, they must first have sought and, where applicable, accepted employment at whatever wage is on offer, however low the pay and distasteful the work. Only then are they entitled to the consideration and assistance of their fellow citizens.

Why do so few of us condemn such “reforms”—enacted under a Democratic president? Why are we so unmoved by the stigma attaching to their victims? Far from questioning this reversion to the practices of early industrial capitalism, we have adapted all too well and in consensual silence—in revealing contrast to an earlier generation. But then, as Tolstoy reminds us, there are “no conditions of life to which a man cannot get accustomed, especially if he sees them accepted by everyone around him.”

This “disposition to admire, and almost to worship, the rich and the powerful, and to despise, or, at least, to neglect persons of poor and mean condition…is…the great and most universal cause of the corruption of our moral sentiments.” Those are not my words. They were written by Adam Smith, who regarded the likelihood that we would come to admire wealth and despise poverty, admire success and scorn failure, as the greatest risk facing us in the commercial society whose advent he predicted. It is now upon us.

The most revealing instance of the kind of problem we face comes in a form that may strike many of you as a mere technicality: the process of privatization. In the last thirty years, a cult of privatization has mesmerized Western (and many non-Western) governments. Why? The shortest response is that, in an age of budgetary constraints, privatization appears to save money. If the state owns an inefficient public program or an expensive public service—a waterworks, a car factory, a railway—it seeks to offload it onto private buyers.

The sale duly earns money for the state. Meanwhile, by entering the private sector, the service or operation in question becomes more efficient thanks to the working of the profit motive. Everyone benefits: the service improves, the state rids itself of an inappropriate and poorly managed responsibility, investors profit, and the public sector makes a one-time gain from the sale.

So much for the theory. The practice is very different. What we have been watching these past decades is the steady shifting of public responsibility onto the private sector to no discernible collective advantage. In the first place, privatization is inefficient. Most of the things that governments have seen fit to pass into the private sector were operating at a loss: whether they were railway companies, coal mines, postal services, or energy utilities, they cost more to provide and maintain than they could ever hope to attract in revenue.

For just this reason, such public goods were inherently unattractive to private buyers unless offered at a steep discount. But when the state sells cheap, the public takes a loss. It has been calculated that, in the course of the Thatcher-era UK privatizations, the deliberately low price at which long-standing public assets were marketed to the private sector resulted in a net transfer of £14 billion from the taxpaying public to stockholders and other investors.

To this loss should be added a further £3 billion in fees to the banks that transacted the privatizations. Thus the state in effect paid the private sector some £17 billion ($30 billion) to facilitate the sale of assets for which there would otherwise have been no takers. These are significant sums of money—approximating the endowment of Harvard University, for example, or the annual gross domestic product of Paraguay or Bosnia-Herzegovina.2 This can hardly be construed as an efficient use of public resources.

In the second place, there arises the question of moral hazard. The only reason that private investors are willing to purchase apparently inefficient public goods is because the state eliminates or reduces their exposure to risk. In the case of the London Underground, for example, the purchasing companies were assured that whatever happened they would be protected against serious loss—thereby undermining the classic economic case for privatization: that the profit motive encourages efficiency. The “hazard” in question is that the private sector, under such privileged conditions, will prove at least as inefficient as its public counterpart—while creaming off such profits as are to be made and charging losses to the state.

The third and perhaps most telling case against privatization is this. There can be no doubt that many of the goods and services that the state seeks to divest have been badly run: incompetently managed, underinvested, etc. Nevertheless, however badly run, postal services, railway networks, retirement homes, prisons, and other provisions targeted for privatization remain the responsibility of the public authorities. Even after they are sold, they cannot be left entirely to the vagaries of the market. They are inherently the sort of activity that someone has to regulate.

This semiprivate, semipublic disposition of essentially collective responsibilities returns us to a very old story indeed. If your tax returns are audited in the US today, although it is the government that has decided to investigate you, the investigation itself will very likely be conducted by a private company. The latter has contracted to perform the service on the state’s behalf, in much the same way that private agents have contracted with Washington to provide security, transportation, and technical know-how (at a profit) in Iraq and elsewhere. In a similar way, the British government today contracts with private entrepreneurs to provide residential care services for the elderly—a responsibility once controlled by the state.

Governments, in short, farm out their responsibilities to private firms that claim to administer them more cheaply and better than the state can itself. In the eighteenth century this was called tax farming. Early modern governments often lacked the means to collect taxes and thus invited bids from private individuals to undertake the task. The highest bidder would get the job, and was free—once he had paid the agreed sum—to collect whatever he could and retain the proceeds. The government thus took a discount on its anticipated tax revenue, in return for cash up front.

After the fall of the monarchy in France, it was widely conceded that tax farming was grotesquely inefficient. In the first place, it discredits the state, represented in the popular mind by a grasping private profiteer. Secondly, it generates considerably less revenue than an efficiently administered system of government collection, if only because of the profit margin accruing to the private collector. And thirdly, you get disgruntled taxpayers.

In the US today, we have a discredited state and inadequate public resources. Interestingly, we do not have disgruntled taxpayers—or, at least, they are usually disgruntled for the wrong reasons. Nevertheless, the problem we have created for ourselves is essentially comparable to that which faced the ancien régime.

As in the eighteenth century, so today: by eviscerating the state’s responsibilities and capacities, we have diminished its public standing. The outcome is “gated communities,” in every sense of the word: subsections of society that fondly suppose themselves functionally independent of the collectivity and its public servants. If we deal uniquely or overwhelmingly with private agencies, then over time we dilute our relationship with a public sector for which we have no apparent use. It doesn’t much matter whether the private sector does the same things better or worse, at higher or lower cost. In either event, we have diminished our allegiance to the state and lost something vital that we ought to share—and in many cases used to share—with our fellow citizens.

  1. 1

    See “High Gini Is Loosed Upon Asia,” The Economist, August 11, 2007.

  2. 2

    See Massimo Florio, The Great Divestiture: Evaluating the Welfare Impact of the British Privatizations, 1979–1997 (MIT Press, 2004), p. 163. For Harvard, see “Harvard Endowment Posts Solid Positive Return,” Harvard Gazette, September 12, 2008. For the GDP of Paraguay or Bosnia-Herzegovina, see www.cia.gov/library/publications/the-world-factbook/geos/xx.html.

  • Email
  • Single Page
  • Print