When the numbers of homeless people began to increase in the early 1980s, their advocates often blamed the housing market for what was happening. In those years, the homeless were mostly poor single adults, many of whom had traditionally lived in “single room occupancy” (SRO) hotels and rooming houses. Since these SROs had been disappearing, it seemed natural to suppose that many of their former tenants were now homeless. Later in the decade, when the number of homeless families increased, many people who worked with the homeless blamed the growing shortage of cheap apartments—a shortage that they often attributed to cutbacks in federal housing programs.

Of course there are other explanations for increased homelessness. In an earlier article1 I argued that changes in the way we treated the mentally ill, declining demand for unskilled workers, declining rates of marriage among young mothers, declining welfare benefits, and the crack epidemic were among the causes. Here, however, I shall concentrate on the housing market, giving particular attention to the ways that federal and local governments have reshaped the choices available to the very poor, and how we could do better in the future.


During the winter of 1958 a young University of Chicago demographer named Donald Bogue conducted an unusually careful survey of Chicago’s major skid-row neighborhoods.2 He concluded that just over one hundred men typically spent the night in public places, either walking around or dozing in all-night restaurants and movie theaters, while almost a thousand slept in free shelters. (At that time shelters were mostly run by evangelists eager to save lost souls and were known as “missions.”) Using today’s definitions, therefore, about 1,100 people were homeless in Chicago on an average winter night.

In the winter of 1986 Peter Rossi, the author of Down and Out in America, made another early morning survey of Chicago’s shelters and public places. By then about 500 adults were spending the night in public places, while another 1,600 adults and 300 children were in shelters on a typical night. Overall, the city’s homeless population had roughly doubled.

When Bogue did his survey, however, Chicago’s 1,100-odd homeless men made up only a small minority of its skid-row population, most of whom lived in what were politely known as cubicle hotels. These were not SROs in the 1990s sense of the word. They housed their patrons in windowless five- by seven-foot rooms furnished with a bed, a chair, and a bare light bulb. The cubicles were separated by wooden walls and ventilated through wire mesh near the ceiling and floor. They were noisy, usually verminous, and frequently smelled of urine, vomit, or both. Because of the wire mesh and the smells, their patrons called them cage hotels.

Nonetheless, almost all skid-row residents preferred a cubicle in one of these hotels to a bed in a shelter. A cubicle of one’s own provided more privacy and security than a shelter, which housed residents in large dormitory rooms. Cubicle hotels also treated their residents like paying guests rather than charity cases, allowing them to come and go as they pleased, receiving their mail and making no effort to save their souls, improve their character, or regulate their drinking. As a result, Chicago’s cubicle hotels housed about eight times as many people as its missions in 1958.

By 1986 all but two of these hotels were gone, and the two that survived apparently had fewer than six hundred beds. Similar changes had occurred in almost every other major American city. It seems natural to suppose that some of the people who would once have lived in very cheap hotels now live in shelters or in the streets and in doorways.

New Homeless and Old, by Charles Hoch and Robert Slayton, describes how entrepreneurs and politicians destroyed most of Chicago’s cubicle hotels during the 1970s and early 1980s. Most of these hotels were located just west of the Loop. During the 1960s developers began urging the city to use its powers of eminent domain to clear the neighborhood, so they could build upscale housing for well-to-do young people working in the Loop. The city eventually accepted this proposal, and the last cubicle hotels in this part of the city were torn down in the early 1980s.

Because redevelopment eliminated so many cheap rooms so quickly, a temporary shortage was inevitable. But why weren’t they replaced? Virtually everything in Chicago is constantly being torn down, but almost everything profitable reappears elsewhere. Chicago had many empty warehouses that could have been turned into cubicle hotels. That did not happen—not because there was no demand for cheap beds but because the Chicago building code no longer permits construction of such accommodations.

Because new competition was illegal, the handful of cubicle hotels that escaped the bulldozer were apparently able to raise their prices much faster than the general level of inflation. Chicago’s cubicle hotels charged between 50 and 90 cents a night in 1958. By 1992 Chicago had only one such hotel with a listed telephone (the “Wilson Men’s Club Hotel”), and it charged $7.50 a night (or $162 a month). In New York, where similar places are called lodging houses, prices are now comparable. Renting cubicles seems to have become at least eight times more expensive during the past thirty years, even though the Consumer Price Index and the cost of construction only rose by a factor of about four.


Rising prices not only put rented rooms beyond the reach of some people who had once been able to afford them but encouraged others to spend their limited funds differently. In 1958, a cubicle cost less than a six-pack of beer, making privacy cheaper than oblivion. By 1992 a six-pack cost less than half what a cubicle cost, making oblivion cheaper than privacy. The same pattern obtains if we compare the price of a cubicle to the price of cocaine. These changes surely encouraged some people to use free shelters and spend their money on alcohol or crack.

While the destruction of skid-row flophouses certainly did something to spread homelessness, the connection was not as simple as most accounts suggest. Because incomes rose steadily between 1945 and 1974, demand for cheap rooms declined. Run-down flophouses had trouble finding tenants, and many were pulled down to make way for more profitable enterprises. Because the supply of very cheap rooms exceeded demand, politicians found that regulations prohibiting the creation of new flophouses could make the sponsor look virtuous without creating much hardship.

The postwar boom ended in 1974 with the oil embargo. After that, real incomes became stagnant for all but the richest fifth of the population. Starting in 1979, the number of very poor people began to grow. The number of unmarried working-age men with personal incomes below $2,500 climbed from 1.6 million in 1979 to 2.6 million a decade later.3 Among women, the number went from 2.1 to 2.9 million. (Throughout this article, incomes and rents are in 1989 dollars.)

People with such low incomes could seldom afford to rent rooms with windows, closets, or plumbing even in the 1960s. If they could not stay with others or wanted to live alone, a flophouse was their only choice. But by 1979 the old flophouses were almost all gone. The 1980 Census found only 28,000 people living in “low cost transient quarters” that charged less than $4 a night, which seems to have been roughly the upper limit for a cubicle hotel at the time. Thus when more people began looking for cheap beds, they had nowhere to turn except shelters.

Fifteen years have passed since this change began to take place, but many people concerned about the homeless still have not grasped its implications. During the mid–twentieth-century economic boom, which lasted from 1942 to 1974, politicians and much of the public came to see old-fashioned flophouses as substandard and made building them illegal. Now that the boom has long been over and extreme poverty has risen among single adults, demand for such places has risen again. But no one is yet willing to bring back the more permissive code requirements of an earlier and poorer era. Instead of turning back the clock and letting private entrepreneurs try to house the very poor, or going further and subsidizing such entrepreneurs, city governments have opened thousands of free shelters.

I do not mean to suggest that private entrepreneurs could house all of today’s homeless if we gave them a free hand. Big cities have always needed free shelters, and their clientele has always increased when extreme poverty increased, as it did in the 1980s. But restrictive building codes exacerbate the problem. While some of today’s homeless are too poor to rent any space of their own, some could afford cubicles if they were still available at their traditional price.

In 1987, when Martha Burt surveyed homeless people who used soup kitchens or shelters in big cities, two fifths of the adults not accompanied by children reported monthly incomes above $100, and a quarter reported incomes above $200. Had cubicle prices risen at the same rate as other prices between 1958 and 1987, they would have been $40 to $80 a month. A large minority of the homeless might well have rented a cubicle if one had been available at that price in 1987.

Even in their heyday, however, flophouses sheltered only a small proportion of the very poor. Most unmarried adults lived with someone else, because sharing space cost less than living alone. This remains true today. In 1989, renting a room from a non-relative cost about 20 percent less than renting one from a hotel or a rooming house. As a result, the number of unmarried, working-age adults renting space in someone else’s home was five times the number in hotels and rooming houses.


People with tight budgets could cut their costs even further if they rented space together. Hotels and rooming houses typically charged $283 a month for a single room in 1989, while one-bedroom apartments averaged $392 a month, two-bedroom apartments averaged $474, and three-bedroom apartments averaged $494. Thus if three hotel or rooming house residents had moved into a three-bedroom apartment, they would typically have cut their monthly rent from $283 to $158 apiece—a 42 percent saving. They would also have gotten a kitchen, a living room, and a bathroom shared with only two other people—amenities that are by no means universal in hotels or rooming houses.

But while living with others is economical, finding housemates is far from easy, and getting along with them often turns out to be extremely difficult. As a result, some people need to live alone, even if that means paying more rent or settling for worse living conditions. Because living alone is expensive, those who need a room of their own are particularly likely to become homeless. The homeless are mostly unmarried working-age adults with incomes below $5,000. Census surveys show that only a seventh of this group lived alone in 1987. Yet Martha Burt reports in Over the Edge that a third of all homeless adults—some 100,000 to 150,000 people—had lived alone just before becoming homeless. These loners are probably the group least likely to use the shelters created to house the homeless during the 1980s. For some—especially the mentally ill—the lack of privacy in a dormitory shelter is almost unbearable, not to mention the threats of theft and violence. Many prefer abandoned buildings, subways, parks, or even doorways. This is one big reason why roughly half the homeless are not in shelters on any given night.


By the late 1980s the number of homeless families with children was rising even faster than the number without children. Many people blamed this rise in family homelessness on a growing shortage of cheap apartments. Two of the most informative studies of homelessness—Peter Rossi’s Down and Out in America and Martha Burt’s Over the Edge—endorse this argument, although neither suggests that changes in the housing market were the only cause of increased homelessness.

Two Washington-based advocacy groups, the Center for Budget and Policy Priorities and the Low Income Housing Information Service, have been particularly influential advocates of the view that housing costs rose faster than poorer tenants’ income during the 1970s and 1980s. Their position is summarized in A Place to Call Home: The Low Income Housing Crisis Continues by Edward Lazere, Paul Leonard, Cushing Dolbeare, and Barry Zigas.

Lazere and his colleagues contrast the number of low-income tenants (those with family incomes below $10,000 in 1989 dollars) with the number of low-rent units (those in which rent and utilities cost less than $250 a month in 1989 dollars). As Table 1 indicates, the number of low-rent housing units was essentially stable from 1973 to 1989, while the number of low-income tenants rose sharply.

Table 1

Lazere and his colleagues argue that this change made it harder for low-income tenants to find housing they could afford.

But if demand for cheap housing had really grown while the supply remained constant, low-rent units nationwide should have had longer waiting lists and lower vacancy rates in 1989 than in 1973. Table 1 shows the opposite trend. Vacancy rates in unsubsidized low-rent units were slightly higher in 1989 than in either 1979 or 1973. This was true even in big cities. Taken at face value this trend suggests that demand for low-rent housing did not increase after all.

Other evidence points in the same direction. Landlords abandoned thousands of low-rent buildings in urban areas during the 1970s and 1980s. Except in cities with rent control, such as New York, landlords do not abandon buildings with waiting lists. If such a building is losing money, the landlord simply raises the rent. Landlords only abandon buildings when no one is willing to pay them a profitable rent. In cities with rent control, landlords may not be able to push up rents to an economic level even if they have plenty of would-be tenants. Unrealistic rent ceilings are especially likely in cities that have an unusually high proportion of well-to-do tenants, as New York does, because local rent-control boards in such cities are especially likely to put tenants’ interests ahead of landlords’ interests. In New York, which has had rent control longer than any other major city, low rent ceilings may well have contributed to abandonment.4

Cheap housing remains vacant for two main reasons. Sometimes the building itself is so run down that nobody wants to live in it. In other cases it is in such a bad neighborhood that no one wants to live there. These considerations can sometimes interact to produce widespread abandonment. Few tenants want an expensive apartment in a very dangerous neighborhood. So when a building inspector insists that a landlord spend a lot of money improving a building in a bad neighborhood so it will conform to the code, the landlord may think he will not recover his costs. At that point abandonment often becomes inevitable.

Despite stagnating incomes, many tenants have apparently been prepared to pay more rent in order to get better housing in a safer neighborhood. As a result, the Census Bureau’s housing surveys show that low-income tenants had somewhat more bedrooms, more bathrooms, more air conditioners, better heating systems, and better-equipped kitchens in 1989 than in 1973. Tenants also reported fewer burglaries.

As a result of these improvements in accommodations, the proportion of low-income tenants living in low-rent units fell during the 1970s (see Table 1). But this trend slowed after 1979, largely because a growing proportion of the nation’s low-rent housing was subsidized by the government. Most tenants in these new subsidized units were very poor. By 1989, 32 percent of all low-income tenants were receiving a means-tested government subsidy, compared to 21 percent in 1979. These tenants’ rent was limited by law to 30 percent of their reported income. The average subsidy cost the government about $350 a month. Most of the money came from the Department of Housing and Urban Development (HUD).

Nonetheless, A Place to Call Home opens its discussion of Washington’s role in the low-income housing crisis by claiming that “federal housing programs were cut sharply in the 1980s.” The authors report that “appropriations for HUD’s subsidized housing programs fell from a peak of $32.2 billion in fiscal year 1978 to $11.7 billion in fiscal year 1991.” Similar statistics crop up in most other liberal criticisms of federal housing policy.5 The statistics are correct, but they are also misleading.

When HUD subsidizes either a housing unit or a family, it makes a long-term commitment. HUD’s “net budget authority,” commonly known as its appropriation, is the estimated long-term cost of all the new commitments it is authorized to make in a given year. When HUD actually makes good on these commitments, which may not happen for years, the money is called an “outlay.” Appropriations are thus a set of promises; outlays redeem these promises.

HUD made a great many promises under Presidents Ford and Carter. Between 1977 and 1981, for example, it committed itself to 1.4 million new units of subsidized rental housing.6 Many of these projects took years to complete, so the number of low-income tenants getting federal subsidies would have risen during Reagan’s first term even if Congress had never authorized another penny of new federal spending.

If the Reagan administration had had its way, new appropriations for low-income housing programs would have stopped after 1981. But Reagan did not get his way. Instead, Congress quietly authorized another 800,000 units of federally subsidized low-income rental housing between 1982 and 1989. Furthermore, when the commitments that HUD had made during the 1970s began to run out in 1989, Congress renewed them all. By 1992, 4.7 million tenants were getting federal subsidies, compared to 3.0 million in 1981.

Table 2 traces the budgetary consequences of these policy changes.

Table 2

Net budget authority fell dramatically during the 1980s, just as Lazere and his colleagues say. But since the promises made under Ford and Carter were being carried out and some new promises were being made even under Reagan and Bush, actual outlays for low-income housing, measured in constant dollars, rose from $9 billion in 1980 to $18 billion in 1992. Lazere and his colleagues mention this increase only in passing.

That low-income housing subsidies grew during the 1980s was one of Washington’s best kept secrets. The Congressional Democrats responsible for this growth never claimed credit for it. They evidently believed that the best way to keep housing subsidies growing was to claim they were shrinking. Like Lazere and his co-authors, Democratic legislators emphasized the decline in HUD’s net budget authority, not the increase in its actual outlays.

Attacking Reagan and Bush for cutting low-income housing programs had obvious political advantages. For one thing, it helped Democrats blame the spread of homelessness on Republicans. Had the Democrats tried to claim credit for expanding the government’s low-income housing programs during the 1980s, the tables would have been turned. Then Republicans would have cited homelessness as evidence that federal housing programs did not work.

The conspiracy of silence about increased federal housing outlays probably helped keep them growing, but it was still a mixed blessing for the poor. As long as everyone thinks outlays are being cut, there will never be a public debate about whether they are being distributed in sensible ways. By 1992 the federal government spent more on its low-income housing programs than on Aid to Families with Dependent Children (AFDC), its most controversial welfare program. Yet only a third of all low-income households—and only a quarter of all AFDC recipients—got subsidies. Those lucky enough to get anything typically got a lot. For a typical AFDC recipient, getting a housing subsidy was equivalent to a second monthly AFDC check. But while HUD gave some people a great deal of help, most low-income households got nothing whatever, and some became homeless as a result. Had Washington tried to run AFDC this way, giving a third of all eligible recipients roughly what they needed and giving the rest nothing, the system would have been reformed years ago.


The spread of homelessness during the early 1980s had many causes: declining job opportunities for unskilled men, political restrictions on the creation of new flophouses, the increase in single motherhood, the erosion of the purchasing power of welfare recipients, and continuing changes in state mental-health policies. But while these factors largely explain what happened in the early 1980s, they cannot explain the continuing increase in homelessness during the late 1980s. When the demand for labor increased in the late 1980s, homelessness should have declined. Instead, it kept growing. The invention of crack, which I discussed in my earlier article, certainly contributed to this surprising growth. But crack cannot explain the entire increase, especially among families with children. We therefore have to consider the possibility that improvements in the shelter system also contributed to the spread of homelessness.7

When a community opens a shelter, it usually seeks to house people who are currently sleeping in bus stations, parks, doorways, and dumpsters. (Shelters for battered women are an obvious exception.) But new shelters also attract some people who would otherwise be living with friends or relatives in conventional housing. Many of these guests are unwelcome, and most of them dislike depending on someone else’s generosity. Some of them leave or are thrown out even when there is no local shelter to take them in. If a shelter is available, they are almost surely more likely to move out and less likely to return. The better the shelter is, the more such people will use it.

Shelters lure even more people out of conventional housing when moving to a shelter improves their chances of getting permanently subsidized housing. About two million single-parent families currently live in someone else’s home. Most would prefer living in a place of their own, but they can only afford one if they get some kind of subsidy. Since there are not enough federally subsidized units for everyone who passes the means test, there is always a waiting list. Federal law gives priority to certain kinds of applicants, including the homeless. If living in a shelter dramatically reduces the waiting time for subsidized housing, as it does in New York and some other cities, some families will move into a shelter so as to get a place of their own sooner.

In New York during the 1980s, the Koch administration housed most homeless families in welfare hotels and forced them to wait well over a year for permanent housing. These hotels were nasty and often dangerous places, so only families in unusually difficult situations moved into them, and many moved out within a few weeks. Those who remained eventually got the handful of subsidized units the city had set aside for the homeless.

This rationing system caused a great deal of misery, but it ensured that the city’s meager supply of new subsidized units for the very poor went to some of the most desperate families. To the courts and the press, however, the long wait was not a rationing system but evidence of bureaucratic callousness and incompetence. By the time Koch left office in 1989, the city was under court order to move the homeless out of welfare hotels, and the Bush administration was trying to bar the use of federal funds to house welfare recipients in hotels.

Once David Dinkins became mayor, the city cut the waiting period for permanent housing by allocating a larger proportion of its subsidized housing to homeless families. At first the number of families in welfare hotels fell. But once waiting time diminished, more families began entering the shelter system, and some said that they had moved into shelters in order to get permanently subsidized housing. Before long, the waiting period lengthened again, and the number of families in welfare hotels climbed.8

Yet despite intense controversy about New York City’s treatment of homeless families, the city never had a coherent public discussion about how it should distribute its subsidized housing. According to officials responsible for dealing with the homeless during the Koch years, the New York City Housing Authority (NYCHA) did not want any homeless families in municipally owned housing. This housing had mostly been built with federal money, but NYCHA is now supposed to collect enough rent to cover its operating costs. Since rents cannot exceed 30 percent of a tenant’s income, NYCHA prefers tenants with steady jobs to welfare recipients. And because rents in the private sector are so high in New York, NYCHA has no difficulty finding the tenants it wants. These tenants share NYCHA’s distaste for the homeless, and they are prepared to make trouble for a mayor who sends them unwanted neighbors. So while Mayor Koch sometimes forced NYCHA to take a few homeless families, the number was always small.

New York had more choice about how it distributed the new housing subsidies that Washington was making available during the 1980s. This was mostly “Section 8” money, earmarked for tenants in private housing. Section 8 is extraordinarily complex, but its basic goal is to make up the difference between what a low-income tenant can “afford” (still defined as 30 percent of their income) and the market value of the tenant’s apartment. New York used some of this money to move families from shelters to private housing. But as far as I could discover, there was no public debate about how the city’s Section 8 money should be divided between the homeless and other claimants. Indeed, when I tried to determine how the money had in fact been divided, I could not find anyone who knew the answer.

The fact that some people moved into shelters during the 1980s in order to get a permanent housing subsidy does not mean, of course, that anyone preferred a shelter to a place of their own. But among the poorest of the poor, those were seldom the choices. For the very poor, the choice was usually among different kinds of homelessness: living in a shelter, living on the streets, or living in someone else’s home. When a city opens more shelters or makes existing shelters more attractive, more people will move to them, and official counts of the homeless, which ignore people living in other people’s houses, will increase.


While new shelters and soup kitchens helped offset some of the worst consequences of extreme poverty during the 1980s, hardly anyone thinks they are a satisfactory long-term solution. Ideally, every society should offer useful work to anyone able to do it, pay them a living wage, and let them find their own housing. But that is not easy in contemporary America. Most of the homeless have characteristics that make them the last hired and first fired. Many could find steady jobs if unemployment stayed below 4 percent for a protracted period, but that has not happened since World War II. Moreover, economists of all political persuasions now agree that unemployment rates below 4 percent would probably push up the rate of inflation to politically unacceptable levels.

Better education and job training could help some of the homeless, but in a competitive labor market someone has to be the last hired and first fired. Training schemes can rearrange the queue for jobs, but they cannot eliminate it. That means we must try to make life at the end of the queue more endurable instead of just helping people change places. When people cannot find steady jobs, they can seldom afford to link their self-respect to their work. Some leave the labor market entirely. Others work at casual jobs when they need money, but only show up when it suits them and take no pride in doing the work well. The side effects of such a life often include depression, alcoholism, drug addiction, rage, and violence.

So far as I know, all societies with unregulated labor markets have this problem to some degree. To reduce its severity, either the government or powerful trade unions must intervene in the labor market to help those whose services are least in demand. European experience—and American experience in the middle third of this century—shows that such egalitarian interventions can be compatible with relatively rapid economic growth. But when politics are dominated by money and television image-making, the rich find it easy to make unfettered competition seem both natural and desirable. Once that happens, support for egalitarian labor market policies dries up. Since this situation is unlikely to change soon, we need stopgap measures to improve housing and job opportunities for those who lose out under the present system.

When homelessness among single adults became a political issue in the first half of the 1980s, most communities responded by opening more congregate shelters. These shelters were useful, but they had several limitations. Because they housed large numbers of strangers together in dormitory rooms, they could not easily accomodate people who were unruly, drunk, crazy, or deeply reluctant to live in places that were lacking in any privacy and were often dangerous. A lot of these people therefore continued to sleep in public places. In the few communities where shelters had to admit everyone, of which New York is the best-known example, the shelters had to hire a good many security people to keep order. The Cuomo Commission reports, for example, that New York City shelters spent an average of $1,500 a month for each shelter resident in 1991. Yet despite this staggering outlay, the city’s congregate shelters were so disagreeable and dangerous that thousands of single adults preferred the streets.

Congregate shelters also have another big drawback. Because they almost all evict their residents early in the morning, they do nothing to reduce daytime homelessness. Few communities offer shelter residents either useful work or anything else to do during the day. Since the homeless cannot legally loiter on private property, they congregate in parks, train stations, subways, and other public spaces from which the police cannot evict them. This upsets the prosperous classes, who then become angry with the homeless.

Recognizing the limitations of congregate shelters, a number of nonprofit groups have spent the past decade trying to build affordable SROs for single adults. These places offer more privacy than most college dormitories, which usually require several students to share a room. SRO rooms usually have a window, a closet, a chair, a bed, and a table. Unlike most college dormitories, most also have some plumbing and cooking equipment. Because they are relatively attractive, demand is enormous. Because they meet middle-class standards, their costs are high. Building enough SRO rooms to accommodate every eligible applicant would therefore be very expensive.

The Clinton administration estimates that seven million people were homeless at some point between 1985 and 1990. Of these, roughly five million were single adults. If all these persons had been offered a SRO room in which the rent was limited to 30 percent of their income, as it is in most subsidized housing, the great majority would probably have accepted the offer. Some would eventually have moved out in order to live with a lover, and some would eventually have earned enough so they could afford more attractive quarters. But a large proportion of those who moved in would remain until they died. If two or three million poor single adults were now getting such housing, and if each room cost the taxpayer $200 a month, the bill would be around $5 billion a year. If we allow for the likelihood that more single adults would have become homeless if homelessness had been the path to a permanently subsidized room of one’s own, the bill might be more like $10 billion. That is not a lot in a $6 trillion economy, but it is far more than most politicians think voters are willing to spend.

New York City probably spends more per capita on the homeless than any other major American city. It has the most vocal and best organized advocates for the homeless in the country. Yet the administration of the newly elected mayor, Rudolph Giuliani, has proposed to cut from the budget the $89 million the city had planned to spend over the next three years on a loan program to support SROs. Elsewhere, the prospects are far worse.

If plans to build large numbers of SROs could get strong political support they would obviously be desirable; but it seems extremely doubtful they will be built in anything like adequate quantity. If our goal is to give everyone some private space from which they can exclude strangers, to which they can escape when life overwhelms them, and in which they can leave their possessions when they venture outdoors, the only feasible way to do it in the forseeable future is to build cubicle hotels. Many Americans will balk at this approach, on the grounds that no society should expect people to live in such conditions. Rooms without windows strike many people, including me, as particularly grim. Nonetheless, when cubicles were widely available, almost everyone preferred them to congregate shelters, despite the fact that shelters were free.

Certainly efforts should be made to construct cubicle hotels that are as decent as possible. But because they would, at best, be far less appealing than today’s SROs, their residents would be less likely to remain in them indefinitely. If we offered every single adult in the country a cubicle and set the rent at 30 percent of the occupant’s income, I doubt that more than a million people would take up the offer at any one time, and the number might be far smaller. Construction of cubicles will provoke opposition in some neighborhoods but they would also cost far less to build and maintain than regular SRO rooms. A program of this kind could probably get by on $2 billion a year, assuming code requirements were no higher than those we set for cubicle hotels earlier in the century. Such a program would sharply reduce both the number of people sleeping in public places and the number wandering the streets during the day.

If the government is to offer the homeless privacy, however, current politics requires that it ask something in return, namely work. The best way to do this, I think, would be to organize local day-labor markets under public auspices. Everyone who wanted work would show up at an early hour. If no private employer hired them, they would be entitled to public employment, cleaning up parks or neighborhoods or public buildings, or doing whatever work the community needed done. In return, they would get vouchers for a cubicle hotel and three meals, plus a small sum for spending money. Assuming that the cubicles were $8 a night, that the meals were worth another $8, and that we provided bus tokens and a couple of dollars in cash, four hours of work should entitle anyone to room and board for the day. Those who wanted better accommodations, better food, or more cash should be able to work longer and get more. By paying largely in vouchers rather than cash, we could limit demand to the homeless and reassure taxpayers that their money was not going for drugs or alcohol.

It is not clear how many of the homeless would be willing to work four hours a day for a cubicle and three cheap meals. But even if most of them refused such an offer, that would not be an argument against making it. Society has no obligation to support people who refuse to accept its most elementary rules, as some of the homeless do. But it does have an obligation to help those who are willing to work, as a significant proportion of the homeless are. Such people deserve something better than a congregate shelter or a doorway.

The difficult question is how much a society can require of those who seek work. Must workers be sober? Must they actually apply themselves to the task at hand? Can they be fired? If workers can be fired, should the standards be those a private employer would use? If the public sector is to use the same standards as the private sector, does that mean the standards private firms use when unemployment is 3 percent or when it is 8 percent?

My own view is that a public day-labor market should ask as much as the private one asks in normal times, and that it should offer workers who perform unusually well a chance at either better jobs with somewhat higher wages or at technical training. With luck, a public labor market could then provide some people with a path to steady employment in the private sector. Some loafing on the job is obviously inevitable. It occurs everywhere. But if a public day-labor market tolerates flagrant malingerers, others will soon follow their example, little useful work will get done, and the voters will soon weary of the whole charade.

This strategy will not solve the problems of the severely disabled, especially the mentally ill. They often need special housing arrangements adapted to their distinctive problems and needs, and of course they must be adequately supported if they cannot work. To finance such housing we need to take the budgetary implications of deinstitutionalization seriously, cutting state hospital budgets and redirecting this money to housing vouchers for the mentally ill. We also need to convert a significant proportion of the disability benefits we now give the mentally ill from cash to housing vouchers. When that is done we should make it illegal to release patients from hospitals until they have a place to live.

We also need a different strategy for helping homeless families with children, who now make up something like a sixth of the total homeless population. These families’ problems demonstrate the failure of the present welfare system. At present, Aid to Families with Dependent Children hardly ever covers a single mother’s living expenses. In order to keep her family together, an unskilled single mother needs to supplement her AFDC check. In theory, she is supposed to report any extra income to the local authorities, who cut her welfare check by an equivalent amount. But since that would leave her unable to support her family, she usually ignores this rule. Some welfare mothers supplement their monthly AFDC check by doing piecework or other jobs that are “off the books.” Others get money from their relatives, from their boyfriends, or from the fathers of their children. If they cannot make ends meet in this way, they often move in with somebody else. If that is not possible, and if they cannot get into subsidized housing, they have to break up their family or move to a shelter. In many cases, a shelter is their last hope of keeping their family together.

The best way to solve these families’ problems would be to raise AFDC benefits. But most voters oppose this solution because they want to make single motherhood less attractive, not more so. If schemes for helping single mothers are to be made more politically acceptable, they must be part of a larger program with a broader constituency. The Earned Income Tax Credit, which helps all low-wage workers with children, is such a program. National health insurance could be another. But these are programs for the working poor. They will not help many homeless mothers, because these are women whom hardly anyone wants to hire, even at the minimum wage.

One politically workable solution might be to redistribute HUD’s low-income housing budget. At present, HUD provides quite generous benefits—sometimes as much as $800 a month—to a large minority of low-income families but nothing to the majority. If HUD limited the value of any one family’s housing subsidy to, say, $250 a month, it could help many more poor families. That might reduce family homelessness significantly without raising HUD’s budget.

Such changes in housing policy will not work miracles. Almost everyone who deals with the homeless now agrees that they need help with alcohol and drug addiction, depression, schizophrenia, and many other ills. Some would also benefit from additional schooling and job training. Many influential groups, including the Cuomo Commission, have argued that the best way to deal with these problems is with intensive social services. Unfortunately, such services are always expensive and often ineffective. If there is to be any hope of significantly increasing political support for such programs, we need to invent more reliable devices for eliminating services that do not work. One strategy might be to give the homeless vouchers for such services, letting them decide what is helpful. Then if a service got no customers, the government would pay nothing. Another posibility is to contract for services with agencies—public or private—who get paid by results.

Our dilemma in dealing with the homeless, both as individuals and as a society, is to reconcile the claims of compassion and prudence. Anyone who ponders this dilemma will soon find that they need to understand the homeless as well as the housing market. For that purpose there is no better guide than Elliot Liebow. Liebow’s 1967 study of poor black men in Washington, DC, Tally’s Corner, is the finest ethnographic account of urban poverty I know of. In 1984, after almost a quarter century of working for the federal government, Liebow discovered that he had cancer, retired, and started working in a shelter for homeless women outside Washington. Tell Them Who I Am recounts the shelter residents’ stories, starting before they arrived at the shelter and often continuing after they returned to conventional housing.

At the end of the book, Liebow quotes a woman whom he calls Shirley. “I’m fifty-three years old,” she says, “I failed at two marriages and I failed at every job I ever had. Is that any reason I have to live on the street?” I doubt that any government program will solve this woman’s marital or employment problems. But we can keep her off the street. Because we can, we should.

(This is the second part of a two-part article.)

This Issue

May 12, 1994