The digital revolution of the 1990s seemed to mark a definitive break with the manufacturing economy that had thrived in the United States since the late-nineteenth century. With the pervasive use of information technology (IT) by banks, insurance companies, hospitals, clinics, even warehouses and retail stores, the era of industrial mass production in the United States faded into the past. Also redundant were the blue-collar workers who had manned the old assembly lines. With 80 percent of the American workforce now employed in white-collar service industries, economists assumed that there was no longer any need for a large industrial proletariat with limited skills, passively taking orders from above.
In the years since the long economic boom of the 1990s came to an end in 2000–2001, there has been growing evidence that this view of recent economic history is flawed. In fact, the findings of the three books under review here, along with much recent research, suggest that methods of production based on top-down standardization and tight control of work are as influential in the digital economy as they were in the industrial economy. Drawing upon the virtually unlimited powers of computers to monitor the activities of employees and their use of information, these methods have simply been readapted for the white-collar workplace.
What is striking is how they have been used in ways that put skilled workers in many professions at a disadvantage. In an economy more and more populated by “knowledge workers”—people who work primarily with information, for which they develop special skills and expertise—one would expect the productivity, or output per person, and real income of employees to move upward together, as an increasingly skilled workforce benefits from its own improved efficiency. But since 1995, the year when the “new economy” based on information technology began to take off, incomes have not kept up with productivity, and during the past five years the two have spectacularly diverged. Between 1995 and 2006, the growth of employee productivity exceeded the growth of employee real wages by 340 percent. Between 2001 and 2006, the first six years of George W. Bush’s presidency, this gap widened alarmingly to 779 percent.1
The gap helps explain why Wal-Mart casts such a long shadow over the US economy. Wal-Mart has demonstrated the effectiveness of applying industrial principles to the retail economy. It does so by combining an intensive use of information technology, a rapid growth of employee productivity, and a harsh, often punitive work regime that keeps even the most productive workers off balance and their wages at poverty levels. Studies have shown, for example, that the productivity of Wal-Mart employees has been as much as 41 percent higher than that of the company’s competitors, yet shop floor workers are paid far less than at other discount stores. According to researchers at the Union of Food and Commercial Workers in Washington, the average hourly wage of UFCW members at the unionized Safeway, Albertson, and Kroger supermarkets in California is $12.71; the comparable figure for Wal-Mart…
This article is available to online subscribers only.
Please choose from one of the options below to access this article:
Purchase a print premium subscription (20 issues per year) and also receive online access to all content on nybooks.com.
Purchase an Online Edition subscription and receive full access to all articles published by the Review since 1963.
Purchase a trial Online Edition subscription and receive unlimited access for one week to all the content on nybooks.com.