Hurry Up Please It’s Time

Investigating the Impact of the Year 2000 Problem

by 105th Congress Special Committee on the Y2K Technology Problem

The Simple Living Journal's Y2K Preparation Guide: 110 Ways to Create a Sustainable Life
Crisis or Not

by Janet Luhrs and Cris Evatt
Simple Living Press, 87 pp., $12.00 (paper)

Y2K: The Millennium Bug

by Tiggre Don L.
Xlibris Press, 360 pp., $18.00 (paper)

Y2K: The Millennium Crisis

by Bradford Morgan
Seattle: Hara Publishing, 631 pp., $12.95 (paper)

Get Rich with Y2K: How to Cash in on the Financial Crisis in the Year 2000

by Porter Steven L.
Piccadilly Books, 158 pp., $24.00 (paper)
Y2K specter
Y2K specter; drawing by David Levine


The Year 2000 computer problem originated in the 1950s and 1960s, when programmers decided to use two rather than four digits to represent a year. The date of Apollo 11’s launch to the moon, for instance, was registered in NASA programs as 07/16/69, rather than 07/16/1969. It was obvious to programmers that they needed to reserve two digits each for the day and month of a date—07 rather than just 7 for July. But there seemed every reason to economize on the year.

One reason was that computer memory was then so expensive. Random-access memory, the sort a computer uses to store the instructions and data for the program it is actively running, cost roughly one million times as much, per unit of storage, in 1960 as it does today. One economist has contended that, even if all the current end-of-the-century reprogramming problems had been foreseen, sticking with two digits in the 1950s would still have been rational because of the compounded value of the early savings in memory cost. But less purely rational considerations of habit and convenience were also important. Before the era of early UNIVAC-style computers, businesses and government agencies had fed data into tabulating machines by using punch cards. The standard card had room for only eighty columns of data, a punch in each column indicating a digit or letter, so it was sensible to confine the year to two columns rather than four. The early computer programmers merely continued this practice. Such abbreviation also fit the way people usually talk and think. They speak of the Roaring Twenties, the Gay Nineties, the Spirit of Seventy-Six, and don’t bother with the century.

Once the two-digit standard was established, it naturally persisted, even as the price of memory fell. The great headache of the computer industry is “backwards compatibility,” or dealing with “legacy code.” It would be easy for companies to introduce faster computers, and more effective programs, far more frequently than they do—if they didn’t have to worry about compatibility with all the computers and programs that already exist. Inserting “19” at the beginning of each year would be a trivial matter for, say, word-processing documents, since a change in one memo does not necessarily affect the ability to understand the next. But for database programs, like those used to manage a payroll or keep track of airline reservations, the size of each data field—the number of spaces allotted to store each name, date, or other unit of information—is part of the basic structure of the program. Changing a field from two digits to four can be a major undertaking, with ripple effects on other programs and databases that must also be changed to remain compatible. Programmers who could work with a more or less free hand, like the creators of the Macintosh in the early 1980s, could build in four-digit dates from the beginning (though not…

This is exclusive content for subscribers only.
Try two months of unlimited access to The New York Review for just $1 a month.

View Offer

Continue reading this article, and thousands more from our complete 55+ year archive, for the low introductory rate of just $1 a month.

If you are already a subscriber, please be sure you are logged in to your account.