• Email
  • Single Page
  • Print

Hurry Up Please It’s Time

Investigating the Impact of the Year 2000 Problem

by 105th Congress Special Committee on the Y2K Technology Problem

The Simple Living Journal’s Y2K Preparation Guide: 110 Ways to Create a Sustainable Life—Crisis or Not

by Janet Luhrs, by Cris Evatt
Simple Living Press, 87 pp., $12.00 (paper)

Y2K: The Millennium Bug

by Tiggre Don L.
Xlibris Press, 360 pp., $18.00 (paper)

Y2K: The Millennium Crisis

by Bradford Morgan
Seattle: Hara Publishing, 631 pp., $12.95 (paper)

Get Rich with Y2K: How to Cash in on the Financial Crisis in the Year 2000

by Porter Steven L.
Piccadilly Books, 158 pp., $24.00 (paper)

1.

The Year 2000 computer problem originated in the 1950s and 1960s, when programmers decided to use two rather than four digits to represent a year. The date of Apollo 11’s launch to the moon, for instance, was registered in NASA programs as 07/16/69, rather than 07/16/1969. It was obvious to programmers that they needed to reserve two digits each for the day and month of a date—07 rather than just 7 for July. But there seemed every reason to economize on the year.

One reason was that computer memory was then so expensive. Random-access memory, the sort a computer uses to store the instructions and data for the program it is actively running, cost roughly one million times as much, per unit of storage, in 1960 as it does today. One economist has contended that, even if all the current end-of-the-century reprogramming problems had been foreseen, sticking with two digits in the 1950s would still have been rational because of the compounded value of the early savings in memory cost. But less purely rational considerations of habit and convenience were also important. Before the era of early UNIVAC-style computers, businesses and government agencies had fed data into tabulating machines by using punch cards. The standard card had room for only eighty columns of data, a punch in each column indicating a digit or letter, so it was sensible to confine the year to two columns rather than four. The early computer programmers merely continued this practice. Such abbreviation also fit the way people usually talk and think. They speak of the Roaring Twenties, the Gay Nineties, the Spirit of Seventy-Six, and don’t bother with the century.

Once the two-digit standard was established, it naturally persisted, even as the price of memory fell. The great headache of the computer industry is “backwards compatibility,” or dealing with “legacy code.” It would be easy for companies to introduce faster computers, and more effective programs, far more frequently than they do—if they didn’t have to worry about compatibility with all the computers and programs that already exist. Inserting “19” at the beginning of each year would be a trivial matter for, say, word-processing documents, since a change in one memo does not necessarily affect the ability to understand the next. But for database programs, like those used to manage a payroll or keep track of airline reservations, the size of each data field—the number of spaces allotted to store each name, date, or other unit of information—is part of the basic structure of the program. Changing a field from two digits to four can be a major undertaking, with ripple effects on other programs and databases that must also be changed to remain compatible. Programmers who could work with a more or less free hand, like the creators of the Macintosh in the early 1980s, could build in four-digit dates from the beginning (though not all did—for reasons of habit, two-digit dates turned up in brand-new programs well into the 1990s). But a typical company whose business was humming along fine with two-digit dates had every reason not to tamper with its programs, until the end of the century came in sight.

As that time approached, programmers had to reckon with a variety of difficulties that collectively make up the “Y2K problem” or “Millennium Bug.” Some highly computerized functions are not date-sensitive, and therefore can roll into the next century unchanged. VCRs, alarm clocks, automatic coffee makers, and other small appliances are controlled by computer chips—but they have no way of monitoring the real date or time. This is why they so often show a flashing “12:00,” and why they should not be affected on January 1. Modern cars contain hundreds of computer chips controlling the engine and brakes and calculating when maintenance is next due. But usually they reckon time by counting the number of days or hours the car is running, rather than referring to an absolute calendar. They know to signal “Servicing Due” when the odometer reaches 30,000 miles, but they generally don’t know when a new year has started. The same is true of even the most advanced airplanes, which are full of computerized chips but whose systems generally keep track of accumulated operating time rather than the actual hour, day, month, and year.

Programs that do refer to an absolute calendar could well have problems the first time they carry out a comparison operation next year. Comparing one value to another, and then taking different actions depending on which value is larger, is one of the fundamental elements of any computer program. The comparisons will be flawed next year in any system using two-digit dates, because events next year—year 00—will seem to come before events this year, year 99.

Most often this will produce nonsense results that the program will be unable to interpret. For instance, in calculating whether a credit card is still valid or a bank loan is due, the computer subtracts the issue date from the current date to see how many years have passed. If an issue date in year 95 is subtracted from a current date in year 00 to calculate elapsed time, the result will be a negative number (-95 years). Depending on how a program is written, a negative value for elapsed time could either produce a cascade of erroneous results or simply stop the program. In either case, the computer won’t do what the user intends. There are endless variations on the ways by which two-digit years can stop programs or make them report false data, all of them arising from the fact that events in the future appear to be in the past.

One special complication that adds to the Y2K problem involves leap year calculations. Years evenly divisible by four are, of course, leap years. Surprisingly enough, years divisible by 100 are not generally leap years—1800 and 1900 had only 365 days apiece. (This adjustment is necessary to reconcile the modern Gregorian calendar to the actual timing of the earth’s annual orbits around the sun.) But years divisible by 400 are leap years (a further calendar adjustment). So the year 2000, evenly divisible by 400, will have 366 days. If a computer, reading 2000 as “00,” assumes that the year is 1900, it will expect it to have only 365 days, and might therefore refuse to accept next February 29 or December 31 (the 366th day of 2000) as legitimate values.

Conceptually, the solution to the Y2K problem is straightforward: find all two-digit date fields, convert them to four digits, and insert “19” or “20,” as appropriate. In practical terms this is a huge and tedious undertaking, since date fields can be buried in so many obscure ways in so many kinds of programs. Nearly every major software company has posted information about how to begin Y2K repair efforts. A glance at these gives an idea of the magnitude of the task.1

And beyond the effort to debug programs themselves, where the main obstacle is limited time, is the near impossibility of determining how many embedded chips might be coded in a way that causes problems next year. Worldwide, something like 50 billion microchips are now built into medical devices, industrial plant controls, transportation and navigation equipment, heating and cooling systems, and virtually every other underpinning of modern life. Some small fraction—most estimates say 1 to 2 percent—of devices and systems were built with internal calendar functions that may produce errors in the next century. Chips in the most vital systems, those controlling power generation, emergency room care, aviation, and so on, have largely been tested and repaired. But no one will know how many other chips are vulnerable until they begin failing.

Everyone who has written or spoken about the problem agrees that some computer systems will malfunction next year, simply because there is too much code in too many places to find or correct in time. What is striking is the very wide difference in expectations about how serious the consequences of failure will be.

The first striking aspect of this debate is the almost ostentatious nonchalance of the technician class. Those who are deepest into the computer culture seem most relaxed about what will happen on January 1. In six months of inquiry, I have not found a software executive or professional programmer who is willing to express serious concerns about the consequences of Y2K. All admit the inevitability of some disruption; all dismiss the possibility of widespread or significant breakdowns. When asked where he placed himself on a ten-point scale of concern, Charles Simonyi, a much-respected programmer at Microsoft, said, “I am at Zero. It will be like the ‘Skylab is falling’ scare.” The head of Y2K preparations at a major software company said, “I am doing nothing special, personally, to prepare for serious bad things to happen on January 1.” Eric Schmidt, CEO of the software company Novell, compared the hubbub over Y2K to that over Monica Lewinsky: “Everyone will be very excited about it, but in the end it won’t matter very much.” Experts have been wrong about their fields before, and it is possible that the professionals’ confidence will look like hubris a few months from now. But two main reasons seem to lie behind this outlook.

One is that programmers in particular and the software industry in general, although they usually miss deadlines, believe that they can meet them when they have to. A truism often cited in Y2K discussions is that most software development projects take far longer than promised. Therefore, however close the experts say they’ll be to correcting Y2K problems, the reality of next January 1 would seem likely to be worse.2 The programmers reply that there is an enormous difference between the schedule for a normal commercial product, such as the latest release of Lotus’s popular Notes software, and that for completing Y2K work. If an ordinary program reaches the market a month or even a year later than planned, it’s a nuisance but not a disaster. The new “R5” version of Notes was late—but so was its main competition, the latest Office software from Microsoft. Often the delay reflects a deliberate tradeoff, to allow new features to be designed and fitted in.

The deadline for Y2K work, in contrast, is entirely real, and programmers approach it with an emergency room mentality of fixing the truly urgent problems now and leaving niceties for later. Normal software development is like book publishing, in that deadlines are targets rather than imperatives, whereas Y2K repair work is like the evening TV news. If you extrapolated from missed deadlines in the book business, you’d think TV news could never get out on time—but it does, because the deadlines are real.

Beyond asserting that their colleagues can get the most important work done in time, programmers go on to claim that the software industry has already done so. The Y2K transition, they say, won’t happen like a thunderclap next New Year’s Eve, as many disaster scenarios assume. Indeed, it is already well underway, through the countless revised computer programs that must deal with dates in the future—and so far the transition has been surprisingly smooth.

  1. 1

    The checklist Microsoft offers for setting up a Y2K test program, for example, tells companies to begin by asking themselves these questions about their data systems:

    In general there are some areas to consider when testing the Year 2000:

    When testing a platform [i.e., a computer system] verify its handling of BIOS [Basic Input-Output System, the most elementary unit of computer instructions] errors related to the year 2000. Verify year display in each decade of 2000-2099. If it knows time zones, verify operation when UTC [Universal Time Coordinated, the fancy name for Greenwich Mean Time] is a different century than local time (both + and - from UTC).

    Test an application with 2-digit date entry and 2-digit date display both before 2000 and after. And when the entry/display is a date in the other century, both before and after.

    Verify entry/display of 4-digit dates.

    Verify sorting of dates where some are before 2000, some after.

    Verify day-of-week before 2/29/ 00, on 2/29, and after, on 2/1/01.

    Does your program do any “year ahead” calculations? Testing in 1999 may be critical….

  2. 2

    For instance, a website called Y2KtimeBomb.com proposed this question for John Koskinen, director of the President’s Council on Year 2000 Conversion: “Eighty-four percent of all technology projects are finished late or not at all. Y2K is the largest technology project in history and it has a fixed deadline. Why is the government trying to convince us it is about to pull off the greatest technical miracle in history?”

    This is similar to the argument that a shortage of programmers familiar with old mainframe computer languages, especially COBOL, would place an absolute ceiling on the available manpower to fix the problem. What’s rarely mentioned is that a bright twenty-two-year-old schooled in modern computer languages can also learn COBOL if necessary. The popular For Dummies series even has a volume called COBOL for Dummies.

  • Email
  • Single Page
  • Print