Gates: How Microsoft’s Mogul Reinvented an Industry and Made Himself the Richest Man in America
Game Over: How Nintendo Zapped an American Industry, Captured Your Dollars, and Enslaved Your Children
Big Blues: The Unmaking of IBM
The “computer revolution” of the last twenty years or so is often discussed as if it were a single huge phenomenon. But it has involved many separate technical and business trends moving in different directions at different speeds. The technical change that has had the biggest impact on daily life has been the phenomenal advancement in semiconductor chips over the past fifteen years. The chips, tiny collections of electronic circuits, fall into two large categories: memory chips (which store information) and processors (which carry out instructions about what to do with the information). The central-processing chip that controls one of today’s typical personal computers operates about one hundred times as fast as the chip supplied with the original IBM personal computer in 1981. The memory chips on today’s typical personal computers can store five hundred times as much information as the original IBM PC did, for about the same price.1
These huge advances in a computer’s speed and capacity have in turn made possible far more sophisticated programs than computers could previously run. Twenty years ago, all interaction between a computer and its human user had to be carried out in computer language, which ranges from “assembly language code,” consisting entirely of arcane abbreviations, to programming languages like BASIC or Pascal whose command structures use recognizable English words (“IF/THEN/ELSE,” “GOTO,” and so on). It took a great leap to move from such languages to the “graphical interface,” which makes it possible to control the computer by manipulating simple iconic drawings on the screen with a hand-held “mouse.” Such “graphical” software required much more memory and much faster processing speed than early machines possessed.
Business empires have sprung up (and in many cases crashed down) in the last decade, as technology and market tastes have been constantly changing. For example, Motorola and, especially, Intel dominate the processing-chip business. Motorola makes the main processing chips used in Macintosh computers; Intel makes the chips for most other personal computers. These two companies have prospered through a combination of technical excellence and business aggressiveness, fighting many legal wars to defend their chip designs against imitators.
Almost every niche of the computer industry contains its own idiosyncratic business dramas. Fifteen years ago, most of the information processed on a personal computer had to be stored on a “floppy disk,” which worked very slowly and had limited space. The development of small hard-disk drives, which store far more data inside the computer and retrieve it far more quickly, created a whole new industry in disk production. Since then, companies specializing in hard-disk drives, such as Shugart, Core, Seagate, Maxtor, and Conner, surged to profitability and momentary prominence within the industry only to fall back when new competitors and new technologies emerged. The market for display screens, by contrast, has been dominated through the years by big, familiar companies like Sony, Zenith, and NEC (along with a few newcomers, like the Japanese firm Nanao). A dozen years ago Compaq started producing the first “IBM-compatible” personal computers—machines that competed successfully with IBM by being engineered to accommodate IBM software and accessories but at a lower price and with greater speed. Compaq’s original innovation was to produce a portable computer, much smaller and lighter than the IBM PC, but it has made several dramatic changes in business strategy in order to survive. Other companies that seemed like promising clone-makers a decade ago have vanished altogether.
During this same period software firms have also gone through radical ups and downs. In the early 1980s, the most popular word-processing program was WordStar. Two computer scientists at Brigham Young University formed a company called Satellite Software and introduced their word-processing program, WordPerfect. By the late 1980s WordPerfect was the dominant word processor in the world, and WordStar had virtually disappeared (along with the Wang corporation, which previously had led the market for word processors in large corporations but which went into bankruptcy in 1992). But late last year WordPerfect replaced its president, announced it would lay off one sixth of its work force, and displayed other signs of corporate distress. Nearly fifteen years ago, a program called VisiCalc dramatically shaped the growth of the computer industry. VisiCalc introduced the concept of a “spreadsheet”—a grid of dynamically linked numbers and formulas, which allowed users to see how changes in each variable, from mortgage-interest rates to monthly sales estimates, would affect the final result. The VisiCalc spreadsheet gave many business officials and bankers their first clear idea of how small computers might be useful to them.
This in turn helped to create a market for the first widely available personal computer: the Apple II, which was introduced in 1977. In retrospect, the spreadsheet helped to bring about the merger and takeover binge of the 1980s. With spreadsheets, analysts could quickly crank out calculations of how share price, interest rate, asset valuation, and other factors would affect takeover bids. People interested in takeovers have always made such calculations, of course, but without spreadsheets the process would have been too slow and cumbersome to permit bidding wars like those described in Barbarians at the Gate by Bryan Burrough and John Helyar. Yet VisiCalc, for all its historic effect, has now practically disappeared, having been displaced in the mid-1980s by Lotus’s spreadsheet “1-2-3”—which in its turn has largely been surpassed by newer programs.
Of these many dramas, two are most frequently discussed in the computer world today. One involves the long decline of IBM. Twenty-five years ago it dominated its business more thoroughly than any other firm in any other field. So complete was its mastery of the technology, marketing, and standards for the computer business that it spent much of the 1970s fighting an anti-trust suit the US Justice Department filed to give IBM’s competitors a chance to survive. Since the mid-1980s IBM has lost more than $70 billion of stock valuation and has eliminated 200,000 jobs. Efforts to explain this decline have become the business press’s counterpart to analyzing the fall of the Roman Empire, with no Gibbon yet at hand.
The other major drama is the rise of Microsoft, the giant of the software business in the United States and worldwide. Microsoft’s story is usually paired with IBM’s because they offer such striking contrasts. Microsoft was a tiny firm at the beginning of the 1980s; by the end of the decade, its stock valuation exceeded that of IBM. Microsoft, based in Redmond, Washington, outside Seattle, has a cocky, swaggering corporate reputation and bears the stamp of its thirty-eight-year-old founder, Bill Gates. (IBM, based in Armonk, New York, has a longstanding reputation for conformity and stodginess and has difficulty overcoming it.) Microsoft got its crucial break with the crumbs from IBM’s table, a contract in 1980 to provide software for the first IBM personal computer. The result was DOS (Disk Operating System), the set of instructions used to operate nearly all IBM-compatible personal computers. Microsoft’s copyright control of this crucial system is the foundation of its spectacular successes. In the last decade Microsoft has sold at least 60 million copies of its operating system software, which accounts for about 80 percent of all such software sold in the world. Through the mid-1980s Microsoft was IBM’s partner in software development; late in the decade it split with IBM, and the two companies now battle for the power to set standards for the personal computer industry.
Writing about business has begun to catch up with these developments. The books under review cover many different aspects of high-tech business competition. The common message that emerges is that for all the skill and determination that have gone into creating the new industrial empires, blind luck has often been decisive.
Randall Stross is a California writer whose previous book, Bulls in the China Shop, described the misadventures of foreign firms in China. The story he tells in Steve Jobs and the NeXT Big Thing is the second chapter of one of the computer industry’s most familiar and important tales.
The first chapter of this story involves the efforts of Steve Jobs and Steve Wozniak, two young Californians, to create the first commercially successful personal computer, which they called the Apple. Computers are generally classified into three categories. “Mainframe” computers are the huge machines used by airlines, banks, the federal government, and other large organizations. IBM is the traditional power in this field. “Minicomputers” are smaller machines still designed for institutional use, for example in universities. Digital Equipment, or DEC, has been a leading minicomputer maker. Apple was the first popular “personal” computer—that is, a computer designed to be used by one person, rather than shared by many users in a network or a central data-processing site. These machines were also known for a while as “microcomputers,” referring to the “microprocessor” chips that controlled them.
The Apple was not in a strict sense the first personal computer. The Altair 8800, released in February of 1975, was the true pioneer in this field. Early in 1976, Jobs and Wozniak founded Apple Computer; both were in their early twenties. The following year they released the Apple II. The Altair 8800 had no keyboard and accepted coded instructions through a bank of off-on levers. The Apple II had a keyboard and could be connected to an external monitor and disk drive, like a modern machine. The appearance of the VisiCalc program heightened demand for the Apple, and by 1980, just before the appearance of the IBM personal computer, Apple was perhaps the most admired small computer company in Silicon Valley, and Jobs was its leader and symbol.
Things were never the same for Apple after the IBM PC appeared. The two companies—huge, cautious IBM and young, fast-growing Apple—started out with similar strategies but ended up in far different circumstances by the middle of the 1980s. Apple hoped that independent software companies would write programs and games that would run on the Apple II computer. The more software the computer could run, the more attractive it would be to purchasers. When IBM entered the market it naturally hoped that its system, which was completely incompatible with Apple’s, would instead become the standard, attracting more support from software companies and more customers. While both companies welcomed software that would run on their respective computers, only Apple was vigorous in using lawsuits and other means to fight off companies that tried to produce imitations of the computer itself. IBM watched passively as an industry of “IBM-compatible” clones grew up.
IBM’s approach succeeded in establishing an industry standard: more than 85 percent of personal computers sold worldwide are based on the IBM design. (Such machines are now referred to simply as “PC-compatible.”) Unfortunately from IBM’s point of view, it makes only about a quarter of these machines. While computers based on the Apple design, mainly the Macintosh, are much less popular overall, Apple Computer makes them all.
The first IBM PC came with "16K of RAM," or slightly over 16,000 bytes of random-access memory, each byte being equal approximately to one digit or character. It is not unusual for computers today to come with "8MB of RAM," slightly over eight million bytes, which is 500 times as much as in the first IBM PC.↩
The first IBM PC came with “16K of RAM,” or slightly over 16,000 bytes of random-access memory, each byte being equal approximately to one digit or character. It is not unusual for computers today to come with “8MB of RAM,” slightly over eight million bytes, which is 500 times as much as in the first IBM PC.↩