The Computer Wars

Gates: How Microsoft's Mogul Reinvented an Industry and Made Himself the Richest Man in America

by Stephen Manes and Paul Andrews
Touchstone/Simon and Schuster, 541 pp., $14.00 (paper)

Game Over: How Nintendo Zapped an American Industry, Captured Your Dollars, and Enslaved Your Children

by David Sheff
Random, 445 pp., $25.00

Big Blues: The Unmaking of IBM

by Paul Carroll
Crown, 375 pp., $24.00
Steve Jobs
Steve Jobs; drawing by David Levine


The “computer revolution” of the last twenty years or so is often discussed as if it were a single huge phenomenon. But it has involved many separate technical and business trends moving in different directions at different speeds. The technical change that has had the biggest impact on daily life has been the phenomenal advancement in semiconductor chips over the past fifteen years. The chips, tiny collections of electronic circuits, fall into two large categories: memory chips (which store information) and processors (which carry out instructions about what to do with the information). The central-processing chip that controls one of today’s typical personal computers operates about one hundred times as fast as the chip supplied with the original IBM personal computer in 1981. The memory chips on today’s typical personal computers can store five hundred times as much information as the original IBM PC did, for about the same price.1

These huge advances in a computer’s speed and capacity have in turn made possible far more sophisticated programs than computers could previously run. Twenty years ago, all interaction between a computer and its human user had to be carried out in computer language, which ranges from “assembly language code,” consisting entirely of arcane abbreviations, to programming languages like BASIC or Pascal whose command structures use recognizable English words (“IF/THEN/ELSE,” “GOTO,” and so on). It took a great leap to move from such languages to the “graphical interface,” which makes it possible to control the computer by manipulating simple iconic drawings on the screen with a hand-held “mouse.” Such “graphical” software required much more memory and much faster processing speed than early machines possessed.

Business empires have sprung up (and in many cases crashed down) in the last decade, as technology and market tastes have been constantly changing. For example, Motorola and, especially, Intel dominate the processing-chip business. Motorola makes the main processing chips used in Macintosh computers; Intel makes the chips for most other personal computers. These two companies have prospered through a combination of technical excellence and business aggressiveness, fighting many legal wars to defend their chip designs against imitators.

Almost every niche of the computer industry contains its own idiosyncratic business dramas. Fifteen years ago, most of the information processed on a personal computer had to be stored on a “floppy disk,” which worked very slowly and had limited space. The development of small hard-disk drives, which store far more data inside the computer and retrieve it far more quickly, created a whole new industry in disk production. Since then, companies specializing in hard-disk drives, such as Shugart, Core, Seagate, Maxtor, and Conner, surged to profitability and momentary prominence within the industry only to fall back when new competitors and new technologies emerged. The market for display screens, by contrast, has been dominated through the years by big, familiar companies like Sony, Zenith, and NEC (along with a…

This is exclusive content for subscribers only.
Get unlimited access to The New York Review for just $1 an issue!

View Offer

Continue reading this article, and thousands more from our archive, for the low introductory rate of just $1 an issue. Choose a Print, Digital, or All Access subscription.

If you are already a subscriber, please be sure you are logged in to your account.