• Email
  • Single Page
  • Print

How the Computers Exploded

The boldest of Turing’s ideas was that of a universal machine: one that, if fed the code number of any special-purpose Turing machine, would perfectly mimic its behavior. For instance, if a universal Turing machine were fed the code number of the Turing machine that performed addition, the universal machine would temporarily turn into an adding machine. In effect, the “hardware” of a special-purpose machine could be translated into “software” (the machine’s code number) and then entered like data into the universal machine, where it would run as a program. That is exactly what happens when your laptop, which is a physical embodiment of Turing’s universal machine, runs a word-processing program, or when your smartphone runs an app.

As a byproduct of solving a problem in pure logic, Turing had originated the idea of the stored-program computer. When he subsequently arrived at Princeton as a graduate student, von Neumann made his acquaintance. “He knew all about Turing’s work,” said a codirector of the computer project. “The whole relation of the serial computer, tape and all that sort of thing, I think was very clear—that was Turing.” Von Neumann and Turing were virtual opposites in character and appearance: the older man a portly, well-attired, and clubbable sybarite who relished wielding power and influence; the younger one a shy, slovenly, dreamy ascetic (and homosexual), fond of intellectual puzzles, mechanical tinkering, and long-distance running. Yet the two shared a knack for getting to the logical essence of things. After Turing completed his Ph.D. in 1938, von Neumann offered him a salaried job as his assistant at the institute; but with war seemingly imminent, Turing decided to return to England instead.

“The history of digital computing,” Dyson writes, “can be divided into an Old Testament whose prophets, led by Leibniz, supplied the logic, and a New Testament whose prophets, led by von Neumann, built the machines. Alan Turing arrived in between.” It was from Turing that von Neumann drew the insight that a computer is essentially a logic machine—an insight that enabled him to see how to overcome the limitations of ENIAC and realize the ideal of a universal computer. With the war over, von Neumann was free to build such a machine. And the leadership of the Institute for Advanced Study, fearful of losing von Neumann to Harvard or IBM, obliged him with the authorization and preliminary funding.

There was widespread horror among the institute’s fellows at the prospect of such a machine taking shape in their midst. The pure mathematicians tended to frown on tools other than blackboard and chalk, and the humanists saw the project as mathematical imperialism at their expense. “Mathematicians in our wing? Over my dead body! and yours?” a paleographer cabled the institute’s director. (It didn’t help that fellows already had to share their crowded space with remnants of the old League of Nations that had been given refuge at the institute during the war.) The subsequent influx of engineers raised hackles among both the mathematicians and the humanists. “We were doing things with our hands and building dirty old equipment. That wasn’t the Institute,” one engineer on the computer project recalled.

Dyson’s account of the long struggle to bring MANIAC to life is full of fascinating and sometimes comical detail, involving stolen tea-service sugar, imploding tubes, and a computer cooling system that was farcically prone to ice over in the Princeton humidity. The author is unfailingly lucid in explaining just how the computer’s most basic function—translating bits of information from structure (memory) to sequence (code)—was given electronic embodiment. Von Neumann himself had little interest in the minutiae of the computer’s physical implementation; “he would have made a lousy engineer,” said one of his collaborators. But he recruited a resourceful team, led by chief engineer Julian Bigelow, and he proved a shrewd manager. “Von Neumann had one piece of advice for us,” Bigelow recalled: “not to originate anything.” By limiting the engineers to what was strictly needed to realize his logical architecture, von Neumann saw to it that MANIAC would be available in time to do the calculations critical for the hydrogen bomb.

The possibility of such a “Super bomb”—one that would, in effect, bring a small sun into existence without the gravity that keeps the sun from flying apart—had been foreseen as early as 1942. If a hydrogen bomb could be made to work, it would be a thousand times as powerful as the bombs that destroyed Hiroshima and Nagasaki. Robert Oppenheimer, who had led the Los Alamos project that produced those bombs, initially opposed the development of a hydrogen bomb on the grounds that its “psychological effect” would be “adverse to our interest.” Other physicists, like Enrico Fermi and Isador Rabi, were more categorical in their opposition, calling the bomb “necessarily an evil thing considered in any light.” But von Neumann, who feared that another world war was imminent, was enamored of the hydrogen bomb. “I think that there should never have been any hesitation,” he wrote in 1950, after President Truman decided to proceed with its development.

Perhaps the fiercest advocate of the hydrogen bomb was the Hungarian-born physicist Edward Teller, who, backed by von Neumann and the military, came up with an initial design. But Teller’s calculations were faulty; his prototype would have been a dud. This was first noticed by Stanislaw Ulam, a brilliant Polish-born mathematican (elder brother to the Sovietologist Adam Ulam) and one of the most appealing characters in Dyson’s book. Having shown that the Teller scheme was a nonstarter, Ulam produced, in his typically absent-minded fashion, a workable alternative. “I found him at home at noon staring intensely out of a window with a very strange expression on his face,” Ulam’s wife recalled. “I can never forget his faraway look as peering unseeing in the garden, he said in a thin voice—I can still hear it—‘I found a way to make it work.’”

Now Oppenheimer—who had been named director of the Institute for Advanced Study after leaving Los Alamos—was won over. What became known as the Teller-Ulam design for the H-bomb was, Oppenheimer said, “technically so sweet” that “one had to at least make the thing.” And so, despite strong opposition on humanitarian grounds among many at the institute (who suspected what was going on from the armed guards stationed by a safe near Oppenheimer’s office), the newly operational computer was pressed into service. The thermonuclear calculations kept it busy for sixty straight days, around the clock, in the summer of 1951. MANIAC did its job perfectly. Late the next year, “Ivy Mike” exploded in the South Pacific, and Elugelab island was removed from the map.

Shortly afterward, von Neumann had a rendezvous with Ulam on a bench in Central Park, where he probably informed Ulam firsthand of the secret detonation. But then (judging from subsequent letters) their chat turned from the destruction of life to its creation, in the form of digitally engineered self-reproducing organisms. Five months later, the discovery of the structure of DNA was announced by Francis Crick and James Watson, and the digital basis of heredity became apparent. Soon MANIAC was being given over to problems in mathematical biology and the evolution of stars. Having delivered its thermonuclear calculations, it became an instrument for the acquisition of pure scientific knowledge, in keeping with the purpose of the institute where it was created.

But in 1954 President Eisenhower named von Neumann to the Atomic Energy Commission, and with his departure the institute’s computer culture went into decline. Two years later, the fifty-two-year-old von Neumann lay dying of bone cancer in Walter Reed Army Hospital, disconcerting his family by converting to Catholicism near the end. (His daughter believed that von Neumann, an inventor of game theory, must have had Pascal’s wager in mind.) “When von Neumann tragically died, the snobs took their revenge and got rid of the computing project root and branch,” the author’s father, Freeman Dyson, later commented, adding that “the demise of our computer group was a disaster not only for Princeton but for science as a whole.” At exactly midnight on July 15, 1958, MANIAC was shut down for the last time. Its corpse now reposes in the Smithsonian Institution in Washington.

Was the computer conceived in sin? The deal von Neumann made with the devil proved less diabolical than expected, the author observes: “It was the computers that exploded, not the bombs.” Dyson’s telling of the subsequent evolution of the digital universe is brisk and occasionally hair-raising, as when he visits Google headquarters in California and is told by an engineer there that the point of Google’s book-scanning project is to allow smart machines to read the books, not people.

What is most interesting, though, is how von Neumann’s vision of the digital future has been superseded by Turing’s. Instead of a few large machines handling the world’s demand for high-speed computing, as von Neumann envisaged, a seeming infinity of much smaller devices, including the billions of microprocessors in cell phones, have coalesced into what Dyson calls “a collective, metazoan organism whose physical manifestation changes from one instant to the next.” And the progenitor of this virtual computing organism is Turing’s universal machine.

So it is fitting that Dyson titled his book Turing’s Cathedral. The true dawn of the digital universe came not in the 1950s, when von Neumann’s machine started running thermonuclear calculations. Rather, it was in 1936, when the young Turing, lying down in a meadow during one of his habitual long-distance runs, conceived of his abstract machine as a means of solving a problem in pure logic. Like von Neumann, Turing was to play an important behind-the-scenes role in World War II. Working as a code-breaker for his nation at Bletchley Park, he deployed his computational ideas to crack the Nazi “Enigma” code, an achievement that helped save Britain from defeat in 1941 and reversed the tide of the war.

But Turing’s wartime heroism remained a state secret well beyond his suicide in 1954, two years after he had been convicted of “gross indecency” for a consensual homosexual affair and sentenced to chemical castration. In 2009, British Prime Minister Gordon Brown issued a formal apology, on behalf of “all those who live freely thanks to Alan’s work,” for the “inhumane” treatment Turing received. “We’re sorry, you deserved so much better,” he said. Turing’s imaginary machine did more against tyranny than von Neumann’s real one ever did.

Letters

Who Gets Credit for the Computer?: An Exchange September 27, 2012

  • Email
  • Single Page
  • Print