The digital universe came into existence, physically speaking, late in 1950, in Princeton, New Jersey, at the end of Olden Lane. That was when and where the first genuine computer—a high-speed, stored-program, all-purpose digital-reckoning device—stirred into action. It had been wired together, largely out of military surplus components, in a one-story cement-block building that the Institute for Advanced Study had constructed for the purpose. The new machine was dubbed MANIAC, an acronym of “mathematical and numerical integrator and computer.”
And what was MANIAC used for, once it was up and running? Its first job was to do the calculations necessary to engineer the prototype of the hydrogen bomb. Those calculations were successful. On the morning of November 1, 1952, the bomb they made possible, nicknamed “Ivy Mike,” was secretly detonated over a South Pacific island called Elugelab. The blast vaporized the entire island, along with 80 million tons of coral. One of the air force planes sent in to sample the mushroom cloud—reported to be “like the inside of a red-hot furnace”—spun out of control and crashed into the sea; the pilot’s body was never found. A marine biologist on the scene recalled that a week after the H-bomb test he was still finding terns with their feathers blackened and scorched, and fish whose “skin was missing from a side as if they had been dropped in a hot pan.”
The computer, one might well conclude, was conceived in sin. Its birth helped ratchet up, by several orders of magnitude, the destructive force available to the superpowers during the cold war. And the man most responsible for the creation of that first computer, John von Neumann, was himself among the most ardent of the cold warriors, an advocate of a preemptive military attack on the Soviet Union, and one of the models for the film character Dr. Strangelove. As George Dyson writes in his superb new history, Turing’s Cathedral, “The digital universe and the hydrogen bomb were brought into existence at the same time.” Von Neumann had seemingly made a deal with the devil: “The scientists would get the computers, and the military would get the bombs.”
The story of the first computer project and how it begat today’s digital universe has been told before, but no one has told it with such precision and narrative sweep as Dyson. The author, a noted historian of science, came to the task with two great advantages. First, as a visiting scholar at the Institute for Advanced Study in 2002–2003, he got access to records of the computer project that in some cases had not seen the light of day since the late 1940s. This makes for a wealth of quasi-novelistic detail in his book, ranging from the institute’s cafeteria menu one evening in 1946 (“Creamed …
This article is available to subscribers only.
Please choose from one of the options below to access this article:
Purchase a print subscription (20 issues per year) and also receive online access to all articles published within the last five years.
Purchase an Online Edition subscription and receive full access to all articles published by the Review since 1963.
Purchase a trial Online Edition subscription and receive unlimited access for one week to all the content on nybooks.com.
Who Gets Credit for the Computer?: An Exchange September 27, 2012