Stephen Hawking, Kings College, Cambridge

David Gamble

Stephen Hawking, Kings College, Cambridge, 1987

The world’s first scientist-celebrity, Isaac Newton, was entombed in Westminster Abbey with high ceremony, alongside statesmen and royalty, under a monument ornately carved in white-and-gray marble, bearing a fulsome inscription in Latin: “Mortals rejoice that there has existed so great an ornament of the human race!” His fame had spread far across the European continent. In France the young Voltaire lionized him: “He is our Christopher Columbus. He has led us to a new world.” An outpouring of verse filled the popular gazettes (“Nature her self to him resigns the Field/From him her Secrets are no more conceal’d”). Medals bearing his likeness were struck in silver and bronze.

When Stephen Hawking died, in 2018, the abbey buried his ashes a few yards away from Newton’s grave. Besides family and luminaries, the interment ceremony drew so many mourners that a thousand were chosen by lottery and 26,000 turned away. “His name will live in the annals of science,” said Martin Rees, Britain’s astronomer royal. “Nobody else since Einstein has done more to deepen our understanding of space and time.” Hawking’s own words—generated by a speech synthesizer he used after a degenerative disorder took away his ability to speak—echoed from loudspeakers placed around the church: “I am very aware of the preciousness of time. Seize the moment. Act now. I have spent my life travelling across the universe inside my mind.” Far away in Spain, a radio telescope beamed the same words into the heavens. Benedict Cumberbatch, who had portrayed Hawking in a television movie, read from the Wisdom of Solomon: “For it is he who gave me unerring knowledge of what exists.” The abbey choir sang. Now the gift shop sells a postcard of his gravestone.

Hawking was no Newton. He said so himself. At a White House event in 1998, First Lady Hillary Clinton read a question from the Internet: “How does it feel to be compared to Einstein and Newton?” He replied, “I think to compare me to Newton and Einstein is media hype.” Then again, as Charles Seife demonstrates in Hawking Hawking, he “worked very hard to cultivate” these comparisons. He played for the White House audience a clip of his cameo appearance on Star Trek: The Next Generation, in which he banters with holographic representations of Newton and Einstein over a game of poker. Einstein calls his bet: “All the quantum fluctuations in the universe will not change the cards in your hand.” But Hawking is the winner in this game, as Seife recounts: “‘Wrong again, Albert,’ Hawking gloats, a huge grin on his face, as a motorized arm reveals his hand. Four of a kind.”

Hawking occupied Newton’s professorship at the University of Cambridge, the Lucasian Chair of Mathematics, and he often said he felt “a strong sense of identity” with Galileo, having been born exactly three hundred years after Galileo’s death. The jacket copy on his popular books claimed that he was “regarded as one of the most brilliant theoretical physicists since Einstein,” which stretched the truth.

His colleagues’ view of Hawking might be assessed, crudely, from a Physics World survey of 250 physicists in 1999—among contemporary physicists they ranked Richard Feynman first and listed Hawking in a large group tied at the bottom, with a single vote each—or from the fact that he never won a Nobel Prize. “I am not among the people who think he is a successor to Einstein,” the physicist and author Jeremy Bernstein wrote in 1992. “I could name, without much effort, 20 physicists since the 1920’s who have contributed more to our science than he has.” The public saw something different: “the world’s smartest man,” as Seife says several times.

A previous biography, by Michael White and John Gribbin, hagiographic and breathless, declared that Hawking “has perhaps done more than anyone to push back the boundaries of our understanding of the Universe”:

This man, weighing no more than ninety pounds and completely paralyzed, speechless, and unable to lift his head should it fall forward, has been proclaimed “Einstein’s heir,” “the greatest genius of the late twentieth century,” “the finest mind alive,” and even, by one journalist, “Master of the Universe.”

Behind the image was a fair bit of bunkum. The scientist made himself into a “commercial product,” Seife writes, and his name became a brand, “something to monetize.” After the enormous success of his first popular book, A Brief History of Time (1988), his later books—including A Briefer History of Time, My Brief History, and Brief Answers to the Big Questions—were mainly cobbled together by graduate students and ghostwriters. His disability was a feature of his celebrity, and he exploited it. He lent his name and his computer-synthesized voice to commercials for Intel and British Telecom and television series including Genius by Stephen Hawking, Into the Universe with Stephen Hawking, Brave New World with Stephen Hawking, and even Stem Cell Universe with Stephen Hawking. (“I have spent my life exploring the mysteries of the cosmos, but there’s another universe that fascinates me. The one hidden inside our bodies.”) Because he spoke so little and with such difficulty, his utterances seemed to come from an oracle.

Advertisement

Although his scientific pronouncements made headlines, they were frequently wrong, and he did little real work in physics during his last three decades. “Hawking’s scientific expertise often had little to do with what he opined about,” Seife writes:

His years of studying general relativity and particle physics didn’t give him any special insight into the nature of artificial intelligence, the need for space travel, or the desirability of contacting alien species, for example. Yet he was more than happy to speak at length on these subjects, as if he were representing scientific consensus.

By necessity—his ability to communicate having been so severely impaired—he became a chronic self-plagiarizer, recycling his aphorisms. At least one graduate student accused him of publishing her work under his name. He associated himself with the kind of business tycoon who likes cultivating scientists, including the English billionaire Richard Branson, the Russian oligarch Yuri Milner, and Jeffrey Epstein, later exposed as a pedophile and sex trafficker. (Seife notes, “There is no evidence that Hawking did anything improper or, unlike a number of other scientists, had any sort of contact with Epstein after the accusations became public.”)

Joyce Carol Oates once proposed the term “pathography” for the subspecies of biography meant to deflate and demean its subject. Contrarian and ruthless, the author topples a giant from a pedestal or exposes a false wizard behind a curtain. Pathography is “hagiography’s diminished and often prurient twin.”

Seife is not committing pathography. He aims to find the human lost inside the myth, so he must first chip away a gaudy shell. Ultimately he reveals

an important scientist whose importance is almost universally misunderstood; a person who suffered deeply and also caused deep suffering; a celebrity scientist who broke the mold of his forebears and fundamentally changed the concept of a scientific celebrity.

Stephen Hawking was not a fraud or a pretender; his early work in the fast-changing field of cosmology, particularly on black holes, was brilliant and influential. He was struck down in 1963, at the age of twenty-one, by amyotrophic lateral sclerosis, known in Britain as motor neurone disease, and told that he had only two or three years to live. This stubborn and independent soul could soon do nothing without assistance. His life became excruciatingly difficult and expensive to maintain. Throughout the next five decades, death always seemed near.

Hawking enjoyed the spotlight, and he needed the money that flowed from his celebrity to pay for his care and life support, but the celebrity eclipsed the scientist. “The authentic Hawking, the man who had devoted his life to physics, and who had a passion to be understood not just by his peers but also by the public, is barely visible behind the image,” Seife writes. “It’s a vexing, almost paradoxical situation.” There’s a scientific story to tell, and most of Hawking’s later life served to conceal it.

The entities we know as black holes had no name until 1967, when the physicist John Archibald Wheeler popularized the term at a lecture in New York City. A black hole is formed, Wheeler said, when a collapsing star, “like the Cheshire cat, fades from view. One leaves behind only its grin, the other, only its gravitational attraction…. Light and particles…go down the black hole only to add to its mass and increase its gravitational attraction.” He spoke as though black holes were real, but at that point they were only a theoretical possibility, implied by Einstein’s equations. No such thing had yet been detected.

Hawking was a graduate student at Cambridge in those years, newly married to a younger student of medieval literature named Jane Wilde. He had begun to show symptoms of his disease and received his grim prognosis, but he was working toward a doctoral dissertation in cosmology, the science of the whole universe and its origins. Powerful new instruments were extending astronomers’ reach, but finding a suitable subject was difficult. The catalyst came in 1965, when Roger Penrose of the University of London published a mathematical proof showing that gravitational collapse must create, at its center, the monstrosity known as a “spacetime singularity.” A singularity is a point at which finite, calculable quantities are replaced by infinities, such as infinite density or infinite curvature. Either “Einstein’s equations are violated” at a singularity, Penrose said, or “the concept of space-time loses its meaning.” (He added that these possibilities were interrelated, “the distinction being partly one of attitude of mind.”)

Advertisement

Hawking had an idea that even now seems breathtakingly audacious. He took the topological techniques Penrose had used and applied them to the entire universe. He reversed the direction of time, proving that, just as a collapsing star ends as a singularity, an expanding universe must begin with one. As astronomers were discovering the faint microwave radiation left over from the Big Bang, Hawking found a way to describe its origin mathematically. He could say with some justice that he had found the beginning of time.

Hawking’s singularity theorem became the capstone of his Ph.D. thesis, and he began a productive collaboration with Penrose. In trying to better understand the geometry of black holes, they focused on the event horizon, the region beyond which nothing, including light, can escape the pull of gravity. The event horizon has no physical form; it is a notional place. Hawking subtly redefined it, thinking of it as a boundary between two realms—an interruption in causality. Events on the inside can have no effect on the outside, for all eternity. This led him to a crucial result about the geometry of black holes, known as Hawking’s area theorem.

Then, as the science around black holes was beginning to firm up, he made the discovery that stands today as his most important. While most of cosmology was built on a foundation of Einsteinian relativity, Hawking was also considering the role of quantum mechanics, which governs behavior at the smallest scales. At the quantum level, he realized, black holes might radiate energy after all. “The uncertainty about energy and time built into quantum mechanics ensures that on the tiniest scales and for the shortest times, particles are constantly winking in and out of existence,” Seife notes. “These vacuum fluctuations are the reason why the quantum theoretic picture of space is frothy and constantly churning, rather than smooth and continuous as relativity would have it.”

At first Hawking himself didn’t believe that anything could escape a black hole’s gravity, and he harshly criticized a younger physicist, Jacob Bekenstein, who had gotten out ahead of him by arguing that black holes should have entropy and temperature—thermodynamic quantities associated with radiation. But careful calculation left no doubt. In the region of the event horizon, fluctuations in the vacuum would lead to a stream of particles away from the black hole.

Ultimately, the black hole would not only disappear; it would explode. “Black holes are not eternal,” Hawking declared. “They evaporate away at an increasing rate until they vanish in a gigantic explosion.” The particle-antiparticle pairs that form near a black hole are now known as Hawking pairs, and the radiation is called Hawking radiation. Just a few years before, Hawking had said it was ridiculous for Bekenstein to speak of the temperature of a black hole. Now he had written a formula for it: T = ħc3/8πGMk. It is inscribed on his grave in Westminster Abbey.

Seife is the author of six previous books about science, including an excellent “biography” of the number zero. Perhaps inspired by Hawking’s own taste for time reversal, he has arranged this biography in reverse chronological order, beginning with death and burial and ending with fragmentary scenes from childhood. Harold Pinter made this trick work on stage in his 1978 play Betrayal, the story of a seven-year love affair, but telling Hawking’s story backward creates challenges for the reader, especially because it means reversing the scientific story as well. Results precede their causes, sometimes bewilderingly. We don’t get to Hawking radiation and the Hawking area theorem till near the end, after reading about controversies for which they are precursors.

Hawking himself grows backward, like Benjamin Button in the story by F. Scott Fitzgerald and movie by David Fincher. His second wife, Elaine Mason, divorces him in 2006, abuses him—or so the British tabloids claimed—in 2004, marries him in 1995, moves in with him in 1990, and becomes his nurse in the 1980s. He turns fifty, “a ripe old age for a theoretical physicist or a mathematician,” long before we arrive at his creative period.

Meanwhile, we take in the brutal progress of his illness. He considered suicide and had no way to accomplish it. The difficulty of pulling air into his lungs led in 1985 to a tracheostomy and the insertion of a breathing tube that required constant alertness by nurses: “The physicist was almost entirely locked in. Unable to write, unable to talk, barely able to move, and entirely dependent on people whom he couldn’t communicate with.” Then he obtains a custom-made voice synthesizer, which he operates by twitching a facial muscle. Backward into the past: he signs his name for the last time in 1979, loses the ability to walk in 1970, and stops writing or typing in 1966. Finally it all begins, chillingly, with “a bit of clumsiness, an unexpected fall here and there, and a bit of trouble tying shoelaces.”

The metamorphosis at the book’s heart is, when it comes, dramatic. By now we are well acquainted with “a scientist superstar the likes of which hadn’t been seen since Einstein.” Only then do we discover the “well-respected, if not terribly well-known, physicist…laboring in semi-obscurity.” The turning point was the publication of A Brief History of Time in 1988. Four years earlier Hawking had sold a hundred-page manuscript of the book to Peter Guzzardi at Bantam. Guzzardi tells Seife frankly that the “material” was “dry and uneven,” “written with all kinds of assumptions about the reader already understanding a lot of physics and astrophysics, and in some cases, written a bit like a middle-school level.” Guzzardi hoped to sell Hawking’s personal story and secretly hired John Gribbin (who later cowrote Hawking’s biography) to help. Several of Hawking’s graduate students also joined the project, writing, reorganizing, and rewriting. It was not a history of time, of course. It was a history of cosmology, from Aristotle’s concentric spheres to the present-day model of the Big Bang and an expanding universe, plus some speculation about theories to come—especially “grand unified theories” that might reduce all physics to a single scheme. Bantam initially printed 40,000 copies. At least 10 million have been sold.

Martin Gardner reviewed the book for The New York Review and found it “stimulating,” though he noted some errors. Newsweek put Hawking and his book on the cover. A Stephen Hawking fan club appeared in Chicago and sold thousands of T-shirts. In New York magazine, David Blum went on the attack:

How did a book as complex and incomprehensible as A Brief History of Time end up as the likely best-selling nonfiction book of the year?

The simple answer, I believe, is that Bantam Books…knew that the only way to guarantee a best-seller in this case was to exploit the illness of Stephen Hawking to promote his book—in a way that is at best irrelevant and at worst shameful.

With Hawking’s participation, the director Errol Morris made A Brief History of Time into a successful movie, much more about the person than the science. (Seife describes it as a clever kind of fakery, relying heavily on a library of clips to extract “the greatest possible dynamism out of a mostly immobile human being.”) The BBC broadcast a documentary called Master of the Universe. In 1990 Hawking did a Playboy interview in which he said that he was now “cutting down” on interviews “and getting back to research.” That was not, strictly speaking, the truth.

It is seldom appreciated how truly disabling Hawking’s condition was for a working physicist. There is a notion—and Hawking encouraged this, wittingly or unwittingly—that scientific theories and discoveries emerge full-grown like Athena from the head of Zeus; that scientists need their bodies only to communicate their work after the fact. More typically, the work of a physicist is in the writing. By all accounts, Hawking had an extraordinary memory and strong visual intuition; even so, he admitted in 1999 that “it is difficult to handle complicated equations in my head.” Seife writes:

Hawking had become less and less able to work out the details himself…. Since he’d lost the use of his hands in the early 1970s, he couldn’t do mathematics in the same way everyone else did. He couldn’t write complicated formulae, doodle diagrams, or even store a transient thought in an efficient way. That made it hard for him to manipulate a lot of the necessary mathematical formalisms or to build the equations.

He continued to publish papers, with the help of graduate students, but his output came more in the form of attention-grabbing predictions, pronouncements, and conjectures. He was famous for making wagers. In 1975 he bet the Caltech physicist Kip Thorne that a newly discovered X-ray source in the constellation Cygnus would not contain a black hole. It did, and he paid up (a subscription to Penthouse magazine) fifteen years later. He bet that the hypothetical Higgs boson would never be found; it was, with great fanfare, in 2012. He particularly enjoyed speculating about time travel. He proposed a “chronology protection conjecture,” arguing that the laws of physics must forbid space-time loops that might allow travel into the past. “There is also strong experimental evidence in favor of the conjecture from the fact that we have not been invaded by hordes of tourists from the future,” he wrote.

Of all the controversies that gained currency from Hawking’s participation, the most important, continuing across four decades, was the black hole information paradox. It is simply stated. According to the equations of relativity, information that travels past the event horizon into a black hole is forever lost—erased, as far as any outside observer is concerned. On the other hand, according to quantum mechanics, information—the sum total of everything that can be known about a quantum object—is a sacred quantity that must be preserved over time. This principle is built into the wave function, the basic tool for calculating everything that can be calculated in quantum mechanics. The wave function carries information step by step into the future, or, with time reversed, back to the past. This preserves causality and enables reliable computation. If information can be lost, the gears might slip.

It was Hawking radiation that created the crisis. As long as black holes were stable and permanent, they might be keeping the information locked up inside. But Hawking radiation carries no information, so when a black hole finally shrinks away and explodes, the information has vanished.

“Hawking initiated an insurgence against a fair and reasonable universe,” the Columbia University astrophysicist Janna Levin writes in her recent book Black Hole Survival Guide. “Quantum mechanics is fatally flawed if information genuinely can disappear, as opposed to merely hide.” The problem roiled theoretical physics, pitting relativists against particle theorists. In the course of time, Hawking managed to take both sides. He made another much-publicized bet in 1997, this time with John Preskill of the California Institute of Technology, that black holes do destroy information. At one point he speculated that information might leak into a baby universe branching off from ours. Then, in 2004, he baldly announced, “I have solved the black hole information paradox and I want to talk about it.”

He drew a large audience to a conference on relativity in Dublin, where he reversed his stance, conceded his bet, and gave Preskill his winnings (a baseball encyclopedia). Information is preserved after all, he said. He gave a rough sense of his mathematical approach but no persuasive physical explanation. “If you jump into a black hole,” he said, “your mass energy will be returned to our universe, but in a mangled form, which contains the information about what you were like, but in an unrecognizable state.” This left the audience mystified. Even Preskill was not sure he had really won his bet. Nor were most physicists persuaded by Hawking’s formal paper, published more than a year later. The question remains unsettled.

We can see, though, why he cared so much. The tension between relativity and quantum theory continues to loom over modern physics. In the effort to unify all the known forces, gravity remains an outlier. It seems impossible to reconcile the essential mathematical formalisms. If you’re an optimist, the conflict is an opportunity. “Hawking had spent his life,” Seife says,

seeking out boundaries where theories clash and break down—at the edge of a black hole, at the birth of the universe, because it is those boundaries that give scientists fleeting clues to truths even more profound than the ones we already know.

At first these boundary regions spurred Hawking to original and productive discoveries; later, though, the clash of theories led him to emphasize the dream of unification known, often with capital letters, as the theory of everything.

Hawking promoted the theory of everything with a vengeance. He made it part of his brand. It was the title of the 2014 biopic in which Eddie Redmayne played Hawking. The much-quoted ending of A Brief History of Time raised the prospect of a complete theory—a final theory: “It would be the ultimate triumph of human reason—for then we would know the mind of God.” At the 1998 White House event, Hawking told the assembled dignitaries:

We shall have to rely on mathematical beauty and consistency to find the ultimate Theory of Everything. Nevertheless I am confident we will discover it by the end of the 21st century and probably much sooner. I would take a bet at 50-50 odds that it will be within twenty years starting now.

He would have lost that one, too. It was hubris—but it sold, and it is part of his legacy. He showed younger colleagues how to chase grand theories and best-selling books. Hawking is not the only physicist guilty of hawking.

The theory of everything is a false idol. Why should the universe, which grows more gloriously complex the more we see, be reducible to one set of equations and formulae? The point of science is not the holy grail but the quest—the searching and the asking. Let us hope there will never be a final theory. As Philip Anderson, the Princeton physicist and Nobel laureate, once told the science writer John Horgan, “You never understand everything. When one understands everything, one has gone crazy.”