• Email
  • Print

The Crisis of Big Science

Superconducting Super Collider Laboratory/Photo Researchers
Construction of an underground shaft for the Superconducting Super Collider in Texas. The SSC was supposed to be the largest particle accelerator in the world, but its funding was canceled by Congress in 1993.

Last year physicists commemorated the centennial of the discovery of the atomic nucleus. In experiments carried out in Ernest Rutherford’s laboratory at Manchester in 1911, a beam of electrically charged particles from the radioactive decay of radium was directed at a thin gold foil. It was generally believed at the time that the mass of an atom was spread out evenly, like a pudding. In that case, the heavy charged particles from radium should have passed through the gold foil, with very little deflection. To Rutherford’s surprise, some of these particles bounced nearly straight back from the foil, showing that they were being repelled by something small and heavy within gold atoms. Rutherford identified this as the nucleus of the atom, around which electrons revolve like planets around the sun.

This was great science, but not what one would call big science. Rutherford’s experimental team consisted of one postdoc and one undergraduate. Their work was supported by a grant of just £70 from the Royal Society of London. The most expensive thing used in the experiment was the sample of radium, but Rutherford did not have to pay for it—the radium was on loan from the Austrian Academy of Sciences.

Nuclear physics soon got bigger. The electrically charged particles from radium in Rutherford’s experiment did not have enough energy to penetrate the electrical repulsion of the gold nucleus and get into the nucleus itself. To break into nuclei and learn what they are, physicists in the 1930s invented cyclotrons and other machines that would accelerate charged particles to higher energies. The late Maurice Goldhaber, former director of Brookhaven Laboratory, once reminisced:

The first to disintegrate a nucleus was Rutherford, and there is a picture of him holding the apparatus in his lap. I then always remember the later picture when one of the famous cyclotrons was built at Berkeley, and all of the people were sitting in the lap of the cyclotron.


After World War II, new accelerators were built, but now with a different purpose. In observations of cosmic rays, physicists had found a few varieties of elementary particles different from any that exist in ordinary atoms. To study this new kind of matter, it was necessary to create these particles artificially in large numbers. For this physicists had to accelerate beams of ordinary particles like protons—the nuclei of hydrogen atoms—to higher energy, so that when the protons hit atoms in a stationary target their energy could be transmuted into the masses of particles of new types. It was not a matter of setting records for the highest-energy accelerators, or even of collecting more and more exotic species of particles, like orchids. The point of building these accelerators was, by creating new kinds of matter, to learn the laws of nature that govern all forms of matter. Though many physicists preferred small-scale experiments in the style of Rutherford, the logic of discovery forced physics to become big.

In 1959 I joined the Radiation Laboratory at Berkeley as a postdoc. Berkeley then had the world’s most powerful accelerator, the Bevatron, which occupied the whole of a large building in the hills above the campus. The Bevatron had been built specifically to accelerate protons to energies high enough to create antiprotons, and to no one’s surprise antiprotons were created. What was surprising was that hundreds of types of new, highly unstable particles were also created. There were so many of these new types of particles that they could hardly all be elementary, and we began to doubt whether we even knew what was meant by a particle being elementary. It was all very confusing, and exciting.

After a decade of work at the Bevatron, it became clear that to make sense of what was being discovered, a new generation of higher-energy accelerators would be needed. These new accelerators would be too big to fit into a laboratory in the Berkeley hills. Many of them would also be too big as institutions to be run by any single university. But if this was a crisis for Berkeley, it wasn’t a crisis for physics. New accelerators were built, at Fermilab outside Chicago, at CERN near Geneva, and at other laboratories in the US and Europe. They were too large to fit into buildings, but had now become features of the landscape. The new accelerator at Fermilab was four miles in circumference, and was accompanied by a herd of bison, grazing on the restored Illinois prairie.

By the mid-1970s the work of experimentalists at these laboratories, and of theorists using the data that were gathered, had led us to a comprehensive and now well-verified theory of particles and forces, called the Standard Model. In this theory, there are several kinds of elementary particles. There are strongly interacting quarks, which make up the protons and neutrons inside atomic nuclei as well as most of the new particles discovered in the 1950s and 1960s. There are more weakly interacting particles called leptons, of which the prototype is the electron.

There are also “force carrier” particles that move between quarks and leptons to produce various forces. These include (1) photons, the particles of light responsible for electromagnetic forces; (2) closely related particles called W and Z bosons that are responsible for the weak nuclear forces that allow quarks or leptons of one species to change into a different species—for instance, allowing negatively charged “down quarks” to turn into positively charged “up quarks” when carbon-14 decays into nitrogen-14 (it is this gradual decay that enables carbon dating); and (3) massless gluons that produce the strong nuclear forces that hold quarks together inside protons and neutrons.

Successful as the Standard Model has been, it is clearly not the end of the story. For one thing, the masses of the quarks and leptons in this theory have so far had to be derived from experiment, rather than deduced from some fundamental principle. We have been looking at the list of these masses for decades now, feeling that we ought to understand them, but without making any sense of them. It has been as if we were trying to read an inscription in a forgotten language, like Linear A. Also, some important things are not included in the Standard Model, such as gravitation and the dark matter that astronomers tell us makes up five sixths of the matter of the universe.

So now we are waiting for results from a new accelerator at CERN that we hope will let us make the next step beyond the Standard Model. This is the Large Hadron Collider, or LHC. It is an underground ring seventeen miles in circumference crossing the border between Switzerland and France. In it two beams of protons are accelerated in opposite directions to energies that will eventually reach 7 TeV in each beam, that is, about 7,500 times the energy in the mass of a proton. The beams are made to collide at several stations around the ring, where detectors with the mass of World War II cruisers sort out the various particles created in these collisions.

Some of the new things to be discovered at the LHC have long been expected. The part of the Standard Model that unites the weak and electromagnetic forces, presented in 1967–1968, is based on an exact symmetry between these forces. The W and Z particles that carry the weak nuclear forces and the photons that carry electromagnetic forces all appear in the equations of the theory as massless particles. But while photons really are massless, the W and Z are actually quite heavy. Therefore, it was necessary to suppose that this symmetry between the electromagnetic and weak interactions is “broken”—that is, though an exact property of the equations of the theory, it is not apparent in observed particles and forces.

The original and still the simplest theory of how the electroweak symmetry is broken, the one proposed in 1967–1968, involves four new fields that pervade the universe. A bundle of the energy of one of these fields would show up in nature as a massive, unstable, electrically neutral particle that came to be called the Higgs boson.1 All the properties of the Higgs boson except its mass are predicted by the 1967–1968 electroweak theory, but so far the particle has not been observed. This is why the LHC is looking for the Higgs—if found, it would confirm the simplest version of the electroweak theory. In December 2011 two groups reported hints that the Higgs boson has been created at the LHC, with a mass 133 times the mass of the proton, and signs of a Higgs boson with this mass have since then turned up in an analysis of older data from Fermilab. We will know by the end of 2012 whether the Higgs boson has really been seen.

The discovery of the Higgs boson would be a gratifying verification of present theory, but it will not point the way to a more comprehensive future theory. We can hope, as was the case with the Bevatron, that the most exciting thing to be discovered at the LHC will be something quite unexpected. Whatever it is, it’s hard to see how it could take us all the way to a final theory, including gravitation. So in the next decade, physicists are probably going to ask their governments for support for whatever new and more powerful accelerator we then think will be needed.


That is going to be a very hard sell. My pessimism comes partly from my experience in the 1980s and 1990s in trying to get funding for another large accelerator.

In the early 1980s the US began plans for the Superconducting Super Collider, or SSC, which would accelerate protons to 20 TeV, three times the maximum energy that will be available at the CERN Large Hadron Collider. After a decade of work, the design was completed, a site was selected in Texas, land bought, and construction begun on a tunnel and on magnets to steer the protons.

Then in 1992 the House of Representatives canceled funding for the SSC. Funding was restored by a House–Senate conference committee, but the next year the same happened again, and this time the House would not go along with the recommendation of the conference committee. After the expenditure of almost two billion dollars and thousands of man-years, the SSC was dead.

One thing that killed the SSC was an undeserved reputation for over-spending. There was even nonsense in the press about spending on potted plants for the corridors of the administration building. Projected costs did increase, but the main reason was that, year by year, Congress never supplied sufficient funds to keep to the planned rate of spending. This stretched out the time and hence the cost to complete the project. Even so, the SSC met all technical challenges, and could have been completed for about what has been spent on the LHC, and completed a decade earlier.

Spending for the SSC had become a target for a new class of congressmen elected in 1992. They were eager to show that they could cut what they saw as Texas pork, and they didn’t feel that much was at stake. The cold war was over, and discoveries at the SSC were not going to produce anything of immediate practical importance. Physicists can point to technological spin-offs from high-energy physics, ranging from synchotron radiation to the World Wide Web. For promoting invention, big science in this sense is the technological equivalent of war, and it doesn’t kill anyone. But spin-offs can’t be promised in advance.

Science Source
Ernest Rutherford holding the apparatus he used to disintegrate a nitrogen nucleus, circa 1917
  1. 1

    In his recent book, The Infinity Puzzle (Basic Books, 2011), Frank Close points out that a mistake of mine was in part responsible for the term “Higgs boson.” In my 1967 paper on the unification of weak and electromagnetic forces, I cited 1964 work by Peter Higgs and two other sets of theorists. This was because they had all explored the mathematics of symmetry-breaking in general theories with force-carrying particles, though they did not apply it to weak and electromagnetic forces. As known since 1961, a typical consequence of theories of symmetry-breaking is the appearance of new particles, as a sort of debris. A specific particle of this general class was predicted in my 1967 paper; this is the Higgs boson now being sought at the LHC.

    As to my responsibility for the name “Higgs boson,” because of a mistake in reading the dates on these three earlier papers, I thought that the earliest was the one by Higgs, so in my 1967 paper I cited Higgs first, and have done so since then. Other physicists apparently have followed my lead. But as Close points out, the earliest paper of the three I cited was actually the one by Robert Brout and François Englert. In extenuation of my mistake, I should note that Higgs and Brout and Englert did their work independently and at about the same time, as also did the third group (Gerald Guralnik, C.R. Hagen, and Tom Kibble). But the name “Higgs boson” seems to have stuck. 

  2. 2

    I have written more about this in “ The Missions of Astronomy,” The New York Review, October 22, 2009. 

  3. 3

    This article is based on the inaugural lecture in the series “On the Shoulders of Giants” of the World Science Festival in New York on June 4, 2011, and on a plenary lecture at the meeting of the American Astronomical Society in Austin on January 9, 2012. 

  • Email
  • Print