Early this year, as part of the $92 million “Data to Decisions” program run by the Defense Advanced Research Projects Agency (DARPA), the Office of Naval Research began evaluating computer programs designed to sift through masses of information stored, traded, and trafficked over the Internet that, when put together, might predict social unrest, terrorist attacks, and other events of interest to the military. Blog posts, e-mail, Twitter feeds, weather reports, agricultural trends, photos, economic data, news reports, demographics—each might be a piece of an emergent portrait if only there existed a suitable, algorithmic way to connect them.
DARPA, of course, is where the Internet was created, back in the late 1960s, back when it was called ARPA and the new technology that allowed packets of information to be sent from one computer to another was called the ARPANET. In 1967, when the ARPANET was first conceived, computers were big, expensive, slow (by today’s standards), and resided primarily in universities and research institutions; neither Moore’s law—that processing power doubles every two years—nor the microprocessor, which was just being developed, had yet delivered personal computers to home, school, and office desktops.
Two decades later, a young British computer scientist at the European Organization for Nuclear Research named Tim Berners-Lee was looking for a way to enable CERN scientists scattered all over the world to share and link documents. When he conceived of the World Wide Web in 1988, about 86 million households had personal computers, though only a small percentage were online. Built on the open architecture of the ARPANET, which allowed discrete networks to communicate with one another, the World Wide Web soon became a way for others outside of CERN, and outside of academia altogether, to share information, making the Web bigger and more intricate with an ever-increasing number of nodes and links. By 1994, when the World Wide Web had grown to ten million users, “traffic was equivalent to shipping the entire collected works of Shakespeare every second.”
1994 was a seminal year in the life of the Internet. In a sense, it’s the year the Internet came alive, animated by the widespread adoption of the first graphical browser, Mosaic. Before the advent of Mosaic—and later Internet Explorer, Safari, Firefox, and Chrome, to name a few—information shared on the Internet was delivered in lines of visually dull, undistinguished, essentially static text. Mosaic made all those lines of text more accessible, adding integrated graphics and clickable links, opening up the Internet to the average, non-geeky user, not simply as an observer but as an active, creative participant. “Mosaic’s charming appearance encourages users to load their own documents onto the Net, including color photos, sound bites, video clips, and hypertext ‘links’ to other documents,” Gary Wolfe wrote in Wired that year.
By following the links—click, and the linked document appears—you can travel through the…
This is exclusive content for subscribers only.
Try two months of unlimited access to The New York Review for just $1 a month.
Continue reading this article, and thousands more from our complete 55+ year archive, for the low introductory rate of just $1 a month.