halpern_1-112411.jpg

Paul Sakuma/AP Images

Apple CEO Tim Cook announcing the iPhone 4S at Apple headquarters, Cupertino, California, October 4, 2011

Three days after Apple’s new iPhone, the 4S, went on sale, the company announced that over four million devices had been sold—a record. It’s safe to say that most of these sales were not made to people who were drawn to the phone’s new and improved physical form, since the design of the iPhone 4S was pretty much a reiteration of last year’s model, the iPhone 4, which sold a record 1.7 million in its first three days. And while some of those four million sales might have been made because of the newer phone’s faster processing speed and improved camera, these were probably not major draws either, since other phones on the market already offered both features.

The real magnet was Apple’s inclusion of Siri, a natural language–processing, artificial intelligence–driven “personal assistant” that interprets and executes a user’s verbal commands like “Make me a reservation for four at the best tapas bar in Dallas” and answers questions about the weather (“Do I need an umbrella?”) and, essentially, allows one to interact with the Internet without typing a word. Most major news outlets breathlessly live-blogged the iPhone 4S launch as if it actually mattered to the world at large, and Siri demo videos are now all over the Internet.

The day after the iPhone 4S was launched, Apple’s founder and resident seer, Steve Jobs, died. One of the most popular Jobs quotes circulating in the days after his death was one that he attributed to hockey great Wayne Gretzky: “A good hockey player plays where the puck is. A great hockey players plays where the puck is going to be.” After three days of record iPhone 4S sales, there’s no better example of playing to where the puck is going to be than Siri. There are other “personal assistant” smart phone apps available. Indeed, before Apple removed it from its App Store, Siri was one of them. But who knew that consumers wanted Siri baked into their phone, and into Apple’s servers, which stores all previous “conversations,” so that Siri gets more and more familiar with its “boss” all the time? Steve Jobs, obviously.

Playing to where the puck is going to be is, of course, a proxy for anticipating and then apprehending the future. At a conference at the MIT Media Lab in October sponsored by Technology Review, engineers, scientists, academics, entrepreneurs, investors, students, and corporate spokespeople were engaged in the journal’s annual attempt both to anticipate where the puck will land and, at the same time, push it there. Call it “Optimism Fest,” though the official name of the event is EmTech, which is short for emerging technologies.

There is no way not to feel hopeful, for instance, listening to Ely Sachs, who gave up his MIT professorship to run a start-up called 1366 Technologies, talk about how his new method of producing photovoltaic silicon wafers—devices that use solar radiation to generate electricity—will make solar power competitive with coal in a few years. There is no way not to feel encouraged, learning that engineers have come up with ways to make more capacious batteries so that electric cars will be more workable. And there is no way not to feel sanguine when the storied computer pioneer Bill Joy explains how Kleiner Perkins, the venture capital firm where he now works on its “clean-tech” energy portfolio, is investing in cellulose-based biofuels, not just because they have a reasonably good chance to make his company lots of money, but because they have the potential to move us away from CO2-emitting fossil fuels.

In this crowd, the world’s most pressing problems, like climate change, famine, and disease are seen as opportunities—and, even better, lucrative opportunities—and what could generate more optimism than that? As Greg Sorenson, the CEO of Siemens pointed out, “Unless products can save money, meaning make somebody some money somewhere, innovations don’t make it across the desert from the idea stage into actually helping people.”

No one is going to argue with the goal of innovation for the common good, and in the EmTech lecture halls and anterooms, the common good was taken for granted. It was implicit in Verizon’s pioneering smart phone software that enables television viewers to buy products they are seeing on their screens with a single press of a button on their phones without ever rising from their couches, because who doesn’t think that’s a better way of doing business? It was embedded in the development of Ford’s new cloud-connected cars, which are essentially smart phones on wheels that, among other things, collect data about one’s driving habits so that car owners can share that data with friends. And it seemed to be motivating Fujitisu’s engineers, who have designed remote biosensors that can track your every heartbeat and show you and others your stress levels in real time.

Advertisement

Putting aside the issue of whether going from seeing to wanting to buying with only the slightest movement of a single finger advances the human condition, or whether the person in the next cubicle really cares how stressed you are, or if your friends actually want to know how often you exceed the speed limit, should we just assume that all this personal information being generated and collected won’t be used against us by insurers, or employers, or lawyers, or marketers, or the government? This is not the sort of question that typically gets raised in forums such as EmTech, even as the biosensor guys are suggesting how great it would be if your boss could boast that his division has not only the highest sales record, but the lowest heartbeats per minute. Obviously every product, even the most benign, can become a force for evil, so does that mean it should not come to market? If I had a new iPhone 4S, I’d ask Siri—which, by the way, was incubated at the Pentagon’s Defense Advanced Research Projects Agency (DARPA), alongside predator drones and driverless combat vehicles, and where the seeds of Apple’s original Macintosh computer were sowed.

It’s not clear where, in the process of innovation, questions of ethics arise, or if the process is so solipsistic and self-referential that the answers are largely beside the point. This may be why, when the Nobel laureates Joseph Stiglitz and John Sulston, along with a group of British scientists and ethicists, issued The Manchester Manifesto three years ago to address the question “Who owns science?,” it caused such a stir. (Sir John, it will be remembered, is the man who, after completing a working draft of the human genome, released it into the public domain, arguing strongly against patenting genes, an argument that did not carry the day.)

“Rewriting the life code is the next industrial revolution,” the venture capitalist and Harvard Business School professor Juan Enriquez told the EmTech audience; and when Andrew Phillips, the head of biological computation at Microsoft, explained how his team is developing computer code to insert modified DNA into cells, it was clear that the revolution was underway, come what may. Meanwhile, over in Boston, on the other side of the Charles River, people without jobs were camped out at Occupy Boston, part of the Occupy Wall Street protest, in what participants, supporters, and critics alike were calling a revolution, too. Here were two competing visions of how to get to a more humane future, one advanced by business interests and money, the other seeking to reduce corporate power and rebalance the distribution of wealth.

Before he died, Steve Jobs had both his DNA and his cancerous tumor gene sequenced through a collaboration of scientists at Stanford, Harvard, MIT, and Johns Hopkins. As reported in The New York Times, Jobs then told his biographer, Walter Isaacson, that he was “either going to be one of the first ‘to outrun a cancer like this’ or among the last ‘to die from it.'” The sequencing cost Jobs $100,000. At the EmTech conference, Jonathan Rothberg—who has been called the Steve Jobs of biotech because his company, Ion Torrent, makes the personal computer equivalent of a gene-sequencing machine (though it is open source and actively encourages others to build on its architecture, unlike Jobs’s Apple)—talked about the need to democratize genomics. “Next year we’ll be able to sequence genomes for under $1,000,” he said, “which is the magic number.”

Money and magic numbers couldn’t save Steve Jobs, and according to Rothberg, “it will probably be twenty years before we understand cancer the way we understand HIV.” If that happens, what Jobs and his medical team were trying to accomplish—tailoring treatment to the individual attributes of both the patient and his particular disease—will become commonplace. Once again, Jobs was playing where the puck was going to be, only this time he got there too soon. It remains to be seen if, twenty years hence, we will have the kind of society and the kind of health care system that lets the rest of us follow.