• Email
  • Single Page
  • Print

Who Was Steve Jobs?

Steve Jobs

by Walter Isaacson
Simon and Schuster, 630 pp., $35.00
halpern_1-011212.jpg
Anthony Dickson/AFP/Getty Images
A man wearing a mask of Steve Jobs pretending to present new iPads outside an Apple store in Hong Kong, in a protest against conditions at Foxconn factories in China where Apple products are made, May 2011

Within hours of the death of Apple Computer CEO Steve Jobs, people began to show up at Apple stores with flowers, candles, and messages of bereavement and gratitude, turning the company’s retail establishments into shrines. It was an oddly fitting tribute to the man who started Apple in his parents’ garage in 1976 and built it up to become, as of last August, the world’s most valuable corporation, one with more cash in its vault than the US Treasury. Where better to lay a wreath than in front of places that were themselves built as shrines to Apple products, and whose glass staircases and Florentine gray stone floors so perfectly articulated Jobs’s “maximum statement through minimalism” aesthetic. And why not publicly mourn the man who had given us our coolest stuff, the iPod, the iPhone, the iPad, and computers that were easy to use and delightful to look at?

According to Martin Lindstrom, a branding expert writing in The New York Times just a week before Jobs’s death, when people hear the ring of their iPhones it activates the insular cortex of the brain, the place where we typically register affection and love. If that’s true, then the syllogism—I love everything about my iPhone; Steve Jobs made this iPhone; therefore, I love Steve Jobs—however faulty, makes a certain kind of emotional sense and suggests why so many people were touched by his death in more than a superficial way.

By the time he died in October at age fifty-six, Jobs was as much an icon as the Apple logo or the iPod or the original Macintosh computer themselves. Known for his casual jeans-and-black-turtleneck look, Jobs had branded himself and, by extension, his company. The message was simple: I’m not a suit, and we don’t make products for suits—suits being a euphemism for buttoned-up, submissive conformists. America loves its business heroes—just a few years ago, books about General Electric’s former chief executive Jack Welch, the investor Warren Buffett, and Chrysler’s Lee Iacocca topped best-seller lists—and it also celebrates its iconoclasts, people who buck the system and make it on their own terms. As Walter Isaacson’s perfectly timed biography makes clear, Jobs aspired to be both, living as if there were no contradiction between the corporate and the countercultural, and this, along with the sexy hardware and innovative software he bequeathed us, is at the root of the public’s fascination with him.

Isaacson’s biography—which is nothing if not a comprehensive catalog of Jobs’s steps and missteps as he went from being an arrogant, mediocre engineer who had been relegated to the night shift at Atari because of his poor hygiene to one of the most celebrated people in the world, widely credited with revolutionizing the businesses of personal computing, digital publishing, animated movies, tablet computing, music distribution, and cellular phones—is distinguished from previous books about Jobs by the author’s relationship with his subject. This is a book that Jobs solicited in 2004, approaching Isaacson not long after being diagnosed with cancer, and asking him to write his biography so his kids would know him, he said, after he was gone. Jobs pledged his complete support, which included full access to himself and his family with no interference or overt editorial control.

As was widely reported in the press in the run-up to the book’s release, Isaacson, who was unaware of Jobs’s diagnosis at the time, demurred. Five years later, after Jobs’s second medical leave from Apple, at the urging of both Jobs and his wife, Isaacson took on the project. Two years, forty interviews with the primary subject, and six hundred pages later, it was rushed into print just two weeks after Jobs’s death and a month before its scheduled publication date, which itself had been moved up from March 2012. Upon Jobs’s death, preorders for the book jumped a record 54,000 percent.

When Isaacson asked Jobs why he wanted him to be the one to write his authorized biography, Jobs told him it was because he was “good at getting people to talk.” Isaacson, who had been both the editor of Time magazine and the head of CNN, professed to be pleasantly surprised, maybe because two of his earlier, well-received, popular biographies were of men who could only speak from the grave: Benjamin Franklin and Albert Einstein. More likely, Jobs, who considered himself special, sought out Isaacson because he saw himself on par with Franklin and Einstein. By the time he was finished with the book, Isaacson seemed to think so as well. “So was Mr. Jobs smart?” Isaacson wrote in a coda to the book, published in The New York Times days after it had come out. “Not conventionally. Instead, he was a genius.”

While the whole “who’s a genius” debate is, in general, fraught and unwinnable, since genius itself is always going to be ill-defined, in the case of Steve Jobs it is even more fraught and even more unwinnable. In part, this is because the tech world, where most of us reside simply by owning cell phones and using computers, is not unlike the sports world or the political world: it likes a good rivalry. If, years ago, it was Microsoft versus Apple, these days it’s Apple versus Android, with supporters of one platform calling supporters of the other platform names (“fanboys” is a popular slur) and disparaging their intelligence, among other things. Call Steve Jobs a genius and you’ll hear (loudly) from Apple detractors. Question his genius, and you’ll be roundly attacked by his claque. While there is something endearing about the passions stirred, they suggest the limitations of writing a book about a contemporary figure and making claims for his place among the great men and women in history. Even though Isaacson has written what appears to be a scrupulously fair chronicle of Jobs’s work life, he is in no better position than any of us to know where, in the annals of innovation, that life will end up.

The other reason nominating Jobs to genius status is complicated has to do with the collaborative nature of corporate invention and the muddiness of technological authorship. Jobs did not invent the personal computer—personal computers predate the Apple I, which he did not in any case design. He didn’t invent the graphical interface—the icons we click on when we’re using our computers, for example—that came from engineers at Xerox. He didn’t invent computer animation—he bought into a company that, almost as an afterthought, housed the most creative digital animation pioneers in the world. He didn’t invent the cell phone, or even the smart phone; the first ones in circulation came from IBM and then Nokia. He didn’t invent tablet computers; Alan Kay designed the Dynabook in the 1960s. He didn’t invent the portable MP3 music device; the Listen Up Player won the innovations award at the 1997 Consumer Electronics Show, four years before Jobs introduced the iPod.

What Jobs did instead was to see how each of these products could be made better, or more user-friendly, or more beautiful, or more useful, or more cutting-edge (quite literally: there is a popular YouTube video making the rounds that shows the sleek new MacBook Air being used to slice an apple). A few years ago, in a profile in Salon, Scott Rosenberg called Jobs a digital “auteur,” and that description seems just right.

The template for Jobs’s career was cast in 1975, months before he and his friends set up shop in his parents’ garage in Los Altos, California, near Palo Alto. Jobs, who had dropped out of Reed College and moved back home, was hanging around with his high school buddy Steve Wozniak. Wozniak, a shy, socially awkward engineer at Hewlett-Packard, was drawn to a group of phone hackers and do-it-yourself engineers who called themselves the Homebrew Computer Club. It was at a club meeting that Wozniak saw an Altair, the first personal computer built from a kit, and he had the insight that it might be possible to use a microprocessor to make a stand-alone desktop computer. “This whole vision of a personal computer just popped into my head,” Isaacson quotes Wozniak as saying. “That night, I started to sketch out on paper what would later become known as the Apple I.”

Then he built it, using scrounged-up parts, soldering them onto a motherboard at his desk at night after work, and writing the code that would link keyboard, disk drive, processor, and monitor. Months later he flipped the switch and it worked. “It was Sunday, June 29, 1975,” Isaacson writes, “a milestone for the personal computer. ‘It was the first time in history,” Wozniak later said, ‘anyone had typed a character on a keyboard and seen it show up on their own computer’s screen right in front of them.’”

Wozniak’s impulse was to give away his computer design for free. He subscribed to Homebrew’s hacker ethos of sharing, so this seemed the right thing to do. His friend Steve Jobs, however, instantly grasped the commercial possibilities of Wozniak’s creation, and after much cajoling, convinced Wozniak not to hand out blueprints of the computer’s architecture; he wanted them to market printed circuit boards instead. They got to work in the Jobses’ garage, assembling the boards by hand. As Wozniak recalls it, “It never crossed my mind to sell computers. It was Steve who said, ‘Let’s hold them in the air and sell a few.’” It was also Jobs who pushed his friend to sell the circuit boards with a steep mark-up—Wozniak wanted to sell them at cost—and Jobs’s idea to form a business partnership that would take over the ownership of Wozniak’s design and parlay it into a consumer product. Within a month, they were in the black. Jobs was twenty-two years old. He hadn’t invented the Apple computer, he had invented Apple Computer. In so doing, he set in motion a pattern that would be repeated throughout his career: seeing, with preternatural clarity, the commercial implications and value of someone else’s work.

After the Apple I, whose innovations were all inside its case, Wozniak went to work on the Apple II, which promised to be a more powerful machine. Jobs, however, recognized that its true power would only be realized if personal computing moved beyond the province of hobbyists like the Homebrew crowd. To do that, he believed, the computer needed to be attractive, unintimidating, and simple to use. This was Jobs’s fundamental insight, and it is what has distinguished every Apple product brought to market since and what defines the Apple brand. For the Apple II itself, Jobs envisioned a molded plastic case that would house the whole computer—everything but the keyboard. For inspiration he looked to the Cuisinart food processors he saw at Macy’s. With those in mind he hired a fabricator from the Homebrew Computer Club to make a prototype, and an engineer from Atari to invent a new kind of power supply, one that ensured that the computer would not need a built-in fan; fans were noisy and inelegant.

Thus began Jobs’s obsession with packaging and design. Isaacson reports that when it came time to manufacture the Apple II case, Jobs rejected all two thousand shades of beige in the Pantone company’s palette. None was quite right, and if he hadn’t been stopped, he would have demanded a 2001st. Still, by any measure, the Apple II was a major triumph. Over sixteen years and numerous iterations, nearly six million were sold.

Despite the huge success of the Apple II, it turned out to be the out-of-town tryout for the computer that would become the company’s big show, the Apple Macintosh. (Nearly three decades later, the words “Apple” and “Mac” have become interchangeable in popular conversation.) Jobs jumped aboard the project when it was well under way, after being booted from a group at Apple working on a computer named Lisa that its developers, who had come over from tie-and-jacket Hewlett-Packard, envisioned selling to businesses and large institutions. The Mac, by contrast, with a target price of $1,000, was meant to appeal to the masses. It would be the Volks-computer. By the time it was released in 1984, the price had more than doubled, in part to cover the heavy investment Jobs had made in promoting it, making it one of the more expensive personal computers on the market.

Both the Lisa and the Mac shared one critical trait: they were built around a novel user interface developed by the engineers at Xerox’s Palo Alto Research Center (PARC) that manipulated every pixel on the screen using a process they’d pioneered called bit-mapping. The geeky commands of the text-based DOS operating system that had been necessary to get a computer to move along were gone. Now there could be color and fonts and pictures. Using the metaphor of a desktop, the PARC engineers placed small, graphical representations of documents and file folders on the screen, and they built an external, hand-operated pointing device—called a mouse, since that was sort of what it looked like—to navigate the desktop by pointing and clicking to open and close documents and perform other functions.

When Steve Jobs saw all this demonstrated at the PARC labs, he instantly recognized that PARC‘s bit-mapped, graphical interface was the key to making a computer user’s experience nontechnical, simple, fun, and intuitive. Do this, he knew, and it would realign the personal computer universe. “It was like a veil being lifted from my eyes,” Jobs told Isaacson. “I could see what the future of computing was destined to be.”

  • Email
  • Single Page
  • Print