In mid-October 2014, about a year into his tenure as director of the Federal Bureau of Investigation, James Comey gave a speech at the Brookings Institution warning of the dangers ahead as tech companies increasingly encrypted their products. “What it means is this,” he said. “Those charged with protecting …
People are drawn to magic. Steve Jobs knew this, and it was one reason why he insisted on secrecy until the moment of unveiling. But Jobs’s obsession with secrecy went beyond his desire to preserve the “a-ha!” moment.
Early this year, a robot in Switzerland purchased ten tablets of the illegal drug MDMA, better known as “ecstasy,” from an online marketplace and had them delivered through the postal service to a gallery in St. Gallen where the robot was then set up. The drug buy was part of …
Long before Elon Musk parlayed the $165 million he made from PayPal into the more than $11 billion that underwrite Musk Industries today, he was thinking ahead, envisioning a world that merged science with science fiction, a real world that he, the hero of this story, would bring to fruition.
Banking, logistics, surgery, and medical recordkeeping are just a few of the occupations that have already been given over to machines. Manufacturing, which has long been hospitable to mechanization and automation, is becoming more so as the cost of industrial robots drops, especially in relation to the cost of human labor. Algorithms are writing most corporate reports, analyzing intelligence data for the NSA and CIA, reading mammograms, grading tests, and sniffing out plagiarism. Computers fly planes and they compose music and pick which pop songs should be recorded based on which chord progressions and riffs were hits in the past.
In a July 2 report on the NSA’s warrantless surveillance of non-US citizens, the bipartisan Privacy and Civil Liberties Oversight Board mostly found that the program is working as it was supposed to. But three days later, The Washington Post revealed the program is monitoring American citizens, and that the documents scooped up include baby pictures, love letters, messages between attorneys and their clients.
Glenn Greenwald is indignant, self-righteous, and self-aggrandizing—but so what? It’s a red herring, just as focusing on Edward Snowden—who is he, where is he—is a distraction. The matter at hand is not their story; as long as this is a democracy, it has to be ours.
On November 14, six weeks after the failed launch of HealthCare.gov, the federal government’s insurance marketplace website and the public face of the endlessly contested Affordable Care Act, President Obama offered a mea culpa that was anything but. True, he did repeat “that’s on me” and “that’s on us,” but he also said this: “I was not informed directly that the website would not be working the way it was supposed to.” And so we are left to wonder how it was that he did not make it a point to be informed.
By now, the presence and reach of the Internet is felt in ways unimaginable twenty-five or ten or even five years ago: in education with “massive open online courses,” in publishing with electronic books, in journalism with the migration from print to digital, in medicine with electronic record-keeping, in political organizing and political protest, in transportation, in music, in real estate, in the dissemination of ideas, in pornography, in romance, in friendship, in criticism, in much else as well, with consequences beyond calculation.
The human–canine bond is inherently unequal. Like it or not, it is a power relationship. Yet at the same time, we love our dogs. We feel sure that they know us and like us, although we never know just how well; we sense they adapt to our moods, but we also know that they very naturally may be more interested in dogs they meet on the street; we believe we can count on them to be absolutely loyal companions, something we may not be able to say about most people we know. Maybe more than at any other time in history, we love our dogs as we love one another, and sometimes even more than that.
Hacking can, and often does, improve products. It exposes vulnerabilities, supplies innovations, and demonstrates both what is possible and what consumers want. Still, hacking has a dark side, one that has eclipsed its playful, sporty, creative side, especially in the popular imagination, and with good reason. Hacking has become the preferred tool for a certain kind of thief, one who lifts money from electronic bank accounts and sells personal information, particularly as it relates to credit cards and passwords, in a thriving international Internet underground. Hacking has also become a method used for extortion, public humiliation, business disruption, intellectual property theft, espionage, and, possibly, war.
It was Jobs’s idea to form a business partnership that would take over the ownership of Steve Wozniak’s design and parlay it into a consumer product. Within a month, they were in the black. Jobs was twenty-two years old. He hadn’t invented the Apple computer, he had invented Apple Computer. In so doing, he set in motion a pattern that would be repeated throughout his career: seeing, with preternatural clarity, the commercial implications and value of someone else’s work.
Three days after Apple’s new iPhone, the 4S, went on sale, the company announced that over four million devices had been sold—a record. It’s safe to say that most of these sales were not made to people who were drawn to the phone’s new and improved physical form, since the …
The day after the iPhone 4S was launched, Apple’s founder and resident seer, Steve Jobs, died. One of the most popular Jobs quotes circulating in the days after his death was one that he attributed to hockey great Wayne Gretzky: “A good hockey player plays where the puck is. A great hockey player plays where the puck is going to be.” After three days of record iPhone 4S sales, there’s no better example of playing to where the puck is going to be than Siri. There are other “personal assistant” smart phone apps available. Indeed, before Apple removed it from its App Store, Siri was one of them. But who knew that consumers wanted Siri baked into their phone, and into Apple’s servers, which stores all previous “conversations,” so that Siri gets more and more familiar with its “boss” all the time? Steve Jobs, obviously.
Playing to where the puck is going to be is, of course, a proxy for anticipating and then apprehending the future. At a conference at the MIT Media Lab last week sponsored by Technology Review, engineers, scientists, academics, entrepreneurs, investors, students, and corporate spokespeople were engaged in the journal’s annual attempt both to anticipate where the puck will land and, at the same time, push it there.
Last week, when Apple’s Steve Jobs took to the stage during the company’s Worldwide Developers Conference and grandly announced its new iCloud service, he was putting the Apple logo on something most internet users have relied on eclectically for years. Gmail, Dropbox, Netflix, Hotmail, Flickr, Box.net, and Spotify, to name a few popular services, all rely on cloud computing, where data—documents, music, photos, and movies—are stored on shared servers in large data centers, rather than on your puny, personal hard drive. The benefits of cloud computing are obvious: one is not limited by the size of that drive, nor restricted to viewing that material on a single device. Once it is in “the cloud,” the only thing standing between you and your stuff is a (fast) internet connection.
A Google search ”curates” the Internet. It’s not just the large number of search variables, or the intervention of marketers, that shapes the information we’re shown by bringing certain pages to our attention while others fall far enough down in the rankings to be kept out of view. As Eli Pariser documents in his chilling book The Filter Bubble: What the Internet Is Hiding from You, since December 2009, Google has aimed to contour every search to fit the profile of the person making the query. The search process, in other words, has become “personalized,” which is to say that instead of being universal, it is idiosyncratic and oddly peremptory.
Now that the memoir has become, in large measure, the literary equiva- lent of reality TV with publishers trotting out an endless parade of writers eager to reveal their abusive, alcohol-soaked, neglected, mutilated, starved, duped, prostituted, dental-surgery-without-anesthesia (oops—that was “fiction”) lives, it is bracing—almost shocking—to read Allen Shawn’s scrupulous Twin.
I once owned a black car that my husband insisted was green, even though the bill of sale said “onyx.” Then one day about 50,000 miles in, and just for a minute, as the light hit the car in a certain way, I saw what he must have been seeing, …
Sebastian Junger and Tim Hetherington, journalists with extensive war-reporting resumés, began following a group of American combat soldiers during their fifteen-month deployment in Afghanistan in 2007 and 2008. The film they have made, Restrepo, is everything The Hurt Locker is not: authentic, unsentimental, modest, nuanced. Of all the films that have come out of our ongoing wars in Afghanistan and Iraq, it is closest to The War Tapes, an unflinching documentary assembled from a year’s worth of footage taken on the ground by a handful of New Hampshire National Guardsmen in the early years of the Iraq war. But unlike that film, which was shot, for the most part, with helmet- and dash-mounted cameras that showed, explicitly, the experience of individual soldiers, Restrepo takes a small step to the side, widening the lens to take in the whole group, the young, eager, ripped professional soldiers we pay to enforce our foreign policy and do our bidding at the end of a gun.
You don’t have to be a technophobe to dismiss out of hand the idea of reading on a machine. Maybe it is muscle memory, but there is something deeply satisfying about a “real” book, whose binding you can crack and fold as you move from beginning to end. E-books, by contrast, are ephemeral. Yes you can carry thousands of them in your pocket, but what do you have to show for it? Then, one day, you find yourself housebound, and Wolf Hall has just won the Booker Prize, and you download a sample onto your iPhone, and just like with a book you are pulled into the story, and your resistance disappears. You press the “buy” button—its so easy!—and that is how it starts.
Not long after the iPad went on sale in early April, the Ilinois Institute of Technology announced that it would be providing each member of next fall’s freshman class with one of the new Apple devices. School officials said that the iPad would allow students to take notes, check email, and read books. Which books they had in mind is not precisely clear except for this: they are not likely to be textbooks.
One of the defining features of social media, if not the defining feature, is its participatory nature. Anyone, everyone, is a content producer. Anyone, everyone, is a critic. And, for the most part, everyone’s voice registers at the same volume. Your take on the new Michael Jackson movie, and my take, and the take of the fifteen-year-old boy down the street are given equal weight. True, there are some sites, like The New York Times and Amazon that let readers rate or recommend other people’s musings, rants and insights, but even so, all the comments are put “out there” with little or no intercession. This works really well for consumer products, where the average user, whose experience is actual and authentic, is typically a more reliable guide than that of professional testers, though manufacturers have figured out how to game the system by mobilizing armies of average-joe posters to shill their products. Still, if 328 people have something to say about a piece of software or a robotic vacuum cleaner you’re interested in, you are going to get a very good sense whether these products will meet your needs.
This past July, a little over a year after the United Nations Security Council finally declared rape a crime of war, the parents of Taraneh Mousavi, a twenty-eight-year-old beautician from Tehran, received a call from an anonymous stranger. The young woman had been missing for weeks, ever since she’d attended …
Sue Halpern and Nicholas Kristof have been engaged in an exchange about microfinance, following her recent NYR review of his new book (co-authored with Sheryl WuDunn), Half the Sky: Turning Oppression into Opportunity for Women Worldwide. The first part of their conversation can be found here. The next installment appears below.
In the November 19 issue of The New York Review, Sue Halpern wrote about Nicholas Kristof and Sheryl WuDunn’s new book, Half the Sky: Turning Oppression into Opportunity for Women Worldwide. Her piece describes the systematic abuse of women documented by Kristof and WuDunn throughout the world, and the considerable success of microfinance programs—pioneered by the Nobel-prize winning economist Muhammad Yunus, whose book is also included in Halpern’s review—in countering this problem by helping poor women gain economic power. Following is an exchange between Halpern and Kristof about the spread of microfinance and some of the criticisms that have emerged about it.
If he were not the second-richest person in the world, Warren Buffett’s other well-known attributes—an Omaha address, a five-bedroom house where he’s lived for fifty-two years, an annual $100,000 salary, and a phone he answers himself—would be unremarkable. But we have certain expectations for the fabulously wealthy, encouraged in no …
In late October, about a week before the presidential election, a young man named Kris Goldsmith, who had served two tours of duty in Iraq, was forced to leave a street fair in Bellmore, Long Island, where he was distributing literature from the group Iraq Veterans Against the War. In …