halpern_1-052616

In mid-October 2014, about a year into his tenure as director of the Federal Bureau of Investigation, James Comey gave a speech at the Brookings Institution warning of the dangers ahead as tech companies increasingly encrypted their products. “What it means is this,” he said. “Those charged with protecting our people aren’t always able to access the evidence we need to prosecute crime and prevent terrorism, even with legal authority.” He called the use of encryption “going dark.”

In that speech, Comey singled out Apple, which had recently introduced new security features to its iPhone operating system that automatically encrypted all of the data stored on the device. E-mails, texts, photographs, videos, contacts, and location data were now out of reach of anyone without its password, Apple and law enforcement included. “Apple argues,” Comey told the group,

that its users can back-up and store much of their data in “the cloud” and that the FBI can still access that data with lawful authority. But uploading to the cloud doesn’t include all of the stored data on a bad guy’s phone, which has the potential to create a black hole for law enforcement.

And if the bad guys don’t back up their phones routinely, or if they opt out of uploading to the cloud, the data will only be found on the encrypted devices themselves. And it is people most worried about what’s on the phone who will be most likely to avoid the cloud and to make sure that law enforcement cannot access incriminating data.

Encryption isn’t just a technical feature; it’s a marketing pitch. But it will have very serious consequences for law enforcement and national security agencies at all levels. Sophisticated criminals will come to count on these means of evading detection. It’s the equivalent of a closet that can’t be opened. A safe that can’t be cracked. And my question is, at what cost?

We were soon to find out. Thirteen months after that Brookings speech, in the first days of December 2015, a county health inspector named Syed Rizwan Farook and his wife Tashfeen Malik opened fire on a holiday gathering of Farook’s colleagues in San Bernardino, California, killing fourteen and wounding twenty-two, while claiming allegiance to the Islamic State. These were terrorists who appeared to materialize out of nowhere, and though they were killed in a shoot-out with police, federal investigators were concerned that they might have had assistance from others, either in the United States or in other countries, or in that borderless, vaporous place we’ve come to call “cyberspace.” A raid on the terrorists’ home netted personal computers and a significant cache of guns and explosives. As luck would have it, investigators also recovered Farook’s employer-issued iPhone. That phone, though, used Apple’s new, encrypted operating system, which protects the user’s password, and the Feds had no idea what combination of numbers Farook used to unlock it.

There are ten thousand possible combinations for a four-digit passcode using the numbers 0–9, and a supercomputer could run through all of them in less time than it will take you to read this sentence. In the language of hackers, that’s what is known as a “brute-force” attack, and that is how the FBI proposed to unlock the recovered iPhone. But it couldn’t. The new operating system also employed a password protection feature that not only limited to ten the number of attempts to guess a password but also slowed down the time allowed between those attempts as new combinations were entered. Then, if the tenth try failed, the phone would wipe itself clean of all the data stored on it (if the user had enabled this option). And so, just as Comey had predicted, Apple’s security features had left the FBI in the dark.

As anyone who owns an Apple phone or computer knows, the company urges customers to back up data to the remote servers of its iCloud storage system and makes it simple for this to happen automatically. Farook’s phone had been backed up on iCloud until six weeks before the attack, and when the FBI requested that file, Apple turned it over. This was possible because, though iCloud accounts are encrypted, Apple holds their key, unlike the iPhone, which is protected by the user’s password. If the FBI had not intentionally (and without consulting Apple) reset Farook’s iCloud password, another data backup to iCloud might have been possible. Instead, the FBI locked itself out, and once it did, demanded that Apple find a workaround.

The battle that ensued between the FBI and Apple took place during the first three months of 2016 both in the US District Court for the Central District of California and in the court of public opinion. Once Apple said it did not have the capacity to unlock Farook’s phone (and in fact had intentionally not given itself that capacity), the Justice Department, at the behest of the FBI, invoked an eighteenth-century statute called the All Writs Act of 1789 to compel it to do so anyway. What the FBI wanted was for Apple to create a new tool, a computer program that would override the security features embedded in the iPhone operating system and so enable investigators to perform a brute-force attack on the phone until the correct password was discovered.

Advertisement

In its response to the government, Apple argued that even if this were possible—and the company was not convinced it was—building a security bypass, even ostensibly for this single phone, would create the conditions for weakened encryption more generally. This, Apple said, would imperil users’ personal information and in some cases their actual lives. “Once created, the technique could be used over and over again, on any number of devices,” Apple CEO Tim Cook wrote in a public “Letter to Our Customers” in mid-February.

In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks—from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

Cook went on, “We can find no precedent for an American company being forced to expose its customers to a greater risk of attack.”

A few days later, FBI Director Comey wrote his own open letter, published on a blog called Lawfare, which is affiliated with the Brookings Institution. “The San Bernardino litigation isn’t about trying to set a precedent or send any kind of message. It is about the victims and justice,” he wrote.

Fourteen people were slaughtered and many more had their lives and bodies ruined. We owe them a thorough and professional investigation under law. That’s what this is. The American people should expect nothing less from the FBI.

The particular legal issue is actually quite narrow. The relief we seek is limited and its value increasingly obsolete because the technology continues to evolve. We simply want the chance, with a search warrant, to try to guess the terrorist’s passcode without the phone essentially self-destructing and without it taking a decade to guess correctly. That’s it.

Comey’s appeal to the emotions sounded reasonable, even unimpeachable: this was the phone of terrorists, after all; innocent people had been killed; of course it was incumbent upon Apple to assist the government. That Apple was resistant, Comey suggested in that letter, was because the tech company “sells stuff.”

In its “Motion to Compel,” the government was more blunt:

Apple appears to object based on a combination of: a perceived negative impact on its reputation and marketing strategy were it to provide the ordered assistance to the government.

In other words, the real reason Apple wasn’t going to assist the FBI was because doing so was bad for business. The implication was that Apple was putting profits before people, and polling data showed that a slim majority of Americans thought so too. It looked as though the FBI director had found the perfect case to bring his argument about the dangers of strong encryption to the people.

Those same polls, though, showed that Americans were also deeply concerned about the integrity and safety of their e-mails, texts, photos, and personal information. And Comey was right: as he told his Brookings audience, encryption was “a marketing pitch.” After the data breaches of major retailers and the federal government, after revelations of intelligence agencies’ digital dragnets, after hackers broke into and held for ransom the medical files of a large hospital system in Los Angeles, and with millions of mobile phones stolen every year, businesses, individuals, and even law enforcement agencies have been eager for more secure devices. In their amicus brief in support of Apple, the Electronic Privacy Information Center and eight other consumer privacy organizations noted that “cell phone theft is one of the top priorities for law enforcement in most US cities” and that “state and federal law enforcement agencies have committed significant resources to promoting security features on cell phones that protect victims and consumers.”

Soon after Apple announced the iPhone’s security upgrade in 2014, Google and Microsoft added similar features to their operating systems. As Edward Lucas points out in Cyberphobia: Identity, Trust, Security and the Internet:

Companies that treat our security seriously will gain business and flourish. More importantly, those that fail to do so will suffer. They will lose business, pay higher insurance premiums, see their stock price fall—and face civil and criminal prosecution.

Ironically, it was because Apple was taking user security seriously that it faced prosecution after the San Bernardino attack.

Advertisement

On its face, the fight between the FBI and Apple appeared to be a contest between the demands of national security (protecting society from terrorists and malicious foreign agents, who threaten us all) and the need for individual security (protecting us, one by one, from those who would expose our secrets, steal and sell our personal data, or take over our networked devices in order to extort money). As the constitutional scholar Laura Donohue points out in her new, richly informative book, The Future of Foreign Intelligence: Privacy and Surveillance in a Digital Age, this tension—between the competing security claims of the government and those of its citizens—is what animated the Founders’ decision to write the Fourth Amendment into the Bill of Rights. That amendment, which grants citizens the right to be secure “in their persons, houses, papers and effects, against unreasonable searches and seizures,” was inspired by the Crown’s egregious use of general warrants and writs of assistance to invade the homes and property of colonists.

Apple can’t claim Fourth Amendment protections for the iPhone in question because its owner, the San Bernardino County Department of Public Health, was eager to have its contents searched. But the company’s general argument—that government attempts to weaken encryption imperil everyone’s security—implies, at least, Fourth Amendment concerns. These are crucial to the support Apple received from individuals and groups as diverse as the Cato Institute and Black Lives Matter. (Apparently, this line of reasoning has not moved United States senators Dianne Feinstein (DCA) and Richard Burr (RNC), authors of the Compliance With Court Orders Act of 2016, which would, if passed, require companies to assist law enforcement when asked to do so.)

The Patriot Act, and other legislation written in direct response to September 11, consolidated state power, and though civil libertarians objected, the public, for the most part, did not oppose provisions that allowed individuals to be spied upon. In 2004, two years after the Patriot Act was enacted, only 26 percent of respondents to a Cnn/USA Today/Gallup poll believed it went too far in restricting civil liberties. That changed following the Edward Snowden disclosures in June 2013.

After the government’s overreach was exposed by Snowden and by the media reports on his revelations, many Americans understood that there was a secret court, the Foreign Intelligence Surveillance Court (FISC), whose rubber-stamp approvals of spying on foreigners were resulting in the widespread surveillance of law-abiding Americans. A Pew poll conducted in 2014 found that nearly three quarters of all respondents said that they did not want to give up privacy and freedom for the sake of safety. As Donohue writes:

It would be difficult to imagine a better example of a general warrant, than the one order, issued by FISC, authorizing the collection of international Internet and telephone content. It names approximately 90,000 targets, in the process monitoring millions of Americans’ communications. The order is not premised upon prior suspicion of wrongdoing. It does not indicate a particular place to be searched. The program is so massive that the government admits that it is impossible to state the number of citizens whose e-mails, telephone conversations, visual communications, and private documents are being monitored.

Another revelation from the Snowden material was the extent to which Silicon Valley tech firms were assisting the NSA in its digital dragnet operations. One document, released to The Guardian, claimed that the agency was receiving data, including e-mails, photographs, videos, calls, and chats, “directly from the servers” of all the major United States Internet companies, Google, Apple, Yahoo, and Microsoft among them. Another leaked document indicated that millions of American tax dollars were being spent to reimburse these companies for their compliance.*

When called to account for their collusion, the tech companies hedged, saying either that they were unaware of the NSA’s data collection program or that they were merely responding to legal warrants, or both. But no matter what they said, the damage to their credibility was significant. Apple’s move to encrypt its phones was both a way to reassure customers that it was not partners with American intelligence agencies and a way to make it largely impossible for the company to cooperate with the agencies, at least with the newer models of its phones.

Apple’s lawyers took issue with the use of the All Writs Act to force the company to write code that would unlock Farook’s iPhone. Joined by many other Silicon Valley firms, as well as cryptographers, civil rights lawyers, privacy experts, and human rights advocates, Apple petitioned the court to have the order vacated. Their claims and arguments, taken together, offer as robust a defense of strong encryption as has yet been made. Just as James Comey’s words at Brookings took on more significance when there was a real case with real victims and a real locked phone that might possibly yield clues, that reality, and the urgency it created, led to a vigorous pushback against the government’s efforts to require Apple to assist it.

Zeid Ra’ad Al Hussain, the United Nations high commissioner for human rights, argued:

The authorities risk unlocking a Pandora’s Box that could have extremely damaging implications for the human rights of many millions of people, including their physical and financial security.

Drawing by Edward Gorey

Edward Gorey Charitable Trust

Drawing by Edward Gorey

What he meant, as lawyers with the Stanford Center for Internet and Society wrote in their amicus brief, was that “governments of other countries where Apple sells its devices will want the same treatment Apple will have given to the FBI.” And if Americans supposed this would be an issue only in countries run by repressive regimes, a letter sent by members of Black Lives Matter to Sherri Pym, the magistrate in the case, brought the issue closer to home:

One need only look to the days of J. Edgar Hoover and wiretapping of Rev. Martin Luther King, Jr. to recognize the FBI has not always respected the right to privacy for groups it did not agree with…. And many of us, as civil rights advocates, have become targets of government surveillance for no reason beyond our advocacy or provision of social services for the underrepresented.

Thus, they go on:

We urge you to consider the dire implications for free speech and civil liberties if the FBI is permitted to force Apple to create technology to serve its investigatory purposes. The FBI’s historically questionable surveillance procedures do not bode well for setting a precedent that allows the agency universal access to private smartphone data.

The Pandora’s Box to which the UN high commissioner referred was not fanciful. In its “Motion to Vacate the Court Order,” Apple pointed out that though the FBI was insisting that it was asking only for a single phone to be unlocked, once it had been cracked, other requests would be forthcoming:

The government says: “Just this once” and “Just this phone.” But the government knows those statements are not true; indeed the government has filed multiple other applications for similar orders, some of which are pending in other courts. And as news of this Court’s order broke last week, state and local officials publicly declared their intent to use the proposed operating system to open hundreds of other seized devices—in cases having nothing to do with terrorism. If this order is permitted to stand, it will only be a matter of days before some other prosecutor, in some other important case, before some other judge, seeks a similar order using this case as precedent.

Indeed, in the midst of the embroilment, and not long after Apple filed this motion, Cyrus Vance Jr., the New York district attorney, made a public display of the 175 iPhones he needed to have unlocked in order to search for evidence in his office’s criminal investigations. As Richard Clarke, the former national coordinator for security, infrastructure protection, and counter-terrorism, told NPR’s David Greene: “You really have to understand that…[the FBI is] not as interested in solving the problem as they are in getting a legal precedent.” This precedent might extend not only to Apple having to assist law enforcement in additional cases, but to other tech firms being ordered to turn their consumer devices—televisions, webcams, and cars, for instance—into surveillance tools.

Perhaps the most interesting argument put forward by Apple and many of its advocates concerned what they perceived to be the government’s assault on the company’s First and Fifth Amendment rights. Apple products are designed to run software that, after testing and approval by the company’s engineers, bears its digital signature—think of it as Apple’s imprimatur. By insisting that Apple write a program to override its security features, a program to which the company has fundamental objections, and then requiring those lines of code to bear the Apple signature, the FBI appeared to be “compelling” the company’s speech, thus violating First Amendment protections.

Moreover, Apple’s lawyers argued that requiring a company with “an extraordinarily attenuated connection to the crime to do the government’s bidding…violates Apple’s substantive due process right to be free from ‘arbitrary deprivation of [its] liberty by government.’” Since at least 2001, the courts have considered computer code to be speech. The question here was whether making Apple write something contrary to its beliefs was or was not constitutional.

That question, and all the others raised by this case, were left unresolved when, quite suddenly, on March 28, a week before both parties were due back in court, the government announced that with the help of an unnamed third party it had successfully unlocked Farook’s phone and no longer needed Apple’s assistance. According to a Washington Post report on April 12, “professional hackers” were paid a “one-time flat fee” to help unlock the phone.

The FBI did not specify who, or how much, it paid. Comey said on April 21 that the fee exceeded his total salary for the next seven years—some $1.3 million—but that “it was worth it.” Nor would it tell Apple what that vulnerability was or how the hack worked. “Disclosing a vulnerability can mean that we forego [sic] an opportunity to collect crucial intelligence that could thwart a terrorist attack,” Michael Daniel, the White House cybersecurity coordinator, observed in 2014. He was explaining why the government might decide not to report those flaws to manufacturers, even though not doing so could compromise public safety in other ways.

Tech companies are constantly on the lookout for holes in their products. So are cybercriminals. A study by IBM released last year found 1.5 million cyberattacks worldwide, a number that is only expected to rise, with more and more attacks directed at mobile phones. Many of these attacks came from criminal syndicates. Spies and other intelligence professionals are also keen to discover unguarded points of entry in both software and hardware, especially what are called “zero-day” vulnerabilities—flaws that haven’t so far been detected and patched. As Adam Segal points out in The Hacked World Order, knowledge of such zero-day flaws is remarkably valuable, often being sold for hundreds of thousands of dollars on the black market. It appears that a zero-day is what the FBI bought to gain entry to Farook’s iPhone, with both Apple and hackers scrambling to figure out where it lies, while leaving iPhone 5c users sitting ducks for cybercriminals and for future surveillance by the FBI and other agencies.

In his appeal to the audience at the Brookings Institution, James Comey said that “the challenge to law enforcement and national security officials is markedly worse with recent default encryption settings and encrypted devices and networks—all designed to increase security and privacy.” And he is right. Encryption makes networked devices more secure. While lessening or removing these protections might make the work of FBI agents easier, it will afford a great many bad guys the same accommodations.

What the government fails to acknowledge is that even if mobile phones had no security features, people like Malik and Farook, who would use them to do evil, could go deeper into the Web, where encryption software is easily available. Actually, they wouldn’t have to go very deep at all. In early April, the founders of WhatsApp, an Internet messaging service with about a billion users, announced that it had added end-to-end encryption to its product. Pictures, text messages, videos—anything—sent via WhatsApp, on any platform, using any hardware, will be inaccessible to everyone except the recipient and the sender.

Encryption is a deterrent. It can’t defeat cybercrime and it won’t stop digital spying. There are too many portals leading into our online lives, and as more of our personal data migrate to the cloud, and as more of our routine activities are conducted online, and as more of our things—refrigerators, toasters, music players, thermostats, and on and on, as well as critical infrastructure—are connected to the Internet, these openings will leave us increasingly at risk. Fred Kaplan writes in Dark Territory about an attempt by the Department of Homeland Security in 2004 to develop a government-wide intrusion detection system called Einstein that failed because “the largest super-computer would have had a hard time monitoring the traffic in and out of four thousand entryways to the Internet….”

Obviously there is significantly more traffic now. Despite its initial failure, the DHS continued to put money into Einstein, taking it through numerous iterations. Whether it would have been strong enough to fend off the theft last year of 21.5 million personnel files housed in the computers of the Office of Personnel Management will never be known: OPM failed to install it. One reason Apple built security features into its phones is that most people, on their own, are remarkably lax about using protection. Apparently this includes government bureaucrats, too.

In its response to Apple’s refusal to write code to crack Farook’s phone, the government wrote that

Apple and its amici try to alarm this Court with issues of network security, encryption, back doors, and privacy, invoking larger debates before Congress and in the news media. That is a diversion. Apple desperately wants—desperately needs—this case not to be “about one isolated iPhone.”

On April 8, two weeks after abandoning its attempt to force Apple to unlock—only—that isolated iPhone, the government, as Apple had predicted, was back at Apple’s door. It asked a court to compel Apple to unlock the phone of a drug dealer in Brooklyn. That case was resolved when an unnamed source provided the password. In the meantime, according to The New York Times, Apple has resisted a dozen other requests by the government to unlock its phones.