Who’s Right In Apple’s Fight with the FBI?
A U.S. magistrate judge has ordered Apple to help the FBI break into an iPhone used by one of the gunmen in the mass shooting in San Bernardino, Calif. (AP Photo/Carolyn Kaster)
The legal standoff playing out between Apple and the FBI has reignited the debate over privacy and national security.
If you haven’t followed the fight, here are the highlights: In December, the FBI seized the iPhone of one of the two suspects behind the shooting in San Bernardino, Calif., an attack that left 14 people dead. However, encryption technology is blocking the government from accessing the phone’s contents. A federal magistrate judge has ordered Apple to write a custom version of its software that would help investigators unlock the phone, but CEO Tim Cook is balking.
“We feel we must speak up in the face of what we see as an overreach by the U.S. government,” wrote Cook in an open letter to customers. “Ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.”
The stance has its roots in the backlash faced by Silicon Valley in the wake of the Snowden revelations. But could valuable intelligence be locked on that phone? One of the shooters, after all, had previously expressed support for ISIS. Might unlocking the device yield valuable intelligence?
For more on the case’s implications for privacy and national security, we invited two experts to join us for a debate. James Andrew Lewis, director of the strategic technologies program at the Center for Strategic and International Studies, believes Apple should comply with the order. Nate Cardozo, a staff attorney for the Electronic Frontier Foundation, disagrees. Here’s what they had to say:
Earlier this week, a U.S. federal magistrate judge issued an unprecedented order to Apple, wholly adopting the FBI’s position without engaging in any independent legal analysis. And that’s a shame, because the FBI didn’t apply for this order as an end in itself; they did it to create precedent.
In isolation, the order might seem reasonable. The FBI says it needs access to a suspect’s phone, and it appears that Apple is capable of creating a custom software package, a literal master key, that would let anyone break into an iPhone they had in their possession. The government has assured the court that it only wants into that one iPhone, and that Apple’s master key would be programed to unlock just that one device. “They are simply asking for something that would have an impact on this one device,” echoed White House Press Secretary Josh Earnest.
But that explanation is hogwash. It’s not about this one iPhone, this one attack, or this one investigation. The FBI is asking the court to create a rule going forward that would permit it to obtain orders in future cases requiring companies to create backdoors in anything the FBI feels it needs. This is not a power that the FBI has ever had in the physical world; to the best of our knowledge, no court ever ordered Brinks to make a master key to every safe.
The government chose to have this fight on this particular case, with these particular facts, very carefully. The crime at issue was absolutely heinous and the nation’s sympathies are rightly with the victims and their families. The phone at issue is the suspect’s work phone, owned by the county which has, of course, consented to the FBI’s plan. The FBI was confident that Apple would be technically able to provide it with the master key.
But if the legal theory the government is using here has the power to force Apple to subvert its security infrastructure in this case, there is no obvious limiting principle. Once that master key is ordered created, we can be certain that our government will ask for it again and again, for other phones, from other companies. Nothing would prevent the FBI from turning this power — potentially in secret — against any software or device that has the audacity to offer strong security. What’s worse, once the FBI has the authority to force American companies to subvert the security of their own products, those companies will be unable to resist demands from other governments. Apple has successfully resisted Chinese and Russian demands for a backdoor only because they’re able to argue that it wouldn’t do so for any government. If the FBI wins here, we’ll all lose.
No American court has ever ordered anything like this before. This is not a door we want opened.
This isn’t really a discussion about encryption. It’s a discussion about whether people trust their government. If you don’t trust the government, and many don’t, then you’ll like Apple’s narrow-minded refusal to help.
There is growing distrust of governments around the world, not just in the United States, but in the U.S., distrust is accelerated by a pervasive and worrisome narrative about corrupt officials and agencies run amok — as if “House of Cards” was real and not some screenwriters’ fantasy.
But there’s a fundamental flaw in the logic of those with such deep distrust of government. If government is made weaker, they will not be safer nor will they have more freedom. In fact, countries with weak governments are places where the average citizen has fewer rights and less safety.
If Apple had been smart, it would have quietly cooperated with the FBI’s request. But Apple has a problem, and that problem is distrust in the global market for American products brought on by the Snowden revelations. Apple must show its global customer base that American agencies do not have easy access to their data, and the company’s position over the San Bernardino case is intended mainly to reassure those markets.
Frankly, I don’t think this will work. Besides, it is not just the FBI that wants in. China, France, India, Russia and the U.K. all want recoverable encryption and are putting in place laws that require it. What happens in the U.S. may ultimately be irrelevant. The tide is turning against Apple.
There is a way to provide strong encryption that is “law enforcement friendly” that doesn’t involve any back door, but the encryption debate has been too trivial to get there yet.
In this case, the San Bernardino Department of Public Health (it’s a work phone, they own it) has provided consent to the FBI for a search and has asked Apple for it to unlock it. Apple has the technical means to gain access, and the court order requests a technical solution to this specific device. The best solution would have been for Apple to help out and take the credit. That opportunity is gone. Apple should comply with the warrant. It won’t hurt their foreign market and it might make the rest of us a little safer.
First, a correction on the technology involved here: there is no such thing as “law enforcement friendly” strong encryption. It just doesn’t exist. There is no way to keep our data secure from identity thieves, criminals, spies, corporate espionage, stalkers, or any of the myriad bad actors who want to get into our devices, while at the same time giving American law enforcement the access they’re demanding.
It used to be that when an iPhone was stolen, any run-of-the-mill criminal could break into the phone and read the owner’s secrets. Apple responded to that threat by hardening the devices it sells. It would be great if it was possible for Apple to keep us secure from the bad guys, while at the same time letting the good guys in when they need access. But there simply is no compromise position.
Don’t take my word for it. Last summer, an all-star group of experts published a paper titled “Keys Under Doormats” that utterly refuted the possibility that a so-called “exceptional access” regime, such as the one you’ve endorsed, could keep us secure. The group, which included the inventors of modern cryptography, computer scientists, mathematicians, and engineers from MIT, Columbia, Microsoft, Penn, Johns Hopkins, SRI, Worcester Polytechnic and Harvard, concluded that what the FBI is asking for is “unworkable in practice … and would undo progress on security at a time when Internet vulnerabilities are causing extreme economic harm.”
This “debate” over strong encryption is eerily similar to the “debate” over climate change. On one side, there are entrenched political elements, dogmatically advocating for their position. And on the other is a literally unanimous chorus of scientists, telling politicians that they’re wrong. And the position of scientists who’ve looked at the FBI’s proposal is unanimous. So far as I know, there is not a single cryptographer, mathematician or computer scientist who has published anything contesting the conclusion of the “Keys Under Doormats” paper. It is simply impossible to do what you’re asking Apple to do without endangering us all.
But as you say, we live in a world where repressive governments like China and Russia are salivating at the prospect of “recoverable encryption.” But is that fact any reason at all for our government to force an American company to give those governments the tools of repression? I’m of the firm opinion that we, as a society, should not stoop to that level.
I fully agree with you — it would be great if Apple could comply with the FBI’s request without endangering ordinary Americans. But that’s a fantasy.
Let’s get a little context. This isn’t about privacy. You don’t have any privacy. There are more than a dozen tracking programs on any website you visit. Companies take your data and commercialize it. This is why companies want you to log in before using a service or buying from their app store — so they can associate your actions with a profile they’ve collected about you and will sell. You have as much privacy as a goldfish in a bowl. It’s fair to ask what the privacy watchdogs were protecting while all this happened. Talk about the dogs that didn’t bark. Big companies, big intelligence agencies and the occasional random hacker group all have access to personal data — it’s for sale in online black markets.
Apple is doing this to protect its foreign markets, but refusing a court order will only slow the damage. Most countries use communications surveillance for domestic security (and to spy on tourists), and most citizens of foreign countries don’t object to surveillance by their own governments. They object to surveillance by the American government and by giant American Internet companies, including Apple. When Angela Merkel said she didn’t want to be a “data colony” of the United States, she wasn’t talking about the FBI.
This would have happened even without Snowden. The rest of the world wants an Internet that meets their preferences, not those of Silicon Valley. You’ve seen a whole string of actions — the European Court of Justice’s decision to cancel the 2000 Safe Harbor agreement, the “right to be forgotten” requirements imposed on Google, laws requiring data localization, all in reaction to the privacy pillaging Internet business model. Most countries have — or are moving to — a requirement that encryption be recoverable when a court order is served. It will be interesting to see how Apple responds when they get a similar request from the Chinese government.
There is real risk. There has been a major terrorist incident attempted against the U.S. every year since 2001. Apple isn’t protecting us from these things, nor does ISIS care about your privacy. My guess is that life will get harder for American tech companies if they refuse to comply.
A final note. This is a law enforcement problem. The FBI needs a clean chain of custody so that evidence can be used in court. It’s not an intelligence problem. Yes, this kind of encryption makes the work of intelligence agencies like Russia’s FSB or the NSA harder and more expensive, but not impossible. If a spy agency wants in, they will get in. The tricks an intelligence agency uses to break into your phone are not the tricks that produce court-worthy evidence, however.
There is a way to let people use strong encryption that can only be accessed with permission, either from the owner or from a court — products like those used by Google. Their encryption is very difficult to crack, but Google can recover the plain text — they need it for advertising and data analytics. This recoverable encryption is what companies use and what most people want.
The Apple case is the third time I’ve seen this movie. In the early 1990s, there was a fight to make phone companies build in surveillance into their switches. The result was the Communications Assistance to Law Enforcement Act. In the late 1990s, in the crypto wars (I was deeply involved), the U.S. decided then that it was better for Americans to have access to strong encryption to protect themselves online. That’s still the right decision, but it is no longer 1999 and the delusion that war had ended and every country would be a democracy doesn’t describe the world we live in today. Every time technology changes, the law has to change with it. The Internet is changing, the danger to public safety has changed, and encryption policy needs to change with it.
Privacy nihilism is seductive, but deeply misguided. Privacy is not dead, and only those who wish to kill it claim otherwise.
As you well know, privacy advocates, including me and my colleagues at the Electronic Frontier Foundation, are fighting just as hard against corporate data collection as we are against illegal government surveillance. We developed a browser plug-in called Privacy Badger that blocks trackers based on observed behavior, so we don’t need to rely on a hard-to-maintain blacklist. In December of last year, I filed a formal complaint against Google with the Federal Trade Commission, detailing how they’re misleading the public by illegally tracking our students’ classroom behavior despite repeated claims to the contrary. And those are just two examples of how we’re fighting back against the surveillance of ordinary people worldwide. Finally, just to point out the obvious that while Google and Facebook are insidious trackers of our online behavior, they can’t throw you in jail. They’re not even legally permitted to turn over the content of your communications to anyone without a warrant based on probable cause.
Privacy is a prerequisite to democracy. That you have nothing to hide is irrelevant; it’s not about you, or about me. Just as freedom of speech benefits those with nothing to say, privacy benefits those with nothing to hide. Without privacy, social change is impossible. The civil rights movement, the LGBT rights movement, and essentially every other agent of progress depended just as much on privacy as on the freedoms of speech and assembly.
Our nation was founded on the premise of limited government. Before the Revolution, agents of the Crown used so-called general warrants as authority to conduct untargeted sweeps for the terrorists of their day, the men who would become our Founding Fathers. As a nation, we agreed that would never happen again and the Fourth Amendment was designed to limit the power of law enforcement.
What the FBI is asking for in this case isn’t quite analogous to a general warrant, but it’s one small step removed. Privacy is not dead, but if the FBI’s legal argument wins the day in San Bernardino, the government will gain a vast new power to compel companies to deliberately weaken the tools that ordinary, law-abiding citizens use.
You point out that it will be interesting to see how Apple responds when they get a similar request from the Chinese government. Indeed it will. But that’s the point. We’re not China and the FBI needs to stop trying to build a police state. You say you still support strong encryption, but you’re advocating in favor of a legal regime that’s trying desperately to ban it.
The Framers of the Constitution and the Bill of Rights were big fans of encryption (Jefferson himself invented a number of strong cyphers), of anonymous speech (think, the Federalist Papers), and of course liberty. Ordering Apple to create a master key would be a betrayal of the values our nation was founded on.
To recap: Apple has been ordered by a court to help the FBI gain access to content on a phone used by jihadists who carried out a mass shooting. The owner of the phone (it’s a work phone) has given permission. Apple may already have the ability to do this. The request would apply only to this phone, not to all Apple products, since the technology requires physical access (e.g. you have to possess the phone). After refusing initial FBI requests for assistance, Apple was served a court order and has refused to comply with it.
Apple’s actions occur in a period of heightened threats of jihadist actions against U.S. citizens and the citizens of other nations, and at a time of widespread global outrage over NSA surveillance and the lax privacy practices of leading technology companies — most of whom are American. Apple is trying to distance itself from these concerns by taking a stance against the FBI.
This is not a good story. Let’s not pretend that there is something noble about this refusal. The motives are commercial.
The FBI can be histrionic in its efforts to sway public opinion on encryption, but in this case, the government has been measured in its actions. Privacy advocates, who have objected to every move to accommodate technology to law enforcement’s needs for the last 30 years, are displeased with the FBI’s requests.
The current tendency in American politics is to go to extremes and to make up facts (like encryption “backdoors”). The encryption debate requires balance and objectivity, however. We need to balance concerns over privacy with concerns over public safety — neither should predominate. We need a factual basis for decisions on this balance — and that includes understanding what other countries want to do on encryption, how the technology actually works, and how little privacy people now have online.
We have three questions to think about: How do we resuscitate privacy in this country without stifling innovation or security? How do we keep Americans safe when any plan that doesn’t involve magical thinking will require lawful access to communications (with congressional and court oversight)? How do we build international agreement on data flows and lawful access when there is so much distrust, warranted or not, of both American agencies and companies. None of these are easy, and the Apple case hasn’t helped move us towards a serious solution.
I understand that Apple is worried about slowing growth, but this case should not have been a problem. Saying yes to the FBI would not create risk to privacy and might reduce risk to citizens. The same document that says Americans are protected from “unreasonable” searches also makes clear that it is Congress and the courts that decide what is reasonable. Apple has received a reasonable request from a court for assistance. It should comply.