The U.S. versus Apple

Who doesn’t like computer security? Who doesn’t want more of it? Government agencies, retailers, restaurant chains, banks, credit card companies, insurance companies, and the entertainment industry have all been hacked. They all want more computer security, at least for themselves. Individuals want more of it, too, especially on their mobile phones. Credit card numbers and bank account numbers are likely to be stored there. So are the names and phone numbers of friends, and possibly of contacts we’d like to keep secret. Within our mobile phones lie the records of our financial and private lives. It’s disturbing to realize that mobile phones are so easy to lose, so easy to steal. One careless moment and our identity is in the wrong hands.

There is one group of people that wants less security on mobile phones: law enforcement agencies. They dislike password-protected data. Worse yet is password-protected encrypted data. Discovering the mobile phone of a criminal is like striking the mother lode; finding that its contents are protected and encrypted is like coming upon a blocked mine shaft. iPhones are the most popular mobile phones among people from all walks of life, including criminals, and they come with strong security features. That’s why law enforcement agencies are particularly eager to enlist the expertise of Apple Inc., the manufacturer, in disabling iPhone security.

cider pressFederal agencies have asked for Apple’s help in “unlocking” (breaking into) iPhones in 13 criminal cases. Apple doesn’t want to, so these agencies have commanded Apple’s help by convincing judges to issue writs (court orders). The authority for these writs is the 1789 All Writs Act. It gives a judge the power “to order a third party to provide non-burdensome technical assistance to law enforcement officials.” The key word is “non-burdensome.” Apparently, law enforcement doesn’t think it would be burdensome for Apple to devise elaborate security features for the millions of iPhones it sells and then undo these same features whenever one of them is involved in a criminal case. Apple might as well spin off a new company that is permanently on contract to federal crime labs.

Last Monday, a federal magistrate judge in New York said “no dice” to the Drug Enforcement Agency. The All Writs Act does not justify “imposing on Apple the obligation to assist the government’s investigation against its will.”  So now we know that this statute cannot compel Apple to unlock the iPhone of a drug dealer, but what about a more serious crime—like terrorism?

This brings us to the case of Syed Farook, who, with his wife, killed 14 people and injured 22 others at a party last December in San Bernardino, CA. The police and FBI subsequently searched the couple’s Redlands townhouse and found guns, ammo, pipe bombs, bomb-making equipment, computers with missing hard drives, and smashed mobile phones. The FBI Lab was able to retrieve the contents of all but one of the phones, an iPhone 5C that belonged to Farook. In this case too, the FBI got a court order to compel Apple’s assistance. Arguments for both sides will again be heard by a federal magistrate judge. Will the judge rule that the All Writs Act is appropriate to this case, one of much more consequence than drug dealing?

In a legal brief submitted to the court, Apple asked that the order be vacated. They asserted it overstepped the scope of the the All Writs Act and violated Apple’s Constitutional rights. Specifically, they cited a 1996 federal court ruling that “computer code is protected speech under the First Amendment.” Ted Olsen, Apple’s lead counsel, went so far as to say that the U.S. was unwittingly empowering a cyberattack on millions of Apple’s users. Outside parties who support either the FBI or Apple have also filed legal briefs. Microsoft, Google, Twitter, and Facebook weighed in on Apple’s side. On March 10, the FBI’s counsel will respond to the pro-Apple arguments. On March 15, Apple will offer its final reply to the FBI’s case. A week later, both parties will argue in District Court before the judge, who is expected to rule shortly afterward. Whatever the verdict, there is no doubt the loser will appeal it.

I see two possible outcomes, one that will fail in time and one that will hold indefinitely. The bound-to-fail outcome is one in which either the Supreme Court or Congress draws a line: “Law enforcement agencies may compel the use of technical expertise in cases of national security, meaning that criteria X, Y, and Z have been met.” Most likely the Supreme Court will punt, saying that it cannot compel Apple in the absence of new law that addresses the issue. Then we’ll have the spectacle of Congress trying to spell out the criteria. If a law is passed and signed by the president, it will no doubt be challenged in court, and we’re back to square one. Round and round we’ll go until the whole enterprise collapses.

We’ll get a durable outcome if the Supreme Court concludes:

  • Apple’s rights would indeed be violated if something it made—something useful and demanded by the marketplace—could be unmade by court order.
  • No matter what protections Apple took to keep its solution a secret, its details would eventually be discovered. Criminals would know a solution existed and would focus their energy on learning it. All it would take is one key software engineer who was bypassed for promotion and then offered a princely sum to tell what he knew.
  • Law enforcement already has a plethora of analytic tools to use in fighting crime. No doubt new, non-invasive ones will come along. Therefore, on balance, the advantages of protecting our identities and private thoughts outweigh the advantages of adding a decryption tool to law enforcement’s arsenal.

The threat of terrorism is forcing us to make difficult choices about our right to privacy. In their zeal to protect us, government agencies will not cease in eroding this right—unless we push back.