Design Wars: The FBI, Apple and hundreds of millions of phones

Design Wars: The FBI, Apple and hundreds of millions of phones

By Deirdre K. Mulligan and Nick Doty, UC Berkeley, School of Information | Permalink | Also posted to the Berkeley Blog

After forum-and fact-shopping and charting a course via the closed processes of district courts, the FBI has honed in on the case of the San Bernardino terrorist who killed 14 people, injured 22 and left an encrypted iPhone behind. The agency hopes the highly emotional and political nature of the case will provide a winning formula for establishing a legal precedent to compel electronic device manufacturers to help police by breaking into devices they’ve sold to the public.

The phone’s owner (the San Bernardino County Health Department) has given the government permission to break into the phone; the communications and information at issue belong to a deceased mass murderer; the assistance required, while substantial by Apple’s estimate, is not oppressive; the hack being requested is a software downgrade that enables a brute force attack on the crypto — an attack on the implementation rather than directly disabling encryption altogether and, the act under investigation is heinous.

But let’s not lose sight of the extraordinary nature of the power the government is asking the court to confer.

Over the last 25 years, Congress developed a detailed statutory framework to address law enforcement access to electronic communications, and the technical design and assistance obligations of service providers who carry and store them for the public. That framework has sought to maintain law enforcement’s ability to access evidence, detailed a limited set of responsibilities for various service providers, and filled gaps in privacy protection left by the U.S. Supreme Court’s interpretation of the Fourth Amendment.

This structure, comprised of the 1986 Electronic Communications Privacy Act and the 1994 Communications Assistance for Law Enforcement Act, should limit the FBI’s use of the All Writs Act to force Apple to write a special software downgrade to facilitate a brute-force attack on the phone’s encryption and access the phone’s contents.

As we argue in a brief filed with the court today, the FBI’s effort to require Apple to develop a breach of iPhone security in the San Bernardino case is an end run around the legislative branch. While the FBI attempts to ensure that law enforcement needs for data trump other compelling social values including cybersecurity, privacy, and innovation, legislators and engineers pursue alternative outcomes.

A legal primer

The Communications Assistance for Law Enforcement Act, passed in 1994, essentially requires telecommunications carriers to make their networks wire-tappable, ensuring law enforcement can intercept communications when authorized by law. Importantly, CALEA’s design and related assistance requirements apply only to telecommunications common carriers and prohibit the government from dictating design; alternative versions of the law which to extend these requirements to service providers such as Apple were debated and rejected by Congress.

The second statute of interest is the 1986 Electronic Communications Privacy Act. ECPA governs the conditions and process for law enforcement access to stored records such as subscriber information, transactional data, and communications from electronic communication service providers and remote communication service providers.

Apple has assisted the government in obtaining records related to the San Bernardino iPhone stored on Apple’s servers. That is the extent of Apple’s obligation. ECPA does not require service providers like Apple to help government get access to information on devices or equipment owned by an individual, regardless of whether they sold the device to that individual.

A ruling that the All Writs Act can be used to force Apple to retroactively redesign an iPhone it sold to ensure FBI access to data an individual chose to encrypt would inappropriately upend a carefully constructed policy designed to address privacy, law enforcement, and other values.

If the AWA is read to give a court authority to order this relief because the statute does not expressly prohibit it, it would allow law enforcement to bypass the public policy process on an issue of immense importance to citizens, technologists, human rights activists, regulators and industry.

Make no mistake, we are in the midst of what we call the Design Wars, and those wars are about policy priorities which ought to be established through full and open legislative debate.

Design Wars: The FBI Strikes Back

Design by Molly Mahar (UC Berkeley); background image from NASA.

Unlike an exception in a law that requires a standard to be met by someone in the right role (for example law enforcement), and ideally a court process to invoke (a warrant or other order approved by a court), a vulnerability in a system lets in anyone who can find it – no standard, no process, no paper required: come one, come all. For these reasons, former government officials differ about whether the trade off is worth it.

Former National Security Administration and CIA Director Michael Hayden has recognized that, on balance, America is “more secure with end-to-end unbreakable encryption.” This view is shared by former NSA Director Mike McConnell, former Department of Homeland Security Secretary Michael Chertoff and former U.S. Deputy Defense Secretary William Lynn who recently wrote, “the greater public good is a secure communications infrastructure protected by ubiquitous encryption at the device, server and enterprise level without building in means for government monitoring.”

This is a big public policy question with compelling benefits and risks on both sides. It’s a conversation that should occur in Congress. If the FBI can require product redesigns of their choosing through the All Writs Act, it risks subverting this process and sidestepping a public conversation about how to prioritize values – defensive security, access to evidence, privacy, etc. – in tech policy.

Technical complications

Much of the public debate has focused on how many phones will be affected by the order to design and deploy a modified and less secure version of the iOS operating system. The FBI claims interest in a single phone. Apple claims that the backdoor would endanger hundreds of millions of iPhone users.

“In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.” — Apple

Legal precedent is certainly an important question; if the All Writs Act can compel Apple to design and deploy software in this case, then would they also have to for the other 13 devices covered by other federal court orders? Or the 175 devices of interest to the Manhattan District Attorney? Will it only require assistance where the government possesses the phone? Or can the All Writs Act be used to push malicious software updates to a device to proactively collect data? What should Apple’s response be when this case is cited by governments of other countries (including China) to compel disabling the PIN entry limits or other security features of an activist’s iPhone?

But the danger of a backdoor exists separately from legal precedents. What if the custom, insecure operating system were to fall into the wrong hands? Apple notes in their motion that it would be a highly-prized asset, sought by hackers, organized crime syndicates and repressive regimes around the world. Developing such software would endanger the security and privacy of all iPhone users in a way that couldn’t be fixed by any software update.

To the FBI’s credit, the conditions of the court order try to limit the risk of this dangerous software falling into the wrong hands: customizing the software to run only on the San Bernardino phone, and unlocking the phone on the Apple campus without releasing the custom insecure software to the FBI.

However, security practitioners more than anyone recognize the limits of well-intentioned methods such as these. They believe in defense in depth, as advocated by the National Security Agency. Rather than relying on a single protective wall or security measure, good security anticipates bugs and mitigates attacks by building security throughout all parts of a system.

Could the design of the custom-insecure operating system limit its applicability and danger if inadvertently released? Apple engineers would certainly try, and much of the expense of developing the software would be the extensive testing necessary to reduce those dangers. But no large scale piece of software is ever written bug-free.

And what is the likely response of rational companies faced with hundreds or thousands of requests to unlock secure devices they’ve sold to the public? Sure, Apple may be financially capable of creating boutique code to unlock every individual phone law enforcement wants access to, or at least many of them. But other companies may build in backdoors to accommodate law enforcement access with minimal impact to the business bottom line.

The result? An end run around the legislative process that has to date been unconvinced that backdoors are good national policy and decreased security for all users.

What next?

Beyond the courtroom, Congress has jumped back into the fray with a hearing in the House Judiciary Committee: The Encryption Tightrope: Balancing Americans’ Security and Privacy.

But the discussion will also include software and hardware engineers. As technical designers see the discretion of law used (or abused) to access communications or undermine security, they will seek technical methods to enforce security in ways increasingly difficult to reverse or undermine.

To take a piece of recent history, revelations during 2013 of the NSA’s mass surveillance of online communications and sabotage of security standards led to organizational and architectural responses from the technical community. The Internet Engineering Task Force concluded that pervasive monitoring is an attack on privacy and one that must be mitigated in the development of the basic standards that define the Internet. A flurry of activity has led to increased encryption of online communications.

As we discussed with the LA Times last week, we expect to see more encryption in cloud services; using a design pattern of exclusively user-managed keys, service providers may build storage and processing services where they are unable to decrypt content for law enforcement and where hackers will be unable to review the data even after breaching a company’s security.

Likewise, look for more work in academia and in industry on: reproducible builds, certificate transparency, homomorphic encryption, trusted platform modules, end-to-end encryption and other technical capabilities that allow for providing services with guarantees of privacy and security from tampering, whether by a hacker, a national intelligence agency or, via court order, the service provider itself.

The next battle in these Design Wars, even after the outcome of the Apple v. FBI case, will be whether the legal process tries to frustrate these technical efforts to provide enhanced security and privacy to people around the globe.


More resources:

1 Comment
  • Indian

    March 20, 2016 at 8:44 pm Reply

    I think you can also add Lenovo and Sony in the title. Not only Apple.

Post a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.