This week a federal judge ordered Apple to help the FBI unlock a work-issued iPhone used by a gunman in December’s deadly San Bernardino, California, shootings. But Apple opposed the ruling, objecting to building what it calls a “backdoor.” In a message posted on the company’s website, CEO Tim Cook said that while the company believes the FBI’s intentions are good, this demand is an “unprecedented step” that “threatens the security of our customers.” The case is raising the stakes in a complex and long-running standoff between government and Silicon Valley over access to encrypted data—one that involves balancing consumers’ digital privacy with the authorities’ ability to investigate crime or terrorism.
Andrea Matwyshyn, professor in the School of Law, studies technology innovation and its legal implications. We asked her to examine how this case might play out, with a particular focus on its potential impact on future federal investigations, the average iPhone user’s privacy, and the global market.
In a message to customers, Apple CEO Tim Cook wrote that while the government may argue that using this backdoor would be limited to this case, “there is no way to guarantee such control.” He added that the government could extend this “breach of privacy” to demand that Apple create surveillance technology and other software to access users’ phones without their knowledge. Are these fears justified, and if a so-called “master key” is created, how broad could its usage be by the government?
Certainly any case law arising from this ex parte demand by the Department of Justice would create precedent that other courts would regard as instructive for similar future law enforcement demands. It’s fair to worry about a slippery slope arising from the outcome of this dispute: If private sector entities become conscripted as unwilling agents of law enforcement and are no longer free to design their own products to meet the demands of their customers, our innovation-driven economy will be permanently damaged.
To date, Congress has been unwilling to legislate on the point of requiring technology companies to alter their product designs to include law enforcement “backdoors,” despite aggressive lobbying efforts from law enforcement. This ex parte order against Apple potentially represents law enforcement’s attempt to circumvent this congressional impasse by achieving a functionally similar result incrementally through the courts.
What could this case mean for average iPhone users and their privacy? Should they be concerned?
Yes, they should be concerned. Leading technology companies such as Apple and Google have been progressively increasing the security built into their products for consumer protection reasons. Similarly, the Federal Trade Commission and European data protection authorities view encryption as a key tool to combat identity theft. The FBI’s demands, if granted, would damage these consumer protection efforts. As explained by leading computer security experts, creating a new “master key”—whether it be to alter user-designated settings or to break encryption in a device—would weaken existing security measures and place consumers at increased—and avoidable—risk. Particularly in light of the many high-profile data breaches suffered by federal agencies in the past five years, it is also unfortunately entirely possible that law enforcement would lose control of any such master key, thereby allowing criminals easier access into millions of consumer devices for nefarious purposes more efficiently than before the existence of the master key.
It’s fair to worry about a slippery slope arising from the outcome of this dispute.
— Law professor Andrea Matwyshyn
Sen. Ron Wyden of Oregon, a leading legislator on privacy and tech issues, told The Guardian that “This move by the FBI could snowball around the world.” What are the broader implications of this case for the technology sector and beyond the U.S.?
Demands such as these fuel international privacy concerns about U.S. law enforcement overreach, which arose partially from the Edward Snowden revelations regarding the National Security Agency’s information-gathering program’s collection of data on our European allies, their heads of state, and their citizens. Because privacy is viewed more aggressively as a dignitary interest and human right in other countries, consumers in those other countries would potentially avoid purchasing U.S. technology products because they come with backdoors. U.S. technology companies’ global market share would drop, as would share price and revenues. The crown jewel sector of our national economy would be harmed.
Cook and supporters of Apple’s stance have pointed, in part, to what they call an overreach of government authority. Do you see ways, whether they are new laws or other avenues, to help federal prosecutors navigate our evolving digital landscape and investigate cases?
The FBI’s arguments that new technologies are frustrating some traditional forms of evidence-gathering, while true, are not new. With each prior generation of technology, law enforcement has successfully adapted to the challenges of evidence-gathering presented by changed circumstances. For example, burner phones undoubtedly also presented challenges for law enforcement in the 1990s, yet law enforcement adapted successfully. Law enforcement is now challenged by more privacy and security-protective product models that assist consumers in defending themselves against identity theft and phone theft—compelling consumer protection concerns.
The bigger-picture answer here again is likely through a “personnel is policy” approach, just as it has been in the past: Law enforcement needs to hire staff with the requisite technical skills and offer additional technical training to agents. Compelling cooperation from technology companies is an inefficient approach—even assuming it to be technologically possible: It is not cost-effective—particularly when time is of the essence in investigations—for law enforcement to seek court orders, externalize law enforcement operating expenses onto the private sector on a case-by-case basis, and ask companies to “break” their more innovative and consumer-protective product models.
The FBI’s arguments that new technologies are frustrating some traditional forms of evidence-gathering, while true, are not new. With each prior generation of technology, law enforcement has successfully adapted to the challenges of evidence-gathering presented by changed circumstances.
— Law professor Andrea Matwyshyn
Apple’s refusal to comply sets up a major showdown with the federal government. And there seems to be some debate as to the appropriate case law, with some citing the All Writs Act of 1789 and others the Communications Assistance for Law Enforcement Act of 1994. What are the next legal steps, and do you think Apple or the government will ultimately prevail?
In this case, law enforcement asked the court to compel Apple to disable particular user-calibrated deletion features on an older iPhone, which may frustrate their attempts to crack into the phone. The court, instead, offered Apple several paths to engage in “reasonable technical assistance”—to provide a technical tool, to direct the FBI to other resources to accomplish its desired task, or to show cause for its inability to comply. A close reading of the order indicates that the court appears aware that the FBI’s request may not be technologically possible based on product design, and the court stopped short of demanding that Apple accomplish the technologically impossible. The court also made note that costs associated with compliance would be imposed on Apple, instructing Apple to “advise the government of the reasonable cost of providing this service.” Apple will now undoubtedly present evidence to the court regarding technological feasibility as it contests the order.
The next legal step is Apple’s argument before the magistrate judge regarding the technical feasibility of the demands made by law enforcement, likely followed by an appeal in district court. I expect neither side to back down. Stay tuned.