The first session was moderated by Fred Cate and featured opening remarks from three panelists: Chris Inglis, Distinguished Visiting Professor in Cyber Security Studies, U.S. Naval Academy, and former deputy director of the National Security Agency (NSA); Patrick Ball, director of research at the Human Rights Data Analysis Group (HRDAG); and James Baker, general counsel for the Federal Bureau of Investigation (FBI). The speakers shared their perceptions of the encryption landscape today based on their diverse experiences. In particular,
- Inglis emphasized that any use of cryptography and potential systems for exceptional access must be based on well-defined goals and uses.
- Ball championed cryptography’s importance in protecting human rights internationally and in the United States and offered a cautionary message about the threat of government abuses.
- Baker underscored the FBI’s role as a servant of the American people and emphasized that while the increasing use of encryption has made investigations more difficult, the FBI does not intend to impose on the population any particular solution.
Following a brief introductory statement from each of the panelists, Cate moderated a general discussion covering many aspects of the debate over giving government actors exceptional access to plaintext from encrypted data.1 Participants considered the differences when thinking about the problem from an international, federal, and state or local perspective; differing levels of trust in government and various perceptions of government’s motivations for exceptional access; and the difference between the possible and the practical in terms of both technical capabilities and the decision-making framework.
Chris Inglis retired from the Department of Defense in 2014 after more than 41 years of federal service, including 28 years at the NSA and seven and a half years as its senior civilian and deputy director. Noting that his
1 Throughout this summary, as at the workshop, the term “plaintext” refers to data in its unencrypted form, “encryption” refers to the use of cryptographic algorithms to encode data so that it cannot be read by an unintended recipient, and “exceptional access” refers to situations where the government or another authorized party “needs and can obtain” the plaintext of encrypted data. (National Research Council, Cryptography’s Role in Securing the Information Society (K.W. Dam and H.S. Lin, eds.), National Academy Press, Washington, D.C., 1996, p. 80.)
views are influenced by his time at the NSA but that he does not speak for the NSA, Inglis began by emphasizing that when designing encryption frameworks, one must consider how they will be used. Any system comprises both technology and doctrine (he defined the latter as the set of appropriate purposes for and uses of technology). That context, he said, must be a part of any discussion about access. He said the goals of encryption are best defined before the technology is built, instead of starting with the technology and then “bending it” to suit a particular application. Rather than being drawn into an unproductive conversation that pits individual expectations for privacy, security, or other aims against collective security objectives supported by exceptional government access, he advised drawing lessons from the marketplace; different companies have made starkly different choices about how their technology will be used and then designed their technology to suit those purposes. Apple, for example, could be read to support ubiquitous encryption with no possibility of exceptional access, even by the company itself. On the other hand, access to plaintext is vital to the Google’s business prospects, and it has built exceptional access into communications streams under controlled, transparent conditions. Although those are two very different choices, they offer examples of how technology can be built to accommodate specific goals that are decided up front.
Inglis concluded his remarks by stating that there has never been a perfect encryption system; even when the math is correct, implementation is never perfect; as a result, encryption will never meet all expectations. This further underscores the importance of doctrine, which he finds as important as the encryption itself. In closing, Inglis expressed confidence that his fellow workshop participants would be able to create systems to meet whatever set of requirements were decided upon. While recognizing that the goal of the workshop is not to achieve consensus on what those expectations are, he said that setting out such groundwork is necessary for a truly thoughtful discussion about the merits of any particular implementation.
Patrick Ball, director of research at the Human Rights Data Analysis Group has spent 25 years conducting data analysis on human rights violations committed by governments around the world. He also acknowledged colleagues he had surveyed for information and experiences to inform his remarks; these included colleagues from various organizations, among them the Electronic Frontier Foundation, Access Now, Human Rights Watch, the Center for Media Justice, and the Genentech Initiative.
Ball presented an overview of the technology landscape based on a 2016 survey by the Berkman Center2 describing a range of encryption products, including file encryption; full-disk encryption; e-mail, messaging, and voice encryption; virtual private network solutions; and tools for digital signatures. Overall, one-third of these products were made in the United States, while the rest were made abroad. About two-thirds were commercial products and one-third was free and open source. Many, said Ball, were software libraries (rather than stand-alone encryption tools), which provide building blocks that make it “fairly easy” to build encryption into a software application. As an illustration, Ball described his own experience with the development of Martus, a free, open-source, secure tool designed to enable human rights workers to safely collect, encrypt, and store information that a government could view as threatening. Martus is one of many open-source encryption products available for human rights groups to use to protect activists from government violence and repression. In the context of the wide availability of encryption systems, Ball emphasized that it is impossible to prevent people or governments from using them. If a government “back door” were created for legal encryption schemes, smart criminals and terrorists would only be more incentivized to build and use illegal encryption systems that do not contain the back door. If the FBI were to break or otherwise gain access to legal encryption schemes, in his view, that would only provide access to information encrypted by less sophisticated criminals—and from the law-abiding world, as well.
Ball pointed to numerous groups around the world who use cryptography to shield their efforts from governments that might oppose them. Examples include journalists’ associations in Egypt, Uganda, Rwanda, and Nicaragua; lesbian, gay, bisexual, and transgender groups in Jordan, Serbia, Morocco, and central and southern Africa; democracy activists in Ethiopia, Turkey, and Kyrgyzstan; human rights activists across east Africa,
Cambodia, Tunisia, and Latin America; and environmental activists in India and Ecuador. The threats such groups fear most, he said, are their own local governments’ military and police. He also noted that many of the encryption tools human rights groups use were funded by grants from the U.S. Department of State, underscoring the department’s view that U.S. interests are served by supporting independent nongovernmental organizations that hold corrupt, often brutal, governments publicly accountable for their actions. Ball then briefly described examples from his own experiences using encryption in the course of human rights investigations abroad. In Guatemala, where Ball worked on and off for more than 20 years, human rights workers collected and analyzed information about the genocidal violence carried out against the Ixil people by José Efraín Rios Montt, Guatemala’s president in 1982 and 1983, which involved the murder of some 100,000 civilians over 18 months. The evidence collected by Ball and others, stored in databases that were encrypted nightly to protect them against theft and use by government actors, was ultimately used in the 2013 trial in which General Rios was convicted of genocide and crimes against humanity. Ball noted that documents uncovered in the course of the investigation revealed that in the 1980s the Guatemalan National Police regularly exchanged information with the FBI, primarily about training, narcotics investigations, and suspects. For the National Police, “suspects” included human rights activists, student and labor leaders, and dissident professors, although the nature of the information provided to the National Police by the FBI is not known. Iraq, where several groups use Martus to protect data collected about human rights violations, provides another illustrative example. Ball has shared technology and training with Yezidi human rights groups to help them securely document violence against their community, which Ball described as “one of the strongest prima facie cases of genocide I know of in recent years.” He noted that those collecting these data are concerned both about potential assassinations of witnesses and human rights workers by Daesh3 and seizure of their data by agents of the Iraqi state for propaganda purposes.
Ball emphasized that encryption is necessary not only for human rights groups working abroad. He noted that government surveillance is also a significant concern for human rights and civil liberties groups working in the United States. He provided the example of Human Rights Watch, which uses secure telephony for cell phone calls where possible, end-to-end encrypted video conferencing, and two types of commercial encryption for internal e-mail. He also noted that journalists in the United States often use SecureDrop, an open-source “whistleblower submission system” to communicate securely with anonymous sources. This is essential, Ball believes, because a journalist’s freedom to pursue stories can sometimes be at odds with government wishes, even in the United States. Ball pointed to moments in American history in which the government conducted surveillance of its people, including civil rights and antiwar activists such as Martin Luther King, Jr., Malcom X, and Muhammad Ali in the 1960s and individuals critical of U.S. policy in Central America in the 1980s. Today, the Black Lives Matter movement has drawn attention to disproportionate police violence committed against people of color. Ball noted that, while some of these police actions may be necessary, legal, or justified, the FBI does not record this information accurately or comprehensively. In his view, the FBI focuses its messages on terrorists and kidnapping scenarios while ignoring “far more numerous acts by government agents that affect everyday Americans.” The responsibility for investigating police homicides can fall to journalists and civil society, whose efforts can also be protected by strong encryption.4 Ball expressed his belief that mechanisms for exceptional access by government would increase the amount of government surveillance without making any of us any safer, and that the downsides of this increased surveillance would be experienced most acutely by vulnerable populations. He pointed to an argument articulated by Black Lives Matter activist Malkia Cyril that encryption is necessary for civil and human rights to prosper because it protects the democratic right to organize for change. According to Ball, Cyril also pointed to evidence that police surveillance has already been directed at nonviolent movements for police reform via scraping of social media accounts and seizure of smartphones from arrested activists. Ball said that evidence of the extrajudicial use of cell phone surveillance devices such as StingRay for metadata tracking suggests that this type of surveillance
3 Daesh is the Arabic-language acronym for the terrorist group also known as the Islamic State, the Islamic State of Iraq and the Levant (ISIL), and the Islamic State of Iraq and Syria (ISIS).
4 In the workshop, “strong encryption” was generally used to mean encryption schemes without exceptional access mechanisms. However, it was sometimes used to mean encryption believed to be strong based on current theory, known attack methods, and estimates of computing capacity available to an attacker. The latter definition, depending on one’s views of the risks associated with exceptional access, might or might not include encryption schemes that provide exceptional access.
is a significant problem. Ball noted that the examples he highlighted were selected because they involve America and its allies. Encryption is also crucial to human rights work in countries like Syria, Russia, Iran, China, and Venezuela, but he said examples illustrating America’s history of illegal surveillance of its own citizens and of alliances with perpetrators of war crimes and genocide should “give us pause” about giving the U.S. government exceptional access to encrypted data. Recognizing that the government has argued and will continue to argue that encrypted data without exceptional access impedes legitimate investigations against “heinous perpetrators,” he urged the audience to focus on the big picture, arguing that exceptional access would help the FBI solve only a small number of cases compared with the great benefits of strong encryption for civil society globally. In addition, he posited that if given a “golden key,” the FBI could use the key, or intelligence gathered using it, as a bargaining chip in discussions with other countries’ law enforcement agencies, thus undermining the security of human rights workers abroad in order to support what he described as the FBI’s short-term interests.
From his years at HRDAG, Ball has seen firsthand how encryption has saved the lives of people working to protect human rights. He expressed strong doubts that it would be wise to give any government an exceptional access key and argued that doing so would significantly degrade security for everyone. He suggested that the primary goal of cybersecurity should be, first and foremost, to protect people by providing tools that ensure strong electronic security against repressive governments, cybercriminals, and terrorists.
James Baker, general counsel for the FBI, opened his remarks with an acknowledgement that the law enforcement community needs help from the various other communities represented at the workshop to ensure that America achieves its objectives while staying true to its values. In response to Ball’s comments, he noted that his direct experience with the FBI, which has spanned 25 years, has given him a very different view from Ball’s regarding the bureau’s ethic and values. While openly recognizing that the FBI has made mistakes, both in the past and the present, he said the way Ball described the FBI shows “an organization with which I am not familiar . . . it’s just not how we operate.” The rest of Baker’s talk focused on three main points: (1) strong encryption has benefits, (2) strong encryption has costs, and (3) the bureau does not seek to impose a specific solution but rather looks to the people to determine what tools are appropriate for it to use in carrying out its mission. Baker asserted that the FBI supports strong encryption and recognizes the tremendous benefits it offers, domestically and internationally, for cybersecurity, civil society, freedom of expression, and freedom of association. He also acknowledged that “the encryption genie is out of the bottle” and said that the FBI’s challenge is to figure out how to function in the context of this new reality.
While agreeing with Ball’s remarks about the value of encryption for protecting oppressed groups, Baker pointed out that the use of encryption also comes with costs. In particular, he focused on what the FBI views as costs to public safety, due to impediments facing the law enforcement, national security, and intelligence communities, stemming from the use of encryption. Under its mission to protect Americans and uphold the Constitution, the FBI must conduct investigations against foreign and domestic threats and enforce the criminal laws of the United States. In support of these activities, the bureau uses a variety of investigative tools, including interviews, human sources, subpoenas, and surveillance and searches, which involve physical places and people as well as data and electronic devices. As strong encryption grows more ubiquitous in the United States and around the world, the FBI’s ability to access stored data and the content of real-time communications, also called “data in motion,”5 for its investigations is affected. Although agents can and do try to work around these roadblocks by using other investigative methods, Baker said there are costs to this. For example, such alternative strategies may slow down investigations, lead to larger resource requirements, or yield less complete information than would otherwise have been obtained. Baker emphasized that the FBI is not trying to impose a particular solution, and he pointed out that the workshop would not have been convened if the bureau had had a solution. He explicitly noted that the FBI is not demanding a magic key for access, nor does it necessarily think that such a strategy would be the right solution.
5 This term was used by workshop participants to refer to data in transit on a communications network and was used in contrast to “data at rest,” or data that’s stored on a mobile phone, laptop, server, or other computer device.
Baker said that the aim of the bureau is to communicate to the public the costs of encryption and to stimulate debate about the proper balance of equities, such as privacy, free expression, cybersecurity, innovation, competitiveness of American companies, and public safety. He suggested that it is not up to the FBI or companies to decide on the proper balance of equities; rather, it is up to the American people to decide this. He said that the FBI will use whatever tools the people make available to pursue its mission of protecting the American people and upholding the Constitution. He suggested that there are many questions that the country needs to answer. For example, What tools do we want law enforcement to have? How should we balance all of the different values at play? Are there technical solutions to make such a balance possible? He noted the possibility that such solutions may not be available, and we will have to accept the costs in one area or another.
Baker concluded with a reminder that the costs of encryption are not hypothetical: Real people will be affected by the ability of law enforcement to investigate and prevent a variety of criminal threats.
Panelists and other invited speakers engaged in an open discussion of the encryption landscape, moderated by Fred Cate.
Cate opened the discussion by asking Baker to clarify his statements about the costs that encryption has for investigations. Baker first stated that in addition to the FBI and other federal agencies, there are 18,000 police forces in the United States. Each agency has different resources and different experiences with encryption. One impediment that commonly arises for state and local law enforcement is the inability to access data at rest on smartphones. When a phone is encrypted, with the contents accessible only by entering the passcode created by its owner, law enforcement has no technical tools to access the data it contains. Baker described this challenge as particularly acute for state and local authorities, adding that the FBI also faces it. Although he did not have specific numbers for the impact of this across the country, he said that the Manhattan District Attorney’s office has 175 to 200 phones that fall into that category right now, which, after extrapolation, suggests that it is a significant issue affecting a large number of investigations across the country.
Baker also pointed to the ways the inability to track data in motion can also affect national security and intelligence activities. ISIL actors in Iraq and Syria, for example, typically initiate contact with U.S. residents online via social media channels that the FBI can track because they are unencrypted. As these relationships develop, however, ISIL actors instruct their contacts to switch over to communication channels that are strongly encrypted. Once that happens, the FBI can access only the metadata, not the actual content of messages. Baker said this situation is particularly problematic in today’s terrorism climate, in which U.S. residents self-radicalize over a very short period of time. In the past, groups like Al Qaeda had a much more structured process for recruiting U.S.-based agents that involved face-to-face interaction and foreign travel, but today, U.S. residents are carrying out terrorist attacks after only online (and often encrypted) communications.
Later in the discussion, Marc Donner, engineering site director at Uber, asked how much technological capability the FBI has in this realm. James Burrell, deputy assistant director of the FBI, said that the bureau’s overall budget for operational technology was on the order of several hundred million dollars. He noted that the FBI actually develops new technology specifically for its operations, rather than simply adapting existing technology, and that this capability must be developed to support operations in both national security and law enforcement contexts. With a finite budget and a limited number of highly trained personnel to support its multiple goals, Burrell said, the bureau must prioritize requirements across the spectrum of its responsibilities.
Richard Littlehale, assistant special agent in charge of the Tennessee Bureau of Investigation’s (TBI’s) Technical Services Unit, spoke from the perspective of state law enforcement, which he said faces a different set of problems from those faced by federal law enforcement. Although state and local law enforcement can request help from the FBI to some extent, it is often limited by its operational technology capacity and by the fact that many local law enforcement agencies have no operational technology division. Given the growing ubiquity of encryption,
local agencies are encountering technological hurdles in case after case, and often it is not the perpetrator’s phone but the victim’s phone they seek to gain entry to. In such cases, he said, “were they to have a voice, perhaps they would want law enforcement to have access to their communications or their data.” This means that encryption has a real, human cost, both internationally and domestically, but this is a cost TBI is still struggling to quantify. He suggested that some sort of relative harms analysis could help to elucidate these issues to support informed decision making about how to balance these costs.
Baker built on this point, noting that it is particularly hard to quantify the cost of encryption hurdles when this cost involves evidence that is not collected or a wiretap that is not pursued. At the FBI, if a device or communication is inaccessible, agents will not waste resources pursuing a warrant or wiretap, knowing that the data gained would be encrypted anyway. Unfortunately, it is hard to quantify the number of times this occurs, or what is lost in such situations.
Cate asked the panelists to comment on the argument that while today there are more data than ever that could potentially be available for government investigations, fewer and fewer data are accessible for those purposes. Baker agreed that there are more data and noted that the FBI uses every legal tool it has to obtain and analyze metadata, which can offer hints and illuminate networks of people who may be working together. However, he said metadata alone cannot give the FBI a full picture; often, the actual content is necessary to understand people’s intent, capabilities, and plans. In addition, the deluge of metadata creates a data management problem, requiring agents to sort through a lot of noise to get to the pieces that can be informative.
Ball added his thoughts that governments have been able to use metadata to follow activists’ communications. Indeed, from the viewpoint of social movements, metadata can be “as big a problem as content analysis, and one that’s much more difficult . . . for them to protect themselves against.” He said that Black Lives Matter is a movement in which activists are concerned that their lawful activities are being surveilled using metadata.
Asked to clarify how his remarks relate to encryption and not just surveillance in general, Ball stated that introducing layers of encryption could better protect social media-based communication from surveillance. Such capabilities have not yet been introduced, and he fears they may never be, either because doing so is technically difficult or because companies might be concerned that such a design would become compromised or illegal in the near future. Ball pointed to 2016 draft legislation by Senators Richard Burr and Dianne Feinstein,6 which would have required technology companies to decrypt data at a court’s request, as a worrisome harbinger of movement in this direction.
Inglis expressed his agreement with Baker’s assessment of the FBI and its values and also agreed with Ball’s arguments about the importance of encryption and other security tools for protecting individuals in the context of human rights. He then responded to the question of whether exceptional access is fundamentally different from other tools that the government has at its disposal. He noted that, even if exceptional access is not available, a variety of other tools are, and one could ask whether the government should be trusted with any of those. Many of the powers we entrust to the government—such as the power to tax, the power to indict, and the power to imprison—can be used for good and noble purposes, but also have the potential for misuse. He suggested that exceptional access was no different in this regard.
Later the discussion returned to this topic, and Ball urged the audience to question the assumption that government exceptional access is a good thing. Given his direct experience with governments that actively suppress their citizens’ human rights, he believes that these are exactly the kind of governments that most want exceptional access; as examples, he pointed to North Korea and Cuba as governments that have made cryptography illegal, and to China
6 Dianne Feinstein, U.S. Senator, “Intelligence Committee Leaders Release Discussion Draft of Encryption Bill,” press release, April 13, 2016, http://www.feinstein.senate.gov/public/index.cfm/2016/4/intelligence-committee-leaders-release-discussion-draft-of-encryption-legislation.
and Russia as countries that have pursued government access to encrypted information. It is important to consider how many people can be helped by government exceptional access versus how many people can be harmed, a balance he believes skews strongly in the direction of greater harm under conditions of greater government access.
As for the FBI’s ethic and activities, Ball expressed appreciation for Baker’s clarification but countered that the FBI “looks very different from outside than it does from inside,” particularly from the viewpoint of those involved in movements for social change. He suggested that surveillance, even of metadata, can have a chilling effect that is “in a palpable way harming our democracy.”
Inglis said that because encryption knows no national boundaries, a national solution would be insufficient and what is needed is a solution that works in an international context. Such a solution would be possible only when agreement on common foundational values can be reached, which he believes is possible even across international boundaries.
Kevin Bankston, director of the Open Technology Institute at the New America Foundation, tied this idea into Baker’s remarks about encryption being widely used by entities like ISIS. According to research conducted by the New America Foundation,7 out of the nine encrypted messaging apps ISIS recommends for its agents, eight are not U.S.-made, or are open source, or are both, which makes it hard for the U.S. government to control them. As this survey and the Schneier et al. survey8 show, Bankston said, even if the U.S. government were able to gain exceptional access to systems created by U.S.-based companies, it would not fully solve the problem.
Inglis argued that it would not make sense for the United States to seek a solution that only works domestically, first because governments that share common value systems need to pursue common goals, and second, because companies—which exist in a global marketplace—need to have a common set of rules to adhere to across the jurisdictions in which they operate. Rather, in his view, we would be obligated to try to seek solutions in an international context, akin to the efforts undertaken in the context of nuclear proliferation, chemical weapons, and other arenas. Baker concurred that a solution has to work within a global context and affirmed Bankston’s suggestion that the widespread global availability of encryption tools is one of the hardest problems to grapple with.
Bankston expressed his doubts that a global solution is realistic, arguing that an attempt to squelch end-to-end encryption globally would be “akin to trying to win the War on Drugs if drugs required no ingredients, could be endlessly and easily replicated, and could be smuggled over the Internet.”
Inglis countered that in other ways the War on Drugs is not such a bad analogy, because drugs have both a benign and harmful purpose, and because although it is perhaps impossible to come up with a perfect solution, we still try. Rather than a stark choice between supporting either individual interests or collective national security, Inglis suggested that we should seek a solution that supports both aims in a way that is better, even if not perfect.
Noting that it is likely easier to regulate what happens on devices that are physically inside the United States, Baker said the potential business repercussions of such a move would also need to be considered, given that American companies must exist in a global marketplace.
Orin Kerr, the Fred C. Stevenson Research Professor of Law at the George Washington University Law School and workshop planning committee member, suggested that different solutions—and perhaps different levels of international coordination—might be warranted when dealing with encryption for national security purposes, as opposed to in the context of law enforcement, pointing out that there are three main contexts for government exceptional access: intelligence gathering for national security, federal criminal investigations, and state and local
7 K. Bankston, R. Schulman, and J. Laperruque, 2015, “An Illustrative Inventory of Widely-Available Encryption Applications,” New American Open Technology Institute, https://static.newamerica.org/attachments/12155-the-crypto-cat-is-out-of-the-bag/Crypto_Cat_Jan.0bea192f15424c9fa4859f78f1ad6b12.pdf.
8 B. Schneier, K. Seidel, and S. Vijayakumar, 2016, “A Worldwide Survey of Encryption Products,” Berkman Center Research Publication No. 2016-2, available at Social Science Research Network, http://ssrn.com/abstract=2731160.
criminal investigations. He asked whether members of Congress, when considering policy options, should think about different solutions or contexts for different purposes. Inglis observed that certain aspects of a potential international effort in this arena are likely to work much better in the context of law enforcement than in the context of national security or national intelligence, where countries inherently have different needs and goals.
Inglis went on to say that, at the national level, law enforcement and intelligence agencies have different standpoints on encryption, owing to their different missions, and citizens might feel more comfortable granting certain authorities to one than to another. The FBI and the NSA might, for instance, be placed under different obligations to disclose vulnerabilities they discover and use in investigations.
For example, he suggested that if the NSA uncovered a way to break into indigenous encrypted information channels from a rogue nation that is pursuing a nuclear weapon, few would expect the NSA to make that vulnerability known to such an adversary. But if the FBI uncovers a flaw in a system during an investigation, its obligations to both individual rights and collective security could well lead us to expect that the FBI would make such a vulnerability known so that it could be fixed.
Daniel Kahn Gillmor, senior staff technologist at the American Civil Liberties Union’s Speech, Privacy, and Technology Project, suggested that there is essentially no widely or intensively used encryption system that could be considered indigenous to a particular organization, with few exceptions (one being Mujahedeen Secrets). If all players are using the same operating systems and software frameworks, flaws—whether in individual components or the overall system—would be common across all systems. As a result, if a government agency exploits a vulnerability when targeting an adversary, that would mean that those same systems would also be vulnerable for the people that the government is supposed to be supporting, he argued. Moreover, he added, by exploiting the vulnerability, the government may, in effect, hand over the exploit to the adversary.
Inglis contended that even if the mathematics underlying the system is the same, what matters is the instantiation of the mathematics-based protocol in the system, arguing that it is conceivable that a government agency could uncover a vulnerability unique to a certain organization, and in such a case, it might not be obligated to disclose the vulnerability.
Butler Lampson, technical fellow at Microsoft and adjunct professor of computer science and electrical engineering at the Massachusetts Institute of Technology, suggested that studying the history of government and individual privacy could aid the debate about exceptional access in the context of encryption. For example, because of that history, today we have rules that guide when the government can enter a home or wiretap a phone.
Weighing in on this point, Inglis said that the Fourth Amendment (protection from unreasonable search and seizure) settled the matter, recognizing that there are times when the government is allowed to encroach on individual rights and expectations of privacy given probable cause. Today’s challenge is how to deal with encryption in this context. FBI director James Comey, in Inglis’s view, has no choice but to ask for the tools he needs to carry out the FBI’s mission. The history of technology, Inglis said, is that it does not present any perfect solutions—only workable ones.
Baker concurred, stating his view that the Fourth Amendment gives law enforcement the authority to access information on, for example, an iPhone found at a crime scene or on procurement of a warrant when the data are inaccessible after demonstrating probable cause. The issue in the case of encryption is that, even with a warrant, the FBI today might not actually be able to access content that it is legally authorized to read, because the communication is encrypted. When the courts grant a warrant or wiretap, we can assume this is lawful behavior, and the government should be allowed past the encryption, Baker said. Building on this point, Lampson reasoned that the law does not protect the contents of his house from search under a government warrant and suggested that the contents of his phone are not fundamentally any different, a conclusion with which Inglis expressed agreement.
Ball drew from history a different perspective, pointing to instances where government powers have been abused. Rather than these events being viewed as anomalies, he said, it is reasonable to assume that similar abuses will continue to be perpetrated today and in the future.
Andrew Sherman, head of security practice at Eden Technologies, a New York City-based information technology consultancy, pointed to other lessons that might be drawn from the analog world. For example, he said, although the police may have the right to search a particular house, that does not mean they should have a master key to access all houses at any time. In the context of radio transmitters, those wary of eavesdropping might remove the batteries from their device—or place it inside a Faraday cage to block the signal—to avoid being tracked. In the case of protecting data that are stored or communicated, the solution people turn to is encryption, and although it might complicate things for investigators, it is not going away.
Susan Landau, professor of cybersecurity studies at Worcester Polytechnic Institute, suggested that today’s technology raises new questions that are not addressed by wiretap laws developed in the past. For example, content and metadata today go well beyond the sender, receiver, and time of the call, and phones contain a great deal of information that a user might not even be aware of.
Eric Rescorla, a fellow in the office of the chief technology officer at Mozilla, pointed out that the term “possible” is being heavily used to talk about exceptional access and suggested that the more important question is really about what is “practical.” Instead of debating whether something is theoretically possible, we should ask whether an exceptional access mechanism could actually be built and operated in a plausible way. To that end, he asked three broader technology questions to discover what might make a particular scheme or not. (1) How would such a mechanism actually interface with existing or new products to provide exceptional access, and how would that mechanism impact the security of those products? (2) Is it likely that such a mechanism could be evaded? (3) Is it practical to build an exceptional access mechanism that effectively allows access only to authorized users?
Joseph Lorenzo Hall, chief technologist at the Center for Democracy and Technology, cautioned that not every desired capability is, in fact, practical to build, because the technical strategy for achieving that capability may also harm the system in which it is being deployed. He suggested that a positive outcome of these discussions might be to enable communication about the limitations of our technical capabilities to nontechnical individuals.
Inglis framed the issue as one of differing goals. Individuals set their own agendas, but governments provide collective security. A key question, he asserted, is who aligns those aims. Practically speaking, if these decisions are left to market forces, companies will have many different answers and the implications are potentially akin to the disastrous results when nations largely leave their financial systems to market forces. Instead, he argued, society must choose the kind of security it wants. Later, Baker concurred with the suggestion that taking no action would essentially mean leaving the question to market forces.
Inglis cautioned against another potential “default” scenario, in which the private sector essentially says to the government “you’re going to have to hack me.” In his view, that would not be a particularly beneficial relationship for the government, the private sector, or the citizens. He suggested instead that if the government gains exceptional access to encrypted information, it should be done overtly and transparently, so that the public can understand how the access mechanisms work and the purposes for which they can be applied. Baker later expanded on this point, stating that while the FBI can hack into devices and can invest in further building up such capabilities, this process is slow, expensive, and fragile, because vulnerabilities that are identified are soon fixed. This approach also is not scalable to allow gaining information from many different devices, as is often necessary when dealing with a network of people.
Inglis expressed his belief that this debate is important enough to make it worth going beyond what seems practical and exploring the realm of what is possible. Looking to the architectures already in use by Google and cloud computing services that will essentially give a user exceptional access if they lose their key or that give a company exceptional access for business purposes, he posited that such systems might represent an approach that has proven secure enough (while providing some forms of exceptional access) that people are willing to buy into it in large numbers.
Ball then commented that quantifying the costs of having unbreakable encryption versus the costs of requiring lawful access is both a hard and interesting problem. He stated that the costs of a world in which all secrets are known by law enforcement will be counterfactuals—the movements that did not organize or the protests against unjust policies that did not mobilize. He said that these costs are difficult to quantify.
Ball noted that “everyone thinks they’re the good guy,” warning that believing one is working toward a good and higher purpose can at times lead people to choose means that are unsavory because they seem justified by the ends.
In his final remarks, Baker echoed the sentiment that it is worth reaching for the possible, likening the current state to a “pre-iPad” frame of mind in which we cannot imagine a different framework simply because we have not thought of it yet. Suggesting that there could be a solution that has not yet presented itself, he said that perhaps what we should seek is some way to “think different.”