In the workshop’s third session, participants drilled deeper into the technical options for creating exceptional access mechanisms, discussed how these options might impact cybersecurity, and explored the motivations of and constraints on the government, industry, and society more broadly.
The session began with a presentation by Matt Blaze, associate professor of computer and information science at the University of Pennsylvania. He explored the historical evolution of technological solutions for exceptional access and presented an overview of the challenges faced today. The presentation was followed by a wide-ranging discussion moderated by Fred Chang and Susan Landau.
Blaze began his presentation with a look at the history of cryptography and exceptional access, starting with wiretapping. Originally, wiretaps involved law enforcement getting access to the target’s phone line and connecting a device capable of capturing the analog audio and telephone network signaling being sent over the line—a relatively straightforward proposition. Eventually, as telephone systems became digital, the Communications Assistance for Law Enforcement Act was enacted to ensure that carriers would provide law enforcement with the necessary interface to their networks as well.
Blaze called the period from 1992 to 2000 “Crypto War I,” the time during which consumers, the government, and the telecommunications industry grappled with dramatic shifts in the encryption landscape. Blaze pegs the start of this “war” to AT&T’s 1992 release of the TSD-3600, a secure telephone for consumers akin to the secure phones that had previously been used only by the government. The TSD-3600s used a Diffie-Hellman key exchange to set up encryption keys and a 56-bit data encryption standard to encrypt the digitized audio.
Although the telephone was expensive and not widely sold, the U.S. government nonetheless feared that technology like the TSD-3600 would spell the end of wiretapping and requested that AT&T make changes to accommodate lawful wiretaps on individuals who might use the device. The solution proposed was the Clipper Chip, which was designed by the National Security Agency (NSA) to be a drop-in replacement for the TSD-3600’s data encryption chip. Using an enhanced algorithm called SkipJack, the Clipper Chip provided 80 bits of security
as opposed to 56 bits. It also introduced key escrow, a feature that allowed the government to recover plaintext from a call if a TSD-3600 was used by a person who was the subject of an approved wiretap. AT&T incorporated the Clipper Chip into all TSD-3600 phones and even recalled already-sold TSD-3600 phones that lacked the chip, Blaze recounted.
The Clipper Chip was a controversial proposal and had technical problems, Blaze said. The first problem was that protocol failures in the chip’s design made it easy to bypass the key escrow feature; Blaze discovered some of these failures himself while working for AT&T. Although bypassability was likely a fundamental problem with the design, the protocol failures found by Blaze were comparatively minor and could be patched. Ultimately, a series of larger problems led to the Clipper Chip’s demise by the dawn of the 21st century. One key problem, in Blaze’s view, was that the Clipper Chip turned an inherently inexpensive technology for software-implemented cryptography into a hardware-based model that was expensive and difficult to integrate into the system. The Clipper Chip was built on a particular structure for performing encryption and required specific, tamper-resistant hardware and a classified algorithm that could not be made public. In addition, said Blaze, it had a central key database, which, although split between two government agencies, ultimately “amplified the risks extraordinarily by basically creating an end-to-end encryption system that was not, in fact, end-to-end.” This ultimately led to a system that was expensive, riskier, and easily bypassed.
By 2000, Blaze said, the country had come to the conclusion that the use of encryption was “really too important to slow down” by requiring a key escrow system for government access, and “unfettered cryptography” was allowed to blossom.
Blaze noted that an unfortunate side effect of the focus on the Clipper Chip and key escrow systems during the 1990s was that insufficient attention was paid to the security standards being built into the communications and information infrastructure then in use, which many view as not having been sufficiently robust to support the sensitive applications that depended on it, such as commerce, finance, and national security. “We are paying that price to this day,” Blaze said.
Blaze then turned to the current period, which he described as a time of both “Crypto War II” and a “cybersecurity crisis.” The first term stems from the renewed debate over encryption spurred by comments from law enforcement and national security officials about the challenges posed by ubiquitous encryption, exemplified by an October 16, 2014, statement from Federal Bureau of Investigation (FBI) director James Comey:
Unfortunately, the law hasn’t kept pace with technology, and this disconnect has created a significant public safety problem. We call it “Going Dark,” and what it means is this: Those charged with protecting our people aren’t always able to access the evidence we need to prosecute crime and prevent terrorism even with lawful authority. We have the legal authority to intercept and access communications and information pursuant to court order, but we often lack the technological ability to do so.1
At the same time, a daily barrage of data breaches and other cybersecurity incidents is affecting our critical systems, said Blaze, offering as an example the enormous 2015 breach at the U.S. Office of Personnel Management. The United States depends on the same platforms produced for the broad consumer market—operating systems and smartphones, and the like—for its critical infrastructure, financial systems, economy, and national security, as it does for its personal or even frivolous activities. Blaze stated that the adversaries perpetrating these breaches are not only ordinary criminals but also nation-states, and that the problem of securing all of this critical infrastructure is getting steadily worse rather than better.
1 From remarks of FBI director James Comey at the Brookings Institution on October 16, 2014, “Going Dark: Are Technology, Privacy, and Public Safety on a Collision Course?,” https://www.fbi.gov/news/speeches/going-dark-are-technology-privacy-and-public-safety-on-acollision-course.
Noting that the challenge of building reliable systems at scale has plagued computer science since the field’s inception, Blaze said that, in general, we still “don’t know how to fix this.” However, he pointed to two approaches as providing partial solutions: reducing the number of components in a system and using cryptography. One tried- and-true approach, he said, is to make systems as small and simple as possible in order to reduce the footprint of the vulnerabilities. Despite the merits of this approach, it has not proven practical in a world in which the endless thirst for capabilities leads to systems that are ever larger and more complex. New features are added to systems too quickly to allow them to be properly understood and to be made more robust.
The second approach is cryptography. By encrypting the data used in large swaths of a system’s components, we can reduce the “attack surface” and confine vulnerabilities to a smaller number of components. End-to-end encryption is considered the best practice in this area because it allows only the sender and receiver access to plaintext in the case of communication and allows only the person storing the data access in the case of data storage, thus reducing the opportunities for compromise to a bare minimum.
Looking forward, Blaze identified two crucial problems that need to be addressed and balanced. The first is that the U.S. computing infrastructure is in “terrible shape” and “getting worse,” and that any actions taken should at a minimum do no harm. The second is that encryption and similar technologies are, may be, or will be making lawful access to communications more difficult. The challenge, he said, is to address both issues without worsening the first.
Blaze then turned to the concept of exceptional access, which he defined as a mechanism that provides access to plaintext that is not inherent to the requirements of the application itself. In his view, exceptional access, by nature, makes a system more complex and makes it impossible to provide end-to-end encryption because it essentially introduces an additional end point.
He pointed to two critical considerations relevant to the policy and technical feasibility of an exceptional access system. The first, “Can we trust the system if it works properly?” is a policy question. The second, “Can we trust the system to work properly?” is a technical question and is the focus of the current workshop.
Citing the report of which he is a coauthor, Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications,2 Blaze articulated some of the ways exceptional access is at odds with the need to better protect infrastructure. He said exceptional access necessarily makes cryptography weaker, increases the difficulty of integrating cryptography securely into applications and systems, creates operational vulnerabilities, and, in many cases, can be easily bypassed. Blaze pointed out that encryption systems can fail even if there are no additional complications from exceptional access. The most common reasons for such failures, Blaze said, have to do with the deployment of an encryption tool for a specific application—it is likely, say, that the tool does not meet some security requirement in the application, owing to some usability issue, or there may be problems with the application itself or the platform on which it is running. Problems with the engineering of the encryption tool, including implementation of the encryption algorithms, are also fairly common, he said. Less common, although still possible, are failures in the encryption algorithms or protocols themselves.
Although the algorithms and protocols are perhaps the easiest aspect of creating an encryption system that works, there are still weaknesses. Despite a great deal of mathematical work in this field, there is still no general theory of cryptography, Blaze posited. He called this “one of the dirty secrets of cryptography.” Furthermore, occasionally it turns out that the assumptions underlying algorithms aren’t entirely correct, he said. Protocols also can turn out to have weaknesses. As an example, he pointed to the protocol failures in the Clipper Chip, an encryption system designed by perhaps the world’s best cryptographers at the time.
The biggest challenges to implementing exceptional access, Blaze said, would likely be related to design and software engineering, which are complicated by access requirements. As a result, he surmised, a requirement for
2 H. Abelson R. Anderson, S.M. Bellovin, J. Benaloh, M. Blaze, W. Diffie, J. Gilmore, et al., 2015, Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications, Technical Report MIT-CSAIL-TR-2015-026, Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory, July 6.
exceptional access may lead many vendors to reduce their focus on security or forego encryption altogether due to the increased difficulty and expense of building secure encryption systems that also allow for exceptional access. As a result, he said, there’s a danger that “we won’t see the deployment of encryption in places where we really, really need to have it used.”
Exceptional access also raises several operational challenges. Whether keys are kept in a single centralized escrow system or in multiple systems, the result is collections of secrets “that have to be guarded essentially in perpetuity,” Blaze noted. Moreover, the complexity of operational infrastructure needed to add exceptional access may well dwarf all the other operational requirements of the application. These collections of secrets would have enormous value for adversaries, creating high-value targets. Maintaining and securing such systems, he said, would be both extremely difficult and extremely expensive.
Allowing that exceptional access can be built into certain applications under certain circumstances at an acceptable cost (and indeed this has already been done in some cases), Blaze said that the risks of implementing exceptional access in a more generalized sense are enormous and the consequences going forward would be “unbounded and unpredictable.” Moreover, Blaze observed, in the context of a fast-moving technology industry involving numerous smaller actors, it is likely exceedingly difficult to implement exceptional access generally, particularly in the nonenterprise context. And given an evolving cast of adversaries that increasingly includes nation-states seeking to do serious harm to the security and economy of the United States, any failures in this realm are likely to be both high impact and expensive.
Fred Chang and Susan Landau moderated a broader discussion of the technical options and security risks related to exceptional access to encrypted information and communications. The discussion is broken into two sections: The first focuses on context and major issues and the second on potential solutions. The order in which the contributions are presented here is not the order in which they were discussed at the workshop.
Chang inquired as to whether, instead of pitting cybersecurity and exceptional access against each other as opposing choices, it would be reasonable to frame the issue as a series of engineering trade-offs. Blaze agreed with that conceptualization in principle but noted that as the technological landscape grows more complex and the stakes become higher, it becomes increasingly challenging to truly understand what is being traded or sacrificed. A related weakness, he added, is that we have poor metrics for evaluating the security of our systems.
Blaze described the difficulty of creating a secure exceptional-access system as “unprecedented.” He noted that, although the problems are subtle and there is no specific proposed system on the table to evaluate, it is hard to quantify exactly what the risks are. At a fundamental level, such a system would likely require (1) a secret held somewhere, (2) a mechanism for communicating that secret, and (3) a mechanism for guarding that secret effectively in perpetuity. Compounding the complexity of these basic requirements is the fact that apps and software can be used anywhere—and by anyone—around the globe.
Noting improvements in cryptography and key management systems, Chang asked whether any recent technological breakthroughs might make exceptional access more feasible. Although he acknowledged incremental progress in cryptography and its algorithms, Blaze said that because the range of vulnerabilities, or “problem space,” has also expanded, the ultimate result is a mixed bag in terms of how much progress has been made relative to the actual problems.
Daniel Kahn Gillmor spoke of the trade-off between the (perhaps limited) benefits of an exceptional access system to law enforcement and the increased vulnerability to which such a system would expose everyone. Although exceptional access might enhance law enforcement’s ability to catch the “dumb criminals” (those without the resources or knowledge necessary to procure end-to-end encryption tools), he predicted that many others would rapidly adopt
a less vulnerable encryption system, even if it were not available by default. Later, Patrick Ball reiterated this point, suggesting that sophisticated users will be able to circumvent exceptional access through what would amount to a form of steganography, by hiding secure encryption inside a layer of encryption with exceptional access. Until the outer layer is decrypted, an adversary or government would not even know that the material inside had been encrypted. Such a scheme, Ball said, would not be difficult for savvy users to implement; as a result, the only targets whose communications can be made visible through exceptional access schemes are “dumb criminals” and law-abiding citizens.
Gillmor emphasized that the costs of deploying exceptional access are borne by everyone. Even worse, in his view, we would be stuck with any downsides for years to come. A weakness in security was exploited in an analogous deployment in the 1990s to comply with U.S. regulations on the export of cryptography technologies. We are still paying the cost, Gillmor said, “even though we decided years ago that that was not a good cost to pay,” for the introduction of export-grade cipher suites used by the Transport Layer Security and Secure Sockets Layer family of protocols.
Blaze noted some additional trade-offs. If, for example, a system is built with a centralized repository that application developers would make use of, this concentrates one part of the risk in a “fairly well hardened environment,” but at the same time creates a “big central fat target” that may or may not be adequately secured. If the central repository fails, it fails catastrophically. Given the complexity of the requirements associated with exceptional access, he said it is quite likely that engineers designing one aspect of a system to meet one requirement could create a structural weakness somewhere else, perhaps without knowing it. Blaze went on to caution that the right set of requirements needs to be considered: It may be easy to meet the requirement that keys can be recovered, but that does not ensure that only the right people are recovering keys.
Landau raised two other trade-offs. First, if a smartphone’s data are more easily accessible, it undermines the ability of the phone to act as a secure authenticator. The fact that everyone carries their phone with them—and we all notice when it’s gone—makes it especially useful, she said, and losing this capability would impose a cost. Second, since the mid-1990s, the U.S. Department of Defense (DOD) has been using more and more off-the-shelf, commercial equipment, in part because of the speed of innovation in Silicon Valley. As a result, she observed, it is important to remember that DOD has a strong interest not just in the ability to break into encrypted devices or communication streams, but also in the ability to keep them secure.
Kevin Bankston explored how exceptional access mechanisms could create vulnerabilities that might be exploited. Bud Tribble agreed that operationalizing exceptional access at scale would indeed raise the risk that adversaries could tap into the decryption capability, pointing to that concern as one of the reasons Apple decided to implement end-to-end encryption. A requirement to maintain an exceptional access mechanism, he said, would create a persistent target for an adversary to exploit, a fundamentally riskier situation than the current system in which Apple can catch and patch security vulnerabilities as they become known.
Another issue to consider is how exceptional access would play out internationally—say, if a foreign government wanted to tap into a communications stream encrypted in a way that enables FBI access to the plaintext. Ball said that based on his experience working with human rights groups at risk of government persecution, faced with such a request, he “would be delighted to say ‘I don’t have the keys.’”
Finally, in a broader sense, there are trade-offs involved in action versus inaction relating to the alternative outcomes that might occur under different circumstances. Butler Lampson suggested that, in the absence of a credible technical solution from the technology community, legislators might come up with their own solution. At several points throughout the workshop, other participants pondered what might be sacrificed if the quest for a “perfect” solution is allowed to block the pursuit of workable, albeit imperfect, solutions.
Participants discussed key concerns and challenges facing law enforcement related to data encryption.
Noting that companies will generally produce data that they have access to in response to law enforcement legal demands, Richard Littlehale asked why some companies are considered to provide adequate security even if they retain the ability to access encrypted data, while others argue that a total lack of access by anyone but the individual user is the only way to ensure adequate security.
At several points, Littlehale suggested that law enforcement is not seeking a perfect system of total access, but is instead looking to improve the current situation, where the increasing ubiquity of encryption for devices and communications is rapidly constraining the types of evidence accessible to law enforcement. In response to the point that technologically sophisticated criminals will likely continue to find ways to protect their data from law enforcement, Littlehale urged attendees not to underestimate the amount of human misery caused by criminals who are either “dumb” or “smart but lazy.”
Littlehale underscored the fact that these issues are currently affecting real cases. If it is determined that exceptional access is either not technically feasible or not a policy the country wishes to pursue, what then is law enforcement expected to do to fulfill its mission? Blaze built on this point, noting that law enforcement and the FBI need viable alternatives if exceptional access does not work out. He predicted that a “Plan B” would involve more investment in lawful hacking and other forensics capabilities. This idea is discussed in greater depth below (see the section “Part 2: Exploring Solutions”).
Joseph Lorenzo Hall suggested it will be important to measure the needs of law enforcement. Such insights would give society a more rational way to understand what is lost if more security is pursued at the expense of catching fewer perpetrators, or if we sacrifice security to achieve law enforcement goals. Landau commented that participants in the annual Workshop on the Economics of Information Security are pursuing research in this vein, but that it is often difficult to obtain the data needed to truly answer the question.
Littlehale agreed that these impacts are difficult to quantify. Rarely can law enforcement point to encrypted data as an absolute or insurmountable block to an investigation, he said; the typical impacts are more nuanced: cases take longer to investigate, and victims endure extra suffering as a result or resources are pulled from other cases. On the flip side, he said, it is appropriate to recognize the value of feeling secure, the feeling that one’s privacy is protected.
Littlehale noted also that certain types of evidence previously available to law enforcement in the analog world are no longer available in the digital world. For example, under a physical search warrant, a police officer might once have patted down a suspect and found an address book. Now, a suspect’s address book is likely to be stored on a smartphone, and its content is more and more often encrypted. In essence, he said, this means law enforcement is being expected to execute its mission—which includes not only prosecuting the guilty but exonerating the innocent—with potentially less evidence.
Ball offered a counterargument: today’s digital world offers a great deal more information to law enforcement than was previously available, through means such as location tracking, metadata, unencrypted e-mails, and plaintext calls made through traditional cellular networks. Although we may be “going dark” in some places, we are going “brilliantly bright” in others, he argued, suggesting that abandoning our devices altogether would take a lot more evidence out of the reach of law enforcement than using fully encrypted devices would.
Addressing in a broader sense the aims of law enforcement, Littlehale noted that concerns about the potential for someone to steal an escrow key or abuse exceptional access to the detriment of human rights are also concerns of the law enforcement community. Such actions, he noted, would be crimes, and thus law enforcement has an interest in preventing and prosecuting them.
Participants considered the market environment that has driven certain decisions by industry players, as well as how companies might respond to an exceptional access requirement.
From a practical standpoint, Blaze suggested that imposing an exceptional access system that is too complex or difficult to implement might drive many companies (with the possible exception of Apple, he conceded) to simply abandon encryption. Such an outcome would result in much less security for everyone.
Tribble explained that Apple’s approach to implementing encryption has been a response to two trends: (1) an increasing level of threat to its customers and (2) the fact that consumers are keeping an increasing amount of important information on their phones. Given these trends, Apple opted to pursue end-to-end encryption, an approach, Tribble noted, that the technical community over the past 50 years has determined to be the best practice.
Tribble went on to acknowledge that when Apple does not have the keys to unlock something law enforcement needs access to, that can lead to a legal order not being enforceable, which, he conceded, is not a good thing. However, in the same way that it is hard to measure precisely how this impacts the efforts of law enforcement, it is hard to quantify the risks and benefits to customers whose data are protected (or not) with end-to-end encryption.
Noting that Apple recently introduced a feature that allows people to download their medical record to their phones, Tribble emphasized that these trends—rising threats and an increased amount of sensitive information on phones—are going to steepen in the coming years. Echoing a sentiment expressed earlier in the workshop by Baker, Tribble said the solutions to these problems are not for one company or one government agency to decide, but for society as a whole.
Landau moved the conversation toward the Internet of Things (IoT) to explore how the proliferation of network-connected devices affects the encryption landscape or exceptional access options. Although the IoT may seem trivial—for example, it may be unlikely that a network-connected toaster would reveal vital information—she pointed to the U.S. intelligence community’s 2016 worldwide threat assessment that suggested “future intelligence services might use the IoT for identification, surveillance, monitoring, location tracking, and targeting for recruitment, or to gain access to network or user credentials.”3
Eric Rescorla and Hall began by clarifying that many such devices do not or cannot practically use encryption. Hall remarked that some IoT devices take 10 seconds or more to perform an encryption, and those that do likely use the same TLS protocol that is widely used for Internet communications today. However, there are efforts to come up with lightweight cryptographic capabilities for such devices, said Hall. Andrew Sherman noted that many IoT devices involve constant watching or listening in private environments, which, depending on your viewpoint, could represent a tremendous opportunity or a tremendous risk.
James Burrell pointed to two main areas of concern from the law enforcement perspective: cybersecurity and forensics. Because the IoT introduces new devices into networks, a cyberattacker has many new points to attack. The compromise of a network-connected coffeemaker, for example, might not be useful on its own, but could be very valuable for an adversary if it allows access to other pieces of the network, he said. Burrell said that in the context of forensics, law enforcement would generally gain the most from being able to access data or communications from IoT devices at the point of data aggregation, where all of these devices are transmitting data to a cloud service or to a local hub, because direct physical access would otherwise be required. Later, Landau and Brian LaMacchia noted that in the absence of practical homomorphic encryption, data at the point of aggregation would need to be available as plaintext.
Littlehale offered a slightly different take, suggesting that physical devices themselves can prove useful to law enforcement. For example, in a homicide investigation, data stored from a sensing device could be accessed to discern when a person was or was not present in the home at a particular time. This type of information, he said, is already being used by law enforcement, although current uses typically involve devices that have a memory but no network connection.
Rescorla observed that the other speakers seemed to be saying that government would not need exceptional access to communications among IoT devices, because it would be able to extract needed data from the aggregation points. Littlehale responded that in the current environment, such access is likely not necessary, but it is unclear what the future may hold.
From a technical perspective, LaMacchia noted that for some portion of these devices—particularly ones meant to be built into houses, such as light switches—a major concern is the limited capability for security updates. Many low-end devices, he said, are based on a “rip and replace” model for updating, requiring new hardware for updates and not just new software. As a result, such devices are unlikely to be using up-to-date security. The ultimate result of this, he suggested, is likely to be a solution in which the more vulnerable pieces of a network are
3 J.R. Clapper, “Worldwide Threat Assessment of the US Intelligence Community,” Statement for the Record, Senate Select Committee on Intelligence, February 9, 2016, p. 1, http://www.intelligence.senate.gov/sites/default/files/wwt2016.pdf.
walled off with a series of internal firewalls so they do not compromise the more secure portions of the network. If exceptional access is something society decides it wants, LaMacchia said, it may not actually be possible to add that capability to devices that cannot be updated.
Building on this point, Donner pointed to modern cars as an example of a relatively mature instance of the IoT that has already revealed how weak security, mismanagement at the configuration level, and a lack of updatability can lead to significant problems. He suggested that the same essential problems are likely to happen “on a larger scale and on a faster time line” as the IoT expands. Stephen Checkoway of the University of Illinois, Chicago, noted that he was one of the people who had hacked into cars in 2010 to demonstrate some of these issues. He said that although the security story then was “kind of a disaster,” things have improved somewhat, and manufacturers are developing the capability for over-the-air security updates.
Rescorla noted that because IoT devices often are actuators capable of causing things to happen in the physical world, it is especially important that any exceptional access mechanism not pose a threat to authentication and integrity. For instance, an attacker who gained write access to the control channel for such a device might be able to cause physical harm, such as a fire.
Prompted by a question from Chang about system-to-system traffic, Blaze noted that as the IoT expands, in general, these devices will reuse building blocks from other computing systems. Already, that has meant that entire operating systems once intended for a desktop computer are being incorporated into embedded devices, a trend Blaze expects to continue.
Noting that it is impossible to accurately predict how the IoT will evolve in the coming years, Blaze cautioned that the decisions made today about security infrastructure are going to have profound effects on the security of future platforms. Because decisions made now with respect to smartphones, for example, are likely to apply to a great diversity of IoT devices down the road, Blaze said that what we are doing today is immensely important.
Looking forward, participants brainstormed the types of requirements that might be considered when designing an exceptional access system. They also explored the feasibility and security implications of several specific technological and operational options.
Several participants expressed frustration over a lack of clarity on what exactly an exceptional access system would need to do to meet the government’s expressed needs. Chang challenged attendees to brainstorm a set of possible requirements.
Matt Green, assistant professor in the Department of Computer Science at Johns Hopkins University, proposed a key first set of questions around what form a mandate for access might take:
- Would the government mandate exceptional access or only encourage companies to adopt it?
- Would the government issue a high-level requirement and let companies figure out how to implement it, or would it identify a specific implementation for everyone to adopt?
Burrell suggested that, at the end of the day, law enforcement really wants access to plaintext, and he noted that the vast diversity of systems in use makes it difficult to specify exactly how this would be implemented.
A second key set of questions, Green said, relates to responsibility:
- Who is responsible when things go wrong?
- What happens when escrow keys are stolen or someone finds a vulnerability?
Picking up on this point later, Ball suggested that law enforcement would need to be held accountable for any failures that result from a key escrow system.
Turning to the more technical aspects of the question, Green suggested that abuse detection should be considered an important requirement. As Gillmor mentioned, it is valuable to be able to detect a breach and stop further damage, even if you can’t always prevent the breach in the first place.
A related question is whether government actors would need only prospective capabilities—for example, the ability to decrypt future communications under a wiretap warrant—or also retrospective capabilities, such as the ability to retrieve past messages sent by a suspect in a terrorist attack. Green said it is unlikely that an exceptional access system would be able to both provide retrospective capabilities and detect abuse, although abuse detection may be possible in a system that only supports prospective decryption. To do so, one would have to have a design in which there is a master key that can be sent to a target’s phone to cause all subsequent communications to be encrypted in a different way that would allow exceptional access.
To this point, Littlehale said retrospective exceptional access is not currently the highest priority for state and local law enforcement. Most companies have business reasons to access their own historical data and will respond to legal demands for whatever they possess. Although it might become a problem in the future if companies were to change their practices, he said it is not currently a major impediment. He emphasized that a much bigger problem is the devices and communication systems that are designed such that even the manufacturer cannot access their stored data or communication streams. Prompted by a follow-up question from Green, Littlehale later added that if the market were to move in a direction where cloud backups are also encrypted under user control, law enforcement would consider the ability to decrypt that information a far higher priority.
Picking up on this point later, Bankston suggested that approaches like that proposed in the Feinstein-Burr bill, which would require entities with the ability to encrypt to also retain the capability to decrypt, would essentially outlaw perfect forward secrecy, a technology coming into widespread use with which communications are encrypted with keys that are thrown away after use, so that if adversaries were to gain the keys for one TLS session, they would still lack the keys for other TLS sessions. Thus, if adversaries were able to gain access to a device, they would not be able to decrypt communications recorded in the past.
Burrell said that in his view, law enforcement wants both prospective and retrospective capabilities. In a communications system with perfect forward secrecy, one can still wait for the data in motion to become stored data when received, and access it then. Burrell also raised the point that timing is a consideration. On a mobile device, for example, how long are the data effectively stored, and how does that compare to how long the data are of value to law enforcement? Green replied that, ultimately, decisions about encryption and access must be made before the data are created. Under the assumption that anyone could become a suspect at some point in the future, if access to their data is desired, then that decision would have to be applied uniformly to everyone. This, in his view, would undoubtedly affect the security of the system.
Rescorla raised the question of whether it is a requirement that one must be able to detect when someone is not using escrowed encryption, arguing that determined adversaries—especially in the counterterrorism context—will use non-escrowed encryption and that it is not practical to detect when adversaries do so.
Littlehale noted that there may well be things law enforcement desires that are either not technically feasible or beyond what the public is comfortable entrusting to law enforcement. It may be that only incremental changes are possible. Given that law enforcement faces real barriers now and will likely face more of them in the future, he suggested that the most important thing is for these barriers to be understood and appreciated so that society can make its decisions in an informed way. Audience member Anand Gupta, from Harvard University and Palantir Technologies, suggested it would be useful, in the case of a large number of sometimes conflicting desires and requirements, to prioritize their relative importance.
Lampson discussed his ideas for a system design that would allow exceptional access while providing reasonable security. He began by pointing out that although exceptional access necessarily makes a system less secure, it is quite possible that the reduction in security could be kept to a minimum such that the level of security could remain acceptable. He asserted that a k out of n model could provide such a solution. In this type of system, the encryption key is itself encrypted with a set of “sealing keys,” for which a set of matched “unsealing keys” is
created and given to trusted escrow agents. The escrowed encryption keys are broken into a number of pieces, n, and a certain number of them, k, must be combined in order to unlock the encryption.
Lampson identified two key weaknesses in such a system: First, one must determine what the sealing keys are—a configuration problem—and second, one must trust the escrow agents, which requires that technical bugs be eliminated from the escrow agent computer systems and, ultimately, involves trusting the people and institutions that operate each escrow agent. While acknowledging that such a scheme undoubtedly could be compromised, Lampson argued that such risks pale in comparison to the overall context and scale of existing security problems. He said the main counterargument he can envision relates to the policy issues articulated by Ball—namely, that the gain from an exceptional access system would not be worth the associated risks. From a practical standpoint, he added that it seems indubitable that countries like China and Russia are going to mandate exceptional access, including for American-made products sold in those countries.
Green commented that it is unlikely that such a system could be properly verified. Although certain types of programs are now formally verified in a research setting, he said that software engineers who would actually be building such a system “just don’t speak that language.” As a result, he said, any system built would likely have the same kinds of vulnerabilities that have been seen before. And while Green believed that it might be possible to find some number, k, of trustworthy organizations to serve as escrow agents, somewhere down the line, the systems they use to hold the escrowed keys will almost certainly share some common software. It is those pieces of software that Green believes would be the point where breaches occur and compromise the system.
Reiterating a previous point, Blaze reminded attendees that any such key escrow scheme would have to be robust against an attack by a nation-state, which is a significant risk, particularly for “important users” such as components of the national infrastructure that are critical for the U.S. economy and national security. In addition, unlike a system designed to protect seldom-used information, such as nuclear launch codes, for example, the exceptional access systems for the purposes being envisioned would have to be used every time there is a wiretap approval—perhaps on the order of once per hour or once per 15 minutes—yet provide security that is robust enough to deflect an adversary that is perhaps as sophisticated as the NSA. Landau reported that the number of federal and state wiretaps issued in 2014 was reported by the U.S. courts to be 3,554.4
In this vein, Rescorla noted that in addition to accommodating frequent access, it might be necessary to allow a large number of jurisdictions to have independent access. Lampson and Landau suggested that the mechanics of the institutions responsible for running the escrow agencies would require careful consideration and that keys would not be handed out at the level of every police department.
Landau wondered whether it matters if the escrow system uses off-the-shelf or specially developed communication software. While acknowledging that both approaches have downsides, Lampson reiterated that the relative risk of this system compared to what we have now is still minor. The larger threat, he emphasized, would be use of flawed methods to configure the sealing keys, which could put the sealing keys in the hands of adversaries. Green concurred with this point, citing as an example a configuration problem that resulted in a significant security vulnerability in Juniper network devices. But although the configuration challenge is a significant one, he expressed his belief that it might be solved but was not confident of this.
Sherman weighed in to urge caution, citing as a fundamental risk the increased complexity that a k out of n approach adds to a system: “We were all brought up to think that complexity is the enemy of security,” he said. Pointing to largely offline, paper-based identification number storage systems of certain financial institutions as an example of a design that is compromised only rarely, he noted that any system in which we must store both data and keys online is, by nature, going to be more vulnerable than a system for which this information is not accessible via the Internet. LaMacchia expanded on this point, noting that “what we know about security for high-value assets is: We want stuff offline.” Indeed, he said, some of Microsoft’s cryptographic assets that are protecting high-value items involve keys that have both online and offline aspects. However, given the operational requirements currently envisioned for an exceptional access system, he said that such a system would likely need to be online.
4 U.S. Courts, Wiretap Report 2014, last updated on December 31, 2014, http://www.uscourts.gov/statistics-reports/wiretap-report-2014.
Participants discussed how the k out of n approach might be tested experimentally and how its security risks could be better understood by examining existing systems that use a similar scheme. Lampson suggested one approach would be to develop the system and then offer a $5 million prize to anyone who can break it. Donner and Rescorla pointed to the signing of the Domain Name System’s (DNS’s) root server credentials as an example of a k out of n system. A key difference in the DNS case is that the keys are not stored in network-accessible systems, and the people holding portions of the key must gather physically. This approach is very expensive, making it impractical as a template for an exceptional access system that must be used frequently.
Rescorla described his main concerns about the k out of n scheme as relating to management of the system once it is built, although he also noted that the code itself would be difficult to write, and there would likely be many defects in implementation. “We are nowhere near having the ability to have a complete system which would have high confidence and security from the ground up,” he said.
Gillmor pointed out that the use of k out of n schemes for signing authorities like the DNSSEC (Domain Name System Security Extensions) root zone signing keys, as distinct from decryption keys that would be necessary for an exceptional access mechanism, provides an opportunity for detecting a compromise, because the signatures must be distributed to an intended victim for theft of a signing key to be useful. The signing keys themselves can be kept secret, but victims who store or publish copies of the signatures that they receive could eventually be able to detect that the signing key was misused because they can see an artifact of its use. On the other hand, if an encryption key is compromised, it can be used in secret to decrypt captured ciphertext without raising any flags. Another important consideration, Gillmor added, is timing and maintainability. Many encryption systems protect data that need to be kept secret for an undetermined amount of time. For a k out of n system or any other system that provides an exceptional access mechanism to allow access to such information, it must be secure and function properly over the sensitive lifetime of the data encrypted using it. This is a significantly harder problem than seeking a short-term or revocable solution.
One final problem raised by LaMacchia is that securing any single environment is difficult, and in the k out of n scheme, one must secure n systems in parallel and maintain the security of all the other systems, even if one should fail. This increases the scale of the problem, and LaMacchia cautioned about underestimating the difficulties of physically securing that many components.
Turning to other potential options, Littlehale questioned whether requiring physical access to the device in question would make a difference. Lampson suggested the security risks could be reduced by keeping the escrowed keys, sealed by the keys of the various escrow authorities, on the device rather than sending them to the authorities. Then an attacker would need both the device and the cooperation of k escrow authorities to release the escrowed key, rather than just the latter. In addition, this approach no longer depends on being able to reliably deliver the sealed key over the Internet.
Returning to this topic later, Bankston pointed out that while it may on the surface seem like requiring physical access to the device would help to contain security vulnerabilities, it is feasible in some countries for the government to seize a large number of phones, thus allowing those governments to steal encryption keys on a large scale. Lampson countered that a government like that of China would be more apt to mandate exceptional access than to gain access by physically seizing millions of phones. This sparked further debate concerning how a company like Apple might respond if mandated to provide exceptional access to the U.S. and/or the Chinese government. Bankston held that if the United States mandates such access, China will most certainly do so as well.
With regard to the need for law enforcement to access information on encrypted devices, Blaze said the question becomes whether the cost of creating the infrastructure necessary for access would exceed the cost of developing other methods law enforcement could use to gain access—for example, reverse-engineering the hardware. In this way, it becomes an economic question as well as a security question, and the economic component would be felt especially acutely by local law enforcement bodies.
The discussion turned to the feasibility of segmenting exceptional access, but by user instead of by sector or technological layer, as had been discussed in Session 2.
Littlehale questioned whether every smartphone must be engineered to withstand an attack from a nation-state adversary. Several participants observed that the idea of creating “more secure” and “less secure” hardware or software is not generally a viewpoint supported by industry. LaMacchia responded that as a Microsoft cryptographer he works under an explicit directive to give the strongest protection to all users and assume a nation-state level threat for all products. Looking toward the long term, Microsoft implements this by adopting approaches like perfect forward secrecy. Rescorla concurred, characterizing the notion of providing good security for some users and bad security for others as “extremely unpleasant” from the perspective of a software or hardware manufacturer. If users have to go to extra effort to get strong security, the vast majority will not bother and will find themselves with weak protection.
LaMacchia added that it cannot be assumed that a device sold to an American user will only be operated in the United States. High-value targets, such as company CEOs and indeed anyone doing business internationally, may need strong encryption to defend against nation-state adversaries when traveling to other countries. Bankston later built on this point, noting that often the point of the attack is not to access an individual’s information but to use the compromise of an individual’s system as a platform from which to attack the valuable information of others. In that sense, he said, it is reasonable to think about hardening the entire “ecosystem” against adversaries such as China or Russia.
Bolstering the case that all users deserve strong encryption, Sherman pointed out that many strong cyber adversaries come not from the realm of nation-states but from technically sophisticated criminal networks seeking to steal people’s money. The targets of such attacks are not limited to high-profile or necessarily high-value individuals or organizations.
Raising another potential downside of segmented encryption, Blaze noted that the current technology industry is built on the idea of general-purpose platforms that are controlled by easily loaded software. Somehow restricting this system by implementing architectures that would disallow user-supplied encryption could have serious economic ramifications. Tribble clarified this point by emphasizing that a motivated person could always find a way to load software implementing such encryption onto a device. In addition, he said it is likely unwise to try to segment exceptional access based on a distinction between hardware and software, particularly in the context of regulation, because the hardware-software boundary tends to shift over time.
At several points throughout the workshop, participants considered what role the government’s own technical expertise and resources should play in addressing the challenge of encryption—for example, through lawful hacking. They also considered how the technology industry and its security solutions bolster the security of the government itself as well as other ways to improve the government’s access to information in the absence of an exceptional access mechanism.
Donner asked whether there are sufficient skills and technical resources in the law enforcement community to make use of this approach. Having the requisite skills in house can ensure that devices are handled correctly in circumstances such those that transpired in San Bernardino. Although such an approach will not always provide access to all the data from sophisticated criminals, it could give law enforcement more capabilities, at least against less sophisticated criminals.
Littlehale suggested that a regime based on massively up-staffed technical capabilities in the law enforcement and national security communities, even as industry continues to strengthen security, would create a situation where the United States is simultaneously devoting significant resources to improving security, on the one hand, and finding ways to compromise it, on the other. Moreover, if the government spends $5 million to find a vulnerability it can use, is it expected to then share it with the software creator so that it can be patched, only to require the government to spend another $5 million to find a vulnerability in the patch? Blaze responded that in some ways,
building up the government’s hacking capabilities will be feasible and indeed necessary, but not universally. Given that adversaries will in any case seek to bypass exceptional access mechanisms, lawful hacking will inevitably be needed even if there is a mechanism for exceptional access. If exceptional access is not given, he said, it certainly would require much greater investment by law enforcement.
Burrell weighed in on this matter, noting that lawful hacking is extremely complex due to the “vast universe” of devices and systems law enforcement would need to develop the capability to hack, and owing to the fact that those devices and systems have increasingly short life spans. In addition, the costs would comprise not only the direct cost of the experts working on this issue, it would also comprise the opportunity costs of removing those experts from other jobs and pulling resources from other tasks. Given all of this and the fact that law enforcement would be expected to reveal the vulnerabilities uncovered in the course of investigations to support cybersecurity aims, Burrell concluded by saying, “I just don’t even see how that’s even a feasible option.”
Tribble suggested that better communication channels between companies and law enforcement could reveal currently unexploited resources that law enforcement could tap into even in the absence of an exceptional access system. Landau offered an anecdote to illustrate this point: A law enforcement organization complained that iMessage metadata sought for an investigation could not be accessed through communications providers like Verizon or AT&T. Instead, such information is held by Apple, where law enforcement never expected it to be. Given how rapidly the technology evolves, Tribble acknowledged that it is hard for law enforcement to keep up with what is actually available, and doing so is not something that law enforcement can be expected to do on its own. Cooperation from industry is important to building government capacity in this area, he suggested.
With regard to how government might take better advantage of available information, Burrell noted that just because more data or metadata are available does not mean they are accessible or necessarily useful to law enforcement. Sometimes, for example, they may be available but cannot be processed in a way that meets legal requirements for privacy or transparency. Sometimes those holding the data do not have a full understanding of what they are holding, Burrell said, and metadata are not standardized, so law enforcement does not always have a good idea of what it will receive when it asks for metadata. Moreover, just having metadata does not necessarily mean one has actionable information; sometimes there is a great deal of noise to sort through to find the useful pieces. Another key issue is timeliness: Burrell noted that it will not be workable for the government to spend 6 months hacking into a system if the information is needed in a matter of hours.
Landau pointed to the USA Freedom Act (P.L. 114-23), which caused the NSA to shift from collecting metadata to requesting metadata from individual communications firms. Burrell replied that the FBI does not have the same ability as the NSA to manage big data because it does not have the same business need for that capability. As a result, the company that holds the data or processes the communication would need to engineer special tools to transfer data to the FBI in a way that satisfies the FBI’s legal requirements. He agreed with Landau’s suggestion that requiring companies holding data to be able to provide them to law enforcement in a timely fashion would be an improvement over the current situation.
Littlehale added that law enforcement must know where data exist in order to request the data. If officers cannot get into a device, it would be impossible for them to find out if there is perhaps an unencrypted app or communications channel from which they could request data from the service provider. If the police were to try to get around this by prospectively sending a legal request for information, for example, to Google, they would be accused of fishing, he said. Pressing on this point, Landau asked whether law enforcement might make a request under 18 U.S. Code § 2703 (required disclosure of customer communications or records) to find out which communications came from the device. Littlehale said there are some technical challenges with such a request, some of which might be addressed by IPv6. Although it is possible in some cases to attribute activity to a particular device, in others it is not.
A related challenge, Littlehale said, is that law enforcement does not always receive instant responses from service providers. One potential avenue to explore is whether it would be possible to speed up these responses through automation. Another option for law enforcement is to open better communication channels with service providers to help its investigators understand which categories of data are available that they might not otherwise know about.
Even in the absence of any actual exceptional access mandate or policy, Landau noted that the government’s posture toward end-to-end encryption could be affecting the technological landscape and wondered if it would be possible to work toward a different “default.” For example, it is possible that government advocacy for exceptional access actually pushes the market more in the direction of user-controlled encryption.
Donner pointed out that the underlying driver stems in part from the fact that the Internet is constructed out of private components and open protocols—infrastructure the government doesn’t control—and as such, the government is limited in its ability to protect citizens from bad actors operating online. If the government cannot adequately protect the network, then consumers have no choice but to protect themselves, and that in turn drives the trend toward end-to-end encryption.
Building on this point, Sherman pointed to the disclosures made by Edward Snowden as instrumental in raising interest in user-controlled encryption. However, given that government is asking for what it perceives as necessary to do its job, it is difficult to come up with anything the government could do to help the situation. Perhaps, he suggested, what is most needed is a more open and honest discussion between the government and its citizens. Littlehale concurred, noting that it would be useful to establish a less adversarial way of communicating about the issues. He reiterated the concern that a crisis situation can lead to hastily drawn-up legislation that does not necessarily achieve any of the desired goals.