The workshop concluded with a period of open discussion, giving speakers and participants a chance to emphasize what they considered the most essential issues and revisit concerns raised earlier in the day. This chapter, organized into thematic areas, describes the content of the final discussion, highlighting some of the broader themes that emerged throughout the workshop.
Opening the discussion, Paul Kocher, Cryptography Research Division, Rambus, Inc., said that the day’s presentations had “turned what I thought was a hard problem into a really hard problem.” Agility is necessary, but imperfect, he summarized. What can we do to advance agility in a way that does not cause as many problems as it solves?
Although we can never predict exactly what the future will hold, two main themes emerged throughout the workshop: (1) the potential threat of quantum computing and (2) the near certainty that vulnerabilities will eventually be discovered in any security system, even without quantum computing.
Kocher expanded on the difficulties of devising strategies to address as-yet-unknown problems. Some challenges, such as specific ciphers failing, are easier to solve, while others, such as quantum computing, inevitable software bugs, or the need to
change root keys managed by someone else, are much more difficult. The “unknown unknowns” present particular challenges: What could happen? What is the probability that it will happen?
Sara Brody, Simply Secure, noted that because we know certain types of vulnerabilities will inevitably keep happening, a network to share information and resources when a bug or other problems are noticed would be useful. Bob Blakley, CitiGroup, Inc., pointed out that some sectors, such as financial services, have created Information Sharing and Analysis Centers to do just that.
Participants explored the implications of a catastrophic security failure—such as one caused by quantum computing—and what might be done now to prevent or prepare for such an event. Peter Swire, Georgia Institute of Technology, suggested that institutions will be critical to forging a path forward during and after such an event, and he posited that addressing a disaster will require not only technical solutions and expertise but also communication with society at large and the identification of critical leaders across sectors.
Blakley brought up a common practice in the corporate security world: using tabletop or paper exercises to test out new ideas or predict their consequences. Perhaps this approach could be applied to help organizations envision the problems they may face by simulating a scenario of cryptographic systems breaking. Anita Allen, University of Pennsylvania, agreed that tabletop exercises would be useful in designing for transition and, even more importantly, in designing for disaster. In the context of disaster, she added that it is also important to plan responses to breaches or damages that threaten cultural heritage. Butler Lampson, Microsoft Corporation, suggested that live exercises, though more expensive than tabletop exercises, can be more illustrative. Several participants noted that organizations often do both tabletop and live exercises, learning different lessons from each.
Participants explored ideas for technical approaches that could help to mitigate agility challenges and allow for more effective ways to improve the prospects of future agility.
Kocher raised the fundamental tension created by the fact that, because we do not know what future protocols and implementations will look like, we cannot build or test them now. Compounding this is the extremely fast pace at which companies develop and deploy products today, leaving little time for thorough testing. Identifying testing as an area with room for improvement, he suggested that perhaps, as part of ensuring compliance with standards, designers could use standard test suites or put more emphasis on checking to make sure that implementations interoperate with other systems.
Later, Swire suggested that all of the interactions within today’s dense networks create issues that cannot be identified with tests that focus on updating one system interacting at one level. Given that it is so difficult to test an individual system (and get results that are reflective of the real-world context), he asked if a new approach is warranted.
Kocher pointed out that there are good test suites available for ciphers but not for protocols. Our current disaster-planning mechanisms are reliant on these protocols, he said, but there is no easy way to test them to see if they will work when they are needed. And while they are not actually hard to test, there is no incentive for companies to do so. Perhaps one solution could be to build a shared protocol testing center that would benefit the entire ecosystem of electronic communications—systems, products, and users. John Manferdelli, Google, Inc., suggested the notion of a “safe harbor” in which the demonstration of some specific level of upgradeability and transparency would be considered due diligence. If crafted well, such an approach could help to put bounds on the expectations for testing and upgrades that would be useful for designers and manufacturers.
Migration—moving existing systems from one cryptography system or approach to another—is an additional key issue. Kocher identified four separate elements to migration: designing better cryptography, implementing cryptography for maximum interoperability, turning off old cryptography, and fully removing the old cryptography. Each of these elements presents unique challenges.
Blakley used a medical metaphor to introduce what he called a radical proposal: while liver transplants are possible because the liver is a discrete organ, spinal transplants are not because the spine is so interconnected with the rest of the body. He wondered if cryptography could be made more like a discreet organ by constraining in ways that would allow engineers to separate cryptographic agility from systems updates. Kocher pointed out that hardware would also need to be addressed. Manferdelli noted that as a designer, he was sympathetic to the goal of streamlining systems, but he expressed concern about the potential for this approach to sacrifice aspects of the cryptography itself and perhaps undermine security.
Participants noted that agility requires making tough choices, and those with the greatest expertise should be helping others make informed decisions. “When the experts cannot decide among themselves so you pass the difficult problem off to a non-expert, that is not really fixing the problem,” Kocher asserted. Transparency and governance are also areas in which experts should be making or more actively informing choices, he said.
Steven Bellovin, Columbia University, noted that all stakeholders, including researchers, standards bodies, engineers, and developers, have a role in solving agility problems, but their specific roles depend on which problem is being examined. Software updates are a vendor-specific problem, he suggested. Systems administration could benefit from more academic research, but thousands of details and consequences, as opposed to unifying principles, make this area “messy” and therefore not very popular among academic researchers. Standards bodies, he suggested, also need to increase their involvement. Pointing to examples in which things have gone wrong despite the involvement of standards bodies like the Internet Engineering Task Force, Bellovin asserted that standards bodies and academia need to feel a greater sense of responsibility or ownership for standards and upgrades. “Academia has to look at this even when it is not nice and neat and academic, and this does not happen,” he said.
With regard to a problem mentioned by nearly every speaker at the workshop—obsolescence and legacy cryptography—David Vladeck, Georgetown University, proposed looking for legal solutions, pointing to warranties as a potential analog from which to draw.
Participants noted that cryptography and agility are in fact global issues, and different nations have drastically different philosophies on privacy, security, and government access to data and communications. Swire noted the prospect of increased regulation by countries including China and Russia, though he expressed doubts that the United States would move very far in that direction because doing so would threaten its own technology industry. The more general point—that there will likely be significant variation among nations in approaches to cryptography and technology more broadly—is something he noted will increase the complexity of cryptographic agility approaches.
Several participants expressed concern about the impact of cryptographic agility on human rights. Deirdre Mulligan, University of California, Berkeley, said that some governments will restrict the use of encryption by citizens. Mulligan and Allen expanded on that point, explaining that encryption and anonymity are enablers of other human rights, including freedom of expression, freedom of speech, and limited government surveillance. Mulligan pointed out that David Kaye, the United Nations’ Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, recently concluded that restrictions on cryptography as an enabler of freedom of expression must be consistent with human rights laws, provided for by law, only imposed for legitimate reasons, and compliant with strict standards. He concluded that any proposal to restrict encryption should be subject to public comment, follow legislative processes,
and be subject to strong procedural and judicial safeguards; these same concerns for human rights should apply, Mulligan suggested, in the context of decisions made not only by governments, but also by companies like Google, Facebook, and Microsoft. “I think that there is a real need to be really sensitive to the role of agility and thinking about human rights,” Mulligan asserted. “I think it is a really important part of the conversation.” Building on this point, Allen suggested that rather than anonymity or encryption being discussed as human rights themselves, it is perhaps more useful to focus on the role of encryption and anonymity in other human rights including freedom of expression, freedom of speech, and limited government surveillance.
Tadayoshi Kohno, University of Washington, added that user demographics should also be a part of the human rights discussion. All users should benefit equally from agility and upgrade mechanisms, not just those who can afford to buy (and rapidly replace) high-end devices.
Noting that failures are inevitable, Steven Lipner, independent consultant, underscored that everyone would stand to benefit from agility mechanisms to deal with vulnerabilities, whether they come from quantum computing, algorithm errors, or merely from the passage of time. One step forward, he suggested, is funding research into these areas. Another is for large organizations to devote more engineering expertise to prevent and prepare for failure.
William Sanders, University of Illinois, Urbana-Champaign, pointed out that the workshop discussions had touched on many aspects of agility—keys, algorithm replacement, full software upgrades, and more—and suggested that this wide-ranging field may benefit from a closer look at what, exactly, is desirable in agility. The answers to such questions could help inform an agility framework and determine a goal for “how agile” our systems should be.
Lampson suggested that an important aspect of the agility context has been overlooked: computing is increasingly service based. While updates to hardware or software matter, so do cloud-based services such as Microsoft’s Office 365 or Google’s Chrome. He expressed concern that many of the agility ideas being explored are “rooted in computing practices that are rapidly disappearing.” Building on this point, Mulligan noted that the Internet of Things, products without update channels, and other such issues also raise important questions that go beyond cryptographic agility itself. She referred back to Brody’s suggestion that examining the behaviors and motivations of developers and users could provide useful tools for addressing these broader issues.