National Academies Press: OpenBook
« Previous: Workshop Introduction
Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

1

Context

The first two speakers, Bob Blakley and Paul Kocher, addressed contexts for cryptographic agility, including the likelihood that cryptographic systems fail as well as the potential sources, causes, and consequences of such failure. Kocher also described lessons learned from real-world experiences with cryptography.

CRYPTOGRAPHY: IF AND WHEN IT BREAKS

Bob Blakley, CitiGroup, Inc.

Bob Blakley, global head of information security innovation at CitiGroup, Inc., emphasized that cryptographic agility is a significant information security problem that needs to be tackled. Noting that most cryptographic systems have eventually failed in use, he described deployed cryptographic systems as prime examples of technologies that are “almost always ultimately doomed to fail.” He pointed out that only one cryptographic system, the Data Encryption Standard (DES), was considered as strong at the time of its retirement as it was on the day of its release.

Blakley noted that a clever mathematician could, in principle, crack any cryptographic system (with the exception of a one-time pad) and compromise significant portions of the current cryptographic infrastructure. In addition to being vulnerable to

Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

breakthroughs in traditional cryptanalysis, many public-key cryptography systems would also be vulnerable to attack by a quantum computer.

Exploring this issue further, he discussed why vulnerabilities in cryptographic systems are a significant potential challenge. Most of the time, attacks against the underlying cryptographic components of cryptographic systems result in immediate vulnerabilities—mathematical problems that weaken the system and can only be fixed by replacing the entire cryptographic system. Blakley estimated that completely replacing a widely deployed system would take at least 3 years—assuming that there is already a well-tested replacement system ready for deployment. The timeline would expand to 10 years or more, he said, if a new cryptographic approach had to be developed from scratch starting at the time of the discovery of the vulnerability in the old system. As a result, he said, “It is important to look a long way ahead if you begin to suspect that there might be a problem.” Although attacks occur suddenly, the necessary repairs can take years.

Replacing a system completely takes a long time because it requires an enormous amount of work. Blakley described some of the necessary steps: A new cipher would have to be invented, put through a battery of tests, and standardized for wide use. Then it would have to be implemented in a variety of cryptographic libraries; hardware would need to be updated to support it; and then it would have to be bought, deployed in actual products, purchased, and put into use. Meanwhile, the old, broken systems would have to be retired so no vulnerabilities would be left behind.

Unfortunately, no one can predict when a cryptographic system will break, but given the years it would take to roll out new cryptographic systems, the need for improved agility is apparent. The federal government has begun thinking about this challenge, in particular the potential need for quantum-resistant cryptography. Blakley noted that the National Security Agency recently issued a document encouraging developers to plan for quantum-safe cryptographic systems.1 The National Institute of Standards and Technology also recently announced an effort to select and standardize quantum-safe cryptographic algorithms.2

Blakley said experts put the timeline for developing new cryptographic systems, including standardization and deployment, at roughly 10 years. However, the pace is clearly influenced by perceived urgency, which varies depending on how far off one thinks quantum computing is. Blakley pointed out that smaller quantum computers already exist, comprised of on the order of 5 qubits, but he suggested the timing of the

___________________

1 Fuzzy, “NSA Switches to Quantum-Resistant Cryptography,” February 8, 2016, https://www.deepdotweb.com/2016/02/08/nsa-switches-to-quantum-resistant-cryptography/.

2 National Institute of Standards and Technology, “NIST Kicks Off Effort to Defend Encrypted Data from Quantum Computer Threat,” last update September 21, 2016, http://www.nist.gov/itl/csd/nist-kicks-off-effort-todefend-encrypted-data-from-quantum-computer-threat.cfm.

Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

development of a quantum computer capable of breaking something like RSA 2048, an extremely strong encryption system, “is a good deal more uncertain.”

It is also conceivable that a quantum computer containing more than a few qubits will turn out to be too challenging to develop. Blakley pointed to a recently published comparison of optimistic and pessimistic hypotheses in which the author, a mathematician, concludes that quantum computing on a practical scale is unlikely to be possible for physical reasons.3 Industry giants Microsoft, Google, and IBM, on the other hand, are optimistic; all are working on building a quantum computer, and they predict success within 4 to 10 years. Blakley noted Google’s claim that a company called D-Wave has built a special purpose quantum computer, although it is unclear whether its speed comes from quantum mechanics or just extremely fast digital computing.

What is at stake if current cryptographic systems are rendered ineffective?

As a systems security expert, Blakley said he is less focused on the potential benefits of quantum computing and more concerned about what it could break. It is a common belief that widely deployed public-key cryptographic systems would be “rendered totally useless” in the context of quantum computing, he said. Symmetric-key cryptography (which is based on shared secrets) would also be vulnerable to the incredible speed with which quantum computers could break encryption codes designed to resist slower traditional computers. Blakley observed that this vulnerability could be mitigated by increasing symmetric-key cryptosystem key lengths by a factor of at least two; key expansion seems to be considered by cryptosystem designers more feasible than it has been in the past.

What is at stake if current cryptographic systems are rendered ineffective? Public-key systems, server authentication infrastructure, personal computer updates, digital signature systems, and, most worrisome to Blakley, our system integrity infrastructure (i.e., code signing) would all be compromised. All of these systems, which play crucial roles in global finance, personal privacy, and national security, depend on the integrity of public-key cryptographic systems.

More concretely, banking and e-commerce protocols are among the software systems that rely heavily on public-key distribution systems and, like other public key–based systems, would be at risk if current cryptographic systems were broken by a quantum computer. Given the level of disruption, and the fact that trillions of dollars are transacted

___________________

3 G. Kalai, 2016, The Quantum Computer Puzzle. American Mathematical Society, Notices of the American Mathematical Society 63(05): 508-516.

Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

using banking and e-commerce protocols protected by public-key cryptography, Blakley surmised that “it does not seem farfetched that an economically rational attacker would be willing to spend a billion dollars to build a quantum computer.”

Even if quantum computers turn out to be impossible to build, Blakley asserted that it is plausible to believe that there will be future “mathematical land mines” that could weaken public-key cryptographic systems without any help from a quantum computer.

What is the solution if current cryptographic systems break? Whatever it is, Blakley said, waiting 10 years is too long for comfort. A new cryptographic system should be deployment ready and easy to implement and interoperate with—which, Blakley reiterated, is a monumental task.

Blakley closed by underscoring the risks and consequences of cryptographic failure. Noting that in the original Greek, the word “apocalypse” meant “the day that all secrets will be revealed,” he said the end of today’s cryptographic systems would be a literal apocalypse—an “instantaneous, simultaneous global failure” of all information security. He emphasized that it is very difficult to calculate the risk of such an event. Widespread simultaneous cryptographic failure is a “black swan” problem: Because it has never happened, there is no way to calculate when it might happen. Blakley also likened it to an asteroid hitting Earth: While it may not be very likely to happen, that does not mean that one should not plan for it. Finally, Blakley summarized the message of the Sherlock Holmes story “The Adventure of the Dancing Men”: “What the mind of one man can invent, another can discover.” Because people can conceive of quantum computing and other forms of code-breaking, he said, “I think it would pay for us to be worried about the discovery of something that would vitiate all of our public-key infrastructure and to start planning for that.” He concluded by expressing his hope that something to more readily allow changes in cryptography infrastructure as circumstances warrant would be developed by the year 2020.

Secure communication involves the math, the code, and the trust we put in institutions.

Peter Swire, Georgia Institute of Technology, opened the discussion by asking Blakley to speculate on the probability of a significant break of widely deployed cryptography. Reiterating his belief that it is impossible to quantify the risk, Blakley said he prefers to plan for the worst-case scenario, which in his view means assuming that a quantum computer will be operational by the end of 2020. Fred Schneider, Cornell University,

Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

commented that whether or not quantum computers become a reality, cryptographic systems will still break, and cryptographic agility will still be required.

Anita Allen, University of Pennsylvania Law School, questioned whether it might be prudent to more precisely define the term “cryptographic systems.” Blakley suggested a need for a deeper discussion of this topic but noted his belief that it would be helpful to expand beyond cryptographic systems as formally defined when thinking about secure communication. Butler Lampson, Microsoft Corporation, pointed to three levels involved in secure communication: the math, the code, and the trust we put in institutions to authenticate our communication, each of which is subject to compromise. These three levels and their differences create confusion in the cryptography discussion, he said.

Lynette Millett, National Academies, noted that although the well-known Y2K bug once seemed like a huge problem requiring significant engineering effort, it was successfully resolved in time to avoid major issues. She asked whether there are lessons for policy makers and engineers that could be drawn from that experience and how the effort required to solve that comparatively simple problem compares to what would be needed for effective cryptographic agility. Blakley noted that companies like IBM began working on the Y2K issue in the early 1980s, suggesting that much time and work were required to fix it.

Paul Kocher, Cryptography Research Division, Rambus, Inc., brought up another ramification of quantum computing or the discovery of another significant weakness: its potential ability to retroactively cause cryptographic breaks and decrypt any encrypted material that has been saved. That is, if data have been encrypted and stored under a particular cryptographic system, and that system is broken, all of those data are now vulnerable to exposure. Some data, such as that pertaining to national security and the identity of spies, require long-term protection. The potential vulnerability of this information, he noted, increases the urgency of making cryptographic systems quantum safe.

LESSONS LEARNED FROM REAL-WORLD CRYPTOGRAPHY

Paul Kocher, Cryptography Research Division, Rambus, Inc.

Paul Kocher is the president and chief scientist at Rambus’ Cryptography Research Division. His presentation focused on the questions, challenges, and lessons learned from his time designing systems for use in the field.

Kocher cautioned, however, that while creating and implementing agility mechanisms brings benefits, it can also open up new risks and cause other complications. He expressed mixed feelings about the trade-offs of developing agile cryptographic systems, noting that his experience has left him “very cynical, but also optimistic.” Beyond the

Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

threat of quantum computing discussed by Blakley, Kocher emphasized that there are many other reasons to want security mechanisms to be agile. In addition to technical aspects, he noted that agility also has legal, moral, and policy implications, and powerful economic forces can influence whether agility mechanisms succeed or fail.

Kocher shared three cryptography case studies from his work to illustrate both positive and negative outcomes: improving the Secure Sockets Layer/Transport Layer Security (SSL/TLS) protocol, combating piracy of pay television, and securing copyrighted material on Blu-ray discs.

The Secure Sockets Layer/Transport Layer Security Protocol

In 1996, Kocher was asked by Netscape to help fix the poorly designed SSLv2 protocol, which direly needed to be upgraded or replaced. The task was enormous and full of uncertainties: What were the technical requirements? How quickly could a replacement be prepared? Could the new solution support the current cryptographic algorithms (known as “backward compatibility”)?

Kocher described how it quickly became clear that there would be no “goldilocks” solution. Each algorithm under consideration worried different parties. Netscape had signed an agreement with the Department of Defense that required the use of FORTEZZA, a cryptographic system specially created for classified information but with a key escrow mechanism that would be unacceptable for commercial use. Triple DES was slow and had problems with its modes of operation. Rivest Cipher 4 (RC4) was currently in widespread use but was not adequately reviewed. The RSA public-key cryptosystem was constrained by patents. In short, there was no single “good and reliable” algorithm to which he could turn.

Ultimately, the team was unable to select any single combination of algorithms that functioned as well as it wanted; the problem was just too large for its time. RSA keys were too small to be effective over the long term, and only newer algorithms, which Kocher saw on the horizon but had not been created yet, could build something more reliable and durable.

Kocher also noted that legal export restrictions further compounded the challenge. U.S. export restrictions at that time required that “export” versions of browsers and servers support only short, insecure key sizes. At the same time, customers in areas like banking and e-commerce needed strong security. Complicating matters, non-export servers needed to be able to fall back to weak cryptography when communicating with exportable browsers, and non-export browsers needed to be able to fall back to weak cryptography when communicating with exportable servers—but these mechanisms needed to prevent attackers from tricking non-export browsers into using breakable cryptography with non-export servers.

Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

While cryptographic agility mechanisms needed to be created to address the protocol’s needs, Kocher was unable to find any applicable agility research to guide his team. Rather, he discovered that SSLv2 and other existing protocols typically could be tricked into using the security of the worst supported configurations. This was one of the reasons he decided to work on SSLv2’s replacement, SSLv3, essentially from scratch.

First, Kocher said, the team decided not to design the protocol to negotiate every setting separately. With all the different combinations and possible interactions among encryption algorithms, modes of operations, authentication algorithms, and key sizes, some were bound to combine negatively. Compatibility problems also had to be addressed, such as cases where some hardware might support only certain key sizes for a particular algorithm.

The SSLv3 cipher suite approach is designed to address compatibility. Each cipher suite specifies a permitted configuration, typically including an encryption algorithm (and its mode of operation), an authentication algorithm (e.g., message authentication code [MAC]), and the associated key size. A cipher suite negotiation mechanism was also defined so that a client and server would select the strongest cipher suite they have in common. However, if the only algorithms a client and server have in common are insecure ones (e.g., because one or both allowed only weak exportable cryptography), then the “best” option could be very weak—or have no encryption at all. (SSLv3 defines cipher suites that include integrity checks but no encryption.)

Backward compatibility is a double-edged sword.

In the actual “handshake” process used in SSLv3, the client sends the server a list of all cipher suites it will accept in the order of the client’s preference, and the server chooses one from the client’s given list. If the messages exchanged between the client and server are correct (and unmodified by an attacker), the protocol clearly works as intended. (The SSL/TLS protocol does not attempt to protect against badly behaved clients or servers making bad choices.) An active man-in-the-middle attacker could modify the client’s list and/or the server’s selection, tricking them into agreeing on a mutually supported option that is weaker than the one they would otherwise choose. To deal with this case, the parties include the negotiation messages in computing MACs, thwarting the attack. With this negotiation approach, Kocher said, “If things are done right, you end up with the strongest combination that they both support, and it works pretty well.” On the other hand, Kocher pointed out that if either the server or the client is not completely

Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

secure, the entire connection will be insecure, regardless of the strength of one side’s cryptography.

Despite these advantages of SSLv3, Kocher noted some caveats. The negotiation works for symmetric (non-public-key) algorithms, but if the client or server supports weaker public keys, those can be forced into use by an adversary. Here, the public-key algorithms and key sizes are each a point of failure rather than a strong protector. An insecure connection can also result if the negotiated cipher suite is not sufficiently secure. Unfortunately the ultimate decision about which public-key algorithms (and their key sizes) to support, as well as the list of cipher suites to support (and their rankings), is typically made by an implementer or systems administrator. Experience demonstrates that they often make poor security choices. To correct for this problem, Kocher said, “Implementations need to be correct and secure.”

Kocher emphasized the enormous complexities he encountered trying to build from scratch this real-world protocol with many challenging security, interoperability, and performance requirements. The constant push and pull, pain, and messiness of various versions of the protocol led him to believe that in the end, “it may have still been too complex for our brains.” In computer security, we have often “vastly overestimated our ability to understand complexity,” he said. Although he conceded that the team made some design mistakes, the overall protocol has been successful and the issues have been correctable with relatively modest changes. He also noted that while academics can ignore practicalities, real-world applications cannot. Agility mechanisms introduce complexity, which leads to unknown consequences; Kocher expressed skepticism of any proof of security that fails to take into account the way a product or algorithm is deployed and behaves in the field.

Key Lessons from the Secure Sockets Layer Experience

Kocher outlined some lessons learned through this arduous process. First, the SSL/TLS cipher suites provide a form of agility. New suites can be added or adapted gracefully; for example, cipher suites that offer quantum resistance can be added. Once standard quantum-resistant cryptographic algorithms become available, he believes that defining and adding new cipher suites should be a straightforward process.

Unfortunately, the ability to negotiate cipher suites also has downsides. For example, the choice of which mix to support in the protocol is often made by committees, which can get mired in personal concerns or pet projects. As a result, committees can allow bad choices to “muddle along,” leading to bad decisions in the long term.

In Kocher’s experience, backward compatibility is also a double-edged sword. In part because SSLv3 included smooth backward compatibility with SSLv2, the SSLv2 protocol was not fully retired as expected. Instead, many implementers left SSLv2 enabled.

Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

Just last year, two research projects demonstrated problems in SSLv2 backward compatibility mechanisms that are still present in current software versions. Kocher described being “just absolutely aghast” that pieces of SSLv2, the protocol he was tasked to completely replace, were still around 20 years later.

Another solution that perhaps worked too well was SSLv3’s near-universal adoption. Other protocols—Secure Electronic Transaction, Secure Hypertext Transfer Protocol, Microsoft’s Private Communications Technology, among others—have faded from use. Kocher is unsure whether this is a good thing, in that there is less fragmentation and more consistency, or a bad thing, because other protocols could be better suited to particular use cases.

Another lesson stems from the idea that certificate authorities whose public keys are embedded in a web browser, for example, must all be secure for the ecosystem to be strong. When Kocher was working on SSLv3, VeriSign was the primary certificate authority, and he did not expect there to be more than a few. Now, there are hundreds because economic and political pressures led browser makers to include many certificate authorities. The economic pressures of having so many certificate authorities meant that they had little budget for diligence when issuing certificates. “There was no incentive to do a good job . . . Everything economically that could go wrong did,” Kocher said, adding that in some cases certificate authorities found their only profitable option was to issue unauthorized certificates to attackers wishing to impersonate legitimate websites. This story demonstrates how the success of cryptographic agility depends not only on sophisticated, “interesting math” (which tends to get more attention), but also on a mix of politics, economics, and policy.

A final lesson is the importance of implementation. To put it plainly, Kocher stated, “Implementation mistakes are the number one enemy and have been the absolutely overwhelming source of problems.” When Kocher reviews a new security method, he looks for common errors that could occur during implementation and be used to bypass the security, such as bugs in Universal Serial Bus (USB) driver software and certificate parsing routines. He admitted to regret in using the X.509 certificate format in SSLv3. A cleaner certificate format could have avoided implementation mistakes but would have made SSLv3 adoption more difficult since X.509 was already in widespread use.

Piracy in Pay Television

Kocher described his work on a project to prevent piracy in pay television, starting with a brief history of the evolution of pay television technology. Early plaintext feeds were easily hacked by hobbyists. The next format, television set-top boxes, had a relatively secure single DES encryption system, but the hardware and key management were “terrible,”

Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

Kocher said. The United States’ legal responses eventually just drove piracy overseas, he noted, instead of addressing the technical holes that made piracy possible.

The next innovation was a removable security chip, present in many of today’s set-top boxes, which creates a new key for every 30 seconds or so of video coming onto the television screen. However, if something is broken in the key-creation process (e.g., the chip is hacked), the only remedy is to mail the user a replacement card, which is expensive and logistically difficult. Kocher said his team has been deploying approaches that put agility mechanisms and tamper-resistant hardware in the video processing chips in the set-top box itself. It is an approach Kocher believes is working well, though he noted that its impending broader deployment will increase pressure on the new system and likely increase attention to other security risks that will need to be addressed such as recording video from displays.

Kocher said a key lesson from the television piracy experience is that criminals are agile, too, and that cryptography theory cannot predict all of the practical problems that will be encountered in real-world use. Pirates were able to hack the cryptographic systems in place and to continue accessing video despite the replacement-card system. For example, replacement cards often had to be readily available for consumers whose cards had been hacked, and replacement cards made without knowing the “hack” typically had the same vulnerabilities as the ones they were replacing. Because of issues like this, Kocher and his team focused heavily on preventive methods instead of relying solely on agility methods. He called the preventive approach a technological success, though it made existing vendors unhappy because they had made a profit from replacement cards. “The idea that you might have a system that does not need replacing on their planned revenue schedule was extremely threatening for them,” Kocher said, offering another example of economic interests gaining the upper hand in security decisions.

It also turned out that the pay television operators were subsidizing the cost of the set-top boxes and, unwittingly, making it easy for piracy to continue. A box including the security card cost about $200 to make but was being sold to customers for about $50. In this situation, Kocher claimed, the pirate could buy the box with the included card, “throw the box away, hack the card, and sell it,” making a profit. Kocher said this economic problem underscores the point that agility architectures can create risks that require non-agility responses. The card’s vulnerability to pirates has led some designers to eliminate the cards (and card slots) altogether.

Piracy of Content on Optical Discs

Kocher then described his experience with preventing piracy of content sold on optical discs. The original digital versatile disc (DVD) format and players each had some cryptography, but neither had any agility to deal with their inevitable vulnerabilities. Pirates

Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

were able to hack the players in a way that could not be fixed, leaving only unappealing choices: allow all copyrighted content issued on DVD to be pirated or issue a massive recall of DVD players, an option Kocher described as “trying to control shoplifting by using nuclear weapons.” Studios and DVD manufacturers never did “push the button,” and never could have, because “there was so much politics involved that nobody could actually decide what to do.” The result, he said, “was security that failed in some pretty catastrophic ways.”

In this environment, Kocher’s team focused on securing the Blu-ray format and created a system where pieces of security software could be delivered on the disc itself. This approach was a hybrid of copy protection schemes from the 1980s and modern cryptography that, when a disc was inserted, required players to run a specific set of cryptographic algorithms in order for the disc to play. This system meant that it was possible for new discs to bring new security code to mitigate attacks that had appeared.

Where strong and simple is possible, it should be pursued.

In addition to helping address vulnerabilities in players, the code on the disc can also embed tiny changes in the video stream that can be analyzed forensically on pirated copies. This information can help pinpoint a pirate’s methods and craft a suitable response to them.

Despite these advantages, Blu-ray players had many vulnerabilities. As a result, the renewability system became what Kocher described as “a whack-a-mole system.” In addition, the business model of pirates changed. With DVD, once a free circumvention tool was released, it never needed to be changed and pirates had limited revenue opportunities. With Blu-ray, piracy tools needed frequent updates, so hackers (usually located out of the reach of U.S. law enforcement) adopted a service model that provided them with recurring revenue.

When discs with a new security method were released, it typically took time for attackers to discover and exploit it, but they eventually did. Hollywood, Kocher said, viewed this as beneficial nonetheless because its business model favors the brand-new movie over the months-old one. However, this security timeline is poorly suited to paid software or other data meant to be used over time, and it has serious implications for long-term classified data. Kocher concluded by pointing out, “You cannot un-steal the information that is leaked.”

Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

Agility Versus “Strong and Simple”

William Sanders, University of Illinois, Urbana-Champaign, opened the discussion by framing the issue as a tension between agility, which is necessarily going to add some complexity, and “strong and simple” security solutions. He questioned whether a strong and simple approach is even possible in many contexts. Kocher said that a simple approach is possible and beneficial but only within a narrow set of problems, such as cryptographic algorithms themselves. Kocher also suggested looking to older systems for ideas. For example, an old personal computer with a floppy disk drive could easily be reverted to a known (presumably good) configuration simply by turning the machine off and then on again. Creating a similar property in some of today’s systems should be achievable, he said.

However, Kocher suggested that a strong and simple method is likely not appropriate or achievable for complex systems of distributed databases, such as those involved in the Office of Personnel Management breach. That said, Kocher emphasized that where strong and simple is possible, it should be pursued. While the Blu-ray solution may seem somewhat complicated, he noted that the code required thousands, but not tens of thousands, of lines of code, making it relatively strong and simple. But the economics and use cases of Blu-ray players made it impossible to build a system that would be truly impenetrable.

While there is a tension between the two approaches, Kocher said they are both necessary. Prevention is obviously preferred because a theft cannot be fully undone, but one also has to plan for failure. He suggested that equal attention should be paid to preventive measures and agility, as opposed to the situation he sees today, where most of the attention (and resources) are spent on repairs after security breaches.

Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 7
Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 8
Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 9
Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 10
Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 11
Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 12
Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 13
Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 14
Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 15
Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 16
Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 17
Suggested Citation:"1 Context." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 18
Next: 2 Government and Infrastructure »
Cryptographic Agility and Interoperability: Proceedings of a Workshop Get This Book
×
Buy Ebook | $14.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In May 2016, the National Academies of Sciences, Engineering, and Medicine hosted a workshop on Cryptographic Agility and Interoperability. Speakers at the workshop discussed the history and practice of cryptography, its current challenges, and its future possibilities. This publication summarizes the presentations and discussions from the workshop.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!