National Academies Press: OpenBook

Cryptographic Agility and Interoperability: Proceedings of a Workshop (2017)

Chapter: 2 Government and Infrastructure

« Previous: 1 Context
Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

2

Government and Infrastructure

The second section of the workshop featured two speakers, Kerry McKay and Richard George, with extensive backgrounds working in the area of cryptography for the U.S. government. They covered the government’s use of cryptography and its standards-setting process, the evolution of cryptographic technology, and current and future challenges.

HOW THE NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY THINKS ABOUT CRYPTOGRAPHY

Kerry McKay, National Institute of Standards and Technology

Kerry McKay is a computer scientist in the Computer Security Division at the National Institute of Standards and Technology (NIST). Her talk focused on the process NIST uses when creating cryptographic standards.

Cryptographic Agility and Interoperability

McKay opened by outlining three facets of cryptographic agility: (1) the ability for machines to select their security algorithms in real time and based on their combined security functions; (2) the ability to add new cryptographic features or algorithms to existing

Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

hardware or software, resulting in new, stronger security features; and (3) the ability to gracefully retire cryptographic systems that have become either vulnerable or obsolete. McKay defined “interoperability” as the ability to communicate and exchange information between systems.

Pursuing the double aims of agility and interoperability works in some ways but not in others, she noted. Supporting a wide variety of algorithms for interoperability can increase a machine’s agility. For example, older devices can be updated with new algorithms that allow them to communicate securely with newer devices. However, interoperability can also complicate agility. It makes it harder, for example, to remove outdated protocols or algorithms such as Rivest Cipher 4 (RC4) and Secure Sockets Layer (SSL) 2.0, which should be retired. Some legacy systems cannot be updated because of hardware restrictions. A user might not be willing to pay to have an older machine properly secured or to give up a familiar operating system (such as Windows XP) that manufacturers are no longer updating. As a result, interoperability sometimes means supporting algorithms that are not very secure.

Reiterating a point Paul Kocher raised, McKay noted the importance of recognizing the potential hazards of implementation flaws. Supporting agility and interoperability requires more software, but that increases the number of potential security vulnerabilities and bugs, she said. In constrained environments, such as those required to manage small radio frequency identification (RFID) tags, there is not always room for all the security a device might need, making it especially important to employ the right security approach. Supporting weaker algorithms instead of retiring them can also lead to what are known as “downgrade attacks,” in which hackers are able to find and exploit known weaknesses in servers that still support these algorithms.

The National Institute of Standards and Technology Process

McKay turned to the standards-making process at NIST, which seeks to balance agility and interoperability with security for non-classified government systems. NIST’s “Transitions: Recommendations for Transitioning the Use of Cryptographic Algorithms and Key Lengths” (SP800-131A) outlines which cryptographic algorithms are allowed, allowed with conditions, or disallowed. For example, Data Encryption Standard (DES) is not considered secure enough to encrypt new data, but older data can be decrypted via DES with a warning about its insecurity. In the case of two-key triple DES, NIST allowed its use for encryption only until the end of 2015 and only for limited amounts of data.

In another example, NIST recommends against using Secure Hash Algorithm-1 (SHA-1) for digital signatures. When SHA-1 is used for creating digital signatures on short-lived parameters within a protocol, such as ephemeral keys in Transport Layer Security (TLS), there is a very small window of opportunity for an attacker to disrupt that

Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

process. The threat changes when using SHA-1 in a certificate for server authentication. The data that are signed have value for a longer period of time, allowing more time for an attacker to find a hash collision and create a fraudulent certificate and increasing the potential threat.

McKay conceded that NIST was not quick to make new standards. It has a small, specialized staff (though NIST is, she noted, actively hiring cryptographers) and a strong focus on completing projects carefully and thoroughly. When NIST determines that a new standard may be needed, the institute holds workshops and meetings with academics, government employees, industry leaders, and participants from related organizations like the International Organization for Standardization and the Internet Engineering Task Force. Through this process, NIST studies the available algorithms; determines if new ones are needed; and then develops new standards, adopts a standard that already exists, or holds a competition. The competition model has both benefits and drawbacks, McKay said. It worked well for Advanced Encryption Standard (AES) and Secure Hash Algorithm-3 (SHA-3), but competitions are time intensive and expensive to run. AES had 15 submissions and ran from 1997 to 2001. SHA-3 had 64 submissions and ran from approximately 2004 to 2015.1 Because of these downsides, sometimes it is more helpful to focus on algorithms that are already in use. This option works well for block cipher modes, which add features to block ciphers that have already been standardized, with faster timelines and lower implementation cost.

Deciding when to make a new standard is as hard as deciding whether to do so.

When NIST is ready to issue new standards, they are written up in a document that goes through a long internal and external review process. If it is a Federal Information Processing Standard, the document has to make its way up several management levels before being approved by the Secretary of Commerce, which adds more time and complexity. This long review process underscores the importance of being thorough and deliberate when creating new standards and explains why NIST cannot operate as nimbly as other entities might.

Those limitations, however, also mean that government systems have a high degree of interoperability because they are highly likely to have algorithms in common. Adding algorithms to the approved list could complicate that interoperability because

___________________

1 The idea for SHA-3 began in 2004 and the concept was finalized in 2015.

Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

each new cryptographic system would have to be fully validated by NIST before it could be used.

McKay said it works well for NIST, and the federal government more broadly, to have general purpose primitives and modes for special features such as authentication or data storage. McKay listed the various standards that NIST recommends and uses: Secure Hash Algorithm-2 (SHA-2), SHA-3, 3DES, AES, RSA, Elliptic Curve Digital Signature Algorithm, Digital Signature Standard, and hash message authentication code. They also use block cipher modes of operation to achieve confidentiality, authentication, key wrapping, and format preservation. In addition, she said NIST is working on new hash function modes, tentatively named Keccak Message Authentication Code, TupleHash, and ParallelHash.

Deciding when to make a new standard is just as hard as deciding whether to do so. Sometimes vendors request that NIST incorporate a particular algorithm into its standards to improve their business prospects with the government, a suggestion that holds little sway with NIST. “In the end, our standards are geared towards the unclassified government IT systems and they cannot be driven by vendors,” McKay said. Sharing a story to illustrate this point, she said a recent draft document initially recommended three encryption modes. An issue was discovered in one mode, and it was left out of the final version. This mode was repaired, but it was no longer needed because the first two were sufficient. Adding it back would have made that vendor happy, but it would have cost taxpayer money and NIST time to do it. “It did not align with our mission, so we decided not to do it,” said McKay.

Because of the intensive process of creating new standards, longevity is a key concern for NIST. These standards should be built to last and possibly even to outlast the technologies that are being used today. “We have had AES around since 2001 and it is still going strong,” McKay said. “That is the kind of thing we like.”

Future Directions

McKay noted that newer devices are sometimes too small to have adequate security. Light bulbs and RFID tags, for example, are not typically being manufactured to be secure. NIST is actively researching lightweight primitives for these small devices, loosely known by the umbrella term “the Internet of Things.” For example, NIST could add AES or another strong security mechanism, but with current approaches this would come at the expense of speed or price. Quantum computing is also a concern for NIST, she said.

The institute is currently working to create modes to enhance functionality while using the same core primitives. As an example, SHA-3 includes the Keccak permutation, which could be used to give users more features with less implementation overhead.

Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

Launching the discussion session, Steven Lipner, an independent consultant, asked what NIST was doing to smooth transitions from one standard to another when they inevitably change. For quantum computing specifically, McKay responded that NIST was looking at several different designs and putting out a call for new algorithms. Donna Dodson, NIST, suggested creating an algorithm that was resistant to both today’s attacks and future quantum attacks, although it would present key management challenges. She added that it would be even better if such an algorithm could be completed and deploy-able before quantum computing became a reality. McKay noted that NIST does recommend transition periods in SP800-131A, including considering the time a vendor needs to anticipate the change to its business.

When asked by Matthew Green, Johns Hopkins University, whether NIST had plans to modernize its approval process for new algorithms, McKay and Dodson answered that to do so, NIST would need more personnel. McKay explained that the NIST team goes through the math and its proofs itself rather than outsourcing the work, which is partly why it takes so long. She expressed her belief that it is important for the institute to be involved in the technical aspects, in part because it is also NIST’s duty to help policy makers understand the decisions that are being made.

Eric Grosse asked if NIST could adopt a management style similar to the one he used at Google, where he strongly encouraged creators to identify which old system would be shut down whenever a new one was initiated. McKay responded that NIST already uses a similar approach, citing SHA-3 as a result of the discovery that SHA-1 had problems. Because they knew it would take a while, they laid the groundwork fairly quickly. In case SHA-2 started showing the same mathematical weaknesses as SHA-1, NIST realized that it needed to go with a very different design. McKay also pointed to the core standards NIST recommends, which rarely have more than two or three options.

Dodson added that NIST does not have policing or enforcement power when it comes to its recommendations. When it no longer recommended SHA-1, some of the larger federal agencies had significant challenges figuring out where, and with which vendors, they were using this ineffective security mechanism, and it took some of them a full 18 months to replace it. McKay noted that a further complication is that some government websites must interact with the public. For example, there are some versions of TLS that are allowed because public-facing servers need to account for the security of the user’s browser. TLS 1.0 is still allowed in this case, despite not being as safe as NIST prefers. “It is kind of hard to tell someone they cannot go to [the] Social Security Administration web page and log in because they have to update their browser and they might not know how to do such a thing,” McKay explained. McKay closed by pointing out that creating government-to-government standards is much easier than interacting with the public. There, she said, “the water gets muddy really quickly.”

Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

CRYPTOGRAPHY THROUGH THE YEARS

Richard George, The Johns Hopkins University, Applied Physics Laboratory

Richard “Dickie” George, currently a senior advisor for cybersecurity at The Johns Hopkins University Applied Physics Laboratory, has had a long career in the field of information security. He spent more than 30 years at the National Security Agency (NSA), including 8 years as the technical director at the NSA’s Information Assurance Directorate, a capacity in which he often confronted problems of resilience and agility. He presented a history of cryptographic technology in the government sphere.

While businesses may balk at the costs of repairing security, the uncomfortable truth is that all cryptography does eventually break. Reflecting on 45 years in the field, George emphasized how essential, yet complex, the issues of cryptographic resilience and agility are and underscored the urgent need to continually work on these issues as technology evolves.

Early Cryptography

George began with an overview of VINSON, a voice encryption device for military radios. VINSON’s initial objective was to encrypt radio communication between a forward observer and up to six contacts. The cryptography behind VINSON went through a lengthy design and testing process, spanning nearly two decades from its initiation in 1957 to the system’s approval for use in 1976. The results have been lasting: “It is still secure as far as we can tell,” said George. “It has never had any modifications—that is a pretty good run.”

Because of the limited purpose of VINSON, interoperability was not an issue. Radios with VINSON were not intended to communicate with receivers on satellites, bases, or fighter jets. “That made it very simple to design only the bare minimum functionality into that system that we needed,” George said. Also, before 1973, software and computers were not considered secure enough to contain cryptography as it existed at the time, so VINSON was designed only for the realm of radio. The limited nature of VINSON-equipped devices’ functionality had the added benefit of keeping communication secure. “Functionality,” George cautioned, “is opportunity for the adversary.”

The “agility” mechanism of the VINSON system was strikingly different from how we think of agility today. Each cryptographic machine contained one or more boards that implemented its own cryptographic algorithm. Any problems were thus limited to the copies of that machine, and the boards could be modified or removed and replaced without impact on the functioning of other cryptographic machines.

Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

Transitioning to the Modern Era

Eventually, as interoperability became an essential component of secure communication systems, more challenges arose. VINSON had few modes—plaintext and encrypted, on and off. The next generation of devices that would be required to interoperate with VINSON had 1,024 modes, almost all of which had to be secure. This dramatically increased the challenge of evaluating each algorithm and its implementation. George emphasized the difficulty of evaluating the implementation in particular because in addition to knowing the designer’s intentions, one must also take into account a variety of unpredictable real-world scenarios. Underscoring the difficulty of this task, George noted that the NSA today is still evaluating VINSON’s algorithm, 60 years later, just in case there are surprises inside. “Occasionally, you find things,” said George. “When you do find them, you have to fix them or accept the risk,” he added, noting that if there are openings, it is often impossible to know whether an adversary has also found them.

Today‘s technology has so much more functionality. Cryptography has also proliferated because increased functionality means there are more places to protect and more ways for data to be attacked. “In order to attack cryptography back in the ‘70s, you had to attack the cryptography,” George said. “Today, you go around the cryptography.” Availability is also key to today’s technology, but early cryptography was not designed to accommodate availability. Integrity, authentication, confidentiality, and non-repudiation were the traditional aims.

The Problem of Legacy Cryptography

Building on a theme raised earlier in the workshop, George commented on the problem of legacy cryptography that remains in software or hardware after it has become obsolete. This typically happens either because new devices still have to be able to talk to older ones or because there is still information stored and encrypted with those older algorithms. These and other factors mean that in many cases old cryptography can never be fully retired. Where legacy algorithms are in use, they present a risk because adversaries can bypass the newer, stronger cryptography and attack the older, less secure algorithms. Preventing backward compatibility and forcing old cryptography to break can mitigate the issue, George said.

In the discussion following George’s presentation, Bob Blakley, CitiGroup, Inc., noted that his mentor used to say that code was akin to “original sin”: Every line you write stays with you and your descendants forever. Therefore, perhaps instead of erasing old algorithms, as George suggested, participants should instead think about how to live with these mistakes, since the more likely scenario is that they will not be retired. George replied that at some point, technologies do become obsolete, such as vacuum tube televisions and rotor encryption machines. Butler Lampson, Microsoft Corporation,

Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

weighed in by paraphrasing a suggestion he attributed to Alan Kay. Computers should be like tissues, he said: “You use it once and throw it away.” Such an approach could greatly simplify the security required.

Properly storing information is essential to government work and required by law in certain situations. George shared an anecdote in which retrieving stored information proved to be nearly impossible, illustrating that these requirements do not always ensure that stored information will be retrievable—just that it will be stored.

International Dimensions

George described some of the international factors that influence cryptography. Foreign governments sometimes have very different standards and motivations than the United States. Some countries do not value citizens’ privacy in the same way that the United States does, and they are changing or adapting secure cryptography for their own ends. For example, some governments may refuse to use public algorithms and instead require the use of their own cryptography in systems, without sharing that algorithm or allowing designers to evaluate it or its implementation. In such cases, said George, “We are going to have to trust [that government] and trust that it has not done something to circumvent everything else that we have in there . . . You cannot possibly vet that algorithm.” Another wild card is randomization. Countries might require their own randomization, which can create ancillary issues. In this complex environment, the cryptography itself becomes the “simple” piece of the puzzle.

The average user chooses the default security settings, which are probably insecure.

During the discussion, Deirdre Mulligan, University of California, Berkeley, asked George to expand on situations in which the requirements of foreign governments may be at odds with citizens’ privacy or other rights. While diplomacy is important, she said, citizens are often unaware that there is a security choice at all and end up using an insecure setting by default. George agreed that the average user chooses the default security settings, which are probably insecure. Mulligan clarified her question to ask if other governments, for example China, could create transparency in their information security process or at least allow their users to understand what level of privacy they actually have.

George suggested that the situation is difficult from any perspective. Other governments make their own decisions about what cryptography is allowed to be used and sold and are unlikely to be persuaded to change their policies. For the same reason, it would be hard to explain to, say, Chinese users what risks they are taking using software whose

Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

cryptography was likely amended. Unfortunately, in foreign countries, there is not much that can be done. “You do not hold a winning hand in trying to protect people when the government does not want to protect them,” George explained.

Preparing for Quantum Computing

George noted that the government is still studying VINSON because it is possible that holes may yet be revealed. Today’s much more complex cryptography, such as RSA cryptography or quantum key distribution, involves distinctive risks. There is a lot of cryptography in use today that is not fully understood and can perhaps never be fully secure, George argued. An approach known as quantum key distribution might be considered safe, for example, but George asserted that every implementation of quantum key distribution so far has been shown fairly quickly by academic researchers to be insecure.

Quantum-resistant cryptography is so little understood that it would require years of study—time that we really do not have, George said. The security of government and personal data is at stake. Classified documents, financial information, and health records all need to be protected as well. George pointed out that health records hold information that does not change, such as a person’s blood type or medical history. When a credit card account is breached, it is fairly easy to receive a new card and a new account number. “But if they steal your health records, you cannot change them that easily,” said George. “These are the kinds of things we have to understand. We have to be ready.” Quantum computers represent a real and urgent threat that must be addressed, he said.

In the discussion, Steven Lipner asked George to imagine that quantum computing was real and could break the widely used 2048-bit RSA keys. In that situation, how would he advise the President? George said his advice would be first to go back to an older system, such as symmetric key distribution, although he would prefer not to need such a stop-gap. It would be far better to be ready and prepared for quantum computing, he said. The older systems would be merely backup communication systems, like Morse code and sextants for navigation on ships. Lipner reflected that many businesses do not have any backup plan, which could cause big problems both for them and for their customers.

Addressing Current Weaknesses

John Manferdelli, Google, Inc., asked George if insecure hardware, which is most of what people are using today, is a “catastrophic problem.” George described the issue as one of risk management, and the answer would depend on what the system is, who the adversary is, and how bad a data breach would be. The online banking industry might be able to live with some of the risk of quantum computing, but the nuclear weapons system probably could not.

Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×

George emphasized that the nature of our adversaries has changed. In the Cold War years, enemies’ motivations were clear and they followed “the rules,” both of which are no longer the case. Key questions have to be considered: “Who is going to have access to that quantum computer? What are they going to be able to do, and what are you worried about?”

Referencing the focus on quantum-resistant cryptography, Paul Kocher, Cryptography Research Division, Rambus, Inc., questioned whether those more “hypothetical” challenges should really be the priority or if the more everyday bugs and repairs—which we know are problems right now—should be attended to first. George agreed that today’s systems are definitely in need of repair and that part of the problem is that we rely too much on users to protect their computers and software. In a car, he noted, there are seatbelts and warning systems that protect the occupants. “We ought to have the same kind of regulation that says that the computer ought to protect the user rather than the user protecting the computer,” George asserted. In the context of the security problem known as buffer overflows, George suggested fixes must be made in hardware because software is too extensive and diverse.

George emphasized how critical it will be to know which adversaries have access to quantum computers and what types of extremely important information must be protected from this threat. However, the vast majority of attacks happening today involve much more run-of-the-mill adversaries who can be defeated through better user behavior. Citing a statistic that 91 percent of security attacks in 2015 came from phishing—a user clicking on a link that unwittingly let an adversary in—George asserted that better system design, not public education or training, is needed to help prevent these sorts of breaches. “You really cannot train them not to because you are talking about 4-year-olds and 90-year-olds . . . That training is not going to work. You have to protect them. Let the computer not let them do dumb things,” he said.

Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 19
Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 20
Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 21
Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 22
Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 23
Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 24
Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 25
Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 26
Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 27
Suggested Citation:"2 Government and Infrastructure." National Academies of Sciences, Engineering, and Medicine. 2017. Cryptographic Agility and Interoperability: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/24636.
×
Page 28
Next: 3 Standards and Security Implications »
Cryptographic Agility and Interoperability: Proceedings of a Workshop Get This Book
×
 Cryptographic Agility and Interoperability: Proceedings of a Workshop
Buy Ebook | $14.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In May 2016, the National Academies of Sciences, Engineering, and Medicine hosted a workshop on Cryptographic Agility and Interoperability. Speakers at the workshop discussed the history and practice of cryptography, its current challenges, and its future possibilities. This publication summarizes the presentations and discussions from the workshop.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!