National Academies Press: OpenBook

Autonomy Research for Civil Aviation: Toward a New Era of Flight (2014)

Chapter: 3 Barriers to Implementation

« Previous: 2 Potential Benefits and Uses of Increased Autonomy
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

3

Barriers to Implementation

The committee has identified many substantial barriers to the increased use of autonomy in civil aviation systems and aircraft. These barriers cover a wide range of issues related to understanding and developing increasingly autonomous (IA) systems and incorporating them into the National Airspace System (NAS). Some of these issues are related to technology, some to certification and regulation, and some to legal and social concerns:

•  Technology Barriers1

Communications and data acquisition

Cyberphysical security

Decision making by adaptive/nondeterministic systems2

Diversity of aircraft

Human–machine integration

Sensing, perception, and cognition

System complexity and resilience

Verification and validation (V&V)

•  Regulation and Certification Barriers

Airspace access for unmanned aircraft

Certification process

Equivalent level of safety

Trust in adaptive/nondeterministic IA systems

•  Additional Barriers

Legal issues

Social issues

Each of these barriers overlaps with one or more of the others, and efforts to overcome these barriers will not proceed in isolation one from the other. For example, the technology barriers tend to overlap with the certification barriers because advanced civil aviation technologies cannot be deployed operationally unless and until

___________________

1 The committee did not prioritize the barriers; they are listed alphabetically within each group.

2 Adaptive and nondeterministic systems are defined in the section “Decision making by adaptive/nondeterministic systems.”

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

they can successfully complete the certification process. Likewise, advanced technologies cannot be expected to complete the certification process unless there are robust V&V processes to support technology development and certification.

The committee did not individually prioritize these barriers (for example, by urgency or degree of difficulty). However, the committee believes that there is one critical, crosscutting challenge that must be overcome to unleash the full potential of advanced IA systems in civil aviation. This challenge may be described in terms of a question: “How can we assure that advanced IA systems—especially those systems that rely on adaptive/nondeterministic software—will enhance rather than diminish the safety and reliability of the NAS?” There are four particularly challenging barriers that stand in the way of meeting this critical challenge:

  • Certification process,
  • Decision making by adaptive/nondeterministic systems,
  • Trust in adaptive/nondeterministic IA systems, and
  • Verification and validation.

TECHNOLOGY BARRIERS

Communications and Data Acquisition

Barrier Statement: Civil aviation wireless communications are fundamentally limited in bandwidth, and the operation of unmanned aircraft in the NAS could substantially increase the demand for bandwidth.

Wireless communications and data acquisition are foundational to the NAS. Very high frequency (VHF) radio and satellite communications are used for voice communication and data transmission among pilots, air traffic controllers, and airline operational controllers. Other segments of the frequency spectrum are used for land- and satellite-based navigation. Additional frequencies are dedicated to systems such as aircraft radar transponders that facilitate informed decision making and control within the NAS.3 The Federal Communications Commission has allocated a fixed amount of bandwidth for civil aviation purposes. This amount is unlikely to increase. In fact, nonaviation commercial and consumer demand for bandwidth is increasing to the point where the use of some frequencies for aviation may be threatened.4 Wireless devices (cell phones, tablets, and the like) are proliferating at an astounding rate, and the demand for bandwidth to support such devices is growing just as fast.

As the number of IA systems in operation increases, it will become more of a challenge to share the limited radio spectrum with existing legacy systems. The Federal Aviation Administration’s (FAA’s) roadmap for integrating unmanned air systems (UAS) in civil airspace has concluded that

. . . harmonized radio spectrum is needed for UAS control and communications links to help ensure their protection from unintentional radio frequency interference, to help ensure adequate spectral bandwidth is available for meeting the projected command and control link capacity demands, and to facilitate operation of UAS across international borders. While spectrum is also needed for beyond-line-of-sight command and control links, the initial focus was on radio line-of-sight for civil UAS because demand for line-of-sight links is expected to be greater.5

The proliferation of remotely piloted UAS, as they are currently operated, will increase the demand for communications and data acquisition. These systems use communications/data links to connect remotely located pilots, aircraft, and air traffic controllers. Managing the transmission of video and airborne weather radar is key to maximizing available bandwidth. The demand for both of these high-bandwidth systems peaks during the departure

___________________

3 FAA, “Spectrum Resources for Air Traffic Control Systems,” Office of Spectrum Management and Policy, January 2002, http://www.faa.gov/about/office_org/headquarters_offices/ato/service_units/techops/spec_management/library/general.cfm.

4 W.R. Voss, 2012, Gathering storm, AeroSafety World, February.

5 FAA, 2013, Integration of Civil Unmanned Aircraft Systems (UAS) in the National Airspace System (NAS) Roadmap First Edition—2012, November 7, http://www.faa.gov/about/initiatives/uas/media/uas_roadmap_2013.pdf, p. 56.

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

and arrival phase of flight. Unmanned aircraft in a fully autonomous mode would greatly reduce the demand for bandwidth associated with aircraft operations. Even so, some UAS missions will involve real-time streaming video and/or sensor data, which could test the limits of available communications bandwidth.

Cyberphysical Security

Barrier Statement: The use of increasingly interconnected networks and increasingly complex software embedded throughout IA air- and ground-based system elements, as well as the increasing sophistication of potential cyberphysical attacks, threaten the safety and reliability of IA systems.

The NAS is a complex cyberphysical system composed of complex cyberphysical systems on aircraft, in air traffic monitoring (ATM) systems, in maintenance systems, and in other support systems. These cyberphysical systems have integrated computational and physical elements, including distributed microprocessors with embedded software, sensors, control systems, navigation, internal data networks, and external communications. The complexity of these ever more interconnected systems can create multiple attack vectors for cyberthreats.

There are many positive motives and drivers for the growing complexity of the NAS. Embedded processors throughout the aircraft are used to improve operational performance, add important safety features, and perform automated vehicle health and maintenance functions. Higher levels of automation/autonomy drive the need for additional software, processors, and sensors throughout the system. Additional communications, wireless connectivity, and in-flight entertainment options are also adding to the complexity of these systems. The NextGen system6 offers the promise of greater capacity, operational safety, and efficiency, but it also increases networking and system-of-system complexity. The need for additional research and development to address the challenges for software V&V was noted in the NextGen UAS Research, Development and Demonstration Roadmap,7 but cyberphysical threats were not addressed.

In conventional desktop and network cyberenvironments, there is a growing asymmetry between the typical code size and attack surface of application software versus the relatively small size of typical software viruses. In some cases, even antivirus software itself can create new vulnerabilities. On a recent software vulnerability watch list, about one-third of the reported software vulnerabilities were in the security software itself.8

Today’s modern aircraft have millions of lines of computer code embedded in distributed processors throughout the aircraft. Cyber vulnerabilities can arise from the reliance of aircraft systems on sensors, control systems, navigation, communications, and other physical elements, and from their interaction with other air and ground systems. There have been numerous documented examples of intentional and unintentional GPS jamming and spoofing. As the NAS transitions to the NextGen system and expands the use of Automatic Dependent Surveillance–Broadcast (ADS-B) systems,9 there will be increased reliance on onboard sensing such as GPS for efficient and safe operation. Communications is also a key vulnerability for cyberphysical systems. Radio hackers have used air traffic control frequencies to give pilots false commands.10 Communications systems are of increased importance for UAS because they provide the link to the human operators. As machine-to-machine communications share more information on aircraft navigation, health, and other data, there will be additional opportunities to exploit these network connections. The embedded processing elements themselves can be a point of vulnerability, with the threat of tampering with the chips to add hidden functionalities and unsecure backdoor access.11

___________________

6 The Next Generation Air Transportation System (NextGen) is the a new National Airspace System due for implementation across the United States over the next decade.

7 NextGen UAS Research, Development and Demonstration Roadmap, Version 1.0, March 15, 2012, http://www.jpdo.gov/library/20120315_UAS%20RDandD%20Roadmap.pdf.

8 DARPA High-Assurance Cyber Military Systems (HACMS) Proposer’s Day Brief.

9 ADS-B is a key component of the NextGen system. The ADS-B system on an aircraft continuously broadcasts that aircraft’s position and other flight information, all of which can be displayed on other aircraft and on ground systems.

10 Aerospace America, Getting ahead of the threat: Aviation and cyber security, July-August 2013, pp. 22-25.

11 The Guardian, Cyber-attack concerns raised over Boeing 787 chip’s back door, May 29, 2012.

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

Decision Making by Adaptive/Nondeterministic Systems

Barrier Statement: The lack of generally accepted design, implementation, and test practices for adaptive/ nondeterministic systems will impede the deployment of some advanced IA systems and aircraft in the NAS.

Adaptive systems have the ability to modify their behavior in response to their external environment. For aircraft systems, this could include commands from the pilot and inputs from aircraft systems, including sensors that report conditions outside the aircraft. Some of these inputs, such as airspeed, will be stochastic because of sensor noise as well as the complex relationship between atmospheric conditions and sensor readings that are not fully captured in calibration equations. Adaptive systems learn from their experience, either operational or simulated, so that the response of the system to a given set of inputs varies over time. This variation means that traditional verification, validation, and certification (VV&C) methods are incompatible with adaptive systems, since those methods are predicated on achieving a predictable output, albeit with some level of measurement error, for a given input. New approaches to VV&C are required to demonstrate that adaptive systems will consistently make appropriate decisions even though individual test results may not be repeatable. The expectation that adaptive systems will improve over time is expected to be of such value that research into these systems is worthwhile despite the need for new VV&C methodologies.

Systems that are nondeterministic may or may not be adaptive. They may be subject to the stochastic influences imposed by their complex internal operational architectures or their external environment, meaning that they will not always respond in precisely the same way even when presented with identical inputs or stimuli. The resulting uncertainty means that traditional VV&C methods are incompatible with nondeterministic systems. VV&C methods for complex, nondeterministic systems may be even more difficult to develop than VV&C methods for adaptive systems. However, the software that is at the heart of nondeterministic systems is expected to enable improved performance because of its ability to manage and interact with complex “world models” (large and potentially distributed data sets) and execute sophisticated algorithms to perceive, decide, and act in real time. As with adaptive systems, the improved performance of nondeterministic systems is expected to be of such value that research into these systems is worthwhile despite the need for new VV&C methodologies.

Systems that are adaptive and nondeterministic demonstrate the performance enhancements—and the VV&C challenges—of both characteristics. Many advanced IA systems are expected to be adaptive and/or nondeterministic, and they should be evaluated on the basis of expected or intended responses to a representative set of input conditions to ensure that they will consistently produce an acceptable outcome under a wide variety of environmental conditions and operational situations. This requires an understanding of how these systems sense and perceive internal and external data and the rationale by which the system arrives at its output decision. Existing adaptive/ nondeterministic algorithms have not been widely applied to safety-critical civil aviation applications in part because of the lack of a mature process for designing, implementing, and testing such algorithms.

Diversity of Aircraft

Barrier Statement: It will be difficult to engineer some IA systems so that they are backward-compatible with legacy airframes, ATM systems, and other elements of the NAS.

Many civil aircraft have life cycles extending to decades. In general, new aviation technologies are required to be backward compatible with legacy aircraft and systems. Integration of advanced systems, such as IA systems, requires consideration of the size, missions, and capabilities of legacy systems now and well into the future.

Legacy systems include aircraft that do not have transponder or voice radio capabilities and sailplanes that operate without engines. More typical legacy aircraft have radios, transponders, and engines, but some of their system capabilities may be out-of-date by several decades compared to those of new systems coming off the production line. Some advanced aviation systems—and this is likely to include many IA systems—work most efficiently when interconnected with other similarly equipped aircraft. However, IA systems will also need to be

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

designed and engineered so that they can operate safely even in the absence of those connections and, preferably, without imposing new tasks or requirements on human pilots in legacy aircraft.12

Human–Machine Integration

Barrier Statement: Incorporating IA systems and aircraft in the NAS would require humans and machines to work together in new and different ways that have not yet been identified.

There will always be some level of human involvement in the operation of IA systems, if for no other reason than to initiate and/or terminate otherwise autonomously performed tasks or missions. Thus, there will always be some kind of interface that allows humans to communicate intent to the IA system and that allows the system to communicate appropriate state and status information to the human. Although many of the necessary characteristics and requirements of systems that will enable humans to interact effectively with IA systems are known from previous research on human–automation interaction, much remains to be learned as we transition to systems with ever-increasing levels of autonomy.

Certainly one fundamental aspect to consider is the distribution of roles, responsibilities, and workload among human operators in IA systems. Implementation of IA systems will be adversely impacted if the workload is distributed poorly between humans and machines, either by depriving humans of necessary information or by overloading them with too much information in too short a time frame. To be effective, autonomous components should be designed to be competent players in an overall system that teams humans with machines. Critical analysis to define the appropriate functional allocation of the roles between the systems and the humans will be essential to avoid design pitfalls that would, in effect, require the human to be the ultimate fail-safe mechanism if the autonomous elements fail.

Effective integration of IA systems into the NAS will require consideration of the impacts of these operations on all stakeholders, including legacy aircraft and systems. For example, traffic management and collision avoidance in mixed operations of legacy aircraft and aircraft with IA systems will require transparency so that all responsible agents, human and machine, will understand the intentions and plans of the others. This presents the considerable challenge of designing human–machine interfaces that will enable the coordination of such operations within confined airspace.

Experience shows that human operators have a tendency to become reliant on the continued functionality and proper performance of automated/autonomous systems. This can be particularly problematic when the performance of these systems degrades, sometimes subtly, sometimes obviously, and sometimes suddenly. Overreliance on automation is frequently suspected as a factor in or indeed the cause of aviation incidents and accidents. If the manual skills necessary to perform an operation or maneuver have not been sufficiently practiced and maintained, human operators may lose proficiency and be ill-prepared to take over in the event of system failures. Overreliance and loss of proficiency can each make operators reluctant to assume manual control even in the face of clearly faulty automation.

Conventional automated flight systems sometimes experience gradual failures without obvious indications, which can make it difficult for operators to detect faults and prepare themselves to take over manual control at an appropriate time. Proper human–machine integration for IA systems would promote graceful degradation of system performance, whereby operators are informed of impending or potential failure modes that may require human intervention.

Formal training programs contribute to the NAS’s high level of safety. These programs have been developed through many years of experience and have evolved in response to lessons learned from accidents and incidents, new applications of advanced automation/autonomy, and new training technology such as high-fidelity simulators. Appropriate training requirements and programs for IA systems will need to be established before the systems can be deployed.

___________________

12 Boeing Commercial Airplanes, “Boeing Autonomy Research Recommendations,” presentation to the committee, November 13, 2013.

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

Sensing, Perception, and Cognition

Barrier Statement: The ability of IA systems to operate independently of human operators is fundamentally limited by the capabilities of machine sensory, perceptual, and cognitive systems.

Machine sensing uses sensors to detect and extract information about the internal states of the system and the external states of the operating environment. Machine perception is the transformation of this raw sensor data into high-level abstractions that can be interpreted by machines and/or understood by humans and used to support decision making. Machine cognition is the utilization and application of this processed information to recognize patterns and make decisions relevant to the missions or functions being performed. All three processes are central to the ability of machines to operate independently of human operators and are often described by use of the OODA loop (Observe, Orient, Decide, and Act), discussed in Chapter 1.

Machine perception is currently very difficult for some combinations of sensors and tasks. For example, while it is relatively easy to detect the presence of an object in a specific location, it is much more difficult to identify or characterize that object (for example, as a building or a vehicle), and it is even more difficult to “contextualize” it (landing pad, conflicting traffic). Projecting the future state or actions of a detected object is still more daunting. Advances in machine perception capabilities are therefore crucial to the implementation of adaptive/ nondeterministic IA systems.

In some circumstances it might be beneficial to share onboard sensor data among multiple aircraft. In these situations, the sensors on board each aircraft become part of a distributed sensor network that can combine data to provide better information than any single aircraft can achieve on its own. However, networking sensors can also introduce communication lags and interference, which may decrease precision. As a result, distributed sensor networks may have performance profiles that differ dramatically from more traditional onboard sensors.13

Machine cognition is a major challenge for the development of IA systems. High-level cognitive ability of machines is necessary to enable them to operate independently of humans for complex, dynamic missions. Achieving this will require consideration of the expectations of the humans who are managing these systems, especially if the system design includes provisions for some level of human intervention in the event of degraded performance or partial system failures.

System Complexity and Resilience

Barrier Statement: IA capabilities create a more complex aviation system, with new interdependencies and new relationships among various operational elements. This will likely reduce the resilience of the civil aviation system, because disturbances in one portion of the system could, in certain circumstances, cause the performance of the entire system to degrade precipitously.

The report produced by the Defense Science Board (DSB) Task Force on Autonomy notes that as capabilities of IA platforms continue to grow, design and testing activities should be focused on the larger multirole, multiechelon system in which the vehicle operates.14 For example, the report states that “a key challenge presented by the complexity of software is that the design space and trade-offs for incorporating autonomy into a mission are not well understood and can result in unintended operational consequences.” The Networking and Information Technology Research and Development (NITRD) report on Grand Challenges15 notes the opportunity to “understand how people, (software) agents, robots, and sensors (PARS) contribute to a collaboration” and the difficulties of

___________________

13 R. Olfati-Saber, J.A. Fax, and R.M. Murray, 2007, Consensus and cooperation in networked multi-agent systems, Proceedings of the IEEE 95(1): 215-233.

14 Defense Science Board, 2012, Task Force Report on the Role of Autonomy in DoD Systems, Office of the Secretary of Defense, July, http://www.acq.osd.mil/dsb/reports/AutonomyReport.pdf.

15 Networking and Information Technology Research and Development Program (NITRD), 2006, Grand Challenges: Science, Engineering, and Societal Advances Requiring Networked Information Technology Research and Development, November, Washington, D.C., http://www.nitrd.gov/Publications/PublicationDetail.aspx?pubid=43, p. 31.

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

understanding “the structural complexity of PARS collaborations (for example, teams, networks, or hierarchies into which the PARS components can self-organize).” Echoing the DSB findings, the NITRD report calls for research and development on architectures for adaptive layered networks. A 2012 NITRD workshop report directly addresses the complexity issues and notes that the technical means to design and test complex engineered networked systems do not yet exist.16 That report points to the need to develop the ability to design networks of IA vehicles and humans that demonstrate increased flexibility, robustness, resilience, and extensibility even as the operating environment, technology, and applications change over time.

The DSB report cautions that “current designs of autonomous systems, and current design methods for increasing autonomy, can create brittle platforms, and have led to missed opportunities and new system failure modes when new capabilities are deployed. Brittle autonomous technologies result in unintended consequences and unnecessary performance trade-offs, and this brittleness, which is resident in many current designs, has severely retarded the potential benefits that could be obtained by using advances in autonomy.” The report recommends the “development of measures and models of the dimensions of system resilience/brittleness that can be used early in systems development as well as later in T&E.” The problem of brittle IA systems has been noted for years in human–machine systems research,17,18 and overcoming brittleness is a critical goal that guides the development of the growing field of resilience engineering.19

The barrier of complexity arises because current designs tend to focus on onboard autonomous capabilities and downplay the need for supporting coordination in a distributed and layered system. Complex engineered networks are broader in perspective—that is, they comprise a multirole, multiechelon system that includes both human and various computational and robotic roles. They present a challenge in terms of their ability to synchronize the distributed activities to keep up with the changing pace and tempo of dynamic situations. The distributed system has to be highly adaptive—changing the relationships across roles and echelons to be able to match changing demands and situations—if it is to meet the robustness and resilience goals as well as the cost and productivity goals, especially as growth in platform autonomy creates new opportunities. Complex engineered networks pose challenges for distributed architectures, such as accommodating new and multiple scales, managing life-cycle extensibility and adaptability, managing multiple trade-offs adaptively, and handling the extensive dependencies created by reliance on software-intensive systems and vital digital infrastructure. Together these sources indicate that

  • The growth of IA systems leads to the need for new models, measures, architectures, and testing and certification methods for complex engineered networks,
  • Complex engineered networks present many unsolved technical challenges, and
  • There are several promising trends in interdisciplinary research such as resilience engineering, resilient control systems, and cyberphysical systems.

Verification and Validation

Barrier Statement: Existing verification and validation (V&V) approaches and methods are insufficient for advanced IA systems.

V&V are critical steps on the path to operational acceptance. Verification refers to the processes for assuring that a given product, service, or system meets its specifications. Validation refers to the processes for assuring that the product will fulfill its intended purpose. Most often, the entity that sets the specifications performs the verifica-

___________________

16 NITRD, 2012, Workshop Report on Complex Engineered Networks, September 20-21, Washington, D.C., http://scenic.princeton.edu/NITRD-Workshop/.

17 P.J. Smith, C.E. McCoy, and C. Layton, 1997, Brittleness in the design of cooperative problem-solving systems: The effects on user performance. IEEE Transactions on Systems, Man, and Cybernetics 27(3): 360-371.

18 S.J. Perry, R.L. Wear, and R.I. Cook, 2005, The role of automation in complex system failures. Journal of Patient Safety 1(1): 56-61.

19 E. Hollnagel, D.D. Woods, and N. Leveson, eds., 2006, Resilience Engineering: Concepts and Precepts, Aldershot, U.K.: Ashgate Publishing.

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

tion, while the prospective user or some other organization that thoroughly understands user needs performs the validation.20

V&V methodologies and techniques are typically underpinned by inviolate scientific principles. For example, physics and or other related engineering sciences are most often used to determine whether an aeronautical system is capable of fulfilling its specifications and its intended purpose. Thus, V&V processes currently employed in aviation (and most other physical applications) are geared toward obtaining quantitatively predictable outcomes based on known inputs or stimuli. Specifically, to be viable, the selected architecture, instantiated with appropriate levels of sensing capability, must prove to be robust over a range of complex and uncertain operating environments. One logical alternative to the first-principles-based approach is to conduct a statistical evaluation (with associated confidence levels) of the behavior of the product, service, or system. However, even a statistical evaluation of system behavior may not be adequate for V&V of autonomous systems.

If the capability in question is accomplished through software, the creators of that software are generally expected to prove that the software can deliver the intended capabilities at specified performance levels and in the relevant environment. In civil aviation applications, software V&V are a precursor to achieving certification, which in turn is required for flight operations. To this end, the FAA requires that software developers follow the guidance contained in DO178B, Software Considerations in Airborne Systems and Equipment Certification.21,22 Approaches to software V&V such as those described in DO178B are laborious because they require extensive examination and testing of every logic path. This approach is based on the well-established premise that errors may occur at any level. However, it is increasingly apparent that, given the complexity associated with software that has been and is being developed to provide increased functionality, these approaches are not scalable to the exceedingly complex software associated with some advanced IA systems. An alternative approach is required, especially at the system level.

In twentieth century aviation, system-level “intelligence” resided almost exclusively with human operators. The segregated assessment of the aircraft and crew for flight readiness was acceptable, because the standards were for the most part independent. That is not the case with IA systems, where the intelligence is likely to be divided between the human and the system. Even if a V&V method were to prove useful for assessing intelligent software, the assessment of the total system, including the human operators and their interactions with the IA system, requires another approach. The barrier here is one of developing confidence in a system with a significant variation in the degree of human involvement, capability, and management. To this end, the validation process will likely require IA systems and human operators to respond quickly and correctly to a great number of questions and situations.

Another element of this barrier relates to the difficulty of establishing system-level performance requirements and functionality that are applicable to and derived from specified levels of safety, reliability, and operational performance. In the case of advanced IA systems, these attributes would be specifically related to the intent of the systems, the mission and operating environment, and the level of human interaction and uncertainty. A generalized approach to overcoming this part of the barrier does not currently exist.

Lastly, new approaches to V&V processes are likely to affect system design processes, especially for small unmanned aircraft, where the cost of V&V of IA systems will need to be modest in order for the aircraft to be affordable.

REGULATION AND CERTIFICATION

Airspace Access for Unmanned Aircraft

Barrier Statement: Unmanned aircraft may not operate in nonsegregated civil airspace unless the FAA issues a certificate of waiver or authorization (COA).

___________________

20 Institute of Electrical and Electronic Engineers (IEEE), 2008, 1490 WG-IEEE Guide: Adoptions of the Project Management Institute (PMI) Standard: A Guide to the Project Management Body of Knowledge (PMBOK Guide) Working Group, 4th ed., http://standards.ieee.org/develop/wg/software_and_systems_engineering_all.html.

21 FAA, 1993, Advisory Circular 20-115B, RTCA, Inc., Document RTCA/DO-I 78B, January 11.

22 FAA, 2011, Advisory Circular 20-171, Alternatives to RTCA/DO-178B for Software in Airborne Systems and Equipment, January 19.

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

Under current regulations there is no routine way to accommodate unmanned aircraft operations in nonsegregated civil airspace. Such operations have been possible only with (1) a certificate of authorization issued by the FAA on a case-by-case basis to public organizations or (2) a special airworthiness certificate in the experimental category. These certificates include operational restrictions that limit the utility and mission effectiveness of the affected unmanned aircraft. The FAA’s cumbersome regulatory mechanisms for authorizing UAS operations will not scale to the volume of operations expected in the near future, and they will continue to constrain potentially beneficial operations. However, the status quo is changing. Two small unmanned aircraft, the InSitu Scan Eagle and the AeroVironment Puma, were certified for commercial usage (in restricted airspace) in September 2013, and the FAA subsequently approved commercial UAS operations, although this authorization was limited to very remote airspace (in the Arctic). In addition, the FAA has developed a roadmap for integrating UAS into the NAS with milestones for the FAA, other government agencies, and industry.23 The FAA is also developing enabling regulations for small unmanned aircraft that are expected to be released for public comment in November 2014. The FAA policy restricting the commercial use of small unmanned aircraft is being challenged in court. On March 6, 2014, a National Transportation Safety Board (NTSB) administrative law judge ruled that the FAA’s de facto ban on commercial use of small unmanned aircraft (model aircraft), which is based on a 2007 policy statement, “cannot be considered as establishing a rule or enforceable regulation” and that “policy statements are not binding on the general public.” The ruling is under appeal.

Most unmanned aircraft presently in operation are remotely piloted by a human operator on the ground who is in continuous operational control unless there is a failure or disruption of the command-and-control link. The human operator is assisted by automation that can maintain stabilized flight by controlling aircraft attitude and power. In more highly automated aircraft, the pilot simply defines the desired flight path by specifying three- and four-dimensional waypoints, and the path is then executed by automation on board the aircraft.

The FAA says that “ultimately, UAS must be integrated into the NAS without reducing existing capacity, decreasing safety, negatively impacting current users, or increasing the risk to airspace users or persons and property on the ground any more than the integration of comparable new and novel technologies.”24 The committee believes that these are appropriate and achievable goals. In particular, the committee believes that the impact of IA systems on the performance of crewed aircraft, unmanned aircraft, and ATM systems will make it possible to introduce UAS into the NAS without reducing the capacity for crewed aircraft operations.

Full integration of UAS into civil airspace is limited by a variety of technical, operational, policy, and economic issues. The most daunting airspace integration challenges are these:

  • Detect-and-avoid capabilities. Because unmanned aircraft do not have a pilot on board, the ability to see and avoid other aircraft is dependent on the development of new technologies.
  • Vulnerabilities of the command and control link. The command-and-control link between the pilot and the aircraft is via a wireless communications link, which is subject to all the vulnerabilities of such links. • ATM integration. The unique operational characteristics and flight performance of unmanned aircraft require UAS-specific ATM procedures and technologies.25

Certification Process

Barrier Statement: Existing certification criteria, processes, and approaches do not take into account the special characteristics of advanced IA systems.

Certification of aircraft and other civil aviation systems is critical to maintaining safety, especially in the midst of growth and/or changes in the operating environment. To date, there are no commonly accepted methods,

___________________

23 FAA, 2013, Integration of Civil Unmanned Aircraft Systems (UAS) in the National Airspace System (NAS) Roadmap, First Edition—2012, November 7, http://www.faa.gov/about/initiatives/uas/media/uas_roadmap_2013.pdf, p. 56.

24 Ibid, p. 4.

25 Andrew Lacher, Andrew Zeitlin, Chris Jella, Charlotte Laqui, Kelly Markin, 2010, Analysis of Key Airspace Integration Challenges and Alternatives for Unmanned Aircraft Systems, F046-L10-021, McLean, Va.: The MITRE Corporation, July.

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

techniques, or practices for the certification of highly autonomous systems. Further, once new methods, techniques, and practices have been developed, they must be authorized by changes to FAA regulations; to the FAA Advisory Circulars that will alert the public on how the agency interprets and will apply new regulations; and to the guidance that is provided to certification specialists and other interested parties via FAA Orders on how to apply the new regulations. Developing new regulations, policies, procedures, guidance material, and training requirements, however, is lengthy and resource intensive, especially when new technologies involve expertise that is not currently resident in the staff of the responsible FAA offices. However, this situation has been encountered and overcome in the past as other revolutionary new technologies, such as fly-by-wire flight controls and composite materials, were introduced into civil aviation.

Having a properly certificated aircraft does not entitle one to operate it in U.S. airspace, especially if it is done for commercial purposes. Operation of an aircraft may require certification of the company that will be operating the aircraft, and it certainly requires pilot certification, whether the pilot is on board or flying the aircraft remotely. The certification of UAS “operators” (as a class of individuals distinct from pilots) to remotely operate unmanned aircraft is not contemplated by present regulations and would likely need major changes to the FAA operating rules. Such changes, as with certification rule changes, would also require the development of Advisory Circulars and FAA Orders.26

One of the most important aspects of new guidance documents for IA systems will be their treatment of the human factors aspects of IA aircraft operations. As first articulated in the 1996 FAA report on flight deck human factors and most recently updated in the recently released Flight Deck Automation Working Group report,

Pilots mitigate safety and operational risks on a frequent basis, and the aviation system is designed to rely on that mitigation. . . . Automated systems have been successfully used for many years, and have contributed significantly to improvements in safety, operational efficiency, and precise flight path management. However, pilot use of and interaction with automated systems were found to be vulnerable in the following areas:

  • Pilots sometimes rely too much on automated systems and may be reluctant to intervene,
  • Autoflight mode confusion errors continue to occur,
  • The use of information automation is increasing, including implementations that may result in errors and confusion, and
  • FMS [Flight Management System] programming and usage errors continue to occur.27

Certifying entities will need to acquire and maintain trust in the judgments that adaptive/nondeterministic components and systems exercise. The certification community does not yet know how to do this or which specific criteria should be applied. Appropriate reliability metrics and acceptable standards for software development have yet to be identified. Nor has the community developed quantitative levels of system reliability and how these requirements could be allocated between hardware, software, and pilots. Nor are there available specific guidelines for redundancy management, which will surely involve new approaches to establish failure modes and effects analyses as well as the setting of specifications for performance of systems, sensors, and software and hardware testing and robustness. In addition, some of the unresolved certification challenges are of particular interest to rotorcraft. Examples include single-pilot workload evaluation, man–machine issues in the near-earth environment, and adaptation of functional allocations to single-pilot operations.

___________________

26 Advisory Circulars and informational documents are issued by the FAA to provide background, explanatory, and guidance material to the aviation community for complying with regulations. FAA Orders are issued by the FAA to provide guidance to its inspectors, controllers, and other agency employees in the conduct of their respective duties.

27 FAA, 2013, Operational Use of Flight Management Systems, Report of the PARC/CAST Flight Deck Automation Working Group, http://www.faa.gov/about/office_org/headquarters_offices/avs/offices/afs/afs400/parc/parc_reco/media/2013/130908_PARC_FltDAWG_Final_Report_Recommendations.pdf.

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

Equivalent Level of Safety

Barrier Statement: Many existing safety standards and requirements, which are focused on assuring the safety of aircraft passengers and crew on a particular aircraft, are not well suited to assure the safety of unmanned aircraft operations, where the primary concern is the safety of personnel in other aircraft and on the ground.

Equivalent safety implies that the use of a new system, such as one that possesses advanced IA systems, will demonstrate the same level of safety as traditional aircraft operations. The regulatory structure that underpins safety of flight is the result of an evolution that has its origins in the early crewed aircraft era and is crafted around the notion of a pilot in the cockpit. The first generation of modern UAS itself evolved from this model, being heavily dependent on interactions between a remote human pilot and onboard systems with extremely limited autonomous capabilities. IA systems can enable the operation of unmanned aircraft with reduced dependence on the remote crew and less need for highly reliabile, real-time communications.

The current air safety regulatory framework implicitly assumes that the “system intelligence” resides exclusively in the human component of the system, primarily the pilots and air traffic controllers. Current automation within civil aircraft is focused on the execution of specific physical control tasks, such as stabilization and orientation, autotakeoff, autolanding, and autonavigation. Although some of these tasks may be complex, they are all inherently based on the execution of the OODA functions by air and ground elements of the NAS in a way that produces deterministic, predictable, repeatable outcomes. Today’s regulatory framework assumes that the pilot is solely responsible for how the onboard automation is used.

In contrast, advanced IA systems derived from rapidly evolving intelligent machine technology will be capable of mission-level decisions comparable to those made by the pilot or other aircrew. Decision logic will be cued and driven by the system’s perception, itself the product of inputs from sensors and other available information sources, along with instantiated and learned behaviors. For many UAS and their missions, the IA contribution to decision making will be an imperative rather than an option because of the real-world issues inherent in remote operation. To assure trust in IA systems, the regulatory system would therefore need to evolve to recognize and evaluate system performance in terms of how humans and machines work together in accomplishing their mission.

Trust in Adaptive/Nondeterministic Systems

Barrier Statement: Verification, validation, and certification are necessary but not sufficient to engender stakeholder trust in advanced adaptive/nondeterministic IA systems.

Successful deployment of IA systems will require mechanisms for establishing and maintaining trust in the systems’ ability to perceive relevant environmental circumstances and to make acceptable decisions regarding the course or courses of action. One important characteristic of IA systems is that they will be able to cope with a range of both known and unknown inputs without frequent or nearly continuous human cognition and control. Thus, it will be important for IA systems to incorporate pattern recognition, hypothesis testing, and other functionalities that are typically performed by humans. For example, IA systems will need to correctly perceive environmental and internal state variables and then apply machine reasoning and judgment to make appropriate decisions and take appropriate courses of action.

Trust is not an innate characteristic of a system; it is a perceived characteristic that is established through evidence and experience. To establish trust in IA systems, the basis on which humans trust these systems will need to change—for example, by developing trust in them in a fashion analogous to how people establish and maintain trust in other people or organizations. Much of this trust is based on a person’s confidence that others will behave in certain ways (through action or, in some cases, through carefully considered inaction), even under challenging and unforeseen circumstances. That is, building trust involves the estimation (with associated confidence) of both the bounds of performance of the system and the bounds of the control actions that will be performed within and by the system. This is in contrast to exhaustive testing over given operating conditions, as is typically now the case in VV&C.

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

After every incident or accident involving IA systems, it will be necessary to undertake a forensic evaluation of the performance of these systems, including their ability to appropriately perceive environmental circumstances and make suitable judgments. Evaluations could include formal testing, simulation, and limited operational deployments. Continued monitoring will be necessary to maintain confidence in adaptive/nondeterministic IA systems as they adapt to changes in the environment and feedback from previous experience. Establishing mechanisms to limit possible actions by IA systems will be necessary to effectively control the consequences of poor performance. Increasing levels of authority can follow as confidence grows in the ability of IA systems to perform appropriately. This is very similar to how extended-range twin operations (ETOPS) have been managed: As confidence in the reliability of engines and other aircraft systems has grown, the FAA has granted air carriers the authority to operate twin-engine commercial transports along routes with flight segments that are increasingly distant from airports to which such aircraft could be diverted in the case of an engine malfunction en route.

ADDITIONAL BARRIERS

Legal Issues

Barrier Statement: Public policy, as reflected in law and regulation, could significantly impede the degree and speed of adoption of IA technology in the NAS.

The FAA Modernization and Reform Act of 2012 included a congressional mandate to integrate UAS into the NAS by September 2015.28 On March 6, 2014, an NTSB administrative law judge ruled that the FAA’s de facto ban on commercial use of small unmanned aircraft (i.e., model aircraft) is not binding on the general public. (It remains to be seen how this this ruling will fare during the appeals process.) In any case, the commercial use of UAS in the NAS raises a host of legal issues. A Congressional Research Service report29 identifies several legal issues, including constitutional concerns. The First Amendment guarantee of freedom of the press will be an important consideration as journalists increasingly employ UAS to gather information. Fourth Amendment protections regarding unreasonable searches conducted using UAS have yet to be established. UAS also raise new questions related to the Fifth Amendment prohibition against property “being taken for public use, without just compensation.” In this context, such “taking” involves government interference in the ability of people to use UAS over their own property.

Government actors and private citizens may be held accountable under laws relating to trespass, nuisance, stalking, harassment, privacy, and liability. How these legal concepts are applied to UAS will be determined by legislative and legal action at the state and federal level. An indeterminate period of uncertainty will persist as the legal system adapts to the introduction of UAS having various operating profiles into the NAS for various applications.

Liability may prove the most significant legal challenge to unmanned aircraft. As advances enable more fully autonomous operations, human interaction with UAS will become more remote. The legal system will be challenged to account for systems causing harm based on decisions made without direct human input.

Between 2010 and 2012 the U.S. Customs and Border Protection Agency loaned its Predator UAS to other law enforcement agencies on 700 occasions.30 On January 14, 2014, Rodney Brossart became the first American convicted of a crime in which surveillance video from an unmanned aircraft was presented as evidence at trial. In 2013, bills or resolutions addressing unmanned aircraft were introduced in the legislative bodies of 43 states.31 Topics included limitations on government and personal use of unmanned aircraft and funding for FAA test sites. In the absence of legislative guidance, courts will adapt existing laws to address concerns arising from the use of unmanned aircraft. It is inevitable that the law covering IA systems in the NAS will continue to evolve.

___________________

28 FAA, Modernization and Reform Act of 2012, P.L. 112-95, 126 Stat. 11.

29 Congressional Research Service, 2013, Integration of Drones into Domestic Airspace: Selected Legal Issues, April 4, http://www.fas.org/sgp/crs/natsec/R42940.pdf.

30 U.S. News and World Report, 2014, North Dakota man sentenced to jail in controversial drone-arrest case, January 15, http://www.usnews.com/news/articles/2014/01/15/north-dakota-man-sentenced-to-jail-in-controversial-drone-arrest-case.

31 National Conference of State Legislatures, “2013 unmanned Aerial Systems (UAS) Legislation,” http://www.ncsl.org/research/civil-andcriminal-justice/unmanned-aerial-vehicles.aspx.

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×

Even as advanced IA systems are deployed in the NAS, IA systems—like other aviation technologies—will continue to mature. As the operational record of advanced IA systems grows, it is likely to trigger additional reviews of policy, laws, and regulations that govern their use.

Social Issues

Barrier Statement: Social issues, particularly public concerns about privacy and safety, could significantly impede the degree and speed of adoption of IA technology in the NAS.

Civil aviation provides many benefits to the general public, such as the safe transport of millions of passengers annually, overnight delivery of packages, and emergency medical evacuation of accident victims. However, the opinions of the general public about technology are shaped by a wide variety of forces and factors. For example, the intense coverage of aviation accidents can create the impression that commercial air travel is less safe than objective measures indicate. Noise concerns, particularly around busy airports, can also color the public’s impressions of aviation.

The general public has limited detailed technical knowledge about the NAS. Most people do not know how air traffic is routed or coordinated. Nor do they know the technical challenges that had to be overcome to assure the safety of civil aviation as new technologies such as fly-by-wire flight controls and composite materials came to play vital roles in the design and operation of commercial transports. With two noteworthy exceptions, the same will likely be true when it comes to the application of IA systems in civil aviation.

First, unmanned aircraft are visibly different from crewed aircraft, and the military uses of drones for surveillance and active warfare are frequently discussed in print and electronic media and in online blogs and advocacy sites. This has stirred widespread concern about the privacy and safety of the civilian population.32 The potential proliferation of unmanned aircraft operating in civil airspace will be a new phenomenon that will be much more noticeable than a change in the design of flight controls or structural materials in large commercial transports. Privacy concerns are likely driven in part by public concerns about the collection and dissemination of personal data by government and industry in other areas. Getting past the privacy barrier associated with some UAS applications will require an evolution in public perception and trust. For example, UAS should be designed to ensure that data acquired for functions such as communication or navigation cannot be used to violate privacy.

Second, with crewed aircraft, the public will also take note if and when advances in IA systems enable the safe operation of commercial transports with fewer than two pilots, the currently required number. Over time, IA systems could make it possible to transition from two-pilot aircraft to single-pilot operations and, eventually, to fully autonomous aircraft with no pilots on board. To be certain, aircraft manufacturers are not yet ready to unveil pilotless passenger transport aircraft. Even if or when an autonomous commercial passenger and cargo aircraft become available it is likely that public acceptance and trust barriers will be difficult to overcome. A study on public perceptions of unmanned aircraft in 2011 determined that 63 percent of the public would support the operation of autonomous cargo transports without a pilot on board, but just 8 percent would support the operation of autonomous passenger aircraft with no pilot on board. However, in the case of autonomous transports that had a pilot on board who could intervene in an emergency, public support increased substantially: from 63 to 90 percent for cargo flights and from 8 to 77 percent for passenger flights.33 Perhaps driverless cars can pave the way for trust in pilotless aircraft, despite the fact that achieving high levels of safety in the “driverless car” may be more challenging owing to the dynamics and complexity in the highway environment.

Apart from public concerns about privacy and safety, other social issues are likely to arise. For example, the tendency of people to resist change will likely be a factor. Such resistance will take shape especially when the self-interest of individuals or their organizations are threatened directly or indirectly by the changes likely to arise from the deployment of IA systems.

___________________

32 Rachel L. Finn and David Wright, 2012, Unmanned aircraft systems: Surveillance, ethics and privacy in civil applications, Computer Law and Security Review 28(2):184-194.

33 Alice Tam, 2011, Public Perception of Unmanned Aerial Vehicles, Purdue University, Publications Department of Aviation Technology, http://docs.lib.purdue.edu/cgi/viewcontent.cgi?article=1002&context=atgrads.

Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 31
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 32
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 33
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 34
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 35
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 36
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 37
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 38
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 39
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 40
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 41
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 42
Suggested Citation:"3 Barriers to Implementation." National Research Council. 2014. Autonomy Research for Civil Aviation: Toward a New Era of Flight. Washington, DC: The National Academies Press. doi: 10.17226/18815.
×
Page 43
Next: 4 Research Agenda »
Autonomy Research for Civil Aviation: Toward a New Era of Flight Get This Book
×
 Autonomy Research for Civil Aviation: Toward a New Era of Flight
Buy Paperback | $45.00 Buy Ebook | $36.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The development and application of increasingly autonomous (IA) systems for civil aviation is proceeding at an accelerating pace, driven by the expectation that such systems will return significant benefits in terms of safety, reliability, efficiency, affordability, and/or previously unattainable mission capabilities. IA systems range from current automatic systems such as autopilots and remotely piloted unmanned aircraft to more highly sophisticated systems that are needed to enable a fully autonomous aircraft that does not require a pilot or human air traffic controllers. These systems, characterized by their ability to perform more complex mission-related tasks with substantially less human intervention for more extended periods of time, sometimes at remote distances, are being envisioned for aircraft and for air traffic management and other ground-based elements of the national airspace system. Civil aviation is on the threshold of potentially revolutionary improvements in aviation capabilities and operations associated with IA systems. These systems, however, face substantial barriers to integration into the national airspace system without degrading its safety or efficiency.

Autonomy Research for Civil Aviation identifies key barriers and suggests major elements of a national research agenda to address those barriers and help realize the benefits that IA systems can make to crewed aircraft, unmanned aircraft systems, and ground-based elements of the national airspace system. This report develops a set of integrated and comprehensive technical goals and objectives of importance to the civil aeronautics community and the nation. Autonomy Research for Civil Aviation will be of interest to U.S. research organizations, industry, and academia who have a role in meeting these goals.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!