National Academies Press: OpenBook
« Previous: 3 Requirements Setting
Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

4

Design and Development

This chapter considers three key aspects of industrial engineering methods for system design and development: (1) the need to assess the technological maturity of subsystems and components prior to insertion in a defense system in development, (2) the need to use objective metrics for assessment, and (3) the advantages of staged acquisition. These topics were discussed at the panel’s workshop (see Appendix B).

THE IMPORTANCE OF TECHNOLOGICAL MATURITY

Consequences of Using Immature Technology

Conclusion 4: The maturity of technologies at the initiation of an acquisition program is a critical determinant of the program’s success as measured by cost, schedule, and performance. The U.S. Department of Defense (DOD) continues to be plagued by problems caused by the insertion of immature technology into the critical path of major programs. Since there are DOD directives that are intended to ensure technological readiness, the problem appears to be caused by lack of strict enforcement of existing procedures.

There are many studies that describe problems caused by inserting insufficiently mature technologies in the critical path of acquisition programs for both DOD and commercial companies (see, e.g., National Research Council [2010a]): see Box 4-1. This is a primary cause of schedule

Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

BOX 4-1
Use of Immature Technologies: Consequences

Four examples of conclusions from major studies of the consequences of using immature technologies are noteworthy.

1.  The “Streamlining Study” of the Defense Science Board was never published, but the Institute for Defense Analysis (1991) produced IDA Paper P-2551, which covered some 100 major defense acquisition programs, reached a firm conclusion that failure to identify technical issues, as well as real costs, before entering into full-scale development—now referred to as engineering and manufacturing development—was the overwhelming cause for subsequent schedule delays and the resulting cost increases.

2.  The U.S. General Accounting Office (1992:49) stated: “Successful programs have tended to pursue reasonable performance objectives and avoid the cascading effects of design instability.…”

3.  More than a decade later, the U.S. Government Accountability Office (2004:2) found: “FCS [Future Combat System] is at significant risk for not delivering required capability within budgeted resources. Three-fourths of FCS needed technologies were still immature when the program started. The first prototypes of FCS will not be delivered until just before the production decision. Full demonstration of FCS ability to work as an overarching system will not occur until after production has begun.” The report also concluded that based upon the experiences of past programs, the FCS strategy was likely to result in cost overruns and delays. In fact, the FCS program was terminated about 6 years later.

4.  At a November 30, 2005, meeting of the Naval Studies Board of the National Research Council, the then newly appointed Department of the Navy acquisition executive, Dr. Delores Etter reported that she had just attended her first Defense Acquisition Board review, which was for the DDG-1000 (guided missile destroyers) program. She had anticipated that technologies for the program would be an issue with the Undersecretary of Defense (AT&L), DOD’s top acquisition executive but they were not. The acquisition team had identified 10 high-risk areas that would have to mature in parallel for the acquisition program to meet its performance goals, and the program was approved for entry into engineering and manufacturing development. About 3-1/2 years later, in the summer of 2008, the Department of the Navy requested, and received approval for, termination of the prohibitively expensive program after having spent $10 billion on the first two ships.

slippage and cost growth in DOD program acquisition, and it often results from the overly optimistic confidence of developers in their abilities to convert technological advances into developing reliable components and subsystems and doing so in a timely manner. The terminations of the FCS (the Army’s future combat system) and DDG-1000 (the Navy’s Zumwalt class of guided missile destroyers) programs years after their entry into

Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

engineering and manufacturing development are strong evidence of the very adverse result of incorporating multiple immature technologies in the critical paths of large complex product developments.

The dangers of immature technology are just as critical in the commercial sector. Globalization and rapid advances in technology have put immense pressure on industry to offer “on-going” new products with the latest technological features. This pressure in turn has led to shorter product development cycles, increasing the risk of introducing immature and infeasible new technologies. Unlike the situation in DOD, product launch dates in many parts of the commercial sector, such as the automotive industry, are sacred. Any slippage in a product launch date has serious financial implications for automotive companies: they range from millions of dollars in lost revenues for every day’s delay in product launch to inflicting major chaos in the entire supply chain, with a supplier who may be 10,000 miles away, to the marketing group that has already made extensive plans and commitments. And launching a product that is not fully ready also has serious cost implications, including high warranty costs and, more importantly, lost customer goodwill. Clearly, a major slip in quality at launch has severe consequences; the product may never be able to sell at planned volumes, resulting in major losses for the company.

Faced with the above challenges, top management in the commercial sector is increasingly approving “pre-spend” money for major programs. This pre-spend money is spent on conducting technical feasibility studies on perceived program challenges while the program details are still being finalized for program approval.1 The challenges can include a wide range of activities, such as establishing feasibility of aggressive exterior styling, kicking off die development for major body panels that have long lead times, and studying the feasibility of adapting a new powertrain and getting better cost estimates on the project. The pre-spend money is often 1-2 percent of the cost of the overall program. In recent years, major industry programs have been cancelled or delayed on the basis of the results of the technical feasibility analysis conducted through pre-spend money, thereby enabling the automakers to prevent major losses later in the process.

Speakers at the workshop emphasized the importance of getting considered opinions from qualified domain experts about the adequate maturity of new technologies or about new applications of existing technologies. It was evident that the commercial sector also places a great deal

image

1DOD has provided analogous funding for reducing major defense acquisition program technology risks and for demonstrating the value of new technologies in separately funded “advanced technology demonstrations” and now in the new technology development phase of major defense acquisition programs.

Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

of emphasis on not risking failure by including an unproven technology advance in a critical path of a new program.

There are indeed examples in DOD where programs have managed this issue successfully,2 so the department has exhibited the capability to properly assess technological maturity. What is needed is a way to instill a willingness to acquire independent expert input and a collaborative spirit in those leading future programs. Such a culture is the responsibility of the most senior DOD acquisition executives and of the secretary of defense. The problems result from the different cultures and practices of the different participants in the requirements development process, the acquisition process, and the resource allocation process—not in stated DOD policies and procedures contained in DOD directives.

The Technology Readiness Assessment Deskbook

The current U.S. Department of Defense Instruction (DODI) 5000.02 of December 8, 2008 (which is consistent with the current DODI 5000.01 certified current as of November 20, 2007) contains the following guidance/ requirement regarding technology for acquisition programs3:

Technology for use in product development (starting at Milestone B) “shall have been demonstrated in a relevant environment or, preferably, in an operational environment [emphasis added] to be considered mature enough to use for product development.… If technology is not mature, the DOD component shall use alternative technology that is mature and that can meet the user’s needs or engage the user in a dialog on appropriately modifying the requirements.” In addition, the current 2009 Technology Readiness Assessment (TRA) Deskbook (p. C-5) defines “hardware” readiness levels as follows:

•   TRL 7 as “System prototype demonstrated in an operational environment” and

•   TRL 6 as “System/subsystem model or prototype demonstrated in a relevant environment.”

image

2A recent report on the F-A-18E/F Super Hornet Development Program is an example of the Navy’s ability to control the technological maturity in a major DOD acquisition program. As noted in the report (Center for Naval Analysis, 2009:16), the program “did not over reach on technology or capability demands.” The collaboration of all the parties “allowed the E/F program to develop a clear and focused set of requirements that was simply stated and understood by all. The technology for all requirements was either already in hand, or all agreed to defer the requirement to a later block upgrade when the technology was ready” (p. 28).

3See DODI 5000.02 Enclosure 2, paragraph 5.d. (4). Available: http://www.dtic.mil/whs/directives/corres/pdf/500002p.pdf [August 2011].

Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

The current (2009) Technology Readiness Assessment (TRA) Deskbook does not refer to the “preferred TRL 7” when describing the readiness assessment process for evaluating technology readiness for Milestone B. Rather, it is Title 10 of the U.S. Code (Section 2366b) that requires that the milestone decision authority (the person so designated for each program) certify technologies used at Milestone B have been demonstrated to perform at level TRL6.4 This was not true in the previous version of the TRA Deskbook, which followed the DODI 5000.02 guidance.5

The current 2009 TRA Deskbook also describes an elaborate process for the preparation of technology readiness assessments involving a suggested schedule of 11 months and the selection of an integrated product team consisting of a balanced set of subject matter experts (SMEs) from DOD components, other government agencies, and possibly, nongovernment entities. Significant attention and space are devoted to the authorities of various parties, the “equitable processes” for selecting subject matter experts, and the desire to arrive at a single agreed-on readiness assessment. However, how to deal with different interpretations of, or opinions on, technological maturity is not a significant subject in the Deskbook.

The panel concludes that the philosophy behind DODI 5000.02 is adequate and that the statements about the preferred levels of technology readiness (i.e., TRL 7) for approval at Milestone B are appropriate. However, we have two concerns. One is that the guidance for implementation in the 2009 TRA Deskbook is not adequate (i.e., the sole focus on TRL 6 for Milestone B approval). The second is the insufficient discipline exhibited by most program managers and most DOD acquisition executives, with regard to both the technological maturity for individual components and the integration of multiple components involving interrelated technologies.

Implementation of DOD Instructions and Directives

The panel also concludes that there is a significant weakness in DOD’s implementation of its own Directive and Instruction for acquisition pro-

image

4See DOD Technology Readiness Assessment (TRA) Deskbook, Section 1, paragraph 1.3.1. Available: http://www.dod.gov/ddre/doc/DoD_TRA_July_2009_Read_Version.pdf [August 2011].

5The 2003 TRA Deskbook stated (available: http://www.dod.mil/ddre/doc/May2005_TRA_2005_D0D.pdf [August 2011]):

•   a central theme of the acquisition process is that technology employed in system development should be “mature” before system development begins (see p. ES-1);

•   for Milestone B, readiness levels of at least TRL6 are typical (TRL 7 preferred); and

•   for Milestone C, readiness levels of at least TRL 8 are typical (TRL 9 preferred).

Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

grams. The proposed solution should not lower the standard in DOD’s instruction to the level of just what is required by the U.S. Code (i.e., what happened in the revision of the TRA Deskbook from 2003 to 2009). Good industry practices as well as past successful (in contrast with unsuccessful) DOD acquisition programs support a higher level of technological readiness than has been, and is being, exhibited in most recent and current DOD acquisition programs. This view is strongly supported by the report of the first Director of Defense Research and Engineering (DDR&E) to Congress on the technical maturity and integration risk of major DOD acquisition programs.6

The comments from industry participants at the workshop and from several GAO reports (U.S. General Accounting Office, 1999; U.S. Government Accountability Office, 2006) indicate that, in general, technological readiness levels for commercial products are higher than they are for DOD programs. There are several possible reasons for this difference. One is that commercial products are vulnerable to product liability lawsuits and product warranties, both of which drive comprehensive performance and reliability testing prior to product introduction on the market. In contrast, with very few exceptions, DOD does not require warranties, nor is the original equipment manufacturer (the contractor) held liable for deficiencies, as are commercial manufacturers. Additionally, most DOD products are at the leading edge of technology in the hope of providing a competitive edge over potential adversaries. Notwithstanding these differences, DOD places too little attention, in general, on technological readiness prior to beginning system development.

Some of the industry participants at the panel’s workshop suggested that the real problem might be the lack of adherence to criteria in the assessment of new technological readiness. In addition, it was noted that there may be poor risk assessment of the effects of technology insertion and integration on systems.

Recommendation 2: The U.S. Department of Defense (DOD) Under Secretary for Acquisition, Technology, and Logistics (USD-AT&L)

image

6This report (U.S. Department of Defense, 2010) was written to comply with the Weapons Systems Acquisition Reform Act of 2009 (Public Law 111-23), which requires that the DDR&E submit an annual report. The report, covering 2009, was critical of the technological readiness levels assigned to technologies in the Joint Tactical Radio System and wideband networking waveform, as well as the technological readiness levels used in the Army’s first increment of its brigade combat team modernization effort. The particular programs reported on are not as important as the fact that the DDR&E’s critical assessment was either not available to the relevant acquisition decision authority before Milestone B or it was available to, but not appropriately acted on by, the cognizant decision authority before or at the Milestone B decision point.

Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

should require that all technologies to be included in a formal acquisition program have sufficient technological maturity, consistent with TRL (technology readiness level) 7, before the acquisition program is approved at Milestone B (or earlier) or before the technology is inserted in a later increment if evolutionary acquisition procedures are being used. In addition, the USD-AT&L or the service acquisition executive should request the director of defense research and engineering (DOD’s most senior technologist) to certify or refuse to certify sufficient technological maturity before a Milestone B decision is made. The acquisition executive should also

•   review the analysis of alternatives assessment of technological risk and maturity;

•   obtain an independent evaluation of that assessment, as required in DODI 5000.02; and

•   ensure, during developmental test and evaluation, that the materiel developer assesses technical progress and maturity against critical technical parameters that are documented in the test and evaluation master plan.

We are aware that a substantial part of the above recommendation is currently required by law or by DOD instructions. In particular, DODI 5000.02 obligates DDR&E to perform an independent technology readiness assessment of major defense acquisition programs prior to Milestones B and C. The director of developmental test and evaluation is supposed to provide an assessment of the test process and results to support this readiness review. Furthermore, DODI 5000.02 requires a cost assessment and program evaluation. In addition, all of the military services currently perform an operational test readiness review and must certify that the system is ready for dedicated initial operational test and evaluation. These certifications, required by DODI 5000.02, have varying degrees of depth and credibility. Recently, the USD-AT&L began performing an independent assessment of operational test readiness, and the director of developmental test and evaluation is tasked to support this effort.

But the panel believes that DOD has been moving in the wrong direction regarding enforcement of an important and reasonable policy as stated in DODI 5000.02. The recommendation of an earlier report (National Research Council, 2006) was also concerned with immature technologies. Our recommendation supports and modifies the earlier one. Our intent is to make it more difficult for advocates to incorporate immature technologies into the critical paths of major DOD acquisition programs.

Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

USE OF OBJECTIVE METRICS FOR ASSESSMENT

Conclusion 5: The performance of a defense system early in development is often not rigorously assessed, and in some cases the results of assessments are ignored; this is especially so for suitability assessments. This lack of rigorous assessment occurs in the generation of system requirements; in the timing of the delivery of prototype components, subsystems, and systems from the developer to the government for developmental testing; and in the delivery of production-representative system prototypes for operational testing. As a result, throughout early development, systems are allowed to advance to later stages of development when substantial design problems remain. Instead, there should be clear-cut decision making during milestones based on the application of objective metrics. Adequate metrics do exist (e.g., contractual design specifications, key performance parameters, reliability criteria, critical operational issues). However, the primary problem appears to be a lack of enforcement.

There should be clear-cut decision making during milestones based on objective metrics. Adequate metrics do exist (e.g., contractual design specifications, key performance parameters, reliability criteria, critical operational issues). However, the primary problem, once again, appears to be lack of enforcement by the component and Office of the Secretary of Defense senior managers responsible for acquisition program oversight. More than one speaker at the workshop said that it is key that defense systems should not pass milestones unless there is objective, quantitative evidence that major design thresholds, key performance parameters, and reliability criteria have been met or can be achieved with minor product changes.

The lack of consistent use of objective, quantitative metrics occurs at many points during defense acquisition:

•   the generation of system requirements (see Chapter 3);

•   the timing of the delivery of prototype components, subsystems, and systems from the developer to the government for developmental testing;

•   the delivery of production-representative system prototypes for operational testing; and

•   the passage of systems into full-rate production.

The transition from developmental testing to dedicated initial operational test and evaluation (IOT&E) is often driven by schedules rather than the availability of production-representative articles with mature

Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

systems. Articles should not be delivered to IOT&E until the system is performing at a level that will meet operational requirements, as determined by a disciplined operational test readiness review, noted above. It is counterproductive to place a system into operational testing when its reliability is 20 percent or 30 percent below what is required, with the hope that enough failure modes will be discovered during operational testing to raise the reliability to the required level. More often than not, such a system will need further development and its operational testing will likely need to be repeated.

The passage of systems into full-rate production is typically justified on the basis of the results of a comprehensive operational test, which includes assessment of both effectiveness and suitability. From 2001 through 2006, DOT&E found that 15 of 28 systems undergoing IOT&E either were not operationally suitable or were suitable with limitations. Of these 28 systems, 9 were found to be either not effective or effective with significant deficiencies. However, all these systems were fielded, often with the deficiencies that had been identified during initial operational test and evaluation (see Defense Science Board, 2008).

Although the decision as to which systems in development are and are not fielded is complex, having a greater degree of rigor in decisions would reduce the chance of systems being delivered to the field and failing to meet their requirements. Such failure is particularly common with respect to system suitability. In such cases, systems often do not go back to development. Rather, a greater number of systems are purchased to ensure adequate availability—since systems may fail in the field or be under repair—thereby greatly increasing the life-cycle costs.7

STAGED DEVELOPMENT WITH AN EMPHASIS ON SOFTWARE

As noted in another NRC report (2010:1): “Current fielding cycles are, at best, two to three times longer than successful commercial equivalents… representing multiyear delays in delivering improved IT systems to warfighters and the organizations that support them. As a result, the DOD is often unable to keep pace with the rate of IT innovation in the commercial marketplace.…” A particular issue is the growing impor-

image

7Adolph (2010:53) provides an excellent discussion of these issues: “Rigorous enforcement of key requirement thresholds, along with emphasis on performance in the intended mission environment, should be the norm when entering System Development and Demonstration. Issues that need to be addressed in relation to requirements setting include technology readiness, the translation of requirements into design criteria, with attention to testability at the subsystem and system levels, as well as defining thresholds for key performance parameters.”

Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

BOX 4-2
Benefits of Agile Development

In examining current DOD processes for acquiring IT systems and comparing them with the processes adopted by leading-edge firms in the commercial sector, the committee found stark differences. DOD is hampered by a culture and acquisition-related practices that favor large programs, high-level oversight, and a very deliberate, serial approach to development and testing (the waterfall model). Programs that are expected to deliver complete, nearly perfect solutions and that take years to develop are the norm in DOD. In contrast, leading-edge commercial firms have adopted agile approaches that focus on delivering smaller increments rapidly and aggregating them over time to meet capability objectives. Moreover, DOD’s process-bound, high-level oversight seems to make demands that cause developers to focus more on process than on product, and end-user participation often is too little and too late. These approaches are counter to agile acquisition practices in which the product is the primary focus, end users are engaged early and often, oversight of incremental product development is delegated to the lowest practical level, and the program management team has the flexibility to adjust the content of the increments in order to meet delivery schedules. The committee concluded that the key to resolving the chronic problems with DOD acquisition of IT systems is for DOD to adopt a fundamentally different process—one based on the lessons learned in the employment of agile management techniques in the commercial sector. Agile approaches have allowed their adopters to outstrip established industrial giants that were beset with ponderous, process-bound, industrial-age management structures. Agile approaches have succeeded because their adopters recognized the issues that contribute to risks in an IT program and changed their management structures and processes to mitigate the risks… for the DOD to succeed in adopting new approaches to IT acquisition, the first step is to acknowledge that simply tailoring the existing processes is not sufficient. DOD acquisition regulations do permit tailoring, but the committee found few examples of the successful application of the current acquisition regulations to IT programs, and those that were successful required herculean efforts or unique circumstances. Changes broader than tailoring are necessary; they must encompass changes to culture, redefinition of the categories of IT systems, and restructured procurement, development, and testing processes as identified in this report. In the aggregate, these changes must realign processes that today are dominated by deliberate approaches designed for the development of large, complex, hardware- dominated weapons systems to processes adapted to the very different world of software-dominated IT systems.”

SOURCE: National Research Council (2010a:ix-x).

Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

tance of software (sub)systems, and the functionality of defense systems is increasingly dependent on extremely complicated software.

Conclusion 6: There are substantial benefits to the use of staged development, with multiple releases, of large complex systems, especially in the case of software systems and software-intensive systems. Staged development allows for feedback from customers that can be used to guide subsequent releases.

The workshop speakers on software systems emphasized staged development as part of “agile” development processes: see Box 4-2. In the panel’s view, many elements of the agile processes are not new. What is needed, however, is a systematic approach that ensures that these practices are consistently used throughout system development. If properly implemented, these practices would ensure that defects and weaknesses in a system are detected early so that they can be addressed inexpensively.

Staged development appears to be natural for large-scale complex software systems. The use of staged development may also be appropriate for some hardware systems: two examples of situations in which substantial upgrades to fielded systems provided a substantial increase in war fighting capability are the Apache helicopter and the M-1 tank.

A good example of the applicability of agile development to hardware systems is that of the F-A-18E/F, a twin-engine carrier-based multirole fighter aircraft mentioned in footnote 3, where it was stated that the technologies were not inserted in a release until they were determined to be fully ready. This approach is consistent with the agile philosophy. However, each of the stages must retain the functionality that all the predecessor systems had, at the very least to satisfy the natural expectations of the customer over time. We note, however, that with fluid requirements, the most challenging job is to select the systems architecture in a way that can accommodate the likely changes in requirements over the several anticipated stages of development. Meeting this challenge requires foresight as to what capabilities may ultimately be requested and in designing the architecture in a way that does not make the ultimate system overly complicated, heavy, or expensive.

Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×

This page intentionally left blank.

Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 33
Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 34
Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 35
Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 36
Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 37
Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 38
Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 39
Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 40
Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 41
Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 42
Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 43
Suggested Citation:"4 Design and Development." National Research Council. 2012. Industrial Methods for the Effective Development and Testing of Defense Systems. Washington, DC: The National Academies Press. doi: 10.17226/13291.
×
Page 44
Next: 5 Testing Methods »
Industrial Methods for the Effective Development and Testing of Defense Systems Get This Book
×
Buy Paperback | $35.00 Buy Ebook | $27.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

During the past decade and a half, the National Research Council, through its Committee on National Statistics, has carried out a number of studies on the application of statistical methods to improve the testing and development of defense systems. These studies were intended to provide advice to the Department of Defense (DOD), which sponsored these studies. The previous studies have been concerned with the role of statistical methods in testing and evaluation, reliability practices, software methods, combining information, and evolutionary acquisition.

Industrial Methods for the Effective Testing and Development of Defense Systems is the latest in a series of studies, and unlike earlier studies, this report identifies current engineering practices that have proved successful in industrial applications for system development and testing. This report explores how developmental and operational testing, modeling and simulation, and related techniques can improve the development and performance of defense systems, particularly techniques that have been shown to be effective in industrial applications and are likely to be useful in defense system development. In addition to the broad issues, the report identifies three specific topics for its focus: finding failure modes earlier, technology maturity, and use of all relevant information for operational assessments.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!