National Academies Press: OpenBook

Peer Review and Design Competition in the NNSA National Security Laboratories (2015)

Chapter: 3 The Present: From 1992 Until Today

« Previous: 2 The Past: Before the 1992 Nuclear Explosion Testing Moratorium
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

3


The Present: From 1992 Until Today

After the testing moratorium in 1992, nuclear explosion testing was no longer an option for the nuclear explosive package (NEP) design laboratories. The Stockpile Stewardship Program (SSP, originally called the Science-Based Stockpile Stewardship Program) was announced in 1995 by President Bill Clinton to sustain the U.S. nuclear weapons arsenal without that source of validation and feedback. The SSP is focused on improving the weapons science base and the computational and experimental capabilities in order to better ground the weapons systems in reliable scientific foundations and thus reduce the uncertainties associated with the inability to perform a full system test of the weapons. At the same time, and for the same reasons, the three laboratories developed and implemented over time an extensive intra- and interlaboratory technical review methodology to enhance their confidence in the stockpile stewardship work that dominates the mission. Approximations to design competitions were also implemented in an attempt to reestablish practices that had been so successful during the Cold War.

TESTING/EXPERIMENTATION

The Department of Energy’s (DOE’s) Stockpile Stewardship and Management Program1 called for construction of a number of major experi-

_________________

1 This May 1995 document responded to a Presidential Decision Directive and an act of Congress (P.L. 103-160). The Department of Energy was directed to “establish a steward-

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

mental facilities and advanced computing capabilities to strengthen the underlying scientific understanding in the nuclear weapons program in the absence of nuclear explosion testing.2 Among the experimental facilities are the National Ignition Facility (NIF) at LLNL, the Dual-Axis Radiographic Hydrodynamic Test (DARHT) and the Los Alamos Neutron Scattering Center (LANSCE) at LANL, and the predecessors of the pulsed power Z machine at SNL, to name a few. On the computing side, the plan led to programs culminating in today’s Advanced Simulation and Computing (ASC) Program.3 These facilities and capabilities have improved our understanding of weapons physics, have enabled the performance of unique basic science experiments, and have attracted high-quality scientists and engineers to participate in the weapons program. The data generated are useful in peer reviews and for providing feedback to NEP designers, but these facilities and capabilities as a whole do not exercise the complete set of skills needed for actual design and engineering of an NEP.

Sandia National Laboratories

Tests of non-nuclear components and of subsystems and systems integrated with the delivery systems continued after the test ban. In the past and today, testing (with mock NEPs) of nuclear warheads integrated with their delivery systems is conducted for surveillance of stockpile systems and for certification of new design for Life-Extension Programs (LEPs).

System-level tests and reviews are used extensively in the SNL design process, design qualification, and surveillance. These include flight tests of warheads integrated with delivery systems and various environmental tests on full systems in the laboratory to qualify elements of the stockpile-to-target sequence.

Following the test ban, stockpile stewardship has relied more heavily on modeling and simulation (M&S), especially for NEP work. SNL’s testing and experimentation continues to provide essential input for verification and validation of M&S at the component, subsystem, and system levels. The use of M&S in the non-nuclear design area has been driven by two factors: budget pressure to reduce costly testing (especially system tests) and the desire for better insight that comes from M&S used in concert with the traditional testing approach.

__________________________________________________________________________________________

ship program to ensure the preservation of the core intellectual and technical competencies of the U.S. in nuclear weapons.” http://fas.org/nuke/guide/usa/doctrine/doe/st01.htm.

2 JASON, 1994, “Science Based Stockpile Stewardship,” JSR-94-345.

3 See, for example, http://www.nnsa.energy.gov/asc.

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

PEER REVIEW

Peer review and experimental validation are time-tested and widely respected mechanisms for reducing risk in research and development. This includes the risk that an unstated assumption or an ingrained practice may bias one’s thinking or the outcome of an analysis—biases that might not otherwise be spotted in the absence of nuclear testing.

In this context, rigorous and effective peer review is needed to increase the degree of confidence the laboratories have in their scientific and technical work, thus contributing in a fundamental way to the laboratories’ ability to perform annual assessments concerning the reliability of the nuclear weapons. This chapter discusses the mechanisms used today by the laboratories for peer review and documents the committee’s view about how effective these reviews are.

At the same time that the laboratories have relied more on peer review, there have been fewer opportunities for design competition. This issue and these opportunities are also discussed in this chapter.

In general, the processes used today for peer review of nuclear weapon designs or development plans are more formal and less ad hoc than they were during the Cold War. Then, as now, all three laboratories conducted a large number of technical and programmatic reviews using reviewers primarily from their own institutions, as can be seen from the definitions of peer review that are provided in Chapter 1.

The committee asked each NNSA laboratory to estimate the number of nuclear weapons-related systems and science peer reviews that it participates in during a typical year. The laboratories were asked to classify these reviews into one of three categories: (1) reviews internal to the laboratory; (2) those involving external reviewers but still within the NNSA nuclear weapons complex (e.g., involving another laboratory or plant); and (3) those involving at least some reviewers external to the weapons complex, such as military customers, academics, or international organizations such as the Atomic Weapons Establishment of the United Kingdom (AWE).

The three NNSA laboratories used somewhat different criteria for rolling up their numbers in the different categories. For example, two of the laboratories had included work on foreign nuclear weapons assessments and nuclear forensics, while one had not. As a result, the committee agreed with the laboratories that it would not be useful to compare precise numbers across all three laboratories; rather, the committee draws the following semiquantitative conclusions:

  • Each laboratory participates in some 800 nuclear weapons system and/or weapons science-related peer reviews in a given year.
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
  • About half of those reviews are internal to the individual laboratories. These reviews are under their control, and they believe this to be about the right number for the work they do. The other half of these reviews involve some experts from one or both of the other NNSA laboratories and/or from outside the nuclear weapons complex.
  • Substantial staff time and resources are involved in organizing, preparing for, conducting, and responding to these reviews.

Since the start of Stockpile Stewardship, the NEP design laboratories and SNL have evolved distinctly different approaches to peer review. Because of the NEP laboratories’ reliance on simulations of complex physical processes, which are inherently imperfect, and SNL’s ability to test a much larger portion of its work, it is reasonable for the three laboratories to have different approaches to peer review, as described here.

Los Alamos National Laboratory and Lawrence Livermore National Laboratory

LANL and LLNL rely a good deal on one another through interlaboratory peer reviews where the NEPs are concerned. There is no perfect model for NEPs, and experiments with nuclear materials tend to be expensive (and nuclear explosion testing experiments are not allowed at all). Therefore, computational simulations using different tools4 are relied on extensively to predict NEP behavior. When teams with different perspectives use simulations based on different assumptions and methods and still obtain similar results, both laboratories tend to trust the scientific validity of those results. In contrast, if the two teams obtain disparate results, as happens sometimes, they work to resolve the differences, and that process leads to an improved understanding of how to model the NEPs.

At LLNL and LANL, about half of the reviews related to nuclear weapons or nuclear weapons science are totally internal. These peer reviews are controlled by the individual laboratories and the number is judged to be appropriate for the work they do. Of the remaining weapon-related reviews, about half are conducted with some participation external to the laboratory, and the other half are conducted with some participation from experts outside the NNSA weapons complex. Substantial effort is required to organize, prepare for, conduct, and respond to all of these peer reviews. In contrast to SNL, the committee did not hear about a formal

________________

4 Currently, the NEP laboratories use different codes and physics models, e.g., different plutonium equations of state.

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

written review process in the NEP design laboratories except in the case of INWAP reviews, red team reviews, reviews for the 6.x process (described below), and reviews to resolve SFIs (see section “Tri-Laboratory, Formalized, or Mandated Peer Reviews of Nuclear Warhead Systems,” below).

With many staff and managers moving from one laboratory to another today, and with continued encouragement from DOE/NNSA and others for the two NEP laboratories to cooperate—including sharing codes and analysis practices—some observers have raised the concern that the laboratories’ independence may be compromised by moving toward more commonality. The laboratory directors should monitor the composition of peer review panels to avoid this possibility.

Sandia National Laboratories

Although Sandia has an advantage over the NEP laboratories in that it is able to test its components and systems, peer review continues to be very important at SNL. One reason is the requirement to be right the first time with hardware introduced into the stockpile; another is the budget pressure to reduce the expense of testing, especially at the systems level.

Sandia has a more formalized and documented approach to peer review than do the NEP laboratories. There are two basic approaches. The first is the more formal interlaboratory peer reviews that are mandated by the NNSA Development and Production Manual5 to occur at key milestones in a weapon or warhead acquisition process. For these reviews, one or more technically qualified individuals from LANL or LLNL may participate, if appropriate for the subject being reviewed. Participation by technically qualified personnel from AWE and other agencies is also allowed, if appropriate. For interlaboratory peer reviews that include production issues, participation by technically qualified personnel from the applicable production agency would be appropriate.

The other type of peer review at Sandia is internal reviews that are conducted to manage technical risk throughout a product’s life cycle. The degree of rigor and the composition of the peer review panel are determined using a graded approach that depends on the degree of technical complexity and risk. The formal documentation of this procedure was finalized in September 2014.6

One requirement for these internal reviews is the inclusion of individuals with recent and relevant technical experience that is independent

________________

5NNSA Development and Production Manual, Chapter 3.7, Inter-laboratory Peer Review Process.

6 Sandia National Laboratories Product Realization Assurance Committee, NW Realize Product Procedure, RPP-12, Peer Reviews, September 29, 2014.

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

of the product being reviewed. Although SNL has no full-spectrum external peer organization in the sense that LANL and LLNL are peers of one another, there are times when the appropriate expertise and independence can be found in individuals from outside Sandia, resulting in peer reviews that include experts from other NNSA sites, industry, AWE, universities, and DOD.

SNL management has established within its organization the Independent Surety Assessment (ISA) team to perform surety-related assessments of the SNL systems in the stockpile.7 The ISA team reports to the SNL laboratory director and is populated by laboratory technical experts who are assigned to ISA for as long as 3 years. It is separate organizationally from the activities it reviews. The ISA team has no members outside SNL.8 SNL also conducts some reviews with all external members, reporting to NNSA as well as to the Sandia board’s Mission Committee. The Mission Committee also reviews details of the design and engineering of nuclear weapons, including, for example, specification of tolerances and their impact on manufacturability.

Committee discussions with SNL presenters during the meeting of September 23, 2014 (see Appendix B), focused primarily on peer reviews at the component and subsystem levels. The committee’s probing indicated that these reviews only occasionally involve technical experts from outside SNL, in contrast to SNL’s guidance (adopted later that month; see footnote 24), which calls for broad participation by outside experts in peer reviews. This situation may be due in part to the fact that SNL does not have a counterpart in the United States with the full depth and breadth of expertise in non-nuclear components for nuclear weapons that could mirror the intimate interplay seen between LLNL and LANL with respect to the NEP.

Tri-Laboratory, Formalized, or Mandated Peer Reviews of Nuclear Warhead Systems

Reviews at each of the three laboratories are held with varying degrees of formality, ranging from fairly casual consultations with colleagues to carefully managed and process-driven reviews of high-stakes work. Interlaboratory peer review (IPR) involving all three laboratories is

________________

7 “Nuclear weapons surety refers to the materiel, personnel, and procedures that contribute to the security, safety, and reliability of nuclear weapons and to the assurance that there will be no nuclear weapon accidents, incidents, unauthorized weapon detonations, or degradation in performance at the target.” Quoted from The Nuclear Matters Handbook: Expanded Edition, Nuclear Weapons Council, 2011, Washington, D.C., p. 24.

8 Details about ISA are drawn from J.F. Nagel, “Organization 400 Independent Assessment,” SNL presentation to the committee on September 23, 2014.

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

often conducted under a more formal process. For example, an independent review is required as part of the Annual Assessment Report (AAR) process for a particular weapon, as prescribed both by law and by directives of NNSA. Another important process in the AAR is the use of a red team, which was stipulated by law in 2003. These teams are formally IPR teams that report to the laboratory director and contain representatives from all three laboratories; their job is to vigorously challenge the assumptions and assessments made by a laboratory design team.

In 2000, the joint DOD-NNSA Nuclear Weapons Council (NWC) formulated a process for the life cycle of nuclear weapons that describes the evolution from concept to retirement in seven phases.9 Currently the U.S. stockpile is largely in Phase 6 (production and maintenance), because present and future stockpile work will be focused on refurbishment10 and maintenance and not on new weapons concepts. In 2004, NNSA formalized the process of preparing for and conducting LEPs for weapons, which is commonly called the “6.x process.”11

NNSA’s 6.x process subdivides Phase 6 into stages that partially map onto the original seven phases of weapon life cycles prescribed by the NWC. An IPR team, populated as needed by experts from the design laboratory not being reviewed and the Department of Defense,12 is to be established in the 6.2 phase (which addresses the feasibility of various design options, the downselection to a provisional design, and a cost study). The IPR and design teams for a given weapon work together to prepare a plan and schedule for the needed peer reviews and their documentation.

A predecessor to the 6.x process—in that it used peer review as an input into planning and executing an LEP—was the dual revalidation of the W76 weapon in 1996 and 1997. The program included intensive and extensive peer review by separate and independent13 LLNL/Sandia Livermore and LANL/Sandia Albuquerque teams to assess the W76 and all of its systems and components in order to identify those components and/or systems that needed to be modified and upgraded in the W76

________________

9 See Nuclear Weapons Council, “Procedural Guideline for the Phase 6.X Process,” April 19, 2000, for detail on phases 6.1 through 6.6. The seven steps may be seen in DOD Instruction 5030.55, available at http://www.dtic.mil/whs/directives/corres/pdf/503055p.pdf.

10 The 2000 NWC memo defines refurbishment as referring “to all nuclear weapon alterations and modifications to include life extension, modernization, and revised military requirements.”

11 See Chapter 3.7 in the latest version, 56XB, Rev. 2, 03-31-14; see also the presentation to the committee by Kevin Greenaugh of NNSA, June 10, 2014.

12 Ibid.

13 Sandia systems design teams were separate and independent but drew mainly from Albuquerque for their component groups.

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

LEP.14 The two teams worked independently in a multiyear effort to assess and revalidate the ability of a specific weapon to meet its military requirements. Each team was required to evaluate the weapon’s design, test history, and history of surveillance in depth, a process that called for true peer expertise. This intellectually and technically competitive effort was presented to the committee as an example of a peer review, not a design competition. In addition to being the first intensive post-moratorium peer review of a weapon system, this activity helped in setting up the steps in an LEP process and established lessons learned for future peer reviews. However, the dual revalidation process was not continued; it was replaced by smaller and less formal peer reviewer processes (e.g., INWAP, see below) that were perceived to be less cumbersome.

A somewhat similar review effort took place in 2001 when LLNL was given the task of refurbishing and rebaselining15 the NEP for the W80 cruise missile warhead, a LANL design. This was the first example of a weapon whose continuing evaluation was assigned to the laboratory that had not designed it. While neither a peer review nor a competitive design, this project had aspects of both, because it was necessary for the LLNL team to thoroughly review the warhead’s design as it planned its refurbishment—a process that might involve redesign and remanufacture of components of the NEP as well as redesign by SNL of components outside the NEP. Subsequently, the original NEP design laboratory (LANL) conducted several peer reviews of the LLNL work, and responsibility for the NEP was transferred to LLNL.

Another example of formalization of peer review is the Independent Nuclear Weapons Assessment Process (INWAP), which is part of the input to the annual assessment letter of the stockpile written by the LANL and LLNL directors. INWAP came about through the initiative of the NEP design laboratory directors and the NNSA in 2008, and it was subsequently authorized and funded by Congress. Under INWAP, each laboratory director is supported by a peer review team established at the other laboratory for an independent assessment of issues specified by the director regarding the safety, reliability, and effectiveness of each stockpile weapon for which his laboratory is responsible.16 That peer review team assesses the given weapon—using the team’s own approaches and computational models of that weapon and exercising their own tools to simulate weapon performance against the weapon’s requirements—and

________________

14 The Navy Strategic Systems Programs (SSP) office had a considerable role in the final decision on the scope of the W76 LEP.

15 Baselining a nuclear weapon involves constructing a physics and simulation model of the weapon that is capable of reproducing, to the needed accuracy, the nuclear and nonnuclear test history of that weapon.

16 See LANL, LLNL, and SNL, “INWAP Implementation Plan,” Initial Release 1.0, Effective February 16, 2010.

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

reports directly to the requesting director.17 The requesting laboratory director then incorporates the independent assessment and the response from his own laboratory’s design team as input to his annual assessment letter.

There are a number of positive aspects to the INWAP approach:

  • It encourages each laboratory to regard the stockpile as a shared responsibility held jointly by all of the laboratories and plants, instead of as something in which one laboratory is responsible for a specific weapon. This builds trust and collaboration while at the same time valuing the differences in technical approaches, tools, and expertise.
  • Because previous laboratory directors were personally involved in establishing INWAP, it is clear to all concerned that the heads of the NEP design laboratories value the benefit of alternative approaches and tools to inform their judgment. Since the results of the reviews are just for the use of the laboratory director, the stakes involved are lower since no funding decisions are involved, and any technical exchanges between the teams are much more likely to be constructive.
  • If an issue arises with the NEP of a particular weapon, the two laboratories are well prepared to assist in addressing the issue, having already performed the necessary baseline work.
  • As a result of the process whereby two technical teams fully share their approaches, expertise, and results, each team improves its own understanding of the weapon and areas that may be of future concern with respect to weapon reliability. The committee sees this as a strong incentive for bench scientists to engage in peer reviews.

According to some current and former laboratory directors, one important key to the success of INWAP is that it is carried out between the laboratories in such a manner that there is no winner and no loser, only an improved understanding of the stockpile. INWAP, which is strongly supported by the laboratories, is very beneficial because it accomplishes just that.

________________

17 This emphasis on using independent approaches and models stems from the recognition that no one model or simulation is a perfect proxy for reality. Good models provide insights, and examining a phenomenon through multiple models can provide a broader set of insights while reducing the risk of being misled by spurious model-dependent results. Multiple steps are needed to represent a model in a computer code, and the use of different approaches (e.g., different numerical algorithms), none of which are uniquely suited, similarly reduces the risk of being misled by a particular choice.

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

Other Types of Review

As noted above, the committee learned that all three laboratories conduct a large number of reviews (many hundreds) of nuclear components, technology, or systems in any given year. Most of the reviews require staff to invest a substantial amount of time in preparation, which must be weighed against the value gained from the review.

Many of these reviews are required by each laboratory’s management and operating (M&O) contract and are initiated by the M&O contractor’s governing board, laboratory management, or laboratory project/program managers. Review bodies initiated by, and reporting to, the M&O contractor board tend to be populated by a wide spectrum of people from outside the laboratory with backgrounds relevant to the scope of the laboratory’s mission. Often they involve experts who work in the weapons program but who are not associated with the project being reviewed. In some instances, a few experts may come from other sites within the NNSA complex: they are generally included on an ad hoc basis when their expertise has been judged to be important to strengthen the review group. The frequency of these reviews typically ranges from quarterly to annually.

While these bodies may be asked to review selected laboratory capabilities, they are not nuclear weapons review panels. Rather, they are considered advisory boards or committees that assess the quality and direction of laboratory research; they may also advise on basic research priorities, the technical health of the laboratory, and overall laboratory (or laboratory division) strategic direction and planning. Within the definitions given in Chapter 1, assessments by these bodies would fall into the category of reviews by subject-matter experts; they are not peer reviews per se because most of the members are not weapons scientists or engineers and hence not “peers” in the strict sense of the word. However, the committee views these external boards as essential to the health of each laboratory and to the long-term viability of the national security effort within NNSA. Notwithstanding that benefit, because these reviews can require considerable preparation time and effort on the part of laboratory personnel that could otherwise be spent on core research and development, the committee counsels that these reviews be targeted thoughtfully and efficiently.

While not falling within the realm of the peer reviews that are the subject of this study, it is useful to recall those advisory panels that exist to provide input to each of the laboratories. For example, SNL has established the Nuclear Weapons External Advisory Board to give critical reviews and advice to Sandia; it reports to the SNL deputy director. It is composed of members of the Air Force, the Navy, the NEP laboratories, and the U.S. Strategic Command—STRATCOM. The NEP design laboratories have analogous advisory panels. Most of these advisory panels

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

report to senior management at the relevant laboratory and meet once or twice a year.18

In addition to laboratory-conducted reviews, the committee notes that STRATCOM conducts an annual review of the status of the nuclear weapon systems deployed. This review is conducted by individuals retired from the weapon design laboratories and retired members of the DOD nuclear forces. Its outcome contributes to the Strategic Commander’s input into the annual assessment letter and has historically identified areas of future research or heightened surveillance for the design laboratories. The committee understands this to be the only example of a standing technical review of a weapon or a weapon system and agrees that this typically amounts to a weapon system review. While this annual review does not delve into the technical depth the committee would associate with a true peer review, committee members associated with this process believe that the format of this review has been very valuable in ensuring the warheads are reliable.

Examples of Value Provided by Peer Review of Nuclear Warhead Work

In discussions during committee site visits at all three laboratories, examples were cited that highlighted the value of peer review. These examples showed how individuals outside the NNSA complex (e.g., from AWE, U.S. industry, or academia), and on occasion without nuclear weapon experience, provided valuable insight into a problem that had not been fully recognized by the design laboratory.

Some examples that were provided by the laboratories of the value of external peer review include the following:

  • Plutonium materials properties. These include the equation of state (EOS) and kinetics properties (e.g., behavior under different strain rates and different timescales for different phenomena). Clear understanding of these properties of plutonium is critical to the analysis of weapons performance. However, the two NEP design laboratories use fundamentally different approaches to developing the EOS from very different perspectives. In recent years, intensive evaluation of how each laboratory’s approach does, and does not, fit the data led to resolution of some long-standing differences. As a result, the two laboratories have attained a deeper

________________

18 Private communication between Gerry Sleefe of SNL and committee member David Overskei, October 9, 2014.

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
  • and joint understanding of the EOS for plutonium, decreasing uncertainties associated with NEP simulations.

  • Safety architecture signal in an arming, fuzing, and firing (AF&F) system. A presentation to the committee by SNL19 illustrated how a mathematical error had gone unrecognized by SNL staff, in spite of in-house reviews, until it was identified by an external peer reviewer from AWE who had not worked on weapons systems. The Air Force subsequently implemented changes to mitigate this problem.
  • Pit lifetimes and plutonium aging. As plutonium ages, it could change in ways that would have important implications for the performance of pits and the long-term viability of the stockpile. Until fairly recently, LANL and LLNL had different views on the extent to which the aging of plutonium is a concern, and they were designing experiments to provide information on plutonium aging and its impact on predictions. An individual without weapons experience but with nuclear fuel experience suggested an approach that resulted in better understanding of plutonium aging as it relates to pit performance and lifetimes, and this method has been adopted by both laboratories.20

All three laboratories also cited examples in which their internal review processes caught significant design or production issues that were then rectified. In one example cited by SNL, Sandia recognized the need to seek additional options to reduce cost in the B61-12 LEP. Peer review teams were formed to focus on partial component reuse, and a new option was identified that saved over $1 billion.21 Several additional examples involved deterioration of a system or component that was found during internal peer review. In selected cases, though, the concern was not completely addressed by the production team. In one such case, it was suggested that more rigor in the peer review process would likely have provided the impetus to push for a more robust solution, because the results of the peer review had not been solid enough to convince the production team to invest resources in a rectification.22

In addition to the above list, some examples were given in which peer

________________

19 Jeff Brewer, “Unique Signals Technical Basis Peer Review,” SNL presentation to the committee on September 23, 2014.

20 This example arose during general committee discussion with LLNL staff on November 14, 2014.

21 Gary A. Sanders, Sandia National Laboratories, “National Academy of Sciences Study of Peer Review,” presentation to the committee on June 10, 2014.

22 Discussion following presentation by Steve Harris, “B61 Spin Rocket Motor Igniter Peer Review,” September 23, 2014.

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

review identified issues and recommended actions, but the customer (Air Force or Navy) did not implement the mitigation approaches suggested, perhaps for reasons of cost or the impact on other DOD systems or different priorities.23 In such cases, peer review still provides value by offering options that a decision maker can consider, and also by increasing the degree of confidence all involved have in the thoroughness with which weapons systems have been scrutinized.

It is worth noting that few, if any, of these issues would have been found or addressed by nuclear explosion testing. While testing results provided a crude form of validation (or demonstration of failure) of a particular design, teasing out the reasons for test successes or failures given all of the underlying variables, as well as predicting the future performance of aging weapons, has required an increased understanding of weapon physics and rigorous peer review of calculations and experimental results—both during the testing era and up to the present.

DESIGN COMPETITION

As noted in Chapter 1, there have been no full design competitions for new weapons in more than 20 years.

Los Alamos National Laboratory and Lawrence Livermore National Laboratory

Recent interlaboratory competitions capture some of the flavor of the design competitions of the Cold War years, but there has been no recent design competition at the NEP laboratories that—as often occurred during the testing era—culminated in the actual production of a prototype to verify that the weapon design was producible or viable.

One of the design studies was for a warhead to be included in a new Air Force cruise missile. It included an extensive look at the NEP, and it was decided that an LEP would be conducted on the existing W80 warhead to fill this need. However, that program has been postponed to the mid-2020s. Another example is the 120-day study on the 3 + 2 initiative.24 The goal of this study was to look at the strategic missile warhead

________________

23 Discussion on January 22, 2015, with senior staffers from the Navy’s Strategic Systems Program.

24 This initiative will move the nation toward a stockpile consisting of three interoperable warheads deployed across the submarine-launched ballistic missile (SLBM) and the intercontinental ballistic missile (ICBM) legs of the Triad and two interoperable air-delivered warheads or bombs. See U.S. Department of Energy, 2014, Fiscal Year 2015 Stockpile Stewardship and Management Plan, Report to Congress, http://nnsa.energy.gov/sites/default/files/nnsa/04-14-inlinefiles/2014-04-11%20FY15SSMP_FINAL_4-10-2014.pdf.

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

stockpile as a whole and consider how the stockpile could be refurbished to provide three weapons for the ICBM/SLBM forces. The intent was to improve the safety and security and also to minimize the number of warheads that need to be kept in reserve as a hedge against a failure in the deployed force, because the warheads could be on either Navy or Air Force missiles. The idea was to provide a certain amount of interoperability and commonality of parts among the three stockpile weapons through redesign and reuse. Part of the initiative involved the competitive design of several interoperable warheads that could be used by both the Navy and Air Force legs of the Triad. However, this program was paused in 2013 for 5 years by the Air Force and NNSA, and in neither case did the studies go beyond computational simulations of the potential options.

In the Reliable Replacement Warhead (RRW) design study, both LLNL/SNL-California and LANL/SNL-New Mexico produced rather innovative RRW designs. The selection process had each team critically review the opposing team’s design in a large meeting that included staff from DOD and NNSA in addition to the designers and their management. Although substantive improvements to both designs were discovered through this opposing team review, the fact that the results of the reviews were presented in such a forum, and somewhat early in the process, created deep-seated negative feelings on the part of the two NEP laboratories and mistrust of NNSA that still exists. These unfortunate aspects of the manner in which the RRW competition was conducted were compounded by the fact that the program was canceled by Congress before either design had been validated by completing engineering and manufacturing a prototype—essential elements of any successful design competition. Specifically, in order to adequately exercise their design skills, designers must “close the loop” and, at the very least, receive feedback from the real world about whether their design is practical and can be manufactured.

Sandia National Laboratories

Sandia designers of non-nuclear components, subsystems, and systems integrated with the delivery system have more opportunities to exercise their skills today than do designers at the NEP laboratories. In addition to addressing the problem of deterioration, designers must replace components when obsolete technology is no longer amenable to remanufacture. Extensive LEPs for the W76 and B61 have also enabled Sandia designers to exercise their skills. The committee notes, however, that in these LEPs, the SNL designers were constrained to meet the interface requirements of the old NEPs, which were not changed. Thus, these cases do not stimulate the kind of innovation and creativity provided by a “clean slate” design competition.

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×

CONCLUDING COMMENTS

Based on its extensive experience with nuclear weapons science and technology, the committee concludes that in the absence of nuclear testing, strong peer review and design competition create a higher level of confidence in the nation’s stockpile than could be generated without them. The ultimate measure of success of peer review and design competition will be a sustained, competent, and creative workforce capable of responding to emerging challenges. The judgment of that workforce will be informed, in turn, by extensive and deep engagement with the scientists and engineers in an environment that fosters and encourages free questioning and checking of one another’s work.

Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 24
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 25
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 26
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 27
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 28
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 29
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 30
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 31
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 32
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 33
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 34
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 35
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 36
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 37
Suggested Citation:"3 The Present: From 1992 Until Today." National Academies of Sciences, Engineering, and Medicine. 2015. Peer Review and Design Competition in the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/21806.
×
Page 38
Next: 4 The Future: Responding to Evolving Challenges »
Peer Review and Design Competition in the NNSA National Security Laboratories Get This Book
×
 Peer Review and Design Competition in the NNSA National Security Laboratories
Buy Paperback | $40.00 Buy Ebook | $32.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The National Nuclear Security Administration (NNSA) is responsible for providing and maintaining the capabilities necessary to sustain a safe, secure, and reliable nuclear weapons stockpile for the nation and its allies. Major responsibility for meeting the NNSA missions falls to the three NNSA laboratories: Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The NNSA National Security Laboratories contribute to that goal by maintaining the skills and capabilities necessary for stewardship of a reliable nuclear stockpile and also by maintaining a high level of technical credibility, which is a component of the nuclear deterrent.

Since 1992 it has been U.S. policy not to conduct explosion tests of nuclear weapons. The resulting technical challenges have been substantial. Whereas a nuclear test was in some sense the ultimate "peer review" of the performance of a particular NEP design, the cessation of nuclear testing necessitated a much greater reliance on both intralab and interlab expert peer review to identify potential problems with weapon designs and define the solution space. This report assesses the quality and effectiveness of peer review of designs, development plans, engineering and scientific activities, and priorities related to both nuclear and non-nuclear aspects of nuclear weapons, as well as incentives for effective peer review. It also explores how the evolving mission of the NNSA laboratories might impact peer review processes at the laboratories that relate to nuclear weapons.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!