In the FY2010 National Defense Authorization Act, P.L. 111-84, Congress directed DOE to request the National Academy of Sciences to review the quality of science and engineering (S&E) research at the three national security laboratories, Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), and Sandia National Laboratories (SNL). Specifically, the Congress mandated that
(a) IN GENERAL.—Not later than 60 days after the date of the enactment of this Act, the Secretary of Energy shall enter into an agreement with the National Academy of Sciences to conduct a study of the following laboratories:
(1) The Lawrence Livermore National Laboratory, California.
(2) The Los Alamos National Laboratory, New Mexico.
(3) The Sandia National Laboratories, California and New Mexico.
(b) ELEMENTS—The study required under subsection (a) shall include, with respect to each laboratory specified in such subsection, an evaluation of the following:
(1) The quality of the scientific research being conducted at the laboratory, including research with respect to weapons science, nonproliferation, energy, and basic science.
(2) The quality of the engineering being conducted at the laboratory.
(3) The criteria used to assess the quality of scientific research and engineering being conducted at the laboratory.
(4) The relationship between the quality of the science and engineering at the laboratory and the contract for managing and operating the laboratory.
(5) The management of work conducted by the laboratory for entities other than the Department of Energy, including academic institutions and other Federal agencies, and interactions between the laboratory and such entities.
The principal motivation of Congress for this study is given in the conference report associated with this act:1
There is a growing concern about the ability of the Department of Energy to maintain the overall quality of the scientific research and engineering capability at the three laboratories. This concern was most recently highlighted in the report of the Congressional Commission on the Strategic Posture of the United States. The conferees believe that an even handed, unbiased assessment of the quality of the scientific research and engineering at each of the three laboratories, with a clear understanding of the criteria used to measure quality and what factors influence quality would be useful in long-term planning for the operations of the laboratories.
The study was divided into two consecutive phases; in phase I, a committee examined management issues, and in phase II a second committee assessed the quality of the science and
1 U.S. Congress, H. Report 111-288, 2010, p. 910.
engineering research.2 The phase I report, released on February 15, 2012, addresses Tasks 4 and 5 and partially addresses Task 3; roughly speaking, how management at all levels affects the quality of the S&E at the three laboratories. The phase I study identified major management concerns that have the potential to impede the conduct of high-quality work at all of these laboratories. Phase II of the study, which evaluates the quality of S&E in key subject areas, was begun after the release of the 2012 report. This report presents the results of that second-phase effort.
The research and engineering programs of these three laboratories are very broad, as befits an enterprise whose total annual budget approaches $7 billion and which employs thousands of scientists and engineers. In addition to their diversified programs for the National Nuclear Security Administration (NNSA), they conduct significant work for other parts of the DOE, the Department of Defense, the Department of Homeland Security, and for the intelligence community, as well as many other sponsors and partners. Although each laboratory is a separately managed entity, the three form an integrated enterprise that performs a unique national security mission; inter-laboratory collaboration is an important pillar of their work, particularly in the nuclear weapons component of that mission. LANL and LLNL are the nation’s sole centers for work on the “physics package,” while SNL provides a unique function for engineering the non-nuclear components of warheads and integrating warheads into delivery systems. In a very general sense, LANL and LLNL are science laboratories that have (of necessity) vigorous advanced engineering capabilities, while SNL is an advanced systems engineering laboratory that maintains a vibrant science base.
The science base at all three institutions includes work in basic science, weapon science, nonproliferation, energy, and a long list of other mission areas and disciplines. One strength of these laboratories is the interconnections among work in various related areas. For example, materials science and engineering support the weapons program (including nonproliferation) as well as energy research and development and many other applications. Research related to the release of energy in nuclear reactions is directly relevant to weapons and nuclear power.
A complete, in-depth evaluation of all the work at the three laboratories is beyond what a single committee of individuals can do in a 1-year NRC study. Therefore, after obtaining agreement from the study sponsors at NNSA and relevant congressional staff, the committee’s efforts were focused on those areas of research, development, and engineering that are most closely aligned with the laboratories’ unique primary mission—that of maintaining the nuclear weapons deterrent. While the study committee examined how well some specific projects are executed,3 it looked primarily at factors that could inhibit the quality of S&E. It concluded that the long-term S&E quality is driven by foundational capabilities: the technical caliber of the S&E staff, plus capabilities such as strategic planning and support; relevance of the work to the advancement of the S&E field and to the mission; integration with other work at the laboratory and connections with the larger technical community; adequacy of facilities, equipment, infrastructure, and other resources; and sustainability of the workforce. This decision was also discussed with and agreed to by the NNSA study sponsors and relevant congressional staff.
An evaluation of capabilities, however, requires—to some degree—an assessment of current work. That evaluation provides insight about the current state of many of the key capabilities, such as the state of facilities, planning and support, and the quality of the workforce. Because the capabilities listed above are foundational, their quality affects all of the laboratories’ S&E work, including research on energy topics, basic research, and work for others, and, thus, this capabilities-focused evaluation also provides insight about the quality—or at least the upper limit that can be attained—in these other areas. Therefore, the analysis presented in this report with respect to the weapons mission should have some
2 This division was largely motivated by security concerns. However, it facilitated appointing two different study committees, one focused on management and one on science and engineering.
3 The projects were raised during discussions with the study committee and not selected in any systematic way prior to those meetings. The key areas of S&E in which these projects fell, however, were selected prior to the laboratory meetings, as discussed in the text below.
relevance for judging capabilities to perform quality work for the other responsibilities of the national security laboratories, including work for others.
To support this approach, the study committee organized itself into study teams focused on four broad areas: (1) science base for nuclear weapons; (2) nuclear weapons design; (3) modeling and simulation; and (4) systems engineering and system aging. The science base team focused on materials physics, chemistry, and engineering; condensed matter at extreme conditions; high-energy-density science; and radiation transport/hydrodynamics. Each study team employed an extensive sampling approach to its task. This included discussions with managers having broad responsibilities across major areas and numerous meetings with scientists and engineers conducting specific projects within those broad areas.
To conduct this work, the NRC formed the Committee to Review the Quality of the Management and of the Science and Engineering Research at the Department of Energy’s National Security Laboratories—Phase II, whose members were carefully chosen to provide broad and deep applicable expertise and experience in the four topic areas. To provide continuity with the first phase of the study, several members of the phase I study committee—including the two co-chairs—agreed to serve on the phase II committee. The phase II committee was assembled so its expertise spanned the four topic areas across relevant national and international communities, and its members had had various direct and indirect interactions with one or more of the NNSA laboratories.
The study committee began its work with an open-session meeting with staff from several NNSA offices to gain clarity about sponsor concerns. This was followed by a closed session at which it established a framework for how it would carry out its assessment. While the laboratories and NNSA regularly collect some quantitative metrics related to the quality of both projects and capabilities, given the resource and time limitations inherent in this study, the committee decided to rely primarily on qualitative data that it could receive and evaluate itself. The study committee, therefore, constructed the study’s framework in a manner to impose as much rigor as possible.
First, to collect and assess its data, the study committee chose to meet with a broad and diverse selection of staff at the three laboratories in informal discussions with a format similar to that used in the phase I study. Laboratory staff made short presentations and then engaged in discussions with the appropriate study team. Much like the phase I study, questions raised by the study team were intended to elicit information that would be most useful to the study committee in its deliberations.
The study teams selected laboratory staff to meet with so as to examine major S&E areas critical to the weapons mission of the laboratories (e.g., materials science and engineering, advanced computing, high-energy-density science, and weapons-design codes). Each study team determined these key areas for its subject area based on the experience and expert judgment of its members and discussions with NNSA personnel at the first meeting. This information was given to the laboratories with a request to arrange sessions for a given study team with appropriate laboratory staff involved in these areas.
Second, in this context, the committee developed a set of criteria to its discussions with laboratory staff and its review and analysis of the results of these meetings.4 These criteria were not applied as a checklist; rather, they constituted a set of considerations from which committee members drew, and to which they added, as appropriate, for a given area of work. The following criteria were used by the study teams:
4 These criteria take into account standard processes, such as those recounted in National Research Council, Best Practices in Assessment of Research and Development Organizations, The National Academies Press, Washington, D.C., 2012.
1. Is scientific research and engineering in support of the missions well managed and well executed?
a. Conduct of the work: Is the work executed well? Are the results of high quality compared to similar work elsewhere? Is this judgment supported by recognized objective measures? Is the work innovative, creative, insightful?
b. What are the unique S&E accomplishments and impacts?
c. Was the work well-planned and well-prepared? Is there a reasonable strategic plan, including planning for future funding?
d. Are there alternative research and development (R&D) paths that are not being pursued, but which would better meet specific missions?
2. Is the work relevant to the advancement of the field, the advancement of the mission(s) (current and anticipated/emerging), and the advancement/continuation of this area of work at the lab?
3. Is the work effectively integrated with other work within the NNSA laboratories as appropriate?
4. Is this work a good use of laboratory resources?
a. Could the needed information have been obtained by monitoring work elsewhere?
b. Does it contribute to attracting new staff, sponsor interest, development of new laboratory capabilities?
5. Are the major facilities, equipment, and infrastructure necessary and sufficient for the missions? Is effective use being made of available facilities and equipment?
6. Is there an appropriate workplace culture, with evidence of enthusiasm, dedication, innovation, empowerment, flexibility and agility, leadership and mentoring, access to resources, and risk tolerance?
7 Is the workforce healthy, i.e. capable and sustainable?
8. Are scientists and engineers appropriately connected within the laboratory and with the broader S&E communities (national and international, both academic and industrial)?
a. Are grand challenges and a vision for the future defined and appropriate for national laboratory missions?
b. Do operations divisions appropriately support S&E?
c. Is Laboratory Directed Research and Development effectively addressing and preparing for future missions?
d. Is peer review appropriately used to evaluate S&E work?
e. What are the values of, and trends in, traditional professional metrics: publications, citations, invited talks, awards, and patents (in both classified and unclassified domains)?
In addition to assisting the committee with its analysis, the criteria are offered in response to the third item of the committee’s charge, as an indication of the multiple dimensions needed to assess the quality of an R&D laboratory and its S&E. Note that most of these questions must be addressed subjectively by peers who understand how to interpret the answers and who know which questions are most important at a given time. That is especially true when assessing the long-term capabilities and the risks that may affect a laboratory’s quality, as is the case in this study.
Third, the committee agreed that its assessment of these data would rely primarily on the collective experience, technical knowledge, and expertise of its members, whose backgrounds were carefully matched to the technical areas within which the activities of the laboratories are conducted. Because state-of-the-art work is best evaluated by peer judgment rather than by quantitative metrics, the committee applied a largely qualitative approach to the assessment. In examining key areas of S&E— including examination of illustrative projects and programs—the committee’s goal was to identify salient examples of accomplishments and opportunities for further improvement with respect to capabilities; evaluate the technical merit of these capabilities; assess their relevance to the laboratories’ missions; and
evaluate specific elements of the laboratories’ resource infrastructure that is intended to support the capabilities and technical work.
During the meetings at the three laboratories, committee members met with laboratory directors and other senior management and with more than 300 mid-level managers, senior scientists and engineers, and early career scientists and engineers. These meetings included presentations, discussions, and poster sessions. Some of these discussions involved personnel from more than one laboratory so that inter-laboratory coordination could be assessed. Subsequent to the meetings, committee members asked for, and received, supporting materials. As envisioned in study’s statement of task, the quality of a laboratory’s S&E is intertwined with the quality of its management. Discussions, therefore, often included topics that had been raised in the phase I report, with particular emphasis on how these matters affect the ability of scientists and engineers to do high-quality work.
Chapters 2-5 present the findings and recommendations in the four subject areas—nuclear weapons design; systems engineering and system aging; the science base for nuclear weapons; and modeling and simulation. Although each of these chapters addresses the same general matters, they are not organized identically. In each case, the chapter organization is driven by the specifics of the subject area. Each presents a snapshot assessment of quality of current work, as observed by the study team, and a broader assessment as described above. These chapters all address the major areas of concern that emerged across all discussions and data-gathering, as reflected in the summary: experimental science; facilities; work environment; and recruitment, retention, and continuation and continuity of knowledge and experience. In addition, each chapter raises issues of importance to individual subject matter areas.
Chapter 2 discusses the nuclear weapons design activities at the three national security laboratories, presents the study committee’s assessment of the quality of the design work being conducted, and briefly discusses how that work draws from and is connected to the other major areas assessed in this report.
Chapter 3 addresses systems engineering and aging. Weapon design and systems engineering are the direct bases of “the product”—that is, the ability to certify the safety and reliability of the nuclear weapons stockpile now and in the future (and to understand nuclear activities outside the United States). Understanding aging is critical to stockpile stewardship. Like weapons design, engineering and aging work draws heavily from work conducted in the science base and the incorporation of the results of that work into computer codes.
Chapter 4 is the assessment of the science base. The study team for this subject area chose to focus on four areas that are most germane to the nuclear weapons mission: materials science, chemistry, and engineering; condensed matter/materials science at extreme conditions; high-energy-density science; and radiation hydrodynamics/transport.
Chapter 5 addresses the laboratories’ capabilities in modeling and simulation.
Chapter 6 provides over-arching observations on S&E quality at the laboratories and summarizes concerns that were raised in connection with more than one of the subject areas covered in Chapters 2 through 5.