National Academies Press: OpenBook

Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories (2013)

Chapter: C--Review of Relevant Studies and Reports 1995-2010

« Previous: B--Presenters and Speakers at Committee Meetings
Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×

C

Review of Relevant Studies and Reports 1995-2010

As part of this study, the study committee reviewed a number of relevant studies that were done in the period 1995-2010. These are listed at the end of this appendix. This appendix summarizes what those studies said about issues that are relevant to this report. This appendix is not an exhaustive analysis in that: (1) it does not review all matters addressed in the referenced reports, just those that were directly relevant to the work of the study committee; and (2) the list of major reports as reviewed does not include every study of possible relevance.

This Appendix First summarizes the four major issues that emerged consistently from the reviewed studies. Then it discusses each of these issues in greater detail.

EVOLVING AND PERSISTING ISSUES IN THE MANAGEMENT OF THE NUCLEAR WEAPONS LABORATORIES

Several issues have persisted and evolved in the management of the nuclear weapons laboratories since the mid-to-late 1990s. These issues have one theme in common: the absence of an effective governance structure. Four issues involving laboratory management, of which advisory groups continue to find evidence of, pervade the weapons complex:

1. An unclear commitment to, and view of, the laboratory mission;

2. An unstable workforce and lack of adequate plan to maintain core competencies;

3. Unclear roles and responsibilities assigned to DOE/NNSA headquarters and to the offices and programs included within the laboratory governance structure, ill-defined and duplicated lines of authority and oversight, including the failure of NNSA to achieve its intended independence; and

4. Excessive number of reviews and oversight by external organizations, particularly by the Defense Nuclear Facilities Safety Board.

Issue 1: An unclear commitment to, and view of, the laboratory mission.

It is evident from reports published in the mid-to-late 1990s that this time was a hectic and disorganized period for the laboratories. The testing of nuclear weapons ended in 1992, and with the establishment of the Stockpile Stewardship Program, national priorities and the mission of the laboratories were changing due to the ban on nuclear testing (GAO, 1995). During this period, there was confusion on the part of the laboratories as to which priorities should be deemed ones of national importance and commitment. Many reports cite the Department of Energy’s (DOE’s) lack of direction as a cause. A 1995 GAO advisory group tasked with examining the labs’ missions stated that the laboratories lacked clearly defined missions, failing to adapt them to changing national priorities and evolving Department objectives, despite recommendations from advisory groups to redefine the laboratory missions.

Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×

The 1995 Task Force on Alternative Futures (a.k.a. the “Galvin Task Force”), believed it was not appropriate or resourceful for the laboratories to acquire new mission areas outside of their traditional ones, including developing technologies for the private sector (DOE, 1995). The Task Force observed “excessive scrambling” on the part of the laboratories in acquiring new mission areas outside of their traditional ones. While they approved of utilizing the laboratories capabilities such as “high performance computation, advanced materials, energy technologies, and systems engineering” to solve other national priorities,

These activities should be carefully managed, are not likely to evolve into “new missions” per se, and should not be a license to expand into areas of science and technology which already are being addressed effectively or more appropriately by other Research and Development (R&D) performers in government, academia and the private sector (DOE, 1995).

The Galvin Task Force expressed concern that expanding the laboratories’ roles to serve the needs of private industry was likely to distract them from their public missions, diverting both intellectual and material resources away from it. The Task Force described these activities as “add-ons;” managed on a case-by-case basis. They stated that “the laboratories might be more likely to propose industrial programs merely based on ‘make work’ criteria,” if their work expanded outside DOE mission areas. In addition, laboratory work performed for the private industry was unfocused. It was unclear to the Task Force how large and broad-ranging these activities should be, how they should be funded, and how they should relate to the primary mission areas the laboratories were involved in- “in particular, whether industrial competitiveness should be viewed as a primary or a derivative function.”

In the early 2000s, several reports, including the Report of the Commission on Maintaining United States Nuclear Weapons Expertise (a.k.a. the “Chiles Commission Report) and the FY 2000 Report to Congress of the Panel to Assess the Reliability, Safety, and Security of the United States Nuclear Stockpile (a.k.a. the “Foster Report”) stressed the need to revamp the strength of the national commitment to the stockpile stewardship mission or risk the loss of recruiting and retaining highly qualified scientists (Chiles et al., 1999; Foster et al., 2001). The 2000 Foster Panel noted that the stockpile stewardship mission was different than other nuclear weapons missions the laboratories had been accustomed to, thus requiring taking a different approach than “the continuation of past technical activities:”

It is not possible to attract or retain a world-class staff absent clear articulation of this new stewardship mission and its national importance, and without a credible multi-year program. NNSA, working with DOE leadership, DOD, the President, and Congress must restore the sense of mission, rationalize the work program, and demonstrate commitment to stockpile stewardship (Foster et al., 2001).

The Secretary of Energy Advisory Board’s (SEAB’s) 2005 Nuclear Weapons Complex Infrastructure Task Force also observed a lack of integrated and coordinated set of missions, citing DOE’s lack of policy guidance and the lack of uniformity among design laboratories about requirements and regulations for the weapons development. For example, the Nuclear Weapons Complex Infrastructure Task Force noted several occasions where a laboratory would justify the building of a new facility based on requirements that they themselves created, in order to appear superior to another laboratory. This resulted in the laboratories “competing for programmatic funds and priorities rather than relying upon their divergent and complementary strengths and thereby operating as a truly interdependent team, with shared success and rewards.” (DOE, 2005).

The 2009 Stimson Center’s Task Force Report on Leveraging Science for Security: A Strategy for the Nuclear Weapons Laboratories in the 21st Century echoed the 2005 SEAB Task Force’s concern about the lack of a unified mission. The Stimson Center Task Force found the laboratories’ research areas

Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×

had expanded to the point that the laboratories appeared “to have evolved from multipurpose to all-purpose,” resulting in a lack of a clearly defined set of missions (Townsend et al., 2009).

The Stimson Center Task Force and the 2009 Congressional Commission on the Strategic Posture of the United States stressed the related issue that work performed by the laboratories needed to support the long-term growth of the science and engineering enterprise underlying the mission. This meant that the laboratories should only participate in those agency partnerships committed to the longterm vitality of the laboratories. Agency partnerships should involve:

Capital investment, annual funding commitments, and participation in the long-term strategic focus of the laboratories. This requires creating a structure for multi-agency decision-making and investment and eliminating “primary” versus “secondary” access to the labs’ capabilities. This “investment” will require commitment and support by the Office of Management and Budget (OMB), the agencies, and the Congress. This multi-agency support should reduce costs for all agency clients, while preserving these national resources and maximizing their service to the nation,” (Townsend et al., 2009).1

In August of 2009, the Laboratories Management and Operations (M&O) contractors laid out several recommendations to the Department of Energy at the request of Secretary Chu (DOE National Laboratory Contractors Group, 2009). In particular, the recommendation to “focus on mission outcomes, not process” was made. This recommendation entailed several actions, including:

Assign full responsibility and accountability for both laboratory programmatic accomplishment and operational performance to DOE’s mission organizations, with DOE’s functional organizations providing advice and support to the mission organizations (as opposed to independently exercising authority to impose requirements on the laboratories or oversee laboratory performance).

Focus laboratory performance appraisals on delivery of the mission outcomes specific to each laboratory, as well as stewardship of laboratory assets and achievement of appropriate operational standards, as opposed to process compliance or other “how” measures.

The recommendation to “provide laboratory contractors with increased flexibility in employment practices, partnership formation, technology transfer, and other area” was also made, which included action to:

Provide increased flexibility for engaging collaborators and other federal agency and private sector sponsors. Decreased transactional oversight or review and increased flexibility in contract terms in Work for Others and CRADAs will enable the laboratories to better meet DOE mission goals, and to engage with private industry on more commercial time scales and terms.

It is unclear the impact that these recommendations have made.

Issue 2: An unstable workforce and lack of adequate plan to maintain core competencies.

Several factors attribute to the unstable workforce experienced by the laboratories over the years. Poor morale, as a result of excessive safety and security requirements and downsizing; the changing workforce demographics; and the opportunities available outside of the laboratories are a few examples. The maintenance of the nuclear weapons “critical skills” and core competencies is also a major concern.

_______________________________

1 Regarding work for others (WFO) and memorandums of understanding (MOUs), the Stimson Center Task Force Report stated that partnerships involving these activities were “too limited and too ad hoc” to aid in the laboratories long-term planning of the S&T foundation.

Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×

Low morale is one reason attributed to rates of departure at the laboratories. The 1995 Galvin Task Force observed that the excessive number of laboratory audits, and the time and effort scientists spent interacting with auditors when they could have conducted research decreased workforce morale and led to the departure of a higher number of employees. The 1999 Chiles Commission and the 2000 Foster Panel both cited poor morale as an impediment in recruiting and retaining highly qualified scientists. The Chiles Commission found that low morale was due to uncertainty and frustration in the strength of the national commitment to stockpile stewardship, as well as a feeling of insecurity for whether the downsizing that had occurred in the past decade would continue in the future (Chiles et al., 1999). The 2000 Foster Panel cited that the highly publicized security breaches and ensuing incriminations were responsible for high departure rates and low job acceptance rates (Foster et al., 2001).

MAINTAINING CORE COMPETENCIES

Many reports, including the SEAB’s 2005 Nuclear Weapons Complex Infrastructure Task Force Report, the Defense Science Board’s 2008 Report on Nuclear Deterrence Skills, the 2009 Stimson Center’s Task Force Report on Leveraging Science for Security: A Strategy for the Nuclear Weapons Laboratories in the 21st Century, and American’s Strategic Posture: The Final Report of the Congressional Commission on the Strategic Posture of the United States expressed concerns that the NNSA lacks an adequate plan for the future recruiting of scientists who possess the core capabilities needed to maintain the nuclear weapons program, and that scientists are not given the ability to exercise and strengthen these essential skills, threatening the safety and reliability of the stockpile (Townsend et al., 2009; Defense Science Board, 2008; Perry and Schlesinger, 2009). “Core competencies” are the skills and capabilities needed to support and foster the nuclear weapons program, which are used to address other areas of national security, including “nonproliferation, threat reduction, and nuclear counterterrorism; including stabilization, assessment of terrorist nuclear devices, and nuclear forensics” (Townsend et al., 2009.) The design and development of nuclear weapons involve incorporating a diverse and unique set of skills from a variety of scientific fields (Perry and Schlesinger, 2009). In order to maintain the weapons program, an appropriate number of scientists need to be employed from each desired field (however, employing too many scientists would be a waste of money), who each need to posses the skill set necessary to fulfill each of their numerous responsibilities. Reports indicated that the NNSA lacks a plan for ensuring that the number of scientists recruited and the fields they are recruited from align with the criteria needed to fuel the nuclear weapons program and maintain its high-quality. The Strategic Posture Commission in 2009 noted, for example, that “NNSA expects to reduce the number of laboratory personnel funded by the weapons program by 20-30 percent. It is doing so without any understanding of what types of expertise to seek to retain or reduce.”

In addition to systematically recognizing the number and types of experts the laboratories should recruit, reports indicated that scientists are not given the needed “hands-on experience” in weapons development and design that is necessary in maintaining the nuclear weapons program. Fine-tuning these skills using computer simulations is not adequate (Townsend et al., 2009; Perry and Schlesinger, 2009).

Due to the absence of a systematic plan in the recruitment and training of scientists, the design, development, and testing capabilities of the laboratory workforce are threatened and will continue to be unless further action is taken.

Issue 3: Unclear roles and responsibilities assigned to DOE/NNSA Headquarters and to the offices and programs included within the laboratory governance structure; ill-defined and duplicated lines of authority and oversight.

The role of headquarters should be to provide guidance, policy, and oversight. It should “focus on areas crucial for success of the organization, and should delegate operations and any activities that can be done elsewhere (Richanbach et al., 1997). Evidence from numerous reports demonstrates this has not

Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×

been the case. DOE headquarters and NNSA have tended to perform tasks and responsibilities that field and operation offices should be responsible for. The Government-Owned, Contractor-Operated (GOCO) model that the laboratories are supposed to operate under has not been put into practice. The system resembles a “Government-Owned, Government-Operated” model (DOE, 1995). The 2009 Commission on the Strategic Posture of the United States stated that the NNSA and DOE failed to distinguish between “what to do (a government function) and how to do it (a contractor responsibility). There is uncertainty in determining where policy and oversight end and where implementation begins. The lack of defined roles and responsibilities within the management structure of the complex has resulted in multiple layers of oversight and compliance requirements, excessive overhead costs, and productivity losses: all of which avert attention from S&E research.

The 1995 Galvin Task Force observed many instances of the inappropriate role that DOE played in the day-to-day operations and management of the laboratories. The following are just a few of that Task Force’s observations:

•   Department of Energy orders to the laboratories range from a few to a few hundred pages in length and are prescriptive to detail processes; there are some 30 thousand individual requirements embodied in these orders to certain major laboratories. …

•   DOE Headquarters has insisted that copies of DOE terms and conditions be attached to all file copies of literally thousands of small purchase orders in order to document that these terms and conditions had been transmitted to vendors. …

•   Each laboratory acknowledges that it has more people than it needs because of the Federal prescriptions and the inability to add the flexibility of assigning people in the manner that would be most productive. …

•   There are at least 12 principal layers of management between the assistant secretary for defense programs down through the layers of DOE and the laboratory program management to the bench scientist working of a project financed through defense programs. There are additional oversight and administrative chain of commands through the field offices which probably add two or three more layers (DOE, 1995).

The Galvin Task Force stressed the need to “de-federalize” the labs. Groups prior to this one observed similar findings and recommendations, but the Department has done little to make improvements. The Task Force noted that although excerpts from DOE’s Strategic Plan at the time stated that “communications, trust, and human resources” were vital for success, its tendency to over regulate was detrimental to the cultivating of these factors. “The activities that it is obliged to direct and order are a countervention of the value of trust” (DOE, 1995).

A 1997 IDA study was commissioned to examine the management processes and structures of the DOE’s Defense Programs (DP), which are responsible for ensuring the safety, security, and reliability of the nation’s nuclear weapons stockpile (Richanbach et al., 1997). The DP’s workforce oversees the contractors who manage the weapons complex (which includes the laboratories.) The role of the field operations offices, area offices, and site offices is to implement the guidance provided by headquarters and to oversee the work carried out by the management and operating (M&O) contractors (IDA, 1997). Operations office managers are the formal contracting officers responsible for administering the M&O contracts. Site, or area offices, provide day-to-day interactions with the contractor, and maintain awareness of operations and issues within the government’s facilities (Richanbach et al., 1997).

The IDA study identified areas where potential overlap exists in the roles played by headquarters, operations offices, and site/area offices. Examples of potential for overlap in responsibilities and corresponding duties are listed in Table C.1. (The asterisk indicates where potential overlap occurs).

Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×

TABLE C.1 Examples of Potential for Overlap in Responsibilities and Corresponding Duties Between Headquarters, Operations Offices, and Site/Area Offices

Major Responsibilities Selected Duties
Headquarters Defense Programs

 

Help formulate and apply corporate policy for support functions

*Interpret ES&H policies and ensure programs apply

Operations Office

 

Serve as contracting officer for M&O contract

*Integrate and coordinate funding, program direction, functional policy direction, and guidance from multiple DOE offices and non-DOE customers

 

*Review and approve facility safety framework

 

*Consider site-wide institutional issues, health of contractual relationship

Execute programs on behalf of DOE program offices

*Develop performance measures and performance expectations for determining

 

*Coordinate and approve HQ’s work authorization

 

Provide planning input and support budget formulation and execution

 

*Provide matrix technical support to programs (and area offices), including ES&H and business operations

Area Office

 

Ensure compliance with ES&H orders

*Provide program direction and oversight for nuclear facility safety

 

*Maintain operational oversight awareness and perform independent management oversight of DOE facilities through Facility Representative program

 

*Conduct performance-based assessments of ES&H, safeguards and security

SOURCE: Adapted from Richanbach et al. (1997), Table I-2.

The IDA study concluded that although there was agreement that providing oversight and guidance is headquarters’ responsibility and program execution should be done by the field, the difference between the two major responsibilities or on the specific tasks that should be delineated to one and not the other is not clearly articulated.

The chains of command existing in the laboratory management structure are also ill-defined. The IDA Task Force found that the reporting chain of command parallels the chains of command for programmatic requirements; environmental, safety, and health activities; and administrative practices. Each of these management processes has their own formal as well as informal chains of command (where offices receive direction from another office outside of its formal chain.) These chains of command are ill-defined, creating confusing lines of authority and accountability within the management structure, and fostering an environment where poorly established boundaries and redundant regulations are the norm.

The 1999 Chiles Commission and the 2000 Foster Panel observed similar confusing chains of command, emphasizing that parallel chains created “day-to-day frustration among those in the field

Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×

performing hands-on stewardship tasks” and “inefficiency due to diffusion of authority and conflicting objectives. Unfunded mandates to meet functional requirements undermine program budget, plans, and milestones” (Foster et al., 2001).

In 2000, several security breaches led to the establishment by Congress of the NNSA. Congress cited “poor organization and failure of accountability” as causes for these security incidents (National Defense Authorization Act, 2000). The NNSA Act lays out the agency’s mission and organization.2 The NNSA took on several challenges that had yet to be resolved in the complex. This included the need for defining the roles and responsibilities of the laboratories, NNSA headquarters, and field organization units (Foster et al., 2001). The 2000 Foster Panel report emphasized that in order to overcome the challenges faced by NNSA, headquarters must:

Provide leadership and perform top management tasks, including: setting objectives; developing strategies, programs, priorities and budgets; providing guidance concerning milestones and objectives; setting measurable goals and appraising performance against these goals; and adjudicating differences among operating entities. Except for selected programs managed from headquarters, NNSA should not focus on the details of task execution. Achieving this goal will require simplifying, clarifying, and disciplining lines of command, communication, and authority with NNSA. Duplication of responsibilities should be eliminated and layers of headquarters and field management or oversight should be consolidated (Foster et al., 2001).

The 2001 Foster Panel report reiterated the points it made in its previous report, emphasizing that the Secretary of Energy must remove the unnecessary duplication of staff in such areas as security, environmental oversight, safety, and resource management. It also stated that NNSA had done little to resolve the management issues existing within the complex, creating even more bureaucratic issues (Foster et al., 2002).

To help align responsibility and management, the 2005 SEAB’s Task Force on Nuclear Weapons recommended that Site Office Managers report to the Deputy Administrator for Defense Programs (NA-10) rather than the Administrator in order to “redirect the contractors’ focus on the Complex.”

An issue stemming from the ill-defined roles and responsibilities of DOE and NNSA is that NNSA failed to gain the level of authority and flexibility that its creators intended it to have. Although the Agency has authority over a range of operations, putting this authority into practice has been difficult.

The SEAB’s 2005 Nuclear Weapons Task Force discussed in its report that because NNSA’s mission is vastly different, its management system must be tailored to its priorities. However, it found this was not the case, citing that “the DOE has burdened the Complex with rules and regulations that focus on process rather than mission safety. Cost/benefit analysis and risk informed decisions are absent, resulting in a risk-averse posture at all management levels.” The Task Force specifically noted:

Many administrative orders and procedures designed for the DOE civilian research and science laboratories are not well suited to the product-oriented Complex. The NNSA mission requires clear deliverables and requirements for the nuclear weapons life cycle, achieved by design, testing, manufacturing, and production with materials that by their very nature embody risk. The current DOE-NNSA structure should permit NNSA to apply appropriate rules and regulations to the NNSA Complex in a graded fashion (DOE, 2005).

The 2009 Strategic Posture Commission and the 2009 Stimson Center Task Force both support the premise that NNSA has failed to achieve its intended autonomy. The Stimson Task Force noted that due to NNSA not achieving the independence it was meant to have, “the laboratories now function under a complicated set of DOE and NNSA regulations, guidelines, and oversight.” The laboratories need better

_______________________________

2 The National Nuclear Security Administration Act was created as a provision under the National Defense Authorization Act for Fiscal Year 2000. For additional information about the NNSA Act, see http://www.gpo.gov/fdsys/pkg/BILLS-106s1059enr/pdf/BILLS-106s1059enr.pdf.

Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×

strategic direction from NNSA, without the risk of losing their flexibility and authority. The excessive oversight does not allow for laboratory leadership to sufficiently manage the labs, hampering NNSA’s ability to perform national security missions (Townsend et al., 2009). The 2009 Strategic Posture Commission gave notable examples in their report:

During the first term of the Bush Administration, the DOE General Counsel effectively prevented any NNSA actions exempting the NNSA from any DOE regulations, arguing any such action required DOE staff concurrence.”

In 2005, a Defense Science Board Task Force examined production at the Pantex plant and concluded that excessive regulation originating outside the NNSA in a risk-averse DOE was raising costs and hampering production. Although the Task Force specifically attributed the problem to non-NNSA DOE staff, the department limited its response to an intensive review of NNSA procedures (Perry and Schlesinger, 2009).

In August of 2009, the laboratories M&O contractors laid out several recommendations to the Department of Energy at the request of Secretary Chu. Recommendations and subsequent actions issued by the M&O contractors are listed below.

Recommendation 2: “Restore the principles of the GOCO model to the DOE national laboratories.” This entails to:

Reestablish the principle that DOE’s role is to set and assign program objectives and roles and to establish performance goals and that it is the contractor’s role to determine the most effective means for their accomplishment.

Implement a competition policy that is conducive to long-term partnership between DOE and its M&O contractors. In particular we recommend that the Department compete laboratory contracts when, in its judgment, it is in the national interest to do so, but not on the basis of arbitrary time limits.

Eliminate orders and contract requirements that instruct the contractors on “how” work is to be conducted to the maximum extent practical. As noted above, the past few years have seen a steady proliferation of DOE orders, other requirements, and “guidance” documents directing contractors in great detail how to perform work at the laboratories.

Recommendation 3: “accept performance and operational risk,” including the following actions:

Establish a culture that balances risk avoidance with mission accomplishment, accepting and managing appropriate risk.

Respond to unfavorable events by holding contractors accountable for performance, rather than by issuing new requirements.

It is unclear the impact that these recommendations have made.

Issue 4: Excessive number of reviews and oversight by external organizations (particularly by the Defense Nuclear Facilities Safety Board).

It was evident since the mid 1990s that numerous DOE and external organizations influenced (in the form of oversight reviews), the environmental, safety, and security practices of the weapons complex. A lack of consensus among these organizations on an agreed-upon definition of safety and a formal mechanism for coordinating and evaluating the reviews by these organizations is evident (Richanbach et al., 1997). Organizations review a program, believing that their view on how the laboratories be regulated

Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×

should be made the standard. This has resulted in an excessive number of uncoordinated, often conflicted reviews. The 1997 IDA study stated: “At any time during what could be a multi-year process, the area office or contractor might, for example, receive a hundred pages of comments from just about anyone that must then be addressed. When conflicts arise between two or more reviewers, there is no formal method for resolving them” (Richanbach et al., 1997). The recommendations formulated by these organizations are developed without a cost/benefit analysis, and have resulted in extreme losses to productivity and unnecessary spending (DOE, 1995). The 1995 Galvin Task Force described the effect of the excessive amount of audits on the laboratories:

Everyone wants in on the act—headquarters, the DOE area office, the DOE field office, program offices of the DOE, the Defense Nuclear Facilities Safety Board (DNFSB), the Department of Labor’s office of Federal Contract Compliance, the EPA, the General Accounting Office (GAO) and the state where the laboratory is located. Each has oversight entities and each thinks their audit is the most important. There are also increased costs and productivity loss of those individuals, who are mostly scientists, interacting with the auditors (DOE, 1995).

The role that non-regulatory agencies (particularly the Defense Nuclear Facilities Safety Board) have had on the laboratories is excessive. Although the Board lacks independent regulatory enforcement authority, it has issued more than 30 formal recommendations to the Secretary of Energy since 1990 (DOE, 1995). Its mission was to move the DOE from its conventional “expert-based safety system” to a “standards-based system,” and disagreement ensued in how standardized and rigorous these standards should be. In the past, the Board was “too inflexibly committed to ES&H approaches,” adopting approaches too disproportionate and insufficient to address all safety requirements (Richanbach et al., 1997). The standard-based system resulted in increased formalities and regulations involved in the procedures for evaluating hazards.

The 2003 SEAB’s Blue Ribbon Commission Report on Competing the Management and Operations Contracts for the Department of Energy Labs also observed an excessive number of external reviews of laboratory program and safety performance. They noted that laboratories spend a great deal of time and overhead in trying to fulfill a multitude of requirements in preparation for reviews. The table below, excerpted from the 2003 Blue Ribbon Commission Report, provides a summary of the number of peer reviews given by various organizations for LLNL’s Defense and Nuclear Technologies (DNT) Directorate:

TABLE C.2 Number of Peer Reviews of the LLNL Defense & Nuclear Technologies Directorate

Review Type Number Number of Requiring Reports
External Program Peer Review

17

14

University of California (UC) Peer
Review of S&T Supporting DNT Program

5

Not indicated

UC-Based Review Panels and
Councils

17

17

Joint Lab, UC, NNSA Reviews of
Contract Performance

4

Reports and briefings

NNSA Headquarters-Based Program
Reviews

38

Not indicated

DNT External Safety Inspections,
Assignments and Reviews

35

Number of reports not indicated. Included 11 audits, 6 assessments, 3 analyses of fire hazards, 5 inspections, 6 reviews, 1 survey, and 3 miscellaneous

Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×

The Blue Ribbon Commission stated that DOE attempted to change their reviewing procedures to fix the problem of excessive reviews, but the Commission was under the impression that their revisions did not result in enough change (DOE, 2003).

The SEAB’s 2005 Task Force on Nuclear Weapons observed that the invasive role played by the Defense Nuclear Facilities Safety Board (DNFSB) and the DOE Office of Independent Oversight and Performance Assurance in security matters have contributed to the “multiple layers of oversight and responsibilities for compliance within the NNSA and in the parent DOE structure” (DOE, 2005). Although the DNFSB only issues recommendations and not requirements, “their recommendations have the implicit status of requirements because of the current lack of a specific mechanism for implementation assessment.” The SEAB Task Force highly emphasized that an analysis of the costs of implementation, safety benefits, and risks of an idea should drive every decision and recommendation made to and within the Complex, and suggested the DNFSB use this mechanism every time they make recommendations to the laboratories.

In 2009, the Strategic Posture Commission continued to highlight the excessive oversight of external agencies on the Complex, stating that “the regulatory burden on NNSA facilities is increased significantly by the on-going audits and reviews by the DOE Inspector General, the Defense Nuclear Facilities Safety Board, and the Government Accountability Office. These burdens are not under the control of either the Secretary of Energy or the NNSA administrator” (Perry and Schlesinger, 2009).

In August of 2009, the laboratories M&O contractors laid out several recommendations to the Department of Energy at the request of Secretary Chu. One recommendation issued was to “accept appropriate performance and operational risk.” The actions set forth by the contractors pertaining to the excessive amount of external oversight on the laboratories are below:

DOE oversight of functions that are already regulated by other entities (such as OSHA, the NRC, or state environmental regulators) should be replaced with oversight provided by those entities.

Consider consolidating DOE oversight and audit activities.

To date, it is unclear the impact these recommendations have made on laboratory management.

REFERENCES

Chiles, H.G., R.B. Barker, C.B. Curtis, S.D. Drell, R.F. Herbst, R.A. Hoover, H.W. Kendall, and L.D. Welch. 1999. Report of the Commission on Maintaining United States Nuclear Weapons Expertise. Available at http://www.doeal.gov/llnlcompetition/reportsandcomments/chilesrpt.pdf.

Defense Science Board. 2008. Report of the Defense Science Board Task Force on Nuclear Deterrence Skills. Prepared by the Office of the Under Secretary of Defense. Available at http://www.acq.osd.mil/dsb/reports/ADA487983.pdf.

Foster, J. S., H.M. Agnew, S.P. Gold, S.J. Guidice, and J.R. Schlesinger. 2001. FY 2000 Report to Congress of the Panel to Assess the Reliability, Safety, and Security of the United States Nuclear Stockpile. Available at http://www.fas.org/nuke/control/ctbt/text/foster00.pdf.

Foster, J.S., H.M. Agnew, S.P. Gold, S.J. Guidice, and J.R. Schlesinger. 2002. FY 2001 Report to Congress of the Panel to Assess the Reliability, Safety, and Security of the United States Nuclear Stockpile. Available at http://www.fas.org/programs/ssp/nukes/testing/fosterpnlrpt01.pdf.

Hammel, E. 1997. Los Alamos Scientific Laboratory Energy-Related History, Research, Managerial Reorganization Proposals, Actions Taken, and Results. Available at http://www.fas.org/sgp/othergov/doe/lanl/osti/M97005280a.pdf.

Perry, W.J., and J.R. Schlesinger. 2009. American’s Strategic Posture: The Final Report of the Congressional Commission on the Strategic Posture of the United States. United States Institute of Peace, Washington, D.C. Available at http://media.usip.org/reports/strat_posture_report.pdf.

Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×

Richanbach, P.H., D.R. Graham, J.P. Bell, and J.D. Silk. 1997. The Organization and Management of the Nuclear Weapons Complex. Institute of Defense Analysis, Alexandria, Va. Available at http://www.dtic.mil/cgi-bin/GetTRDoc?Location=U2&doc=GetTRDoc.pdf&AD=ADA323402.

Townsend, F.F., D. Kerrick, and E. Turpen. 2009. Leveraging Science for Security: A Strategy for the Nuclear Weapons Laboratories in the 21st Century. Henry Stimson Center, Washington, D.C. Available at http://www.stimson.org/images/uploads/research-pdfs/Leveraging_Science_for_Security_FINAL.pdf.

DOE (U.S. Department of Energy). 1995. Report of the Task Force on Alternative Futures for the Department of Energy National Laboratories. Prepared by the Secretary of Energy Advisory Board. Available at http://www.lbl.gov/LBL-PID/Galvin-Report/Galvin-Report.html.

DOE. 2000. Review of the Department of Energy’s Laboratory Directed Research and Development Program. Prepared by the Laboratory Operations Board.

DOE. 2003. Competing the Management and Operations Contracts for the Department of Energy Labs: Report of the Blue Ribbon Commission on the Use of Competitive Procedures. Prepared by the Secretary of Energy Advisory Board. Available at http://www.doeal.gov/llnlcompetition/ReportsAndComments/BlueRibbonReport.pdf.

DOE. 2005. Report of the Nuclear Weapons Complex Infrastructure Task Force. Prepared by the Secretary of Energy Advisory Board. Available at http://www.globalsecurity.org/wmd/library/report/2005/nwcitf-rept_13jul2005.pdf.

DOE Energy National Laboratory Contractors Group. 2009. Recommendations to the Department of Energy from the National Laboratory Management and Operations Contractors. Washington, D.C.

GAO (U.S. General Accounting Office). 1995. National Laboratories Need Clearer Missions Better Management. Washington, D.C. Available at http://www.gao.gov/archive/1995/rc95010.pdf.

Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×
Page 39
Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×
Page 40
Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×
Page 41
Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×
Page 42
Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×
Page 43
Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×
Page 44
Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×
Page 45
Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×
Page 46
Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×
Page 47
Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×
Page 48
Suggested Citation:"C--Review of Relevant Studies and Reports 1995-2010." National Research Council. 2013. Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories. Washington, DC: The National Academies Press. doi: 10.17226/13367.
×
Page 49
Next: D--The Structure of the Management Organizations that Govern the NNSA National Security Laboratories »
Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories Get This Book
×
Buy Paperback | $39.00 Buy Ebook | $31.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The three National Security Laboratories--Los Alamos National Laboratory (LANL), Sandia National Laboratories (SNL), and Lawrence Livermore National Laboratory (LLNL)--are managed by private-sector entities under contract to the National Nuclear Security Administration (NNSA). The FY2010 Defense Authorization Act mandated that NNSA task the National Research Council (NRC) to study the quality and management of Science and Engineering (S&E) at these Laboratories.

This study (addressing a total of 5 tasks) is being conducted in two phases. This report covers the first phase, which addresses the relationship between the quality of the science and engineering at the Laboratory and the contract for managing and operating the Laboratory (task 4), and also addresses the management of work conducted by the Laboratory for entities other than the Department of Energy (task 5). The study's second phase will evaluate the actual quality of S&E in key subject areas.

Managing for High-Quality Science and Engineering at the NNSA National Security Laboratories presents assessments of the evolution of the mission of the NNSA Labs and the management and performance of research in support of the missions, and the relationship between the Laboratory Directed Research and Development (LDRD) program and the ability of the Labs to fulfill their mission. The report examines the framework for managing science and engineering research at the Labs and provides an analysis of the relationships among the several players in the management of the Labs--the NNSA, the site offices, the contractors, and the Lab managers--and the effect of that relationship on the Laboratories' ability to carry out science and engineering research.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!