The committee’s second task was to describe the analytical procedures and data management practices in laboratories at the U.S. Geological Survey (USGS), other federal agencies, and geological surveys in other countries. Because the USGS is implementing a quality management system (QMS), it asked the committee to focus on laboratory QMS procedures. The data quality assurance practices of USGS laboratories are summarized in Chapter 3. This chapter focuses on the experiences of selected organizations in implementing and maintaining a QMS.
The committee invited eight organizations to share their approach and experiences with a QMS. The committee heard from four U.S. federal agencies with a substantial number of analytical laboratories using a QMS: the USGS, the Navy’s Environmental Laboratory Accreditation Program (ELAP) and Naval Sea Systems Command (NAVSEA) Shipyard Laboratory Program, the Centers for Disease Control and Prevention (CDC), and the Environmental Protection Agency (EPA). A QMS is less common in geological surveys and is rare in academic institutions. Two of five geological surveys from other nations that the committee consulted (France and Norway) use a QMS. Finally, the committee heard from the Geochemical and Environmental Research Group (GERG) Laboratory at Texas A&M University and a group of laboratories under quality assurance oversight of Duke University Medical Center that use a QMS. This group includes laboratories at Duke and other academic institutions. For simplicity, they are referred to in this chapter as the Duke University Medical Center laboratories.
This sample of organizations is small and cannot be considered representative of all organizations. However, the committee sought diversity in terms of scientific focus, number of laboratories under the QMS umbrella (see Table 4.1), and position of the presenters. Presentations by federal agencies were given by QMS-related managers, and presentations by geological surveys and academic institutions were given by scientists or laboratory managers. The diverse perspectives and experiences of these organizations are highly relevant to the USGS, which is incorporating lessons learned into its own QMS.
TABLE 4.1 Scientific Focus and Number of Laboratories in the Presenting Organizations
|Organization||Number of Laboratories with QMS||Number of Staff||Scientific Focus|
|USGS||~250 planned||~1,300||Geology, hydrology, and ecosystems|
|CDC||~200 proposed||~1,700||Infectious disease|
|Navy||85 ELAP||NA||Environmental restoration and ship materials|
|EPA||NA (100 percent)||NA||Public health and the environment|
|Duke University Medical Center||16||NA||Human clinical trials|
|Texas A&M GERG Laboratory||1||15||Contaminants, petroleum markers, and metals|
|French Geological Survey||1||77||Geophysics and the environment|
|Norwegian Geological Survey||1||18||Geology|
NOTE: ELAP = Environmental Laboratory Accreditation Program; NA = not available; NAVSEA = Shipyard Laboratory Program.
Each organization was asked to address the following points in their presentation:
- Why did you establish a QMS?
- Describe your current QMS.
- Is a QMS applied in all of your internal laboratories?
- How is the QMS resourced, managed, and assessed?
- Challenges in initially integrating QMS (what did you learn?)
- Challenges in maintaining the QMS (what are you learning?)
- Has the QMS evolved significantly and why?
- Has the QMS added value (e.g., examples where it either thwarted significant problems, saved money, created efficiencies, or produced defensible results that held up to scrutiny)?
- Opportunities or needs for interagency collaboration on developing consensus-based approaches for QMS in nonregulated laboratories.
The diverse perspectives of these organizations and common themes that emerged from the presentations are summarized below.
Organizations have different motivations for having a QMS, and they find different benefits in it.
All four federal agencies presenting had developed a QMS to improve the quality, reliability, and reproducibility of laboratory data. In addition, the USGS, EPA, and CDC sought to preserve or enhance their reputation for providing high-quality data. USGS data support policy making and CDC data support public health, and so high-quality data are needed to avoid unaccounted-for risk and error and to maintain public trust. EPA data support regulation, and so the data must be scientifically and legally defensible. The French and Norwegian geological surveys also cited enhancing customer confidence in laboratory results as a key motivation for implementing a QMS. The Navy mentioned the benefit of a QMS for continually improving processes.
Some laboratories adopted a QMS to pursue contract work and research projects in fields that require compliance with specific data standards. The Texas A&M GERG laboratory complies with EPA standards for work on contaminants (e.g., pharmaceuticals, pesticides, and dioxin), petroleum biomarkers, and metals. The Duke University Medical Center laboratories comply with National Institutes of Health requirements to support clinical trials.
A key QMS benefit identified by many of the presenters was the value of good documentation. A documented, step-by-step quality assurance process allows consistent performance, data verification, and proof that accepted protocols were followed. The need for consistent performance was emphasized by the Norwegian Geological Survey, where multiple team members carry out analyses and need uniform standards for quality assurance and quality control. The value of documentation for retaining institutional memory through staff turnover was noted by staff in several laboratories visited by the committee (see Chapter 3).
Several presenters noted that a QMS does not prevent all data quality problems, but it can leave clues that help laboratory staff identify and respond to problems relatively quickly. Some of the problems mentioned concerned sample or instrument contamination, and the loss of original data through machine overwriting or human editing. The built-in external audit feature of a QMS is intended to catch problems. For example, an external audit of USGS Energy Mission Area laboratories identified a data quality problem related to anion concentrations measured using ion chromatography (Meeks et al., 2018).
Some organizations are not implementing QMS for all their laboratories.
Four of the presenting organizations have implemented or plan to implement QMS for all of their laboratories. EPA has one QMS that covers EPA organizations as well as other organizations that receive EPA funds, but allows flexibility in how these external organizations implement the system. The CDC is slowly implementing QMS through pilot programs that recruit successively more laboratories. Each CDC laboratory in the pilot program chooses from three QMS options: standard, customized, or both. The USGS has begun implementing a single QMS for all of its laboratories. The Norwegian Geological Survey has a QMS for its one laboratory.
The other organizations are implementing a QMS for only part of their enterprise. The Navy laboratories that operate under the Environmental Laboratory Accreditation Program and NAVSEA Shipyard Laboratory program require a QMS, but other Navy laboratory programs do not. The French Geological Survey laboratories are accredited for some environmental chemistry and mineral characterization activities, but not for isotope geochemistry or multiscale experimentation activities. The Texas A&M GERG laboratory embraced a flexible approach to support its diversity of programs. It maintains a QMS for most contract work, but not for research and development projects. Half of the 32 laboratories under the quality assurance oversight of Duke University Medical Center use the QMS.
A QMS requires substantial resources. The cost of implementation and maintenance may be covered by the organization or by its laboratories.
The USGS has found that its laboratory staff spent about 20 percent of their time to implement the QMS. This figure is in line with the experiences of other presenters—who estimated that their laboratories spent 10 to 20 percent of their time implementing a QMS and 5 to 10 percent thereafter maintaining it—and with published time estimates (e.g., Vermaercke, 2000). A substantial part of the investment involves record-keeping requirements. Laboratory Information Management Systems or electronic QMS provide a higher level of traceability than traditional spreadsheets for data management, but they may require procurement and maintenance funding or dedicated staff. External audits required in some QMSs can also be costly.
The organizations have different approaches for paying the costs of the QMS. The French and Norwegian geological surveys fund their QMS centrally. The USGS and the CDC pay for training and support all quality assurance staff. No additional funding is provided to their laboratories for QMS implementation and maintenance. The laboratories under QMS programs at the Navy, EPA, Texas A&M GERG, and Duke University Medical Center pay all QMS costs, including training and quality
assurance specialists. The Duke University Medical Center laboratories are also covering information technology support for their electronic QMS.
Implementation and maintenance of an effective QMS requires institutional commitment, leadership, and buy-in from the staff.
A point made by several presenters was the role of leadership and culture in implementing a QMS. Initial resistance from laboratory staff was not unusual, and a substantial amount of institutional change was often required. The Texas A&M GERG Laboratory presenter noted that implementing a QMS was a huge undertaking, and getting staff to buy into the QMS was a major challenge. The Navy encountered resistance from laboratory staff, particularly those in smaller laboratories, when a QMS was introduced. The Norwegian Geological Survey stressed the importance of ongoing leadership support and laboratory staff awareness for maintaining their QMS.
EPA leadership set out to persuade scientists that a QMS supports science, rather than interferes with it. Some of the desired cultural shift toward a QMS occurred as management and scientists made decisions together. At the Duke University Medical Center, the participating laboratories were involved from the beginning to change attitudes and create buy-in. The CDC also involved agency managers and scientists in developing laboratory quality management policies. The well-respected principal investigators in the pilot laboratories served as “QMS champions,” who were able to demonstrate the feasibility and benefits of a QMS and thus help facilitate cultural shift. The Navy accelerated culture change by first showing its laboratories a functioning QMS that yielded benefits, then requiring the laboratories to adopt it.
Effective two-way communication between laboratories and management is important for developing and evolving a QMS.
Presenters pointed out that communication between laboratories and management is important for explaining why a QMS is needed, creating buy-in, developing a QMS that meets the needs of the laboratories, and evolving the QMS to accommodate new equipment or meet new user needs. The USGS found communication a challenge, given the complexity of the organization and the rapid development and implementation of its QMS. USGS leadership is working to provide training and communication plans that focus on why a QMS is necessary. The CDC noted the importance of consistent messaging about the QMS from both leadership and QMS staff. EPA found it helpful to keep quality assurance jargon to a minimum. The French Geological Survey noted the importance of reminders of why the QMS was put in place, especially in light of staff turnover.
Starting small and taking advantage of lessons learned helps the system evolve over time.
The Navy’s NAVSEA Shipyard Laboratory Program experienced significant growing pains, including a struggle to develop documents and maintain records, as it implemented a top-down QMS for its laboratories. The USGS faced a steep learning curve by developing and implementing a QMS on an accelerated schedule. In contrast, the CDC found that implementing a QMS slowly and in stages increased the likelihood of success because it allowed the agency to test and modify its approach, measure the necessary level of effort and investment, and demonstrate the feasibility and benefits of the QMS. It also provided an opportunity to develop guidance documents, templates, and expertise to support implementation across the agency. The number of laboratories under the Duke University Medical Center QMS expanded from 3 to 16 as it was determined that the research would benefit from standard operating procedures and QMS.
Task 2 was to summarize data management practices, specifically QMS, for laboratories at the USGS, other federal agencies, and geological surveys in other countries. The committee chose eight organizations that had different motivations for developing a QMS for some or all of its laboratories, different QMS challenges (e.g., number and diversity of laboratories and laboratory activities), and different QMS implementation strategies (e.g., top down or bottom up, and fast or slow). Despite these differences, some common themes emerged. First, implementing a QMS provides benefits, such as improving documentation, reliability, reproducibility of laboratory data; finding and correcting data quality problems; and enhancing the organization’s reputation for quality data. However, these benefits come with substantial monetary and personnel costs. The high costs and paperwork burden associated with implementing a QMS, as well as the need to learn a new way of doing things, can create resistance among laboratory staff. Institutional commitment and strong leadership are required to gain staff buy-in and to change the organization’s culture. Consistent messaging is important in explaining why a QMS is needed, and good two-way communication between managers, quality assurance staff, and laboratory staff is essential for developing a QMS that meets the needs of the laboratories. Finally, implementing the QMS slowly allows the system to evolve in response to lessons learned and thus fulfills its intended purpose.