National Academies Press: OpenBook
« Previous: Part 1 - Research Report
Page 131
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 131
Page 132
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 132
Page 133
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 133
Page 134
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 134
Page 135
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 135
Page 136
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 136
Page 137
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 137
Page 138
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 138
Page 139
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 139
Page 140
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 140
Page 141
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 141
Page 142
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 142
Page 143
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 143
Page 144
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 144
Page 145
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 145
Page 146
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 146
Page 147
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 147
Page 148
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 148
Page 149
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 149
Page 150
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 150
Page 151
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 151
Page 152
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 152
Page 153
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 153
Page 154
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 154
Page 155
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 155
Page 156
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 156
Page 157
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 157
Page 158
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 158
Page 159
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 159
Page 160
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 160
Page 161
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 161
Page 162
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 162
Page 163
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 163
Page 164
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 164
Page 165
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 165
Page 166
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 166
Page 167
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 167
Page 168
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 168
Page 169
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 169
Page 170
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 170
Page 171
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 171
Page 172
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 172
Page 173
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 173
Page 174
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 174
Page 175
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 175
Page 176
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 176
Page 177
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 177
Page 178
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 178
Page 179
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 179
Page 180
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 180
Page 181
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 181
Page 182
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 182
Page 183
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 183
Page 184
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 184
Page 185
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 185
Page 186
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 186
Page 187
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 187
Page 188
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 188
Page 189
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 189
Page 190
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 190
Page 191
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 191
Page 192
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 192
Page 193
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 193
Page 194
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 194
Page 195
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 195
Page 196
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 196
Page 197
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 197
Page 198
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 198
Page 199
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 199
Page 200
Suggested Citation:"Part 2 - Guidelines." National Academies of Sciences, Engineering, and Medicine. 2017. Guidelines for Optimizing the Risk and Cost of Materials QA Programs. Washington, DC: The National Academies Press. doi: 10.17226/23691.
×
Page 200

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

P a r t 2 Guidelines

135 How much time and money should we devote to quality assurance? Will the results be worth the resources invested? In response to shrinking budgets and dramatic reductions in both the numbers and experience levels of inspectors and engineers, several state departments of transportation (DOTs) are seeking ways to achieve greater efficiencies in quality management, often targeting current practices that may be either disproportionate to what is needed to ensure a quality product (e.g., does a concrete sidewalk warrant the same level of QA as a bridge deck?) or outdated given advances in testing technology (i.e., is there a more efficient or effective way to accept this material?). The purpose of such inquiry is not to downplay the importance of QA but to recognize that it is an inherently scalable activity driven by, among other considerations, an organization’s toler- ance for risk, material/product variability, and cost. As illustrated in Figure 1.1, a well-designed QA program can provide confidence that the materials and workmanship incorporated into a project will be in reasonably close conformity to the approved plans and specifications. Con- versely, an inadequate QA plan can increase the risk of short- and long-term failures, possibly leading to reduced design life, increased maintenance costs, service interruptions, and/or safety hazards. Logically, the more comprehensive and robust the QA strategy, the less risk of material failure or non-conformance; however, an overly rigorous QA plan can result in unnecessary proj- ect costs—an outcome that most agencies cannot afford in this time of flat or declining resources. The following guidance document has therefore been prepared under NCHRP Project 10-92 to help DOTs identify and evaluate opportunities to optimize or enhance their materials QA practices to achieve a better balance between efficiency and risk reduction. For example, modi- fications or enhancements to an existing QA strategy might entail adopting a less rigorous plan (e.g., fewer tests, use of verified contractor test data for acceptance purposes, and/or greater reli- ance on certification or inspection) to achieve the same result or incorporating more advanced, performance-oriented acceptance tests—even if at a higher cost—if such practices improve durability, reduce the risk of failure, or enhance performance. 1.1 Framework Overview 1.1.1 The Economics of Quality Model Striking the optimal balance between the cost of quality and the value of quality has long been an aspirational goal of managers and engineers, particularly in the manufacturing and produc- tion industries where an effective mix of quality improvement activities can play a vital role in helping firms achieve and maintain both customer satisfaction and long-term profitability. Introduction C h a p t e r 1

136 Guidelines for Optimizing the risk and Cost of Materials Qa programs The conceptual foundation for analyzing the economics of quality is therefore well estab- lished in the literature. Over the years researchers have proposed and refined a theoretical model for optimizing QA based on the principle of diminishing marginal returns (e.g., Juran, 1951; Kilpatrick, 1970; Plunkett and Dale, 1988; Hylton Meier, 1991; Morse, 1993; Schiffauerova and Thomson, 2006). As depicted in Figure 1.2, the total cost of quality (i.e., total cost) can be represented as a function of QA assessment/prevention costs (i.e., QA cost) plus the cost of defective or non-conforming materials (i.e., conformance and correction cost). Theoretically, the cost of improving QA will continue to rise while the cost of failure continues to fall, suggesting the existence of an optimum QA investment point at which the sum of the cost of materials QA and the cost of failure are at a minimum. This would yield the lowest total cost to achieve the desired level of quality. 1.1.2 Practical Implementation of Quality Model Although it may be conceptually elegant to frame the optimization problem as shown in Figure 1.2, practical implementation of such a model may prove difficult absent supporting cost data. For example, to fully develop the model as shown requires the following key inputs: • Cost to implement different levels of QA (testing, inspection, certification), • The probability of a non-conforming material for each level of QA, and • The cost of repairing or replacing non-conforming materials should a defect occur. For organizations that actively track this information (e.g., through a materials management and/or asset management system), adopting such an optimization model would allow for a more explicit consideration of costs, thereby reducing the need for (or perception of) subjective decision-making. For those that may find the required quality-related costs too difficult to capture and track, it may still be possible to develop reasonable approximations of the required inputs through expert judgment or other experimental data collection techniques (e.g., Delphi survey). Alternatively, if reliable cost information is too elusive to collect, taking a more qualitative approach to optimization can also provide actionable information. Engineering judgment combined with classic risk management principles can be used to prioritize materials and QA Figure 1.1. Optimized QA. Figure 1.2. Economics of quality of conformance [Adapted from Kirkpatrick, 1970].

Introduction 137 activities on the basis of the probability and consequence of material failure. What such a quali- tative process may lack in academic rigor, it can make up for in accessibility and ease of use. 1.1.3 Optimization Framework To ensure broad applicability, the framework presented in this guidebook combines the explicit cost-based optimization approach with more qualitative processes for risk ranking materials and material properties. The resulting three-level analytical framework, as summa- rized below and described in greater detail in subsequent chapters, is sufficiently flexible and robust to accommodate the range of materials and acceptance practices in use today as well as any new products that may emerge in the future. The three levels of the framework can be described as follows: • Level 1 entails conducting a qualitative risk rating of materials and then aligning these ratings with QA methods that can provide reasonable assurance of acceptable quality. (See Chapter 3 for more details.) • Level 2 has a narrower focus, attempting to optimize acceptance testing by emphasizing properties and test methods that are more direct indicators of performance. (See Chapter 4 for details.) • Level 3 adds a financial dimension to the assessment by comparing the cost of different QA protocols to the cost of potential defects to arrive at the optimum QA investment point. In contrast to the other levels, this method allows for a direct approximation of costs. (See Chap- ter 5 for details.) As illustrated in Figure 1.3, progressing through the levels requires an increasing degree of objective information and analysis. Users may choose to proceed through each level, or to stop after an answer of sufficient specificity is found or when the quality of the input data does not justify the additional analytical effort associated with the subsequent steps. Figure 1.3. Optimization model.

138 Guidelines for Optimizing the risk and Cost of Materials Qa programs Level 1 Level 2 Level 3 Objective Adjust QA effort (e.g., greater reliance on certification and inspection) based on qualitative assessment of the risk of material failure Adjust owner QA testing effort based on material property importance and risk Obtain the optimal QA investment point based on an explicit consideration of the cost of QA versus the cost of non-conformance Inputs • Materials of interest • Standard acceptance plans • Expert judgment regarding material risk • Standard acceptance plans • Primary and secondary properties and test methods • Traditional and advanced properties and test methods • Expert judgment regarding material risk • Cost to implement different levels of QA (testing, inspection, certification) • The probability and cost of potential defects Output Prioritized listing of materials and QA activities Optimized acceptance testing emphasizing properties that are more direct indicators of performance Optimum QA investment point Benefit Ease of use Enhanced focus on materials properties Direct approximation of QA and non-conformance costs Challenges Subjectivity of qualitative ratings, which can be difficult to defend Subjectivity of relative contribution of properties to material performance Difficulty of obtaining actual cost data (or reasonable approximations thereof) Table 1.1. Comparison of optimization levels. The decision of which level to use will depend in part on the application (e.g., programmatic versus project-level assessment), the availability and quality of the input data, and the desired level of analytical rigor. As summarized in Table 1.1, each level has unique objectives, benefits, and challenges, which should be considered when selecting the appropriate approach to use for a particular program, project, or project element. 1.2 Organization of Manual As an outgrowth of the NCHRP Project 10-92 research effort, the following guidance docu- ment has been prepared to help DOTs identify opportunities to optimize their existing materials acceptance practices. Given the variability in QA practices across the DOTs, the optimization methodology presented here is not intended to replace the steps and procedures that a DOT would normally follow in developing a materials acceptance plan; it instead provides a supple- mentary risk-based decision process that can serve to optimize or enhance existing practices. Understanding current QA practices therefore provides an essential starting point for devel- oping an optimization model that can be adapted to programs and projects of varying types, sizes, and complexity. To this end, Chapter 2 describes the current state of practice among the DOT community for materials QA, placing particular emphasis on any existing optimization strategies being used to align acceptance practices to what is needed to ensure a quality product.

Introduction 139 Subsequent chapters then present a detailed discussion of the framework methodology itself: • Chapter 3 describes a “Level 1” materials-based optimization approach, which uses a quali- tative risk-based rating process to prioritize materials and QA activities on the basis of the probability and impact of material failure or non-conformance; • Chapter 4 describes a “Level 2” optimization of material properties (e.g., strength, density) for materials subject to sampling and testing; and • Chapter 5 describes a “Level 3” cost-based optimization process that balances the cost of dif- ferent QA protocols against the cost of potential material defects to determine an optimal QA investment point. Finally, Chapter 6 presents strategies and tools that can facilitate the implementation of the optimization framework.

140 C h a p t e r 2 To provide context for the optimization processes presented later in this manual, this chapter describes the current state of materials QA in the highway construction industry. A general summary of standard methods used by DOTs to assure materials quality is presented first, followed by a discussion of some existing optimization strategies being used by DOTs to achieve better efficiencies in quality management. 2.1 Background 2.1.1 Categories of Materials To understand the current state of materials QA, it is important to first recognize that materials used in transportation construction can be broadly assigned to one of three categories based on their source and method of production. As described in AASHTO R 38 and summarized in Table 2.1 below, these materials categories are as follows: project-produced, fabricated structural materials, and standard manufactured materials (AASHTO, 2013). To a large extent, materials falling within the same category will generally exhibit similar char- acteristics regarding the level of production control and stability of properties, and will thus likely require a similar level of QA to assure product acceptability. For example, for field-produced materials, a high level of testing and inspection is often required or anticipated to control vari- ability and assure performance. 2.1.2 QA Methods Despite differences in the specific details according to which DOTs manage the acceptance of construction materials, current programs all generally include some combination of prequalifi- cation, sampling and testing, certification, and inspection processes to assure that the materials, products, and workmanship incorporated into a project are in reasonably close conformance to the approved plans and specifications. Descriptions and objectives of several QA practices in standard use today are summarized in Appendix A. 2.2 Examples of Existing Optimization Strategies Materials QA has historically been a critical though resource-intensive component of project delivery for DOTs. Driven by budget and resource constraints, DOTs have begun to investi- gate ways to more efficiently balance the risk of poor quality against the cost of materials QA. Greater use of alternative project delivery methods that shift more responsibility to industry Materials QA State of the Practice

Materials Qa State of the practice 141 for managing quality, increased understanding of materials behavior, and innovations in non- destructive testing and related technologies are providing additional motivation for DOTs to revisit their existing QA practices. Examples of strategies DOTs are implementing to optimize materials QA are discussed in the following subsections. 2.2.1 Use of Contractor Test Results Contractors have been assuming more sampling and testing responsibility in recent years in connection with the increasing use of statistically based QA specifications and alternative con- tracting methods that place more performance risk on industry. Including contractor quality control (QC) data in the acceptance decision allows for some optimization of DOT resources (even if, on the whole, the overall testing effort is not reduced). In recognition of the increasing role played by the industry in assuring materials qual- ity, FHWA’s sampling and testing regulation, “QA Practices for Construction” published at Title 23, Code of Federal Regulations, Part 637 (23 CFR 637), was revised in 1995 to expressly allow the use of contractor QC test results in an DOT’s acceptance decision pro- vided that: • The sampling and testing is performed by qualified laboratories, using qualified sampling personnel. • The DOT, or its designated agent (i.e., a consultant under direct contract with the DOT), validates the contractor’s test results by performing some level of independent verification sampling and testing. (Use of a third-party testing and inspection firm hired by the contractor does not relieve the agency of its responsibility for verification. Likewise, splits of contractor- obtained samples cannot be used for verification purposes.) • The QC sampling and testing is evaluated under an independent assurance (IA) program. • The DOT has a dispute resolution system in place to resolve possible discrepancies between the contractor’s QC and the agency’s verification data. Project Produced Fabricated Standard Manufactured Description Items that are produced directly for a specific project, often at the project site Materials custom-made for a specific project under what are generally controlled conditions Items that are mass produced for routine use under highly controlled conditions Material Variability Subsequent mixing, placing, compacting, finishing, curing, or other operations can substantively impact quality and material variability Properties are highly stable, assuming proper handling, transporting, and storage practices Properties are highly stable, assuming proper handling, transporting, and storage practices Examples Earthwork, subbase and base courses, and pavement Structural steel and precast/prestressed concrete structural elements Binders, paints and coatings, geosynthetics, landscaping items, piping, and traffic control devices Table 2.1. Material classification based on production mode.

142 Guidelines for Optimizing the risk and Cost of Materials Qa programs The DOTs are expected to perform enough verification sampling and testing to be able to identify statistically valid differences between its results and those of the contractor. [The F-test (comparison of variances) and t-test (comparison of means) are commonly used together to validate contractor test data.] While there is no universally accepted standard, a minimum rate of 10% of the contractor’s testing rate has been suggested as a rule of thumb. Construction Quality Assurance for Design-Build Highway Projects does acknowledge that rates of verification may differ based on the risks involved, and offers as an example that structural concrete would likely require more verification testing than embankment materials (FHWA, 2012). 2.2.2 Small Quantity Acceptance Several DOTs allow sampling and testing requirements to be waived or adjusted for small quantities of materials. For example, Washington State DOT (WSDOT) in Chapter 9 of its Construction Manual, provides project engineers with substantial latitude to judiciously adjust testing frequencies based on established guidelines (WSDOT, 2013). In accordance with the WSDOT Construction Manual, project engineers may choose to accept small quantities of materials without meeting the minimum sampling and testing fre- quencies (e.g., by visual acceptance, certification, or other methods) if the proposed quantity for that material is less than the minimum required testing frequency. Other considerations that WSDOT project engineers may factor into their decision to use small quantity acceptance include whether or not: • The material has been previously approved, • The material is certified, • A mix design or reference mix design is available, • The material has been recently tested with satisfactory results, or • The material is not structurally significant. WSDOT also allows use of small quantity acceptance for any quantity of the following: • Curbs and sidewalks, • Driveways and road approaches, • Paved ditches and slopes, and • Packaged concrete meeting ASTM C387 used for jobsite mixing. 2.2.3 Large Quantities under Control DOTs will generally allow project engineers some discretion to reduce the frequency of sam- pling for high volume materials once initial testing has verified that the materials supply is rela- tively uniform or statistically under control and within specification limits. (However, such DOTs typically reserve the right to focus and increase QA testing if the situation suggests that the process is somehow changing and is not consistently under control.) For projects with large quantities or volumes of materials, WSDOT’s policy is that it may choose to reduce sampling frequency and/or eliminate selected test properties after ten con- secutive samples taken at the standard testing frequency are shown to be well within the speci- fication limits. If there are any failing tests, the sampling and testing frequency will revert to the normal schedule. The authority to approve deviations to testing frequencies is as shown in Table 2.2. 2.2.4 Criticality of Materials or Activities Some DOTs use or are investigating the use of tiered or risk-based processes to qualitatively evaluate and assign materials to different levels of QA based on their perceived criticality from

Materials Qa State of the practice 143 the perspective of difficulty to repair or replace, safety, maintenance cost, or cost of rework. Examples of such strategies include the following: • The California DOT (Caltrans) Construction Quality Assurance Program Manual assigns materials to one of four different tiers based on their consequence of failure. As summarized in Table 2.3, Tier 1 items are considered to have the greatest consequence of failure, while Tier 4 items have the least consequence (Caltrans, 2015). • WSDOT’s Construction Manual provides project engineers with the latitude to adjust normal acceptance procedures for minor or non-critical items as follows (WSDOT, 2013): – Project engineer discretionary materials acceptance. The project engineer may choose to modify the normal acceptance procedures for minor, non-critical work items occurring outside the traveled way. Acceptance in such cases typically entails verifying dimensional conformance to the plans and making a visual determination that the materials are suitable. – Optional approval/acceptance for materials. WSDOT’s Construction Manual includes a list of materials that the project engineer may accept by visual inspection. If the quality Role Authority Level Project Engineer May initiate and approve up to 10% deviations from the testing frequency schedule, with the exception of the following materials: hot-mix asphalt, warm-mix asphalt, structural concrete, and cement concrete pavement Region Materials Engineers May approve requests from project engineers for: An additional 10% deviation, with the exception of the following materials: hot-mix asphalt, warm-mix asphalt, structural concrete, and cement concrete pavement Elimination of fracture and/or sand equivalent from a quarry site State Materials Engineer or Assistant State Materials Engineer May approve requests: To eliminate any other testing not expressly delegated to project engineers or region materials engineers For sampling frequency deviations exceeding the authority of project engineers and region materials engineers For sampling frequency deviations for hot-mix asphalt, warm-mix asphalt, structural concrete, and cement concrete pavement [Source: WSDOT, 2013] Table 2.2. WSDOT adjustments in testing frequency. Tier Failure Category Consequence of Failure Example Items 1 Catastrophic Failure is likely to cause loss of life or serious injury Structural steel, precast girders, pre-stressing 2 Safety Failure creates a safety hazard for employees or the public Delineation, safety barriers, lighting, signal controllers 3 Interrupt Service Failure or repair may cause an interruption in service or environmental impact Pavements, bases, embankment, storm water pollution prevention plan-best management practice devices 4 Monetary Monetary loss only Grass seed, drainage and irrigation products, fencing [Source: Caltrans, 2015] Table 2.3. Caltrans tier levels based on consequence of failure.

144 Guidelines for Optimizing the risk and Cost of Materials Qa programs or ability of the material to perform as intended is in question, the project engineer must determine if visual acceptance is appropriate or if additional acceptance testing or certifica- tion is necessary. The materials included on the list include items such as erosion control materials and miscellaneous fittings and hardware. • South Dakota DOT (SDDOT), as described in its manual of Required Samples, Test, and Certificates, implemented a 3-tiered process for certification of materials ranging from critical to non-critical on the basis of safety considerations or replacement costs (SDDOT, 2013). The certification requirements vary from certificates of compliance supported with test results to umbrella certificates for component products of a system or assembly. Verification methods for certification range from sampling and testing to documented inspection, random audits, or annual inspections of suppliers. • The Indiana DOT (INDOT) sponsored a research project performed by Purdue University to develop a tiered prioritization system for inspection resources. As described in a report prepared by Mostafavi and Abraham (2012), the objectives of this project were to evaluate the current inspection practices of INDOT and develop a risk-based inspection protocol to facilitate the efficient allocation of limited inspection resources to activities with higher risk consequences. Researchers at Purdue University conducted a risk analysis, using information obtained from 101 experts, representing INDOT, other DOTs, and consultants, to prioritize construction activ- ities based on the perceived risk impacts due to reduced inspection. The results of this analysis are summarized in Table 2.4. Based on this prioritization, a protocol for various inspection activities was developed that could be used to assist with the allocation of inspection resources when multiple activities are proceeding concurrently and available resources are not sufficient to fully inspect all ongoing activities. For example, site clearing, which was rated as a low prior- ity activity, would require only random inspection. In contrast, the higher priority activities of High Priority Medium-High Priority Medium Priority Medium-Low Priority Low Priority Aggregate base courses Asphalt paving Bolting structural connections Concrete paving Driven piles Embankment Placement of concrete in structures Post-tensioning (pre- stressed structures) Reinforcement steel in structures Retaining walls Structure rehabilitation (repair concrete deck) Beam erection Pipe placement Sub-grade treatment Drilled shafts Guardrail Overhead sign structure Painting steel Traffic marking Barrier curb Blasting Concrete forms (structures) Drainage Excavation Handling/removal of regulated waste Highway Lighting Installing soil erosion/sediment control items Sound wall panel placement Sound wall post placement Traffic control—set up Traffic signals Cofferdam Electrical conduit and wiring Fence ITS—fiber optic conduit and cable Landscape plantings Milling Placement of lighting features Seal coating Sheet piles Sidewalk Clearing site Clearing site—bridge Stripping [Source: Mostafavi and Abraham, 2012] Table 2.4. List of prioritized construction activities for inspection.

Materials Qa State of the practice 145 base course and embankment construction would require frequent to constant inspection based on the specific inspection item in question (e.g., embankment lift height would require frequent inspection whereas density would require constant inspection). 2.3 Summary Several DOTs currently optimize their materials QA practices to some extent based on what is required for each material or product to assure the quality of the end product. The rationale for selecting a particular acceptance method (ranging from continuous or statisti- cally based sampling and testing to certification and inspection) is often not well-defined but may be based on factors such as material criticality, quantities, type/size of project, and project delivery method. In general, the existing optimization efforts appear to be largely qualitative and informal in practice, with much discretion left to project engineers to modify rates or protocols based on engineering judgment. However, regardless of the level of rigor applied to current QA decision- making, the use of such optimization strategies suggests that a strong foundation and knowl- edge base exists for developing and implementing a more rational and in-depth process for optimizing the costs and risks of materials QA. Subsequent chapters of this guide thus focus on describing a flexible framework that DOTs may apply to determine how to efficiently allocate their QA resources.

146 Level 1: Materials-Based Optimization As introduced in Chapter 1, this guidebook presents a three-level analytical framework to help DOTs identify the QA acceptance plan necessary to meet specification requirements within an acceptable level of risk. Briefly, these levels can be described as follows: • Level 1 entails conducting a qualitative risk rating of materials and then aligning these ratings with QA methods that can provide reasonable assurance of acceptable quality. • Level 2 focuses on optimizing acceptance testing by emphasizing properties and test methods that are more direct indicators of performance. • Level 3 compares the cost of different QA protocols to the cost of potential defects to arrive at an optimum QA investment point. The focus of this chapter is on describing the Level 1 assessment process; Chapters 4 and 5 address Levels 2 and 3, respectively. Although this guidebook presents these three levels as a linear progression, the framework may be applied either in whole or in part, and in any order. The selection of which level(s) to use will depend in part on the application (e.g., programmatic or project-level assessment), quality of input data, and the desired level of analytical rigor. 3.1 Conceptual Framework The Level 1 assessment framework, as illustrated in Figure 3.1, provides a structured, risk- based decision process for prioritizing materials and QA activities on the basis of the probability and consequence of material failure or non-conformance. This knowledge is then used to effectively allocate QA resources to different project items on the basis of their perceived risk of failure. For example, high risk materials are aligned with acceptance methods designed to provide maximum confidence in the quality of the materials provided, whereas less resource-intensive methods are considered for lower risk materials. 3.2 Framework Steps This section systematically leads users through each step of the Level 1 qualitative risk-based decision process. The initial step largely entails a planning effort to help users define and structure the specific QA problem or opportunity that they wish to address. Subsequent steps then entail the use of classic risk management principles to align materials to an appropriate level of QA. Step 1. Identify Materials of Interest As QA protocols and risks are material-specific, it is important to first identify the specific materials or project elements that will be evaluated. C h a p t e r 3

Level 1: Materials-Based Optimization 147 Given unlimited time and resources, DOTs would likely benefit from reviewing and optimizing the QA protocol for all materials included in their Standard Specifications. If, however, such an overarching study is not practical, smaller subsets of materials could be explored on a programmatic basis (e.g., safety-critical materials, fabricated items, project or field-produced materials). Similarly, if operating on a project-level rather than a programmatic basis, one would have to decide whether to optimize the QA processes for all pay items or, for example, only a few select or high priority items. Step 2. Assess the Risk of Non-Conformance for Each Material of Interest Qualitative risk management principles can be used to prioritize the materials of interest on the basis of their relative risk of failure (or non-conformance). Once materials have been prioritized in this manner, QA resources can be allocated accordingly (i.e., the higher the risk of failure for a given material, the greater the amount of resources that should be dedicated to its QA). Step 1: Identify materials of interest. Step 2: Assess the risk of non-conformance for each material of interest. Step 5: Refine selection of QA method(s) as necessary. Step 4: Align QA methods with material risk tiers and modes of production. Considerations: Programmatic evaluation to assess all QA practices Project-specific evaluation to scale QA effort to project characteristics Evaluation of select or high priority materials Risk assessment: Material risk = probability x impact Step 3: Assign material tier levels based on risk scores. Figure 3.1. Level 1 assessment.

148 Guidelines for Optimizing the risk and Cost of Materials Qa programs As described in greater detail below, determining a given material’s risk of failure requires assessing: • The probability (or likelihood) of receiving non-conforming material and • The impact (or consequence) of the material failing to meet specification. For example, possi- ble impacts could range from low (e.g., acceptance at reduced pay or increased maintenance), to high (e.g., early failure of a safety critical material or product). The risk of failure can then be determined as the product of the probability and impact ratings: Probability Impact (EQ. 3.1)Risk = × a. Assess Probability of Non-Conformance For each material of interest, the likelihood or probability of that material failing to meet speci- fications must be determined. The likelihood of non-conformance will be driven in part by the material’s inherent variability. For example, one would expect standard manufactured items pro- duced under highly controlled conditions to have more stable properties (and thus be more likely to conform to specifications) than project-produced materials that are subject to subsequent mixing, compacting, finishing, curing, or other operations at the jobsite that could substantively impact quality and material variability. Although a single person could make this determination, typically a group of experts and interested stakeholders will systematically arrive at this assessment in a workshop setting. As individuals initially may hold differing attitudes and tolerance toward risk, all participants must come to share a common understanding of the probability of occurrence (e.g., high probability should mean the same thing to all participants) for such a process to be effective. A key initial task is to therefore define the probability scales that will be used to assess risk. Establishing this range upfront lends structure to the analysis exercise and ensures that all partici- pants are viewing probability in a consistent manner. Table 3.1 provides an example of possible risk probability definitions. As shown, either numerical (1, 2, 3) or adjectival (low, medium, high) rating scales may be used. b. Assess Impact of Non-Conformance For each material of interest, the project team must also determine the potential impact associated with non-conformance. For example, the assessment could range from no impact on performance or safety to a potential catastrophic impact requiring complete removal and replacement of a safety critical item. Numerical Rating Adjectival Description Definition 1 Non-conformance is unlikely <15% 2 Non-conformance is somewhat likely > 15% to <45% 3 Non-conformance is likely > 45% to <75% 4 Non-conformance is highly likely > 75% to <95% Table 3.1. Example of risk probability definitions.

Level 1: Materials-Based Optimization 149 Similar to estimating probabilities, all participants must share a common understanding of “impact” for the process to be effective. Possible perspectives from which to consider impacts include safety, difficulty to repair or replace, maintenance costs, and cost of work. Before con- ducting the assessment, the evaluators should reach agreement on the impact definitions that will be used. The tables below provide examples of possible ways to define the impact of material non-conformance, with Table 3.2 focusing on performance issues and Table 3.3 on safety criticality. c. Determine the Risk Score Once probability and impact ratings are determined, they can then be combined (by multiplying the probability and impact ratings or by using a probability-impact matrix similar to that shown in Figure 3.2) to arrive at a “score” for each material of interest. This score reflects the combined effect of a material’s probability of non-conformance and the estimated severity of any unmitigated consequences associated with that non-conformance. Figure 3.2 provides an example probability-impact matrix, which incorporates the probability and impact rating scales previously shown in Tables 3.1 and 3.2. Numerical Rating Adjectival Description Definition 1 Minimal Impact Little if any impact to service life 2 Some Impact Earlier than planned maintenance needed 3 Significant Impact Earlier than planned major rehabilitation needed 4 Catastrophic Impact Immediate intervention needed Table 3.2. Example of risk impact definitions. Numerical Rating Adjectival Description Definition 1 Monetary Impact Monetary loss only 2 Service Interruption Failure or corrective action may cause an interruption in service or an environmental impact 3 Safety Impact Failure creates a safety hazard for employees or the public 4 Catastrophic Impact Failure is likely to cause loss of life or serious injury Table 3.3. Alternative example of risk impact definitions (adapted from Caltrans).

150 Guidelines for Optimizing the risk and Cost of Materials Qa programs In the example matrix shown in Figure 3.2, the probability rating is multiplied by the impact rating to arrive at an overall risk score. This score will then be used in the following steps to prioritize materials for the purpose of efficiently allocating QA resources on the basis of material criticality. Step 3. Assign Material Tier Levels Based on Risk Scores The risk scores established in Step 2 can then be used to assign materials to different tiers of material criticality using a scale similar to that shown in Table 3.4, which translates the risk score (probability × impact) to different materials tiers (low, moderate, and high risk) based on the risk of material non-conformance. Note that the risk scores included in Table 3.4, along with the other risk-related definitions and criteria included throughout this manual, are intended to be illustrative only. Users should tailor these rating scales based on their own agency’s overall tolerance or appetite for risk. It is also important to bear in mind that risk tolerance (as reflected in the probability and impact definitions, probability-impact matrix, and risk scoring cutoffs) may change over time. Changing circumstances (e.g., heightened public, political, or regulatory scrutiny) may trigger a reevaluation of what constitutes an acceptable level of risk for the agency. Step 4. Align QA Methods with Material Tiers and Modes of Production The previous risk assessment step focused on evaluating material non-conformance risk. In this step, the focus shifts to identifying an appropriate level of QA given this material risk. Impact 1 2 3 4 Minimal Some Significant Catastrophic Pr ob ab ili ty 4 Highly Likely 4 8 12 16 3 Likely 3 6 9 12 2 Somewhat Likely 2 4 6 8 1 Unlikely 1 2 3 4 Figure 3.2. Example probability-impact matrix. Risk Score Material Tier Description Risk Score > 8 Tier 1 Materials having the greatest risk of failure 2 < Risk Score < 8 Tier 2 Moderate risk materials Risk Score < 2 Tier 3 Low risk materials Table 3.4. Example material tiers.

Level 1: Materials-Based Optimization 151 Risks related to materials quality are generally managed using some combination of preven- tion and appraisal techniques: • Prevention techniques refer to those measures taken to avoid poor quality. This could include prequalification (e.g., of materials, sources of supply, contractors, fabricators), submittal reviews, preconstruction meetings. • Appraisal techniques refer to methods, such as inspection and sampling and testing, which are used to determine the degree of material conformance to specifications. For the purpose of this step, the distinction between prevention and appraisal is not as impor- tant as the idea that assuring quality often requires use of multiple methods that vary in cost and effectiveness. A key step in the optimization process therefore entails first identifying the full spectrum of QA options available and then aligning these methods to the risk-based material tiers established in Step 3. a. Identify All Possible QA Methods As illustrated in Figure 3.3 and defined in greater detail in Appendix A, DOTs currently use a variety of methods to assure project and/or material quality, including but not limited to the following practices: Prevention Methods: • Materials prequalification – Qualified (or authorized) products list – Manufactured to national quality standard – Pre-approved sources of supply • Qualification requirements for facilities, contractors, and personnel – Prefabrication audit – Authorized facility list – Authorized plant – Contractor qualifications – Qualifications for personnel Figure 3.3. Example QA methods.

152 Guidelines for Optimizing the risk and Cost of Materials Qa programs • Submittals – Review of working drawings – Review of mix designs – Review of quality management plans Appraisal Methods: • Contractor quality control sampling and testing • Certificates of compliance – Certificate of compliance from a producer – Certificate of compliance from a producer with test results • Sampling and testing – Source and field – Contractor and agency • Inspection – Benchmark, intermittent, or continuous • Warranties Similar to most quality systems, the methods listed above represent a mix of measures used to both prevent and appraise material quality. Although it is possible that under cer- tain circumstances one method alone could be used to assure the desired level of quality and performance of a material (e.g., requiring the use of materials selected from a qualified products list), the use of multiple methods in series or in combination is more typical of most construction materials. For example, to assure the quality of hot-mix asphalt (HMA), DOTs will often require the use of prequalified constituent materials, submission of mix designs for DOT review, contrac- tor QC, and DOT acceptance sampling and testing. Then, even if produced correctly, proper placement and final processing are still necessary to ensure an HMA pavement’s final in-place quality. Improper handling and placement can cause segregation of the aggregate components of the product thereby negatively impacting final uniformity, porosity, or ride quality. To ensure quality, inspection is therefore also a critical element of a QA program even when significant sampling and testing is being performed. b. Align Possible QA Methods to Material Tiers and Production Modes The risk-based materials tiers established in Step 3 can be used to determine the type and level of QA needed for a particular item; the higher the risk (likelihood and impact) of failure for a given material, the greater the level of resources that should be allocated to its QA. For example, certification and intermittent inspection may be sufficient for low risk Tier 3 materi- als, whereas more frequent inspection and sampling and testing may be necessary for Tier 1 and 2 materials. Focusing first just on an owner’s materials acceptance practices, this risk-based philosophy of QA resource allocation can be applied to align the material tiers established in Step 3 to accep- tance methods that would provide an appropriate degree of confidence in the quality of the material provided. Table 3.5 provides an example of how acceptance methods could vary based on material tiers. In addition to the material tiers, it is also important to recognize, as reflected in Table 3.5, that the mode of material production (i.e., project-produced, fabricated, and standard manufac- tured item) will also influence the selection of an appropriate materials acceptance strategy. As discussed previously in Chapter 2, materials produced in a similar manner will generally exhibit

Level 1: Materials-Based Optimization 153 similar characteristics regarding variability and stability of properties, and will thus likely require comparable levels and types of QA to assure product acceptability. Additional examples of risk-based materials acceptance methods are provided in case studies from the WSDOT and Caltrans (see boxes 3.1 and 3.2, respectively). Looking beyond an owner’s materials acceptance practices, a comprehensive materials QA program, as noted above, would likely entail the use of multiple QA methods (including both prevention and appraisal techniques), some of which may be the responsibility of the owner, while others are performed by industry. Table 3.6 therefore expands on the acceptance methods presented in Table 3.5 to provide a broader summary of the various QA techniques that are used to assure material quality (and thus contribute to a material’s total cost of quality). The table is by no means exhaustive in its identification of possible QA methods, nor is it meant to suggest that DOTs should incor- porate all of the identified methods into their QA program. For example, some QA methods, such as performance warranties on materials produced at the jobsite, may be challenging to implement. Users should tailor the table to the various QA methods employed by their own agencies. Furthermore, the suggested QA solutions in the table (as denoted by the “x” in the matrix) are not intended to be definitive. A method to further refine these selections is provided in Step 5. Step 5. Refine Selection of QA Methods Step 4, and more specifically Table 3.6, provides a starting point for identifying QA meth- ods that may be appropriate for materials falling within a given tier. To help further refine Material Tier QA Strategy Primary Acceptance Methods Project-Produced Fabricated Manufactured 1 QA methods designed to provide maximum confidence in the quality of the materials provided. DOT Acceptance Sampling and Testing, or DOT Verification Sampling and Testing (if using Contractor QC data) Continuous DOT fabrication inspection combined with a requirement for a Fabricator Quality Management System Plan Certificates of compliance backed by periodic sampling and testing by manufacturer and random DOT verification testing 2 QA methods designed to provide a high level of confidence in the quality of the materials provided. Same as above Intermittent DOT inspection combined with certificates of Compliance Certificates of compliance backed by random or programmatic sampling and testing by manufacturer 3 QA methods designed to ensure that the specified material has been supplied Random or programmatic sampling and testing Random or programmatic assessment of fabricator combined with certificates of compliance Certificates of compliance or catalogue cuts Table 3.5. Alignment of primary acceptance methods to material tiers and production modes.

154 Guidelines for Optimizing the risk and Cost of Materials Qa programs Box 3.1. WSDOT Risk Rating of Materials Washington State DOT (WSDOT) performed an internal programmatic materials risk analysis between 2002 and 2005 to develop a system to more formally evaluate the risk of materials and to determine the level of assur- ance needed to accept each construction material based on that perceived risk. As documented in the WSDOT research report entitled Materials Risk Analysis (Baker et al., 2010), WSDOT first developed a conceptual internal ranking of QA acceptance practices, ranging from the most intensive level of scrutiny to the least intensive as follows: • Level 1: Highest level—WSDOT acceptance testing, or a combination of fabrication inspection coupled with a requirement for a manufacturer’s quality system plan (e.g., structural steel) • Level 2: Second highest level—Requires a manufacturer’s certification of compliance coupled with a quality systems plan (e.g., soil nails, structural earth walls, ground anchors, and guardrail) • Level 3: Intermediate level—Either a certification from the contractor that the supplied material meets con- tract requirements or a catalog cut stating the qualities of the material being used (e.g., fencing, compost, soil amendments, and other non-structural items that do not require testing or certification) • Level 4: Lowest level—Visual inspection in the field to verify the correct product is used WSDOT then plotted these QA acceptance levels on a five-by-five risk rating matrix, as shown below, to align QA levels with risk ratings. Once this framework was established, WSDOT followed a Delphi process to assign risk ratings for all the materials in its program. Material risk ratings were determined by considering each material’s chance (or probability) of failure and the consequences of this failure. As a result of this analysis, WSDOT was able to adjust or optimize the level of QA for some materials that previ- ously were subject to more rigorous QA. The number of materials requiring the highest level of examination (i.e., acceptance testing and/or manufacturer’s quality system plan) decreased from 98 to 88. Similarly, materials in the second level of acceptance also decreased, while the number of materials that could be accepted based on visual inspection or certification increased accordingly.

Level 1: Materials-Based Optimization 155 Box 3.2. Caltrans Risk Rating of Fabricated Materials The Caltrans Office of Structural Materials (OSM) has initiated a qualitative risk-based decision framework for acceptance of fabricated materials. The Material Plant Quality Program certification is used for plant produced materials. The program assures that the QA effort is scaled correctly by deploying inspectors through a risk- based decision process that determines the level of inspection (i.e., from programmatic to intermittent to continuous). High priority/high risk items are inspected or tested at higher inspection levels. Project type is also considered in the evaluation, with higher profile or higher risk projects subjected to more rigorous QA. As shown in the figure below, the qualitative risk process involves a material and workmanship probability assessment (low-medium-high) and an impact assessment scaled according to material failure and project types. The risk assessment determines the level of plant inspection (QA effort) required, ranging from low (programmatic) to high (continuous inspection). A low or programmatic level might not entail a shop inspec- tion whereas higher levels would entail intermittent shop inspections or a full-time plant inspector. Caltrans distinguishes between projects with a regular production schedule and those with accelerated or higher impact production schedules, which tend to increase the level of QA required.

156 Guidelines for Optimizing the risk and Cost of Materials Qa programs Quality Assurance Methods Production Mode Material Risk Tier Pr oj ec t Pr od uc ed Fa br ic at ed M an uf ac tu re d Ti er 1 Ti er 2 Ti er 3 Materials Prequalification Qualified (or Authorized) Products List (1) X x x x x x Tested Stock x x x Manufactured to National Quality Standard (i.e., NTPEP) x x x x Commercial Quality x x Pre-approved Source X x x x Qualification Requirements for Facilities, Contractors, and Personnel Prefabrication Audit x x Authorized Facility List x x x Authorized Laboratory X x x x x Approved Manufacturer x x x x Contractor Qualifications X x x x Qualification Requirements for Sampling/Testing Personnel X x x x x Qualification Requirements for Installer/Fabricator Personnel x x x x Submittals Agency Review of Contractor Working Drawings x x x x Agency Review of Contractor Mix Designs X x x x x Agency Review of Quality Management Plan X x x x Sampling and Testing by Contractor Quality Control Sampling and Testing (Source) x x x x x Quality Control Sampling and Testing (Jobsite) X x x x Quality Control Sampling and Testing for Acceptance X x x Sampling and Testing by Agency Verification Sampling and Testing X x x Programmatic QA Inspection and Testing (Jobsite) X x Acceptance Sampling and Testing X x x x Independent Assurance Testing (Project Basis) X x x Independent Assurance Testing (System Basis) X x x x Table 3.6. Alignment of QA methods to material tiers and production modes.

Level 1: Materials-Based Optimization 157 Quality Assurance Methods Production Mode Material Risk Tier Pr oj ec t Pr od uc ed Fa br ic at ed M an uf ac tu re d Ti er 1 Ti er 2 Ti er 3 Sampling and Testing by Agency Certificate of Compliance (1) X x x x x Certificate of Compliance with Test Results x x x x Inspection Quality Control Inspection X x x x x x Continuous Inspection of Work In Progress X x x Intermittent Inspection of Work In Progress X x x Benchmark Inspection X x x Warranties Material and Workmanship Warranty X x x x Performance Warranty X x Manufacturer’s Guarantee x x x x (1) For project produced materials, this method could be applied to constituent materials. x = compatible; blank cell = incompatible Table 3.6. (Continued). the selection of appropriate QA methods, Table 3.7 below summarizes some additional project-related considerations that can influence the QA strategy for a given material and/ or project. To further optimize the QA strategy for a given material requires taking a closer look at its failure scenarios to identify: • The different threats that could lead to material non-conformance, • The consequences associated with these failure modes, • QA measures that could be used to manage (i.e., prevent or detect) the specific threats identi- fied, and • Possible response or recovery measures that could be used to mitigate the identified consequences. This step provides a structured framework for analyzing such causal relationships to identify fit-for-purpose QA strategies that will help ensure ongoing minimization and management of risks related to material quality. Unlike prior steps, the focus here is not on assessing the level of risk but on demonstrating effective control of these risks. a. Identify the Threats and Consequences To design an effective and efficient QA program for a given material requires first identifying the specific threats that could lead to the material’s non-conformance, as well as the potential

158 Guidelines for Optimizing the risk and Cost of Materials Qa programs Factor Considerations Possible Optimization Strategies Contractor/supplier qualifications and experience Does the contractor, fabricator, or supplier (as applicable) have a history of consistently acceptable performance (i.e., compliance with specifications or with national or regional quality standards, such as NTPEP)? Are the materials being provided from a pre-approved source, qualified products list, or similar? Does the contractor/supplier have a quality management plan? If the DOT has confidence in the reliability of the contractor or supplier, consider the following options: Reduce sampling and testing and/or inspection frequency Consider programmatic, system-wide, or regional testing or certification Accept by certification backed by desktop audit of certification documentation and visual inspection Small material quantities Is the planned quantity less than the minimum required test lot? Has the material been previously approved or certified? Has the material been recently tested with satisfactory results? Is the material structurally significant? Reduce sampling and testing frequency Accept based on visual inspection or certification Large quantities Has initial testing verified the material supply is relatively uniform or statistically under control? Reduce sampling and testing frequency Eliminate selected test properties (e.g., focus on end-result testing) Project criticality or complexity Does the project have a low risk profile based on size, location, or complexity? Is the work item in question located outside the traveled way? Is the item non-structural? For a low risk project (e.g., low volume rural roadway or culvert reconstruction), consider: Acceptance based on visual inspection Acceptance based on certification backed by random or periodic tests Project delivery method Is the project being delivered using an alternative delivery method (e.g., design- build) that places more responsibility for performance on industry? Is the contractor required to submit and implement a detailed, project-specific quality management plan? Will the material be covered under a warranty or post-construction maintenance provision? Use contractor QC test data, that has been independently verified by the DOT or its agent, in the acceptance decision Use more performance-oriented or end- result tests Table 3.7. Factors that can influence selection of QA methods.

Level 1: Materials-Based Optimization 159 consequences that could result from its failure. In essence, this entails providing a response to the following questions: What can cause the material to fail? How can control be lost? What consequences are associated with these different failure modes? For example, possible threats could include limited owner QA resources, contractor inexperi- ence, and material-specific issues (e.g., poor constituent materials in mix design, or unsuitable local material sources). Consequences could include service interruptions, reduced design life, and safety issues. b. Identify Control Measures to Minimize Threats Once a comprehensive set of threats and consequences have been identified, the next step is to evaluate what controls (i.e., QA methods) could be used to prevent or otherwise monitor each individual threat. For example, to help prevent the threat of unsuitable local material sources, one could require the contractor to supply materials from agency pre-approved suppliers/sources or require higher levels of source testing. From the perspective of QA optimization, tailoring a QA protocol to address the specific issues or events that could lead to a material’s failure can help minimize the implementation of unduly burdensome QA practices and unnecessary procedures that are disproportionate to the actual threats posed. c. Identify Recovery Measures to Minimize Consequences Despite the implementation of controls, a material non-conformance may still occur. As part of the overall QA strategy, it is therefore important to also consider what recovery meth- ods could be used to minimize the consequences associated with material non-conformance. For example, if materials are non-compliant, the agency could then either accept the non- conforming material at reduced payment or require removal and replacement of the smaller production lot or element. A structured approach used across industries to facilitate the above analysis, particularly when performed in a workshop setting, is known as the Risk Bowtie method. As discussed further in Box 3.3, Bowties are a useful technique for analyzing and communicating the interactions between risk causes, effects, and controls. If the selected QA strategy ultimately involves sampling and testing, additional refine- ment can be achieved by performing a Level 2 assessment. As described in Chapter 4, the Level 2 analysis optimizes acceptance testing by emphasizing properties and test methods that are more direct indicators of performance and by establishing a risk-based approach to determine the appropriate level of owner verification testing (if contractor test data is being used). If the agency has access to cost data (or reasonable approximations thereof), a Level 3 assessment, as described in Chapter 5, can be performed to quantitatively arrive at an opti- mum QA investment point by comparing the cost of different QA strategies to the cost of potential defects.

160 Guidelines for Optimizing the risk and Cost of Materials Qa programs {ART} Box 3.3. Using Risk Bowties to Refine QA Strategy The Bowtie method is a structured risk evaluation technique that can be used to qualitatively assess and dem- onstrate effective control of risks. Popularized by Royal Dutch/Shell Group, the Bowtie method is now used by many organizations, particularly in the chemical, oil and gas, and airline industries, to: • Evaluate risk scenarios (threats and impacts) that could exist around a certain hazardous event (typically involving safety or quality), and • Identify the measures that will be used to control these risks The Bowtie combines elements of fault tree (causes) and event tree (consequences) analyses to arrive at a single diagram, resembling a bowtie, which can be used to analyze and communicate the interactions between risk causes, effects, and controls. The method, which is typically applied in a workshop setting, is built out piece-by-piece by having knowledge- able people talk through the different elements of a risk scenario (causes, impacts, controls, and recovery mea- sures), as well as the linkages between these elements. The steps to this process include the following: 1. Identify the hazard of concern. Place the undesired or hazardous event (e.g., material non-conformance or material hazard) in the center of the model.

Level 1: Materials-Based Optimization 161 2. Assess the specific threats that could cause the hazard. Identify the potential threats that could cause the hazard to occur. In the case of material non-conformance, this could include: • Inexperienced contractor/fabricator/producer • Owner constraints related to QA resources (limited testing/inspection staff) • Material-specific issues (e.g., for HMA this could include segregation, rutting, cracking) 3. Assess the impacts of each threat. Identify the potential impacts if the undesired event were to occur. In the case of material non-conformance, this could include: • Premature failure (or reduced design life) • Service interruption • Safety concerns • Aesthetic issues 4. Identify controls (i.e., QA methods) that could be used to minimize the possibility that threats will material- ize. For example: • Use of prequalified materials • Contractor qualifications • Sampling and testing • Inspection

162 Guidelines for Optimizing the risk and Cost of Materials Qa programs 5. Identify recovery measures that can be used to mitigate the potential impacts of the threat should it occur. For example: • Accept at reduced pay • Remove and replace • Require warranty

163 Level 2: Property-Based Optimization The Level 1 optimization process discussed in Chapter 3 provides a framework for identifying suitable method(s) for assuring that a given material will be in reasonably close conformance to the approved specifications. For materials that are to be accepted by sampling and testing, the optimization process presented in this chapter provides a systematic approach for further evaluating • properties and test methods that are more direct indicators of performance and • the appropriate level of owner verification testing (if contractor test data is being used in the acceptance decision). 4.1 Conceptual Framework The Level 2 assessment framework, as illustrated in Figure 4.1, provides a structured decision process for prioritizing material sampling and testing activities on the basis of the risk (i.e., likeli- hood and consequences) of non-conformance of key material properties. 4.2 Framework Steps The framework shown in Figure 4.1 includes a six-step process to determine the extent to which acceptance testing can be optimized or reduced based on performance risk. Step 1. Identify Materials of Interest The Level 2 optimization process applies to materials that are accepted on the basis of sam- pling and testing, a resource-intensive method that is primarily used for higher risk or more critical materials. Similar to the Level 1 assessment process, a key first step to fully defining the optimization prob- lem entails identifying the specific materials of interest. One may wish to assess all material items requiring sampling and testing, or focus simply on specific items having the greatest potential for optimization. Step 2. Review Existing Acceptance Plans for Materials of Interest For each material of interest, the DOT’s current sampling and testing plan should be reviewed, bearing in mind that although such QA practices are likely based on methods that C h a p t e r 4

164 Guidelines for Optimizing the risk and Cost of Materials Qa programs have historically produced satisfactory quality, they may fail to take advantage of possible efficiencies to be gained from • recent developments in the understanding of materials behavior, • advances in non-destructive testing technology, and • increasing use of performance specifications and alternative delivery methods that shift more responsibility for quality management to industry. Step 3. Evaluate the Importance of Acceptance Properties Most, if not all, tested properties routinely used for acceptance have some relationship to product performance. However, DOT acceptance testing of less important properties can result in inefficient allocation of an agency’s QA resources and unnecessary project costs. The resources allocated to the sampling and testing effort for a particular material property should be consistent with • the criticality of the contract item (see the Level 1 process described in Chapter 3) and • the ability of the individual property to act as an indicator of performance. When reviewing the existing QA strategy for a given material (or alternatively, when selecting acceptance properties for a new pay item), consider the following questions to help determine the relative importance of a property as an indicator of performance: Step 2: Review existing acceptance plan. Step 3: Evaluate importance of acceptance properties as indicators of performance. Step 5: Identify project-level considerations that may influence sampling and testing effort. Project factors: î Contractor/supplier qualifications î Material quantities î Project criticality or complexity î Project delivery method Acceptance plan considerations: î Properties î Sampling plan î Test methods î Targets/thresholds î Pay factor adjustments Evaluate: î Importance of property î Cost of acceptance sampling and testing Step 1: Identify materials of interest. Step 4: Assess benefit of using alternative or advanced acceptance properties and test methods. Step 6: Determine if acceptance/verification testing can be optimized based on material risk and property importance. Figure 4.1. Level 2 assessment.

Level 2: property-Based Optimization 165 • Is the property more strongly or less strongly associated with distresses? • What is the likelihood that property non-conformance would result in reduced or impaired performance? • Are any properties more suited for the contractor’s quality control function (e.g., gradation for HMA) than for agency acceptance testing? • Does the property provide a measure of fundamental material characteristics (e.g., stiffness, fatigue)? • Is the acceptance property used to determine a payment adjustment or to support a remove- and-replace decision? From the perspective of optimization, additional questions to consider may include: • Is the standard frequency of sampling and testing for certain properties relatively higher than others? What test(s) require the most resources? • What test(s) can only be performed in the state laboratory? • What test(s) require special equipment or expertise? • What properties require destructive testing? Answering such questions requires a thorough understanding and evaluation of each prop- erty being considered for inclusion in the QA plan. Historical data obtained from the agency’s project quality management records or asset management system may provide a reliable source of information to support the decision process. Based on such an analysis, any given material property can be characterized as being a pri- mary, secondary, or observational indicator of performance, as described in Table 4.1. If analysis Property Tier Description Suggested Level of QA Primary Property has strong relationship to performance and/or has the highest risk related to non- compliance QA methods designed to provide maximum confidence in the quality of the materials provided Secondary Property has a moderate or less direct relationship to performance; risk related to non- compliance is moderate QA methods designed to provide an adequate level of confidence in the quality of the materials provided. For optimization purposes, consider: î Reducing the level of acceptance testing after start-up for material properties demonstrated to be under control, or î Using alternative non-destructive sampling and testing strategies to expedite testing, and increase the sample size Observational Property has an indirect relationship to performance and risk related to non-compliance is low QA methods entailing observational verification of contractor/producer tests combined with intermittent to random inspection, sampling, and testing of in-progress work Table 4.1. Assignment of material property tiers based on property importance.

166 Guidelines for Optimizing the risk and Cost of Materials Qa programs indicates that a property is strongly associated with common failure modes, then the property could be considered a primary indicator of performance. It is also important to remember that, as a general rule, properties tested in the end product are more closely related to the end-product performance than when tested during production. Thus, testing of the in-place (in-situ) product (or at the point of placement), while not currently practical for some materials, is a desirable goal. Step 4. Assess Benefit of Using Alternative or Advanced Acceptance Properties Current sampling and testing protocols for material acceptance often rely on destructive test- ing of the in-place product. Although such testing generally provides more accurate and reliable results, it tends to be costly and time intensive. For example, obtaining core samples for HMA can be considerably more resource intensive than non-destructive test (NDT) methods. NDT methods, though often less precise, yield a higher volume of samples (or continuous sampling), which can offset this disadvantage. Examples include ground penetrating radar (GPR) as a measure of durability (in-place air voids, thickness), and intelligent compaction to more rapidly characterize material quality and variability. For less critical material items or as a secondary measure of QA (e.g., as a QC tool for screening purposes), the use of rapid NDT methods may provide a cost-effective alternative to standard tests. Use of advanced performance tests (i.e., performance-based mixture design of mechanistic properties) may also be appropriate for acceptance purposes if they can potentially provide stronger indicators of performance for high risk projects or they correlate well with specific performance objectives (e.g., durability, fatigue, or rutting resistance). Step 5. Identify Project-Level Considerations That May Influence Sampling and Testing Consideration of project-related factors, such as those discussed previously in Table 3.7, can also help with the selection of effective sampling and testing strategies. In effect, a DOT can use a lower frequency of sampling and testing when a material item is supplied by an extremely reliable, qualified supplier, and/or the material is for a low volume or less critical/complex project. Step 6. Determine the Extent to Which Acceptance Testing Can be Reduced or Optimized Based on Material Risk and Property Importance The last step in the Level 2 property-based optimization entails deciding the extent to which acceptance testing can be reduced or optimized for a given material based on material risk and property significance. As described in the Level 1 assessment, materials can be generally char- acterized as high (Tier 1), moderate (Tier 2), and low risk (Tier 3) items. The QA sampling frequency of tested properties can be reduced for less critical materials and less important properties. Table 4.2 provides an example of how a sampling and testing acceptance plan could be opti- mized based on material risk and property importance when the DOT performs all acceptance testing (i.e., contractor test data is not used in the acceptance decision).

Level 2: property-Based Optimization 167 In general, the DOT may choose to modify the normal inspection or testing procedures for lower risk projects or project elements. For primary properties, sampling frequency may be reduced for tests that are under control (e.g., after ten consecutive samples taken at the normal testing frequency indicate full conformance with the specifications). The sampling and testing frequency should revert to the default frequency if there are failing tests. For the case where contractor QC test results are used in the DOT’s acceptance decision, the DOT must still perform independent verification sampling and testing in accordance with 23 CFR Part 637. The optimal verification sampling and testing plan would be driven by material criticality and the ability of the property to act as an indicator of performance. A case study example provided in Box 4.2 describes how Texas DOT has implemented a risk- based protocol that assigns higher levels of QA analysis to properties that have been historically shown to have greater residual risks related to long-term performance. A general summary of such a risk-based process follows below. Property Importance Material Risk Example Properties High Risk (e.g., structural or safety critical elements, high user impacts, large quantities) Moderate Risk (e.g., structural elements with moderate safety or user impacts) Low Risk (e.g., non-structural elements, small quantities) Primary indicator î Use default frequencies î If process is determined to be under control, reduce to 75% of default frequency î Use 90% of default frequency after process is determined to be under control î Use 50% of default frequency (double lot size) after production start-up î Waive acceptance testing at engineer’s discretion î Concrete Strength î Concrete air content î HMA In-place air voids î Key mixture properties (e.g., asphalt concrete) î Performance tests* Secondary indicator î Use 90% of default frequency after process is determined to be under control î Use 50% of default frequency (double lot size) after production start-up î Observational verification of QC tests with audits of certifications and random verification tests î Slump î Gradation î Secondary mixture properties î NDT* Observational indicator î Observational verification of QC tests with audits of certifications and random verification tests î Observational verification of QC tests with audits of certifications î Random verification of QC records for compliance with specifications î Segregation profile î Temperature î Workmanship indicators î NDT* *Note: Testing of more advanced material properties (i.e., stiffness, dynamic modulus, permeability) or advanced rapid NDT test methods can provide cost-effective strategies to reduce risk or to meet specific performance goals (e.g., durability or long life) when used at lower frequencies or a once-per-project basis correlated with standard tests. Table 4.2. Example property-based optimization (DOT performing acceptance testing).

168 Guidelines for Optimizing the risk and Cost of Materials Qa programs a. Primary Properties For primary properties, verification testing frequency is typically a minimum of 10% to 25% of the QA testing frequency. F-test and t-test are performed on these key material properties on a continuous basis with the addition of each verification test result. The p-values (from the F-test and t-test) are reported for each analysis and tracked over time compared to a specified level of significance for the material (e.g., a = 0.025 for structural concrete and 0.01 for non-structural concrete). The levels of significance refer to the probability of rejecting the null hypothesis assumption that the DOT and contractor populations are equal. AASHTO R 9 provides suggested values of acritical used in the highway construction industry. An example of material categories and acritical used for statistical analyses is shown in Table 4.3. This approach enables the DOT to monitor the validation status of each property daily and allows for corrective action to address non-validating test results. For a sampling and testing process determined to be under control, a continuous-cumulative lot or a chain lot strategy, as described in Box 4.1, has been used to increase the sample size for the F-test and t-test where contractor test results are used in the acceptance decision. b. Secondary Property A secondary property provides independent verification for those materials and test meth- ods that are secondary indicators of performance. An example is a slump or gradation test for hydraulic cement concrete. For such low risk materials and properties, the DOT verification testing frequency could be scaled back and not involve statistical validation (i.e., F-tests and t-tests). c. Observational Property An observational property can entail observation verification for those materials that require only a few QA tests for compliance with the standard guide schedule or materials having a low risk of failure that will not affect long-term performance. Under the observational approach, the DOT does not directly perform tests but instead observes the contractor’s QC testing for equip- ment and procedural compliance with the test standard. Table 4.4 provides an example of how a sampling and testing acceptance plan could be opti- mized based on material risk and property importance when the DOT uses contractor QC data for acceptance purposes. Material Category Level of Significance (`critical) Embankment, Subgrades, Backfill, and Base Courses 0.01 Asphalt Stabilized Base (Plant Mix) 0.01 Hydraulic Cement Concrete – Structural 0.025 Hydraulic Cement Concrete – Non Structural 0.01 Hydraulic Cement Concrete Pavements 0.025 Asphalt Concrete Pavement 0.025 Table 4.3. Example level of significance applied to materials.

Level 2: property-Based Optimization 169 Box 4.1. Continuous-Cumulative and Chain Lot Methods The continuous-cumulative method shown in the figure below consists of accumulating incrementally test results from sequential lots (i.e., results from lots 1 and 2; results from lots 1, 2, and 3; results from lots 1, 2, 3, and 4). Lot accumulation starts once one lot is found to be conforming (i.e., favorable t-test or other DOT application). Then, as long as the accumulated lots yield a favorable t-test, the accumulation of lots continues until a failing t-test occurs for consecutive sets of lots. Similarly, in the chain-lot method shown below, a fixed number of lots (e.g., 2 ≤ i ≤ 5) are individually tested. After a specified number of lots i are found conforming, the accumulation of lot results begins. The chain-lot method considers a constant set size of i + 1 lots for assessing the acceptance properties (Arambula and Gharaibeh 2014).

170 Guidelines for Optimizing the risk and Cost of Materials Qa programs Box 4.2. TXDOT Risk-Based Approach to Verification Testing Texas DOT (TXDOT), in its Design Build Quality Assurance Program Implementation Guide, evaluated the ability of individual material properties to act as indicators of performance. This analysis of material properties was then used as a basis for determining how much owner verification testing should be performed to validate contractor test results used in the acceptance decision (TXDOT, 2011). For design-build projects with 15-year capital maintenance agreements, TXDOT’s guide applies three tiers of owner verification (OV) testing to specific materials and properties, which are based on the TXDOT’s perceived residual risk after the contractor has completed construction and fulfilled its maintenance obligations. As explained in the guide, these levels are as follows: • Level 1 provides continuous analysis for those analysis categories that are strong indicators of performance. Examples include compressive strength for hydraulic cement concrete, percent soil compaction for embank- ment, and percent asphalt content for hot-mix asphalt concrete. The QA testing frequency is in compliance with the Guide Schedule, and the OV testing frequency should be a minimum of 10 percent of the QA testing frequency. F- and t-tests are performed on these material categories on a continuous basis with the addition of each OV test result. • Level 2 provides independent verification for those materials that are secondary indicators of performance. An example is the slump test for hydraulic cement concrete. The QA testing frequency is required to be in compliance with the Guide Schedule and the OV testing frequency should be a minimum of once per quarter. • Level 3 provides observation verification for those materials that only require very few QA tests for compli- ance with the Guide Schedule or tests on materials whose risk of failure does not affect the long-term per- formance of the facility past the contractual maintenance obligations. An example is the entrained air test (Tex-416-A) for non-structural (miscellaneous) concrete riprap where risk of failure does not affect the long- term performance of the facility past the contractual maintenance obligations. Under the Level 3 approach, OV does not perform tests but observes the QA test performance for equipment and procedural compliance with the test procedure.

Level 2: property-Based Optimization 171 The figure below provides an example of how TXDOT’s guide applies these analysis categories to specific materials and properties.

172 Guidelines for Optimizing the risk and Cost of Materials Qa programs Property Importance Material Risk Example Properties High Risk (e.g., structural or safety critical elements, high user impacts, large quantities) Moderate Risk (e.g., structural elements with moderate safety or user impacts) Low Risk (e.g., non-structural elements, small quantities) Primary indicator î Verification testing at 25% QC frequency with continuous F and t analysis î If process under control, use chain lot or cumulative sampling î Verification at 25% QC frequency î If process is under control, reduce verification by 50% and use chain lot or cumulative sampling î Acceptance tests at 50% of default frequencies î Verification at once per quarter w/no statistical validation î Concrete Strength î Concrete air content î Cover depth î HMA in-place air voids î Primary mix properties (e.g., asphalt concrete) î Performance tests* Secondary indicator î Verification at 25% of QC frequency î If process under control, reduce verification by 50% and use chain lot or cumulative sampling î Acceptance tests at 50% of default frequencies after start-up î Verification at once per quarter w/no statistical validation î Observational verification of QC tests with audits of certifications î Slump î Gradation î Secondary mix properties î NDT* Observational indicator î Observational verification of QC tests with audits of certifications and random verification tests î Observational verification of QC tests with audits of certifications î Random verification of QC records and compliance with specifications î Segregation profile î Cracking î Joint consolidation î Workmanship indicators î NDT* *Note: Testing of more advanced material properties (i.e., stiffness, dynamic modulus, permeability) or advanced rapid NDT test methods can provide cost-effective strategies to reduce risk or to meet specific performance goals (e.g., durability or long life) when used at lower frequencies or a once-per-project basis correlated with standard tests Table 4.4. Example property-based optimization (Contractor QC Data Used in Acceptance Decision).

173 Level 3: Cost-Based Optimization In contrast to the Level 1 and 2 analyses presented in Chapters 3 and 4, respectively, the Level 3 optimization process described below explicitly compares the direct costs of different QA pro- tocols to the associated cost of material failure to arrive at the optimum QA investment point. 5.1 Conceptual Framework 5.1.1 Cost of Quality To apply the Level 3 optimization process, it is important to first understand what is meant by the cost of quality. Quality-related materials costs can generally be assigned to the two broad categories defined in Table 5.1. 5.1.2 Conceptual Model While providing more and/or better QA measures should result in higher quality products, it also tends to entail higher costs of conformance. In turn, higher quality products display fewer defects and require less rework, and thus generally have lower costs of non-conformance. These relationships, which can be displayed graphically as represented in Figure 5.1, suggest that there is an optimum level of QA investment corresponding to the minimum point on the total cost of quality curve, where the total cost of quality can be calculated as: - (Eq. 5.1)Total Cost of Quality Cost of Conformance Cost of Non Conformance= + Or, stated differently, (Eq. 5.2)Total Cost of Quality Cost of Materials QA Cost or Expected Value of a Defect( )= + The goal of the Level 3 optimization process is to therefore find the minimum value of the total cost of quality, based on the understanding that, for any given material and QA protocol, a point exists where additional investment in QA would yield a sub-optimal return. The level of QA associated with this investment point would represent the optimum balance between the cost of quality and the value (or benefit) of quality. The steps in the Level 3 process are summarized in Figure 5.2 and described in detail in Sec- tion 5.2. Section 5.3 then proceeds to introduce an Excel-based tool, included in Appendix B, which may be used to facilitate the analysis. C h a p t e r 5

174 Guidelines for Optimizing the risk and Cost of Materials Qa programs Table 5.1. Cost of quality. Category Description Example Components Cost of Conformance (Prevention + Detection) Prevention Costs Costs related to assuring the product or project meets requirements Design analysis and reviews Constructability reviews Quality management systems Appraisal Costs Costs related to determining the degree of product or project conformance Inspection Sampling and testing Cost of Non-Conformance (Defects or Failure) Cost of Defects or Failures Costs associated with non- conforming materials Repair/rework Schedule delays Road user impacts Reduced life Figure 5.1. Level 3 optimization framework. Model Key Definitions Total Cost of Quality = Cost of QA + Expected Value of a Defect Where: Cost of Materials QA refers to the costs to implement different QA protocols (e.g., cost of sampling and testing vs. certification vs. inspection) Risk/Expected Value (EV) of Defect can be calculated as a function of the probability of receiving non-conforming material and the cost impact of the non- conformance should it occur (e.g., cost of repairing or replacing the non- conforming material): EV = Probability x Impact 5.2 Framework Steps The Level 3 optimization framework, as conceptually described below, entails a structured decision process that explicitly considers the following variables to determine the optimum QA investment point: • the specific material and properties of interest, • project characteristics, • possible incremental levels of QA that could be used to assure material quality, • the costs associated with each of these QA levels, and • the risk reduction (e.g., reduced cost of failure) associated with each of these QA levels.

Level 3: Cost-Based Optimization 175 Figure 5.2. Level 3 optimization. Step 4: Quantify the costs of QA. Step 2: Identify project factors that may influence QA effort. Step 6: Quantify the cost impact of non- conforming material. Expected Value (EV): EV = Probability x Impact Criteria: Industry experience Material quantity Criticality/complexity Delivery method Step 3: Identify levels of QA effort. Step 1: Identify materials and properties of interest. Step 5: Quantify the probability of non- conforming material. Step 7: Quantify the expected value of non- conforming material. Step 8: Optimize QA effort based on the total cost of quality. Total cost of quality: Costs of QA + EV of non- conforming material

176 Guidelines for Optimizing the risk and Cost of Materials Qa programs Step 1. Identify the Materials and Properties of Interest As QA protocols, costs, and risks are material and property-specific, the first step when attempting to optimize the costs and risks of QA—whether on a programmatic or project-level basis—is to identify the materials and properties of interest. For example, recognizing that obtaining the data needed to support the Level 3 model may entail a significant effort, one could choose to focus only on those materials that account for a significant proportion of project cost, QA effort, and/or risk-related cost if non-conformances were to occur. Step 2. Consider Project Factors that Could Influence QA Effort As previously summarized in Table 3.7, one should consider if any project-related factors (e.g., project complexity, delivery method) could influence the QA practices applied to a specific project. Step 3. Identify Levels of QA Effort The next step to build the cost-of-quality model is to define the possible levels of QA effort. QA effort quantifies the amount of resources that a DOT may invest to assure that the material conforms to the specifications. Generally, to apply more rigorous QA measures, a DOT will have to commit more resources. The unit of analysis for QA effort would be cost (as expressed as a percentage of the material cost) that has to be committed to pay for each of the QA effort categories. Table 5.2 provides an example of possible levels of QA effort performed by the DOT, ranging from visual inspection up to continuous inspection and/or sampling and testing. As the levels increase, the QA effort may be compounded to some extent. For example, even if significant sampling and testing is conducted, inspection must still be performed to ensure that any handling and placement activities do not affect the final in-place quality or performance of the material. Although not explicitly addressed in the model, it is also important to bear in mind that the cost of an effective owner’s acceptance program will be driven in part by the quality control efforts applied. Table 5.2. Example levels of DOT acceptance practices. Level of Effort Description of Owner Acceptance Method Level 1 Visual Inspection Visual inspection of the work to assure compliance with the contract and prevailing industry standards Level 2 Certification Verification that certified material complies with specification requirements or is on the current Qualified Products List Level 3 Certification with back-up data Certificates of Compliance backed by random or programmatic sampling and testing by manufacturer Level 4 Verification sampling and testing Owner verification testing at a reduced frequency and statistical comparison with contractor’s results. Level 5 Full owner sampling and testing Agency acceptance sample and testing

Level 3: Cost-Based Optimization 177 0.00% 2.00% 4.00% 6.00% 8.00% 10.00% 12.00% 14.00% 16.00% L E V E L 1 L E V E L 2 L E V E L 3 L E V E L 4 L E V E L 5 P ER CE N TA G E O F M AT ER IA L CO ST LEVEL OF QUALITY ASSURANCE EFFORT Cost of QA Figure 5.3. Estimated cost of QA effort. Step 4. Determine the Cost of QA The next step entails quantifying the cost of each level of QA effort. DOTs typically account for the costs of various QA activities as a percentage of a program budget (lab tests and plant inspections) or as a percentage of project cost (field inspection and testing). This cost is specific to each material and property of interest. With the progressive increase in QA cost, expressed as a percentage of the budget for the material of interest, this information should yield a curve similar to that shown in Figure 5.3, assuming a compounding effect exists between the level of effort and QA costs required. As QA effort continues to increase, the expectation is that the risk of material non-conformance will ultimately decrease to a point where the risk-related cost of material non-conformance remains relatively constant. In other words, the additional QA effort would not significantly reduce the risk of non-conformance. To determine this risk of non-conformance requires first evaluating the probability of the material failing to meet specifications (as described below in Step 5) and then the related impact of this non-conformance (as described in Step 6). Once probabilities and impacts are determined, they can be combined, as described in Step 7, to arrive at the expected value of a non-conforming material. Step 5. Quantify the Probability of Non-Conforming Material Step 5 entails evaluating the probability that the material of interest (or material property if per- forming a property-based assessment) will be non-conforming for each QA level. Conceptually, the probability is expected to decline with increasing levels of QA. The probability of non-conformance can be estimated based on historic QA non-conformance data. Absent such data, subject matter experts could be gathered to estimate probability in a workshop setting. Step 6. Quantify the Cost of Non-Conforming Material The impact of non-conforming material can be evaluated as the cost of any assumed remedial work, reductions in service life, or some other cost impact related to the specific non-conforming material (or property). While historic data may be available, direct estimates by subject matter experts may be more practical.

178 Guidelines for Optimizing the risk and Cost of Materials Qa programs 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% L E V E L 1 L E V E L 2 L E V E L 3 L E V E L 4 L E V E L 5 P ER CE N TA G E O F M AT ER IA L CO ST LEVEL OF QUALITY ASSURANCE EFFORT EV of non-conformance Cost of QA Figure 5.4. Expected value of non-conformance. Specifications typically include options to address non-conforming materials, which may range from acceptance as is, acceptance at reduced payment, or removal and replacement of the lot or unit of production. Accepting non-conforming material in an “as is” condition assumes that it is in reasonably close conformance to the requirements, and the non-conformance will have mini- mal impact on performance or longevity. Accepting at a reduced price recognizes the possibility of some service life reduction, and the reduction in payment may be representative of future costs. Removal and replacement is intended to restore the specified quality and intended service life. Step 7. Quantify the Expected Value of a Non-Conforming Material Based on the probability of a non-conforming material (Step 5) and the cost of rework (Step 6), the expected value of non-conformance can be calculated as: (Eq. 5.3)EV PNC I= × Where: EV = expected value of non-conformance expressed as a percentage of material cost, PNC = probability of non-conformance, and I = impact of non-conforming material as a percentage of material cost. With increasing levels of QA effort, the expected value of non-conformance (as a percentage of material cost) is expected to decrease, as illustrated in Figure 5.4. Step 8. Optimize QA Based on the Total Cost of Quality To optimize the cost of QA, the curves associated with the QA effort and the expected value of non-conforming materials can be added together to determine the total cost of quality, and plotted as shown conceptually in Figure 5.5. In this case, the optimal investment point (i.e., QA level) is that corresponding to the level 4 QA effort, the minimum point on the CoQ curve. 5.3 Optimization Tool The Level 3 model is designed to use actual or estimated QA costs and potential cost impacts related to material non-conformance to optimize materials QA. Appendix B includes an Excel- based spreadsheet tool that automates the development of the optimization curves discussed

Level 3: Cost-Based Optimization 179 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% L E V E L 1 L E V E L 2 L E V E L 3 L E V E L 4 L E V E L 5 P ER CE N TA G E O F M AT ER IA L CO ST LEVEL OF QUALITY ASSURANCE EFFORT EV of non-conformance Cost of QA CoQ Figure 5.5. Total cost of quality (CoQ). above. The data required for a comprehensive QA risk-based optimization include: (1) the cost of each level of QA effort; (2) the probability of a non-conforming material at each QA level; and (3) the impact of a non-conforming material. The Excel-based tool contains separate worksheets for each of these inputs. Detailed instructions on using the tool are included in Appendix B. When historic data on QA costs and risk-related impacts (e.g., increased cost of maintenance, repair, or replacement of non-conforming materials) are available, the user can directly input these cost elements into the model. However, the effort for collecting these historic data can be signifi- cant, and even impractical in some cases. If historic data are not available, experts can be used to estimate QA costs, the likelihood of non-conformance, and costs of non-conformance. Collection of expert data can take the form of judgments of a subject matter expert or, more frequently, judgments of a group of experts using a workshop, survey or Delphi data collection technique. 5.4 Applications of the Optimization Tool The optimization framework can be applied as a planning tool to determine the optimal approach to QA for different project scenarios. It can also be used to assess the relative impor- tance of material properties for a specific material item on a project.

180 Implementation This chapter provides a high-level recap of the optimization process, a discussion of potential implementation strategies, and examples to illustrate the application of the QA optimization framework. 6.1 Recap of the Optimization Process This guidebook applies a three-level analytical framework to help DOTs identify the QA acceptance plan necessary to meet specification requirements within an acceptable level of risk. Briefly, these levels can be described as follows: • Level 1 entails conducting a qualitative risk rating of materials and then aligning these ratings with QA methods that can provide reasonable assurance of acceptable quality. • Level 2 focuses on optimizing acceptance testing by emphasizing properties and test methods that are more direct indicators of performance. • Level 3 compares the cost of different QA protocols to the cost of potential defects to arrive at an optimum QA investment point. Each level is based on the principle that materials QA is an inherently scalable activity, and the approach and resource commitment invested by DOTs to assure quality should be com- mensurate with: • Material variability and the level of control required for materials to meet specifications, as well as • Material criticality from the perspective of: – Difficulty to repair or replace, – Safety, – Maintenance cost, and/or – Cost of rework. 6.2 Applying the Framework To successfully apply the framework requires careful implementation planning and data col- lection efforts, as described below. 6.2.1 Implementation Planning Current DOT QA programs are largely based on methods and protocols that have historically produced satisfactory results. Given the institutional knowledge and existing infrastructure (e.g., C h a p t e r 6

Implementation 181 specifications, manuals, test equipment) associated with these traditional practices, some resis- tance may be encountered when attempting to implement a QA strategy that is new or different. To help achieve staff buy-in, it is important to establish and communicate the rationale as to why optimization is necessary or would be beneficial. For example, • Is the agency facing budget or staffing constraints? • Is the agency increasing its use of alternative delivery methods that shift more responsibility for quality to the industry? • Is the agency interested in exploring new or emerging test methods? • Do the resources needed to implement traditional QA protocols in some cases exceed the value received (i.e., the risk of non-compliance does not justify the level of QA applied)? Such awareness of agency goals and constraints related to materials QA will help establish a foundation for subsequent discussions on what is and is not an appropriate or implementable optimization strategy for a particular material or project. For example, an agency contending with limited field or source inspection staff may be open to different optimization strategies (e.g., contractor QC testing in acceptance decision, industry standards for self-certification of materials, etc.) than one facing constraints related to industry inexperience. Understanding what is driving the need for optimization will also help users frame or define the specific scope of the optimization effort. For example, based on the agency’s needs or objec- tives, possible applications of the process could include: • Conducting a programmatic evaluation of all materials and their associated QA practices to optimize a standard sampling and testing schedule or materials manual. • Conducting a project-specific evaluation to scale the QA effort to a given set of project characteristics • Evaluating the cost/benefit of new or emerging QA methods against traditional methods 6.2.2 Framework Application The framework can be applied at the programmatic or project levels. A Level 1 materials- based assessment can be applied as a programmatic assessment of all material/product items for incorporation into the DOT’s materials management system, standard specifications, and the state-wide construction and materials manual(s). Similarly, a Level 2 properties-based assessment could be applied at the program or project level for project-produced or fabricated materials requiring acceptance sampling and testing. If the DOT is interested in shifting more responsibility for quality management to industry, a Level 2 properties-based assessment can be applied in particular to projects involving alterna- tive delivery (e.g., Design-Build or Design-Build-Operate-Maintain) where industry assumes the risk for quality management, and contractor test results are used in the acceptance decision. A Level 3 cost-based risk assessment can be used when costs of QA and costs of non- conformance can be evaluated to determine the optimal QA investment point for a given project scenario and material. 6.2.3 Data Collection and Analysis Techniques The optimization process is dependent upon the availability of reliable input data for QA costs and risk (or reasonable approximations thereof). Although a single person could collect and evaluate such information, it is generally advisable to instead assemble a multi-disciplined team to provide for a more balanced and accurate evaluation. Sufficient discussion should take

182 Guidelines for Optimizing the risk and Cost of Materials Qa programs place among the experts to allow a consensus to be reached on the data input values. Members of this team should include internal DOT subject matter experts on materials and products, and construction and maintenance personnel. It may also be useful to consult with representatives from local materials suppliers and contractors, and national organizations that certify materials and products for highway construction to obtain their perspective on materials QA. The techniques used to assess material risks could range from an informal brainstorming exercise to a more structured and facilitated work session in which project team members and independent subject matter experts are asked to evaluate material risks, QA costs, and impacts related to material non-conformance. In addition to brainstorming activities, other common techniques for assessing the risk and costs of materials QA are described below. These techniques may be used alone or in combination, depending on the optimization approach, the time and resource constraints, and the information available. For a given project, the approach taken to assess material risks should represent a balance between the objectives for the optimization effort and the available resources and skills of the team. Brainstorming Workshops One approach, commonly used in project risk assessment processes, is to conduct brainstorm- ing workshops to assess the likelihood of material non-conformance and the potential impacts of material defects. Brainstorming can assume many forms but it generally works as follows: 1. A facilitator organizes a workshop with a panel of multidisciplinary experts (materials, con- struction, maintenance, etc.), and may invite other stakeholders as appropriate. Experts who are the most knowledgeable with DOT QA practices, materials, and construction manage- ment, and who are regularly confronted with the impacts of material quality failures, would be the most desirable candidates. Effective brainstorming workshops generally entail some or all of the following practices: – Under the guidance of the facilitator, participants should establish the agenda and scope of the optimization effort. – Relevant documents should be provided to the participants (preferably in advance to allow for some preparation). These may include programmatic materials (e.g., standard specifi- cations, materials, and construction manual) or project-specific specifications and material schedules with default materials QA plans. – Participants should review or have knowledge of existing DOT testing and evaluation proce- dures, policies, and practices. It may be necessary to document certain information regard- ing the QA effort. For example, b Resource capabilities (staffing, laboratory facilities, equipment) b Time, effort, and cost to perform inspection or a particular test b Inspection and testing data management – For more complex programmatic assessments, it may be desirable to hold multiple workshops, each focused on a specific material category based on specification categories (e.g., earthwork, base courses, pavements, structures) with key subject matter experts participating. – A note-taker should be appointed to capture the ideas that are being discussed. 2. Panel members share their experience and opinions, which provides the opportunity to build on each other’s ideas. (A more structured brainstorming session, where each group member presents ideas in turn, may be used to ensure that feedback is obtained from all group members.) 3. For each of the materials of interest, the facilitator will structure the discussion to assess inherent material (or property) risks, project or other factors that may affect material (prop- erty) risks, the likelihood and cost impact of material non-conformance, and optimal levels of QA required to manage or mitigate these risks. The facilitator will attempt to reach group consensus through polling, survey, or other methods.

Implementation 183 4. After participants have identified the perceived optimum levels of QA necessary for materi- als management for the given scope and materials of interest, the facilitator will then com- pile and present the optimized materials QA plan for final discussion and validation by the expert panel. Retrospective Analysis Material non-conformance issues can impact multiple projects, unless something is done dif- ferently to avoid or mitigate the problem. Historic cost data from remediation work related to non-conformance or unplanned maintenance work, lessons learned, and project postmortems can provide a rich source of data for assessing the cost of material non-conformance. A retrospective analysis of projects performed either on its own or in conjunction with a brainstorming workshop can serve as a useful tool to quantify costs, as well as to identify optimal QA efforts required to minimize the risk of failure. Delphi Technique The Delphi method is a technique used to obtain the judgment of a panel of experts on a com- plex issue or topic such as material risk. It provides a systematic method of data collection that can minimize bias and the influence any one person can have on the outcome. Conducting a Delphi analysis to evaluate materials QA risks could entail the following steps: 1. A facilitator develops and distributes a questionnaire to solicit ideas regarding perceived material non-conformance risks (likelihood and impact) and costs of QA. 2. Participants work independently and report data anonymously. 3. The facilitator receives and aggregates the data. 4. The aggregated data from this first round are then circulated back to the experts for further comment and refinement. 5. The process is repeated until consensus is achieved. Typically, this requires two or three more survey rounds. 6. The aggregated data is circulated back to the experts for reconsideration. The iterative nature of this process has been found to yield more reliable results than a single survey round. Applying administrative controls, such as keeping responses anonymous and randomly ordering questions within a survey, has been shown to further improve the reliability and validity of the resulting data. Interviews Material risks and costs can also be validated by interviewing experienced project managers and/or independent subject matter experts. After being briefed on the material QA optimization process, the interviewees can be asked to identify QA costs and material non-conformance risks and costs based on their experience and knowledge. Interview results can be used to validate the results of earlier brainstorming or other information gathering techniques or as an input to these other methods. 6.2.4 Examples of the Optimization Process Appendix C presents examples that demonstrate the use of the optimization framework.

184 References AASHTO. 2010. “Standard Practice for Quality Assurance of Standard Manufactured Materials.” AASHTO Designation: R 38. AASHTO. 2013. “Standard Practice for Acceptance Sampling Plans for Highway Construction.” AASHTO Designation: R 9. Arambula, E., and Gharaibeh, N. 2014. “Methods for accumulating construction and material quality test results and their effect on acceptance decisions.” Journal of Construction Engineering and Management. 140(8): 1–3. Baker, T., Molohon, J., and McIntyre, R. 2010. Materials Risk Analysis. Washington State Department of Transporta- tion. Olympia, Washington. California Department of Transportation. 2015. Construction Quality Assurance Program Manual. http://www. dot.ca.gov/hq/construc/docs/cqap_manual.pdf. Federal Highway Administration. 2012. Construction Quality Assurance for Design-Build Highway Projects. Tech Brief. Washington, D.C. Federal Highway Administration. 2013. “Quality Assurance Stewardship Review Summary Report for Fiscal Years 2009 Through 2012.” Washington, D.C. Federal Highway Administration. 2004. “Use of Contractor Test Results in the Acceptance Decision, Recommended Quality Measures, and the Identification of Contractor/Department Risks.” Technical Advisory T6120.3. http://www.fhwa.dot.gov/construction/t61203.cfm. Hylton Meier, H. 1991. “A Control Model for Assessing Quality Costs.” Mid-American Journal of Business, 6(1): 40–44. Juran, J. M. 1951. Quality Control Handbook. McGraw-Hill, New York. Kirkpatrick, E. G. 1970. Quality Control for Managers and Engineers. John Wiley & Sons, Inc. Morse, W. J. 1993. “A Handle on Quality Costs.” CMA Magazine, 67(1): 21. Mostafavi, A., and Abraham, D. M. 2012. “INDOT Construction Inspection Priorities.” Joint Transportation Research Program, Indiana Department of Transportation and Purdue University, West Lafayette, Indiana. Plunkett, J. J., and Dale, B. G. 1988. “Quality Costs: A Critique of Some ‘Economic Cost of Quality’ Models.” International Journal of Production Research, 26(11): 1713–1726. Schiffauerova, A., and Thomson, V. 2006. “A Review of Research on Cost of Quality Models and Best Practices.” International Journal of Quality & Reliability Management, 23(6): 647–669. South Dakota Department of Transportation. 2013. “Required Samples, Test, and Certificates.” http://sddot.com/ business/certification/forms/default.aspx. Texas Department of Transportation. 2011. Design-Build Quality Assurance Program Implementation Guide. https://ftp.txdot.gov/pub/txdot-info/cst/db_gap_guide.pdf. Washington State Department of Transportation. 2013. Construction Manual. Engineering and Regional Operations. Olympia, Washington.

185 Current State of Materials QA While not an exhaustive list, the following table summarizes several QA practices in standard use today. Although it is possible that under certain circumstances one method alone could be used to assure a material’s desired level of quality and performance (e.g., requiring use of materials selected from a Qualified Products List), the use of multiple methods in series or in combination is more typical of most construction materials. A p p e n d i x A QA Method Description QA Strategy/Objectives Materials Prequalification Qualified (or Authorized) Products List: List of products that have been tested and/or otherwise evaluated by the DOT and found to meet specification requirements. Once prequalified, the listed materials typically undergo periodic testing and field performance evaluations at a prescribed frequency to ensure continued receipt of material of the specified quality î Minimize project-specific inspection and testing needs, particularly for o Materials that cannot be evaluated or tested within a typical construction project timeframe o Materials that require extensive prequalification testing not practical to repeat for every job î Ensure objective and consistent process for evaluating new products for use in construction Tested Stock: A defined quantity (batch, heat, lot, tank, etc.) of the manufacturer's or supplier's inventory that has been sampled and tested by the agency and has been set aside for use on agency projects. Approved material(s) may then be shipped to any project until the approved quantity is depleted. î Minimize project-specific inspection and testing needs Manufactured to National Quality Standard: Items produced to meet the specifications of industry-wide trade associations, professional societies, standards-writing organizations etc. (e.g., AASHTO, ASTM, the American Wood-Preservers’ Association, the American Institute of Steel Construction (AISC), among others) î Referencing national standards of quality provides a cost- effective alternative to the process of developing agency- specific contract item specifications and test methods for manufactured items î Minimize project-specific inspection and testing needs Commercial Quality: “Off-the-shelf items” readily available for purchase at local supply houses. î Eliminate need to develop agency-specific specification î Minimize project-specific inspection and testing needs for low risk, generic items Pre-approved Source: Source designated as being an approved and/or certified supplier (typically for a specific period of time) on the basis of acceptable testing and evaluation periodically performed by the agency. î Streamline quality management and material acceptance by reducing the need for testing at the project site (if proper documentation accompanies the material delivered from the pre-approved source) (continued on next page)

186 Guidelines for Optimizing the Risk and Cost of Materials QA programs Contractor Qualifications: Specification of minimum requirements for contractors. î Ensure construction is performed by qualified contractors having the requisite experience Qualification Requirements for Sampling/Testing Personnel: Requirement that sampling, testing, and inspection personnel are certified to a recognized standard or certification program (e.g., ACI). î Ensure contractor, vendor, and agency sampling, testing, and inspection data used in the acceptance decision is performed by qualified personnel Qualification Requirements for Installer/Fabricator Personnel: Requirement that certification production personnel are certified to a recognized standard (e.g., AWS). î Ensure work is performed by qualified personnel Submittals Agency Review of Contractor Working Drawings: Review of contractor’s working drawing submittals. î Assure construction and fabrication details conform with design requirements prior to the start of construction or fabrication Agency Review of Contractor Mix Designs: Review of contractor’s mix design submittals. î Assure contractor’s planned mix proportions will meet performance requirements prior to the start of construction or fabrication Agency Review of Quality Management Plan: Review of contractor’s planned quality control procedures describing how it intends to perform and control the work to meet the specifications. î Assure the contractor performs, as a minimum, the quality inspections and testing identified in its plan. î Assure the contractor understands how its own actions (e.g., in the scheduling, ordering, handling, placing, finishing, curing etc.) will impact in-place material properties and performance of the work, and that the contractor has planned the work and allocated its resources accordingly Sampling and Testing by Contractor Quality Control Sampling and Testing (Source): Required testing performed by the contractor during the production process. î Measure quality characteristics that affect the production at a time when corrective action can be taken to prevent appreciable nonconforming material from being incorporated in the project QA Method Description QA Strategy/Objectives Qualification Requirements for Facilities, Contractors, and Personnel Prefabrication Audit: Agency-performed audit to evaluate if a fabricator has the processes and the resources to fabricate project-specific products to the quality indicated in the specifications. î Obtain a measure of assurance that a producer has the capability to perform by conducting an onsite facility audit Authorized Facility List: List of facilities that are audited on a system-based approach to ensure the fabricator is adhering to its quality control plan. î Audit process ensures that the fabricator has and is adhering to its quality control processes î The prospect of periodic audits should help keep fabricators cognizant of their responsibility for quality control Authorized Laboratory: Laboratories recognized by the agency (or a formal accrediting body) such as AASHTO Accreditation Program (AAP) as meeting quality system requirements. î Ensure testing is performed by laboratories having demonstrated competence to perform standard test procedures Approved Manufacturer: A manufacturer who has submitted quality control documentation and/or material samples, and has been given approval status to certify specific material(s). î Streamline quality management at the project site by allowing for acceptance of certain products on the basis of certifications (continued on next page)

Current State of Materials QA 187 QA Method Description QA Strategy/Objectives System-Based Acceptance: Agency monitoring and management of material quality on a statewide basis (would likely require implementation of a materials management system). î More efficient use of testing and sampling resources î Streamline project-level quality management and material acceptance Sampling and Testing by Agency Acceptance Sampling and Testing: Sampling and testing to determine the degree of compliance with specifications. î Measure the degree to which materials comply with specification requirements Independent Assurance Testing (Project Basis): Unbiased and independent project-level evaluation of all the sampling and testing procedures used in the acceptance program (frequency of IA testing based on testing frequency on a particular project, e.g., 10% of verification/acceptance testing). î Assure sampling and testing activities are being performed by qualified personnel using proper procedures and properly functioning and calibrated equipment î Promote confidence in test results used for acceptance purposes Independent Assurance Testing (System Basis): Centralized independent assurance testing, the frequency of which is generally based on a time basis for all testers and equipment. (Note this would likely require implementation of a materials management system). î Improve efficiency of IA testing by focusing on the testers not project quantities (ensures that most testers are reviewed over the period of a year as opposed to continually reviewing the same testers) î Incorporate IA program results as part of technician qualification programs Certificate of Compliance Certificate of Compliance from a Producer: Written and signed statement from a producer, submitted before the material is incorporated into the work, for each batch or lot of the material stating that the material provided complies with the contract. (All materials and products accepted by certificate of compliance require periodic programmatic quality assurance testing of random “check” samples with results that support the reliability of the certificate provider.) î Receive confirmation that the contractor has accepted the material and is confident that the material complies with the specifications î Eliminates need for agency sampling and testing at the jobsite Quality Control Sampling and Testing (Jobsite): Required testing performed by the contractor during the production process. î Measure quality characteristics that affect the production at a time when corrective action can be taken to prevent appreciable nonconforming material from being incorporated in the project Quality Control Sampling and Testing for Acceptance: Agency’s use of contractor test results in the acceptance decision for select quality characteristics (as validated by the agency’s verification and independent assurance testing). î Use verified contractor test results to minimize the duplication of agency and contractor testing and reduce agency’s testing burden Verification Sampling and Testing: Agency performed testing to validate contractor test results used in the acceptance decision. î Use verified contractor test results to minimize the duplication of agency and contractor testing and reduce agency’s testing burden. Verification testing at 10-25% of frequency of QA testing Programmatic QA Inspection and Testing (Jobsite): Periodic inspection and testing performed by the agency on random “check” samples of manufactured products at the jobsite. î Confirm that a manufacturer continues to provide products meeting the desired standard of quality î Determine the reliability of the manufacturer’s quality control process î Provide data to support continued acceptance of a particular product on all agency projects via use of a certificate of compliance (continued on next page)

188 Guidelines for Optimizing the Risk and Cost of Materials QA programs QA Method Description QA Strategy/Objectives Intermittent Inspection of Work In Progress: Agency monitoring of the contractor’s construction processes on an intermittent basis to ensure that the construction quality and workmanship are in compliance with the plans and specifications. î Provide a reasonable degree of confidence in the quality of workmanship and fitness for purpose (e.g., Inspect 30– 80% of the time work is in progress with assistant(s) assigned to two or three operations simultaneously) Benchmark Inspection: Agency inspection up to 30% of the time work is in progress, allowing construction operations to proceed until a predetermined critical activity or hold point has been reached. î Provide some confidence in the quality of workmanship while minimizing disruption of construction operations Warranties Material and Workmanship Warranty î Requires the contractor to correct early defects in products caused by elements within the contractor’s control Performance Warranty î Shift some post-construction performance risk to industry î Monitor and evaluate Contractor performance over time Certificate of Compliance from a Producer with Test Results: A written statement from a producer accompanied by test data (e.g., mill test reports for steel, pressure treating reports for timber) that affirms a product meets the specifications. î Obtain field or laboratory test data verifying the suitability of material representative of the same lot of material as the material to be incorporated in the work î Eliminates need for agency sampling and testing at the jobsite Inspection Quality Control Inspection: Required inspection performed by the contractor during the production process to ensure that a material or product meets the contract requirements. î Visual inspection of quality and workmanship during production or installation to ensure that a material or product meets contract requirements Acceptance Inspection: Inspection performed by the agency or designated agent to ensure that a product is acceptable in terms of the specifications for a specific project. î Validate the quality of the product to ensure the proper combination of materials and details of construction Continuous Inspection of Work In Progress: Agency monitoring of the contractor’s construction processes on a continuous basis to ensure that the construction quality and workmanship are in compliance with the plans and specifications. î Provide the highest degree of confidence in the quality of workmanship and fitness for purpose (e.g., Inspect 80– 100% of the time work is in progress with assistant(s) assigned only to one operation)

189 Optimization Tool Description of Tool The Level 3 model is designed to use actual or estimated QA costs and the potential cost impacts related to material non-conformance to optimize materials quality assurance (QA). Appendix B introduces an Excel-based spreadsheet tool that automates the optimization pro- cess and is available at http://apps.trb.org/cmsfeed/TRBNetProjectDisplay.asp?ProjectID=3403. The data required for a QA risk-based optimization are: (1) the cost of each level of QA effort; (2) the probability of a non-conforming material at each QA level; and (3) the impact of a non- conforming material. The Excel tool has separate worksheets for these inputs. Input Materials of Interest, Properties, QA Levels of Effort, and Factors Select the material of interest or material property(s) for evaluation of material QA. The user could select a material of interest (material item), or material properties used for acceptance as part of the agency’s standard acceptance plan. The model may be used to think through the use of advanced material properties and test methods (i.e., stiffness) to assess their cost benefit. To complete the input sheet for Step 1 as shown in Figure B.1, users should input the material of interest and a subcategory within the material in the yellow color coded cells for the “Specification/ property.” The tool is able to perform optimizations from one to six properties. The user should input the ones that are most important for optimization purposes. For analysis of QA level of effort and costs, the Excel template tool defines five levels of QA effort performed by the DOT based on the primary mode of acceptance, ranging from visual inspection up to sampling and testing, as shown in Table B.1. The definitions worksheet includes definitions for QA levels of effort. The user can modify (add or delete) levels of QA as appropriate. If applicable, input factors that have an obvious impact on the QA approach (i.e., industry experience, project complexity, delivery method). The more factors, the greater the complexity. The model allows input of up to four factors. For certain material items (i.e., standard manufac- tured products such as plastic pipe), the impact of supplier experience may be the only important factor affecting the level of QA. The tool can handle up to four factors. If a material requires more than four factors, then the user should look for which of those factors have a strong correlation and group them together. This proposed QA optimization methodology may be used in an iterative manner so the user could initially run the process with the four factors and rerun the process with different factors to determine factors that have a highest impact on the outcomes. A p p e n d i x B

190 Guidelines for Optimizing the Risk and Cost of Materials QA programs Figure B.1. Material/property inputs. Material Material Specification/property 1 Property 1 2 Property 2 3 Property 3 - - - Figure B.2. QA levels of effort. QA Levels of Effort 1 Visual inspection 2 Certification 3 Certification w/data 4 Verification sampling and testing 5 Full sampling and testing - - - A two-option dichotomous rating (i.e., low-high) is used in the tool for the characterization of factors for generating scenarios. A score of low or high is assigned. For delivery strategy Design Bid Build (DBB) or Design-Build (DB) is selected. Although this is not a comprehen- sive manner of characterizing all potential scenarios, it is adequate for demonstrating the data collection for use of the optimization model. For example, Figure B.3 illustrates three factors that were used to develop project scenarios. The factors may change depending on the material type and use. The tool will auto-generate scenarios based on the number of factors selected as shown in Figure B.4. To gauge the sensitivity of different factors on the optimization results, the user can select different scenarios in the Definitions worksheet (select ON to activate a scenario), and analyze them in an iterative manner. The user could initially run the process with selected factors and rerun the process for dif- ferent scenarios (i.e., small v. large) to determine how they will affect the level of QA. This may be useful in the project planning and development stages to determine the optimal level of QA effort for different project scenarios. If the model is used for a project-specific assessment, the user would input the factors for the one specific pre-defined project (i.e., high level of industry experience, large quantity of material, large complex project, and design-build delivery method). The user could then analyze materials of interest for the single project scenario to determine the optimal QA level of effort for the project.

Optimization Tool 191 Input Cost of QA Input the cost of QA as a percentage of cost as shown in Table B.1 for each QA level. DOTs typically account for the costs of various QA activities as a percentage of a program budget (lab tests and plant inspections) or as a percentage of material cost (field inspection and testing) specific to each material item. A user can also input historic QA cost data as a percentage of item cost for each level of QA or input data based on subject matter experts. It is expected that the cost of QA would generally increase with each level of QA and may vary somewhat based on the levels of industry experience, material quantities, criticality, and chosen project delivery method. Input Probability of Non-Conforming Material Input the probability that the material or material property will be non-conforming (i.e., the probability of a defect) considering each QA level as shown in Table B.2. Here, the probability is again expressed as a percentage (i.e., 0–100%). It is expected that the probability of non-conformance will be lower with increasing levels of QA across all project scenarios. Figure B.3. Factors affecting QA approach. Factors that affect QA approach Definition Scenarios 1 Industry Experience The confidence or reliability an owner has on the contractor and/or supplier. High Low 2 Criticality/Complexity Project size, location, criticality (urban/rural) High Low 3 Delivery Method The delivery system used by the owner for design, construction, operation, and maintenance services for project DBB DB - Figure B.4. Scenarios for selected factors. Scenarios\Factors Industry Experience Criticality/Complexity Delivery Method On/Off 1 High High DBB ON 2 Low Low DBB ON 3 Low High DBB ON 4 High Low DBB ON Table B.1. Estimated cost of QA. QA Effort\Scenarios 1 2 3 4 1 Visual inspection 0% 0% 0% 0% 2 Certification 0% 0% 0% 0% 3 Certification w/data 0% 0% 0% 0% 4 Verification sampling and testing 0% 0% 0% 0% 5 Full sampling and testing 0% 0% 0% 0%

192 Guidelines for Optimizing the Risk and Cost of Materials QA programs Input Cost of Non-Conforming Material Input the cost of a non-conforming property as shown in Table B.3. The user can assign expected values based on the perceived impact of the material non-conformance considering the range of options previously discussed. Outputs: Expected Value of a Non-Conforming Material and Total Cost of Quality The tool will calculate the expected value of non-conformance and the Total Cost of Quality (CoQ) outputs. The CoQ output is presented as a heat map in Table B.4 to illustrate the optimal investment point for the material item. The cells colored in red are the ones with the highest expected cost of quality and the ones in green have the lowest cost of quality. For this example, the assessment indicates that, Level 5, full agency sampling and testing, is the optimal QA invest- ment for the example material across all scenarios. The user can optimize QA effort as a planning or decision tool to choose among different project scenarios. It can also be used to assess the relative importance of material properties for a specific material item on a project. Table B.2. Probability of non-conformance worksheet. QA Effort\Scenarios 1 2 3 4 1 Visual inspection 0% 0% 0% 0% 2 Certification 0% 0% 0% 0% 3 Certification w/data 0% 0% 0% 0% 4 Verification sampling and testing 0% 0% 0% 0% 5 Full sampling and testing 0% 0% 0% 0% Table B.3. Impact of non-conformance. Spec\Property Scenarios 1 2 3 4 1 Property 1 0% 0% 0% 0% 2 Property 2 0% 0% 0% 0% 3 Property 3 0% 0% 0% 0% Table B.4. Total cost of quality (CoQ). QA Effort\Scenarios 1 2 3 4 1 Visual inspection 78% 93% 77% 70% 2 Certification 59% 69% 59% 52% 3 Certification w/data 53% 62% 52% 47% 4 Verification sampling and testing 31% 40% 31% 24% 5 Full sampling and testing 18% 27% 18% 18%

193 Case Study Examples of Optimization Process Example 1: Level 2 QA Optimization for HMA Pavement Reconstruction This example illustrates the use of the Level 2 property-based optimization for HMA pave- ment. HMA is categorized as a project-produced material. Items are typically produced for a specific project, and mixing, placing, compacting, and other field operations can substantively impact quality and material variability. Because of these characteristics, QA processes for accep- tance of HMA pavements typically include QA sampling and testing of material properties. The remainder of this example describes the steps that a DOT would follow to optimize QA. Step 1: Identify Project Description, Objectives and Material(s) of Interest A p p e n d i x C Level 2 Property-based Optimization Assessment Worksheet Project Name Project ID: Asphalt Pavement Reconstruction/Widening Date of Review MM/DD/YY Project Description: Reconstruction of a rural interstate highway to add capacity. Objectives: î Improve capacity î Speed of construction î Pavement durability Material(s) of interest î Dense-graded Hot-Mix Asphalt (QC/QA), Item xxx, high volume roadway mix î Project-produced î Material quantities: ≥50,000T Step 2: Review Existing Acceptance Plan for Materials of Interest The Standard Minimum Sampling and Testing Guide for HMA is shown in Table C.1. The guide schedule establishes the minimum required frequencies for sampling and testing for

194 Guidelines for Optimizing the Risk and Cost of Materials QA programs specific material properties. The worksheet can be used to determine the optimal QA frequency for acceptance testing of material properties. Step 3: Evaluate Importance of Acceptance Properties Consider the following in rating the importance of the property: • Consequences of non-conformance (i.e., pay adjustment system or remove and replace unit of production) • Sampling frequency in the standard guide schedule • Location and test method (i.e., in-place destructive v. production test) • Relationship of property to pavement distresses • Relationship to fundamental material characteristics Table C.2 illustrates the ratings for acceptance property importance for the HMA material item. For this material specification, in-place air voids and lab molded density are payment adjust- ment properties. In-place air voids requires destructive core tests. Thus, the key material prop- erties for acceptance are in-place air voids, lab molded density, and asphalt content. VMA is measured frequently, but is not part of the payment adjustment system. Gradation is tested for acceptance less frequently (1 per 12 sub-lots). Ride quality is a pay adjustment performance property. The remaining properties are tested less infrequently, or can be waived by the Engineer. Step 4: Assess Benefit of Using Alternative or Advanced Testing Properties As noted in the acceptance plan, this acceptance plan includes Hamburg Wheel Tracking, a performance test for rutting resistance. The frequency is one test per project, usually done at the Dense-Graded HMA Pavement (Source: TXDOT Item 341 2014) Test For: Analysis: Std. Acceptance Frequency Co m pl et e M ix tu re Asphalt Content (%) AC content of JMF Truck sample Minimum 1 per Lot Voids in Mineral Aggregates (VMA) Bulk specific gravity of lab molded JMF Truck Sample 1 per Sublot Gradation Dry gradation sieve analysis Engineer Truck Sample 1 per 12 Sublots Boil Test Lab specimen Truck Sample 1 per project Moisture Content Oven drying Engineer Truck Sample 1 per project Lab Molded Density Density of production material Truck Sample 1 per Sublot Hamburg Wheel Lab molded JMF Truck Sample 1 per project R oa dw ay In-Place Air Voids Core for density of in-place material Roadway 2 cores per Sublot Joint Density Density gauge Roadway 1 per project Segregation profile Density gauge Roadway 1 per project Ride Quality Inertial Profiler (IRI) Travel Lanes As per Specification Location/ Timing Material/ Product Table C.1. Sampling and testing guide for HMA.

Case Study examples of Optimization process 195 start of or early during production as a measure of mixture performance, particularly if recycled material is used in the mix. It is an important test in that production can be suspended until mix design is adjusted if the test results do not meet the performance standard for Hamburg rutting. Other possible performance measures that might be considered as supplemental measures include mixture fatigue tests (Beam Fatigue), Intelligent Compaction monitoring of the in-place material as a rapid QC measure to control compaction variability, or Ground Penetrating Radar (GPR) to non-destructively check density and layer thickness. Step 5: Identify Project-Level Considerations To refine the selection of appropriate QA methods, Table C.3 below summarizes some additional project-related considerations that can influence the QA strategy for a given material and/or project. Step 6: Determine the Extent That Acceptance Testing Can Be Reduced or Optimized Based on Material Risk and Property Importance For this Design-Build project, contractor quality control (QC) test results are used in the DOT’s acceptance decision, but the DOT must still perform independent verification sampling and testing in accordance with 23 CFR Part 637. The optimal verification sampling and testing plan Acceptance Properties for HMA Pavement Test For: Location/Timing Std. Acceptance Frequency Importance Lab Molded Density Truck Sample 1 per Sublot Primary In-Place Air Voids Roadway 2 cores per Sublot Primary Asphalt Content (%) Truck sample Minimum 1 per Lot Primary Hamburg Wheel Truck Sample 1 per project Primary Voids in Mineral Aggregates (VMA) Truck Sample 1 per Sublot Secondary Gradation Engineer Truck Sample 1 per 12 Sublots Secondary Boil Test Truck Sample 1 per project unless waived Low Moisture Content Engineer Truck Sample 1 per project Low Joint Density Roadway 1 per project Low Segregation profile Roadway 1 per project Low Ride Quality Travel Lanes Per Specification N/A Table C.2. Assessment of acceptance property importance. Contractor Qualifications: Contractor has a satisfactory performance history, and is experienced with quality management and with Design-Build. Project Delivery Method: Design-Build Acceptance Method: Contractor test results used in acceptance decision Material Quantities: Large Quantities Project Criticality/ Complexity Moderate – Rural Interstate Table C.3. Project factors that influence QA strategy.

196 Guidelines for Optimizing the Risk and Cost of Materials QA programs Dense-graded Hot-Mix Asphalt (QC/QA), Item xxx, high volume roadway mix Test For Location Std. Acceptance Frequency Property Importance Optimal Verification QA: Asphalt Content (%) Truck sample 1 per Lot Primary î Verification testing at 25% of QC frequency î Reduce to 50% if process under control & use cumulative/chain lot sampling Gradation Engineer Truck Sample 1 per 12 Sublots Secondary î Acceptance at 50% of default frequency after start-up î No statistical validation Voids in Mineral Aggregates (VMA) Truck Sample Plant Produced 1 per Sublot Secondary î Reduce to 50% of std. frequency î No statistical validation Boil Test Truck Sample 1 per project Observational î Observational verification of QC with audits Moisture Content Engineer Truck Sample 1 per project Observational î Observational verification with audits Lab Molded Density Truck Sample 1 per Sublot Primary î Verification testing at 25% of QC frequency î Reduce to 50% for process under control & use cumulative/chain lot sampling Hamburg Wheel Tracker Engineer Truck Sample 1 per project Primary î 1 per project after start-up (for high RAP mixes) In-Place Air Voids (density) Roadway 2 cores per Sublot Primary î Verification testing at 25% of QC frequency î Reduce to 50% for process under control & use cumulative/chain lot sampling Joint Density Roadway 1 per project Observational î Observational verification of QC with audits Segregation profile Roadway 1 per project Observational î Observational verification of QC with audits Table C.4. Property-based optimization of HMA pavement. would be driven by material criticality and the ability of the property to act as an indicator of performance. Using the Level 2 guidelines, the optimal level of verification QA (testing or observational verification) is shown in Table C.4 below. In this example, the material item risks are characterized as moderate based on the material application. Example 2: QA Optimization for Precast Structural Concrete This example illustrates the use of the Level 3 cost-based optimization tool to optimize DOT’s QA program for different precast concrete applications. Precast concrete can be used for struc- tural or non-structural applications. The remainder of this example describes the steps that a DOT would follow to use the optimization tool.

Case Study examples of Optimization process 197 Step 1: Define Material(s) and Properties (Categories) of Interest In this example precast concrete was selected for analysis. Materials have a large variety of applications but some of them are of greater interest because of cost, criticality, safety or other reasons. Structural bridge elements and drainage structures commonly use precast con- crete. However, these two applications differ significantly in terms of QA cost and the risks of non-conformance. Precast concrete is the material of interest in this example. Bridge members and drainage structures are subcategories within the material. There are many applications or uses for precast concrete but the user should select only those of interest for a specific optimization strategy. Step 2: Identify Project Factors Use factors that have a greater impact on the way the agency approaches QA. Industry experience, material quantity, criticality or complexity, and project delivery method; all can influence the QA procedures or the risks that the agency is managing. As noted in Figure C.2, all the scenarios represent extremes. The project complexity is either high or low. The delivery method is either traditional design-bid-build (DBB) or design-build-operate-maintain (DBOM). The 4 project scenarios selected for this example are shown in Table C.5. Level 3 Cost-based Optimization Assessment Worksheet Programmatic QA program for precast concrete Date of Review MM/DD/YY Material(s) of interest Precast concrete Bridge element Drainage Structure Description Precast bridge deck System of precast inlets and drains (precast pipe) to collect and draw off water from structures, pavements. Category Critical Structure Non-critical structure Figure C.1. Input for precast concrete example. Input Factors that affect QA approach Definition Scenarios Industry Experience The experience and reliability of the contractor and/or supplier. High Low Material Quantity The planned quantity or volume of material. Large Small Criticality/complexity Project size, location, criticality (urban/rural) High Low Delivery method The delivery system used by the owner for design, construction, operation, and maintenance services for project DBB DBOM Figure C.2. Project factors.

198 Guidelines for Optimizing the Risk and Cost of Materials QA programs Factors Scenarios 1 2 3 4 Industry Experience High Low High High Material Quantity Large Large Small Large Project delivery method DBB DBB DBB DBOM Criticality/ Complexity High High Low High Table C.5. Project scenarios. Step 3: Identify Levels of QA Effort Table C.6 provides the assumed levels of effort with a range of options for what the DOT’s QA approach could be with concise definitions for the levels of QA effort and factors previously identified. Step 4: Quantify Cost of QA (% of material cost) Input the cost of QA by quantifying the cost to perform each level of QA for the different scenarios. The cost QA, is expressed as a % of material element cost for each scenario as shown in Figure C.3. Step 5: Quantify the Probability of Non-Conforming Material Estimate the likelihood or probability of non-conforming material properties shown in Figure C.4 for each level of QA effort and each scenario. The probability is expressed as a percentage (0–100%). Step 6: Quantify the Cost of Non-Conforming Material Estimate the cost of non-conforming material. If a key property (i.e., strength, air content) is not in conformance with the specifications, and it is determined that the degree of non-compliance requires replacement, the cost of replacement of precast bridge deck element (lot) is estimated at 115% (i.e., the initial cost of the replacement element 100% plus the cost of removal 15%) as Levels of Effort Description of QA Effort Level 1 Visual inspection Visual inspection of the work to assure compliance with the contract and prevailing industry standards Level 2 Certification Verify that certified material complies with contract or is on the current QPL Level 3 Certification w/data Review certification for compliance with specifications supported by test data Level 4 Verification sampling and testing Agency verification testing at a reduced frequency and statistical comparison with contractor's results. Also responsible for IA Level 5 Full sampling and testing Agency performs acceptance sampling and testing. Also responsible for IA. Table C.6. Levels of QA.

Case Study examples of Optimization process 199 Bridge Deck QA Effort\Scenarios 1 2 3 4 1 Visual inspection 2% 1% 1% 1% 2 Certification 3% 2% 2% 2% 3 Certification w/data 8% 5% 3% 3% 4 Verification sampling and testing 9% 8% 6% 9% 5 Full sampling and testing 12% 12% 10% 10% Drainage Structure QA Effort\Scenarios 1 2 3 4 1 Visual inspection 1% 2% 1% 1% 2 Certification 2% 3% 2% 2% 3 Certification w/data 2% 0% 3% 3% 4 Verification sampling and testing 6% 7% 4% 8% 5 Full sampling and testing 9% 10% 6% 10% Figure C.3. QA effort for each scenario. Bridge Deck QA Effort\Scenarios 1 2 3 4 1 Visual inspection 50% 60% 40% 20% 2 Certification 30% 40% 35% 15% 3 Certification w/data 11% 20% 5% 8% 4 Verification sampling and testing 10% 11% 5% 5% 5 Full sampling and testing 6% 9% 4% 3% Drainage Structure QA Effort\Scenarios 1 2 3 4 1 Visual inspection 45% 50% 40% 25% 2 Certification 25% 30% 25% 15% 3 Certification w/data 10% 18% 5% 6% 4 Verification sampling and testing 7% 12% 5% 3% 5 Full sampling and testing 5% 7% 5% 3% Figure C.4. Probability of non-conforming material.

200 Guidelines for Optimizing the Risk and Cost of Materials QA programs Spec\Scenario 1 2 3 4 1 Bridge Deck 115% 115% 115% 115% 2 Drainage Structure 115% 115% 115% 115% Figure C.5. Estimated impact of non-conforming element. Bridge Deck QA Effort\Scenarios 1 2 3 4 1 Visual inspection 58% 69% 46% 23% 2 Certification 35% 46% 40% 17% 3 Certification w/data 13% 23% 6% 9% 4 Verification sampling and testing 12% 13% 6% 6% 5 Full sampling and testing 7% 10% 5% 3% Drainage Structure QA Effort\Scenarios 1 2 3 4 1 Visual inspection 52% 58% 46% 29% 2 Certification 29% 35% 29% 17% 3 Certification w/data 12% 21% 6% 7% 4 Verification sampling and testing 8% 14% 6% 3% 5 Full sampling and testing 6% 8% 6% 3% Figure C.6. EV of non-conformance. shown in Figure C.5. The cost of the drainage structure segment similarly includes an estimated cost of replacement 100% plus the cost of removal 15%. Step 7: Quantify the Expected Value of Non-Conforming Material The expected value (EV) of nonconforming material is expressed as the product of the prob- ability of non-conformance (PN) and the cost impact (I) of rework expressed as a percentage of the installed material cost. EV PN I= × If the estimated impact of the non-conforming bridge element and drainage structure is multiplied by the probability of nonconformance, the EV of nonconformance for each level of QA effort and scenario is calculated in Figure C.6. Step 8: Optimize the QA Effort Based on Total Cost of Quality Table C.7 is a heat map that compares the cost of quality in two dimensions; the QA levels of effort on the vertical scale and scenarios on the horizontal scale. The percentage signifies the

Case Study examples of Optimization process 201 expected total CoQ of the material. The green color coding represents the lowest cost of QA and the red coding represents the highest cost of QA. The color coding from green to red is specific to each heat map. In scenario one the overall expected cost of quality for the bridge element is 59% (orange) for visual inspection compared to 20% (green) for agency would perform verification sampling and testing. In scenario four, the lowest overall expected cost of quality (13% green) is for the agency performing verification sampling and testing. It also shows that scenario two has a higher expected cost of quality at all QA levels. Comparing the bridge deck CoQ with the drainage structure example, the optimal QA level for the bridge deck entails sampling and testing for scenarios 1 and 2 and certification with sup- porting test data for lower risk scenarios 3 and 4. The optimal QA level for the drainage structure entails certification with supporting test data for all scenarios. Bridge Deck QA Effort\Scenarios 1 2 3 4 1 Visual inspection 59% 70% 47% 24% 2 Certification 37% 48% 42% 19% 3 Certification w/data 21% 28% 9% 12% 4 Verification sampling and testing 20% 21% 12% 15% 5 Full sampling and testing 19% 22% 15% 13% Drainage Structure QA Effort\Scenarios 1 2 3 4 1 Visual inspection 53% 59% 47% 30% 2 Certification 31% 37% 31% 19% 3 Certification w/data 13% 21% 9% 10% 4 Verification sampling and testing 14% 21% 10% 11% 5 Full sampling and testing 15% 18% 12% 13% Table C.7. Optimal level of QA effort based on total cost of quality.

Abbreviations and acronyms used without definitions in TRB publications: A4A Airlines for America AAAE American Association of Airport Executives AASHO American Association of State Highway Officials AASHTO American Association of State Highway and Transportation Officials ACI–NA Airports Council International–North America ACRP Airport Cooperative Research Program ADA Americans with Disabilities Act APTA American Public Transportation Association ASCE American Society of Civil Engineers ASME American Society of Mechanical Engineers ASTM American Society for Testing and Materials ATA American Trucking Associations CTAA Community Transportation Association of America CTBSSP Commercial Truck and Bus Safety Synthesis Program DHS Department of Homeland Security DOE Department of Energy EPA Environmental Protection Agency FAA Federal Aviation Administration FAST Fixing America’s Surface Transportation Act (2015) FHWA Federal Highway Administration FMCSA Federal Motor Carrier Safety Administration FRA Federal Railroad Administration FTA Federal Transit Administration HMCRP Hazardous Materials Cooperative Research Program IEEE Institute of Electrical and Electronics Engineers ISTEA Intermodal Surface Transportation Efficiency Act of 1991 ITE Institute of Transportation Engineers MAP-21 Moving Ahead for Progress in the 21st Century Act (2012) NASA National Aeronautics and Space Administration NASAO National Association of State Aviation Officials NCFRP National Cooperative Freight Research Program NCHRP National Cooperative Highway Research Program NHTSA National Highway Traffic Safety Administration NTSB National Transportation Safety Board PHMSA Pipeline and Hazardous Materials Safety Administration RITA Research and Innovative Technology Administration SAE Society of Automotive Engineers SAFETEA-LU Safe, Accountable, Flexible, Efficient Transportation Equity Act: A Legacy for Users (2005) TCRP Transit Cooperative Research Program TDC Transit Development Corporation TEA-21 Transportation Equity Act for the 21st Century (1998) TRB Transportation Research Board TSA Transportation Security Administration U.S.DOT United States Department of Transportation

TRA N SPO RTATIO N RESEA RCH BO A RD 500 Fifth Street, N W W ashington, D C 20001 A D D RESS SERV ICE REQ U ESTED N O N -PR O FIT O R G . U .S. PO STA G E PA ID C O LU M B IA , M D PER M IT N O . 88 G uidelines for O ptim izing the Risk and Cost of M aterials Q A Program s N CH RP Research Report 838 TRB ISBN 978-0-309-44634-1 9 7 8 0 3 0 9 4 4 6 3 4 1 9 0 0 0 0

Guidelines for Optimizing the Risk and Cost of Materials QA Programs Get This Book
×
 Guidelines for Optimizing the Risk and Cost of Materials QA Programs
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s National Cooperative Highway Research Program (NCHRP) Research Report 838: Guidelines for Optimizing the Risk and Cost of Materials QA Programs proposes guidelines for optimizing the risk and cost of materials quality assurance (QA) programs. It develops a methodology for establishing a materials QA program that optimizes risk and cost by providing appropriate types, levels, and frequencies of agency testing and inspection for transportation projects across their full range of type, size, complexity, and project-delivery method.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!