National Academies Press: OpenBook
« Previous: Chapter 1: Introduction
Page 9
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 9
Page 10
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 10
Page 11
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 11
Page 12
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 12
Page 13
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 13
Page 14
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 14
Page 15
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 15
Page 16
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 16
Page 17
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 17
Page 18
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 18
Page 19
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 19
Page 20
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 20
Page 21
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 21
Page 22
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 22
Page 23
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 23
Page 24
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 24
Page 25
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 25
Page 26
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 26
Page 27
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 27
Page 28
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 28
Page 29
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 29
Page 30
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 30
Page 31
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 31
Page 32
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 32
Page 33
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 33
Page 34
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 34
Page 35
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 35
Page 36
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 36
Page 37
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 37
Page 38
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 38
Page 39
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 39
Page 40
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 40
Page 41
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 41
Page 42
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 42
Page 43
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 43
Page 44
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 44
Page 45
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 45
Page 46
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 46
Page 47
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 47
Page 48
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 48
Page 49
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 49
Page 50
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 50
Page 51
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 51
Page 52
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 52
Page 53
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 53
Page 54
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 54
Page 55
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 55
Page 56
Suggested Citation:"Chapter 2: State of Practice." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 56

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

6 CHAPTER 2: STATE OF PRACTICE HISTORICAL PERSPECTIVE The AASHO Road Test, conducted during the late 1950’s and early 1960’s, is often credited with providing the impetus for developing statistically based quality assurance procedures for managing highway construction. Results of extensive sampling and testing at the AASHO Road Test documented unexpectedly large variabilities of measured material and product properties. With the recognition that properties of highway construction materials and products may vary considerably from target values, a movement toward today’s quality assurance procedures began. Statistical methods are the tools that allow quantification and consideration of variability. But significant changes in construction management philosophy have also occurred. The most important of these changes has been increased contractor involvement. The following are critical benchmarks in the evolutionary process that provide a basis for discussion: • Recognition of material and product variability • Application of statistical methods • Method specifications • End-result and performance-related specifications • Quality assurance procedures/specifications • Contractor-performed tests for acceptance (Federal Regulation 23 CFR 637B”) • Warranties and design-build procurement After the true variability in properties of highway construction materials and products was recognized; the logical next step was to look for statistical methods to

7 quantify and consider this variability in specification requirements. This led to the use of numerous statistical techniques in today’s specifications. Procedures to insure random samples and consideration of differences between split and independent samples are common. Numbers of samples and tolerances are related to both contractor and state DOT risks. Hypothesis testing and other comparative techniques are routinely used to determine the significance of differences between test results and specification criteria or differences between sets of test results. Pay adjustment schedules recognize and account for, at least theoretically, variations in performance that result from variations in as-constructed properties. State DOTs have the dominant role in method specifications with minimal involvement of contractors. Materials and construction methods are specified and state DOT personnel control the production, placement and inspection processes. It has been said that with this type specification, contractors provide financing, equipment and manpower which is managed by state DOTs. An often noted pitfall of such a system is that it may be difficult to determine cause when the end product is deficient. End-result and performance-related specifications share a very important common characteristic, i.e., the assumption that desirable properties of the as- constructed product can be identified and measured. This, from a construction management perspective, means the structure of these types of specification will be similar. Differences in the two types of specifications are related to the identification of desirable properties. Performance related specifications presume that fundamental engineering properties directly related to performance can be identified and measured

8 at time of construction. In addition, it is presumed that a quantifiable relationship exists between measured properties and performance and that these relationships can be used in the acceptance process. NCHRP Synthesis 212 (8), completed in 1995, concluded that no operational examples of performance-related specifications were identified and this has changed little in the intervening years. Requirements for properties used in end-result specifications are not so rigorous. Identified properties are certainly related to end-product quality, but direct relationships with performance may not exist. Ease and convenience of sampling and testing are often key considerations when selecting properties. Acceptance and pay adjustment computation are based on the degree of compliance with specification requirements. Many current state DOT end-result specifications for HMAC use both mix and in- place mat properties. Gradation and asphalt content are common acceptance parameters that are good indicators of construction quality but may not be directly relatable to pavement performance. As-constructed smoothness of surface layers is thought to be directly related to pavement performance and is a common acceptance parameter. However, a direct link between acceptance and quantifiable relationships between smoothness and pavement performance is not apparent in current pay adjustment schedules. Air void content is a second property similar to smoothness. Segregation is a property with a direct link to performance that is beginning to appear in specifications, although measurement remains difficult. Quality assurance is defined as “all those planned and systematic actions necessary to provide confidence that a product or facility will perform satisfactorily in

9 service” (1). As related to the construction process, the various elements are illustrated in Figure 1. Quality assurance specifications are defined as follows: “A combination of end-result specifications and materials and methods specifications. The contactor is responsible for QC (process control), and the highway agency is responsible for the acceptance of the product.” (1) This clear delineation of responsibilities has become distorted as some state DOTs, responding to provisions in federal regulation 23 CFR 637B (2), use contractor- performed tests in the acceptance process. No matter exactly how contractor- performed tests are used in the currently structured quality assurance process, i.e., process control or process control and acceptance, increased contractor involvement is obvious. A final benchmark for discussion is warranties and design-build. These procurement practices represent the next step in the evolution of contractor participation in the control and management of highway construction. With their implementation, the transformation from practically total state DOT control to practically total contractor control will be complete. REVIEWS AND SURVEYS Beginning in 1971 with HRB Special Report 118 (6), there have been periodic reviews of quality assurance procedures and specifications. NCHRP syntheses were published in 1976 (5), 1979 (7), 1995 (8) and 2005 (9). These are general reviews of the overall quality-assurance process and most often employed surveys as the main source of information. Limited reviews of specifications and practices for HMAC are provided by Benson (10) and Schmitt, et al (11). Benson reviewed specifications from 16 states with

10 particular attention to common practices that could be adopted with some confidence. Schmitt, et al (11) conducted surveys of 42 state DOTs and 61 contractors to develop recommendations for modifying or developing quality control and quality assurance specifications. Key issues identified were (a) whether to use contractor or agency data for acceptance, (b) use of quantity or time for lots, and (c) testing frequency. As part of this study, a limited review of specifications was conducted. Specifications for the 37 state DOTs shown in Figure 2 were reviewed on-line. Questionnaires were sent to 25 state DOTs and the FHWA-Western Federal Lands Highway Division (FHWA-WFLHD). Responses were received from 12 state DOTs and the FHWA-WFLHD.

11 Figure 2. States Where Specifications Were Reviewed

12 Procedures for Using Contractor-Performed Tests One of the overarching impressions developed from the review and survey was the extreme diversity in details of state DOT quality assurance processes. A reason is thought to be the variable progress state DOTs have made in moving from method to more end result oriented specifications. Reluctance to commit to required changes in construction management philosophy and uncertainty as to the best practices have certainly been impediments to progress. The result is a condition of flux in many state DOT quality assurance systems, particularly regarding the use of contractor-performed tests for acceptance. A second overarching impression was a lack of consensus regarding critical definitions. This is expressed rather succinctly in the latest synthesis of practice (9) as follows: “One problem associated with QA programs and specifications since their inception has been differing interpretations of the specialized vocabulary used in these programs.” (9) One often finds terminology in various state DOT specifications that are unrelated to or possibly in conflict with those in TRB’s, “Glossary of Highway Quality Assurance Terms” (1). A desire for clarity in this research effort leads to some consideration of definitions as related to project objectives. There seems to be general agreement, or at least no serious controversy, as to the definition or meaning of quality assurance, quality (process) control, acceptance and independent assurance. As depicted in Figure 1, the latter three are part of an overall quality assurance process. In the traditional separation of responsibilities, contractors are responsible for quality control and state DOTs are responsible for acceptance and

13 independent assurance. Issues arise when there is a mixing and mingling of responsibilities, particularly when contractor-performed tests are used for acceptance. In many instances, lack of clear delineation of responsibilities is the reason for confusing terminology. In order to use quality control sampling and testing results as part of the acceptance decision, 23 CFR 637B (2) requires that “The quality of the material has been validated by the verification samplication and testing.” As used herein, verification will refer to a procedure intended to determine if contractor and state DOT performed tests provide measures of material or product properties that are comparable, i.e., when compared they are acceptably close. Verification is critical to using contractor- performed tests in the acceptance process. The viability of contractor-performed tests is a primary concern of state DOTs (3), and procedures for verifying contractor- performed tests have been identified as a topic needing additional study (4). Table 1 groups state DOT specifications for various materials and products according to requirements for contractor-performed testing, requirements for comparing contractor test results with state DOT test results, and the use of contractor-performed test results for acceptance. Included are specifications from the 37 states illustrated in Figure 2. The materials and products are limited to those that require both contractor and state DOT-performed test results. Exceptions in Group B are HMAC in Illinois and Washington and PCC in Nevada and Montana. These materials in these four states were included because they have end result type specifications that use statistical principles for acceptance but require no contractor-performed testing.

14 Table 1. Classification of Testing, Verification and Acceptance Procedures Group A: Both State DOT and Contractor Testing Required Subgroup A-1-a: State DOT Test Results for Acceptance and Comparisons Required 1. HMA properties and mat density- Colorado* 2. PCC pavement mix and slab properties- Colorado* Subgroup A-1-b: State DOT Test Results for Acceptance and No Comparison Required 1. HMA mix properties and mat density- Indiana*, New Jersey, Utah, Arizona 2. HMA mix properties- Mississippi, Tennessee, Louisiana, Nevada, Texas 3. HMA mat density- West Virginia*, Idaho*, Montana 4. PCC properties- Arizona 5. PCC strength- Michigan 6. PCC pavement mix and slab properties- Indiana* 7. Base density- New Mexico Subgroup A-2-a: Contractor Test Results for Acceptance and Comparisons Required 1. HMA mix properties and mat density- Florida*, South Carolina*, Maryland, Wyoming, Minnesota, Iowa, Missouri, Oregon, Kansas 2. HMA mix properties- Alabama*, Georgia*, North Carolina*, West Virginia*, Ohio, New York*, Kentucky, Nebraska 3. PCC- Florida*, Utah 4. PCC pavement- Missouri, Oregon, Texas, Kansas 5. Aggregate base- Florida*, Arkansas, Missouri 6. Earthwork- Florida, Arkansas Subgroup A-2-b: Contractor Test Results for Acceptance and No Comparison Required 1. HMA mix properties- Idaho* 2. HMA mat density – Virginia*, New York* (series 70), Nebraska 3. PCC plastic properties – Michigan, Kansas (pavement) Subgroup A-3-a: Combined Test Results for Acceptance and Comparisons Required 1. HMA mix properties and mat density- New Mexico, California 2. HMA mix properties- Virginia*, Michigan, Arkansas* 3. PCC pavement- Arkansas* 4. Base- New Mexico Subgroup A-3-b: Combined Test Results for Acceptance and No Comparisons Required 1. HMA mix properties- Pennsylvania Group B: Only State DOT Testing Required for Acceptance 1. HMA mix properties and mat density – Illinois, Washington 2. HMA mix properties- Montana 3. HMA mat density- Alabama*, Georgia*, Mississippi, Tennessee, Kentucky, Pennsylvania, Ohio, Michigan, Louisiana, New York* (series 50 and 60), Texas, Nevada 4. PCC- Nevada, Montana * States that responded to survey.

15 The combination of materials and state DOTs in Group A are those where specifications require both contractor and state DOT sampling and testing. Subgroup A- 1-b represents the traditional separation of contractor (quality control) and state DOT (acceptance) responsibilities. Subgroups A-2-a and A-3-a represent typical use of contractor-performed test results for acceptance when they are verified. Subgroups A-2- b and A-3-b indicate use of contractor-performed test results without verification, but in all cases other properties are involved in final acceptance. The combinations of materials and state DOTs in Group B are those where specifications require only state DOT sampling and testing for acceptance. Certainly there are many other combinations which have state DOT dominated quality control and acceptance procedures that could have been included. However, as noted previously, combinations included in Group B are combinations with basically end-result specifications that use statistical principles for acceptance. HMAC mat density is included in Group B for twelve states. For these same states, mix properties are included in Group A where both contractor and state DOT-performed tests are required. The complexity of Table 1 is indicative of the diversity in the use of contractor- performed test results in the quality assurance process. In the following sections, details of verification and acceptance procedures for HMAC and PCC are presented and discussed. Procedures for Hot Mix Asphalt Concrete The use of contractor-performed test results in the quality assurance process is most widespread for HMAC. This was apparent from the review of specifications and is

16 also reported by Hughes (9). It reflects considerable movement from method to end result specifications with increased contractor involvement. Contractors are given responsibility for mix design, contractors are required to develop and implement quality control programs, and, ultimately, contractor-performed test results are used for acceptance. The properties for HMAC may be separated into material and (in-place) mat properties. Material properties include gradation, asphalt content, voids in total mix (VTM), voids in mineral aggregate (VMA), and, occasionally, voids filled with asphalt (VFA), tensile strength ratio (TSR), moisture content, stability, and dust to asphalt ratio. Some material properties are used for process control and some are used for both process control and acceptance. In-place mat properties include in-place density, smoothness (surface courses), and, occasionally, layer thickness. In-place density is always used as both a control and acceptance property and was the only mat property considered for analysis. Smoothness and layer thickness were not considered for analysis because only one set of measurements is normally made by either the contractor or state DOT. For smoothness, there are occasionally different tests used for control and acceptance. For example straight edge measurements are made for quality control and profilograph (PI) or inertial profiler (IRI) measurements for acceptance. Cores are measured for layer thickness and two sets of length measurements with a ruler or a caliper seems somewhat redundant. Table 2 summarizes details of procedures for HMAC for the 37 state DOT specifications reviewed. The presentation is in the form of questions and responses

17 based on the procedures. The diversity encountered required, on occasion, creative reasoning to categorize some procedure details. Responses to question 1 indicate that, when required, contractor-performed test results are more likely than not used in the acceptance process. The responses also indicated that contractor-performed tests of mix properties are used more than contractor-performed tests of mat density. Among the state DOTs responding to the survey, nine use contractor performed test results for mix acceptance but their own test results for mat density acceptance. Final pay adjustments are most often based on combinations of pay adjustments for mix properties and mat density. In 11 states, contractor test results are used only for quality control. Questions 2 and 3 provide insight into sampling and testing frequencies. For a majority of situations, the ratio of contractor to state DOT testing frequencies is 4 to 1 or less, with 4 to 1, by far, the most common. Questions 4 through 9 are applicable when contractor test results are used for acceptance. In 16 of the 23 states that use contractor-performed test results for acceptance, verification is achieved by one to one comparisons. This verification procedure has been referred to as “statistically weak”. Its widespread use is believed due to consideration of practical and reasonable numbers of tests and LOT size for acceptance. An often expressed reason or justification for using contractor-performed test results for acceptance is shrinking state DOT work forces for construction management.

18 Table 2. Practices When Using Contractor- Performed Tests for HMAC 1. Are contractor- performed test results used in the acceptance process? Yes- 232 No- 133 2. What are the ratios of contractor to state DOT testing frequencies for mix properties? 4 to 1 or less - 21 5 to 1 to 9 to 1 - 3 10 to 1 or greater - 6 Other - 2 3. What are the ratios of contractor to state DOT testing frequencies for mat density? 4 to 1 or less - 10 5 to 1 to 9 to 1 - 5 10 to 1 or greater - 3 Other - 1 4. What are methods for verifying (comparing with state DOT) contractor-performed test results when used in the acceptance process? 1 state DOT to 1 contractor - 16 1 state DOT to contractor average - 1 F and t test - 4 Other - 2 5. What acceptance method is used? Pass/ Fail - 24 Pay adjustment - 224 6. What are LOT sizes for acceptance? Days production - 9 Project production - 2 Tonnage (>1000 tons) - 9 Other - 3 7. How are contractor-performed test results used in making acceptance decisions? Alone - 182 Combined with state DOT - 5 8. How are contractor- performed test results related to specification requirements for making acceptance decisions? Deviation from target -10 Absolute deviation from target - 4 Percent within limits - 9 9. How are pay adjustments for individual properties used to determine LOT pay adjustments 5? Lowest - 84 Weighted average - 11 Cumulative - 4 Notes 1. Review of Maryland DOT specifications is not included. 2. In Alabama, Georgia, Kentucky, West Virginia, Pennsylvania, Ohio, New York, Michigan, and Idaho contractor- performed test results are used for mix properties but DOT-performed test results are used for mat density. 3. Includes Illinois and Washington even though no contractor sampling and testing required. 4. The Iowa DOT has a pass/fail system for mix properties and adjusts pay for mat density. 5. Surface smoothness is commonly used acceptance property, but it is almost always applied independently of mix and mat properties and is almost always based on one set of measurements, i.e., state DOT or contractor.

19 that are practical. From a contractors’ perspective, timely acceptance decisions should reasonably be expected. These factors limit the number of state-DOT tests for verification. Four state DOTs use more statistically robust F and t tests. This method, however, is not without application problems. Minimum sample sizes are required before valid comparisons can be made but contractors want timely acceptance decisions. Table 3 illustrates how these four state DOTs use F and t tests to verify contractor-performed tests. The most common method to overcome the problem of minimum sample size is to designate the entire project production as a LOT for acceptance. Table 3 also illustrates differences in what constitutes comparable test results, i.e., mean only or mean and variability. The significance level used to determine statistically significant differences in all four states is 1%. When contractor-performed test results are used in the acceptance process, they are most likely used for the computation of pay adjustments to bid prices. A pass/fail procedure is used in only two states. Included in the 22 procedures where pay adjustments are computed are several that compute pay reductions as a last resort in what is basically a pass/fail system. These acceptance procedures are somewhat flexible and contain remnants of method specifications. Both contractor and state DOT sampling and testing are required, and re-sampling and retesting may be allowed. Unfavorable comparisons or unacceptable test results first lead to investigations or additional sampling and testing before definitive decisions are made regarding acceptance and pay factor determination.

20 Table 3. Differences in Application Procedures for F and t tests State LOT Size Testing Procedure Verification Procedure Kansas Day (mat density) 3000 tons (mix) F & t for last 5 LOTs F to decide t or mod. t t indicates same X New Mexico Total Project Production F & t cumulatively F and t indicate same s2 and X California Total Project Production t cumulatively Assume equal s2 t indicates same X Idaho Days Production F & t cumulatively F to decide t or mod. t t indicates same X

21 On the other hand, some specifications have more rigid acceptance procedures. In these there is a stronger commitment to a statistically based procedure and significant movement toward end results. These acceptance procedures are characterized by definite sampling and testing requirements, specific verification procedures (if contractor-performed tests are used for acceptance) and definite consequences based on comparisons of test results and comparisons of test results with acceptance criteria. LOT sizes for acceptance are most often a day’s production (9 states) or a discrete tonnage greater than 1000 tons (9 states). Two states delineate the entire project production of a material or product as a LOT for acceptance. These are western states that use the F and t test for verification. Larger quantities eliminate the impediment of minimum required numbers of test results for application of F and t tests as noted above. Questions 7-9 are related to computation and application of pay adjustments. Eighteen of 23 states use verified contractor-performed test results alone to compute pay adjustments. Five states combine their test results with contractor test results. To compute pay adjustments, ten states compare deviation from target values with numerical criteria. Deviations may be positive or negative. Four states use similar procedures but the basis is absolute deviation from target values. Nine states use the percent within limits (PWL) method or some derivative. PWL methods use LOT means and standard deviations to compute pay adjustments. Finally, pay adjustments for individual properties are used to compute LOT pay adjustments. The most common method (11 states) is a weighted average LOT pay

22 adjustment computed with individual property pay adjustments. The lowest property pay adjustment is applied in 8 states and cumulative adjustments for several properties are applied in 4 states. Smoothness is a mat property that is commonly included for asphalt concrete surface courses. The pay adjustments for smoothness are, however, most often treated independently of material properties and mat density. Procedures for Portland Cement Concrete The shift in the responsibility for acceptance testing from state DOTs to contractors is not as advanced or pronounced for PCC products as with hot-mix asphalt concrete. Several factors may explain the relatively sparse use of contractor-performed tests for acceptance purposes. One of these is the fact that PCC is typically not produced by the contractor, but rather by a producer or supplier who sells the product to the contractor. A second factor is that a wide variety of PCC applications exist (while HMA is used solely for pavements); lot sizes, testing frequencies, etc. typically vary among these applications, further complicating change from the traditional division of sampling and testing responsibilities. Finally, the most important property of PCC is strength. This can only be determined after some curing time (usually 28 days) which makes timely final acceptance decisions difficult. Properties of PCC can be grouped into two categories: plastic properties, and hardened, or in-place, properties. Some states that require contractor testing for acceptance have done so only for in-place properties (compressive strength, flexural strength, and pavement thickness), while other states have done so with both hardened and plastic properties (slump and air content).

23 Pavement smoothness is not considered in this analysis since typically only one measurement is made, rather than both the contractor and the agency measuring smoothness and then comparing the findings. In some states, the contractor is responsible for collecting the data while the state DOT then reviews and interprets the results. Table 4 presents a summary of how contractor-performed tests are used for acceptance and pay adjustments; there is little consensus among the states using contractor-performed tests. Item 1 in the table clearly shows that most state DOTs still control acceptance testing; only eight of those whose specifications were reviewed use contractor-performed tests for acceptance. In many other states, the contractor carries no testing responsibilities whatsoever. The remaining items in Table 3 apply only to the eight states noted in Item 1 (Arkansas, Colorado, Florida, Kansas, Missouri, Texas, Utah, and West Virginia). It can be seen in Item 2 that a wide variety of approaches are used to compare results of tests conducted by contractors and state DOTs. At one extreme, one state uses the statistical approach of t-tests and F-tests to compare means and variances of test results from each party. While this approach may be more conceptually complex than others, it also maximizes the probability that the product is characterized correctly by the test results. However, this approach cannot be applied until some minimum sample size of test results are available (most commonly three test results per party as found in the review of hot-mix asphalt acceptance procedures).

24 Table 4. Practices When Using Contractor-Performed Tests for PCC 1. Are contractor-performed tests used in the acceptance process?1 Yes: 8 No: 29 2. What methods are used for verification (comparison) of contractor-performed test results when used in the acceptance process?2 One DOT to one contractor: 2 DOT average to contractor average: 1 F- and t-tests: 1 One DOT to contractor average: 4 No direct comparison made: 1 3. What is the methodology of the acceptance process?3 Pass/Fail: 2 Pay adjustment: 7 4. What are lot sizes for acceptance?4 Project production: 2 One day production: 4 Area/volume: 2 5. How are contractor-performed test results used in the acceptance process? Alone: 4 Combined with DOT: 4 6. How are contractor-performed test results related to specification requirements for making acceptance decisions? Deviation from target value: 1 Variability/standard deviation: 2 Percent within limits: 2 As yet undetermined: 3 7. How are pay adjustments for individual properties used to determine lot pay adjustments? Weighted average: 5 Other: 2 Notes: 1. The eight states (22% of those states studied) are: Arkansas, Colorado, Florida, Kansas, Missouri, Texas, Utah, West Virginia. One of these states (Colorado) has two acceptance processes for PCC products: “flexural strength criteria” in which contractor-performed tests are used for acceptance, and “compressive strength criteria” in which only agency-performed tests are used for acceptance. 2. One state (Florida) compares an individual agency test result to an individual contractor-performed test result for slump, air content, compressive strength, and flexural strength, and compares the average of agency test results to the average of contractor test results for both strength properties. 3. One state (Texas) uses a pass/fail approach to acceptance for slump, air content, and flexural strength but applies a pay adjustment based on the results of pavement thickness tests. 4. One state (Colorado) that uses project production as the lot size allows for a new lot to begin when a “process change” occurs. Process changes include changes in mix design, material source, design pavement thickness, or construction method. The other state (West Virginia) that uses project production as the lot size applies this lot definition to each lane of pavement (each lane constitutes a lot).

25 At the other extreme, two states use the simple and relatively inexpensive, but statistically weak, approach of comparing one state DOT result to the corresponding contractor test result. This approach, while appealing from a resource perspective, may not effectively compare contractor and state DOT tests. Other comparison procedures include comparison of one state DOT-performed test to an average of contractor- performed test results, and comparing averages of test results from both parties. Interestingly, although four states use this approach, they do not take advantage of the fact that such a data set can also be the basis for statistical testing. Regarding acceptance and pay adjustment processes, all but one of the eight state DOTs that use contractor-performed tests for acceptance apply pay adjustments based on some or all of the properties tested, rather than simply accepting products on a pass/fail basis. While all of these states use contractor-generated results in the acceptance and pay adjustment processes, half of them use contractor data alone, while half use contractor and DOT-generated results in some combination. For acceptance purposes, LOT size varies widely. Two states use the entire project quantity as the basis for acceptance and pay adjustment; two other states an area or volume based quantity. Four states define a LOT as one day of production (or smaller subsets if production exceeds a specified amount). Pay adjustments are typically applied using weighted averages of the various factors for which pay adjustments are possible, except in two states, in one of which only one property is used for pay adjustment purposes. A variety of practices are used to relate the results of contractor-performed tests to specification requirements. Two state DOTs use the percent within limits concept.

26 One state uses deviations from specified targets, while two others use measures of variability. The processes used in three states were not evident from the specifications review. Confidence in Contractor-Performed Tests The limited survey described earlier asked state DOTs how confident they were that contractor-performed tests provided the same measure of material quality as their tests. The state DOTs were asked to rank (5 “confident” to 1 “not confident”) their level of confidence. The composite ranking for 12 state DOTs was 4.1 for HMAC and for 10 state DOTs was 4.8 for PCC. These rankings are contrary to the results of surveys reported by Hancher, et al (3) where viability of contractor-performed tests was of primary concern to state DOTs. However, in the surveys reported by Hancher, et al (3), both state DOTs and contractors indicated the major advantage of contractor-performed quality control was contractor responsibility for their products. This implies that the concern of state DOTs is related to use of contractor-performed tests for acceptance. A second question asked state DOTs was how satisfied they were with their quality assurance programs. The composite ranking, on a scale of 1 to 5, for 12 state DOTs was 3.8 for HMAC and for 10 state DOTs was 4.0 for PCC. These are not as high as the rankings related to confidence in test results but do indicate a relatively high satisfaction level. COMPARISONS OF CONTRACTOR AND STATE DOT-PERFORMED TESTS Several studies that compared contractor and state DOT performed test results were found in the literature. These studies include statistical comparisons of means

27 and variability of contractor and state DOT test results. In addition, indications of possible bias were examined. Several of the studies (17-21) were of data collected during the development and implementation of statistically based quality assurance procedures for HMAC by the Alabama DOT. A recently published study (3) compared contractor and Kentucky Transportation Cabinet test results for HMAC and both paving and structural PCC. An unpublished study analyzed data collected during the trial implementation of a statistically based quality assurance procedure for structural PCC by the Alabama DOT. Alabama DOT Hot Mix Asphalt Concrete Tests Parker and Hossain (12) compared asphalt content and air voids measurements for HMAC (Marshall mix design) collected during implementation of a statistically based quality assurance procedure by the Alabama DOT. Table 5 contains the results of statistical comparisons (5% significance level) of asphalt content measurements for three mix types and combined data from three construction seasons. The variable used in the analyses was the difference between measured and target values (∆ =X-XT). Data used in the comparisons are illustrated in Figures 3 and 4. The statistical comparisons provide no strong indications of significant differences or similarities between means or variabilities of contractor and Alabama DOT asphalt

28 Table 5. Summary of Statistical Analyses of Differences Between AHD and Contractor Asphalt Content for Combined Mix Data (Ref. 12) Year Mix Type Significantly Different Variability Higher Variability Significantly Different Mean Deviation Higher Mean Deviation 1 414 Yes AHD No … 9 416 Yes AHD Yes AHD 9 417 No … No … 2 Combined Yes AHD Yes AHD 1 414 No … Yes Contractor 9 416 No … Yes AHD 9 417 No … No … 1 Combined No … Yes Contractor 1 414 Yes AHD No … 9 416 No … No … 9 417 No … Yes Contractor 0 Combined Yes AHD Yes AHD AHD – Alabama Highway Department (now Alabama Department of Transportation) Figure 3. Summary of AHD and Contractor Asphalt Content Standard Deviation (Ref.17)

29 Figure 4. Summary of AHD and Contractor Asphalt Content Mean Deviation (Ref. 12) content measurements. The statistical comparisons and Figure 3 do, however, show that the variability of Alabama DOT asphalt content measurements is likely larger than the variability of contractor measurements. Table 6 and Figures 5 and 6 show results of similar analyses for air voids. The statistical comparisons in Table 6 and Figure 6 suggest no significant differences in means of contractor and Alabama DOT air voids measurements. As to variability, the statistical comparisons provide no strong indication of differences or similarities. However, similar to asphalt content, the statistical comparison and Figure 5 suggests that the variability of Alabama DOT air voids measurements is likely larger than the variability of contractor measurements. Parker and Hossain (13) compare mat density measurements (with a nuclear gauge) collected during the above mentioned implementation by the Alabama DOT.

30 Table 7 contains results of statistical comparisons (5% significance level). No results are shown for 1991 because during this construction season the Alabama DOT only measured mat density with cores. Data used in the comparison are depicted graphically in Figures 7 and 8. The statistical comparisons in Table 7 indicate significant differences in means and variability of contractor and Alabama DOT measurements. Larger variability and larger deviation from target density (94% of theoretical maximum mix density) for Alabama DOT measurements are illustrated in Figures 7 and 8. Figure 7 also strongly indicates that measured mat densities are consistently lower than the target mat density (∆ = X – 94). It should be noted that the 94% target is more rigorous than the more common target of 92%. Implications of the consistent inability to achieve target compaction are meaningful for the acceptance Table 6. Summary of Statistical Analyses of Differences Between AHD and Contractor Air Void Content for Combined Mix Data (Ref. 12) Year Mix Type Significantly Different Variability Higher Variability Significantly Different Mean Deviation Higher Mean Deviation 1 414 No … No … 9 416 Yes AHD No … 9 417 No … No … 2 Combined Yes AHD No … 1 414 Yes AHD No … 9 416 Yes AHD No … 9 417 No … No … 1 Combined Yes AHD No … 1 414 Yes AHD No … 9 416 No … No … 9 417 Yes Contractor Yes AHD 0 Combined No … No …

31 Figure 5. Summary of AHD and Contractor Air Void Content Standard Deviation (Ref. 12) Figure 6. Summary of AHD and Contractor Air Void Content Mean Deviation (Ref. 12)

32 Table 7 Summary of Statistical Analyses of Differences Between AHD and Contractor Mat Density (Nuclear Gage) Measurements for Combined Mix Data (Ref. 13) Year Mix Type Significantly Different Variability Numerically Higher Variability Significantly Different Mean Deviation Numerically Higher Mean Deviation 1993 414 Yes AHD Yes AHD 416 Yes AHD Yes AHD 417 Yes AHD Yes … Combined Yes AHD Yes AHD 1992 414 Yes AHD Yes AHD 416 Yes AHD Yes AHD 417 Yes AHD No … Combined Yes AHD Yes AHD 1990 414 Yes AHD Yes AHD 416 Yes AHD Yes AHD 417 Yes Contractor Yes AHD Combined Yes AHD Yes AHD Figure 7. Summary of Nuclear Gage Mat Density Measurement Mean Deviation From Target (Ref.18)

33 Figure 8. Summary of Nuclear Gage Mat Density Measurement Variability (Ref.18) process. In the Alabama DOT system, the lowest pay adjustment for asphalt content, air void content, or mat density was applied to each LOT. This means that mat density was, in most cases, the most critical for pay factor computation. Beginning in 2003, acceptance and pay factor computation for mat density were based on Alabama DOT tests on cores. Contractors are still required to test mat density with nuclear gages, but results are for their process (quality) control only. The Alabama DOT began implementing the Superpave mix design system for HMAC in the mid 1990’s. Data was collected from 1997 to 2000 to determine quality assurance modifications for Superpave designed mixes. Parker and Hossain (14) compared contractor and Alabama DOT measurements for asphalt content, air void content, and mat density collected to support these modifications.

34 Table 8 contains results from statistical comparisons (5% significance level) of means of deviations from target values for asphalt content, air void content, and mat density. Differences for asphalt content are not significant but differences for mat density are significant. Results are mixed for air void content. In terms of numerical differences, there are no strong indications that either contractor or Alabama DOT asphalt content measurements are closer to targets. However, for air voids and mat density, contractor measurements are consistently closer to targets.

35 Table 8. Comparison of Deviation from Target Values Between Contractor and Alabama DOT Measurements (Ref. 14) Average Deviation from Target Measured Properties (Analysis Variable) Year DOT Contractor Statistical Difference @ 5% Level Asphalt 1997 -0.130 -0.087 S.D 1998 -0.006 -0.028 N.S.D. (AC-Target, JMF) 1999 -0.069 -0.048 N.S.D 2000 +0.001 +0.006 N.S.D Combined -0.045 -0.036 N.S.D Air Voids 1997 -0.074 -0.075 N.S.D 1998 -0.256 -0.229 N.S.D (Voids – 4%) 1999 -0.477 -0.351 S.D. 2000 -0.437 -0.371 N.S.D Combined -0.357 -0.281 S.D. Mat Density 1997 -1.772 -1.763 N.S.D 1998 -1.700 -1.427 S.D Density as (% of TMD – 1999 -1.097 -0.875 S.D. 94%) 2000 -0.981 -0.689 S.D. Combined -1.245 -0.997 S.D.

36 Of interest in Table 8 is the observation that both air void content and mat density are consistently less than target values. The low air void contents are inconsistent with asphalt contents that are close to design values. The difficulty in achieving mat compaction is consistent with the initially high Superpave mix design compaction levels and coarse gradations that resulted in very harsh mixes. Recent reductions in mix design compaction levels to increase asphalt content and changes in recommended gradations were made by the Alabama DOT to produce more workable mixes. Table 9 contains results from statistical comparison of variability. There are strong indications that the variability of contractor asphalt content, air void content, and mat density measurements are significantly smaller than Alabama DOT measurements. HMAC was the first, and to date the only, construction material managed by the Alabama DOT with a statistically based quality assurance procedure. An analysis of data collected during the implementation of this procedure will illustrate how product quality, as quantified by reduced variability and closer proximity to targets for measured properties, improved and stabilized with implementation. There is nothing particularly unusual about Alabama DOT procedures, and it might be argued that if certain details of the procedure were different, then the product quality might have been even better. However, this is an argument that can never be settled, but the noted improvements in quality during implementation are irrefutable.

37 Table 9. Comparison of Variability of Contractor and Alabama DOT Measurements (Ref. 14) Variability (Standard Deviation) Measured Properties (Analysis Variable) Year DOT Contractor Statistical Difference @ 5% Level Asphalt 1997 0.265 0.239 S.D 1998 0.237 0.197 S.D. (AC-Target, JMF) 1999 0.274 0.247 S.D 2000 0.288 0.219 S.D Combined 0.272 0.230 S.D Air Voids 1997 1.054 0.989 N.S.D 1998 1.019 0.791 S.D (Voids – 4%) 1999 1.014 0.847 S.D. 2000 0.992 0.840 S.D Combined 1.025 0.863 S.D. Mat Density 1997 1.493 1.475 N.S.D 1998 1.742 1.406 S.D Density as (% of TMD-94%) 1999 1.276 0.991 S.D. 2000 1.382 0.975 S.D. Combined 1.470 1.175 S.D. Note: S.D. = significantly different; N.S.D.= not significantly different; JMF= job mix formula. ALDOT began implementing a statistically based quality assurance program for Marshall designed HMAC in 1990 and completed the process in 1994. Analyses of the data collected during implementation were described above (12 and 13) with additional data and analyses in Reference 15. A chronology for the implementation is as follows: 1990 - Model specification for four trial projects. Pay adjustments computed but not applied.

38 1991 - Modified specifications for 11 trial projects. Pay adjustments computed and applied at 50% of the computed values. 1992 - Modified specifications for all projects. Pay adjustments computed and applied at full values. 1993 - Final specifications for all projects. Pay adjustments computed and applied at full values. Figures 9-11 illustrate how variability and proximity to target values for measured properties changed during implementation. The standard deviations of asphalt content and voids decreased from 1990 to 1991 and stabilized at about 0.2% for asphalt content and 0.6 to 0.7% for voids content. The trend for mat density variability was somewhat different with a rather uniform decrease in standard deviation for the entire implementation period. Amongst the possible reasons for the observed decreases in variabilities are better process control and improvements in technician sampling and testing skills. It is also likely that the application of pay adjustments, beginning in 1991, was an important factor in the considerable improvements in asphalt and voids content variabilities between 1990 and 1991. A trend for all three properties is that the standard deviations for contractor test results were consistently smaller than standard deviations for ALDOT test results.

39 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 Marshall Mixes Year Superpave Mixes S ta n da rd D ev ia tio n , % ALDOT Contractor Combined ALDOT & Contractor Measurements for Marshall Mixes -0.15 -0.1 -0.05 0 0.05 0.1 0.15 0.2 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 Marshall Mixes Year Superpave Mixes M ea n D ev ia ti o n, % (x - x T) Figure 9. Asphalt Content Statistics During Quality Assurance Implementation

40 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 1.3 1.4 1.5 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 Marshall Mixes Year Superpave Mixes S ta nd ar d D ev ia ti on , % ALDOT Contractor Combined ALDOT & Contractor Measurements for Marshall Mixes -0.5 -0.4 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4 0.5 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 Marshall Mixes Year Superpave Mixes M ea n D ev ia ti on , % ( x - x T) Figure 10. Voids Content Statistics During Quality Assurance Implementation

41 0 0.25 0.5 0.75 1 1.25 1.5 1.75 2 2.25 2.5 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 Marshall Mixes Year Superpave Mixes S ta nd ar d D ev ia ti on , % ALDOT Contractor Combined ALDOT & Contractor Measurements for Marshall MIxes -2 -1.75 -1.5 -1.25 -1 -0.75 -0.5 -0.25 0 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 Marshall Mixes Year Superpave Mixes M ea n D ev ia ti o n, % (x - x T) Figure 11. Mat Density (%Gmm) Statistics During Quality Assurance Implementation

42 Figure 9 indicates considerable improvement in achieving target asphalt contents between 1990 and 1991, and stabilization close to target values for the remainder of the implementation period. Deviations from targets are defined as measured values minus target values. The proximity of measured voids content to the 4% target (Figure 10) does not stabilize after 1991 but continues to improve rather uniformly. By 1993, when implementation was complete, measured air void contents are very close to the 4% target. The proximity of measured mat density to the 94% target (Figure 11) improves rather uniformly for the entire implementation period. However, unlike air void contents, which were very close to the 4% target in 1993, mat density measurements were still 0.5-0.7% below the 94% target. The consistent decrease in air void content and increase in mat density are inconsistent with the general decrease in asphalt content. Air void content should increase and mat density should decrease as asphalt content decreases. For mat density, a logical reason or explanation for the observed increase may be greater compactive effort. However, there is no logical physical explanation for the uniform, though small, rate of decrease in voids content for the entire implementation period. For asphalt and air void contents, there are no consistent differences between the proximity of contractor or Alabama DOT measurements to target values. For mat density, however, contractor test results were consistently closer to target values. As noted above, the variabilities of contractor test results for all three properties were consistently smaller.

43 Beginning in 1997, the Alabama DOT conducted a study to see if changes to quality assurance procedures would be needed for HMAC designed with the Superpave method (16). Analyses of data collected during this study were described above (14). The chronology of the modification of the quality assurance procedures is as follows: 1997 - Data collected and analyzed for nine trial Superpave projects and nine comparable Marshall-designed projects. 1998 - Data collected and analyzed for 20 Superpave projects. 1999 - Modified specifications for 27 Superpave projects. Pay adjustments computed and applied at 50% of the computed values. 2000 - Final specifications for all Superpave projects. Pay adjustments computed and applied at full values. The standard deviations for both asphalt content (Figure 9) and voids content (Figure 10) remained relatively constant during the 4-year implementation period. Both deviations were somewhat larger than standard deviations for Marshall-designed mixes. The larger standard deviations for Superpave air void contents are consistent with the larger standard deviations for the asphalt content. Standard deviations and means of combined Alabama DOT and contractor tests for the 1997 Marshall mixes are shown in Figures 9 through 11. The standard deviation of these asphalt content measurements was larger than the 1991-1993 Marshall measurements and about the same as the standard deviations for the comparable 1997 Superpave projects. The standard deviation of these air void content measurements

44 was essentially the same as 1991-1993 Marshall measurements and smaller than the standard deviations for the comparable 1997 Superpave projects. The variability of mat density measurements (Figure 11) for the nine Superpave projects in 1997 was higher than the variability for the nine comparable Marshall projects. However, the standard deviation for Superpave mat density measurements decreased during the 4-year implementation period to levels comparable to those achieved for Marshall mixes in 1993. Similar to Marshall mixes, standard deviations of contractor measured properties for Superpave mixes were consistently smaller than the standard deviations of Alabama DOT measured properties. The proximity of asphalt content (Figure 9) and mat density (Figure 11) measurements to target values improved during the 4-year implementation period. Conversely, the proximity of air void content measurements (Figure 10) to the 4% target worsened, with the average air void content decreasing from about 3.9% in 1997 to about 3.6% in 2000. For both Marshall and Superpave mixes, asphalt contents were very close to target values after the 4-year implementation period. Mat densities for both mix types improved, but both remained below the 94% target value – by about 0.7% for Marshall mixes and 0.9% for Superpave mixes. Air Void contents of Marshall mixes were very close to the 4% target after the 4-year implementation period. Air voids content of Superpave mixes was the only quality measure that degraded during implementation. An possible reason for this behavior may be the contractors’ efforts to minimize potential for pay reductions. The general decrease in air void content and the general increase in mat density are consistent with the general increase in asphalt

45 content. Mat density was the critical property for pay adjustments and the lower than desirable air void content was apparently accepted by contractors in order to achieve higher mat density. Similar to the Marshall mixes, contractor measured properties for Superpave mixes generally tended to be closer to target values than Alabama DOT measured properties, particularly for air void content and mat density. The implementation of statistically based quality assurance procedures for HMAC resulted in progressive improvements in quality that stabilized with time. However, there remained differences in the level of quality indicated by contractor and Alabama DOT-performed tests. Alabama DOT Portland Cement Concrete Tests Table 10 presents comparisons of tests from an unpublished study of structural PCC for Alabama DOT. The data for the comparisons were collected during a bridge construction project that was part of a study to evaluate the feasibility of a statistically based quality assurance procedure for structural PCC. For this pilot project, Alabama DOT testing frequencies were increased and the contractor was required to conduct quality control sampling and testing. Since the contractor had no testing capabilities, a consultant was hired for this purpose. A model specification combining contractor tests with Alabama DOT tests for computing pay factors was developed. Pay factors were computed, but concrete was actually accepted for pours on a pass/fail basis determined with Alabama DOT compressive strength tests.

46 Table 10. Comparison of QC/ QA Data for Substructure PCC Property n σ Difference @ 5% Significance Level x Difference @ 5% Significance Level QC Slump, in 26 0.87 NSD 3.88 NSD QA Slump, in 32 0.82 4.01 QC Air, % 26 0.64 NSD 3.71 NSD QA Air, % 32 0.58 3.58 QC Comp. Str., psi 78 629 NSD 5902 NSD QA Comp. Str., psi 99 539 5737 Target Slump = 4in Target Air = 4% Minimum 28 day comp. str. = 3500psi Comparisons in Table 10 indicate no significant differences in variability or estimates of target values for contractor and Alabama DOT tests. These comparisons, however, should be viewed in light of the following facts: • Contractor tests were not used for acceptance and each pour was accepted based on Alabama DOT tests. • Slump and entrained air are relatively quick tests and were run, side-by- side and simultaneously, at ready mix truck discharges. • Cylinder fabrication is a process offering few, if any, opportunities to improve strength but numerous opportunities to impair strength. • Separate, but side-by-side, initial (24 hour) curing facilities were provided on site for contractor and Alabama DOT cylinders.

47 • Concrete was purchased by the contractor from a ready mix supplier. • Substructure construction required about 5 months for completion and was comprised of numerous pours. The first four facts should promote comparable test results. Contractor and Alabama DOT slump and air content test results were immediately compared. After initial 24 hour curing, cylinders were transported to separate facilities for final (28 day) curing and testing, but all other sampling, preparation, and testing conditions were similar. The last two factors are quite different from HMAC. Unlike HMAC where the contractor is also the producer, bridge contractors normally purchase concrete from local ready mix suppliers. This introduces another organization into the system where risk assignment must be considered. The motivation for the contractor in this system is the same as the DOT in a two organization system, i.e., to make sure the quality of the concrete purchased from the ready mix supplier is adequate. Unlike HMAC where production and placement are relatively continuous, bridge substructure PCC placement can be quite sporadic over an extended period of time. This certainly affects the ability of a ready mix supplier to produce a consistent product. Finally, the “usual” method of acceptance leads to a very conservative approach for meeting strength requirements. The “usual” method of accepting a “pour” requires that average 28-day strength of cylinders be greater than a minimum compressive strength. Table 10 indicates that mean strengths are much larger than the required minimum 3500 psi, 28-day compressive strength. This may seem overly conservative

48 but, as illustrated in Figure 12, is reasonable in view of the risk inherent in the supply and acceptance systems. Figure 12 shows histograms for the compressive strength data in Table 10. Both histograms show three strengths near minimum strength. These six low strengths were for one pour and show that, considering the consequences of finding out that concrete strength is unacceptable after 28 days, the high mean strengths may indeed be reasonable. Removal and replacement of hardened concrete is a costly exercise that is avoided at all costs. PCC construction is much less likely than HMAC construction to be controlled with a statistically-based quality assurance process. As noted above, there are reasons for this circumstance. Slump and entrained air content essentially serve as surrogate acceptance tests. They are relatively quick and easy to run and sampling and testing are easily observable on site. But the single most important factor appears to be the 28 days required before definitive acceptance data is available. This time lag increases risk and has the potential to severely strain contractor-state DOT relationships. The introduction of the ready mix producer into the system can also potentially complicate the contractor-state DOT relationship.

49 a. C o n tr ac to r D at a b . A la b am a D O T D at a F ig u re 1 2. C o m p re ss iv e S tr en g th D at a fo r S u b st ru ct u re P o rt la n d C em en t C o n cr et e

50 Kentucky Transportation Cabinet Hot Mix Asphalt Concrete and Portland Cement Concrete Tests Hancher, et al (3) compare contractor and Kentucky Transportation Cabinet measurements of properties of hot-mix asphalt concrete, paving PCC, and structural PCC. The properties compared are asphalt content, air void content, and VMA for HMAC and air content, slump, and strength for both paving and structural PCC. Table 11 contains results from statistical comparisons (5% significance level). It is noted that Table 11 was prepared from information extracted from Reference 3, but that a table in Reference 17 developed, presumably, from the same data has some different p-values and, thus, some different comparisons. The variability of Kentucky Transportation Cabinet hot-mix asphalt concrete air void content and asphalt content tests are significantly larger than the variability of contractor tests. The variability of VMA tests are not significantly different but the variability of the Kentucky Transportation Cabinet tests are larger. The means for the three HMAC properties were not significantly different. The paving concrete air content and slump variabilities are significantly different and contractor variabilities are larger. Means of slump and strength are significantly different and Kentucky Transportation Cabinet means are larger. It should be noted that the slump means exceed the specification target range of 1½ to 2 inches and may have been switched with values for structural concrete (Class A target range of 2-4 inches).

51 Table 11. Comparisons Between Kentucky Transportation Cabinet and Contractor Tests Property nKTC nCONT sKTC sCONT Diff. p-Value X KTC X CONT Diff. p-Value HMA-Air Voids (%) 1827 1818 0.978 0.853 SD <0.001 4.063 4.086 NSD 0.462 HMA - % Asphalt (%) 3082 3082 0.210 0.152 SD <0.001 -0.007 -0.007 NSD 0.851 HMA-VMA (%) 422 422 1.037 0.940 NSD 0.083 1.255 1.267 NSD 0.854 PCCP – Air (%) 92 428 0.710 0.920 SD 0.004 5.530 5.500 NSD 0.792 PCCP – Slump (in) 92 428 0.836 1.399 SD <0.001 3.407 2.669 SD <0.001 PCCP – Strength (psi) 92 421 876 889 NSD 0.854 5676 5366 SD 0.002 PCCS – Air (%) 67 245 0.788 0.815 NSD 0.480 5.727 5.591 NSD 0.223 PCCS – Slump (in) 67 245 0.542 0.589 NSD 0.680 1.795 1.829 NSD 0.669 PCCS – Strength (psi) 67 242 624 622 NSD 0.766 5926 6032 NSD 0.219

52 None of the variabilities or means of the structural concrete properties are significantly different. There is also no particular tendency for the Kentucky Transportation Cabinet or contractor statistics to be larger. The slump means seem more consistent with requirements for paving concrete (1½ - 2 inches). The general trends exhibited by the comparisons of the HMAC and structural PCC tests are similar to trends for comparisons of Alabama DOT tests. SUMMARY The findings of the review of the state-of-practice for using contractor-performed tests in quality assurance can be summarized as follows: ● There is great diversity in how state DOTs use contractor-performed tests for quality assurance. ● Contractor-performed tests are most used in the quality assurance process for hot-mix asphalt concrete. Other materials, in order of level of use, are PCC (pavements and structures), granular base and earthwork. The use for earthwork is quite limited. ● When construction of one of the materials listed above is controlled with a quality assurance process as currently defined, the use of contractor- performed tests for process (quality) control is widespread and well accepted. ● There is no consensus as to how contractor-performed tests can be best used in the acceptance process. ● Verification that contractor-performed tests provide the same measure of material quality as state DOT-performed tests is a major impediment to the use of contractor-performed tests in the acceptance process.

53 ● Contractor and state DOT-performed tests may provide different measures of material properties. This is most likely for HMAC properties and least likely for properties of PCC. No comparisons of granular base or earthwork properties were found.

Next: Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests »
Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report Get This Book
×
 Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB's National Cooperative Highway Research Program (NCHRP) Web-Only Document 115: Using the Results of Contractor-Performed Tests in Quality Assurance includes select chapters of the contractor's final report on this project, which explores whether state departments of transportation can effectively use contractor-performed test results in the quality-assurance process. NCHRP Research Results Digest 323 summarizes the results and findings of this project.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!