National Academies Press: OpenBook

Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report (2007)

Chapter: Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests

« Previous: Chapter 2: State of Practice
Page 57
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 57
Page 58
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 58
Page 59
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 59
Page 60
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 60
Page 61
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 61
Page 62
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 62
Page 63
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 63
Page 64
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 64
Page 65
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 65
Page 66
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 66
Page 67
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 67
Page 68
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 68
Page 69
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 69
Page 70
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 70
Page 71
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 71
Page 72
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 72
Page 73
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 73
Page 74
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 74
Page 75
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 75
Page 76
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 76
Page 77
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 77
Page 78
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 78
Page 79
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 79
Page 80
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 80
Page 81
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 81
Page 82
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 82
Page 83
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 83
Page 84
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 84
Page 85
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 85
Page 86
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 86
Page 87
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 87
Page 88
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 88
Page 89
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 89
Page 90
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 90
Page 91
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 91
Page 92
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 92
Page 93
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 93
Page 94
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 94
Page 95
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 95
Page 96
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 96
Page 97
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 97
Page 98
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 98
Page 99
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 99
Page 100
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 100
Page 101
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 101
Page 102
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 102
Page 103
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 103
Page 104
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 104
Page 105
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 105
Page 106
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 106
Page 107
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 107
Page 108
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 108
Page 109
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 109
Page 110
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 110
Page 111
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 111
Page 112
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 112
Page 113
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 113
Page 114
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 114
Page 115
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 115
Page 116
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 116
Page 117
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 117
Page 118
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 118
Page 119
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 119
Page 120
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 120
Page 121
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 121
Page 122
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 122
Page 123
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 123
Page 124
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 124
Page 125
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 125
Page 126
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 126
Page 127
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 127
Page 128
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 128
Page 129
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 129
Page 130
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 130
Page 131
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 131
Page 132
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 132
Page 133
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 133
Page 134
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 134
Page 135
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 135
Page 136
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 136
Page 137
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 137
Page 138
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 138
Page 139
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 139
Page 140
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 140
Page 141
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 141
Page 142
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 142
Page 143
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 143
Page 144
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 144
Page 145
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 145
Page 146
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 146
Page 147
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 147
Page 148
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 148
Page 149
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 149
Page 150
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 150
Page 151
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 151
Page 152
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 152
Page 153
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 153
Page 154
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 154
Page 155
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 155
Page 156
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 156
Page 157
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 157
Page 158
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 158
Page 159
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 159
Page 160
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 160
Page 161
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 161
Page 162
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 162
Page 163
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 163
Page 164
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 164
Page 165
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 165
Page 166
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 166
Page 167
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 167
Page 168
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 168
Page 169
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 169
Page 170
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 170
Page 171
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 171
Page 172
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 172
Page 173
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 173
Page 174
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 174
Page 175
Suggested Citation:"Chapter 3: Comparisons of Contractor-Performed and State DOT-Performed Tests." National Academies of Sciences, Engineering, and Medicine. 2007. Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report. Washington, DC: The National Academies Press. doi: 10.17226/23134.
×
Page 175

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

54 CHAPTER 3: COMPARISONS OF CONTRACTOR-PERFORMED AND STATE DOT-PERFORMED TESTS INTRODUCTION HMAC test results were collected and analyzed from six state DOT’s. Details of verification and acceptance procedures for these six state DOT’s are contained in Table 12. Also included in Table 12 are details of procedures for the Alabama DOT and the Kentucky Transportation Cabinet. This information is provided to connect these analyses with the analysis of test results from Alabama and Kentucky discussed in Chapter 2. The verification and acceptance procedures described in Table 12 provide a range of details that might affect comparisons of contractor and state DOT-performed tests. Ratios of number of contractor to number of state DOT tests range from 2 to 1 to 20 to 1. Simple 1 to 1 comparisons of test results are used by three state DOT’s to verify contractor-performed tests. More statistically robust comparisons of variances and means with F and t tests are used by three state DOT’s. Pay adjustments are applied by all six state DOT’s, but are applied as a last resort for mix properties by the North Carolina DOT. The North Carolina DOT acceptance procedure for HMAC mix properties is basically an accept/reject procedure based on monitoring with control charts of both North Carolina DOT and contractor-performed tests. LOT size varies from 2000 tons or a day’s production to the entire project production. Acceptance is based on verified contractor tests or combined DOT and verified contractor tests. When contractor-performed tests are not verified, acceptance is based on state DOT tests. Acceptance criteria may be deviations from targets, absolute deviations from targets or deviations from targets and variability with the percent within limits (PWL) method. The weighted average LOT pay

55 factor for all properties considered is applied by three state DOT’s. The lowest pay factor from all properties considered or the pay factors for all properties considered are applied by three state DOT’s. PCC pavement strength data was collected and analyzed from the Colorado DOT and granular base course data was collected and analyzed from the FHWA-WFLHD. Details of verification and acceptance procedures are contained in Table 13. Details for Kentucky Transportation Cabinet PCC procedures are also included in Table 13 to provide a connection with analyses discussed in Chapter 2. Tests results generated during an entire construction season for a particular material were requested from state DOT’s. Some provided the requested data, some provided partial data from a construction season, and some provided limited data from several construction seasons. This resulted in a wide range in the size of data sets. Examples are presented below. The North Carolina DOT provided HMAC tests for 735 mix designs from the 2004 construction year. This gave data sets with over 14,000 contractor mix tests, over 2000 North Carolina DOT mix tests, over 20,000 contractor mat density tests and over 6,000 North Carolina DOT mat density tests. The Florida DOT provided HMAC tests from 98 selected projects constructed during the 2003 and 2004 construction years. This gave data sets with over 2000 contractor mix tests, over 500

56 Table 12. Details of Hot-Mix Asphalt Concrete Verification and Acceptance Procedures State DOT Properties Cont. to DOT Testing Freqency Verification Comparisons Acceptanc e Method Lot Size Acceptanc e Data Acceptance Criteria Pay Factor Applicatio n Georgia AC, Gradation 4 to 12 1 to 1 Adjust Pay Days Production Contractor Absolute Deviation from Targets Lowest3 Pay Florida AC, VTM, Gradation, Mat Density 4 to 1 and 8 or 12 to 1 1 to 1 Adjust Pay 2000 or 4000 tons1 Contractor PWL Weighted Average North Carolina AC, VTM, VFA, Gradation, Mat Density 10 to 1 and 20 to 1 1 to 1 Adjust Pay4 Mix-Indefinite Mat Density-Days Production Contractor4 and DOT Deviations from Target Lowest4 Mix Pay and Mat Kansas VTM, Mat Density Mix 4 to 1 Mat 2 to 1 F and t Tests Adjust Pay Mix-3000T Mat Density–Days Production Contractor PWL Both5 Mix and Mat California AC, Gradation, Mat Density 10 to 1 t Test and 1 to 1 Adjust Pay Project Production Contractor PWL Weighted Average New Mexico AC, VTM, Gradation, Mat Density 3 to 1 F and t Tests Adjust Pay Project Production Contractor and DOT PWL Weighted Average Alabama AC, VTM, Mat Density Mix 3 to 16 Mat 2 to 1 1 to 1 Adjust Pay 2800 tons6 Contractor Absolute Deviation From Targets Lowest Kentucky AC, VTM, VMA 4 to 1 1 to 1 Adjust Pay 4000 tons Contractor Deviations from Targets Weighted Average3,7 Notes: 1. Contractor chooses 2000 or 4000 ton LOTs for acceptance. 2. Will vary based on production rate but data provided indicates about 4 to 1. 3. Mat density pay adjustments are included but are based on state DOT tests. 4. Pay adjustments (reduction only) applied independently for mix properties and mat density. Control charts with both contractor and DOT test results used to control mix production process and to decide when pay reductions applied. Mix pay reductions appear to be a last resort. Mat density pay computed for each LOT. 5. Mix pay factor based on VTM and mat density pay factor applied independently. 6. Definition of a LOT(tonnage or time basis) has varied and, therefore, the mix contractor to ALDOT testing ratio has varied. The 3 to 1 ratio and 2800 ton LOT are approximate. The 2 to 1 mat density testing ratio has been constant. 7. Pay factors are computed for each property for each 1000 ton subLOT. LOT averages are computed for each property and weighting factors applied to compute an overall LOT pay factor.

57 Table 13. Details of PCC and Granular Base Course Verification and Acceptance Procedures Agency Material Properties Cont. to Agency Testing Frequency Verification Comparisons Acceptance Method LOT Size Acceptance Data Acceptance Criteria Pay Factor Application Colorado DOT PCC Pavement Flexural Strength 4 to 1 F and t Tests Adjust Pay Project Production Contractor PWL - Kentucky Transportation Cabinet PCC Slump, Air and Compressive Strength 4 to 1 1 to 1 Adjust Pay Structural 200cy Pavement 4000sy Contractor PWL Weighted Average (Air and Comp. Str.) FHWA-WFLHD Granular Base Gradation, LL, PI, % Fractured Particles and SE/P200 10 to 1 after first 3 for a project1 F and t Tests Adjust Pay Project Production Contractor PWL Lowest2 1. Data provided indicates this results in an average testing frequency ratio of about 3 to 1. 2. Pay adjustment for density also included but based only on contractor-performed tests. FHWA-WFLHD witnesses density testing.

58 Florida DOT mix tests, over 6000 contractor mat density tests and over 1400 Florida DOT mat density tests. The Colorado DOT provided PCC pavement flexural strength tests from 3 projects constructed in 2000, 2001 and 2003, respectively. The total data sets were comprised of 221 contractor tests and 61 Colorado DOT tests. STATISTICAL ANALYSIS TECHNIQUES Variability and proximity to target or limiting values (means) of contractor- performed and state DOT-performed tests were statistically compared. Variability, as measured with variance, was compared with F tests. The proximity to target or limiting values, as measured with means of differences between test results and target or limiting values, were compared with t tests. Means for contractor and state DOT tests from split samples were compared with paired t tests. Mean square deviations (MSD) provide a way to evaluate process control that considers both accuracy and variability of the process. Mean square deviations for contractor and state DOT tests were compared to determine which indicates the best material quality (process control). Statistical comparisons were made at a 1% level of significance (α =0.01). This is certainly arbitrary but provides a stronger determination of differences than the more widely used 5% level of significance (α =0.05). All data provided by a state DOT for a particular material were combined for analyses. Some properties have target values that vary by project or job mix formula, for example asphalt content for HMAC. Therefore, it was necessary to subtract target values from measurements (∆=X-XT) to produce a variable that could be combined for

59 all mixes or projects. For consistency, this practice was followed for all properties, i.e., those with constant target values also. Data from small projects (nDOT < 6) were eliminated to produce reduced data sets. Data from these larger projects were combined and analyzed to see if project size might affect comparisons. In addition, comparisons and analyses were conducted for these larger projects (nDOT ≥ 6) on a project by project basis, i.e., variability and means for contractor and state DOT were compared for each project or, for the North Carolina DOT, for each job mix formula. SAMPLING AND TESTING CAPABILITIES When comparing contractor and state DOT tests there is always the issue of possible differences in technician capabilities. Technician motivation can also be an issue but will be addressed later. The federal regulation “23 CFR 637B” includes requirements for laboratory and sampling and testing qualifications for tests that will be used for acceptance decisions. The regulation states that state DOT central laboratories and non-state DOT laboratories involved with independent assurance or dispute resolution sampling must be accredited by the AASHTO Accreditation Program or comparable laboratory accreditation program approved by the FHWA. The regulation also states that sampling and testing personnel will be qualified. However, the issue related to comparisons of tests is equal contractor and state DOT technician qualifications.

60 Hughes (9) discusses technician and laboratory requirements with consideration of the terms “qualified” and “certified”. The discussion of technician qualifications is summarized as follows: “It is generally understood that technicians must be qualified and that one way to ensure this is to require them to have undergone some form of certification.” Specifications for the state DOT’s in Tables 12 and 13 contain language that requires some form of certification for contractor technicians. Specifications may contain language indicating the same requirements for DOT technicians, but this is not the case in all states. In these states it is policy that state DOT technician requirements are the same as contractor technicians requirements, even though, this may not be explicitly stated in specifications. The one exception to the equal technician qualification requirement is the FHWA-WFLHD. The FHWA-WFLHD laboratory is AASHTO accredited but contractor technicians must simply “be qualified”. Because contractor and state DOT technicians have the same qualification requirements and because of the requirement for independent assurance sampling and testing, possible differences in sampling and testing will not be included as a factor when analyzing comparisons. Certainly there are individual differences in technician and laboratory capabilities, but there in no practical way that these differences might be considered when comparing contractor and state DOT tests. ANALYSIS OF GEORGIA DOT HOT MIX ASPHALT CONCRETE DATA Test results obtained for HMAC during the 2003 construction year were analyzed. Properties include gradation (% passing 1”, ¾”, ½”, 3/8”, #4, #8, #50 and #200 sieves) and asphalt content measured with either the vacuum solvent

61 extraction or ignition methods. Mat density is also used in the acceptance process, but only Georgia DOT testing is required. Figure 13 illustrates the Georgia DOT sampling and testing requirements for managing the production of HMAC. A LOT is a day’s production and contractor QC samples are taken and tested for each 500 ton subLOT. Results from these tests are used for LOT acceptance if verified by Georgia DOT Comparison test results. Georgia DOT Comparison tests are the type of testing often called “verification” in the vernacular of many state DOTs and Reference 1. Comparison samples are split from the contractor’s QC samples for one of every 10 LOTS and results are compared one to one. Georgia DOT also tests what is referred to as a QA sample from two of every five LOTs. The samples are obtained independently of contractor samples and results are compared with specification mix tolerances. It appears the purpose of this testing may be to satisfy “23 CFR 637B” requirements for validating the quality of material. When Georgia DOT comparison test results do not validate contractor QC tests results and/or when Georgia DOT QA test results do not compare favorably with specification mix tolerances, additional testing is conducted. If additional testing does not resolve unfavorable comparisons, contractor QC test results may be replaced with Georgia DOT test results for acceptance of the LOTs where comparisons were unfavorable.

62 Figure 13. Georgia DOT HMAC Mix Sampling and Testing Requirements The first comparison will be between means of contractor QC and Georgia DOT Comparison test results with the paired t test. Results are summarized in Table 14. The strength or significance of comparisons are indicated by the p-values from the hypothesis testing. The comparisons indicate significant differences in deviation from targets for only 4 of 8 sieves. However, for the 4 sieves used for pay adjustment computation, differences are significant for 3 sieves. Numerically, mean deviations from targets are larger for Georgia DOT tests for only 5 of 8 sieves. But, for the 4 sieves used for pay adjustment computation, deviations for Georgia DOT tests are always larger (as noted above, significantly so for 3 of 4 sieves). The deviations from target asphalt contents are not significantly different, but the deviations for Georgia DOT tests are larger.

63 Although not a part of the paired t tests for means, the variances for contractor and Georgia DOT gradation tests were compared with the F test. These comparisons are summarized in Table 15. The variances were significantly different for only 4 of the 8 sieves. However, for the 4 sieves used in pay adjustment computation, 3 variances were significantly different. Numerically, the variances for Georgia DOT gradation tests were larger for 7 of 8 sieves. The variance for the Georgia DOT asphalt content tests was significantly larger than the variance for contractor tests. Table 14. Comparison of Georgia DOT Comparison and Contractor QC Test Result Means Property n GDOT Δ , % CONT Δ , % Difference p-value Pay % Pass 1” 395 0.258 0.295 NSD 0.462 NO % Pass 34 ” 791 0.398 0.469 NSD 0.166 NO % Pass 12 ” 1067 0.314 0.118 SD 0.002 YES % Pass 38 ” 953 0.516 0.329 SD 0.005 YES % Pass #4 402 0.506 0.392 NSD 0.128 YES % Pass #8 1142 0.449 0.244 SD <0.001 YES % Pass #50 282 0.897 0.763 NSD 0.094 NO % Pass #200 1141 0.334 0.447 SD <0.001 NO % Asphalt 1135 0.005 0.002 NSD 0.634 YES

64 Table 15. Comparison of Georgia DOT Comparison and Contractor QC Test Result Variances Property n 2S GDOT 2S CONT Difference p-value Pay % Pass 1” 395 1.527 1.363 NSD 0.131 NO % Pass 34 ” 791 4.410 3.831 NSD 0.024 NO % Pass 12 ” 1067 9.343 6.576 SD <0.001 YES % Pass 38 ” 953 8.479 5.545 SD <0.001 YES % Pass #4 402 9.450 8.606 NSD 0.175 YES % Pass #8 1142 8.673 6.561 SD <0.001 YES % Pass #50 282 3.971 4.004 NSD 0.472 NO % Pass #200 1141 1.137 0.791 SD <0.001 NO % Asphalt 1135 0.088 0.045 SD <0.001 YES Scatter diagrams, with lines of equality, were plotted for the eight sieves and asphalt content to provide additional insight into the relationship between contractor and Georgia DOT test results from split samples. Examples for % passing the 3/8-inch sieve and asphalt content, Figures 14 and 15. Although somewhat difficult to visualize, the scatter diagram confirm the larger means and variances of contractor-performed tests in Tables 14 and 15. More revealing is the distribution of points that are some distance from the origin but near the horizontal axis, i.e., large GDOT∆ and small CONT∆ . These points are potentially troublesome because contractor test results indicate small deviations from JMF targets which are not corroborated by Georgia DOT test results. It is large deviations from JMF targets that create issues with acceptance.

65 Table 16 summarizes comparisons between contractor QC and Georgia DOT QA test results. These test results are from independent samples and one to one comparisons, with the paired t test, are not appropriate. Variances were compared with F tests and means were compared with t or modified t tests, as required by equality of variances. It should be noted that the contractor QC test results compared with Georgia DOT Comparison test results in Tables 14 and 15 are a subset of the total contractor QC test results data set. Table 16 indicates that, except for the % passing the 1 and ¾-inch sieves, the variances of Georgia DOT test results are significantly larger than variances of contractor test results. However, Table 16 indicates no significant differences in the means of any of the test results. The Georgia DOT means for the percents passing the 4 sieves used for pay are larger, but the contractor mean for asphalt content is larger.

66 Figure 14. Scatter Diagram for Percent Passing the 3/8” Sieve - GDOT Figure 15. Scatter Diagram for Asphalt Content - GDOT

67 Table 16. Comparison of Georgia DOT QA and Contractor QC Test Results – All Projects Property GDOTn CONTn s GDOT 2 s CONT 2 Difference p-value GDOT Δ , % CONT Δ , % Difference p-value Pay % Pass 1” 832 4775 1.425 1.296 NSD 0.034 0.187 0.184 NSD 0.941 NO % Pass 34 ” 1637 9444 4.167 4.378 NSD 0.099 0.418 0.535 NSD 0.036 NO % Pass 12 ” 2323 13157 6.793 5.565 SD <0.001 0.196 0.160 NSD 0.530 YES % Pass 38 ” 2099 11587 6.605 6.044 SD 0.004 0.246 0.231 NSD 0.805 YES % Pass #4 1050 5532 9.959 7.707 SD <0.001 0.320 0.293 NSD 0.792 YES % Pass #8 2488 14051 9.488 5.534 SD <0.001 0.253 0.196 NSD 0.380 YES % Pass #50 749 4047 4.139 3.334 SD <0.001 0.727 0.837 NSD 0.170 NO % Pass #200 2488 14036 1.212 0.769 SD <0.001 0.359 0.400 NSD 0.082 NO % Asphalt 2487 14061 0.064 0.040 SD <0.001 0.004 0.005 NSD 0.827 YES

68 Mean square deviation (MSD) provides a method for considering both accuracy (proximity to target) and precision or variability in evaluating measurements. For the nominal is best (NIB) situation, where test results may be either larger or smaller than targets, the MSD is computed with ( )2 1= − = ∑ n i T i NIB X X MSD n ………………………….……………….(1) where iX = test results, TX = target, and n = number of measurements. For large n values this can be written as ( )22= + −NIB TMSD s X X …………………………………………………(2) where 2s = variance of tests and X = mean of tests. The variable used to combine tests with different target values is the difference between tests and target values. Therefore, the most desirable value is always zero and the equation for NIBMSD reduces to ( )22= + ∆NIBMSD s …………………………………………..……………(3) where ∆ = mean of difference between tests and target values. Smaller NIBMSD values for manufacturing processes mean better control. When comparing NIBMSD values for

69 two sets of test results for the same process, smaller NIBMSD indicate more precise tests with closer conformity to target values. Table 17 contains NIBMSD for the set of contractor QC tests, the set of Georgia DOT Comparison tests and the set of Georgia DOT QA tests. Values for contractor QC tests are smallest for all properties except percent passing the ¾” sieve. Implications are that contractor tests are consistently more precise and closer to target values. The NIBMSD for Georgia DOT QA tests are closer to NIBMSD for contractor QC tests for percents passing 4 sieves and asphalt content. The NIBMSD for Georgia DOT Comparison test results are closer to NIBMSD for contractor QC tests for percents passing 4 sieves. This is surprising because Georgia DOT Comparison and contractor QC samples are split samples and test results are directly compared one to one. It is reasonable to assume that this would promote similarities. Georgia DOT QA test results are from independent samples and results are compared to acceptance criteria. It may be that this more direct relationship with the acceptance process is the reason contractor QC and Georgia DOT QA tests are more comparable than contractor QC and Georgia DOT Comparison tests. It should be noted that variance and, therefore, measurement precision dominates the computation of NIBMSD . Target values are zero and means for the differences from targets, when squared, are small. Except for percent passing the #50 sieve for Georgia DOT Comparison and QA tests, comparisons of variances would provide the same relative rankings as NIBMSD .

70 Table 17. Comparison of Mean Square Deviations NIBMSD Property Contractor QC GDOT Comp. GDOT QA % Pass 1” 1.330 1.594 1.460 % Pass 34 ” 4.664 4.559 4.342 % Pass 12 ” 5.591 9.442 6.831 % Pass 38 ” 6.097 8.745 6.666 % Pass #4 7.793 9.706 10.061 % Pass #8 5.572 8.875 9.552 % Pass #50 4.035 4.768 4.668 % Pass #200 0.929 1.249 1.341 % Asphalt 0.040 0.088 0.064 The preceding analyses were performed on databases containing all test results collected during the 2003 construction season. From these databases, projects with at least 6 Georgia DOT QA tests or 6 Georgia DOT Comparison tests for asphalt content, % passing the ½ inch sieve, and % passing the #200 sieve were identified. Databases from these projects with nGDOT ≥ 6 were compiled and their variabilities and means compared. In addition, variabilities and means for individual projects were compared. It should be noted that the format in which data were provided by the Georgia DOT made sorting by project somewhat tedious and was the reason only three properties were selected. Sorting by project or job mix formula of the data provided by other states was somewhat easier. As a result, data for all properties are included in similar analyses for these states that are presented in following sections. Comparisons of reduced database variances and means for Georgia DOT QA and Contractor QC tests are summarized in Table 18. Comparisons of reduced

71 database variances and means for Georgia DOT Comparison and Contractor QC tests are summarized in Table 19. The variances and means are similar to those for all projects in Tables 14-16. Comparisons in Tables 18 and 19 are also similar to comparisons in Tables 14-16. The only difference is for the means of the % passing the ½” sieve for the Georgia DOT Comparison and contractor QC tests. In Table 14 the means for all projects are significantly different, but in Table 19 the means for the larger project are not significantly different.

72 Table 18. Comparison of Georgia DOT QA and Contractor QC Test Results – Projects with nGDOT ≥ 6 Property Projects GDOTn CONTn 2GDOTs 2 CONTs Difference p-Value GDOT∆ , % CONT∆ , % Difference p-Value % Asphalt 114 1410 8453 0.058 0.040 SD <0.001 0.011 0.010 NSD 0.638 % Pass 12 ” 114 1385 8072 7.701 6.439 SD <0.001 0.146 0.208 NSD 0.433 % Pass #200 126 1565 8908 1.210 0.741 SD <0.001 0.310 0.367 NSD 0.051 Table 19. Comparison of Georgia DOT Comparison and Contractor QC Test Results – Projects With nGDOT ≥ 6 Property Projects n 2GDOTs 2 CONTs Difference p-Value GDOT∆ , % CONT∆ , % Difference p-Value % Asphalt 41 452 0.097 0.053 SD <0.001 0.018 0.010 NSD 0.148 % Pass 12 ” 35 400 12.286 9.251 SD 0.005 0.462 0.200 NSD 0.023 % Pass #200 45 470 0.997 0.631 SD 0.719 0.159 0.278 SD 0.003

73 Project by project comparisons of Georgia DOT and contractor tests for asphalt content and % passing the ½” and #200 sieves are summarized in Tables 20 and 21. This analysis will quantify the numbers of projects where there are significant differences between Georgia DOT and contractor means and variances, and the numbers of projects where Georgia DOT means and variances are largest. The comparisons generally confirm trends indicated by comparisons of combined tests, i.e., that variability of Georgia DOT tests are likely larger than the variability of contractor tests, but that means of Georgia DOT tests are less likely larger than means of contractor tests. Except for asphalt content, the percentages in column 3 of Tables 20 and 21 indicate no particular tendency for Georgia DOT or contractor tests to be closer to target values. The percentages in column 4 indicate no strong tendency for means of differences from targets to be significant but, when differences are significant, the percentages in column 5 indicate Georgia DOT means are likely larger. The percentages in column 6 indicate Georgia DOT variances are likely larger. The percentages in column 7 indicate that variances are more likely significantly different than means (column 4). When variances are significantly different, the percentages in column 8 indicate Georgia DOT variances are likely larger.

74 Table 20. Project by Project Comparisons of Georgia DOT QA and Contractor QC Test Results Property Projects Projects with Larger GDOT ∆ Projects with SD ∆ Projects with Significantly Larger GDOT ∆ Projects with Larger GDOT s2 Projects with SD s2 Projects with Significantly Larger GDOT s2 % Asphalt 114 68 (60%) 8 (7%) 6 (5%) 77 (68%) 12 (10%) 10 (9%) % Pass 12 ” 114 61 (54%) 3 (3%) 3 (3%) 62 (54%) 13 (11%) 10 (9%) % Pass #200 126 52 (41%) 11 (9%) 5 (4%) 81 (64%) 15 (12%) 13 (10%) Table 21. Project by Project Comparisons of Georgia DOT Comparison and Contractor QC Test Results Property Projects Projects with Larger GDOT ∆ Projects with SD ∆ Projects with Significantly Larger GDOT ∆ Projects with Larger GDOT s2 Projects with SD s2 Projects with Significantly Larger GDOT s2 % Asphalt 41 27 (66%) 1 (2%) 0 35 (85%) 1 (2%) 1 (2%) % Pass 12 ” 35 16 (46%) 0 0 21 (60%) 2 (6%) 2 (6%) % Pass #200 35 21 (47%) 2 (4%) 2 (4%) 34 (76%) 3 (7%) 3 (7%) Numbers in parentheses are percentages of total numbers of projects.

75 Plots of project means and variances were made to graphically illustrate the trends summarized in Tables 20-21. A complete set of these plots are contained in Appendix A. Figures 16 and 17 are for asphalt content and with row 1 of Table 20 will be used to illustrate interpretation of the plots. The 114 points on Figures 16 and 17 represent the projects where nGDOT ≥ 6. The Georgia DOT means are larger for 68 (60%) of the projects and these plot in the half of Figure 16 defined by the lines of absolute equality that is centered about the horizontal axis. The means for 8 (7%) of the projects are significantly different and are represented by points close to the vertical or horizontal axes. The Georgia DOT means are significantly larger for 6 (5%) projects and are represented by points close to the horizontal axis. The Georgia DOT variances are larger for 77 (68%) of the projects and these points plot below the line of equality in Figure 17. The variances are significantly different for 12 (10%) of these projects, and for 10 (9%) of these projects, the Georgia DOT variances are larger. The distribution of the points below the line of equality and along the horizontal axis in Figure 17 clearly illustrate the larger Georgia DOT test variability.

76 Figure 16. Asphalt Content Project Means – QA and QC Figure 17. Asphalt Content Project Variances – QA and QC

77 ANALYSIS OF FLORIDA DOT HOT MIX ASPHALT CONCRETE DATA HMAC test results from 98 projects constructed during 2003 and 2004 were provided by the Florida DOT. All test results for one year were requested but those provided were described as “an excellent sampling of the types of mixture properties that are used and a good sampling of the contractors that conduct FDOT work.” Test results included gradation (percent passing3 4 ”, 1 2", 3 8", #4, #8, #16, #30, #50, #100 and #200 sieves), asphalt content, maximum mix specific gravity (Gmm), bulk density of laboratory compacted samples (Gmb), air voids and VMA (computed with Gmm and Gmb) and mat density (%Gmm, core bulk density as a percentage of Gmm). Percent passing the #8 sieve, percent passing the #200 sieve, asphalt content, air voids, and mat density (%Gmm) are used in the PWL system to compute LOT composite pay factors. Asphalt content and gradation are determined with the ignition oven method. Mat density is measured with 6”-cores. Figure 18 illustrates Florida DOT sampling and testing requirements for managing construction of HMAC pavement layers. A LOT may be 2000 or 4000 tons (contractor choice) divided into either 4-500 ton or 4-1000 tons subLOTS. Contractors test one mix sample and five cores per subLOT. Florida DOT conducts two types of sampling and testing: Verification and independent sample verification testing (ISVT). Florida DOT Verification tests and Contractor QC tests are on split samples. Test results are compared one to one with numerical criteria to determine if Contractor QC test results are used for LOT pay factor computation.

78 2000 Ton LOTs (4-500 SubLOTs) Contractor QC FDOT Verification FDOT ISVT 4000 Ton LOTs (4-1000 Ton SubLOTs) Contractor QC FDOT Verification FDOT ISVT Notes: 1. Contractor QC and FDOT mix verification tests on split samples. Results compared one to one with numerical criteria. 2. Independent Sample Verification Testing (ISVT): 2 per 12,000 tons on independent samples. Results compared with specification tolerances. 3. For mat density, contractor QC test 5 cores per subLOT. FDOT Verification tests 5 of these cores from 1 of 4 subLOTs. Results compared one to one with numerical criteria. 4. Mat density ISVT tests 5 independent cores from same LOTs and subLOTs as mix ISVT tests. Results compared with specification tolerances. Figure 18. Florida DOT HMAC Sampling and Testing Requirements 12,000 Tons 1 2 3 4 5 6 4 4 4 4 4 4 1 1 1 1 1 1 1 1 1 4 2 4 1 1 1 1 3 4 1

79 Florida DOT ISVT results are compared with specification tolerances. Noncompliance with mix specification tolerances can result in stopping production. The first comparisons performed were between Contractor QC and Florida DOT Verification test results from split samples. Tables 22 and 23 contain comparisons of variances and means of differences from target values for data from all projects and for data from large projects (those with at least 6 Florida DOT test results (nFDOT ≥6)), respectively. Data from the large projects will also be compared on a project by project basis. The comparisons for all projects and for large projects are very consistent. Variances of contractor and Florida DOT test results are mostly significantly different. Exceptions are for % passing the #16, #30, #50 and #100 sieves. Proximity to target values (∆=X-XT) are consistently not significantly different. However, the p-values for mat density (%Gmm) indicate the means are approaching statistically significant differences. Numerically, for all cases, variances of Florida DOT test results are larger than variances of contractor test results. Except for % passing the #50 sieve and asphalt content among large projects, the mean differences indicate contractor test results closer to target values. For both all and large project comparisons, mean differences from target asphalt contents are quite small. It should also be noted that the target values used for VMA are minimum acceptable values. The negative mean differences (∆=X-XT) of about 0.5% indicate that lower than desirable VMA are obtained. Contractor VMA measurements are larger and, therefore, closer to minimum acceptable values.

80 Table 22. Comparison of Florida DOT Verification and Contractor QC Test Results – All Projects Property nFDOT nCONT. s 2 FDOT s 2 CONT. Diff. p-Value FDOT∆ CONT∆ Diff. p-Value Pay % Passing 1 2 ” 518 2288 3.802 2.869 SD <0.001 0.533 0.409 NSD 0.183 No % Passing 3 8 ” 519 2286 10.514 8.553 SD 0.001 1.316 1.176 NSD 0.366 No % Passing #4 519 2288 17.179 13.247 SD <0.001 1.237 0.762 NSD 0.016 No % Passing #8 520 2288 7.533 5.619 SD <0.001 0.679 0.400 NSD 0.032 Yes % Passing #16 519 2287 6.576 6.006 NSD 0.089 0.224 0.005 NSD 0.069 No % Passing #30 519 2286 5.412 4.914 NSD 0.076 0.521 0.376 NSD 0.185 No % Passing #50 519 2284 5.570 4.614 SD 0.003 0.805 0.698 NSD 0.342 No % Passing #100 517 2284 2.123 1.850 NSD 0.021 0.755 0.630 NSD 0.063 No % Passing #200 521 2286 0.491 0.376 SD <0.001 0.136 0.072 NSD 0.055 Yes % Asphalt 526 2307 0.084 0.062 SD <0.001 0.016 -0.012 NSD 0.037 Yes Air Voids 469 2063 1.308 0.707 SD <0.001 -0.285 -0.248 NSD 0.513 Yes VMA 469 2095 1.023 0.737 SD <0.001 -0.508 -0.490 NSD 0.719 No %Gmm 1490 6874 2.958 2.570 SD <0.001 -0.222 -0.103 NSD 0.014 Yes

81 Table 23. Comparison of Florida DOT Verification and Contractor QC Test Results – Large Projects (nFDOT ≥ 6) Property nFDOT nCONT. s 2 FDOT s 2 CONT. Diff. p-Value FDOT∆ CONT∆ Diff. p-Value Pay % Passing 1 2 ” 377 1528 3.682 2.700 SD <0.001 0.625 0.494 NSD 0.225 No % Passing 3 8 ” 377 1527 10.240 7.334 SD <0.001 1.277 1.146 NSD 0.462 No % Passing #4 377 1528 15.886 13.039 SD 0.006 1.211 0.885 NSD 0.148 No % Passing #8 383 1551 6.840 5.139 SD <0.001 0.562 0.330 NSD 0.113 Yes % Passing #16 377 1527 5.652 5.489 NSD 0.353 0.325 0.204 NSD 0.369 No % Passing #30 377 1526 5.074 4.755 NSD 0.206 0.662 0.587 NSD 0.552 No % Passing #50 377 1525 5.083 4.655 NSD 0.134 0.825 0.836 NSD 0.932 No % Passing #100 376 1525 1.958 1.685 NSD 0.030 0.810 0.786 NSD 0.760 No % Passing #200 383 1549 0.492 0.386 SD 0.001 0.128 0.075 NSD 0.181 Yes % Asphalt 388 1571 0.078 0.057 SD <0.001 0.001 -0.019 NSD 0.205 Yes Air Voids 345 1409 1.301 0.753 SD <0.001 -0.337 -0.263 NSD 0.263 Yes VMA 335 1369 1.032 0.751 SD <0.001 -0.595 -0.537 NSD 0.336 No %Gmm 1408 5770 2.851 2.511 SD 0.001 -0.172 -0.082 NSD 0.070 Yes

82 The second comparisons were between Contractor QC and Florida DOT ISVT test results from independent samples. Tables 24 and 25 contain, respectively, comparisons of variances and means for data from all projects and for data from large projects. The comparisons in Tables 24 and 25 for all and for large projects are reasonably consistent. Variances of Florida DOT ISVT and contractor test results are significantly different, except for % passing the #50 sieve and % passing the #200 sieve for large projects. Mean values are mostly not significantly different. Important exceptions are air voids and mat density (%Gmm) where means are significantly different. Mean values for % passing the #4 and #8 sieves are also significantly different for test results from all projects. Numerically, for all cases, variances of Florida DOT test results are larger than variances of contractor test results. Numerically, contractor gradation test results are closer to target values than Florida DOT test results, except for % passing the ½” sieve. For asphalt content, Florida DOT test results are closer to targets. These differences for asphalt content are, however, quite small and are consistent with Florida Verification test results. The VMA comparisons indicate more favorable Florida DOT ISVT test results, i.e., larger test results relative to minimum acceptable values. This is opposite of indications from comparisons with Florida DOT Verification test results where contractor test results were more favorable, relative to minimum acceptable values.

83 Table 24. Comparison of Florida DOT ISVT and Contractor QC Test Results – All Projects Property nFDOT nCONT. s 2FDOT s 2 CONT. Diff. p-Value FDOT∆ CONT∆ Diff. p-Value Pay % Passing 1 2 ” 540 2288 3.693 2.869 SD <0.001 0.342 0.409 NSD 0.455 No % Passing 3 8 ” 539 2286 11.809 8.553 SD <0.001 1.177 1.176 NSD 0.995 No % Passing #4 539 2288 16.853 13.247 SD <0.001 1.273 0.762 SD 0.008 No % Passing #8 540 2288 9.555 5.619 SD <0.001 0.836 0.400 SD 0.002 Yes % Passing #16 540 2287 7.525 6.006 SD <0.001 0.271 0.005 NSD 0.039 No % Passing #30 540 2286 6.041 4.914 SD <0.001 0.452 0.376 NSD 0.515 No % Passing #50 540 2284 5.103 4.614 NSD 0.065 0.873 0.698 NSD 0.091 No % Passing #100 540 2284 4.024 1.850 SD <0.001 0.853 0.630 NSD 0.014 No % Passing #200 539 2286 0.480 0.376 SD <0.001 0.132 0.072 NSD 0.062 Yes % Asphalt 545 2307 0.086 0.062 SD <0.001 0.000 -0.012 NSD 0.386 Yes Air Voids 490 2036 1.400 0.707 SD <0.001 -0.057 -0.248 SD 0.001 Yes VMA 499 2095 1.251 0.737 SD <0.001 -0.414 -0.490 NSD 0.159 No %Gmm 437 6874 3.536 2.570 SD <0.001 -0.640 -0.103 SD <0.001 Yes

84 Table 25. Comparison of Florida DOT ISVT and Contractor QC Test Results - Large Projects (nFDOT≥6) Property nFDOT nCONT. s 2FDOT s 2 CONT. Diff. p-Value FDOT∆ CONT∆ Diff. p-Value Pay % Passing 1 2 ” 328 1351 4.246 2.717 SD <0.001 0.480 0.496 NSD 0.892 No % Passing 3 8 ” 322 1330 10.761 7.652 SD <0.001 1.325 1.192 NSD 0.502 No % Passing #4 327 1351 17.133 13.190 SD 0.001 1.415 0.942 NSD 0.058 No % Passing #8 328 1351 9.548 5.396 SD <0.001 0.794 0.372 NSD 0.021 Yes % Passing #16 328 1350 7.174 5.601 SD 0.002 0.290 0.058 NSD 0.151 No % Passing #30 328 1349 6.598 4.813 SD <0.001 0.616 0.411 NSD 0.183 No % Passing #50 328 1349 5.770 4.838 NSD 0.019 0.984 0.712 NSD 0.049 No % Passing #100 328 1348 5.134 1.697 SD <0.001 1.036 0.790 NSD 0.060 No % Passing #200 328 1349 0.474 0.393 NSD 0.014 0.104 0.080 NSD 0.549 Yes % Asphalt 337 1422 0.084 0.057 SD <0.001 -0.016 -0.017 NSD 0.936 Yes Air Voids 302 1172 1.185 0.731 SD <0.001 -0.029 -0.241 SD 0.002 Yes VMA 302 1172 1.226 0.763 SD <0.001 -0.374 -0.519 NSD 0.035 No %Gmm 363 2236 3.044 2.381 SD 0.001 -0.687 -0.272 SD <0.001 Yes

85 For air voids, the mean differences indicate Florida DOT ISVT test results are significantly closer to the 4% target than contractor test results. This is opposite of comparisons with Florida DOT Verification test results (Tables 22 and 23) where contractor test results were closer to the 4% target, but not significantly closer. The numbers causing this discrepancy appear to be the Florida DOT ISVT mean differences (-0.057 and -0.029%) which indicate unusually close agreement with the 4% target. For mat density (%Gmm), the mean differences indicate contractor test results are significantly closer to targets than Florida DOT ISVT test results. The Florida Verification test result comparisons (Tables 22 and 23) also indicate contractor tests results closer to targets, but not significantly closer. The numbers causing this discrepancy appear to be the Florida DOT ISVT mean differences (-0.640 and -0.687%) which indicate unusually low levels of compaction. The third comparison will be between means of paired Contractor QC and FDOT Verification test results. Tables 26 and 27 contain the results of paired t tests for data from all projects and large projects, respectively. Note that comparisons of maximum mix specific gravity (Gmm), laboratory bulk specific gravity (Gmb) and core bulk specific gravity (Gmb) samples are added. Note also that means of these three properties are reported rather than means of differences between test results and targets.

86 Table 26. Comparison of Paired Florida DOT Verification and Contractor QC Test Results – All Projects Property n FDOT∆ CONT∆ Diff. p-Value Pay % Passing 1 2 ” 489 0.517 0.450 SD <0.001 No % Passing 3 8 ” 491 1.337 1.125 SD <0.001 No % Passing #4 490 1.293 0.825 SD <0.001 No % Passing #8 492 0.694 0.391 SD <0.001 Yes % Passing #16 490 0.214 -0.038 SD <0.001 No % Passing #30 489 0.508 0.319 SD <0.001 No % Passing #50 490 0.777 0.627 SD <0.001 No % Passing #100 487 0.752 0.644 SD <0.001 No % Passing #200 490 0.143 0.084 SD <0.001 Yes % Asphalt 499 0.016 0.003 NSD 0.418 Yes Air Voids 450 -0.302 -0.304 NSD 0.972 Yes VMA 449 -0.511 -0.490 NSD 0.713 No %Gmm 1374 -0.198 -0.052 SD <0.001 Yes Gmm 443 2.433* 2.432* NSD 0.741 No Lab. Gmb 450 2.342* 2.341* NSD 0.760 No Core Gmb 1399 2.257* 2.259* SD <0.001 No * Mean of Gmm and Gmb (not of difference from target).

87 Table 27. Comparison of Paired Florida DOT Verification and Contractor QC Test Results – Large Projects (nFDOT≥6) Property n FDOT∆ CONT∆ Diff. p-Value Pay % Passing 1 2 ” 346 0.615 0.587 NSD 0.144 No % Passing 3 8 ” 340 1.350 1.247 NSD 0.035 No % Passing #4 346 1.281 0.884 SD <0.001 No % Passing #8 352 0.582 0.283 SD <0.001 Yes % Passing #16 346 0.277 0.057 SD <0.001 No % Passing #30 345 0.630 0.498 SD <0.001 No % Passing #50 346 0.784 0.719 SD <0.001 No % Passing #100 345 0.826 0.779 SD <0.001 No % Passing #200 352 0.119 0.087 SD <0.001 Yes % Asphalt 359 0.003 -0.004 NSD 0.570 Yes Air Voids 319 -0.377 -0.355 NSD 0.584 Yes VMA 319 -0.592 -0.546 NSD 0.137 No %Gmm 1302 -0.138 -0.005 SD <0.001 Yes Gmm 318 2.433* 2.432* NSD 0.096 No Lab. Gmb 320 2.343* 2.342* NSD 0.040 No Core Gmb 1248 2.259* 2.261* SD <0.001 No * Mean of Gmm and Gmb (not of difference from target).

88 The comparisons for gradation indicate significant differences, in % passing for all except the 1/2” and 3/8” sieves for large projects. This is quite different from comparisons of unpaired test results in Tables 22 and 23 where none of the differences were significant. The magnitude of the mean differences in Tables 22 and 23 are similar to those in Tables 26 and 27, so the inconsistencies in comparisons may be attributed to the paired t test being somewhat more discerning than the t test. The results of comparisons of paired % asphalt, air voids and VMA test results are the same as unpaired comparisons in Tables 22 and 23, i.e., differences in means are not significant. Likewise, differences in Gmm and Lab. Gmb are not significant. However, unlike the unpaired data, the differences for the paired % Gmm test results are significant. This is surprising as are the significant differences for core Gmb. The same cores are tested by Florida DOT and contractors and used to compute %Gmm. It would seem that pairing test results from the same cores would make significant differences unlikely, but this was not the case. An analysis of the magnitude of mean differences of %Gmm in Tables 22, 23, 26 and 27 provides a clue as to why the paired results are significantly different. The magnitudes of the mean differences are reasonably consistent for all except the paired contractor data, i.e., CONT∆ = -0.052 and -0.005%. These indicate compaction are much closer to target densities than any of the other test results. The next analysis performed was project by project comparisons for large projects, i.e., projects where there were 6 or more Florida DOT test results. This analysis quantified the numbers of projects where there were significant differences between Florida DOT and contractor means and variances, and the numbers of projects

89 where Florida DOT means and variances were the largest. These analyses are summarized in Tables 28-30 for Contractor QC vs. Florida DOT Verification, Contractor QC vs. Florida DOT ISVT and Paired Contractor QC vs. Paired Florida DOT Verification test results, respectively. The project by project comparisons generally confirm trends indicated by comparisons of combined test results. These trends can be summarized as follows: • Numerically, differences from target values of Florida DOT test results tend to be larger than contractor test results. Evidence for this conclusion are the percentages in column 3 of Tables 28-30 which are mostly greater than 50%. As noted previously for VMA, the opposite is true for the numerical differences, but contractor test results are closer to minimum values. A graphical illustration is provided in Figure 19 where contractor and Florida DOT Verification air voids mean differences are plotted. Points for 25 of the 28 projects (89%) fall in the portion of the figure bounded by the dashed lines of absolute equality and centered about the horizontal axis. A complete set of figures for mean differences and variances for all the Florida DOT project by project comparisons is contained in Appendix B. • Numerically, the variances of Florida DOT test results are larger than the variances of contractor test results. Evidence for this conclusion are percentages in column 6 of Tables 28-30 that are mostly greater than 50%. A graphical illustration is provided in Figure 20 where contractor and Florida DOT Verification air voids variances are plotted. Points for 23 of the 28 projects (82%) fall below the dashed line of equality.

90 Table 28. Project by Project Comparisons of Florida DOT Verification and Contractor QC Test Results Property Projects Projects with Larger FDOT ∆ Projects with SD ∆ Projects with Sig. Larger FDOT ∆ Projects with Larger FDOT s2 Projects with SD s2 Projects with Sig. Larger FDOT s2 Pay % Passing 1 2 ” 29 20 (69%) 1 (3%) 1 (3%) 14 (48%) 3 (10%) 2 (7%) No % Passing 3 8 ” 29 19 (66%) 0 0 13 (45%) 2 (7%) 2 (7%) No % Passing #4 29 17 (59%) 0 0 14 (48%) 4 (14%) 4 (14%) No % Passing #8 30 24 (80%) 1 (3%) 1 (3%) 18 (60%) 3 (10%) 3 (10%) Yes % Passing #16 29 19 (66%) 1 (3%) 1 (3%) 15 (52%) 1 (3%) 1 (3%) No % Passing #30 29 16 (55%) 2 (7%) 2 (7%) 12 (41%) 0 0 No % Passing #50 29 14 (48%) 1 (3%) 1 (3%) 13 (45%) 2 (7%) 1 (3%) No % Passing #100 29 18 (62%) 1 (3%) 1 (3%) 17 (59%) 2 (7%) 2 (7%) No % Passing #200 30 21 (70%) 2 (7%) 1 (3%) 11 (37%) 6 (20%) 5 (17%) Yes % Asphalt 30 22 (73%) 0 0 18 (60%) 3 (10%) 3 (10%) Yes Air Voids 28 25 (89%) 0 0 23 (82%) 3 (11%) 3 (11%) Yes VMA 28 17 (61%)* 0 0 20 (71%) 2 (7%) 2 (7%) No %Gmm 49 32 (65%) 2 (4%) 2 (4%) 33 (67%) 2 (4%) 1 (2%) Yes Gmm 29 - 0 - 12 (41%) 1 (3%) 0 No Lab. Gmb 29 - 0 - 21 (72%) 2 (7%) 2 (7%) No Core Gmb 49 - 1 (2%) - 26 (53%) 1 (2%) 0 No Numbers in parentheses are percentages of projects. *Minimum VMA requirements are specified. These numbers indicate projects and percent of projects where FDOT VMA test results were smaller than contractor VMA test results. No targets for Gmm, Lab Gmb or Core Gmb and, therefore, which test results might be larger is of no particular importance. Means of test results rather than means of differences between test results and targets are compared.

91 Table 29. Project by Project Comparisons of Florida DOT ISVT and Contractor QC Test Results Property Projects Projects with Larger FDOT ∆ Projects with SD ∆ Projects with Sig. Larger FDOT ∆ Projects with Larger FDOT s2 Projects with SD s2 Projects with Sig. Larger FDOT s2 Pay % Passing 1 2 ” 25 12 (48%) 1 (4%) 1 (4%) 18 (72%) 5 (20%) 5 (20%) No % Passing 3 8 ” 24 11 (46%) 2 (8%) 1 (4%) 17 (71%) 3 (12%) 3 (12%) No % Passing #4 25 13 (52%) 2 (8%) 0 18 (72%) 5 (20%) 5 (20%) No % Passing #8 25 16 (64%) 1 (4%) 0 18 (72%) 5 (20%) 5 (20%) Yes % Passing #16 25 14 (56%) 1 (4%) 1 (4%) 14 (56%) 1 (4%) 1 (4%) No % Passing #30 25 14 (56%) 1 (4%) 1 (4%) 15 (60%) 1 (4%) 1 (4%) No % Passing #50 25 12 (48%) 1 (4%) 0 21 (84%) 2 (8%) 2 (8%) No % Passing #100 25 15 (60%) 1 (4%) 1 (4%) 14 (56%) 1 (4%) 1 (4%) No % Passing #200 25 17 (68%) 2 (8%) 1 (4%) 16 (64%) 0 0 Yes % Asphalt 26 19 (73%) 2 (8%) 2 (8%) 16 (62%) 3 (12%) 3 (12%) Yes Air Voids 24 18 (75%) 0 0 16 (67%) 3 (12%) 3 (12%) Yes VMA 24 10 (42%)* 2 (8%) 2 (8%)* 18 (75%) 4 (17%) 4 (17%) No %Gmm 14 10 (71%) 3 (21%) 3 (21%) 7 (50%) 1 (7%) 1 (7%) Yes Gmm 25 - 2 (8%) - 14 (56%) 0 0 No Lab. Gmb 25 - 1 (4%) - 15 (60%) 1 (4%) 1 (4%) No Core Gmb 13 - 3 (23%) - 6 (46%) 2 (15%) 2 (15%) No Numbers in parentheses are percentages of projects. * Minimum VMA requirements are specified. These numbers indicate projects and percent of projects where FDOT VMA test results were smaller than contractor VMA test results. No targets for Gmm, Lab. Gmb or Core Gmb and, therefore, which test results might be larger is of no particular importance. Means of test results rather than means of differences between test results and targets are compared.

92 Table 30. Project by Project Comparisons of Paired Florida DOT Verification and Contractor QC Test Results Property Projects Projects with Larger FDOT ∆ Projects with SD ∆ Projects with Sig. Larger FDOT ∆ Projects with Larger FDOT s2 Pay % Passing 1 2 ” 27 16 (59%) 0 0 13 (48%) No % Passing 3 8 ” 26 14 (54%) 0 0 12 (46%) No % Passing #4 27 15 (56%) 0 0 13 (48%) No % Passing #8 28 18 (64%) 1 (4%) 0 18 (64%) Yes % Passing #16 27 17 (63%) 1 (4%) 1 (4%) 17 (63%) No % Passing #30 27 18 (67%) 1 (4%) 1 (4%) 17 (63%) No % Passing #50 27 15 (55%) 0 0 16 (59%) No % Passing #100 27 13 (48%) 3 (11%) 3 (11%) 19 (70%) No % Passing #200 28 19 (68%) 2 (7%) 2 (7%) 14 (50%) Yes % Asphalt 28 19 (68%) 0 0 16 (57%) Yes Air Voids 27 23 (85%) 0 0 19 (70%) Yes VMA 27 15 (56%)* 0 0 16 (59%) No %Gmm 48 26 (54%) 19 (40%) 11 (23%) 31 (65%) Yes Gmm 27 - 1 (4%) - 12 (44%) No Lab. Gmb 27 - 1 (4%) - 14 (52%) No Core Gmb 48 - 12 (25%) - 36 (75%) No Number in parentheses are percentages of projects. Statistical comparisons of variances were not conducted. No targets for Gmm, Lab Gmb or Core Gmb and, therefore, test results might be larger is of no particular importance. Means of test results rather than means of differences between test results and targets are compared. *Minimum VMA requirements are specified. These numbers indicate projects and percents of projects where FDOT VMA test results were smaller than contractor VMA test results.

93 Air Void Means -1.5 -1 -0.5 0 0.5 1 1.5 -1.5 -1 -0.5 0 0.5 1 1.5 Project FLDOT VT Means P ro je ct C o n tr ac to r Q C M ea n s Figure 19. Project Air Voids Mean Differences Air Void Variance 0 1 2 3 4 5 6 0 1 2 3 4 5 6 Project FLDOT VT Variances P ro je ct C o n tr ac to r Q C V ar ia n ce s Figure 20. Project Air Voids Variances

94 • The variances of project test results are more likely to be significantly different than mean differences. This is confirmed by comparing numbers and percentages of projects in column 4 of Tables 28 and 29 with numbers and percentages of projects in column 7. • When mean differences from target values are significantly different, mean differences for Florida DOT test results are likely larger. This can be confirmed by comparing numbers and percentages of projects in columns 4 and 5 of Tables 28-30. The numbers and percentages of projects in columns 4 and 5 are quite similar. • When variances are significantly different, variances for Florida DOT test results are likely larger. This can be confirmed by comparing numbers and percentages of projects in columns 7 and 8 of Tables 28 and 29. The numbers and percentages of projects in columns 7 and 8 are quite similar. This is graphically illustrated in Figure 20 by the number of points close to the horizontal axis. A final analysis will compare mean square deviations (nominal is best) computed with means and variances from Tables 22 and 24. The nominal is best mean square deviation (MSDNIB) is computed with Equation 3. Mean square deviations for VMA are not included because Florida DOT specification contains minimum acceptable requirements. Therefore, for VMA, a larger is best situation is applicable, but appropriate statistics were not available for computing MSDLIB. Mean square deviations are contained in Table 31. These values confirm trends indicated by comparisons of mean differences and variances that contractor test results are more accurate, relative to target values, and more precise (less variable) than Florida DOT test results. The mean square deviations for contractor test results are all smaller than Florida DOT test results from split (Verification) and independent (ISVT) samples.

95 Table 31. Comparisons of Mean Square Deviations for Florida DOT Data MSDNIB Property Contractor QC FDOT Verification FDOT ISVT % Asphalt* 0.062 0.084 0.086 Air Voids* 0.768 1.392 1.403 %Gmm* 0.977 1.281 1.422 % Passing 1 2 ” 3.036 4.086 3.810 % Passing 3 8 ” 9.936 10.516 13.194 % Passing #4 13.828 18.709 18.474 % Passing #8* 5.779 7.994 10.254 % Passing #16 6.006 6.626 7.598 % Passing #30 5.055 5.683 6.245 % Passing #50 5.101 6.218 5.865 % Passing #100 2.247 2.693 4.752 % Passing #200* 0.381 0.509 0.497 * Property used for pay factor computation.

96 ANALYSIS OF NORTH CAROLINA DOT HOT MIX ASPHALT CONCRETE DATA Test results for HMAC produced and placed for the North Carolina DOT during the 2004 construction year were provided. The North Carolina DOT manages HMAC production by job mix formula (JMF). Compaction is managed by project and, therefore, there is a disconnect between test results for mix properties and mat properties. However, mat densities were provided in a format so that sorting and, therefore, analysis was convenient only by JMF. Mix Properties Comparisons Test results for mix properties were received for a total of 735 mix designs. These were combined into a data set of all JMFs and sorted into a reduced data set comprised of JMFs where there was 6 or more North Carolina DOT test results. Proximity to targets and variances of contractor and North Carolina DOT test results were compared for the combined and reduced data sets. Comparisons were also conducted for each JMF with 6 or more North Carolina DOT test results. Test results included gradation (percent passing 1”,3 4 ”,1 2 ”,3 8 ”, #4, #8 and #200 sieves), asphalt content, air voids, VMA, VFA and % Gmm @ Ni in the gyratory compactor. The ignition oven method is used for asphalt content and gradation, except that the contractor may request an alternative method for asphalt content. Individual and moving averages for 4 test results for percents passing the #8 and #200 sieves, asphalt content, air voids, VMA and % Gmm @ Ni are plotted on control charts with control limits. When test results exceed control limits a series of actions may be taken

97 that include notification of the engineer, process adjustment, additional testing, retesting and, as a last resort, pay reduction. Figure 21 illustrates North Carolina DOT sampling and testing requirements for managing the production of HMAC. The increment for Contractor QC sampling and testing is 750 tons but mix acceptance is not on a LOT basis. For mat compaction a LOT is a days production. North Carolina DOT conducts two types of mix sampling and testing; QA and Verification. North Carolina DOT QA tests are on split samples with Contractor QC and North Carolina DOT Verification tests are on independent samples. Contractor QC and North Carolina DOT QA test results are compared one to one with numerical criteria, with control limits on control charts and with specification requirements. Unacceptable comparisons result in an investigation as described below “In the event comparison test results are outside the above acceptable limits of precision or the quality assurance test results are either outside the individual test control limits or fail to meet specification requirements, the engineer will immediately investigate the reason for the difference.” Pay adjustments for mix properties appear to be applied only as a last resort. It was not clear from the review of specifications exactly how North Carolina DOT Verification test results are used in the QA process. The first comparisons made were between Contractor QC and North Carolina DOT QA test results for all 735 JMFs. Table 32 contains comparisons of variances and means of differences from target values. The first 6 properties in the table are plotted on control charts and are, therefore, used directly in the acceptance process.

98 Notes: 1. Contractor QC and NCDOT QA tests on split samples. 2. NCDOT Verification tests on independent samples. 3. NCDOT QA tests at 10% and NCDOT Verification tests at 5% of Contractor QC test rate. 4. Contractor QC and NCDOT QA test results compared one to one with precision limits. Figure 21. North Carolina DOT HMAC Mix Sampling and Testing Requirement

99 Table 32. Comparison of NCDOT QA and Contractor QC Test Results – All JMFs Property nNCDOT nCONT s 2NCDOT s 2 CONT Difference p-Value NCDOT∆ CONT∆ Difference p-Value Control % Asphalt 2295 14396 0.095 0.059 SD <0.001 -0.021 -0.003 SD 0.008 Yes Air Voids 2269 14225 1.080 0.564 SD <0.001 -0.212 -0.097 SD <0.001 Yes VMA 2268 14225 1.856 1.803 NSD 0.177 1.224 1.507 SD <0.001 Yes % Gmm @ Ni 2223 14017 2.665 2.091 SD <0.001 -0.969 -1.028 NSD 0.107 Yes % Pass #200 2294 14396 0.765 0.490 SD <0.001 0.221 0.095 SD <0.001 Yes % Pass #8 2296 14397 14.519 8.210 SD <0.001 0.335 0.307 NSD 0.729 Yes % Pass #4 2258 14175 20.717 15.277 SD <0.001 1.333 1.140 NSD 0.056 No % Pass 3 8 ” 2251 14194 19.008 13.753 SD <0.001 0.928 0.547 SD <0.001 No Pass 1 2 ” 2281 14347 15.349 10.621 SD <0.001 0.921 0.506 SD <0.001 No % Pass 3 4 ” 2281 14335 6.903 5.316 SD <0.001 0.179 0.121 NSD 0.316 No % Pass 1” 2277 14383 1.385 0.992 SD <0.001 0.018 -0.020 NSD 0.147 No VFA 2015 12795 45.123 28.629 SD <0.001 8.863 8.574 NSD 0.066 No

100 Except for VMA, Table 32 indicates that the variances of North Carolina DOT and contractor test results are statistically significantly different. For all properties, North Carolina DOT variances are larger. Significant differences for means are not so consistent. Table 32 indicates that means of 4 of the 6 properties used in control charts are statistically significantly different but that only 2 of the remaining 6 properties have significantly different means. Except for VMA and % Gmm @ Ni, the means of differences from target values indicate contractor test results are closer to targets than North Carolina DOT test results. The specification requirement for VMA is a minimum acceptable value and for % Gmm @ Ni is a maximum acceptable values. The means of differences in Table 32 indicate more favorable contractor test results for both VMA and % Gmm @ Ni. Comparisons of paired Contractor QC and NCDOT QA test results (paired t tests) are contained in Table 33. These comparisons indicate statistically significant differences for means of all properties. Except for % passing the 1” sieve, contractor test results are either closer to target values or, for VMA and % Gmm @ Ni, more favorable relative to specification requirements. The comparisons, summarized in Table 34, are for test results in a reduced data set (nNCDOT≥6). The numbers of test results are about 40% of those in Table 32 and represent about 110 of the 735 JMFs. There are a few differences for specific comparisons, but the general trends indicated in Table 32 are confirmed by Table 34.

101 Table 33. Comparison of Paired NCDOT QA and Contractor QC Test Results – All JMFs Property n NCDOT∆ CONT∆ Difference p-Value Control % Asphalt 2287 -0.021 -0.002 SD <0.001 Yes Air Voids 2261 -0.214 -0.112 SD <0.001 Yes VMA 2260 1.221 1.464 SD <0.001 Yes % Gmm @ Ni 2214 -0.966 -1.129 SD <0.001 Yes % Pass #200 2286 0.222 0.130 SD <0.001 Yes % Pass #8 2286 0.366 0.161 SD <0.001 Yes % Pass #4 2249 1.341 1.062 SD <0.001 No % Pass 3 8 ” 2243 0.906 0.609 SD <0.001 No Pass 1 2 ” 2273 0.904 0.517 SD <0.001 No % Pass 3 4 ” 2273 0.176 0.048 SD 0.007 No % Pass 1” 2268 0.020 -0.048 SD 0.006 No VFA 2005 8.900 8.629 SD 0.005 No

102 Table 34. Comparison of NCDOT QA and Contractor QC Test Results – JMFs with nNCDOT≥6 Property nNCDOT nCONT s 2NCDOT s 2 CONT Difference p-Value NCDOT∆ CONT∆ Difference p-Value Control % Asphalt 994 6059 0.083 0.057 SD <0.001 -0.017 -0.001 NSD 0.098 Yes Air Voids 973 5920 1.018 0.490 SD <0.001 -0.285 -0.115 SD <0.001 Yes VMA 973 5920 2.338 2.374 NSD 0.380 1.186 1.451 SD <0.001 Yes % Gmm @ Ni 960 5864 2.400 1.971 SD <0.001 -0.863 -0.981 NSD 0.027 Yes % Pass #200 994 6059 0.751 0.462 SD <0.001 0.179 0.064 SD <0.001 Yes % Pass #8 993 6059 11.955 8.412 SD <0.001 0.524 0.321 NSD 0.080 Yes % Pass #4 972 5926 17.785 13.627 SD <0.001 1.504 1.166 NSD 0.019 No % Pass 3 8 ” 972 5926 14.027 11.738 SD <0.001 0.815 0.424 SD 0.002 No Pass 1 2 ” 989 6039 11.608 9.597 SD <0.001 0.731 0.505 NSD 0.050 No % Pass 3 4 ” 988 6009 4.849 4.357 NSD 0.012 0.033 0.079 NSD 0.531 No % Pass 1” 994 6058 1.056 0.799 SD <0.001 -0.054 -0.002 NSD 0.128 No VFA 836 5210 35.811 24.960 SD <0.001 9.657 8.845 SD <0.001 No

103 Comparisons of paired Contractor QC and NCDOT QA test results in the reduced data set are contained in Table 35. As was the case for all JMFs, the comparisons of paired test results indicate more consistent statistically significant differences than comparisons of unpaired test results. Only the comparisons for % passing the ¾ and 1” sieves are not significantly different. The reduced dataset will be analyzed by comparing JMF statistics. The analyses are similar to those project by project comparisons for Florida and Georgia DOT test results and are summarized in Table 36. The JMF by JMF comparisons generally confirm trends indicated by comparisons of combined test results. These trends can be summarized as follows: • Numerically, differences from target values of North Carolina DOT test results tend to be larger than contractor test results. Evidence for this conclusion are percentages in column 3 of Table 36 that are equal to 50% for passing ½” sieve or greater than 50% for all other properties. As noted previously, interpretation for VMA and % Gmm @ Ni are different. A graphical illustration for asphalt content is provided in Figure 22. Points for 80 of the 112 JMFs (71%) fall in the portion of the figure bounded by the dashed lines of absolute equality and centered on the horizontal axis. A complete set of figures for means and variances of the comparisons in Table 36 is contained in Appendix C. A final observation is that use of a property on control charts for acceptances appears to affect the percentages in column 3. The average is 72% for the first 6 properties, which are used for acceptance, and 61% for the last 6 properties, which are not used for acceptance.

104 Table 35. Comparison of Paired NCDOT QA and Contractor QC Test Results – JMFs with nNCDOT ≥ 6 Property n NCDOT∆ CONT∆ Difference p-Value Control % Asphalt 992 -0.017 0.005 SD 0.003 Yes Air Voids 971 -0.286 -0.146 SD <0.001 Yes VMA 971 1.185 1.452 SD <0.001 Yes % Gmm @ Ni 956 -0.863 -1.069 SD <0.001 Yes % Pass #200 992 0.181 0.124 SD 0.006 Yes % Pass #8 991 0.526 0.223 SD <0.001 Yes % Pass #4 970 1.510 1.085 SD <0.001 No % Pass 3 8 ” 970 0.819 0.491 SD <0.001 No Pass 1 2 ” 987 0.728 0.459 SD 0.002 No % Pass 3 4 ” 986 0.038 -0.024 NSD 0.314 No % Pass 1” 991 -0.047 -0.031 NSD 0.618 No VFA 834 9.663 9.112 SD <0.001 No

105 Table 36. JMF by JMF Comparisons of North Carolina DOT QA and Contractor QC Mix Properties Test Results Property JMFs JMFs with Larger NCDOT ∆ JMFs with SD ∆ JMFs with Sig. Larger NCDOT ∆ JMFs with Larger NCDOT s2 JMFs with SD s2 JMFs with Sig. Larger NCDOT s2 Control % Asphalt 112 80 (71%)* 5 (4%) 5 (4%) 59 (53%) 9 (8%) 7 (6%) Yes Air Voids 110 89 (81%) 15 (14%) 15 (14%) 88 (80%) 8 (7%) 8 (7%) Yes VMA 110 89 (81%)** 15 (14%) 15 (14%)** 73 (66%) 11 (10%) 9 (8%) Yes % Gmm @ Ni 108 71 (66%)*** 16 (15%) 14 (13%)*** 77 (71%) 9 (8%) 7 (6%) Yes % Pass #200 112 81 (72%) 28 (25%) 18 (16%) 61 (54%) 16 (14%) 13 (12%) Yes % Pass #8 112 71 (63%) 5 (4%) 4 (4%) 70 (62%) 9 (8%) 6 (5%) Yes % Pass #4 110 74 (67%) 2 (2%) 2 (2%) 55 (50%) 4 (4%) 4 (4%) No % Pass 3 8 ” 110 66 (60%) 4 (4%) 3 (3%) 48 (44%) 8 (7%) 4 (4%) No Pass 1 2 ” 92 46 (50%) 5 (5%) 4 (4%) 43 (47%) 11 (12%) 11 (12%) No % Pass 3 4 ” 46 28 (61%) 0 0 21 (46%) 1 (2%) 1 (2%) No % Pass 1” 21 12 (57%) 0 0 12 (57%) 1 (5%) 1 (5%) No VFA 94 60 (64%) 12 (13%) 10 (11%) 72 (77%) 10 (11%) 9 (10%) No * Numbers in parentheses are percentages of JMFs. ** Minimum VMA requirements are specified. These numbers indicate projects and percentages of projects where NCDOT VMA test results were smaller than contractor test results. *** Maximum %Gmm @ Ni requirements are specified. These numbers indicate projects and percentages of projects where NCDOT %Gmm @ Ni test results were larger than contractor test results.

106 Asphalt Content Mean -0.55 -0.45 -0.35 -0.25 -0.15 -0.05 0.05 0.15 0.25 0.35 0.45 0.55 -0.55 -0.45 -0.35 -0.25 -0.15 -0.05 0.05 0.15 0.25 0.35 0.45 0.55 JMF NCDOT QA Means JM F C o n tr ac to r Q C M ea n s Figure 22. JMF Asphalt Content Mean Differences Asphalt Content Variance 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0 0.1 0.2 0.3 0.4 0.5 JMF NCDOT QA Variances JM F C o n tr ac to r Q C V ar ia n ce s Figure 23. JMF Asphalt Content Variances

107 • Numerically, the variances of North Carolina DOT test results are generally larger than the variances of contractor test results. Evidence for this conclusion are the percentages in column 6 of Table 36 that are mostly (9 of 12) greater than or equal to 50%. A graphical illustration for asphalt content is shown in Figure 23 where points for 59 of 112 JMFs (53%) plot below the line of equality. The average percentage for the first 6 properties that are used for acceptance is 64%. For the last 6 properties that are not used for acceptance, the average is 54%.The average of 54% for the last 6 properties and the number of percentages near 50% is somewhat unusual. Larger, and significantly larger, DOT variances have been the norm in most analyses of data from Georgia and Florida. • When mean differences from target values are significantly different, mean differences for North Carolina DOT test results are likely larger (90 of 107 JMFs). This can be confirmed by comparing numbers and percentages of JMFs in columns 4 and 5 of Table 36. The numbers and percentages of JMFs are quite similar. Again the use of a property for acceptance affects the percentages in columns 4 and 5. The average is 13% of JMFs with significantly different means for the first 6 properties (those used for acceptance) and 4% for the last 6 properties (column 4). The average is 11% of JMFs with significantly larger North Carolina DOT means for the first 6 properties and 3% for the last 6 properties (column 5). • When variances are significantly different, variances for North Carolina DOT are likely larger (80 of 97 JMFs). This can be confirmed by comparing numbers and percentages of JMFs in columns 7 and 8 which are quite similar. The distribution of points along the horizontal axis in Figure 23 graphically illustrates the trend of significantly larger North Carolina DOT variances. The second set of comparisons made were between Contractor QC and NCDOT Verification test results from all 735 JMFs. These test results are from independent

108 samples. Table 37 contains comparisons of variances and means of differences from target values. The nNCDOT values in Table 37 are about 35% of the nNCDOT values in Table 32. Except for % Gmm @ Ni, Table 37 indicates that the variances of North Carolina DOT and contractor test results are statistically significantly different. For all properties, North Carolina DOT variances are larger. The comparisons of variances in Table 37 for test results from independent samples are quite similar to the comparisons in Table 32 for test results from split samples. The only difference being the one property in each (VMA among split samples and % Gmm @ Ni among independent samples) that is not significantly different. Significant differences for means are not as consistent. Table 37 indicates that means for 3 of 6 properties used in control charts were statistically significantly different, but that none of the remaining 6 properties have significantly different means. In total only 3 of 12 means are significantly different compared to 6 of 12 in Table 32. These numbers seem reasonable since Table 32 comparisons are for split sample test results and Table 37 comparisons are for test results from independent samples. Contractor means of differences are smaller for all properties except VMA and % Gmm @ Ni. However, the mean differences for these properties also indicate more favorable contractor test results.

109 Table 37. Comparison of NCDOT Verification and Contractor QC Test Results – All JMFs Property nNCDOT nCONT s 2NCDOT s 2 CONT Difference p-Value NCDOT∆ CONT∆ Difference p-Value Control % Asphalt 814 14396 0.082 0.059 SD <0.001 -0.021 -0.003 NSD 0.067 Yes Air Voids 817 14225 1.079 0.564 SD <0.001 -0.161 -0.097 NSD 0.086 Yes VMA 808 14225 2.130 1.803 SD <0.001 1.217 1.507 SD <0.001 Yes % Gmm @ Ni 798 14017 2.262 2.091 NSD 0.060 -0.826 -1.028 SD <0.001 Yes % Pass #200 814 14396 0.841 0.490 SD <0.001 0.194 0.095 SD 0.002 Yes % Pass #8 814 14397 13.829 8.210 SD <0.001 0.489 0.307 NSD 0.168 Yes % Pass #4 798 14175 22.830 15.277 SD <0.001 1.428 1.140 NSD 0.094 No % Pass 3 8 ” 797 14194 20.914 13.753 SD <0.001 0.749 0.547 NSD 0.220 No Pass 1 2 ” 810 14347 16.661 10.621 SD <0.001 0.776 0.506 NSD 0.065 No % Pass 3 4 ” 808 14335 7.208 5.316 SD <0.001 0.139 0.121 NSD 0.852 No % Pass 1” 810 14383 1.586 0.992 SD <0.001 -0.032 -0.020 NSD 0.792 No VFA 723 12795 48.719 28.629 SD <0.001 8.919 8.574 NSD 0.192 No

110 The comparisons summarized in Table 38 are for a reduced database (JMFs for which nNCDOT≥6). The numbers of test results are only about 10% of those in Table 37 and represent only 12 of the 735 JMFs. There are a few differences in the various comparisons but the general trends indicated in Table 38 are similar to those indicated in Table 37. The reduced Verification database was analyzed for JMF by JMF comparisons. The comparisons are similar to those for the reduced QA database (Table 36) and are summarized in Table 39. The number of JMFs compared by property in Table 39 (12 maximum) is small compared to the number in Table 36 (112 maximum), but the general trends demonstrated are similar. Mean square deviations for mix properties are summarized in Table 40. The nominal is best (NIB) condition is applicable for the properties in Table 40 since each has a target value. VMA and % Gmm @ Ni are not included since they have variable minimum and maximum acceptable values, respectively, and statistics for computation were not available. Statistics (s2 and ∆ ) for all JMFs from Tables 32 and 37 were used in Equation 3 to compute MSDNIB. The MSDNIB for Contractor QC tests of all properties were smallest indicating the best process control (material properties). The North Carolina DOT QA test MSDNIB were largest for three properties and the North Carolina Verification test MSDNIB were largest for seven properties.

111 Table 38. Comparison of NCDOT Verification and Contractor QC Test Results – JMFs with nNCDOT≥6 Property nNCDOT nCONT s 2NCDOT s 2 CONT Difference p-Value NCDOT∆ CONT∆ Difference p-Value Control % Asphalt 93 1318 0.041 0.044 NSD 0.350 -0.041 -0.003 NSD 0.093 Yes Air Voids 92 1318 0.663 0.403 SD <0.001 -0.218 -0.111 NSD 0.220 Yes VMA 92 1318 0.798 0.689 NSD 0.154 0.733 1.043 SD 0.001 Yes % Gmm @ Ni 93 1318 0.740 1.030 NSD 0.022 -0.415 -0.717 SD 0.005 Yes % Pass #200 93 1318 0.721 0.375 SD <0.001 0.013 0.053 NSD 0.653 Yes % Pass #8 93 1318 9.535 6.387 SD 0.002 0.495 0.291 NSD 0.535 Yes % Pass #4 93 1318 13.529 8.879 SD 0.001 1.419 0.848 NSD 0.146 No % Pass 3 8 ” 93 1318 10.013 7.911 NSD 0.050 -0.290 -0.957 NSD 0.029 No % Pass 1 2 ” 93 1318 10.056 4.511 SD <0.001 0.290 -0.162 NSD 0.178 No % Pass 3 4 ” 93 1318 6.846 2.647 SD <0.001 -0.258 0.083 NSD 0.218 No % Pass 1” 93 1318 1.839 0.324 SD <0.001 -0.290 -0.035 NSD 0.074 No VFA 84 1227 32.760 20.346 SD 0.001 8.818 8.498 NSD 0.617 No

112 Table 39. JMF by JMF Comparisons of North Carolina DOT Verification and Contractor QC Mix Properties Test Results Property JMFs JMFs with Larger NCDOT ∆ JMFs with SD ∆ JMFs with Sig. Larger NCDOT ∆ JMFs with Larger NCDOT s2 JMFs with SD s2 JMFs with Sig. Larger NCDOT s2 Control % Asphalt 12 8 (67%) 0 0 5 (42%) 1 (8%) 0 Yes Air Voids 11 10 (91%) 1 (9%) 1 (9%) 8 (73%) 2 (18%) 2 (18%) Yes VMA 11 8 (73%)** 2 (18%) 2 (18%)** 6 (55%) 1 (9%) 1 (9%) Yes % Gmm @ Ni 12 8 (67%)*** 1 (8%) 1 (8%)*** 7 (58%) 1 (8%) 0 Yes % Pass #200 12 10 (83%) 5 (42%) 5 (42%) 8 (67%) 0 0 Yes % Pass #8 12 7 (58%) 0 0 6 (50%) 0 0 Yes % Pass #4 12 9 (75%) 0 0 5 (42%) 0 0 No % Pass 3 8 ” 12 5 (42%) 0 0 5 (42%) 1 (8%) 0 No Pass 1 2 ” 8+ 2 (25%) 1 (12%) 0 3 (38%) 1 (12%) 1 (12%) No % Pass 3 4 ” 5+ 3 (60%) 1 (20%) 1 (20%) 2 (40%) 1 (20%) 0 No % Pass 1” 2+ 2 (100%) 0 0 2 (100%) 1 (50%) 1 (50%) No VFA 11 8 (73%) 1 (9%) 1 (9%) 8 (73%) 1 (9%) 1 (9%) No Numbers in parentheses are percentages of JMFs. ** Minimum VMA requirements are specified. These numbers indicate JMFs and percents of JMFs where NCDOT VMA test results were smaller than contractor test results. *** Maximum %Gmm @ Ni requirements are specified. These numbers indicate JMFs and percents of JMFs where NCDOT %Gmm @ Ni test results were larger than contractor test results. + JMFs with 100% passing required and achieved were not included.

113 Table 40. Comparisons of Mean Square Deviations for North Carolina DOT Mix Data MSDNIB Property Contractor QC NCDOT QA NCDOT Verification % Asphalt* 0.059 0.095 0.082 Air Voids* 0.573 1.125 1.105 VFA 102.142 123.676 126.494 % Passing #200* 0.499 0.814 0.879 % Passing #8* 8.304 14.631 14.068 % Passing #4 16.577 22.494 24.869 % Passing 3/8” 14.052 19.869 21.475 % Passing ½” 10.877 16.197 17.263 % Passing ¾” 5.331 6.935 7.227 % Passing 1” 0.992 1.385 1.587 * Property used for control.

114 Mat Density Comparisons As noted previously, mat compaction is managed by project and acceptance is by LOT. A LOT is a days production and contractors may choose testing with nuclear gages or cores. A minimum mat density of 92% of Gmm is specified and pay factors computed with the equation PF = 100-10 (D)1.465 ................................................................................(4) where D = the deficiency in average LOT density, i.e., less than 92% of Gmm. Average LOT compaction of 92% of Gmm and greater results in 100% pay. Figure 24 illustrates North Carolina DOT’s nuclear gage mat density testing requirements. Contractors conduct 5 tests at equal intervals in each 2000 foot test section. The North Carolina DOT conducts retest (same locations) in 10% of the test sections and conducts verification tests (independent locations) in 5% of the test sections. Results are reported as the average of 5 tests. Analysis of combined nuclear gage testing for 141 JMFs is summarized in Table 41. The comparisons of variances and means are consistent for both North Carolina DOT Retest and Verification results and for all and for large JMFs (nNCDOT≥6). • Variances of North Carolina DOT nuclear gage mat density tests are significantly larger than contractor tests.

115 Notes: 1. Contractor may select cores or nuclear gages for mat density testing. 2. 5 tests per 2000 LF test section. Results reported as average of 5 tests. 3. NCDOT Retesting at 10% of Contractor QC rate. Conducted at same test locations. 4. NCDOT Verification testing at 5% of Contractor QC rate. Independent test locations. 5. Contractor QC and NCDOT Retest compared (Acceptable difference is ± 2% Gmm). Figure 24. North Carolina DOT HMAC Nuclear Gage Mat Density Testing Requirements

116 Table 41. Comparisons of North Carolina DOT and Contractor Nuclear Gage Mat Density Test Results Data Sets nNCDOT ncont s 2 NCDOT s 2 CONT Difference p-Value NCDOT∆ CONT∆ Difference p-Value QC vs. Retest All JMFs* 1255 9011 2.200 1.657 SD <0.001 0.695 1.114 SD <0.001 QC vs. Retest** JMFs, nNCDOT≥6 1090 7466 1.992 1.118 SD <0.001 0.663 1.118 SD <0.001 QC vs. Verif. All JMFs* 588 9011 2.200 1.657 SD <0.001 0.489 1.114 SD <0.001 QC vs. Verif.*** JMFs, nNCDOT≥6 379 4586 1.800 1.326 SD <0.001 0.593 1.165 SD <0.001 * Total of 141 JMFs analyzed. ** 76 JMFs with Retest nNCDOT≥6 *** 34 JMFs with Verification nNCDOT≥6

117 • Mean differences of North Carolina DOT nuclear gage mat density tests from the 92% Gmm minimum target are significantly different from contractor tests, and North Carolina DOT tests indicate poorer compaction, i.e., smaller ∆ = X-92. The nuclear gage mat density test results were provided in a format so that North Carolina DOT Retest test results could not be matched with specific Contractor QC test results. Therefore, paired t tests could not be performed on the subsets of matched Contractor QC and North Carolina DOT Retest test results. JMF by JMF comparisons of nuclear gage tests are summarized in Table 42. Numerically, North Carolina DOT JMF mean differences from the 92% Gmm minimum indicate lower achieved densities (column 3). This is graphically illustrated in Figures 25 and 27 where 71% and 85%, respectively, of the points plot above the line of equality. In Figures 25 and 27 it is interesting to note the number of JMF where contractor test results show densities exceeding the 92% Gmm minimum (+∆ ), but North Carolina DOT test results show densities less than the 92% Gmm minimum (-∆ ). When JMF mean differences are significantly different, the North Carolina DOT values are likely smaller (columns 4 and 5 of Table 42). Based on differences in variances for combined tests, as shown in Table 41, the differences in JMF variances are smaller than expected (61% and 50% in column 6 of Table 42). However, when differences in variances are statistically significant, North Carolina DOT variances are always larger (columns 7 and 8). These trends are graphically illustrated in Figures 26 and 28 by the distribution of points along the horizontal axes.

118 Table 42. JMF Comparisons of North Carolina DOT and Contractor Nuclear Gage Mat Density Test Results Data Sets JMFs* JMFs with Smaller NCDOT ∆ JMFs with SD ∆ JMFs with Sig. Smaller NCDOT ∆ JMFs with Larger NCDOT s2 JMFs with SD s2 JMFs with Sig. Larger NCDOT s2 QC vs. Retest 76 54 (71%) 20 (26%) 18 (24%) 46 (61%) 16 (21%) 16 (21%) QC vs. Verif. 34 29 (85%) 8 (24%) 7 (21%) 17 (50%) 8 (24%) 8 (24%) * Total of 141 JMFs analyzed.

119 Density Mean (Nuclear Gage) -4.2 -3.5 -2.8 -2.1 -1.4 -0.7 0 0.7 1.4 2.1 2.8 3.5 4.2 -4.2 -3.5 -2.8 -2.1 -1.4 -0.7 0 0.7 1.4 2.1 2.8 3.5 4.2 JMF Retest Means JM F C o n tr ac to r Q C M ea n s Figure 25. JMF Nuclear Gage Mat Density Mean Differences- NCDOT Retest and Contractor QC Density Variance (Nuclear Gage) 0 1 2 3 4 5 6 0 1 2 3 4 5 6 JMF Retest Variances JM F C o n tr ac to r Q C V ar ia n ce s Figure 26. JMF Nuclear Gage Mat Density Variances – NCDOT Retest and Contractor QC

120 Density Mean (Nuclear Gage) -2.1 -1.8 -1.5 -1.2 -0.9 -0.6 -0.3 0 0.3 0.6 0.9 1.2 1.5 1.8 2.1 -2.1 -1.8 -1.5 -1.2 -0.9 -0.6 -0.3 0 0.3 0.6 0.9 1.2 1.5 1.8 2.1 JMF Verification Means JM F C o n tr ac to r Q C M ea n s Figure 27. JMF Nuclear Gage Mat Density Mean Differences - NCDOT Verification and Contractor QC Density Variance (Nuclear Gage) 0 0.25 0.5 0.75 1 1.25 1.5 1.75 2 2.25 2.5 2.75 3 0 0.25 0.5 0.75 1 1.25 1.5 1.75 2 2.25 2.5 2.75 3 JMF Verification Variances JM F C o n tr ac to r Q C V ar ia n ce s Figure 28. JMF Nuclear Gage Mat Density Variances – NCDOT Verification and Contractor QC

121 As noted previously, contractors may choose testing with either nuclear gages or cores for control and acceptance of mat compaction. Figure 29 illustrates North Carolina DOT core mat density testing requirements. Contractors take and test one core in each 2000 foot test section. The North Carolina DOT conducts retest (same cores) and tests comparison cores (taken adjacent to Contractor QC core locations) in 10% of the test sections. In addition, the North Carolina DOT tests one verification core from an independent location in 5% of the test sections. Analyses of combined core testing for 585 JMFs are summarized in Table 43. The comparisons are consistent for North Carolina DOT Retest, Comparison and Verification test results for all and for large JMFs (nNCDOT≥ 6). • Variances of North Carolina DOT core mat density tests are significantly larger than contractor tests • Mean differences of North Carolina DOT core mat density tests from the 92% Gmm minimum target are significantly different from contractor tests, and North Carolina DOT tests indicate poorer compaction, i.e., smaller ∆ = X-92.

122 Notes: 1. Contractor may select cores or nuclear gages for mat density testing. 2. Contractors take and test 1 core per 2000 LF test section 3. NCDOT Retest and NCDOT Comparison Cores testing at 10% of Contractor QC rate. 4. NCDOT Comparison Cores taken adjacent to Contractor QC core location 5. NCDOT Verification testing at 5% of Contractor QC rate. Cores taken from independent locations. 6. NCDOT Retest and NCDOT Comparison Core test results compared, 1 to 1, with Contractor QC Core test results – NCDOT Retest limit of precision is ±0.030 and NCDOT Comparison Core limit of precision is ±0.050. Figure 29. North Carolina DOT HMAC Core Mat Density Sampling and Testing Requirements

123 Table 43. Comparisons of North Carolina DOT and Contractor Core Mat Density Test Results Data Sets nNCDOT ncont s 2NCDOT s 2 CONT Difference p-Value NCDOT∆ CONT∆ Difference p-Value QC vs. Retest All JMFs* 1530 20282 3.790 3.005 SD <0.001 0.588 1.109 SD <0.001 QC vs. Retest** JMFs, nNCDOT≥6 1260 6379 3.674 3.086 SD <0.001 0.411 1.043 SD <0.001 QC vs. Comp. All JMFs* 3250 20282 4.897 3.005 SD <0.001 0.794 1.109 SD <0.001 QC vs. Comp.*** JMFs, nNCDOT≥6 2254 13567 5.230 2.780 SD <0.001 0.702 1.050 SD <0.001 QC vs. Verif. All JMFs* 1817 20282 6.218 3.005 SD <0.001 0.674 1.109 SD <0.001 QC vs. Verif.+ JMFs, nNCDOT≥6 1017 9331 7.167 2.970 SD <0.001 0.473 1.063 SD <0.001 * Total of 585 JMFs analyzed. ** 76 JMFs with Retest nNCDOT≥ 6 *** 170 JMFs with Comparison nNCDOT≥ 6 + 95 JMFs with Verification nNCDOT≥ 6

124 The comparisons for core mat density tests are similar to those for nuclear gage tests in Table 41, i.e., variances and means are significantly different. The mean differences from the 92% Gmm minimum target are also similar in magnitude. However, the variances for nuclear gage and core tests are numerically different, with core variances consistently larger. Although there may be differences in the variability of nuclear gage and core testing, at least some portion of the observed differences are thought due to sample size. One core is taken per 2000 linear foot test section, but 5 nuclear gage tests are run per test section and the average recorded as a test result. This is thought to be the primary reason the variances of nuclear gage density tests are smaller than variances of core density tests. Since the acceptance procedure is the same for either type testing, it is surprising that contractors choose the core option (larger variance) about two times as often as the nuclear gage option (contractor nCORE=20,282 and nNG=9011). As was the case for nuclear gage tests, the core test results were provided in a format so that North Carolina DOT Retest and Comparison test results could not be matched with specific contractor QC test results for paired t test analyses. JMF by JMF comparisons are summarized in Table 44. Numerically, North Carolina DOT JMF mean differences from the 92% Gmm minimum indicate lower achieved densities (column 3). The percentages in column 3 of Table 44 are similar to percentages in column 3 of Table 42 for nuclear gage tests. The differences in contractor and North Carolina DOT core mean differences are illustrated in Figures 30, 32 and 34 where 80, 75 and 78%, respectively, of the points plot above the line of equality. As with nuclear gage tests, there are a surprising number of JMFs where

125 Table 44. JMF Comparisons of North Carolina DOT and Contractor Core Mat Density Test Results Data Sets JMFs* JMFs with Smaller NCDOT ∆ JMFs with SD ∆ JMFs with Sig. Smaller NCDOT ∆ JMFs with Larger NCDOT s2 JMFs with SD s2 JMFs with Sig. Larger NCDOT s2 QC vs. Retest 76 61 (80%) 8 (11%) 8 (11%) 37 (49%) 9 (12%) 4 (5%) QC vs. Comparison 170 128 (75%) 8 (5%) 8 (5%) 113 (66%) 34 (20%) 32 (19%) QC vs. Verification 95 74 (78%) 9 (9%) 9 (9%) 59 (62%) 15 (16%) 15 (16%) * Total of 585 JMFs analyzed

126 Figure 30. JMF Core Mat Density Mean Differences – NCDOT Retest and Contractor QC Figure 31. JMF Core Mat Density Variances – NCDOT Retest and Contractor QC Density Mean (Core) -7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 -7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 JMF Retest Means JM F C o n tr ac to r Q C M ea n s Density Variance (Core) 0 1.5 3 4.5 6 7.5 9 10.5 12 13.5 15 16.5 18 19.5 21 22.5 0 1.5 3 4.5 6 7.5 9 10.5 12 13.5 15 16.5 18 19.5 21 22.5 JMF Retest Variances JM F C o n tr ac to r Q C V ar ia n ce s

127 Figure 32. Core Mat Density Mean Differences – NCDOT Comparison and Contractor QC Figure 33. Core Mat Density Variances – NCDOT Comparison and Contractor QC Density Mean (Core) -7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 -7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 JMF Comparison Means JM F C o n tr ac to r Q C M ea n s Density Variance (Core) 0 1.5 3 4.5 6 7.5 9 10.5 12 13.5 15 16.5 18 19.5 21 22.5 0 1.5 3 4.5 6 7.5 9 10.5 12 13.5 15 16.5 18 19.5 21 22.5 JMF Comparison Variances JM F C o n tr ac to r Q C V ar ia n ce s

128 Figure 34. JMF Core Mat Density Mean Differences – NCDOT Verification and Contractor QC Figure 35. JMF Core Mat Density Variances – NCDOT Verification and Contractor QC Density Mean (Core) -8 -7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 8 -8 -7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 8 JMF Verification Means JM F C o n tr ac to r Q C M ea n s Density Variance (Core) 0 2 4 6 8 10 12 14 16 18 20 22 0 2 4 6 8 10 12 14 16 18 20 22 JMF Verification Variances JM F C o n tr ac to r Q C V ar ia n ce s

129 contractor tests indicate densities exceeding the 92% Gmm minimum but North Carolina DOT tests indicate densities less than the 92% Gmm minimum. When JMF mean differences are significantly different, the North Carolina DOT values are always smaller (columns 4 and 5 of Table 44). Based on differences in variances for combined tests illustrated in Table 43, the differences in JMF variances in Table 44 are smaller than expected, (49, 66 and 62% in column 6). However, when differences in variances are significant, North Carolina DOT tests variances are almost always larger (columns 7 and 8). An exception is the Contractor QC and North Carolina DOT Retest core comparisons where the North Carolina DOT variances are larger for only 4 of 9 JMFs. Since the same cores are tested by both agencies what is surprising is any significant differences in mean differences from targets or variance. Job mix variances are plotted in Figures 31, 33 and 35. The distributions of points along the horizontal axes graphically illustrate the larger North Carolina DOT test variances. North Carolina DOT specifications have a minimum acceptable mat density requirement of 92% of Gmm. Therefore, when computing mean square deviations, the largest is best situation is applicable. The largest is best mean square deviation (MSDLIB) can be approximated with the equation ( ) ( ) 2 2 2 1 3 1 X X s ⎡ ⎤ ⎢ ⎥≈ + ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ LIBMSD …………………………………..………………..(5) where X = mean of measurements and s2 = variance of measurements.

130 To compute the MSDLIB for mat density tests in Table 45, the statistics (s2 and ∆ ) for all JMFs from Tables 41 and 43 were used. Means were computed by adding the minimum mat density requirement (92% of Gmm) to the mean deviations ( X = ∆+92). The contractor MSDLIB are smallest for both nuclear gage and core tests of mat density indicating the best process control (mat compaction). The MSDLIB for the several North Carolina DOT tests are all relatively similar. Comparable values for nuclear gage and core tests are close and are an indication that the means dominate the computations since the variances of core tests are considerable larger than variances of nuclear gage tests. Table 45. Comparisons of Mean Square Deviations for North Carolina DOT Mat Density Data MSDLIB Data Set Nuclear Gage Core Contractor QC 1.154 x 10-4 1.155 x 10-4 NCDOT Retest 1.165 x 10-4 1.168 x 10-4 NCDOT Verification 1.170 x 10-4 1.167 x 10-4 NCDOT Comparison - 1.163 x 10-4

131 ANALYSIS OF KANSAS DOT HOT MIX ASPHALT CONCRETE DATA Test results from 49 projects constructed during the 2003 season were analyzed. Properties compared were theoretical maximum mix specific gravity, air void content of laboratory compacted specimens and mat density. Mat density is typically measured with nuclear gages but may also be measured with cores. Gradation and asphalt content are measured by both contractors and Kansas DOT, but only for process control. Gradation and asphalt content test results are not archived by the Kansas DOT and, therefore, were not available for analysis. Figure 36 illustrates Kansas DOT sampling and testing requirements for managing the production and placement of HMAC. A LOT for mix properties is 3000 tons that is divided into four-750 ton subLOTs. Contractors take one QC sample per subLOT and the Kansas DOT takes one independent verification sample for each LOT (4 to 1 sampling ratio). Means of contractor QC test results are compared with means of Kansas DOT verification test results with t or modified t tests ( 0.01α = ). If verified, contractor QC test results are used for LOT acceptance. A LOT for mat density is a day’s production that is divided in 5 subLOTs. Contractors make two and the Kansas DOT one independent mat density measurement for each subLOT (2 to 1 sampling ratio). Means of the 10 contractor LOT QC test results are compared with means of the 5 Kansas DOT LOT Verification test results with t or modified t tests ( 0.01α = ). If verified, contractor QC test results are used for LOT acceptance.

132 Figure 36. Kansas DOT HMAC Sampling and Testing Requirements

133 Kansas DOT has no target mat density but uses the PWL system for computing pay adjustments with a lower specification limit (LSL). To combine data from multiple projects with different LSL, the following variable was defined: ∆ = X-LSL…………………………………………………………… (6) where X = = 90% for shoulder paving = 91 % for mainline paving 2 inches thick and less = 92 % for mainline paving thicker than 2 inches. Comparisons of variances and means of Kansas DOT and contractor tests for combined data from all projects are summarized in Table 46. Variances for Kansas DOT tests are significantly larger for all comparisons. Means are significantly different for mat density (%Gmm) for the combined data and for both thin (≤ 2”) or thick (>2”) mainline (ML) paving. The Kansas DOT mean of differences from the air voids target is the largest. Contractor means of differences from mat density lower specification limits (LSL) are largest and indicate better mat compaction. The data was sorted into a reduced data set for projects with nKDOT ≥ 6. Comparisons of variances and means for this reduced data set are summarized in Table 47. The comparisons for air voids and %Gmm are the same as those in Table 46. Kansas DOT variances are significantly larger than contractor variances. The air voids means are not significantly different but the contractor mean difference from lower compaction specification limits is significantly larger than the Kansas DOT mean difference.

134 Table 46. Comparisons of Kansas DOT Verification and Contractor QC Test Results – All Projects Property nKDOT nCONT 2KDOTs 2 CONTs Difference p-Value KDOT∆ CONT∆ Difference p-Value Air Voids 393 1494 0.643 0.318 SD <0.001 0.322 0.262 NSD 0.164 % Gmm Combined 2281 4554 3.016 1.674 SD <0.001 1.429 1.642 SD <0.001 % Gmm shoulders 341 681 2.443 0.927 SD <0.001 2.322 2.375 NSD 0.569 % Gmm ML ≤ 2” 1301 2606 3.190 2.086 SD <0.001 1.448 1.655 SD <0.001 % Gmm ML > 2” 639 1267 2.283 0.764 SD <0.001 0.914 1.219 SD <0.001 Table 47. Comparisons of Kansas DOT Verification and Contractor QC Test Results – Projects with nKDOT ≥ 6 Property nKDOT nCONT 2 KDOTs 2 CONTs Difference p-Value KDOT∆ CONT∆ Difference p-Value Air Voids 298 1140 0.568 0.310 SD <0.001 0.366 0.281 NSD 0.068 % Gmm Combined 2214 4438 3.021 1.704 SD <0.001 1.454 1.645 SD <0.001

135 Project by project comparisons are summarized in Table 48. Data for theoretical maximum mix specific gravity (Gmm) are included in project comparisons. The numbers of projects and percentages in column 3 indicate Kansas DOT differences from target air voids and Gmm test results are likely largest but that contractor differences from mat density lower specification limits are likely largest. These trends are graphically illustrated in Figures 37, 39 and 41. Column 4 indicates only project deviations from mat density lower specification limits are likely significantly different and column 5 indicates that it is contractor project deviations that are likely larger. Column 6 indicates project variances for Kansas DOT test are likely largest. These trends are graphically illustrated in Figures 38, 40 and 42 where more points plot below the lines of equality. The numbers of projects and percentages in column 7 show the likelihood that variances of Kansas DOT and contractor tests are significantly different. The numbers of projects and percentages in column 8 are the same as those in column 7 and indicates that, when variances are significantly different, Kansas DOT variances are always largest. The points that lie along the horizontal axes in Figures 38, 40 and 42 illustrate this trend. A final comparison will be between the mean square deviations of Kansas DOT and contractor tests in Table 49. The nominal is best situation is applicable for air voids and Equation 3 was used for computations. The largest is best situation is applicable for mat density and Equation 5 was used for computations. The MSD for contractor tests are always smaller indicating better process control (material quality).

136 Table 48. Project Comparisons of Kansas DOT and Contractor QC Test Results Property Projects Projects with Smaller KDOT∆ Projects with SD ∆ Projects with Significantly Smaller KDOT ∆ Projects with Larger KDOT s2 Projects with SD s2 Projects with Significantly Larger KDOT s2 Air Voids 24 18 (75%) 0 0 19 (75%) 5 (21%) 5 (21%) Gmm* 23 14 (61%) 3 (13%) 3 (13%) 16 (70%) 2 (9%) 2 (9%) %Gmm Combined 24 18 (75%) 11 (46%) 10 (42%) 22 (92%) 13 (54%) 13 (54%) * No target values for Gmm so comparisons are for actual measurements.

137 Air Void Mean -1.25 -1 -0.75 -0.5 -0.25 0 0.25 0.5 0.75 1 1.25 -1.25 -1 -0.75 -0.5 -0.25 0 0.25 0.5 0.75 1 1.25 Project KDOT Verification Means P ro je ct C on tr ac to r Q C M ea n s Figure 37. Air Void Project Means - KDOT Verification and Compliance Air Void Variance 0 0.25 0.5 0.75 1 1.25 1.5 1.75 2 2.25 2.5 2.75 0 0.25 0.5 0.75 1 1.25 1.5 1.75 2 2.25 2.5 2.75 Project KDOT Verification Variances P ro je ct C on tr ac to r Q C V ar ia n ce s Figure 38. Air Void Project Variances - KDOT Verification and Contractor QC

138 Gmm Mean 2.36 2.37 2.38 2.39 2.4 2.41 2.42 2.43 2.44 2.45 2.46 2.47 2.48 2.49 2.36 2.37 2.38 2.39 2.4 2.41 2.42 2.43 2.44 2.45 2.46 2.47 2.48 2.49 Project KDOT Verification Means P ro je ct C o n tr ac to r Q C M ea n s Figure 39. Theoretical Maximum Mix Specific Gravity Project Means - KDOT Verification and Contractor QC Gmm Variance 0 0.0001 0.0002 0.0003 0.0004 0.0005 0.0006 0 0.0001 0.0002 0.0003 0.0004 0.0005 0.0006 Project KDOT Verification Variances P ro je ct C on tr ac to r Q C V ar ia nc es Figure 40. Theoretical Maximum Mix Specific Gravity Project Variances - KDOT Verification and Contractor QC

139 Density (%Gmm) Mean -4 -3.5 -3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 3 3.5 4 -4 -3.5 -3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 3 3.5 4 Project KDOT Verification Means (%) P ro je ct C o n tr ac to r Q C M ea n s (% ) Figure 41. Mat Density Project Means - KDOT Verification and Contractor QC Density (%Gmm) Variance 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 Project KDOT Verification Variances (%) P ro je ct C o n tr ac to r Q C V ar ia n ce s (% ) Figure 42. Mat Density Project Variances - KDOT Verification and Contractor QC

140 Table 49. Mean Square Deviations for Kansas DOT Verification and Contractor QC Tests MSD Property KDOT Verification Contractor QC Air Void Content 0.747 0.387 Mat Density – LSL = 90% 1.174 x 10-4 1.172 x 10-4 Mat Density – LSL = 91% 1.171 x 10-4 1.165 x 10-4 Mat Density – LSL = 92% 1.159 x 10-4 1.151 x 10-4

141 ANALYSIS OF CALTRANS HOT MIX ASPHALT CONCRETE DATA HMAC test results from 149 projects constructed from 1996 to 2005 were provided by Caltrans. Caltrans’ quality assurance procedures use both mix properties and mat density for acceptance, but only test results for mix properties were provided. Test results included asphalt content and gradation (percents passing ¾” or ½”, 3/8”, #4, #8, #30 and #200 sieves). Figure 43 illustrates Caltrans sampling and testing requirements for managing construction of HMAC pavement layers. A LOT for acceptance is the entire project production. For contractor QC sampling and testing a subLOT is 500 tons; Caltrans samples and tests for verification at a frequency not less than 10% of the contractor QC frequency. Caltrans samples independently for mix properties. The PWL system is used to compute pay factors for mix properties and mat density. These pay factors are combined with weighting factors to compute composite LOT pay adjustments which may include up to a 5% bonus. Contractor QC test results are used for pay factor computation if verified by comparisons with Caltrans test results. Verification requires acceptable comparison of means with the t test (α =0.01) and with numerical allowable testing differences. During production comparisons are apparently made that, if unfavorable, can result in a review process. Detection of errors can lead to retesting (portions of split samples), recalculations and, as a last resort, independent third party involvement.

142 Notes: 1. A LOT for acceptance is the entire project production for a specific mix design. 2. A subLOT for contractor QC sampling and testing is 500 tons. Samples are split and one split portion is retained for possible dispute resolution. 3. Caltrans samples and tests at a frequency not less than 10% of the contractor QC sampling and testing frequency. 4. Caltrans verification samples for mix properties are independent of contractor QC samples. Samples are split into 4 portions. One portion is provided to the contractor and 2 portions are retained for possible dispute resolution. 5. Caltrans verification samples for theoretical maximum mix density are split samples with the contractor. Caltrans and contractor nuclear gage mat density tests are conducted at the same location. Therefore, theoretical maximum mix density and relative compaction test results are paired. Figure 43. Caltrans HMAC Sampling and Testing Requirements

143 The first comparisons made were between contractor QC and Caltrans tests for all 149 projects. Table 50 contains comparisons of variances and means of differences from target values for mix properties. The variances for all 7 properties are significantly different and the variances of Caltrans tests are always largest. The mean differences from targets for 4 of the 7 properties are significantly different. Except for the % passing the #30 sieve, the Caltrans mean differences from targets are largest for these 4 properties. The mean differences from targets are not significantly different for 3 properties, but the Caltrans mean differences from targets for all 3 of these properties are largest. A reduced data set for large (nCAL. ≥ 6) projects was created from the total data set. Comparisons of project means and variances for these larger projects are summarized in Table 51. The trends indicated by these comparisons are as follows: ● Numerically, differences from target values of Caltrans test results tend to be larger than differences from target values for contractor test results. Evidence of this are the percentages of projects in column 3 of Table 51 that are all greater than 50%. A graphical illustration for asphalt content is provided in Figure 44 where 53 (65%) of the points fall in the portion of the figure that is bounded by the dashed lines of absolute equality and centered on the horizontal axis. A complete set of figures for mean differences and variances for all properties are contained in Appendix D.

144 Table 50. Comparisons of Caltrans Verification and Contractor QC Mix Properties Property nCAL nCONT 2 CAL.s 2 CONTs Difference p-Value .CAL∆ CONT∆ Difference p-Value % Asphalt 1405 9258 0.087 0.042 SD <0.001 0.036 -0.003 SD <0.001 % Passing ¾ or ½” 1331 8553 4.863 3.319 SD <0.001 0.951 0.719 SD <0.001 % Passing 3/8” 1514 9585 11.082 6.478 SD <0.001 -0.093 -0.049 NSD 0.619 % Passing #4 1514 9585 9.258 6.770 SD <0.001 -0.517 -0.381 NSD 0.099 % Passing #8 1513 9585 8.303 5.423 SD <0.001 -0.356 0.012 SD <0.001 % Passing #30 1513 9585 5.363 3.203 SD <0.001 -0.058 0.148 SD 0.001 % Passing #200 1507 9439 1.095 0.602 SD <0.001 0.062 0.011 NSD 0.072

145 Table 51. Project by Project Comparisons of Caltrans Verification and Contractor QC Mix Properties Test Results Property Projects Projects with Larger Caltrans ∆ Projects with SD ∆ Projects with Sig. Larger Caltrans ∆ Projects with Larger Caltrans s2 Projects with SD s2 Projects with Sig. Larger Caltrans s2 % Asphalt 82 53 (65%) 26 (32%) 23 (28%) 56 (68%) 26 (32%) 25 (30%) % Passing ¾ or ½” 77 54 (70%) 18 (23%) 16 (21%) 44 (57%) 17 (22%) 14 (18%) % Passing 3/8” 86 61 (71%) 19 (22%) 15 (17%) 54 (63%) 17 (20%) 16 (19%) % Passing #4 86 48 (56%) 12 (14%) 5 (6%) 61 (71%) 20 (23%) 20 (23%) % Passing #8 86 53 (62%) 14 (16%) 11 (13%) 60 (70%) 23 (27%) 21 (24%) % Passing #30 86 62 (72%) 13 (15%) 12 (14%) 57 (66%) 20 (23%) 18 (21%) % Passing #200 85 52 (61%) 25 (29%) 18 (21%) 60 (71%) 31 (36%) 30 (35%)

146 Asphalt Content Mean -0.5 -0.4 -0.3 -0.2 -0.1 0.0 0.1 0.2 0.3 0.4 0.5 -0.5 -0.4 -0.3 -0.2 -0.1 0.0 0.1 0.2 0.3 0.4 0.5 Caltrans Verification - Means (%) C o n tr ac to r Q C - M ea n s (% ) Figure 44. Project Asphalt Content Means – Caltrans Verification and Contractor QC Asphalt Content Variances 0.00 0.05 0.10 0.15 0.20 0.25 0.00 0.05 0.10 0.15 0.20 0.25 Caltrans Verification - Variances (%) C o n tr ac to r - V ar ia n ce s (% ) Figure 45. Project Asphalt Content Variances – Caltrans Verification and Contractor QC

147 ● Numerically, variances of Caltrans test results are larger than variances of contractor test results. Evidence of this are the percentages of projects in column 6 that are all greater than 50%. A graphical illustration is provided in Figure 45 for asphalt content where 56 (68%) of the points plot below the line of equality. ● Except for % passing the #4 sieve, when mean differences from target values are significantly different, mean differences for Caltrans test results are likely larger. This can be confirmed by comparing numbers and percentages of projects in columns 4 and 5 which are similar. ● When variances are significantly different, variances of Caltrans test results are very likely larger. This can be confirmed by comparing numbers and percentages of projects in columns 7 and 8 which are quite similar. Comparisons of variances and means of deviations from target values for combined data from large (nCAL. ≥ 6) projects are summarized in Table 52. Except for the #30 sieve, the comparisons for the large project data are the same as those in Table 50 for all projects. The contractor mean of differences from the #30 sieve target values is significantly larger for all projects but for large projects the means are not significantly different. A final comparison will be between mean square deviations of Caltrans and contractor tests. The nominal is best situation is best for all properties and statistics from Table 50 were used in Equation 3 to compute the MSDNIB in Table 53. The

148 MSDNIB for contractor tests are smaller, indicating better process control (material quality).

149 Table 52. Comparisons of Caltrans Verification and QC Mix Properties Test Results – Large (nCAL. ≥ 6) Projects Property nCAL nCONT 2 CAL.s 2 CONTs Difference p-Value .CAL∆ CONT∆ Difference p-Value % Asphalt 1201 7480 0.084 0.041 SD <0.001 0.039 -0.014 SD <0.001 % Passing ¾ or ½” 1128 6828 5.097 3.387 SD <0.001 1.089 0.808 SD <0.001 % Passing 3/8” 1311 7860 11.084 6.378 SD <0.001 -0.125 0.012 NSD 0.153 % Passing #4 1311 7860 9.389 6.619 SD <0.001 -0.458 -0.481 NSD 0.801 % Passing #8 1310 7860 8.503 5.556 SD <0.001 -0.281 -0.059 SD 0.009 % Passing #30 1310 7860 5.293 3.079 SD <0.001 -0.011 0.099 NSD 0.097 % Passing #200 1304 7714 1.120 0.558 SD <0.001 0.067 0.023 NSD 0.154

150 Table 53. Mean Square Deviations for Caltrans Verification and Contractor QC Tests MSDNIB Property Caltrans Verification Contractor QC Asphalt Content 0.088 0.042 % Passing ¾ or ½” 5.767 3.836 % Passing 3/8” 11.091 6.480 % Passing #4 9.525 6.915 % Passing #8 8.430 5.423 % Passing #30 5.366 3.225 % Passing #200 1.099 0.602

151 ANALYSIS OF NEW MEXICO DOT HOT MIX ASPHALT CONCRETE DATA Limited data for HMAC were provided by the New Mexico DOT. These data included results from 3 projects with 7 mixes. Test results were not provided, rather, results from project analyses were provided. These results included target values, statistics ( X and s) and pay adjustments computed with both New Mexico DOT and contractor statistics. New Mexico DOT accepts HMAC on a LOT by LOT basis and a LOT is defined as the entire project production for a particular mix design. Sampling and testing requirements are illustrated in Figure 46. Contractor QC test results are plotted on control charts with applicable upper and lower specification limits. Contractor QC and New Mexico DOT acceptance test results are compared (α =0.01) with F and t tests as they are accumulated. Verification requires that both variance and mean are not significantly different. If verified, contractor QC test results are combined with New Mexico DOT acceptance test results to compute LOT pay factors. The PWL system is used to compute pay factors for the individual material properties listed in Table 54. These pay factors are combined with the weighting factors, also listed in Table 54, to compute a composite LOT pay factor as follows: CPF=[f1(PF1) + f2(PF2) + ···fj(PFj)]/∑ fj ...........................................(7) where fj = weighting factors and PFj = pay factors for individual material properties

152 Notes: 1. Contractor QC test results are plotted on control charts with applicable upper and lower specification limits. 2. Contractor QC and NMDOT Acceptance testing on independent samples. 3. Contractor QC and NMDOT Acceptance test results are compared with F and t test as accumulated. Final acceptance decisions and pay factor calculations are made when production of a mix design is complete. If Contractor QC test results are validated they are combined with NMDOT Acceptance test results for final acceptance decisions and pay factor calculations. 4. Both variabilties (F test) and means (t test) at α =0.01 must be comparable for validation of contractor test results. Figure 46. New Mexico DOT HMAC Sampling and Testing Requirements

153 Table 54. Properties for Computing Pay Factors and Weighting Factors Mix Type Properties Weighting Factor Dense Graded and SMA Asphalt Content Mat Density Air Voids Nominal Max. Agg Size % Passing #8* % Passing #10* % Passing #16* % Passing #30* % Passing #40* % Passing #50* % Passing #200* 50 50 50 10 15 15 15 15 15 15 15 Open Graded Friction Course Asphalt Content % Passing #4 % Passing #10 % Passing #40 % Passing #200 20 6 20 6 6 * A combination of 2 to 3 sieves is used, depending on specific mix type.

154 An example of the type information provided by the New Mexico DOT for an open graded friction course (OGFC) mix is shown in Table 55. The test results for the example mix indicate no particular tendency for New Mexico DOT or contractor test results to be more or less variable or more or less closer to targets. This was true for all seven mixes where 30 of 60 (50%) contractor standard deviations were smaller and where 28 of 60 (47%) contractor means were closer to target values. None of the standard deviations or means for the example mix were significantly different (α =0.01). Despite these similarities in variability and accuracy, the acceptance outcomes using New Mexico DOT and contractor test results were different. These acceptance outcomes are as follows: New Mexico DOT CPF = 1.024 New Mexico DOT pay = $407,757 Contractor CPF = 1.045 Contractor pay = $415,994 Use of contractor test results yield $8,237 (2.02%) greater pay for the example mix. For all seven mixes, only 6 of 60 (10%) of the standard deviations were significantly different and contractor standard deviations were smaller for 5 of 6 cases. Seven of 60 (12%) of the means were significantly different but contractor means were closer to targets for only 2 of 7 cases. Again, despite similarities for all seven mixes, the use of contractor test results gave $9,615,127 - $9,411,517 = $203,610 or 2.16% greater pay. The similarities between New Mexico DOT and contractor test results may be a result of the verification process where F and t test are used to compare variability and

155 Table 55. Example New Mexico DOT Data Property Target nDOT sDOT X DOT PFDOT nCONT sCONT X CONT PFCONT % Asphalt 6.9 14 0.085 6.864 1.050 12 0.081 6.854 1.050 % Passing #4 40.0 10 2.791 36.700 1.050 12 2.221 30.750 1.050 % Passing #10 6.0 10 1.751 10.200 0.975 12 1.557 9.333 1.035 % Passing #40 4.0 10 0.707 5.500 1.050 12 0.793 5.417 1.050 % Passing #200 2.0 10 0.200 2.300 1.050 12 0.303 2.367 1.050 Shaded values are smallest standard deviations or means that are closest to targets.

156 accuracy as tests are accumulated for the entire project mix design production. Tables 56 and 57 contain standard deviations and means of differences from target values for some properties from New Mexico with comparable statistics from other states studied. The standard deviations in Table 56 are interesting for several reasons. Except for the standard deviation of contractor-tested void contents in Kansas, the standard deviations for the New Mexico DOT tests (both DOT and contractor) are always the smallest. For asphalt content, the variability is considerably smaller than any other state. As observed earlier for individual mixes, there is also no consistent indication that the variability of contractor test results is smaller than the variability of New Mexico DOT tests or that differences are significant. The standard deviations for both New Mexico DOT and contractor asphalt content tests are 0.116 which is about two and a half times smaller than the standard deviations of asphalt content tests for any other state. However, it should be noted that the asphalt content standard deviations are for data from only three New Mexico projects and they are not unlike standard deviations for individual projects in other states. Their magnitude and closeness for contractor and New Mexico DOT data may be due to the limited size of the database compared to the other states. The means in Table 57 illustrate no consistent trends. The means for New Mexico asphalt contents are in line with the other states and indicate that, on average, test results are quite close to target values. The New Mexico DOT voids content test results are much closer to the 4% target than test results for any of the other four states, except for the Florida DOT ISVT (independent sample verification test) test results.

157 Table 56. Standard Deviations of Test Results from Several States Property Standard Deviation Agency % Asphalt Voids Content Mat Density % Passing #200 ALDOT (S) CONT. (S) 0.272* 0.230* 1.025 0.863 1.470* 1.175* - - GDOT (S) CONT. (S) GDOT (I) 0.297** 0.200** 0.253** - - - - - - 1.066 0.877 1.101 FDOT (S) CONT. (S) FDOT (I) 0.290*** 0.250*** 0.293*** 1.144 0.841 1.183 1.720++ 1.603++ 1.880++ 0.701 0.613 0.693 KDOT (I) CONT. (I) - - 0.802 0.564 1.737+++ 1.294+++ - - NCDOT (S) CONT. (S) NCDOT (I) 0.308+ 0.243+ 0.286+ 1.039 0.751 1.039 1.483* 1.947++ 1.287* 1.733++ 1.483* 2.494++ 0.875 0.700 0.917 Caltrans (I) CONT. (I) 0.295*** 0.204*** - - - - 1.046 0.776 NMDOT (I) CONT. (I) 0.116*** 0.116*** 0.750 0.660 0.858++ 1.086++ 0.522 0.480 (S) DOT and contractors test split samples (I) DOT and contractors test independent samples * Nuclear gage method ** Solvent extraction or ignition methods but primarily ignition method *** Ignition method + Optional methods but primarily ignition method ++ Core method +++ Optional core or nuclear gage methods

158 Table 57. Means of Test Results from Several States Property Mean Agency % Asphalt Voids Content Mat Density % Passing #200 ALDOT (S) CONT. (S) -0.045* -0.036* -0.357 -0.281 -1.245* -0.997* - - GDOT (S) CONT. (S) GDOT (I) 0.005** 0.005** 0.004** - - - - - - 0.334 0.400 0.359 FDOT (S) CONT. (S) FDOT (I) 0.016*** -0.012*** 0.000*** -0.289 -0.248 -0.057 -0.222++ -0.103++ -0.640++ 0.136 0.072 0.132 KDOT (I) CONT. (I) - - 0.322 0.262 1.429+++ 1.642+++ - - NCDOT (S) CONT. (S) NCDOT (I) -0.021+ -0.003+ -0.021+ -0.212 -0.097 -0.161 0.695* 0.588++ 1.114* 1.109++ 0.489* 0.674++ 0.221 0.095 0.194 Caltrans (I) CONT. (I) 0.036 -0.003 - - - - 0.062 0.011 NMDOT (I) CONT. (I) -0.018*** -0.024*** 0.072 -0.050 -1.389++ -1.503++ -0.142 -0.059 (S) DOT and contractors test split samples (I) DOT and contractors test independent samples * Nuclear gage method ** Solvent extraction or ignition methods but primarily ignition method *** Ignition method + Optional methods but primarily ignition method ++ Core method +++ Optional core or nuclear gage methods Mat density means for ALDOT, FDOT and NMDOT reflect differences between measured and target values. Mat density means for KDOT and NCDOT reflect differences between measured and lower specification limits or minimum acceptable values.

159 This mean was much closer to the target than the mean for any other type Florida DOT test. The means for mat density are similar to those for Alabama DOT. The mat density means for Florida indicate test results closer to targets than either Alabama DOT or New Mexico DOT. The negative values for all three states indicate that, on average, target mat density was not achieved (∆ = X-XT). Lower specification limits for Kansas DOT and North Carolina DOT were subtracted from test results and is the reason for positive values. What is different about New Mexico mat density measurements is that the DOT test results indicate better compaction, whereas, in the other four states the contractor test results indicate better compaction. The magnitude of the means of the New Mexico DOT tests for % passing the #200 sieve are not unlike some of the means for Florida DOT, North Carolina DOT or Caltrans tests. What is different is the sign. The New Mexico DOT means indicate less than target amounts passing the #200 sieve (gradations coarser than targets), whereas, in the other states the amounts passing the #200 sieve are consistently larger than targets (gradations finer than targets). To summarize, the statistics for the limited New Mexico DOT data are quite different from the statistics in the other five states studied. The New Mexico DOT and contractor test results appear more similar in variability and accuracy. However, acceptance outcomes are more favorable when contractor test results are used to compute pay factors. The reasons for the observed differences and similarities are not known. Possible factors include the verification and acceptance system that defines a LOT as the entire project mix production, the accumulation and comparison of DOT and

160 contractor test results with F and t test and/or the combining of DOT and contractor test results to make acceptance decisions. While, the Caltrans system is similar to that of the New Mexico DOT, the variances for Caltrans asphalt content and % passing the #200 sieve are larger. Variances among Caltrans tests are more like variances for Alabama, Georgia, Florida, North Carolina and Kansas DOT tests than the New Mexico DOT tests. The differences in acceptance outcomes will be considered further in Chapter 4. An analysis of acceptance outcomes computed with statistics for the other five states studied will be presented in Chapter 4, and the New Mexico outcomes will be compared with these outcomes. ANALYSIS OF COLORADO DOT PORTLAND CEMENT CONCRETE PAVEMENT DATA Flexural strength test results from 3 PCC pavement (PCCP) projects were provided by the Colorado DOT. Contractors can choose between acceptance processes that use either 28 day flexural or compressive strengths. With the compressive strength process, contractor test results are used only for quality control and DOT test results are used for acceptance. There is no required comparison of compressive strength test results. With the flexural strength process, contractor and Colorado DOT test results are compared with F and t tests (α =0.05). If the contractor tests are verified, they are used for acceptance, i.e., to compute a flexural strength pay factor. Comparisons must indicate no significant difference for both variances and means for verification. Pay factors for pavement thickness and smoothness are computed with test results provided by contractors.

161 A LOT is the entire project production of a process; defined as consistent materials, mix design and construction method. Contractors fabricate and test a set of 3 beams per 2500m2 of pavement or a minimum of 1 set of 3 beams per day. The Colorado DOT independently fabricates and tests a set of 3 beams per 10,000m2 of pavements. A test result is the average flexural strength from 3 beams. Comparisons of contractor and Colorado DOT flexural strength test results are summarized in Table 58. In order to combine data from the three projects, the analysis variable was the difference between test results and lower specification limit flexural strength (∆=x-xL). The comparisons indicate no significant differences (α =0.01) between Colorado DOT and contractor flexural strength test results. The p-values for Project 3 are 0.014 and 0.084 for variance and mean comparisons, respectively. Rounded to two decimal places, Project 3 variances would be significantly different; as they would also certainly be for α =0.05 significance level. Comparisons of PCC compressive strength test results from Kentucky and Alabama were presented in Chapter 2. These comparisons, conducted at α =0.05 significance level, indicated no significant differences in variances or means for structural PCC. There were also test results for paving PCC from Kentucky. Comparisons indicated there was no significant difference in means of the Kentucky paving PCC compressive strength, but that there was a significant difference in variances (p-value=0.002). The comparisons of the limited PCC test results indicate that, if there are significant differences between contractor and DOT test results, it is more likely these will be differences in variability.

162 Table 58. Comparisons of Colorado DOT and Contractor Flexural Strength Test Results Project nCDOT nCONT 2CDOTS 2 CONTS Difference p-Value CDOT∆ CONT∆ Difference p-Value 1 27 99 2367 2101 NSD 0.328 179 189 NSD 0.336 2 15 53 1639 2274 NSD 0.256 105 90 NSD 0.272 3 19 69 876 2265 NSD 0.014 59 80 NSD 0.084 Combined 61 221 4434 4963 NSD 0.329 124 131 NSD 0.461

163 The comparisons for the Colorado, Alabama and Kentucky PCC strength data suggests no particular tendency for contractor tests to be less variable or more favorable (larger strengths). This is contrary to the general tendencies noted for HMAC. The overall mean differences between test results and the lower specification limit of 124 and 131 psi in Table 58 are about 22% higher than the 570 psi lower specification limit. However, the overall mean differences indicates average strength that is only about 7% higher than a 650 psi plan or design strength. Also, the Colorado DOT test mean for Project 3 of 629 psi is 3% lower than the 650 psi plan strength. These comparisons indicate a not so conservative approach to assuring adequate PCC strength as do comparisons for Alabama and Kentucky. The means in Table 10 for structural PCC indicate compressive strengths 69 and 64% higher than minimum required for contractor and Alabama DOT tests, respectively. The means in Table 11 for Kentucky Transportation Cabinet and contractor tests indicate, respectively, structural PCC compressive strengths 62 and 53% higher than the minimum required. The means in Table 11 indicate paving PCC compressive strengths 69 and 72% higher than the minimum required. ANALYSIS OF FHWA–WESTERN FEDERAL LANDS HIGHWAY DIVISION AGGREGATE COURSE DATA Test results from 23 aggregate course construction projects were provided by the FHWA-Western Federal Lands Highway Division (FHWA-WFLHD). The projects involved several types of aggregate courses. Each type of aggregate course has some combination of properties for pay factor computation. For these properties both contractor and FHWA-WFLHD testing of split samples is required. FHWA-WFLHD tests

164 are used to verify contractor tests with t or paired t tests at 1% significance level. If verified, contractor tests are used to compute LOT pay factors with the PWL method. Bonuses of up to 5% may be obtained. Pavement layers have compaction requirements, but layer compaction is accepted or rejected based on contractor density tests. A LOT is the entire project production for a particular type aggregate course. Contractors take and test one sample per 1000 tons of aggregate placed. The FHWA- WFLHD tests a split sample from the first 3 project samples and at least 10% of the remaining project samples. The data provided for the 23 projects indicate an average contractor to FHWA-WFLHD testing ratio of about 3 to 1. However, the ratio for a particular project depends on the project quantity. The variable used for the comparisons was the difference between test results and either target values, maximum specification values or minimum specification values. Target, maximum and minimum values were subtracted from test results. Comparisons for the entire data sets of FHWA-WFLHD and contractor tests are contained in Table 59. The differences in variability seem not so extensive as those for HMAC but the differences in means seem somewhat more extensive. The variabilities of only 5 of 12 properties were significantly different but, for these 5, the variability of FHWA-WFLHD tests were larger for 4 properties. Overall, 8 of 12 FHWA-WFLHD test property variabilities were larger. This observation of larger agency test variability is consistent with observations for hot mixed asphalt concrete tests.

165 Table 59. Comparisons of FHWA-WFLHD and Contractor Aggregate Course Test Results Property nFHWA nCONT s 2 FHWA s 2 CONT Difference p-Value ∆ FHWA ∆ CONT Difference p-Value % Passing 1” 68 216 0.042 0.211 SD <0.001 -0.022 0.007 NSD 0.611 % Passing ¾” 30 96 3.547 4.518 NSD 0.232 0.625 0.122 NSD 0.248 % Passing ½” 148 347 4.777 3.964 NSD 0.085 1.541 -0.010 SD <0.001 % Passing 3/8” 36 103 19.332 12.354 NSD 0.044 2.259 0.146 SD 0.004 % Passing #4 154 354 7.821 5.054 SD 0.001 -0.648 -0.119 NSD 0.024 % Passing #10 136 290 4.461 3.839 NSD 0.148 -0.371 0.060 NSD 0.040 % Passing #40 154 354 2.805 3.139 NSD 0.213 -0.012 0.151 NSD 0.330 % Passing #200 148 348 1.558 0.968 SD <0.001 -0.022 0.065 NSD 0.408 LL 118 251 4.114 2.000 SD <0.001 -10.534 -11.450 SD <0.001 PI 118 251 3.281 1.863 SD <0.001 0.034 -0.390 SD 0.013 SE/P200 13 80 0.074 0.061 NSD 0.288 0.535 0.503 NSD 0.668 % Frac. Part. 135 331 81.514 90.453 NSD 0.244 20.934 24.943 SD <0.001 Specifications contain target values for gradation and PI, maximum values for LL and minimum values for % fractured particles and the ratio of the sand equivalent to percent passing the #200 sieve ratio (SE/P200).

166 The procedure FHWA-WFLHD uses to establish gradation targets will affect consideration of the comparisons of means. Gradation targets are set as the average of contractor tests, provided the average is within specification allowable limits. For example, if the allowable range for percent passing a sieve is 20 to 30% and the average for contractor tests is 27%, the target value would be 27%. This was the case for most of the 23 projects and accounts for the low gradation mean deviations for contractor tests. As a result, comparisons of the magnitude of FHWA-WFLHD and contractor mean deviations from gradation targets are not meaningful. The gradation means were significantly different for only 2 of 7 sieves. For the remaining properties, the FHWA-WFLHD test means were significantly different for 3 of 4 properties. The contractor means were more favorable, relative to specification limits for these 3 properties. The means for SE/P200 were not significantly different and the FHWA-WFLHD mean was slightly more favorable relative to minimum specified values. Means of contractor and FHWA-WFLHD test from split samples were compared with paired t tests. These comparisons are summarized in Table 60. The comparisons of gradation means were the same as those for the entire contractor data set in Table 59. Means for only 2 sieves (3/4” and 1/2”) were significantly different. The comparisons of paired tests for the remaining samples were the same as the comparisons for all data, except for PI. The means for paired PI tests were not significantly different (p-Value=0.044).

167 Table 60. Comparisons of Paired FHWA-WFLHD and Contractor Aggregate Course Test Results Property n ∆ FHWA ∆ CONT Difference P-Value % Passing 1” 68 -0.022 -0.035 NSD 0.810 % Passing ¾” 30 0.625 0.053 NSD 0.035 % Passing ½” 148 1.541 0.071 SD <0.001 % Passing 3/8” 36 2.259 0.135 SD 0.006 % Passing #4 154 -0.648 -0.181 NSD 0.039 % Passing #10 136 -0.371 0.032 NSD 0.023 % Passing #40 154 -0.012 0.121 NSD 0.290 % Passing #200 148 -0.022 0.126 NSD 0.054 LL 118 -10.534 -11.178 SD <0.001 PI 118 0.034 -0.246 NSD 0.029 SE/P200 13 0.535 0.425 NSD 0.044 % Frac. Part. 136 20.934 22.708 SD 0.002

168 The comparisons of mean differences from target values for granular base are somewhat the same as comparisons for HMAC. Means are not consistently significantly different but, when they are significantly different, contractor tests are likely more favorable (LL, PI, and % fractured particles). Project by project comparisons were made for the 9 properties in Table 61. Percentages passing the 1” and ¾” sieves and the ratio of the sand equivalent and % passing the #200 sieve (SE/P200) were omitted because individual project data was insufficient for meaningful comparisons. Projects were included that had 5 or more FHWA-WFLHD tests. Previous project analyses defined a large project as one with 6 or more agency tests. However, this data had a number of projects with 5 FHWA-WFLHD tests and the inclusion of these projects greatly expanded the database. The numbers and percentages of projects in column 3 of Table 61 indicates contractor gradation tests are consistently closer to targets. However, this is due to the designation of the project target percent passing as the average of contractor tests, provided this average is within specification tolerances. Figure 47 illustrates the resulting unusual distribution of project means for % passing the #200 sieve. A complete set of figures for project means and variances for all the properties in Table 61 are contained in Appendix E. The numbers and percentages of projects in column 4 indicate it is unlikely gradation means are significantly different.

169 Table 61. Project Comparisons of FHWA-WFLHD and Contractor Aggregate Course Test Results Property Projects Proj. with Larger FHWA ∆ Proj. with SD ∆ Proj. with Sig. Larger FHWA ∆ Proj. with Larger FHWA s2 Proj. with SD s2 Proj. with Sig. Larger FHWA s2 % Passing ½” 20 20 (100%) 5 (25%) 5 (25%) 11 (55%) 1 (5%) 1 (5%) % Passing 3/8” 5 4 (80%) 0 0 0 0 0 % Passing #4 21 17 (81%) 4 (19%) 4 (19%) 18 (86%) 1 (5%) 1 (5%) % Passing #10 19 18 (95%) 0 0 10 (53%) 0 0 % Passing #40 21 16 (76%) 1 (5%) 1 (5%) 8 (38%) 0 0 % Passing #200 20 19 (95%) 2 (10%) 2 (10%) 10 (50%) 1 (5%) 1 (5%) LL 16 2 (12%) 4 (25%) 0 11 (69%) 2 (12%) 2 (12%) PI 16 9 (56%) 3 (19%) 2 (12%) 14 (88%) 2 (12%) 2 (12%) % Frac. Part. 18 7 (39%) 6 (33%) 1 (6%) 12 (67%) 2 (11%) 2 (11%)

170 #200 Sieve -0.5 0.0 0.5 1.0 1.5 -2.0 -1.5 -1.0 -0.5 0.0 0.5 1.0 1.5 FHWA Mean Difference from Target C o n tr ac to r M ea n D if fe re n ce f ro m T ar g et Figure 47. Project Percent Passing the #200 Sieve Means – FHWA-WFLHD Verification and Contractor QC #200 Sieve 0.0 0.5 1.0 1.5 2.0 2.5 3.0 0.0 0.5 1.0 1.5 2.0 2.5 3.0 FHWA Variance (of Diff. from Target) C o nt ra ct o r V ar ia n ce ( of D if f. f ro m T ar g et ) Figure 48. Project Percent Passing the #200 Sieve Variances – FHWA-WFLHD Verification and Contractor QC

171 The numbers and percentages of projects in column 6 indicate no particular tendency for FHWA-WFLHD or contractor gradation variances to be larger, except for % passing the #4 sieve. The numbers and percentages of projects in columns 7 and 8 indicate that variances of gradation tests are not likely significantly different, but that, if they are significantly different, the variances of FHWA-WFLHD gradation tests are always larger. These tendencies are illustrated in Figure 48 for % passing the #200 sieve. The numbers and percentages of projects in column 3 for LL, PI and % fractured particles indicate more favorable contractor test results, relative to specification limits. The numbers and percentages of projects in columns 4 and 5 indicate some tendency for significant differences in means and that, if the means are significantly different, contractor tests are more favorable. These tendencies are illustrated in Figure 49 for LL. Upper limits for LL are specified. The numbers and percentages of projects in column 6 for LL, PI and % fractured particles indicate larger FHWA-WFLHD test variances. The numbers and percentages in column 7 indicate no strong tendency for significant differences in variances but, if the means are significantly different, the numbers in column 8 indicate the variances for FHWA-WFLHD tests are always larger. These tendencies are illustrated in Figure 50 for LL.

172 LL Means -15 -10 -5 0 -15 -10 -5 0 FHWA Diff. from target C o n tr ac to r D if f. f ro m t ar g et Figure 49. Project LL Means – FHWA-WFLHD Verification and Contractor QC LL Variances 0 5 10 15 20 25 30 35 40 0 5 10 15 20 25 30 35 40 FHWA Variance (of Diff. from Target) C o n tr ac to r V ar ia n ce ( o f D if f. f ro m T ar g et ) Figure 50. Project LL Variances –FHWA-WFLHD Verification and Contractor QC

Next: Chapter 4: Evaluation of the Effects of Differences in Test Results on Acceptance Outcomes »
Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report Get This Book
×
 Using the Results of Contractor-Performed Tests in Quality Assurance: Contractor's Final Report
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB's National Cooperative Highway Research Program (NCHRP) Web-Only Document 115: Using the Results of Contractor-Performed Tests in Quality Assurance includes select chapters of the contractor's final report on this project, which explores whether state departments of transportation can effectively use contractor-performed test results in the quality-assurance process. NCHRP Research Results Digest 323 summarizes the results and findings of this project.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!