Click for next page ( 23


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 22
22 DATA MANAGEMENT ACTIVITIES duced in the following sections and are discussed in detail in chapter four. For data quality management practices to be effective and efficient, quality management methods needs to be employed Personnel Training and Certification: Continuous train- throughout the entire data collection process. Figure 11 sum- ing is very important to ensure that the personnel oper- marizes the types of activities used for quality control and ating the equipment or conducting the visual surveys acceptance by the agencies that responded to the survey. It are properly trained. That the classification of the dis- can be observed that several of the tools and methods used tresses is somewhat subjective makes training even more for quality control and acceptance are basically the same. critical for the distress surveys. Some agencies require This is probably one of the reasons why the two processes are a formal "certification" of the pavement distress raters often confused. However, the objective of the activities, the and equipment operators to verify that they have the way they are conducted, and the personnel responsible for it required knowledge and skills. are typically different in the two quality management phases Equipment and Method Calibration, Certification, and as discussed in chapter one. Verification is to be conducted before the initiation of the data collection activities and periodically thereafter The main techniques used by state and provincial DOTs to verify that equipment is functioning according to for pavement data quality management are calibration of expectations and that the collection and analysis meth- equipment and/or analysis criteria before the data collec- ods are being followed. tion, testing of "control" segments before and during data Data Verification Procedures by Testing of Control or collection, and software routines for checking the reason- Verification Sites are used for both quality control and ableness and completeness of the data. Similarly, 100% of acceptance before and during production. Typical ver- the pavement data collection service providers indicated ification techniques include periodic retesting of con- that they use equipment and/or analysis criteria before the trol or verification pavement segments, oversampling data collection, and software routines for checking the rea- or cross-measurements, and reanalyzing or resurvey- sonableness and completeness of the data, and most (86%) ing a sample of the sections measured by an indepen- reported that they use testing of "control" segments before dent evaluator. The locations of sections can be known and during data collection. These tools are briefly intro- or unknown (blind) to the data collection crews. Calibration of equipment and/or analysis criteria 80% before the data collection 94% Testing of known "control" segments before data 73% collection 94% Periodic testing of known "control" segments during 71% production 81% Software routines that check if the data is within the 71% expected ranges 57% Software routines that check for missing road 61% segments or data elements 55% Verification of the post-survey processing software/ 48% procedures 47% 50% Comparison with existing time-series data 42% Statistical/software routines that check for 50% inconsistencies in the data 38% Cross-measurements (i.e., random assignment of 27% repeated segments to different teams or devices) 26% Periodic testing of blind control " segments during 21% Quality Acceptance production 24% Quality Control Verification of sample data by an independent 12% consultant 4% 0% 20% 40% 60% 80% 100% FIGURE 11 Percentage of state and provincial agencies using each quality control and acceptance activity.