Click for next page ( 44


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 43
43 CHAPTER FIVE CASE STUDIES This chapter documents the data management practices of Subjective Rating a select group of transportation agencies. The case studies include an agency that conducts most of the data collection in- In the last step, the operator verifies that the data have been house and three agencies that contract most of the network- saved and inputs a subjective evaluation of the crack detec- level pavement condition data collection with data collec- tion process, following a set of recommendations summa- tion service providers and use different quality acceptance rized in Table 5. For example, if more than 90% of the sta- approaches. tions reviewed pass the crack detection criteria and all the data were saved in the hard drive and the network, the batch is given a "good" rating. MARYLAND MDSHA uses an in-house automated system to measure Repeatability and Accuracy Examination smoothness, rutting, and cracking, in addition to other data such as right-of-way images, longitudinal and transverse MDSHA has also implemented a quality control program to slopes, and GPS coordinates (66). The network-level data monitor data repeatability and the accuracy of test equipment collection process includes: (1) data management, (2) pre- using a test loop. The experiences of Virginia and other states processing, (3) processing, (4) quality control, (5) quality have been incorporated into the program. The test loop is acceptance (denoted as quality assurance), (6) classification measured 20 times at the beginning of each data collection and rating, and (7) data reduction. The quality control and season, then run once every three weeks during the season. acceptance procedures for the automated crack detection are To analyze data accuracy of a particular test-loop run, the discussed by Groeger et al. (66). moving average of all the previous runs, including the initial 10-run results, is considered as the reference value for that Quality Control particular test. This test is also compared with the previous one to check data repeatability. As discussed in chapter four, the quality control plan includes checks to verify that all fields are processed, reviews of section- Quality Acceptance level data in a search for abnormalities, and checks to verify that all the data have been saved. The reviewer then inputs a Quality acceptance is done by a quality assurance auditor, subjective evaluation of the crack detection process (good, who is not the equipment operator. This process verifies that fair, or poor). the data collection and quality control processes have been conducted properly. The independent auditor checks the data Section-Level Review management spreadsheet, verifies that the data are complete, verifies that the data have been saved and backed-up, and re- The section-level review is conducted by looking at the total checks a random sample of 10% of the data files collected quantity of cracking by station and searching for abnormali- implementing the same procedure used for quality control. ties; for example, a road segment with many spikes. The oper- This sample includes any files that have comments that are ator reviews the segments with abnormalities by manually out of the ordinary. If there is one discrepancy in a file, it is superimposing the cracks detected with the actual pictures. At noted on the data management form. If more than two dis- the time the plan was published, this process was applied to crepancies are detected, 50% of the file is reviewed to deter- approximately 50% of the pictures and the goal was to recog- mine if there is a systematic error. If more than 10% of the nize 80% of the cracks. The operator also looks at the last quality acceptance samples have discrepancies, consideration rehabilitation date and verifies that the amount of cracking is is given to repeating the crack detection process (66). All consistent with the age of the surface; for example, a pave- data are backed-up on the server on a daily basis and copied ment recently rehabilitated would have little cracking. to tapes once a week.

OCR for page 43
44 TABLE 5 EXAMPLE OF QUALITY CONTROL RATING MATRIX USED BY MDSHA QC Procedure Good Fair Poor Stations Processed <100% Criterion 1: Detected > 80% of Cracking >90% Stations 70%90% Stations <70% Stations Data Saved to Hard Drive Yes No Data Saved to Network Yes No Source: Groeger et al. (66). Classification and Rating match PMS specifications, and the pavement condition index (PCI) is calculated. After the data have been processed for crack detection, qual- ity control, and quality acceptance, the next step in the process is to classify and rate cracks using an automated process. Time-Series Comparisons Cracks are classified as longitudinal and transverse, their locations in the pavement (outside wheel path, inside wheel MDSHA also monitors network-level time-series data with path, center, left edge, or right edge) are determined, and the the help of a software tool developed and implemented in severity is rated as low, medium, or high using the AASHTO 2004. This quality acceptance tool checks reasonableness of cracking protocol definition and the crack width determined the data trend. The tool is routinely used to test each data sub- by the system. mission and includes two steps: data trend monitoring and data quality investigation. Data Reduction Data Trend Monitoring MDSHA uses the percentage of the network in acceptable condition--called acceptable The cracking data are reduced to a condition rating of 0 to rate--to monitor the network-level pavement condition. A 100 and assigned a condition state of very good, good, fair, software program summarizes the acceptable rates in terms mediocre, or poor using a software program known as the of IRI, rutting, cracking, and friction individually by route, MDSHA Automated Distress Analysis Tool. This tool also county, district, or statewide. A table is prepared for each of the performs the final quality check through a suite of logic, pavement condition indicators along with the values obtained range, and trend checks on the data and generates a progress for the last five years. As an example, Table 6 shows 2 years report to document the pace of data collection and data pro- of the 5-year comparison for IRI for a sample of routes. For cessing. If the checks detect any problems, the file is flagged each year, columns 1 through 5 indicate the percentage of and a note is output to an error log. If the file passes all the pavements in each condition state and the last column the checks, data are converted into U.S. units and reformatted to percentage of road in acceptable condition (states 1 through 3). TABLE 6 IRI TREND MONITORING BY ROUTE 2007 2006 Route Acc. Acc. 1 2 3 4 5 1 2 3 4 5 Rate Rate IS68E 41.04 36.57 18.16 3.23 1 95.77 30.6 37.81 25.87 4.23 1.49 94.28 IS68W 33.83 41.79 20.65 3.48 0.25 96.27 24.38 43.78 27.11 2.99 1.74 95.27 MD35N 0 33.33 58.33 4.17 4.17 91.67 0 33.33 58.33 8.33 0 91.67 MD35S 0 33.33 58.33 4.17 4.17 91.67 0 41.67 50 4.17 4.17 91.67 MD36N 12.97 40.61 29.35 7.85 9.22 82.94 9.62 39.52 34.02 9.28 7.56 83.16 MD36S 12.24 42.52 28.91 7.48 8.84 83.67 11.6 41.98 32.08 7.85 6.48 85.67 MD47N 0 11.76 58.82 23.53 5.88 70.59 0 17.65 58.82 23.53 0 76.47 MD47S 0 5.88 76.47 17.65 0 82.35 0 11.76 70.59 17.65 0 82.35 MD49E 0 9.09 72.73 9.09 9.09 81.82 0 10 80 10 0 90 MD49W 0 10 90 0 0 100 0 10 90 0 0 100 Source: W. Xiong, personal communication, 2008. Note: Columns 15: the percentage of pavements in each condition; last column percentage of road in acceptable condition. In bold, routes where pavement condition indicators differ more than 2% from previous year. Acc. = acceptable.