National Academies Press: OpenBook

Automated Pavement Distress Collection Techniques (2004)

Chapter: Chapter Eight - Case Studies

« Previous: Chapter Seven - Costs, Advantages, And Disadvantages
Page 46
Suggested Citation:"Chapter Eight - Case Studies." National Academies of Sciences, Engineering, and Medicine. 2004. Automated Pavement Distress Collection Techniques. Washington, DC: The National Academies Press. doi: 10.17226/23348.
×
Page 46
Page 47
Suggested Citation:"Chapter Eight - Case Studies." National Academies of Sciences, Engineering, and Medicine. 2004. Automated Pavement Distress Collection Techniques. Washington, DC: The National Academies Press. doi: 10.17226/23348.
×
Page 47
Page 48
Suggested Citation:"Chapter Eight - Case Studies." National Academies of Sciences, Engineering, and Medicine. 2004. Automated Pavement Distress Collection Techniques. Washington, DC: The National Academies Press. doi: 10.17226/23348.
×
Page 48
Page 49
Suggested Citation:"Chapter Eight - Case Studies." National Academies of Sciences, Engineering, and Medicine. 2004. Automated Pavement Distress Collection Techniques. Washington, DC: The National Academies Press. doi: 10.17226/23348.
×
Page 49
Page 50
Suggested Citation:"Chapter Eight - Case Studies." National Academies of Sciences, Engineering, and Medicine. 2004. Automated Pavement Distress Collection Techniques. Washington, DC: The National Academies Press. doi: 10.17226/23348.
×
Page 50
Page 51
Suggested Citation:"Chapter Eight - Case Studies." National Academies of Sciences, Engineering, and Medicine. 2004. Automated Pavement Distress Collection Techniques. Washington, DC: The National Academies Press. doi: 10.17226/23348.
×
Page 51
Page 52
Suggested Citation:"Chapter Eight - Case Studies." National Academies of Sciences, Engineering, and Medicine. 2004. Automated Pavement Distress Collection Techniques. Washington, DC: The National Academies Press. doi: 10.17226/23348.
×
Page 52
Page 53
Suggested Citation:"Chapter Eight - Case Studies." National Academies of Sciences, Engineering, and Medicine. 2004. Automated Pavement Distress Collection Techniques. Washington, DC: The National Academies Press. doi: 10.17226/23348.
×
Page 53

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

46 This chapter covers cases studies of agencies applying vari- ous methods to pavement condition data collection and pro- cessing. The studies were performed to provide additional insight into agency practices. The questionnaire responses and the materials appended to those responses by some agen- cies have been thoroughly reviewed and three agencies meet- ing the general criteria of doing things in different ways have been chosen. All are state transportation agencies, one of which does everything through the use of agency resources, whereas another does the same things exclusively through contract work. The third accomplishes most of its tasks through a vendor, but with significant input from the agency to accomplish data processing. The studies incorporate work by two of the major North American vendors, the Roadware Group, Inc., in the first two cases, and Pathway, Inc., in the third. All three agencies appear to have been reasonably suc- cessful in accomplishing their goals of securing high-quality network-level pavement condition data. MARYLAND STATE HIGHWAY ADMINISTRATION Introduction The Maryland State Highway Administration (MDSHA) pro- vides a case study of an agency that has successfully moved from essentially a manual data collection and analysis pro- cess to one that is fully automated. In Maryland, all pavement condition data collection and processing is by agency re- sources that use purchased data collection equipment, pro- cessing hardware and software, and training. Much of the work the agency has done in improving the quality of the data collected has been documented in TRB publications over the past few years. For a case history of automated collection and processing or pavement cracking data, Maryland engineers have provided background in a 2003 TRB paper as summa- rized here (49). The MDSHA started collecting cracking data on its road- ways in 1984. Typically, the data were collected by teams of inspectors, riding in vans. However, with recent reengineer- ing of the agency, the old process began to create resource and logistical problems. In response, over the past few years, the MDSHA pavement management group has developed and implemented a state-of-the-art network-level crack detection process (49). The agency manages some 16,000 lane-mi of highway pavement and spends approximately $100 million annually to maintain and preserve the network. To use the funding wisely, the agency developed and implemented a pavement management system with the appropriate data analysis and performance modeling tools. For there to be suitable perfor- mance indicators, pavement cracking is an important input to those models. Data Collection In 1995, the state purchased an ARAN device and software that enabled the capture of pavement images and the analysis of pavement cracking for the network. “After a period of experimentation and development, an efficient, accurate, and repeatable process evolved” (49). Ride, rutting, and cracking data are all collected with the ARAN by running one lane in each direction or the curb lane for multilane roads. As a result, some 10,000 lane-mi per year are collected. The report describes the ARAN as follows: The ARAN vehicle is equipped with state-of-the-art equipment to collect information about Maryland’s highway infrastruc- ture. The combination of high resolution digital video, ultra- sonic sensors, accelerometers, gyroscopes, Global Positioning Systems (GPS), and a distance measuring device are used to collect data at highway speeds. As it travels, it collects infor- mation on rutting and roughness, grade, and curve radius. It also collects right-of-way digital video. Digital photographs of the pavement view are taken by two rear-mounted, downward looking cameras (49). The profiling capabilities of the Maryland ARAN exceed the requirements for a Class II profiling device. A 37-sensor rut bar measures transverse pavement profile and determines the amount and severity of rutting. In addition, digital pave- ment images are collected and stored on removable hard drives. These images are processed by agency personnel at an off-line work station. Maryland indicated that its data collection procedures have matured to become a robust part of its business process. The agency noted Many customers within and outside the State use this data and it is now integrated with the Geographic Information System (GIS). By pushing a button within the GIS, a full database of inventory, performance, and right-of-way data can be accessed. CHAPTER EIGHT CASE STUDIES

47 The use of digital right-of-way images has now been rolled-out statewide and, as a result, the Pavement Division has seen a large increase in the number and variety of customers who are utiliz- ing this information. Maryland engineers noted that the automated procedures described here apply only to flexible pavements because visual surveys are done on PCC pavements. Image Processing Each week the ARAN crew brings the pavement images into the office on removable hard drives. The files are archived and the following tasks as given in the report are performed (49): • Data management, • Pre-processing, • Processing, • Quality control, • Quality assurance, • Classification and rating, and • Data reduction. Data Management Paper log sheets containing location-reference information and other collection information are entered into a database and serve as important measures of progress in covering the system. The data are cross checked with inventory to make sure that the appropriate road sections are being collected. Still another tool, described in the data reduction portion of this document, along with this database, monitors crack detection progress. The present database is updated continuously during data collection and processing. Pre-Processing Pre-processing begins with loading data from archived videotapes into the WiseCrax processing computer. These data are a control file containing location-reference information and providing a tie-in between roadway locations and pavement images. The file is segmented into 0.016 km (0.01 mi) incre- ments. The corresponding images are stored as JPEG digital files. No data could be processed without the control file. To speed processing and review, pavement history files are reviewed to identify new overlays. Those no more than 2 years old are assumed to have zero cracking and are not processed. The WiseCrax program is then started and the control file and images are loaded. Images are then reviewed by the operator to ensure that they are of a suitable quality to be processed. This review consists of making sure that images are clear and that the lighting is even. Following this review, a series of adjustments are made to the software to ensure that crack detection input parameters are acceptable. MDSHA engineers noted “This is the most important part of the crack detection process and it involves a great deal of experience by the oper- ator to perform well” (49). Although the details of this part of the process are not given by the agency, the MDSHA does report that “the manufacturer’s user’s guide and training ensure that operators are skilled to perform this task.” The MDSHA engineers continued As part of this process, the operator performs trial crack detec- tion on a variety of stations within the road section. During this process, the automated crack detection results are compared to a manual review of the data on the computer screen. This is done to ensure that crack detection is occurring correctly. Parameter settings are sensitive to changes in pavement color and texture so, in the end, a compromise set of parameters that best apply to the entire section is usually chosen. It is also possible to choose separate crack detection parameters for individual sections within a pavement, but this option is not normally performed in MDSHA due to time and efficiency constraints. Identification of 80 per- cent or greater of all visible cracks is the benchmark used to determine if crack detection is adequate (49). Processing Processing is done in the automatic mode of the WiseCrax program, which processes the images at an equivalent speed of 21 to 27 km/h (13 to 17 mph), depending on processor speed, the amount of cracking, and other factors. Because no human intervention is required, most processing is done overnight. Quality Control QC of the processed files is very important and usually is carried out by the person who started the program for a given run. It is done as soon as the run is completed and consists of three reviews: completeness, section-level data, and data management. The completeness review ensures that all files have been processed. The section-level review is done to determine whether or not the program is detecting most crack- ing, and it is accomplished through both subjective evaluation by an experienced operator and through direct comparison of detected cracking to superimposed images. Approximately 50% of the 0.016-km (0.01-mi)-long stations are reviewed, because this high level of QC has been found to be feasible and warranted to produce good cracking data. The goal of the process is to achieve 80% crack recognition, for this level has been found to be sufficient for network-level surveys. Finally, the data management review is conducted to ensure that the data have been saved to the WiseCrax computer hard drive and then to the network. Table 17 summarizes the QA require- ments applied by Maryland. Quality Assurance The Maryland QA process is of particular interest because it demonstrates the concern that the agency has for securing good-quality data and the extent to which one must go to achieve that objective. A QA auditor (QAA) does a weekly or

48 biweekly review of the process to ensure that the QC has been properly conducted. For a given group of sections, the QAA must be someone other than the person who ran the original WiseCrax on the sections. The QAA inspects the previous week’s documentation and then selects a 10% sample of the data for QA review. Maryland engineers do not cite a random sampling requirement for this 10%. An equal number of files for each different operator are evaluated. The main review consists of conducting the QC process as described and assigning QC ratings as shown in Table 17. For files rated “Fair” in Table 17, disposition is at the discretion of the QAA. Any ratings not in agreement with the originals are noted. Two evaluation criteria are then applied: • If more than two discrepancies are found within one file, 50% of the file must be reviewed for the possibil- ity of a systematic crack detection problem; and • From that 50%, if more than 10% have discrepancies, consideration is given to throwing out all data and repeating the WiseCrax process for the files repre- sented. All rating discrepancies are discussed with the appropri- ate operators and a consensus is reached as to the final rating to be assigned. Finally, the QAA ensures that all data have been backed up to the network daily and archived to a tape drive once a week. MDSHA engineers clearly consider those data to be very valuable and worthy of a sound QA program, and of solid backup and archival procedures. Classification and Rating Upon completion of data QC and QA, the next step is to clas- sify and rate the cracking detected. WiseCrax performs this process automatically at a rate of approximately 1300 km/h (800 mph) once the operator enters the proper commands. Longitudinal and transverse cracks are classified while their location on the pavement (outside wheel path, inside wheel path, center, left edge, or right edge) and widths are deter- mined. Proprietary algorithms perform this procedure. Finally, the cracks are rated as low, medium, or high severity through application of the AASHTO cracking standard (17 ). The final result is a text file of location-reference and cracking data. MDSHA produces about 1 millions rows of such data each year. Data Reduction The raw data from the classification and rating process are used in the computation of a PCI. MDSHA uses a combina- tion of the AASHTO cracking protocol (17 ) and the PCI developed by the U.S. Army Corps of Engineers (19) to com- pute the condition index. Performance Indicators MDSHA engineers noted that the ultimate use of the cracking data is as an overall performance indictor to be used in perfor- mance models. They anticipate that an overall condition index combining roughness, rutting, and cracking eventually will be developed. To date they have not developed that index because reliable cracking data have only recently become available through the new WiseCrax process. Lessons Learned Maryland engineers cited a number of lessons learned from their work with the automated crack detection and classifica- tion program. In the belief that these lessons are of special value to those considering automated distress collection and data reduction, they are listed here exactly as given in the report (49): • Automated crack detection is a viable technology that is “ready for primetime.” • Performing (the process) in-house is a large resource commit- ment in terms of equipment purchase, personnel training, and operator time. • The key to quality cracking data is to take a phased approach to implementation. Take each step slowly and work out all bugs before proceeding to the next step. • Rigorous QC and QA are paramount. Large amounts of effort should be devoted to this cause. • Partner with the manufacturer of equipment you use. Learn from them and allow personnel to attend training offered by the vendor. More was learned in two days with the vendor than in 10 weeks on our own. • Secure commitment from above. The implementation process is time and resource intensive and progress sometimes appears to be slow. • Validate your data and your process at each stage of imple- mentation. “Ground-truth” resulting data as much as possible in the field by comparing office generated data with actual field conditions. • Keep it as simple as possible. QC Procedure Good Fair Poor Stations Processed >80% Cracks Detected Saved to Hard Drive Saved to Network 100% >90% of stations Yes Yes <100% <70% of stations No No 70%–90% of stations TABLE 17 MARYLAND PROCESSING QC RATING MATRIX

49 Related Research In work related to cracking data collection and reduction, MDSHA did an evaluation in 2002 of the AASHTO proto- col on quantifying cracks in asphalt pavement surface (79). The work spanned more than a year and consisted of a pilot study of feasibility, a survey to ensure that the results were compatible with expert opinion, and a production study to determine standard utility at the network level. It was con- cluded that the protocol is suitable for use at the network level. However, the MDSHA process as described previ- ously is not entirely compliant with the AASHTO standard owing to hardware, software, and even policy issues. How- ever, MDSHA engineers deemed their procedure to be “well within the spirit” of the AASHTO protocol. Conclusion The MDSHA has done a comprehensive study of the applica- tion of the WiseCrax method of pavement cracking identifi- cation and classification and finds the procedure to be ready for use on a network basis. The agency concedes that the process must be undertaken deliberately, that personnel must be fully trained through partnering with the equipment provider, and that a full agency commitment is necessary. Conclusions from the Maryland work include the following: • A rigorous QA process is a must for automated surface distress collection and processing; • WiseCrax cannot accurately detect cracks on Maryland concrete pavements or bridge decks, generally because the striations in the relatively coarse texture are detected as cracks; and • A crack recognition level of 80% has been found to be sufficient for network-level surveys. LOUISIANA DEPARTMENT OF TRANSPORTATION AND DEVELOPMENT Introduction Another agency chosen as a case study is the LADOTD. Essen- tially all of LADOTD pavement condition data collection and processing are done by contract, a sharp contrast with Mary- land. Although the LADOTD has not published a comprehen- sive study of its work, as Maryland did with the WiseCrax work, the agency has provided extensive operational and con- tractual data that make up the overall case history. LADOTD began the current effort concurrent with the release of The Road Information Program (TRIP) report, showing that Louisiana had the second worst pavement con- ditions in the nation, with 27% of the state’s major roads rated as poor (80). Although the TRIP report was done for the Louisiana Associated General Contractors, Inc., it relied on FHWA data for its assessment. Although contract and other LADOTD documents do not refer to the TRIP report, there is little question that the agency was under considerable pressure to do its own objective and efficient assessment of pavement conditions. LADOTD issued an RFP in January 2000 and received a proposal from Roadware Group, Inc., on March 7, 2000. The subsequent contract, discussed here, was an out- growth of that proposal. General Contract Provisions The 2001 contract was awarded on October 25 and encom- passed 32 000 lane-km (20,000 lane-mi) of comprehensive data collection and analysis. Following a period of experience with the original contract, including some renegotiation of contract provisions and pricing, a supplementary agreement was executed in June 2002. The work covered all nine dis- tricts administered by LADOTD. Included were GPS data, digital right-of-way and pavement images, IRI, faulting and rutting measurements, and pavement distress evaluation (58). Furthermore, a workstation for agency review of images and validation of distress data was to be provided at an LADOTD office. Other general contract provisions addressed the fol- lowing: • A QC program, • A master progress schedule, • The collection and processing system configurations, and • Specific data requirements. The QC program requires the contractor to administer a plan that will ensure that data are accurately collected and that they reflect the actual pavement condition within specified precisions. The contractor’s equipment is checked against an agency profiler and a Class I profiling instrument (Dipstick, etc.) before beginning testing. During production, the con- tractor is required to use verification sites of known IRI, rutting, and faulting values. These sites are permitted to “roll”; that is, the contractor is not required to use the same sites all the time. Rather, the contractor may, on a given day, test a site that was tested 1 week previously. These reruns are evaluated to determine if the profiler is still in calibration. Such tests are documented in writing and delivered to the agency weekly. This feature is helpful in testing over a wide- spread area, because extensive backtracking is not necessary to do the verification sites. The master progress schedule was to address scheduling of data collection over the nine districts beginning within 90 days of the notice to proceed. The contractor was to deliver data for at least one and no more than two districts per month begin- ning in November 2002, with all nine to be delivered by May 15, 2003. The contract did not address the reason for the two district maximum, but it is presumed to relate to the time required for LADOTD evaluation of deliverables. Liquidated damages of $100 per day were to be assessed for each day that

50 data for the required number of districts were not delivered on time and $500 per day for each day the data for all nine dis- tricts were not delivered on time. Finally, damages of $300 per day were to be assessed for each day that the final report was not on time. Specific sensor data requirements are given in Table 18. Additional requirements were for at least three transverse laser sensors on a bar not to exceed 2.4 m (8 ft) in length for rut depth measurements, for continuous faulting measurements 0.3 to 0.9 m (1 to 3 ft) from the outside edge, and for statisti- cal parameters (averages and standard deviations for both wheel paths) of IRI measurements. The agreement called for the collection of pavement dis- tress data in accordance with LADOTD protocols (81,82) based on the LTPP distress identification manual (18). These protocols include the distresses given in Table 19. The max- imum and minimum values of each of these distresses are to be determined in Task I of the data collection portion of the contract discussed here. PCI Data Collection, Quantification, and Reporting This portion of the contract called for work to be done in four tasks as summarized here: Variables Roughness Rut Depth Faulting Scope Definition Sampling Calculation and Statistics Units Equipment Configuration Standards Precision and Bias Initial Verification Ongoing Quality Monitoring Special Requirements Reporting Frequency All pavements Longitudinal profile, both wheel paths Max., 0.3 m (1 ft) IRI, each wheel path and average of both wheel paths Inches/mile Lasers and accelerometers, both wheel paths ASTM E950, HPMS Field Manual Class II Max. error of 5% bias or 0.3 m/km (20 in./mi), whichever is less Section comparison of longitudinal profile with Class I profiling instrument and LADOTD’s SD laser profiler QA/QC sections Correct/report low- speed sections; capability of monitoring data collection in real time in the data collection vehicle 0.16 km (0.10 mi) Asphalt surfaces Rutting of each wheel path Max., 3 m (10 ft) Each transverse profile of both wheel paths, for section report average Inches [nearest 2.5 mm (0.1 in.)] Min., 3 laser sensors None given Contractor to provide Test section comparison with field measurements provided by LADOTD QA/QC sections Capability of monitoring data collection in real time in the data collection vehicle 0.16 km (0.10 mi) Jointed concrete Elevation difference across joint (trailing slab lower) All transverse joints Wheel path absolute elevation difference averaged for each joint, for section report average Inches [nearest 1 mm (0.04 in.)] Lasers in right wheel path None given Contractor to provide Test section comparison with field measurements provided by LADOTD QA/QC sections Capability of monitoring data collection in real time in the data collection vehicle 0.16 km (0.10 mi) TABLE 18 LOUISIANA SENSOR DATA COLLECTION REQUIREMENTS (58)

51 • Task 1—Preliminary activities in which the contractor did additional calibration testing, calibrated raters in identifying and classifying typical distresses, and deliv- ered the LADOTD workstation. • Task 2—Data collection by district in which clear digi- tal pavement images of all roads were required. These images were required to be location-reference identified to the nearest 0.0016 km (0.001 mi) and to provide res- olution sufficient to identify cracks 3 mm (0.125 in.) wide. These images were to be loaded weekly into an approximately four terabyte server provided by the con- tractor in Task 1. Raw sensor data were also to be included in the Task 2 deliverables. This portion of the contract specified that roadway locations with unaccept- able image quality were to undergo data collection again at no additional cost to the agency. • Task 3—Distress quantification in which distresses were evaluated and reported in 0.16 km (0.10 mi) increments and tied to the LADOTD’s location-reference system. LTPP protocols and the Louisiana distress identification manual were specified for this distress data reduction. In that section of the contract, the agency reserved the right to review images and data quantification on the provided workstation and to require the contractor to resolve any problems with the quantified distress data. No specifics of this process are given. The D-Rate procedure discussed in chapter three was to be used for distress quantification. • Task 4—Final documentation of the project including a summation of the project detailing problems, solutions, and final outcome. Final documentation was also to include a final QC report and a report of proposed and actual schedules of work accomplishment. Conclusion The Louisiana contract is ongoing and no assessment of its’ success or failure has been released. The LADOTD esti- mated the cost of delivered data at approximately $32 per km ($54 per mi) and did not note any special problems with the contract in response to the synthesis questionnaire. Although the responses to the questionnaire provided lit- tle information on acceptance criteria, it did in part address the question of data reasonableness. For example, the maxi- mum reporting value for IRI is given at 10 m/km (632 in./m), whereas the minimum for joint faulting is 5 mm (0.20 in.) as provided by the AASHTO provisional standard. Further- more, discussion with the agency management systems engi- neer (S.M. Ismail, LADOTD, personal communication, August 2003) revealed that data have been delivered in com- pliance with contract requirements. Digital images are deliv- ered weekly for the agency to check quality. Once each month, and according to the delivery schedule, the contrac- tor delivers data for one district, including digital images and distress data. The agency has 2 weeks in which to review the delivery. Anything considered not acceptable is rejected, and the contractor must correct the deliverable even if it is nec- essary to recollect the data. At least two features of the Louisiana case are worthy of additional comment. The first is the use of rolling sensor data calibration sites, the concept being that a site production tested one week may be tested as a calibration check the fol- lowing week. Use of the rolling site does not replace a cali- bration requirement; rather, it is intended as a screening activity to determine if a recalibration is needed. In view of the large systems managed by some agencies, this approach may be a reasonable interim QM tool. A second feature is the provision for the definition of max- imum and minimum values of distress data items during the preliminary phase of the contract. This type of requirement is not unusual, and points to the lack of solid pavement data- bases reflecting typical data variabilities and extreme values. The issue is very much related to the broader issue of how pavement data items can and should be characterized. As found throughout the development of this synthesis, there is a great need for a focused study of pavement data items to determine typical variabilities such that defensible contract provisions, especially in the QA area, can be developed. MISSISSIPPI DEPARTMENT OF TRANSPORTATION Introduction MDOT was chosen as the third case study, where data col- lection and processing activities are done by a vendor, but with specific protocols provided by the agency. Whereas both of the previous case studies depended on equipment and Variable Asphalt Pavements Jointed Concrete Pavements Continuously Reinforced Concrete Pavement Cracking Miscellaneous Alligator, block, longitudinal, transverse, reflection Patching, patch deterioration, potholes Longitudinal, transverse Patching, patch deterioration Longitudinal, transverse Patching, patch deterioration, punch-outs TABLE 19 LOUISIANA SURFACE DISTRESSES TO BE COLLECTED

52 processes provided by one vendor, this third case represents the use of a competing vendor. Mississippi provided contrac- tual and procedural attachments in responding to the synthe- sis questionnaire. Background for the Mississippi work is provided by the pavement management overview given in a research report found on the agency’s website (83). In 1986, MDOT con- tracted with the University of Mississippi to implement a pilot pavement management system in one district. Included was a rudimentary database containing distress and roughness data for the entire state-maintained system for the district. Later, the products of the university-developed pilot were used by MDOT to establish a statewide pavement management sys- tem. At that time, the database was expanded to include loca- tion reference, lane widths, roadway lengths, county and route information, and inventory and historical information (origi- nal construction as well as overlays). Presently, a contractor collects data every 2 years to assess the overall condition of the state’s highways. Data collected include the IRI, a pave- ment condition rating (PCR), rut depth, joint faulting, and tex- ture. Cracking, potholes, patching, punch-outs, and joint dete- rioration data are collected on two 150-m (500-ft)-long samples per mile within each analysis section. Data Collection A contractor has collected pavement condition data in Mis- sissippi every 2 years since 1991 (83). GPS receivers collect coordinate data on laser-equipped profilers, collecting pave- ment profile data that are used to generate IRI, rutting, and faulting values. Roughness, rutting, and faulting data are col- lected on 100% of the state-maintained system and on the HPMS roadways in the state. The vans also carry five video cameras used to capture images of the shoulders, wheel paths, and front perspective views. Images of the pavement surface require a minimum shutter speed of 4,000 frames per second that will provide clear, crisp resolution for distress analysis purposes. The present contract provides for the collection of inven- tory videologging and longitudinal profiling of approximately 34 700 km (21,700 mi) of state-maintained and non-state- maintained roadway (84). The non-state-maintained roadways inventoried are part of the state’s HPMS mileage, and the con- tract serves as a means of meeting FHWA requirements for reporting on those roadways. Data are obtained from the out- side lane in both directions for divided highways and in an easterly or northerly direction for undivided highways. Sensor data are collected for 100% of the length of high- ways being surveyed at a maximum sampling frequency of 25 cm (10 in.). A South Dakota-type profiler equipped with three laser height sensors and real-time graphical display is required for sensor data collection. The agency does not specify rut-measuring methodology, but it reserves the right of approval of sensor orientation before the start of work. The contractor is required to maintain a library of field data tapes in a fireproof enclosure until completion of the contract, when they will be provided to the agency. An addi- tional set is furnished to the agency as soon as available. The agency provides the contractor with established file formats for the capture of all data items. These formats must be adhered to by the contractor, but they may be supplemented with additional files as needed by the contractor. Data Review and Processing In addition to providing sensor data for the 8100 km (13,000 mi) of state-maintained roadways, the contractor provides distress data reduction from the interpretation of digital images. The various distresses applicable to each pavement type are separately evaluated for location, severity, and extent. Although either automated or manual processing is acceptable, the contract stated, “automated distress inter- pretation shall require, as a minimum, ninety-five (95) per- cent reliability and must have prior written approval by the COMMISSION before use.” Without stating a methodology, the contract indicated that “the COMMISSION reserves the exclusive right to make the determination of whether an auto- mated distress interpretation scheme attains the ninety-five (95) percent reliability level” (84). In the execution of the current contract, the data reduction process involves having individuals identify the various dis- tresses from digitized video images; that is, a manual method is used. In this process, images are digitized at a frequency of approximately one every 15 m (50 ft). A distress evaluation whereby cracks, potholes, punch-outs, etc., are measured is then performed on the digitized images. The sampling tech- nique used ensures that approximately 20% coverage of the state-maintained system is achieved in the distress evaluation. Quality Control and Quality Assurance The Mississippi data collection contract (84) provides that an overall QC plan for the verification of the accuracy of reduced data shall be presented for agency review and approval before the start of field surveys. Additional requirements were that the contractor shall provide accuracy and precision data from previous work and that the QC plan address both roughness and distress data. In the current contract, QC procedures consist of vendor units traversing approved calibration sites in the district where work is ongoing (84). Such sites are strategically located in the districts. To develop a baseline for the district, any vendor unit must test all sites before beginning data collection. Agency IRI, rutting, and faulting statistics for the calibration sites are derived from manual data using an agency South Dakota pro- filer, a rut bar, and the Georgia Faultmeter, respectively. After initial approval in a district, each unit is required to make a traverse of one calibration site before the beginning of

53 Mississippi uses pavement condition data to compute three indices: roughness rating, distress number, and general PCR consisting of a combination of roughness and surface distress. The focus of the data is at the network level, so that the data are used primarily to show the condition of the system as a whole or of a particular class of roads. Data are also used to monitor performance of pavements over time and for long- term budgeting. Other uses include life-cycle cost analyses of various pavement types and of pavement rehabilitation and maintenance treatments. The data are sometimes used at the project level, but mainly as a tool to plan projects and evalu- ate performance to more efficiently expend allocated funds. Two other important uses of pavement condition data identified by the agency are the following (83): 1. MDOT has let its first warranty job contract, which was aided by pavement management data. Standards for job acceptance based on pavement condition were devel- oped, and a CD-ROM was made showing typical dis- tress features and severity levels. This CD-ROM will be used to illustrate to contractors what is and is not accept- able for warranty jobs in the future. 2. The chief engineer uses PMS data to show how pave- ment condition declines over time if maintenance is not done, to request funding for the state’s four-lane system. In that way, PMS data support the concept of pavements being a long-term investment. SUMMARY The three case histories on automated pavement data collec- tion and processing show that various agencies approach the matter in very different ways. Maryland prefers to do all col- lection and processing in-house, whereas Louisiana has sim- ilar work done totally by a vendor. Mississippi also has most of the work done by a vendor, but it uses different procedures and software. One of the contrasts between Maryland and the other two agencies is the enormous amount of work that Maryland has put into the maturing of the processes in the state. It has taken the state years (since 1995 with the current equipment) to grow the technologies to where it is now very confident of the data received, provided that a rigorous QA program is used. It is reasonable to anticipate that other agencies can learn much from the Maryland experience. It is also reasonable to expect that the other two agencies highlighted are fairly early in the learning process. Although Louisiana has a QA process in place, it does not have a great deal of experience with how it works, although it has invoked some contractor penalties. Mississippi is developing its QA program as it proceeds with the work. The agency also has not developed a set of specifi- cations on deliverables. It is depending heavily on the vendor to provide high-quality data in a timely manner. Only time and experience will determine if experiences in Louisiana and Mississippi will be as positive as that of Maryland. data collection for each day. Precision thresholds for sensor- collected data for each traverse are IRI = ±0.30 mm/m (±19 in./mi), rutting = ±0.23 cm (±0.09 in.), and faulting = ±0.18 cm (±0.07 in.). The procedure agreed to by the contractor and the agency is to disregard any data between an unacceptable calibration site traverse and the previous acceptable traverse. Because of the daily traverse requirement, no more than 1 day’s work should be lost through this procedure. Agency review of distress data basically involves an indi- vidual positioned at a video workstation who conducts a QA on randomly selected samples. A 5% sampling of each pave- ment type and of all one-half mile or shorter sections are examined (85). Distresses checked are cracking, potholes, spalling, and punch-outs. The LTPP Distress Identification Manual (18) is used as the standard for type and severity of distress levels. At this time, the agency does not provide written accep- tance criteria for distress data determined from image evalu- ation. Instead, there is a contract provision stating, “It is understood that the work required of the CONSULTANT under this contract shall meet the normal standards of the state- of-the-art and state-of-the-practice for distress and roughness information and shall be performed to the satisfaction and approval of the COMMISSION.” Because those standards are not spelled out, it is likely that the state would use more spe- cific acceptance criteria if databases to support appropriate limits and variabilities of data existed. Data Reporting The state requires Excel spreadsheets for the overall PCR, the distress rating, and the roughness rating for each section by district. Excel files that list and quantify incremental dis- tresses in each section with location, severity, and extent also are required. The format for those files is given in the contract. Conclusion MDOT is implementing a comprehensive PMS that requires the data described in this case study. The agency has taken great strides in just a few years to meet its data needs through contracted data collection and processing. Although the reduc- tion of pavement distresses from pavement images is not yet automated in practice for the state, such automation is desired and is a contract option at this time. The agency requires the contractor to develop and execute a QC plan while the state exercises some degree of QA on the delivered data. However, data acceptance procedures are not clearly spelled out in the current contract, suggesting that there is a need for additional research in the QC/QA area.

Next: Chapter Nine - Art Versus Practice »
Automated Pavement Distress Collection Techniques Get This Book
×
 Automated Pavement Distress Collection Techniques
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s National Cooperative Highway Research Program (NCHRP) Synthesis 334: Automated Pavement Distress Collection Techniques examines highway community practice and research and development efforts in the automated collection and processing of pavement condition data techniques typically used in network-level pavement management. The scope of the study covered all phases of automated pavement data collection and processing for pavement surface distress, pavement ride quality, rut-depth measurements, and joint-faulting measurements. Included in the scope were technologies employed, contracting issues, quality assurance, costs and benefits of automated techniques, monitoring frequencies and sampling protocols in use, degree of adoption of national standards for data collection, and contrast between the state of the art and the state of the practice.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!