National Academies Press: OpenBook

Bridge Inspection Practices (2007)

Chapter: Chapter Five - Quality Programs

« Previous: Chapter Four - Inspection Types and Intervals
Page 44
Suggested Citation:"Chapter Five - Quality Programs." National Academies of Sciences, Engineering, and Medicine. 2007. Bridge Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/14127.
×
Page 44
Page 45
Suggested Citation:"Chapter Five - Quality Programs." National Academies of Sciences, Engineering, and Medicine. 2007. Bridge Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/14127.
×
Page 45
Page 46
Suggested Citation:"Chapter Five - Quality Programs." National Academies of Sciences, Engineering, and Medicine. 2007. Bridge Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/14127.
×
Page 46
Page 47
Suggested Citation:"Chapter Five - Quality Programs." National Academies of Sciences, Engineering, and Medicine. 2007. Bridge Inspection Practices. Washington, DC: The National Academies Press. doi: 10.17226/14127.
×
Page 47

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

44 U.S. federal regulations make QC and QA the responsibilities of each state’s bridge inspection program. Quality program activities may include office reviews of inspection programs, field review of inspection teams, refresher training for inspec- tion staff, and independent reviews of inspection reports and computations. The FHWA provides recommendations for QC/QA pro- grams at state DOTs (18). Recommended procedures for QC include: • Documentation of QC responsibilities of inspection program staff, • Documentation of required qualifications for staff titles in the inspection program, • A process for tracking the qualifications of current staff, • Procedures for review and validation of inspection reports and data, and • Procedures for identification and resolution of errors in inspection reports. Recommended procedures for QA include: • Documentation of QA responsibilities of inspection program staff, • Procedures for office review and field review of inspec- tion programs, • Procedures for disqualification and requalification of in- spection team leaders and inspection consulting firms, and • Procedures for validation of QA programs. QA reviews should verify bridge lists for underwater, fracture-critical, and other specific inspections, and follow up on critical findings. QA should verify a sample of inspections and reports. QA reviews should document their outcomes and recommend improvements to inspection programs. QUALITY PROGRAMS OF U.S. STATE DEPARTMENTS OF TRANSPORTATION Information on QC/QA programs of U.S. state DOTs is pre- sented in the order of FHWA’s framework. Detailed responses can be found in the tables in Appendix G. Quality Control Documentation Thirty of 37 DOTs have or are preparing documentation of their QC/QA programs. Documentation appears as part of DOT bridge inspection manuals, as bulletins and directives, or as standard forms that are used in the course of QC/QA ac- tivities (see Table G1). Program Staff Role in Quality Control and Quality Assurance Thirty-five of 37 state DOTs and Eastern Federal Lands iden- tified staff responsible for QC or QA activities (see Table G2). For nearly all of these personnel, QC/QA is one area in a larger set of job responsibilities. At 11 DOTs, the inspec- tion program manager, or equally qualified staff, is directly involved in QC/QA. Most DOTs use peer team leaders for QC review of inspection reports and periodic QA reviews of districts. Two DOTs have central inspection teams that per- form QC/QA activities. Quality Control of Inspector Qualifications Eight of 37 DOTs track an identified population of qualified team leaders, often by use of unique Certified Bridge Inspec- tor numbers assigned to leaders. The team leader enters the Certified Bridge Inspector number on inspection reports. Twenty-four DOTs rely on personnel records or a personnel database having records of training and experience for team leaders. During QA review, personnel records provide veri- fication that inspection leaders meet National Bridge Inspec- tion Standard requirements (see Table G3). Quality Control Review of Inspection Reports Thirty-two of 38 state DOTs and Eastern Federal Lands per- form QC review of all inspection reports. Reviewers may be peer team leaders, regional DOT staff, central DOT staff, or software applications performing checks for valid data en- tries. Nine DOTs review all reports plus a sample of reports. The sample is reviewed by the district program manager or other higher-level staff. Four DOTs do special QC review for bridges with fracture-critical members, load posted bridges, or bridges in poor condition (see Table G4). Twelve DOTs track the progress of field inspections, reports, and report re- views as a QC activity (see Table G5). Quality Control Field Reviews Fifteen of 36 DOTs make QC field visits to inspection teams at work or field verifications of inspection reports. QC may be CHAPTER FIVE QUALITY PROGRAMS

45 as frequent as twice-per-month verifications of one or a few bridge inspection reports or as infrequent as one or two field reviews of teams per inspection cycle. Six DOTs make QC site visits or field verifications targeted at every inspection team. Field visits are logged, and the results of field QC are recorded and discussed with the inspection team (see Table G6). Quality Control of Inspections by Consultants Thirteen of 35 DOTs delegate QC review to inspection con- sultants as a part of their contract work. Twelve DOTs per- form their own QC reviews of consultants’ inspection reports (see Table G7). Quality Control Program Validation Sixteen of 36 DOTs reported methods for validation of QC programs in addition to the use of QA review. State DOTs approve QC plans of local government inspection programs and of inspection consultants. A state DOT may use annual review by the FHWA as a measure of validation of the DOT’s QC program. Four DOTs rely on check inspections of a sample of bridges to validate QC programs. Two DOTs view annual training of staff as a way to maintain effective- ness of QC programs (see Table G8). QUALITY ASSURANCE Activities in Quality Assurance Reviews of Inspection Programs QA reviews are verifications of the organization and execu- tion of bridge inspection programs. QA reviews determine whether inspection programs have qualified staff and ade- quate equipment. QA verifies that appropriate progress, records, identifications, and follow-up are achieved. Thirty of 39 DOTs make QA reviews that are directed at districts and local government inspection programs (16), at inspection leaders and teams (15), or at samples of bridge inspection reports (6). Sampling of inspections reports may be within a district or statewide. Other DOTs (six) are developing their QA policies or extending QC review to address QA needs (see Table G9). Fifteen DOTs make QA reviews of inspection office pro- cedures and records. QA reviews verify: • Staff qualifications and training, including refresher training. • Bridge lists, especially lists of bridges having fracture- critical members, scour-critical bridges, posted bridges, bridges needing dive inspections, bridges needing access equipment, and bridges needing interim inspections. • Records of critical findings, repair recommendations, and staff follow-up. • Planning, scheduling, and progress of inspection work including report review and acceptance. Office review may include the review of a sample of in- spection reports (nine DOTs), usually through comparison of condition ratings and maintenance recommendations with photographs and inspectors’ notes. As a part of QA, field review can have several forms: • Independent inspections by QA teams with subsequent comparison with current inspection reports (seven DOTs). • Field verification of inspection reports by QA review teams generating lists of differences in condition rat- ings and other findings (seven DOTs). • Field review of current inspection reports performed jointly by QA review teams and inspectors of record. • Site visits of QA review teams to inspection teams at work (three DOTs). Selection of bridges for QA review may be random (14 DOTs), based on bridge condition or special features (15 DOTs), or targeted at specific inspection leaders or teams (7 DOTs). QA review may include as few as two bridges or as many as 50% of inspections for the current cycle (see Table G10). QA reviews produce reports of the review and its findings, often with a set of recommendations for continuing im- provement of inspection work. Eleven of 30 DOTs employ standard forms, checklists, or questionnaires in QA review and these become part of QA reports. Intervals for Quality Assurance Review Nineteen of 37 DOTs reported on intervals for QA review of in- spection leaders and/or inspection teams. Intervals range from 1 to 36 months. Nineteen DOTs reported on intervals for QA review of district and/or local government inspection programs. Intervals range from 12 months to 48 months (see Table G11). Aspects of Quality Assurance Review of Bridge Inspections Thirty DOTs reported items in QA review of bridge inspec- tions. Most DOTs identify five items: • Discovery of deterioration (21 DOTs). • Recognition of critical conditions (24 DOTs). • Accuracy of condition ratings (26 DOTs). • Thoroughness of inspection reports (24 DOTs). • Appropriate methods of inspection (17 DOTs). Tolerances Used in Quality Assurance Review Twenty-five of 32 DOTs reported tolerances used in QA reviews. Twenty-one DOTs reported a tolerance of ±1 for NBI condition ratings. Nine DOTs reported a tolerance on bridge load rating, with 10% being a common limit on dif- ferences. Twelve DOTs reported tolerance on element-level

46 condition ratings, with ±1 condition state being a common value (see Table G12). Benchmarks in Quality Assurance Reviews DOTs that perform QA reviews of samples of bridge inspection reports can track accuracy of condition ratings as a benchmark of program quality. Various aspects of program compliance, such as timely completion of inspection reports, completion by staff of refresher training, and up-to-date bridge lists each might serve as a measure of program quality. Most DOTs include these aspects in QA reviews. Few DOTs reported the use of any of these as benchmarks (see Table G13). Disqualification of Inspection Program Staff Fifteen of 32 DOTs reported on grounds for disqualification of inspection program personnel. Common concerns in- cluded timely completion of work (4 DOTs), accuracy and consistency of inspection findings (10 DOTs), and inade- quate response to QA advice for improvement to perfor- mance (3 DOTs) (see Table G14). Six DOTs allow requalification of team leaders after re- training. Remedies for poor performance, short of disqualifi- cation, include additional training, counseling or coaching, and further quality review (18 DOTs). Poor performance can affect career advancement of DOT personnel and selection of inspection consultants (11 DOTs) (see Table G15). QUALITY PROGRAMS—FOREIGN PRACTICE Denmark QC activities in Denmark include: • Review of all Principal inspection field reports by a peer bridge inspector. • Review of data entry by experienced data personnel and verification by the bridge inspector. • Comparison of field measurements over several inspec- tion cycles. • Automated checks within the bridge database system. • Automated alerts for missing data as reports are generated. Finland Finnra uses automated checks in its bridge database for QC of inspection data. There are no other checks. Instead, Finnra emphasizes QA by inspector certification and training. Consultants to Finnra must propose and implement inspec- tion quality programs. These plans differ among consultants. France France implements ISO 9000 to direct its QC program. ISO 9000 is a set of standards for quality management published by the International Organization for Standardization. Germany In Germany, QC is a matter for the individual states. The fed- eral ministry has no direct involvement. Bridge data and the use of the bridge management system are monitored by BASt. When errors in data are apparent, the federal ministry is notified and the state is asked to resolve the errors. South Africa In South Africa, QC is performed by inspection consultants. Typically, the degree-extent-relevancy component ratings and inspectors’ notes are compared with supporting pho- tographs. Inspection data are entered into SANRAL’s bridge management system by consultants. Printouts of these data must be reviewed and signed by inspectors. In addition, the bridge management system performs automated checks of inspection data. Sweden Sweden uses standard inspection forms and the existing bridge record to guide inspectors and to ensure that all needed inspection tasks are completed. There is no indepen- dent review of inspection reports. United Kingdom Contract provisions for inspection services address some as- pects of QC. Supervising engineers must sign inspection reports. Maintenance agents are required to have third-party review of inspection reports. Timely completion of reports, accuracy and completeness of bridge data, and provision of adequate equipment to inspectors are all aspects that may be tracked as measures of contract performance. In addition, the administrator for the Structure Management Information System, the Highway Agency’s bridge manage- ment system, makes spot checks on bridge data. Inspection reports that have errors are returned to the maintenance agent and ultimately the supervising engineer for the inspection. Serious or persistent errors are recognized as poor service by the contractor, and these could influence future contract awards. Quality Assurance Among the nations included in this synthesis, QA usually entails training and workshops. Denmark, Finland, and Germany all conduct annual workshops for bridge inspec- tors, and all of these workshops include field inspections. Denmark and Germany use field work to recalibrate inspec- tors. Finland collects quantitative measures of accuracy of condition ratings and evaluates the performance of individ- ual inspectors. In South Africa, SANRAL’s QA is a program of independent reinspection of 2% of bridge inspections per year. Sweden has no periodic QA program, but instead relies on contract supervision to ensure consistent work among

47 consultants. In the United Kingdom, the detailed inspections that are made in preparation for repair projects are viewed as verification of previous inspection reports. These offer a measure of inspection quality. Denmark In Denmark, each bridge inspector is required to complete a QA review every year. Over a two-day period, teams inspect a number of selected bridges. Results are compared team by team, and the differences are discussed. Each year, different bridges are selected for this exercise. The outcomes of the reviews can include further training for inspectors, improve- ments to inspection procedures, or improvements to Danbro software. The Directorate views each Special inspection as a verification of conditions and previous inspection reports. Special inspections are done as needed. There is no sampling of bridges for QA review at a regular interval. Finland Finnra holds an Advanced Training Day each year at which certified inspectors participate in general inspections of two bridges. These two bridges are also inspected by a select group of Finnra personnel. Inspection data from individual in- spectors are compared with Finnra results. Deviations are computed and quantitative measures of the accuracy of the in- spectors’ work are obtained. Finnra sets limits on permissible deviations, allowing larger deviations for evaluation of indi- vidual defects and smaller deviations in the overall evaluation of a bridge. Finnra central administration tracks the quality of the inspection program with the quantitative measures. Inspection results are discussed with inspectors. The con- trol inspections are used, in part, as refresher training for inspectors. The quality of work at advanced training days affects awards of inspection contracts. Repeated, large devi- ations by an inspector can result in the loss of certification. Similar control inspections are made within Finnra regions as well. The number of control inspections for a region depends on the number of bridges inspected in the past year (Table 76). Germany In Germany, continuing training for bridge inspectors occurs at annual federal conferences conducted by BMVBS and lasting 2 or 3 days. Discussions at each conference focus on interesting bridges, as well as problems and new develop- ments in bridge inspection. One day is spent in field obser- vations of structures. The conference is held in a different state each year. Some states require attendance at the confer- ence by their inspectors, whereas other states either do not re- quire attendance, or require attendance in only some years. Other QA procedures, such as sampling of bridges and independent verification of inspection findings, are not performed. South Africa South Africa performs two activities for QA. First, when a con- sultant starts a contract for inspection services, SANRAL con- ducts an inspection workshop to calibrate all inspectors. The workshop and a briefing on inspection methods are mandatory for all inspectors who will participate in the contract. Second, verification inspections are done for 2% of Prin- cipal inspections each year by senior bridge inspectors. If many and/or large discrepancies are found, a new Principal inspection may be ordered. A third, though informal, type of QC is a product of the contract award process. As groups of bridges pass from one inspection firm to another, inspections by the new firm offer a verification of previous work. QA can affect the tender process. Evidence of negligence in consultant work is grounds for disqualification for further work. QA efforts do not evaluate or track individual inspectors. This too is a product of the tender process: there is no per- manent inspection staff. Sweden In Sweden there is informal QA for inspection consultants. SRA staff acquires knowledge of consultants’ competence during the course of inspection contract work. Firms that do not meet SRA expectations do not obtain further contracts for inspection services. United Kingdom In the United Kingdom, specific programs for QA are the responsibility of the maintenance contractor. The Highways Agency views the detailed inspections in preparation for repair projects as a verification of conditions at bridges. Bridge data records stored as part of SIMS, the bridge management system, have been collected for about 5 years. The Highways Agency will engage a contractor to undertake a records health check for existing data. Bridge data quality is considered in continuing develop- ment of SIMS. Here, the Highways Agency works coopera- tively as one member of a users group made up of agencies using the bridge management system. No. of Inspected Bridges No. of Control Inspections 1–100 2 101–300 3 >300 4 TABLE 76 NUMBER OF FINNISH QC INSPECTIONS IN 2005

Next: Chapter Six - Conclusions »
Bridge Inspection Practices Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB's National Cooperative Highway Research Program (NCHRP) Synthesis 375: Bridge Inspection Practices examines bridge inspection practices in the United States and selected foreign countries. The report explores inspection personnel (staff titles and functions, qualifications, training and certification, inspection teams, and the assignment of teams to bridges), inspection types (focus, methods, and frequency), and inspection quality control and quality assurance. The report also reviews the uses agencies make of information gathered from bridge inspections, what triggers repairs, and plans for future development of inspection programs.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!