Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 45
45 as frequent as twice-per-month verifications of one or a few Office review may include the review of a sample of in- bridge inspection reports or as infrequent as one or two field spection reports (nine DOTs), usually through comparison of reviews of teams per inspection cycle. Six DOTs make QC site condition ratings and maintenance recommendations with visits or field verifications targeted at every inspection team. photographs and inspectors' notes. Field visits are logged, and the results of field QC are recorded and discussed with the inspection team (see Table G6). As a part of QA, field review can have several forms: · Independent inspections by QA teams with subsequent Quality Control of Inspections by Consultants comparison with current inspection reports (seven DOTs). Thirteen of 35 DOTs delegate QC review to inspection con- · Field verification of inspection reports by QA review sultants as a part of their contract work. Twelve DOTs per- teams generating lists of differences in condition rat- form their own QC reviews of consultants' inspection reports ings and other findings (seven DOTs). (see Table G7). · Field review of current inspection reports performed jointly by QA review teams and inspectors of record. Quality Control Program Validation · Site visits of QA review teams to inspection teams at work (three DOTs). Sixteen of 36 DOTs reported methods for validation of QC programs in addition to the use of QA review. State DOTs Selection of bridges for QA review may be random approve QC plans of local government inspection programs (14 DOTs), based on bridge condition or special features and of inspection consultants. A state DOT may use annual (15 DOTs), or targeted at specific inspection leaders or review by the FHWA as a measure of validation of the teams (7 DOTs). QA review may include as few as two DOT's QC program. Four DOTs rely on check inspections of bridges or as many as 50% of inspections for the current a sample of bridges to validate QC programs. Two DOTs cycle (see Table G10). view annual training of staff as a way to maintain effective- ness of QC programs (see Table G8). QA reviews produce reports of the review and its findings, often with a set of recommendations for continuing im- provement of inspection work. Eleven of 30 DOTs employ QUALITY ASSURANCE standard forms, checklists, or questionnaires in QA review Activities in Quality Assurance Reviews of and these become part of QA reports. Inspection Programs Intervals for Quality Assurance Review QA reviews are verifications of the organization and execu- tion of bridge inspection programs. QA reviews determine Nineteen of 37 DOTs reported on intervals for QA review of in- whether inspection programs have qualified staff and ade- spection leaders and/or inspection teams. Intervals range from quate equipment. QA verifies that appropriate progress, 1 to 36 months. Nineteen DOTs reported on intervals for QA records, identifications, and follow-up are achieved. Thirty review of district and/or local government inspection programs. of 39 DOTs make QA reviews that are directed at districts Intervals range from 12 months to 48 months (see Table G11). and local government inspection programs (16), at inspection leaders and teams (15), or at samples of bridge inspection Aspects of Quality Assurance Review of reports (6). Sampling of inspections reports may be within a Bridge Inspections district or statewide. Other DOTs (six) are developing their QA policies or extending QC review to address QA needs Thirty DOTs reported items in QA review of bridge inspec- (see Table G9). tions. Most DOTs identify five items: Fifteen DOTs make QA reviews of inspection office pro- · Discovery of deterioration (21 DOTs). cedures and records. QA reviews verify: · Recognition of critical conditions (24 DOTs). · Accuracy of condition ratings (26 DOTs). · Staff qualifications and training, including refresher · Thoroughness of inspection reports (24 DOTs). training. · Appropriate methods of inspection (17 DOTs). · Bridge lists, especially lists of bridges having fracture- critical members, scour-critical bridges, posted bridges, Tolerances Used in Quality Assurance Review bridges needing dive inspections, bridges needing access equipment, and bridges needing interim inspections. Twenty-five of 32 DOTs reported tolerances used in QA · Records of critical findings, repair recommendations, reviews. Twenty-one DOTs reported a tolerance of ±1 for and staff follow-up. NBI condition ratings. Nine DOTs reported a tolerance on · Planning, scheduling, and progress of inspection work bridge load rating, with 10% being a common limit on dif- including report review and acceptance. ferences. Twelve DOTs reported tolerance on element-level