Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
A-1 APPENDIX A Survey of Bridge Engineers
A-2 CONTENTS A.1Â BACKGROUND AND PURPOSE ........................................................................................ A-5Â A.2Â SURVEY PARTICIPATION ................................................................................................. A-5Â A.3Â SUMMARY OF KEY FINDINGS ......................................................................................... A-7Â Training and Qualifications ..................................................................................... A-7Â Manuals ................................................................................................................... A-7Â Data Quantification, Accuracy and Precision ......................................................... A-8Â Use of Elements and Sub-Elements ........................................................................ A-8Â Quality Assurance / Quality Control ....................................................................... A-9Â Inspection Process and Procedure ......................................................................... A-11Â A.4Â DETAILED SURVEY RESULTS ....................................................................................... A-14Â Bridge Element Inspection Experience ................................................................. A-14Â Background Of Inspection Personnel .................................................................... A-15Â Qualification Of Inspectors ................................................................................... A-16Â Inspection Manual ................................................................................................. A-19Â Use of Data ............................................................................................................ A-23Â Data Quantification, Accuracy and Precision ....................................................... A-24Â Inspection Manual ................................................................................................. A-25Â Quality Assurance and Quality Control ................................................................ A-27Â Inspection Process and Procedure ......................................................................... A-31Â Optional Question ................................................................................................. A-39Â REFERENCES ...................................................................................................................................... A-42Â
A-3 LIST OF FIGURES Figure A-1. Pie chart showing distribution of responses for question 32. .............................................. A-10Â Figure A-2. Image showing example bridge provided in the survey. ..................................................... A-11Â Figure A-3. Bar graph showing the survey response to question regarding estimating defect areas. ..... A-12Â Figure A-4. Bar graph showing estimated time to complete a first-time and routine element-level inspection. ................................................................................................................................................................ A-13Â Figure A-5. Agenciesâ Experience with Bridge Element Inspection ...................................................... A-14Â Figure A-6. Agenciesâ practice for conducting routine bridge inspection .............................................. A-16Â Figure A-7. Agenciesâ types of bridge inspection teams assignment ..................................................... A-19Â Figure A-8. Agencies with/without element level inspection manual .................................................... A-20Â Figure A-9. Agenciesâ plan on developing element level manual .......................................................... A-22Â Figure A-10. Agencies' quality assurance procedure .............................................................................. A-28Â Figure A-11. Agencies' quality control procedure .................................................................................. A-31Â Figure A-12. Question 42 deck element ................................................................................................. A-33Â Figure A-13. Methods used to estimate the CS's of the deck element .................................................... A-34Â Figure A-14. Sources of the method used for Deck CS's estimation ...................................................... A-35Â Figure A-15. Sounding frequency to determine the area of damage in the deck .................................... A-36Â Figure A-16.Time required for subsequent element level inspection of the bridge ................................ A-37Â
A-4 LIST OF TABLES Table A-1. NCHRP 12-104 Survey Respondents' Status ........................................................................ A-6Â Table A-2. Summary of results for questions 25, 26, and 28-30 regarding use of elements and sub-elements. .................................................................................................................................................................. A-9Â Table A-3. Summary of responses for QA procedures analyzed according to the general models for QA (Washer and Chang, 2009). .................................................................................................................... A-10Â
A-5 A.1 Background and Purpose The objective of NCHRP Project 12-104 is to develop guidelines to improve the quality of element-level data collection for bridges on the National Highway System (NHS) in reference to the AASHTO Manual for Bridge Element Inspection First Edition with 2013 and 2015 Interim (MBEI). These guidelines will include recommendations to: (1) improve consistency in data collection and assessment of bridge element conditions and (2) establish accuracy levels for element conditions and applicable defect quantities to support bridge management system deterioration forecasting and evaluation. This appendix contains the results of a survey of bridge engineers conducted as part of the study. The purpose of the survey was to identify key data relevant to the objectives of the research based on the current state of the practice for element-level inspections. A.2 Survey Participation The survey was designed having a two-step response format. The initial email request was prepared by the research team and provided to AASHTO. It was requested in the email that the SCOBS member identify an individual within their agency to complete the survey. A summary of the survey, objectives of the research, and several sample questions from the survey were provided in this request. This email was transmitted from AASHTO to members of SCOBSâ Identified participants were then asked to complete an identification survey that consisted of simply providing their contact information. In this way, the appropriate individual within each state was identified by leadership within that state, rather than by the peer group of the researchers; we believe this was the appropriate approach to identifying the appropriate individual within each agency. Once a participant had completed the identification survey, they were provided with access to the electronic survey containing all 52 questions. Using this procedure allowed for an individual participant to save partial survey responses, leave the survey and come back later, etc. Table A-1shows the status of survey respondents.
A-6 Table A-1. NCHRP 12-104 Survey Respondents' Status No. Agencies Competed the Full Survey No. Agencies Completed the Identification Survey but not the Full Survey No. Agencies Completed Neither 1 Alabama 1 Arizona 1 Connecticut 2 Alaska 2 Arkansas 2 Georgia 3 Colorado 3 California 3 Indiana 4 Delaware 4 Hawaii 4 Louisiana 5 Florida 5 Iowa 5 Oregon 6 Idaho 6 Mississippi 6 South Carolina 7 Illinois 7 New Jersey 8 Kansas 8 North Carolina 9 Kentucky 9 Washington 10 Maine 10 West Virginia 11 Maryland 11 Federal Lands 12 Massachusetts 13 Michigan 14 Minnesota 15 Missouri 16 Montana 17 Nebraska 18 Nevada 19 New Hampshire 20 New Mexico 21 New York 22 North Dakota 23 Ohio 24 Oklahoma 25 Pennsylvania 26 Rhode Island 27 South Dakota 28 Tennessee 29 Texas 30 US Army Corps of Engineers 31 Utah 32 Vermont 33 Virginia 34 Washington DC 35 Wisconsin 36 Wyoming
A-7 A.3 Summary of Key Findings This portion of the appendix summarizes some of the key finding from the survey of bridge engineers. Training and Qualifications The survey asked if agencies had additional formal education or licensure requirements beyond those in the NBIS. In general, the responding agencies do not have additional formal education or professional licensure requirements. The Corps of Engineers and a few select state agencies have degree and/or Professional Engineer (PE) license requirements, but the majority of respondents indicated that they did not have such requirements. Questions were asked to develop a better picture of the respondentâs approach to training individuals for element-level inspections. It was found that a small majority of respondents require training for element- level team leaders, and 70% of those requiring training were providing the training in-house. Typically, respondents that either did not require training or did not have agency-based training identified element- level training provided by the FHWA Resource Centers as a training method used for element-level inspection team leaders. Manuals A key question addressed in the survey was to determine if agencies using element-level inspection had element-level inspection manuals to support the inspections. It was significant that more than 2/3 respondents indicated that they had an element-level inspection manual that included at least the FHWA- required elements and the defect elements identified in the MBEI. Among these respondents, more than Â½ have a manual that include defects as well as ADEâs. About 1/5 of the respondents indicated they do not have a manual at this time though half indicated a manual was being planned. These data are significant in demonstrating that agencies are responding to the need to develop manuals to support element-level inspections and are adjusting existing element-level manuals to meet the needs of the MBEI. It could also be viewed as significant that about 10% of respondents indicated that they did not have a manual and had no plans to develop one, which may indicate a resistance to implementing element-level inspection. It may also suggest that these respondents feel the MBEI provides adequate guidance for their needs. However, a later question in the survey asked if an agency used the element-level data for anything other than reporting the FHWA. Almost 1/3 of the respondents, or 10/36, indicated the data was not used for any purpose other than reporting to the FHWA. Seven of these respondents indicated insufficient benefits of using the data as a challenge that precluded the use of these data. Inadequate resources were identified by about half of these respondents, and incompatible data formats were identified by two of the respondents. It is important to note that several respondents indicated they were not using the data at this time because they had trouble with the software, were waiting for a future software release, or had not performed enough element-level inspections yet to have sufficient data to be useful for planning. The use of photographs to illustrate different condition states is limited among those agencies with element-level manuals; only about 1/3 of respondents indicated that their manual included photographs of different condition states. The use of visual guides for describing or illustrating different condition states in an important factor contributing to the consistency of visual inspections. It is also significant to note that 2/3 of these respondents indicated that their manual was in a format that is suitable for use in the field by inspectors. The use of visual guides for defining condition states or condition state boundaries will be most effective if provided in a format that is suitable for use in the field, such as a pocket guide. Among respondents that had manual suitable for use in the field, five indicated the manual was available via laptop or tablet computers. Seven of these respondents indicated the use of a pocket guide; one noted that their pocket guide ârequired large pockets.â
A-8 Data Quantification, Accuracy and Precision Several questions in the survey (questions 20-24) were designed to develop an understanding of the current state-of-the-practice regarding the use of quantitative data to describe CS and accuracy requirements used by agencies implementing element-level inspection. For example, question 20 in the survey asked if the responding agency was using the quantitative descriptors included in the commentary to determine the CS for concrete bridge decks with cracks. A majority of respondents, 70 %, indicated that they were using the quantities identified in the commentary in the MBEI (with 2015 revisions), and another 20 % indicated they were using the description in the previous version of the MBEI (without 2015 revisions), which included density as well as crack widths, or had developed their own quantitative descriptions. In total, 33/36 respondents were using quantitative descriptors to define condition states. This is significant to the current study because quantifying CS descriptions is more objective than having subjective descriptions, and therefore, more likely to have a measurable quality. Only three respondents indicated that they were using subjective criteria, of these, two respondents indicated that they had procedures for âcalibratingâ inspectors; these generally consisted of QA processes conducted on a yearly basis. This type of calibration exercise, which is common in Europe, provides on-site training on a periodic basis as a means of developing a common understanding of the desired CS definitions. Several questions were aimed at determining if agencies were using accuracy and precision requirements as part of their business practices. These questions were designed to address the question of accuracy requirements for element-level inspection. For example, Question 23 asked if the respondent had a specified accuracy requirement for CS quantities. In general, most respondents indicated that they did not have a specified accuracy tolerance for CS quantities, with 5/36 respondents indicating that they did have such an accuracy requirement. Among the five indicating that they had tolerances, two respondents appeared to be in error; one respondent indicated that they had graduated tolerance that decrease as the CS went from good to severe (CS1 15%, CS2 10%, CS3 5% and CS4 1%). A second respondent indicated that they used tolerances that varied according to element type, ranging from 0% for elements with units of EA up to 15% for wearing surfaces and protective coatings. Question 27 asked if any agencies had accuracy requirements for defects. Twenty-two agencies reported using defect elements, but only two of these indicated an accuracy requirement for defects. Of these two, one indicated a subjective requirement was used by comparing inspector ratings to a QA review rating, and the second indicated that accuracy requirements were included in their inspection manual. A review of the subject manual revealed the guidance on the measurement of steel and concrete cracking to the nearest 1/32" (1 mm). Question 24 asked if any respondent was using a precision requirement that was different than that indicated in the MBEI; more than 90% of respondents (31/34) indicated they were not using different precision values. Two respondents skipped the question. A respondent indicating they used a different precision stated that they used percentages for CS quantities, which would not necessarily result in a whole number value when multiplied by the total quantity of the element. In general, it was found that respondents were using the precision suggested in the MBEI. Use of Elements and Sub-Elements Several questions were asked in the survey to determine the extent to which agencies are using ADEs, defect elements and Bridge Management Elements (BMEs). Responses to the five questions regarding the use of element are summarized in Table 1. Given that the element-level inspection requirement and the MBEI are relatively new, these data provide a snapshot of the use of elements at this point in time. Overall, the results indicated that a majority of respondents were using ADEs and recording defects according to the MBEI. Only about a third of the respondents indicated that their agency had developed any additional defects elements; though several noted that these may be developed in the future. One agency indicated
A-9 that they were incorporating defect descriptions into the CS language and recording quantities as a note in their manual, which was still being developed. It was significant that more than 40% of respondents indicated they had developed elements that were subsets of BMEs. These results indicate that more detailed descriptions of these elements are being used by many of the respondents. This provides for more accurate descriptions of the conditions in the field. It was also notable that only a small portion, about 1/5 or the respondents indicated they had developed their own BMEs. In contrast, the number of respondents that have developed ADEs was about 4/5. Table A-2. Summary of results for questions 25, 26, and 28-30 regarding use of elements and sub- elements. Question Yes (%) No (%) Does your Agency use ADEs? 78 22 Does your Agency record defects using the MBEI defects? 61 39 Has your Agency developed any additional defect elements? 34 66 Has your Agency developed elements that are subsets of the BMEâs? 42 58 Has your Agency developed specific BMEâs in addition to those identified in the MBEI? 22 78 Quality Assurance / Quality Control Several questions were asked to determine the approaches to QA and QC currently being implemented by survey respondents, and how the results of these processes were communicated to inspectors. Question 31 asked respondents to select a QA procedure that best described the one used by their agency. These data were analyzed to characterize the QA procedures according to the general models provided in the literature (Washer and Chang 2009). In this analysis, responses noted as âotherâ were evaluated and compared with the general models. It was found that most (10/11) of the âotherâ responses could be assigned to one of these models; the remaining âotherâ response indicated using all of the processes suggested in the survey question. Responses indicating that âmultiple inspection teams inspect the same bridge and results are compared between teamsâ were considered as a type of independent oversight model (IOM); it was assumed that each team was conducting an independent inspection. Results of the analysis are shown in Table 2. These data indicate that the IOM is used by the majority of respondents, followed by the Field Verification Model (FVM). The use of Collaborative Peer Review (CPR) and the Control Bridge Model (CBM) were identified by only 1 respondent each.
A-10 Table A-3. Summary of responses for QA procedures analyzed according to the general models for QA (Washer and Chang, 2009). QA Process Responses (%) Independent Oversight Model 61 Field Verification Model 31 Collaborative Peer Review 3 Control Bridge Model 3 Other 3 Subjectively, these responses indicate an increase in the number of agencies using an IOM process as compared with research completed in 2009 (Washer and Chang 2009), when the FVM model appeared to be at least as common as the IOM. Although no formal survey was conducted at that time, a majority of State agencies were contacted during the course of the research regarding their QA practices. This increase in the commonality of the IOM may reflect the increased focus on QA by the FHWA. Question 32 asked how the results of the QA process are conveyed to the inspection team leaders. Research has shown that feedback is an important factor in the quality of VI results (Megaw 1979, Drury and Watson 2002, Melchore 2011, See 2012). Consequently, the feedback provided through both formal QA processes and less formal group meeting is an important component in the quality of inspection data. The results shown in Figure 1 illustrate that most respondents use a combination of face-to-face meetings and a report to convey the results of the QA process. Specifically the data showed that 14% of respondents used a report, 23% used and face-to-face meeting, and 63% used both. Figure A-1. Pie chart showing distribution of responses for question 32. Another form of providing feedback to inspectors is to hold periodic meetings of inspectors to discuss inspection results, ratings procedures, and requirements. Survey questions 34 and 35 asked if the agency held periodic inspection meetings, and if so, the frequency of the meeting. Question 35 also differentiated between meetings of all inspection teams or meetings at a district or other sub-division level. Responses indicated that almost 90 % of respondents hold periodic meetings of inspectors. Responses from those holding meetings were analyzed to include responses noted as âotherâ if those responses generally fit within a provided frequency option. For example, some respondents indicated a monthly meeting, but the choices provided were annual, biannual or quarterly. Based on this analysis, it was found that 81% of the respondents held a meeting of all inspectors at least annually; 53% indicated an annual meeting and 28% could be characterized as monthly, quarterly, or biannually. These results are significant in showing that most agencies are providing feedback and information to their inspectors through face-to-face meetings.
A-11 Inspection Process and Procedure A series of question in the survey were focused on inspection process and procedures. Because the bridge inspection process can vary significantly for different situations encountered in the field, an example bridge was provided for consideration in responding to the survey. This example bridge shown in Figure 2 was taken from the MBEI, where it serves as one of the inspection examples. This was done with the expectation that most or all of the survey respondents would be familiar with this example bridge. This example bridge provided the context for certain questions regarding how such a bridge would typically be inspected. All of the respondents indicated that previous inspection results were available to inspectors in paper or electronic form, or both, prior to a bridge inspection. Interestingly, about 10% of respondents indicated that based on their experience, these previous inspection results are not reviewed prior to the inspection. The accuracy of this response cannot be known, however, it does indicate that there are likely some cases in which the previous inspection results are not reviewed. Figure A-2. Image showing example bridge provided in the survey. One of the quality dimensions commonly assessed for element-level data is the consistency of area estimates. The ability of the inspector to estimate any given area is not well understood. As such, several questions were designed to gain some insight regarding how estimates of quantities were made in the field, and whether these procedures were documented. For example, question 42 inquired about how damage quantities would be estimated for the example bridge deck, assuming that the decks had a total area of 11,880 sq. ft. with 11,628 sq. ft. in CS 1 and 252 sq. ft. in CS 3. This question was focused on determining if the quantity is obtained subjectively, i.e., by a visual estimate of an area, or quantitatively, i.e., by using an objective measure such as a tape or measuring wheel. Figure 3 shows that responses were fairly evenly divided between those that would use an area estimate and those that would use a measurement device. A small number indicated that the damage would first be drawn on a sketch, or that the respondent didnât know what method would be used. A follow-up question asked if the method of estimating area was documented in an inspection manual, provided through training, or was at the discretion of the inspection
A-12 team. More than 60% of the respondents indicated that the method for making the measurement was at the discretion of the inspection team. Less than 10% of respondents indicated that the method used was documented in an inspection manual or other resource; about 30% indicated the method is described in training or periodic meetings. These data indicate that there is variation in the methodology for making a quantity estimate, and consequently one would expect some variation in the consistency of results. Figure A-3. Bar graph showing the survey response to question regarding estimating defect areas. Another question was aimed at determining the frequency at which sounding of the deck may be conducted as part of a routine inspection of the example deck. Condition states for decks include the defect of delamination/spall/patched area. Delaminations are subsurface and therefore not available for visual assessment. It could be expected that an inspection that included sounding to reveal subsurface damage would vary from one that did not. It was found from the survey responses that about 40% of respondents expected sounding to be done as part of a routine inspection for the example bridge. About 30% of the respondents indicated that sounding would be done during an in-depth or special inspection, or when work was being scoped. Another 30% of respondents indicated âother.â The common explanation among these responses was that sounding was done once damage of the deck became apparent visually or when the condition of the deck was downgraded. Previous research has indicated that the time required or allowed to complete an inspection has an impact on the quality of results. It is interesting to note that, similar to previous studies, the anticipated time to complete a routine element-level inspection of the example bridge varied significantly. Two questions focused on the time element for inspecting the example bridge. The first asked how long the respondent would expect an inspection team to spend inspecting the example bridge and preparing the required report for a first-time inspection. As shown in Figure 4, estimates ranged from less than 2 hours to greater than 8 hours. Another question asked how long the respondent would expect an inspection team to spend for subsequent element-level inspection; these results are also shown in Figure 16. These results also indicated responses ranging from less than 2 hours to greater than 8 hours. As expected, first-time inspections were
A-13 generally expected to take longer than subsequent inspections. One respondent skipped these two questions. Given that different agencies have different business practices and are collecting different levels of detail during element-level inspections, the fact that time estimates differ is not surprising. However, the extent to which they differ, from less than 2 hours to greater than 8 hours, is indicative of how large the variation in approaches to element-level inspection may be among different bridge owners. Another question asked if the agency required inspectors to record the actual time taken to complete an inspection; 70 % of respondents indicated that they did not. All respondents answered this question. Figure A-4. Bar graph showing estimated time to complete a first-time and routine element-level inspection. Access to the elements to be inspected is another factor in the quality of inspection results that was studied through the survey. Respondents were divided in their responses for the example bridge provided in the survey. Asked if access equipment would be used, approximately 47% of respondents said yes and 53% said no; two respondents skipped this question. Asked if traffic control would be used, approximately 46% said yes and 54% said no; one respondent skipped this question. Again, these data provide insight into variations between the business practices of different agencies. Finally, the survey asked what photographs were required as a part of the inspection. Photographs can provide important documentation of conditions and defects present in bridges. Photographs can provide context regarding the advancement of deterioration over time for future inspections and allow for quality review of inspection results. About 70% of respondents indicated that the photographs described in the MBEI were required, other responses indicated that photographs of defects that resulted in CS2, 3, or 4 or a NBI component condition rating below a 6. About 16% of responses indicated that inspection teams determine the appropriate photographs. Overall, the survey provided important information regarding the state-of-the-practice for element-level bridge inspection. Given the evolving nature of element-level inspection programs, the survey provides a snapshot in time of the current practices. Generally, the survey illustrated that responding agencies were taking a proactive approach to implementing the MBEI and new element-level inspection requirements, developing element-level manuals, developing ADEs and other elements, and utilizing some quantitative descriptors to identify condition states quantitatively. QA programs that included feedback through both
A-14 face-to-face meetings and reports illustrate a commitment to quality and an important understanding of the need for feedback to improve the quality of visual inspections. The survey also revealed that there were broad variations in approaches to inspecting a common highway bridge, with significantly different estimated times to complete the inspection, different methods of access, traffic control and tools are used during routine inspections. A.4 Detailed Survey Results The survey results for each question in the survey is presented in this section of the report. The purpose of the question, as conceived at the time the survey was developed, is shown for each questions. All relevant comments submitted from the survey are included. Bridge Element Inspection Experience 1. When did your Agency begin collecting element level data? a) Between 1994-2000 b) Between 2000-2010 c) Between 2010-2012 d) Between 2013-2015 Purpose of the question: The purpose of this question is to determine if the responding Agency is new to the element level inspection practice, or if they have experience with element level inspection. This question will demonstrate the experience level of responders. Figure A-5 shows the results from this question. The project survey shows that the majority of the agencies have significant experience with element level bridge inspection. Namely, 22 agencies began conducting element level inspection during 1994-2000, 7 agencies began conducting element level inspection during 2000-2010, one agency started during 2010-2012, and 6 agencies started during 2013- 2015. Also, it is important to note that two states, Vermont and Kansas, stopped and then restarted element level inspection after a break of several years. All respondents answered this question. Figure A-5. Agenciesâ Experience with Bridge Element Inspection
A-15 Background of Inspection Personnel 2. Which of the following best describes the organization of bridge inspection teams (used for typical NHS bridges) in your Agency? a) Teams stationed in the central or main office conduct inspections throughout the state b) Teams stationed in districts (or other subdivision within your Agency) conduct inspections across a defined geographic region c) Consultants perform the majority of inspections across the state, assigned through a central office d) Consultants perform the majority of inspections across the state, assigned through a district (or other subdivision within your Agency). e) A mixture of Agency personnel and consultants/contractors conduct inspections across the state, assigned through a central office f) A mixture of Agency personnel and consultants/contractors conduct inspections across the state, assigned through districts (or other subdivision within your Agency) g) Other____________________________________ Purpose of the question: The purpose of the question is to determine the organization of bridge inspection teams within the responding agency. Division of responsibility between central office and district offices and between consultants and agency personnel are describe in general terms. Greater insight into the bridge element inspection processes used by agencies will help to facilitate the development of guidelines in this project. The survey results show that the majority of the agencies have decentralized organization for bridge element level inspection and use a mixture of agency personnel and consultants/contractors to conduct the inspections. Agencies use different methods to divide the work between agency personnel and consultants. 12 agencies have a combination of agency personnel and consultants/contractors to conduct element level inspection across the state, and seven agencies have teams stationed in districts or subdivisions within the agency. Centralized organization is found in 16 agencies; six of these agencies have teams stationed in the central/main office to conduct inspection across the state, three of these agencies employ consultants to perform the majority of inspection across the state, and seven of these agencies use a combination of agency personnel and consultants/contractors for conducting the inspection. One respondent skipped this question. 3. Which of the following choices best describes your Agencyâs practice for conducting routine bridge inspections? a) A Team Leader works together with the same team members (i.e. inspectors or assistants) most of the time b) A Team Leader works together with different team members (i.e. inspectors or assistants) most of the time c) Donât know d) Other (please specify) _____________________________________________________________ Purpose of the question: The purpose of the question is to determine the degree to which bridge element inspection teams consistently work together as a team. Consistency in team composition may affect the consistency of the inspection results. As shown in Figure A-6, working as a consistent team for conducting routine element-level inspection is prevalent among the agencies. In 23 agencies, most of the time a team leader works together with the same team members, and in 6 agencies the team leader works with different team members. In several states, team leaders work alone the majority of the time. All respondents answered this question.
A-16 Figure A-6. Agenciesâ practice for conducting routine bridge inspection Qualification of Inspectors 4. Does your Agency require any of the following qualifications for inspection team leaders conducting routine inspections?(Select all that apply) a) An associateâs degree in engineering or engineering technology b) A bachelorâs degree in engineering c) Professional Engineering License d) None of the above e) Other (please specify)_____________________ Purpose of the question: The purpose of the question is to determine if Agencies are using additional requirements beyond the NBIS to qualify team leaders. The results indicate that most agencies do not require any additional qualifications for team leaders. Twenty-seven agencies do not require any qualification beyond the NBIS. Additional qualifications beyond the NBIS standards were varied among the rest of the states. Five agencies require inspection team leaders to hold a professional engineering license, three agencies require a bachelorâs degree in engineering, and one agency requires an associateâs degree in engineering or engineering technology. Two respondents skipped this question. 5. Does your Agency require a field performance test for an inspection team leader to be qualified/certified/re-certified to perform routine bridge inspections? (A field performance test consists of an inspector completing an inspection in the field of sufficient quality to meet an established standard or policy). Y/N
A-17 Purpose of the question: The purpose of the question is to determine how many states require a field performance test to qualify inspectors. Field performance tests are a quality tool intended to ensure that inspectors meet a standard level of quality when performing inspections in the field. The results from this question indicate that most agencies do not require a field performance test for an inspection team leader to be qualified/certified/re-certified. Only six agencies require a field performance test. All respondents answered this question. 6. Does your Agency require any form of a visual test for inspectors? a) Yes, we use a Jaeger test b) Yes, inspectors are required to possess a driverâs license which requires a periodic visual exam c) No d) Yes, Other (Please describe) e) ___________________________________ Purpose of the question: Purpose of this question is to determine the number of agencies requiring some form of a visual test for inspectors. The question also suggests the use of a vision test, which some participants may not have considered. Because the inspections conducted are primarily visual assessments, vision may have an impact on the quality of results. The results from this question show that most agencies do not require any form of a visual test for inspectors. Among those that do require a visual test of inspectors, 11 agencies require that inspectors have a license to operate a vehicle, which requires a periodic visual exam, and 1 agency requires that inspectors must maintain a Commercial Driverâs License, which requires a visual test every two year. One agency is considering adding a visual testing requirement. 7. Does your Agency have any additional requirements beyond the National Bridge Inspection Standards (NBIS) requirements for inspection team leader qualification, other than those described above? Y/N If yes, please describe ______________________________________________________________________ Purpose of the questions: The purpose of the question is to determine if there are additional enhancements to the NBIS requirements currently in use by Agencies. Twenty-eight of the agencies indicated that they do not have requirements beyond the NBIS for inspection team leader qualification. Requirements beyond the NBIS standards are varied. Alabama requires that team leaders attend at least one of Alabamaâs Bridge Inspection Refresher Courses prior to certification, and then must attend one of these at least once every two years to maintain certification. New York requires that inspection team leaders must have at least 3 years of bridge experience in design, construction, inspection, or other bridge engineering related work. The United Statesâ Army Corps of Engineers (USACE) requires that team Leaders to be practicing structural engineers and attend the Bridge Inspector Refresher course; Inspectors must earn a minimum grade in the course and accomplish this every 5 years in order to remain qualified. Michigan requires recurrent training requirement for the inspection team leader. Delaware requires the NHI Bridge Inspection Refresher course or equivalent to be taken every 5 years, a PE must be present for all Fracture Critical Member inspections, and also, all underwater divers must take the NHI Underwater Bridge Inspection Course. Ohio DOT bridge inspectors must take a climbing class, confined space class, and regular refresher training in order to be a Team Leader. Missouri requires that team leaders take the refresher class every 4 to 6 years, and someone new to inspecting will accompany
A-18 an existing team leader for a week or two before working independently. Wisconsin requires the team leader to have a P.E. license or five years of bridge inspection experience in addition to the NHI bridge inspection course. Kansas requires that team leader to have completed a minimum number of inspections. One respondent skipped this question. 8. Do bridge element inspection teams specialize in certain types of bridges? For example, are there special teams for fracture critical inspections or complex structures? Y/N If yes, please describe. ____________________________________________________________________ Purpose of the questions: The purpose of the question is to determine the level of specialization of the bridge element inspection teams. The responses showed that there were no specialized inspection teams for certain types of bridges in 25 of the surveyed agencies. Specialized bridge element inspection teams are were indicated by 11 agencies. Maryland and Florida both used inspectors with specialized training for inspections of movable bridges. Texas has specialized teams for fracture critical members and underwater inspections. Alaska requires all team leaders to be able to inspect fracture critical or complex structures; this could include a team with an assistant who has previously performed an inspection of the bridge to help guide the current team leader. The state of Minnesota houses a central bridge office with special teams for fracture critical, pin and hanger, and complex structure inspections, which includes personnel with nondestructive testing Level 2 and 3 certifications. Kentucky has specialized rope access teams. In Missouri, all fracture critical member inspections and other specialized inspections are done by Central Office personnel. In Wyoming, all fracture critical inspections are performed by a centrally located team that all have attended NHI training. Illinois uses specialized crews only for major river (complex) bridges who have special training and access equipment to accomplish those large bridge inspections. In Oklahoma, fracture critical members are inspected by team leaders with Professional Engineer (PE) licenses. Wisconsin requires that inspectors are NHI fracture critical course certified. All respondents answered this question. 9. Does your Agency require training for element level inspection team leaders? Y/N If yes please describe. Purpose of the question: The purpose of the question was to identify the commonality of agencies requiring element-specific training for team leaders. Twenty agencies, which is more than half of the agencies surveyed, require training for element level inspection team leaders. Sixteen agencies do not require training for element level inspection team leaders. All answered this question. 10. You answered yes to the previous question. Is this training provided by your Agency? Y/N Purpose of the question: The purpose of the question was to determine if agencies were developing internal training programs for element-level inspection. The responses to this questions indicated that 14 of the agencies that required training for element level inspection team leaders provide the trainings by their own agency. All answered this question. 11. Which of the following best describes the assignment of bridge inspection teams? a) Teams (or a small group of teams) inspect the same population of bridges in each inspection cycle (e.g. bridges in a particular district or geographic region)
A-19 b) Assignments are ad-hoc, meaning inspection teams typically inspect a different population of bridges during each inspection cycle c) Inspection teams are assigned different bridges in each inspection cycle intentionally d) Other (please specify) Purpose of the questions: Some agencies believe that using the same teams on the same bridge improves the quality of results, and some literature suggests this is effective for improving accuracy. In contrast, some Agencies require that different inspection teams inspect bridges each cycle, as a quality control procedure. Other agencies assign inspections in an ad-hoc fashion. This question is intended to provide a distribution of these approaches among respondents. As shown in Figure A-7, it is found that in 17 of the surveyed agencies, teams (or a small group of teams) inspect the same population of bridges in each inspection cycle (e.g., bridges in a particular district or geographic region). Three of the agenciesâ bridge inspection assignments are ad-hoc, meaning inspection teams typically inspect a different population of bridges during each inspection cycle. In eight of the agencies, inspection teams are assigned different bridges in each inspection cycle intentionally. In the group of agencies that assigns different bridges in each inspection cycle intentionally, the assignments could be according to consultant bridge inspection contract, by specific schedule, by a set minimum percentage of bridges inspected, or random assignment. All answered this question. Figure A-7. Agenciesâ types of bridge inspection teams assignment Inspection Manual 12. Does your Agency have an element-level inspection manual that addresses the new element-level Federal Highway Administration (FHWA) requirements and/or includes the AASHTO Manual for Bridge Element Inspection (MBEI) additional definitions and data collection? a) Yes, required FHWA data only
A-20 b) Yes, FHWA data plus MBEI defects c) Yes, FHWA data plus MBEI defects and Agency Developed Elements (ADEâs) d) No manual at this time e) Other (please describe) Purpose of the questions (Q11-Q15): The purpose of the questions is to determine the current level of development of manuals to support element level inspection. Responses indicate how many agencies are using photographic references for condition states. The question also determines if those manuals are presented in a suitable form for reference in the field. Figure A-8. Agencies with/without element level inspection manual As shown in Figure A-8, it is found that four of the surveyed agencies have an element level inspection manual that addresses the FHWA required data only, three agencies have manuals that address the FHWA data and MBEI defects, 20 agencies address FHWA data plus MBEI defects and Agency Developed Elements (ADEâs), and eight agencies do not have an element inspection manual at this time. One respondent skipped this question. 13. You answered yes to the previous question. Does the manual include photographs of different Condition States (CS) for National Bridge Elements (NBEs) and required Bridge Management Elements (BMEs)? Y/N Of the 27 agencies that have an element inspection manual, ten of these manuals include photographs of different Condition States (CS) for National Bridge Elements (NBEs) and required Bridge Management Elements (BMEs).
A-21 14. Is the manual available in a format that is suitable for use in the field by inspectors during the conduct of inspections? For example, is there a pocket guide, tablet version, field manual or tables? Y/N Of the 27 responding agencies, 19 of them have the manual available in the form of pocket guide, tablet version, field manual, or tables to be used in the field by inspectors. In Maine the manual is available through software and in North Dakota laminated 8 1/2"x11" sheets contain the defects that can be coded for each element. 15. You answered No to question 12. Does your agency plan on developing an element-level manual in the future? a) Yes, in the next 1-5years b) Yes, but no time schedule known c) No. Of the eight agencies that do not have an element level manual, one of them has plan to develop a manual in the next 1-5 years, three of these agencies plan to develop a manual with no specific time, and 4 of these agencies are not planning to develop a manual. These data are shown in Figure A-9. New York is going to release an Inspection Manual in January 2016 that is to be used in concert with the MBEI and the BIRM. They do include ADEs and defects will be optional based upon planning need. Marylandâs inspection manual, which addresses the FHWA data only, also has some ADEs, but defects are not required. Colorado is in the process of updating their current Pontis Element Inspection Manual to NBE's and ADE's. In Alaska, MBEI is used and any old CoRe elements or smart flags and any old ADE's are being converted into NBE or BME; use of defects is not required, though recommended. In Minnesota, the new manual will be available online by January 2016. Minnesota will begin inspections based on the new elements with the 2016 inspection cycle. All element data from the 2015 inspection cycle will be migrated internally to the new elements for FHWA submittal. Illinois, has many agency ADEâs that go into greater detail than the standard Federal elements. One respondent skipped this question.
A-22 Figure A-9. Agenciesâ plan on developing element level manual 16. What are some of the challenges that your Agency has faced with bridge element inspection? Choose all that apply. a) Time required to complete the inspections b) Documentation / data entry of inspection results c) Developing initial quantity estimates d) Training of inspectors in element-level inspections e) Adequate resources (inspection teams) to complete the required inspection f) Interpretation of element and condition definitions in the MBEI g) Storage of data from inspections h) Consistency of element level results among inspectors i) What other significant challenge did you experience? (If different than shown above) Purpose of the question: The purpose of the question is to identify potential obstacles that have impeded an agencyâs implementation of bridge element inspection. Results will help to indicate some of the challenges agencies have faced and will help to provide direction to this research project. The question is in two parts, the first is constrained to perceived challenges and the second was more open-ended. Time required to complete the inspections was the most common challenge identified in answer to this question, with 23 of the 36 surveyed respondents making this choice. Seventeen agencies indicated documentation / data entry of inspection results, interpretation of element and condition definitions in the MBEI, and maintaining consistency of element level results among inspectors. Fifteen of the agencies indicated training of the inspectors in element-level inspections to be a challenge; 15 agencies also indicated that developing initial quantity estimates was a challenge; 10 of the agencies deemed adequate resources (inspection teams) to complete the required inspection as a challenge. Other challenges that were
A-23 specifically mentioned in the survey were the software installation, additional cost to transition to element level inspections for a particular bridge, changing personnel, unpredictable weather, internal restructuring of the regions, completing inspections within 24 month time period, and BrM software issues. The lack of inclusion of all elements in MBEI was identified as a challenge (e.g., there is no prestressed concrete slab yet this is a common element and was present in the CoRe). Other challenges noted were developing new field manuals, determining agency elements, upgrading current data collection software to handle the defects, wearing surfaces and other sub elements, and application of defect elements. It was noted by one respondent that the MBEI quantities were not adequate for describing deterioration. For example, a pier wall is measured by the linear foot regardless of whether the pier is 5 feet tall or 40 feet tall. This can cause a very large disparity in percentage of pier wall in any particular condition state. One respondent noted that the transition to the element-level data was challenging and that the element level data does not collect the "location" of the defect. Therefore, it is a little harder to give an overall assessment of the entire bridge based on a small quantity of a specific defect. For example, a rural bridge with heavy section loss in a fascia beam may have 5 ft of the girder in condition state 4 (severe). However, the overall assessment of the superstructure may not be poor condition (NBI rating 4) as this may have little impact to the overall safety of the bridge. Focusing in on the specific quantities of defects dilutes the ability to make an overall assessment of the structure especially when the location of the defects are not collected. The respondent indicated that there needs to be a balance between the overall condition and minute defects. The primary task for the inspector is to give a "safety" condition assessment of the structure, then second to collect data for management needs. Three respondents skipped this question. Use of Data 17. Does your Agency use data collected during element level inspection for purposes other than reporting to the FHWA to meet federal requirements? Y/N Purpose of the question: The purpose of the question is to determine how the agency uses its bridge element inspection data. The results from the question will provide greater insight into how agencies use these data or possible reasons why they do not use the data. These results will also provide insight regarding how accuracy (or inaccuracy) may affect the Agencyâs program, based on how they are using the data. The answers provided for this question show that 26 of the agencies use the data collected during element level inspection for purposes other than reporting to FHWA to meet federal requirements, but 10 agencies use collected data for reporting to FHWA only. 18. You answered yes to the previous question. Does your Agency use bridge element inspection data for the following applications? (Select all that apply) a) Deterioration Modeling b) Forecasting future funding needs for a system c) Project level planning (e.g. estimating project costs for a particular bridge) d) Identifying preservation / replacement needs e) Performance measures / Health index f) Other (please describe)_____________________________ Of the 26 agencies that use data for purposes other than reporting to FHWA, almost all of these agencies use the data for identifying preservation/replacement needs. Fourteen agencies use these data for deterioration modeling; 16 agencies for forecasting future funding needs for a system; 15 agencies for project level planning (e.g. estimating project costs for a particular bridge); 13 agencies for performance
A-24 measures or a health index. It was also reported that the data has been used for data deterioration modeling, forecasting future funding needs, and project level planning. One agency has noted that they are using these data to identify structures that need painting. 19. You answered No to the previous question. What are some of the challenges which preclude your Agencyâs use of bridge element inspection data for these applications? a) Inadequate resources to utilize data b) Insufficient benefits of using data c) Incompatible data formatting or database systems d) Other (please specify) ________________________________________________________ Of the 10 agencies that do not use the data other than for reporting to FHWA, the followings are challenges that preclude them from those applications: inadequate resources to utilize data was indicated by five agencies, insufficient benefits of using data was indicated by seven agencies, and incompatible data formatting or database systems was indicated by 2 agencies. All answered this question. Data Quantification, Accuracy and Precision 20. The MBEI commentary includes quantified descriptors for certain defects such as cracks. For example, Element 12 Reinforced Concrete Deck elementâs commentary suggests cracks ranging from 0.012 to 0.05 in. can be considered moderate. The condition state description for CS2 cracking describes cracking subjectively as âmoderate.â (Page 3-4 of the MBEI). Does your agency utilize the quantitative descriptions in the MBEI commentary for determining the CS of elements? a) Yes b) Yes, but our agency uses the original descriptions in the 2013 version of the MBEI which include both crack width and density c) No d) No, but we have developed similar quantitative descriptors for determining CS of elements Purpose of the question (Q20-Q22): The purpose of the question is to determine the extent to which quantitative descriptors are being used to identify CSs. Quantitative descriptors are more objective than using qualitative descriptors. Therefore the extent to which quantitative descriptions are being used should be established. This question will also provide additional information on existing practices to assist with the development of guidelines. The responses from the survey shows that 30 agencies utilize quantitative descriptions in the MBEI commentary for determining the CS of elements. Of the 30 agencies, 25 utilize the quantitative descriptions as written in the MBEI commentary for determining the CS of elements, but five agencies use the original descriptions in the 2013 version of the MBEI, which include both crack width and density. Of the six agencies that are not using the quantitative descriptions of the MBEI commentary, three of them are not using quantitative description at all, and 3 agencies have developed similar quantitative descriptors for determining CS of elements. All answered this question. 21. You answered No to the previous question indicating subjective assessments are used to determine CS. Does your Agency have any procedures in place to âcalibrateâ the subjective assessments of inspectors? Y/N
A-25 Two respondents answered this question. Maine indicated that the issues concerned with subjective assessments are covered at annual training. Montana uses annual on-site quality assurance reviews, and indicated that while they do not use the quantitative descriptions as an absolute, the descriptions are used as a guideline for inspector to use at their discretion. All answered this question. 22. You answered (No, but we have developed similar quantitative descriptors for determining CS of elements.) to the previous question. Please describe and/or provide a reference to where these descriptions can be found? Agencies that have developed similar quantitative descriptors are Delaware, Florida, and Kansas, and these descriptors are found in their respective inspection manuals. 23. Does your Agency have a specified accuracy requirement for condition state quantities (e.g. +/- 10 % tolerance for CS2, etc.) for National Bridge Elements (NBEs) or required Bridge Management Elements (BMEs)? Y/N If yes, please describe and/or provide a reference to where these requirements may be found. ____________________________________________________________________ Purpose of the question (Q23-Q24): The purpose of the questions to determine how many states have specified the desired accuracy and precision for element level condition states. This will assist with identifying the current state of the practice in terms of element level inspection data accuracy and precision requirements. Responses to this question indicate that only five of the surveyed agencies have specified accuracy requirements for condition state quantities, and 31 agencies do not have specified accuracy requirement for condition state quantities. All answered this question. Among the five respondents indicating that they had tolerances, two respondents appeared to be in error; one respondent indicated that they had graduated tolerance that decrease as the CS transitions from good to severe (CS 1 15%, CS 2 10%, CS 3 5%, CS 4 1%). Another respondent indicated that they used tolerances that varied according to element type, ranging from 0% for elements with units of ea up to 15% for wearing surfaces and protective coating (sq ft). 24. The MBEI suggests the precision of CS quantities as the base unit for a given element, e.g. 1 square foot (sq. ft.) for bridge deck CS quantities. Does your Agency specify the precision of element CS quantities that is different than suggested in the MBEI? Y/N If yes, please describe or provide a reference where the requirement may be found. _________________________________________________________________ The answers to this question showed that 31 agencies do not specify the precision of element CS quantities that is different than suggested in the MBEI. Only 3 agencies specify the precision. Florida specifies precision to the nearest unit of measurement (sq. ft., ft. each, etc.). Montana typically records defects in percentages. Two respondents skipped this question. Inspection Manual 25. Does your Agency use Agency developed elements (ADEs)? Y/N If yes, please describe or provide a reference where these elements are described. (You may email data to email@example.com.) ___________________________________________________________
A-26 Purpose of the question: The purpose of the question is to determine if agencies have supplemented bridge elements with special defect elements or their own customized elements. These data suggest the expected or desired level of accuracy in the element-level data for a given Agency in terms of describing the existing conditions, materials and defects. Agencies that desire the greatest accuracy in these terms would typically use more ADEs, defects, or BMEs to enhance those provided in the MBEI. This will provide insight into the commonality of states using ADEs and special defect elements. This will also provide information on common elements that may be missing from the current MBEI, if these elements are consistently utilized by all Agencies. The survey result shows that 28 agencies use ADEs, and only eight agencies do not use ADEs. 26. Does your Agency record defects using the MBEI defects (e.g. defect 1130, cracking)? Y/N The answers provided for this question show that 22 agencies record defects using the MBEI defects, and 14 agencies do not record defects. 27. You answered yes to the previous question. Does your Agency have any accuracy requirement for defects? Y/N If yes, please describe. Of the 22 surveyed agencies recording defects, 20 agencies do not have an accuracy requirement for defects; only two agencies indicated an accuracy requirement. In Montana, there is a process for the QA review team and inspector to go through if there is a disagreement between the team and inspector. Delaware uses defects in BrM software that reflect MBEI defects; Minnesota records defects in the element notes; North Dakota uses reference sheets that were used in the MBEI training provided by FHWA. 28. Has your Agency developed any additional defect elements? Y/N If yes, please describe or provide a reference. (You may email data to firstname.lastname@example.org.) In response to this question 12 agencies responded yes, and 23 agencies answered no. Developed elements or defects were of a wide variety: stream hydraulic, secondary member ADE which have unique defects, beam end, loss of riprap stone, paved slope effectiveness, timber fire damage, timber weather, impact damage, shear cracks, and debris defect. One respondent skipped this question. 29. Has your Agency developed elements that are subsets of the BMEâs? For example, for element 510, Wearing Surfaces, a subset might include concrete wearing surface, asphaltic wearing surface and/or semi-rigid (epoxy or polyester material) as subsets. Y/N Twenty one of the respondents answered that they had not developed elements that are subsets of BMEâs. Fifteen agencies have developed elements that are subsets of BMEâs. All answered this question. 30. Has your Agency developed specific BMEâs in addition to those identified in the MBEI? Y/N If yes, please describe the general background of development of these elements. Responses to this question show that 28 agencies have not developed any specific BMEâs in addition to those identified in MBEI, and only eight agencies have developed specific BMEâs in addition to those in MBEI. Michigan is among those that has developed specific BMEâs such as reinforced concrete deck top surface, reinforced concrete bottom surface, false decking, maintenance sheeting, and stay in place forms. Virginia has developed specific BMEâs for system preservation activities. Montana has added a BME for
A-27 corrosion resistant reinforcing, and one for nonmetallic reinforcing. Appendix B lists that ADEs, BMEs and defect elements submitted by participants. Quality Assurance and Quality Control 31. Please select the quality assurance procedure that best describes the procedure used by your Agency. a) Bridges are inspected by a different, independent inspection team and results are compared to the routine inspection results. b) Inspection results are reviewed in the field by a supervisor or peer c) Multiple inspection teams inspect a particular bridge and results are compared with an âexpertâ inspection d) Multiple inspection teams inspect a particular bridge and results are compared between teams e) Other (please describe)_______________________________ Purpose of the question: The purpose of the question is to determine the quality assurance procedures being used by agencies to help ensure consistency of the bridge element inspections. This question will provide direction for ways in which the guidelines developed in this project can help to improve quality assurance procedures and inspection consistency. Question relates directly to the objectives of the research in terms of developing field exercises for ensuring the quality of inspection results. Answers to this question as shown in Figure A-10 reveal that all agencies have a quality assurance procedure that is similar with some agencies. Fourteen of the agencies indicated that bridges are inspected by a different, independent inspection team and results are compared to the routine inspection; eight agencies, uses inspection results to be reviewed by a supervisor or peer; one agency uses âexpertâ inspection as a bench mark and multiple inspection teams that inspect a particular bridge are compared with it; two agencies use multiple inspection teams inspect a particular bridge and results are compared between teams. Other quality assurance procedures reported were a combination of above mentioned procedures (Utah), independent team leaders performing technical reviews followed by a QA review by a Division Program Manager to ensure regional consistency and compliance with NBIS (USACE), and annual QA with the use of consultant contract (Michigan). ï· In Virginia, central Office Structure and Bridge Division performs a QA review of all districts (sample of in-house teams, consultant teams and a locality team), which consists of office review of records (inspection and load rating) and field review of inspected bridges. ï· Delaware QA teams takes the finalized report to the bridge and compares the report to existing conditions to complete the QA form with any changes; this information is then discussed with the team leader during their review. ï· In Alaska, each year a random sampling of bridges are inspected by a different, independent inspection team and results are compared to the routine inspection results. All inspection reports are reviewed by another team leader to ensure appropriate NBI Item and MBEI Element ratings based on the narrative and photos in the draft inspection report. ï· In Massachusetts, inspection reports are reviewed by a supervisor. Each team leader has two reports evaluated in the field yearly by the supervisor and Area Bridge Inspection Engineer as QA. ï· In Kentucky, inspections are randomly selected by central office and an independent field inspection is performed by the Chief bridge inspector on each QTL. Also, random inspection are selected and reviewed at central office by the Chief Bridge Inspection Engineer, Chief Bridge Inspector, and the Chief Load rating Engineer. ï· In Missouri, the central office do independent QA/QC inspections on a sample of bridges each year that covers different areas of the state.
A-28 ï· In Montana, Bridges are inspected by a QA Review team which consists of the original inspector, the district bridge inspection coordinator, any other inspectors who would like to join the team, and two Quality Assurance engineers from headquarters, who are considered "experts," and the result is compared to the routine inspection results. With the inspector on-site during the review, changes to an inspector's rating typically generates discussion that becomes on-site training. ï· Oklahoma utilizes a combination of their QC/QA program, independent review, and consultant for a one time inspection QA contract. All responded to this question. Figure A-10. Agencies' quality assurance procedure 32. How the results of quality assurance (QA) processes are conveyed to inspection team leaders? a) Report b) Face to face meeting c) Both of the above Comments. Purpose of the question: The purpose of questions 32 and 33 were to determine the feedback mechanisms currently used, and to learn if the agencies were disqualifying inspectors based on performance. From the responses of this question it is found that 22 agencies are using both reports and face to face meetings to convey the result of QA processes to the inspection team leader; five agencies use reports only; eight agencies use face to face meeting only. In Washington DC, a bridge inspection consultant is notified of any discrepancies by email or in person. Ohio uses comparative data in a face to face meeting to convey the QA result. One respondent skipped this question. 33. What is the result of poor performance of an inspection team leader during QA process?
A-29 a) Retraining/re-certification b) Disqualification c) Other (please describe) It is found that 16 agencies uses retraining/recertification if poor performance of an inspection team leader is found during the QA process. One agency disqualifies the inspection team leader if determined to have poor performance, and Delaware uses both and sometime either of the above to address poor performance by team leaders. Nineteen other agencies use different ways to overcome team leadersâ poor performance. ï· In Washington DC, poor performance is usually the action of the sub consultant. The consultant is informed of the problem and asked to rectify it through instruction or some other means. ï· New York uses retraining but disqualification also an option if the issue was severe. ï· Since USACE Team Leaders are P.E. and structural engineers, any inspection issue is resolved between each other. ï· Virginia uses performance counseling to address inspection team leader poor performance. ï· In Texas, poor performance is reflected in the inspection team leader evaluation, which is used in the consultant contract selection process. It is in the consultants' best interest to pursue re-training for their chances at continued work. ï· In Minnesota, the poor performance is addressed face to face meeting by retraining the inspection team leader by the quality assurance team and the bridge owner is informed through a report. ï· Ohio only provides information and verbal correction to address inspection team leader poor performance. ï· In Missouri, direct feedback to someone that has significant issues with how to rate things is used. For other things, general feedback is given to all the inspectors on areas that need more focus. ï· Nevada addresses this issue by talking about the problem in group meetings, and Montana uses on-site training. ï· In Oklahoma, if an inspection team/consultant has been underperforming for a period of time they are eliminated from the bridge inspection program. ï· In Rhode Island, if the inspection team gets a poor rating on a QA/QC more bridges that the team inspected will be reviewed and a meeting will do setup with the team to review the results. Disciplinary actions may be taken that could result in suspension of that team for a time period. ï· In Ohio, the result of poor performance is addressed by providing information and verbal correction, and in Pennsylvania the results are reported to district executive. 34. Does your Agency hold periodic meetings of bridge inspectors to discuss inspection results, ratings, procedures and requirements? Y/N Purpose of the question (Q34-Q35): The purpose of the question is to determine if the agency uses periodic team meetings to discuss inspection results and requirements, and to generally characterize the frequency and participation in those meetings. Almost all the agencies, a number of 32, hold periodic meetings of bridge inspectors to discuss inspection results, ratings, procedures, and requirements. Only four agencies do not hold periodic meetings. 35. You answered yes to the previous question. Which of the following best describes your process? a) Annual meeting of all inspection teams b) Quarterly or biennial (every six months) meetings of all inspection teams c) Annual meetings of inspection teams within a district or other subdivision
A-30 d) Quarterly or biennial (every 6 months) meetings of inspection teams within a district or other subdivision e) Other (please describe)________________________________________ Of the 32 agencies that hold periodic meetings, 15 agencies hold an annual meeting of all inspection teams, four agencies hold quarterly or biennial (every six months) meetings of all inspection teams, one agency holds an annual meeting of inspection teams within a district or other subdivision, and four agencies hold quarterly or biennial (every six months) meetings of inspection teams within a district or other subdivision. Eight other agencies uses different way to discuss. In Washington DC, meetings are not scheduled on a regular basis, but occur when issues must be discussed or prior to execution of an option year. Delaware and Utah holds monthly meeting because in Delaware they are small group of 8-10 in-house inspectors and also because off all the changes with the MBEI. Maine hold bi-weekly meeting with inspection teams. North Dakota hold annual meeting with all teams and some years meeting with teams in individual districts. In Montana, biannual meeting of all inspection teams plus yearly Quality Assurance reviews are held in each district. Wisconsin, holds an annual and quarterly meeting with their Regional Bridge Engineers. Kansas holds an annual three day training-review meeting every years and meet as needed to resolve other issues as they come up. 36. Please select the quality control procedure that best matches the procedure used in your Agency. Select any combination of answers that best describe your Agency quality control procedure. a) Inspection results (report content, rating, quantities, etc.) are reviewed by a supervisor or peer in the office to identify errors and confirm reasonable results b) Inspection results (report content, ratings, quantities, etc.) are reviewed in the office by a supervisor or peer to identify errors and confirm reasonable results, AND some field verification is performed. c) Software is used to check for errors and omissions in the report. d) Other (please describe) ________________________________________ Purpose of the question: The purpose of the question is to determine the number of Agencies that utilized some form of field verification at the QC level, and how many states rely on a software tool. If an Agency uses software tools, it may be useful for others to learn of that tool. Also shows the level of review for original inspection results. Responses to this question show that 25 agencies depend on supervisor or peer review to identify errors and confirm reasonable results, and some field verification of element-level inspection results (report content, ratings, quantities, etc.) as a quality control procedure. As shown in Figure A-11, 13 agencies use only supervisor or peer review in the office to identify errors and confirm reasonable results of element- level inspection results (report content, rating, quantities, etc.). Seven agencies use software to check for errors and omissions in the report. In Kansas, every inspection is reviewed by one inspector in the office selected for that particular area. Each area inspected has an area meeting conducted with both the area & district to discuss the major findings of the inspections. Procedures for element level data have not yet been incorporated into Agency's inspection QA procedures (Pennsylvania). All answered this question.
A-31 Figure A-11. Agencies' quality control procedure Inspection Process and Procedure 37. When conducting the inspection in the field, do inspectors have the previous inspection results available to them? Y/N Purpose of questions (Q37-Q42): To determine the documentation available to inspectors in the field. Some data quality research suggests that multiple inspection of the same conditions increase the accuracy of the resulting data. The use of previous inspection results influences the quality of the data and may reduce scatter. Responses for this question show that all 36 agencies have previous inspection results available to inspectors. 38. You answered yes to the previous question. Which of the following best describes the format of the previous inspection results? a) Electronic (e.g. available on a laptop computer or tablet) b) Paper copy c) Other (Please specify) ________________________________________ For 12 of these agencies, the previous inspection results are available in electronic format (on a laptop computer or tablet), and for 13 agencies the previous results are available in paper copy. Eleven other agencies such as Delawfare, Minnesota, Kentucky, North Dakota, Montana, Oklahoma, Wisconsin, Rhode Island, Utah, and Kansas use both electronic and paper copy.
A-32 39. Based on your experience, are the previous inspection results reviewed prior to the inspection? Y/N Based on the responses to this question, it is found that in 32 agencies, the previous inspection results are reviewed prior to conducting the inspection; four agencies do not review the previous inspection before conducting the inspection. 40. Which of the following is available to the inspector during the inspection process in your Agency? Choose all that apply. a) Bridge plans b) Scour elevation/plan of action (POA) c) Photos of damages/defects d) Drawing of damage/defects e) Recent maintenance/repair records f) Load rating data/controlling member(s) The survey shows that during inspection, bridge plans are available in 31 of the surveyed agencies, scour evaluation/plan of action (POA) is available in 27 agencies, photo of damages/ defects are available in 34 agencies, drawing of damages/defects are available in 29 agencies, recent maintenance/ repair records are available in 21 agencies, and load rating data/controlling member(s) are available in 24 agencies. All answered this question. 41. Would you expect the areas of damage on the bridge to be sketched on the drawings as part of the inspection result? Y/N The survey shows that 21 agencies expect that the areas of damages on the bridge will be sketched on the drawings as part of inspection and 15 agencies do not expect to sketch the area of the damage on the drawings. All answered this question. The following questions refer to common procedures and practices used in your inspection program to conduct routine inspections. To assist in responding to the questions, please consider the following example bridge. The bridge shown is an example bridge from the AASHTO Bridge Element Manual, Appendix B, page B-5. Assume that the bridge lies on an NHS route and has an Average Daily Traffic (ADT) of ~ 5000 vehicles per day. The deck of the bridge (element #12) includes 11, 880 sq. ft. total; 11, 628 sq. ft. in Condition State (CS) 1 and 252 sq. ft. in CS3. 42. The deck of the bridge (element #12) shown in Figure A-12 includes 11,880 sq. ft. total; 11,628 sq. ft. in CS1 and 252 sq. ft. in CS3. To estimate these data, which of the following best describes the method used? a) The areas are estimated based on the percentage of the deck in poor condition, multiplied by the recorded total area (11,880 sq. ft.) b) The areas are estimated by an inspector using a measurement device, such as a tape measure or wheel, to calculate the total area in CS3. c) The areas are determined by sketching the damage on a scaled diagram (drawing) and then calculating the area from the sketch. d) The process for estimating the area is not known. Purpose of the questions (Q42-Q44): The purpose of the question is to determine how quantities are estimated in the field for a common element. The question will determine if measurement devices are used in estimating the area and if the process is documented in inspection procedures or training. The use of sounding is also considered in the questions. A deck element was selected for the subject because all bridges have a deck, regardless of the superstructure form, and decks are a focus of maintenance and
A-33 preservation activities which may be suggested through bridge management processes based on element level inspection. Figure A-12. Question 42 deck element The following are the response of the survey invitees which is also shown graphically in Figure A-13. Fifteen agencies responded that the areas are estimated based on the percentage of the deck in poor condition, multiplied by the recorded total area (11,880 sq. ft.) Sixteen agencies responded that the areas are estimated by an inspector using a measurement device, such as a tape measure or wheel, to calculate the total area in CS3. Two agencies responded that the areas are determined by sketching the damage on a scaled diagram (drawing) and then calculating the area from the sketch. And three agencies noted that the method for estimating the area is not known. All responded to this question.
A-34 Figure A-13. Methods used to estimate the CS's of the deck element 43. Regarding the previous question, which of the following best describes the practice within your Agency? a) The method for making the estimate is described in an inspection manual or otherwise documented. b) The method for making the estimate is described in training or periodic meetings, but not documented. c) The method for making the estimate is determined by individual inspection teams or groups of inspection teams. As shown in Figure A-14, 22 respondents chosen that the method for making the estimate is determined by individual inspection teams or groups of inspection teams. Eleven respondents chosen that the method for making the estimate is described in training or periodic meetings, but not documented. Three respondents chosen that the method for making the estimate is described in an inspection manual or otherwise documented.
A-35 Figure A-14. Sources of the method used for Deck CS's estimation 44. Regarding the bridge deck element, considering the example bridge and other similar bridges in your Agency's inventory, how often would you expect sounding to be used to determine the area of damage in the deck? a) each routine inspection b) every other routine inspection c) During in-depth or special inspection only d) Not until work is being planned/scoped e) Other frequency/never (please describe) Thirteen respondents replied that sounding is expected in every routine inspection, four respondents replied that the sounding is only needed during in-depth or special inspection, and six respondents replied that they do not expect sounding of the element until work being planned/scoped. The result is shown in Figure A-15.
A-36 Figure A-15. Sounding frequency to determine the area of damage in the deck New Hampshire, Nebraska, and Missouri deem sounding appropriate if visual signs of deterioration are present. Colorado deem sounding necessary if there are indication of delamination and traffic allows. Kansas uses chain drag if the traffic allows. Michigan leaves this to the inspector judgment, and it is recommended when the structure falls from good to fair condition that a detailed inspection is performed; the next detailed inspection would need to be completed at the inspectorsâ discretion or when the deck condition falls from fair to poor. After the deck is in poor condition it is recommended that detailed inspections be completed at 48 month intervals. Delaware conducts random sounding during all routine inspections usually in the shoulder areas unless there are signs of deterioration elsewhere. According to them, shoulder typically show problems first with salt and water flowing to this location. In Alaska although sounding is recommended during each routine inspection, but it can be deferred till the next cycle if traffic, rain or visibility preclude sounding. If a couple of cycles have passed without soundings and a deck has prior documented damage with observed increases in cracking, pop outs, spalls, or patches and depending on the extent of damage, a deck inspection with traffic control may be scheduled to obtain updated condition information. If a deck has no prior or no progression of prior documented damage with minimal extent, several cycles may pass till a deck inspection is scheduled. North Dakota requires sounding once a deck reaches an NBI rating of 5, decks are chained at least every four years. Florida responded that since they do not use deicing salts, they do not have deck problems that other states have, so unless they have issues they generally do not sound. 33 respondents answered this question. 45. For the first time element level inspection of this bridge, how long would you expect an inspection team to spend inspecting this bridge and preparing the required report? a) 0-2 hrs b) 2-4 hrs c) 4-6 hrs
A-37 d) 6-8 hrs e) Greater than 8 hrs Purpose of the question: The purpose of the question is to characterize the scope of a typical inspection in terms of the time anticipated for the inspection and the access equipment made available to the inspectors. The use of traffic control is also measured. These data provide insight regarding the thoroughness of inspections and the variability in the scope of inspections among different bridge owners. Three respondents indicated that 0-2 hrs are required to do this inspection for the first time. Seven respondents indicated that 2-4 hrs are required to do this inspection for the first time. Nine respondents indicated that 4-6 hrs are required to do this inspection for the first time. Eight respondents indicated that 6-8 hrs are required to do this inspection for the first time. Eight respondents indicated that more than 8 hrs are required to do this inspection for the first time. One respondent skipped this question. 46. For subsequent element level inspections of this bridge, how long would you expect an inspection team to spend inspecting this bridge and preparing the required report? For subsequent inspections, 11 respondents answered 0-2 hrs, eight respondents answered 2-4 hrs, nine respondents answered 4-6 hrs, four respondents answered 6-8 hrs, and three respondents answered more than 8 hrs. The result is shown in Figure A-16. One respondent skipped this question. Figure A-16.Time required for subsequent element level inspection of the bridge 47. Does your Agency require recording the inspection teamâs time to inspect the structure? Y/N
A-38 From the responses it is found that 25 agencies do not require recording the inspection teamâs time of inspection, and 11 agencies require recording the inspection teamâs time to inspect a structure. 48. To complete the element-level inspection, would access equipment be used during the inspection? Assume that the previous inspection results have indicated that all elements are in good or fair condition. Y/N Responses to this question show that 18 agencies do not prefer using access equipment during inspection of this structure, and 16 agencies prefer using access equipment for inspection. Two respondents skipped this question. 49. Assuming this bridge lies on an NHS route with an Average Daily Traffic (ADT) of ~5000 vehicles, would traffic control be used to complete the inspection? Y?N Nineteen of the respondents answered to this question that no traffic control would be used to complete the inspection, and 16 respondents chosen that traffic control would be used to complete the inspection. One respondent skipped this question. 50. Does your Agency's inspection procedure or practice include baseline photographs documenting conditions when a bridge is added to the inventory? Y/N Purpose of the question (Q50-Q51): The purpose of this question is to characterize the use of photographs to document bridge conditions. The question will determine if respondents have a systematic procedure for documenting damage conditions in the field. For this question, 31 agencies replied that their inspection procedure or practice include baseline photographs documenting conditions when a bridge is added to the inventory, and only five agencies inspection procedure do not include baseline photographs for bridges when added to the inventory. 51. Does your Agency's inspection procedure REQUIRE photographs to be taken during the inspection? Choose all that apply. a) No required photographs, inspection teams determine appropriate photographs independently. b) Photographs showing the overall conditions at the bridge are required. For example, bridge elevation and waterway photographs are required. c) Photographs are required for CS 2, CS 3 and CS 4 locations/conditions d) Photographs are required for CS 3 and CS4 locations / conditions e) Photographs are required for CS 4 locations / conditions f) Other (please describe) Twenty five agencies answered that photographs as described in the MBEI are required, including photographs showing a top view of the roadway across the bridge, a side elevation view, and an under view of the main or typical span superstructure configuration. Ten agencies answered that photographs are required for CS 3 and CS 4 locations/conditions, and six agencies answered that no photographs were required and that inspection teams determine appropriate photographs independently. Also, three agencies indicated that photographs are required for CS 2, CS 3 and CS 4 locations/conditions, and three agencies indicated that chosen photographs are required for CS 4 locations/conditions. ï· Virginia has a specific photograph requirement that current photographs, and a date stamp, will be included in the report of all items that warrant a general condition rating of '4' or less. Unless otherwise
A-39 approved by the District Bridge Safety Inspection Engineer, new photographs shall be required during each inspection. Old photos can be included for comparison or to show a better picture of a condition (i.e. such as the substructure when the water level was much lower during a past inspection). ï· Alaska has a list of standard photos that are recommended to be taken with each routine inspection: Ahead / Back at Bridge / Looking Upstream / Downstream at Channel Elevation Looking Upstream / Elevation Looking Downstream Bank Protection at Bridge / Stream Bank Condition Upstream / Downstream Near / Far End Signs (Close-up, No Flash) for load posting / restricted width / vertical clearance Bridge Mounted Signs / Utilities Typical Deck Condition / Typical Soffit Condition / Edge of Deck Condition Underside of Superstructure in each span Near End Abutment / Far End Abutment / Pier(s) Pin & Hangers / In-Span Hinges / Seismic Retrofits Damage / Defects / Deterioration. ï· In Massachusetts, general photos and photos of deficiencies are required. ï· Oklahoma requires photographs of the centerline and profile photos for all inspections besides those described in the MBEI and those of for CS 3 and CS 4 locations/conditions. ï· Florida requires photos of NBI rating of 6 or less besides those required for CS3 and CS4 locations/conditions. ï· Kansas takes approach and elevation (includes channel) photos on the initial inspection and update them about every eight years. All defects with a NBIS rating of 6 or lower, all expansion joints, impact damage, and scour require a photo every inspection. All utilities, special features and under clearance signs are taken at the initial inspection or as changes require. New minor defects often get photos, depends on the inspector. Inspection manual requires a minimum number of photographs. However the photographs are not required for submission to DOT (Ohio). Optional Question 52. If you could change two things about element level bridge inspection, what would you change? Purpose of the question: The purpose of the question was to determine is there were common complaints regarding the element-level inspection requirements. An analysis of this survey question shows that the majority of proposed changes can be grouped as revision, reorganization, addition, and/or deletion of MBEI contents. There are a small number of changes outside the MBEI content. REVISION, AND REORGANIZATION One agency proposes a âcentralized (i.e. federal) database that states follow inspection based on standard elements. The advantage would be real-time information uploaded and a cost savings. It is expensive and time consuming for every state to independently develop systems and manuals while keeping up to date. Having this provided to the states would promote consistence, catch problems, identify needs, etc.â Two agencies suggest mandating element level data collection for all NBI bridges, which would improve consistency. One of these respondents state that âFHWA only requires element inspection for NHS bridges which makes less than 5% of our inventory.â The other one states âThe FHWA needs to either make it apply across the board or else forget about the whole idea and go back to the classic NBI approach. Fooling around with the "incremental" application of the element-level approach (which the FHWA has done for years now) is just wasting everyone's time.â Also, one agency proposed reorganization of the MBEI and clarification of the condition states. Another agency proposes that âFind and eliminate areas of inconsistency, i.e. be specific regarding the environments.â, and this respondent writes âcommunicate the benefits of element level.â Several agencies propose updating or revising the quantity units for substructures. One agency states that âcollecting defects for columns in" "each and for pier caps and abutments in " "lft does not provide a good method for determining bridge management activities, especially for concrete material. Generally, if
A-40 substructure is going to be repaired the repairs are measured in square feet.â Two other agencies propose more consistency in units (deck in .sq ft while abutment/pier in lft ). One of them states, âReally need to review how substructure elements are considered- each is probably not the best option.â Another agency proposes âchange to the unit of substructure elements from linear feet to square feet, which would allow to identify the actual quantity of deficiency and use contract unit costs.â Another four agencies propose change of protective coating measurement unit from square feet to the quantity of the parent element. One of the respondents brings examples of such element as: bridge rail, piles, bearings, pins and hangers, bin walls, and sheet pile walls that their protective coating can be measured the same way as their own measurement unit. Also, one agency proposes the protective coating element quantity to 1 " "each instead of square feet and rated all in one condition state. This respondent states that âthe way we handled it previously and we still do it this way.â Similarly, one more agency asks for âchanging other material abutment or pier to cover materials combination type abutments and piers in addition to materials other than steel, concrete or timber. Combination abutments and piers make up many substructures. Relying on the predominant system for combination type abutments is not ideal. Abutments are intended to be one element, but piers that are similarly built are not, i.e. a pier with steel piles and steel pier cap are separately coded, but the same at the abutment is coded as steel abutment. Another example would be with a concrete infill/web wall, at the abutment it retains fill, but at the pier it is more a protective system to prevent debris from snagging on the pier. Abutment fill retention system may be separate from the structural system for the bridge, but they are coded together, i.e. vertical sheet pile wall behind a steel pile and steel pile cap. An alternative may be to break abutments up into separate elements much in the same way piers are separated into footing, columns, cap beams, though that still doesn't fully address combination substructures.â There are several change proposals from respondents about defects. One agency writes, âdefect quantities do not need to add up to the condition state quantity because defects often overlapâ and another agency wants ârecommendations/examples for overlapping defects.â Also, the former respondent proposes adding fifth condition state CS 5, and notes that âit is my understanding that if the review determines that there is not an effect on strength or serviceability of the element or bridge the quantity is moved back to CS 3. This requires a lot of additional notes describing for the next inspector why it is not in CS 4. When a structural review determines there is an effect on strength or serviceability of the element or bridge the quantity will be shown in CS 5.â Another agency writes, âI really liked the deck element with five condition states. We had an action for each state that worked very well for us. We have established a work around. We have had some trouble with wearing surface-deck and top flange when you have all three at one location.â One agency asks for âchange in the language for the condition of timber bridges - the depth of checks as compared to the depth of the member part is ridiculous. Checks are a normal part of the drying process for timber. They shouldn't lower the condition state. Other language about the condition of timber is too conservative as well. The condition of our timber bridges is going to drop significantly once we start using the new elements. The location of checks/splits/cracks/rot is just as important as how big that split/check/crack/rot is. Inspectors need to be able to use common sense, and the language should be based on how the condition affects the load carrying capacity or future load carrying capacity of the bridge, not just go by the size of the defect.â ADDITION Two other agencies propose to implement 3D model for bridges to accurately collect the severity, quantity, and location of defects. One of these two agencies states that to improve the accuracy and reliability of the data, âwe really need a better way to collect data.â One agency states âElement level manual use photographs instead of text. 95% of inspections are visual, therefore in order to maximize consistency, a visual inspection should correlate with a visual manual. Each CS 2, 3 and 4 for every defect should have a representative photograph.â Two agencies ask for âspecific defect language for elements with . , , and sq ft lft each quantities and more specific guidance for CS 4â and âmore and better examples, and better guidance when there is multiple defects in one location and on the hierarchy.â Another agency is
A-41 proposing the same change and requests âbetter description of defectsâ and states, âone or two lines of text donât always cover the issues.â One agency asks, âWhy it is needed to add non-primary defects per inspection which in the next inspection may become a primary defect?â One agency proposes adding âprestressed slab element in the manualâ, another agency proposes âinclude wingwalls, these have been added as ADEsâ, and one agency wants to âchange how trusses and arches are measured and quantified.â DELETION Another respondent proposes, âRemove the reference of (does not warrant structural review) from condition state three and remove the reference (warrants a structural review) from condition state four. Unless the condition is "obvious" you cannot tell if a structural review is required or not. The condition state language should contain a description of an actual defect condition for example rebar is exposed and core concrete is affected.â One agency is written that âI would get rid of it entirely because we don't believe that it will ever provide anything that is meaningful to use. I would not expand the number of elements that are required to be collected and I would not expand the requirement for collection of this data to non-NHS bridges.â OTHERS Among those who ask for changes, other than MBEI content, one agency wants to âhave the BrM software run smoothlyâ, the other one wants âto have the software installed and deployed.â, and the third one proposes that âFHWA develop the software for everyone to input and use.â Also, one agency proposes that âstop changing the system. We take what you give us and make it works with our system and then FHWA decides to change it because it doesn't work for some big state that lead the development on the original system in the first place. Focus on the development of better forecasting tools and better deterioration curves.â 28 of the respondents answered this question.
A-42 REFERENCES Drury, C. G. and J. Watson (2002). "Good practices in visual inspection." Human factors in aviation maintenance-phase nine, progress report, FAA/Human Factors in Aviation Maintenance.@ URL: http://hfskyway. faa. gov. Megaw, E. (1979). "Factors affecting visual inspection accuracy." Applied Ergonomics 10(1): 27-32. Melchore, J. A. (2011). "Sound Practices for Consistent Human Visual Inspection." AAPS PharmSciTech 12(1): 215-221. See, J. E. (2012). Visual inspection : a review of the literature: Medium: ED; Size: 77 p. Washer, G. A. and C. A. Chang (2009). Guideline for Implementing Quality Control and Quality Assurance for Bridge Inspection.