Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
94 Key Bridge Preservation Reports FHWA (2008). 2008 Status of the Nationâs Highways, Bridges, and Transit: Conditions & Performance. Report to Congress. â¢ Chapter 2: Bridge System Characteristics â¢ Chapter 3: Bridge System Conditions â¢ Chapter 11: NHS Bridge Performance Projections â¢ Appendix B: Bridge Investment Analysis Method TRB (2009). NCHRP Synthesis of Highway Practice 397: Bridge Management Systems for Transportation Agency Decision-Making. Provides an overview of how agencies make decisions about bridge investments. Fiber-Optic Sensors Table D-1 provides detailed information about projects using FOS. Dispersion of Expert Opinion on NDE Bridge Deck Evaluation Technologies Table D-2 provides an indication of the degree of uncer- tainty or difference of opinion these survey scores represent. For each of the three measures, the coefficient of variation has been calculated. The standard deviation divided by the mean provides a rel- ative weighting of dispersion. A coefficient of variation closer to one means a wider variation of values in that the standard deviation would approach the magnitude of the mean. A coefficient of variation closer to zero means that there is little variation about the mean. Cost Assumptions Some assumptions were used in the cost model devel- oped by the research team. Several of these assumptions were discussed in detail during the sessions at MnDOT. Among the most important was the question of scope of application. The relatively low assumed share of bridges within the jurisdiction evaluated each year (5 percent) is based on a presumption that no technology will replace the existing statutory methods (visual and audible). Rather, instances in which further evaluation, higher reliability, or lessened impact are desired (e.g., a heavily traveled bridge in a dense urban area) would form the target group of can- didates for assessment using an alternative or supplemental technology. Report on STREAM Field Trial: Minnesota DOT The research teamâs efforts to field-test STREAM to refine it and ensure its utility and practicality included a day-long workshop at the Minnesota DOT (MnDOT) in April 2012. The research team had two objectives in meeting with these representatives of the intended audience for STREAM. First, the research team sought feedback from individuals who might use STREAM, either by conducting a STREAM analy- sis or by using the resulting findings. Second, the research team sought to generate awareness of STREAM and explore opportunities for applying STREAM. The research team sought a collaboration with MnDOT in particular because of close ties between the NCHRP Proj- ect Panel and MnDOT and because the case study on bridge deck evaluationâwhich also served as the application for the workshop exerciseâdrew heavily on MnDOTâs efforts to assess bridge deck evaluation methods (see, for example, Gastineau et al., 2009). The research team divided the workshop into four ses- sions. Sessions 1 and 2 were aimed at broadly discussing STREAM with end users and soliciting general feedback. In Sessions 3 and 4, the research team validated STREAM (using the topic of bridge deck inspection) with experts from MnDOTâs bridge maintenance and related offices. The A p p e n d i x d Materials Supporting Bridge Deck NDE STREAM Example
95 remainder of this appendix describes those sessions and the findings. Sessions 1 and 2: Overview of STREAM In Session 1, the research team provided an overview of the project and STREAM to a group of MnDOT middle and upper level managers from both operational and planning offices. In Session 2, the research team met with personnel involved in technology decision-making and technology use. In these sessions, the research team focused princi- pally on STREAM itself, describing the thinking behind it and how the research team envisioned its implementation. In both sessions, participants indicated support for imple- menting STREAM as a technology decision tool in trans- portation agencies. When presented with the bridge deck inspection example, one participant commented that a side- by-side comparison of technologies illustrated that there are A total of sixteen Fabry-Perot (FP) FOS were installed on the East Bay bridge, in Hillsborough County, Florida. The bridge is a 4-span continuous reinforced concrete deck-type structure. (Mehrani et al., 2009) (Idriss and Liang, 2007) âFiber Bragg sensors developed for structural health monitoring, and were installed on Hong Kongâs landmark Tsing Ma bridge (TMB), which is the world longest (1,377 m) suspension bridge that carried both railway and regular road traffic. Forty FBG sensors divided into three arrays were installed on the hanger cable, rocker bearing and truss girders of the TMB.â (Chana et al., 2006) Switzerland, Siggenthal bridge in Baden, 58 FOS (Li et al., 2004) Switzerland Versoix bridge with 104 FOS (Li et al., 2004) Waterbury bridge, Vermont, 36+16 FOS, steel bridge (as opposed to concrete) (Li et al., 2004) Planned: âThe [FOS-based] SHM systems will be implemented on two existing steel bridges: the Government Bridge at Rock Island Arsenal, Rock Island, Illinois, and the Interstate 20 (I-20) Mississippi River Bridge between Mississippi and Louisiana.â (Mason et al., 2009) I-35 W bridge in Minnesota (the one that collapsed). Array of sensors being used, including FOS. (Inaudi et al., 2009) âThe Bridge Engineering Center at Iowa State University has been working with the Iowa DOT to improve methods of managing bridge infrastructures. Specifically, the Bridge Engineering Center is developing and utilizing short-term and long-term SHM systems to measure bridge behaviorâ¦. The HPS bridge SHM system consists of components developed from several different manufacturers. When possible, standard off-the-shelf components were utilized to maintain minimum cost for the system. The primary components of the SHM system are as follows: Strain sensing equipment: Micron Optics si425-500 Interrogator Strain sensors: 30 Fiber Bragg Grating (FBG) Sensors Video equipment, networking components, and three computers for web service, data collection and data storage. [For real-time status, visit the SHM system online portal at http://www.ctre.iastate.edu/bec/structural_health/hps/index.htm. There clients can view live streaming video of traffic crossing the bridge and the resulting real-time girder strain measurements.â] (Graver et al., 2004) âAn optical fiber monitoring system was designed and built into one span of the five span high performance prestressed concrete I-10 Bridge over University in Las Cruces, NM.â Table D-1. Examples of bridge monitoring projects using FOS. Technology Application Preservation Value Mobility Value POSI Robotic 0.32 0.27 0.22 NDE Suite 0.11 0.19 0.16 FOS 0.49 0.32 0.21 GPR 0.11 0.38 0.15 Visual 0.30 0.38 0.01 Visual + Audible 0.35 0.36 0.03 Table D-2. Characterization of bridge deck evaluation technology alternatives, coefficients of variation for expert survey replies.
96 benefits to increasing the POSI of GPR, a technology applica- tion already being considered within MnDOT, and that doing so would add value to MnDOTâs bridge operations. Participants supported rooting the assessment in broad agency missions (safety, mobility, etc.) that are common to agencies and across functions and then defining specific goals based on these fundamental agency missions. Participants also expressed concerns that STREAM might still not be sufficient to help agencies identify and understand when or how technology will affect them in unforeseen or subtle ways. Computer-assisted drawing (CAD) was offered as an example of a technology that has changed the way DOTs do business, but which agencies may not have identified as a technology to evaluate with STREAM. Session 3: Introduction to Bridge Deck Inspection Technology Assessment with STREAM Session 3 was devoted to looking at the example of bridge deck inspection technology applications in detail. This was intended to validate with experts the set of data and infer- ences that the research team had been using and to frame the conversation about bridge inspection technology within the STREAM framework. This was first done through gen- eral discussion of the bridge monitoring function, alternative technological approaches, and the approaches used or tested by MnDOT. The participants then reviewed the research teamâs work in detail. Key Insights from Discussion The discussion offered several insights that the research team used to refine STREAM and the bridge deck inspection case study. Participants noted that bridge deck inspection principally relates to preservation of transportation infra- structure and mobility for travelers. Safety is not a key concern of bridge deck inspection (though it is of substructure inspec- tion, for example) because traveler safety is not typically in question. Safety of bridge inspectors is a consideration. Participants also noted that the effect of bridge deck inspec- tion on mobility depends not only on the length of time that a bridge or bridge lane is closed, but also the number of people who use the bridge. A technology that involves a half-day of bridge closure may have little impact on mobility in a very rural region, but even a 1-hour closure may be costly in an urban region. As in previous sessions, participants made several com- ments related to technologies that fundamentally change how DOTs perform certain functions. For example, fully adopting FOS as a means of bridge deck inspection involves imple- menting fiber optics in every new bridge during construction and retrofitting existing bridges. Fiber-optic systems could eliminate the need for recurring bridge deck inspection. This approach is different from GPR and visual inspection, which both involve routinely scanning the bridge with sensors of some kind to obtain information. In contrast to fiber-optic systems, these technologies allow DOTs to use the same equip- ment and resources on multiple bridges. Participants agreed that it was important for DOTs to be able to assess both âbusiness as usualâ and revolutionary tech- nologies. There was some disagreement over whether different technologies should be evaluated side by side. Some partici- pants thought that doing so would be comparing âapples to orangesâ while others noted that such comparisons were necessary in order to adopt technology. The group discussed how STREAM could facilitate such a comparison, noting that comparisons of cost, impact on goals, and feasibility assisted in this comparison. Participants also noted that DOTs typically experiment with and adopt technologies in phases, conducting larger pilot tri- als of technologies. They emphasized that STREAM should be able to assist in narrowing in on appropriate trials and incor- porating information from trials into the decision-making process. Several of these points shaped how STREAM was devel- oped. For example, STREAMâs ability to facilitate âapples and orangesâ comparisons was made explicit and intentional. Other aspects of discussion provided valuable input as the research team refined not only the STREAM method, but also the materials the research team used to present it. Review of Cost Model The research team also shared with MnDOT its assump- tions about the costs of different bridge deck inspection tech- nologies. Table D-3 shows a sample of the assumptions used in the cost model developed by the research team. Partici- pants discussed several of these assumptions in detail. Among the most important was the scope of application. The relatively low share of bridges evaluated using the alter- native technology each year (5%) is based on a presump- tion that no technology will completely replace the existing statutory methods of visual and audible inspection. Rather, instances in which further evaluation, higher reliability, or lessened impact are desired (e.g., inspecting a heavily traveled bridge in a dense urban area) would be key candi- dates for assessment using an alternative or supplemental technology. The resulting calculations, incorporating much additional data on the candidate technology applications, are shown in Table D-4. This table shows the breakdown of net fixed and recurring costs for each technology during the initial 5-year period of its use.
97 Session 4: Real-Time Application of STREAM to Bridge Deck Inspection In the final session, the research team worked with partici- pants to apply STREAM to bridge deck inspection technolo- gies. This exercise focused on the Characterize and Compare steps in STREAM and used the forms shown in Figure D-1 to solicit participantsâ rating of the benefits, barriers, and costs of the six alternative technology approaches for bridge deck evaluation that had been discussed. The research team used the results of participantsâ input to generate new plots com- paring technologies along these dimensions. This exercise was more than a field test of STREAM. MnDOT has a high level of experience and familiarity with specific technology alternatives, gained through studies per- formed by MnDOT following the collapse of the 35-W bridge over the Mississippi River. The research team therefore used the results from this exercise as a proxy for the results that would be produced by an expert STREAM panel under the institutional Board/panel. The results could then be used as Assumption Base Value Assumption Base Value Percent Using Chain-drag 10% Discount Rate 7.0% Bridges in Area of Responsibility 20,419 Hourly Labor Cost $20.0 Percent of Potential Bridges Using Alternative 5% Overhead Rate 50% Bridges per year 1,000 Burdened Labor Rate $ 30.0 Average Inspection Labor (CD)(man-hrs) 40 Cost per Mile $ 0.56 Average Inspection Labor (VI)(man- hours) 8 People trained/year for 1000 bridges/year 4 Average Distance to Bridge (mi) 100 Table D-3. Base case assumptions for calculating fixed and recurring costs for candidate bridge inspection technology applications. Technologies Cost Factors Visual Visual +Audible GPR Fiber Optics (New) Sensor Suite (GPR +IR) Sensor Suite on Robotic Platform N et F ix ed C os ts Acquisition (net value of redundant equipment) $87,285 $87,285 $129,463 $4,404,041 $217,311 $471,246 Taxes/Penalties/ Fees (net TPF no longer required) $0 $0 $0 $0 $0 $0 Training (net training no longer required) $0 $0 $43,200 $10,000 $97,200 $97,200 Licenses, Royalties, etc. (net) $0 $0 $15,000 $6,186 $24,405 $61,518 N et R ec ur rin g Co st s O&M (net O&M of equipment made redundant) $278,588 $278,588 $269,813 $278,588 $282,975 $282,975 Training (net training no longer required) $52,647 $52,647 $115,822 $67,271 $194,792 $194,792 Taxes/Penalties/Fees (net TPF no longer required) $0 $0 $0 $0 $0 $0 Licenses, Royalties, etc. (net) $0 $0 $0 $0 $0 $0 Personnel (net personnel made redundant) $1,491,652 $2,680,586 $2,024,867 $1,491,652 $2,249,852 $2,249,852 Total $1,910,172 $3,099,106 $2,598,165 $6,257,738 $3,066,535 $3,357,583 Table D-4. Fixed and recurring costs for candidate bridge deck inspection technology applications over five-year period following implementation.
98 Figure D-1. Sample workshop input collection form. Please note the extent to which you believe the given technology will improve preservation and safety. Little or None Small Large Significant Visual Visual+Audible GPR Fiber Optics Sensor Suite (GPR+IR) Sensor Suite on Robotic Platform Please note the extent to which you believe the given technology will improve mobility (by reducing congestion due to lane closure or major bridge repair.) Little or None Small Large Significant Visual Visual+Audible GPR Fiber Optics Sensor Suite (GPR+IR) Sensor Suite on Robotic Platform Please assess how the costs of each technology relate to your agency. No net cost/savings "Little" net cost "Major" net cost "Excessive" Net Cost Visual Visual+Audible GPR Fiber Optics Sensor Suite (GPR+IR) Sensor Suite on Robotic Platform Unfamiliarity with core or applied technology Uncertainty concerning actual performance Additional implementation requirements (training, standards, etc.) Visual _____ _____ _____ Visual+Audible _____ _____ _____ GPR _____ _____ _____ Fiber Optics _____ _____ _____ Sensor Suite (GPR+IR) _____ _____ _____ Sensor Suite on Robotic Platform _____ _____ _____ Please assess which of the following technical barriers apply to each technology: Please Mark: __ if you believe the barrier does not apply L if you believe the barrier is a small concern M if you believe the barrier is a major concern S if you believe the barrier is a "showstopper"
99 input to workshops and meetings with other DOTs that focus on how an agency would receive and use such input from the Board-sponsored body. Technology Comparison Figure D-2 shows how the six alternatives compare in terms of potential benefits.75 The current approachesâvisual inspection alone and visual inspection coupled with audible inspectionâ score the least well. Visual inspection coupled with audible inspection provides some additional benefit over visual inspec- tion alone on the Preservation axis. Three alternatives, GPR, a suite of sensor technologies, and the roboticized platform version of the latter, are clustered together and outperform visual and audible inspection on both preservation and mobility. However, large uncertainties mean they are largely indistinguishable from one another. Figure D-3, however, clarifies the choices faced by trans- portation agencies. The vertical axis captures the findings Please assess which of the following process and institutional barriers apply to each technology: Need for new or conflict with existing regulations & stds. Non-fungibility of funding for required expenditures Extended or problematic approval processes Visual _____ _____ _____ Visual+Audible _____ _____ _____ GPR _____ _____ _____ Fiber Optics _____ _____ _____ Sensor Suite (GPR+IR) _____ _____ _____ Sensor Suite on Robotic Platform _____ _____ _____ Please assess which of the following external barriers apply to each technology: Inertia of existing processes and methods Insufficient political or public acceptance Lacking presence of necessary vendor or support base Visual _____ _____ _____ Visual+Audible _____ _____ _____ GPR _____ _____ _____ Fiber Optics _____ _____ _____ Sensor Suite (GPR+IR) _____ _____ _____ Sensor Suite on Robotic Platform _____ _____ c Figure D-1. (Continued). 75 The results shown are for nine of the ten worksheets because one was apparently filled incorrectly on some responses. We therefore excluded all the responses from this participant. in Figure D-2, combining (multiplicatively) the two benefit measures of preservation and mobility. The horizontal axis plots the POSI and makes clearer the distinctions between technologies. The current standard techniques are precisely that because of the almost nonexistent barriers to their application and use. Both the GPR and mixed sensor suite alternatives provide greater benefit from use, but have com- mensurately higher barriers to adoption and implementation than current approaches. Although they represent two distinct points, both are approximately located on the highest tradeoff curve within the range of uncertainty that surrounds them. Though close, the relatively narrow uncertainty ranges give the edge to GPR in terms of higher POSI. Figure D-4 provides a third view of the results. The ben- efit values on the vertical axis remain the same as in Figure D-3, but the horizontal axis plots the cost barriers associated with each technology. Technologies follow a similar pattern in terms of cost barriers as they do in terms of non-cost bar- riers in Figure D-3. There is more dispersion along this axis than before. Fiber-optic systems and a robotic testing suite have serious cost implications when compared to benefits they return. As expected, the present standards of visual and audible inspection have the least serious cost implications. GPR and a sensor suite represent an increase in costs relative
100 to this baseline. Again, GPR appears to have a slight edge over the sensor suite. The research team shared and discussed these results with participants during Session 4, after data entry into the soft- ware designed to support the Compare step. The exercise had two favorable outcomes. First, the results seemed to capture the actual relationship among these technological alternatives as understood individually and collectively by this group. Sec- ond, even though not surprising, the results provided insights that were considered valuable. They provided a comprehensive means to represent disparate information in a manner that could be understood and could (and did) provoke additional rounds of usefully focused discussion. Participants recognized and valued that the purpose was not to cause the computer to offer a single âanswerâ deus ex machina, but rather to provide a platform to facilitate weighing of alternatives by a group of technical staff, managers, and decision makers. Summary Several lessons emerged from the workshop at MnDOT. STREAM appeared to pass the first market test to which it was subjected. While far from complete validation of the concepts and their application, it was informative that practitioners playing several managerial, planning, research, and operational roles in a transportation agency reacted largely favorably. The research team also received valuable input on how to present STREAM to agencies and how to provide instruction in its operation. This includes describing how STREAM can facilitate decision-making about successive implementations of technology alternatives that will co-exist to some extent with existing means, how better to motivate the Frame step, and similar insights. It also became clear that there were unintended ambiguities, for example, in the lay out of the questionnaire and visualizations, which the research team subsequently improved. Vsl V+e GPR FO Sui Rob 0 2 4 6 8 10 12 14 16 0 1 2 3 4 Sa fe ty * M ob ili ty Probability of Successful Implementation Note: The vertical and horizontal bars show the range of estimates from the ten participants. The curves show the points of equivalent tradeoff between the combined Safety and Mobility benefit and the POSI. Key: Vsl=visual; V+e=visual+audible; GPR=ground penetrating radar; FO=fiber optics; Sui=sensor suite; Rob=sensor suite on robotic platform. Figure D-3. Estimated total combined benefit values for Preservation [Safety] and Mobility compared to scores for POSI for six candidate bridge inspection technology applications based on results from a workshop at MnDOT. Vsl V+e GPR FO Sui Rob 0 2 4 6 8 10 12 14 16 0 1 2 3 4 Sa fe ty * M ob ili ty Costs Notes: Costs are less of a barrier to adoption as one moves further from the origin. Key: Vsl=visual; V+e=visual+audible; GPR=ground penetrating radar; FO=fiber optics; Sui=sensor suite; Rob=sensor suite on robotic platform. Figure D-4. Estimated total combined benefit values for Preservation [Safety] and Mobility compared to measures of cost for six candidate bridge inspection technology applications based on results from a workshop at MnDOT. Note: The vertical and horizontal bars show the range of estimates from the nine participants. The curves show the points of equivalent tradeoff between Preservation and Mobility assuming them to be weighted equally (i.e., equally important). Key: Vsl=visual; V+e=visual+audible; GPR=ground penetrating radar; FO=fiber optics; Sui=sensor suite; Rob=sensor suite on robotic platform. Vsl V+e G PR F O S ui R ob 0 0.5 1 1.5 2 2.5 3 3.5 4 0 1 2 3 4 P re se rv a tio n Mobility Figure D-2. Estimated relative benefit values for Preservation [Safety] and Mobility for six candidate bridge inspection technology applications based on results from a workshop at MnDOT.