Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
11 The findings presented in the previous chapters suggest the value of a process that overcomes some current barriers while acknowledging and accommodating the realities of others. STREAM is built on the seven principles set forth in Chap- ter 2. STREAM is â¢ An explicit description of a five-step process for tech- nology assessment and decision making that outlines key decision points; â¢ An attempt to de-mystify as well as to provide a common framework and vocabulary for discussion, evaluation, and knowledge sharing within and among transporta- tion agencies, and between transportation agencies and stakeholders; â¢ A best practice checklist and a framework for improv- ing the quality of agency evaluation and adoption deci- sions17; and â¢ A means for providing a forensic checklist for under- standing better which steps or stages have proven to be obstacles; There are two aspects to STREAM. First, is the technical method itself. STREAM proceeds in five overarching steps as shown in Figure 3-1. In this representation, STREAM is shown as a linear process for ease of explication; however, the flow of the assessment and evaluation processes in practice is not likely to be linearâit is likely to be more recursive and bi-directional, and successive phases will themselves cause a re-evaluation of what has gone before.18 Chapter 3 describes these steps briefly. A fuller discussion of STREAM, provided in Chapter 4, grounds the method in a proof-of-principle application, an examination of NDE technologies for bridge deck evaluation. Appendices B and C provide two additional applications of STREAMâfirst to a materials technology application and then to a problem of information gathering and communications. The second element is how STREAM should be imple- mented: by whom, how, and when for each step. This is discussed in Chapter 5. The efforts required from an indi- vidual agency may not be equivalent along the full course of a STREAM analysis. In particular, the first three steps could lend themselves to formal collaboration among several agen- cies, by some collaborative institution, or to informal knowl- edge transfer among agencies. In each instance, the latter steps would require an agency either to take the initial findings of that joint or external effort and then particularize it by apply- ing the characteristics of its own local situation or to use its own output from itself applying STREAM and carrying out the latter steps. It is the âopen architectureâ design of STREAM along with the standardization of approach that can make this economy and efficiency of effort possible. C h a p t e r 3 STREAM: A Systematic Technology Reconnaissance, Evaluation, and Adoption Method 17 Analyses of successful instances of technology assessment and implementa- tion in transportation might well disclose a pattern in which all or most of the STREAM steps would have been followed. Illuminating the process should help this become the routine, rather than the exception. DOTs and MPOs tacitly but widely regarded as technology âleaderâ agencies may have in many cases performed well in analyzing adoption decisions regarding technologies that are either mature or may become so within 5 years for functions that they undertake and understand. This is less the case for technologies 5â20 years away and how those technologies might reshape agenciesâ functions. 18 Several times when large corporations or other multi-level organizations were confronted with results from the introduction of new technologies that were less than expected, they found it necessary to examine in detail â often for the first timeâtheir process and methods. In this sense, the need to compre- hend the possible result from new technology led to a systematic accounting of the function and context of the production process. In this sense, coming to some common methodological framework such as STREAM may mean that new technology assessment and introduction may become the lens for a re-examination by transportation agencies of their own functions and their relationship to mission goals.
12 Step 1: FRAME the Problem and Specify Goals The Frame step is the foundation on which the subsequent steps are based. In this step, decisionmakers make explicit the following: â¢ The agency functions for which alternative technologies are being considered19 â¢ The goals on which the functions bear â¢ The objectives and metrics by which each goal should be measured This helps establish a common understanding of the decisionâs objectives and criteria, both within an agency and for other agencies that may turn to the analysis for guidance. The Frame step may appear simple and straightforward. That in itself is a problem because in many ways it is both the most important yet also the one most often missing from transportation sector technology analyses or is given only cursory treatment.20 The absence of explicit framing may contribute materially to many of the downstream difficulties that have previously been presented as barriers to the more rapid adoption of transportation-relevant technologies. In keeping with the core philosophy embodied in STREAM, the Frame step originates from the demand side (i.e., under- standing the needs of the agency).21 The most important result from this step will be the set of criteria against which any technological solution may be judged. For a single agency function, several transportation agency mission goals might be affected. For each, there should be a specific measure that can then be applied to assess different technologies in the later stages of the STREAM process. There will be practical considerations. One challenge will be to identify both the function and context at an appropriate level of abstraction. To borrow from the example discussed extensively in the next chapter, NDE technologies are aimed at the inspection function but also affect the agency role in main- taining infrastructure and perhaps even in better modeling. The framing of particular functions and goals should lead to similar results, even if undertaken by different agencies. Frame â¢ What is the function that technologies are to affect? â¢ What is the agency context within which the function is carried out? â¢ What are the goals and metrics associated with that context? Identify â¢ What technologies are or will be available to affect an agency's ability to perform a particular function? â¢ What is the maturity of these technologies and when are they likely to be available? Characterize â¢ For each technology, how does it affect the agency's ability to meet the goals associated with that function? â¢ What are the costs to technology adoption? â¢ What are the drivers or barriers to technology adoption? Compare â¢ What are the tradeoffs between adopting a technology or bundle of technologies now or in the future? â¢ What are the likely outcomes, both direct and indirect, on the target function as well as other agency functions? Decide â¢ What action should an agency take -- monitor, shape, adopt, etc. -- with respect to these technologies? Figure 3-1. The major steps in the STREAM process. 19 In this step it will be important to specify how the function is being served, so that, in the next step, technology applications can be considered as improve- ments within the current approach or introducing a new approach. 20 Organizations are charged with identifying and disseminating best practice or more widely publicizing the results of trials performed at DOTs or the federal level. Although these organizations provide a valuable service, they do not rou- tinely incorporate the explicit contextual discussion intended with the Frame step. 21 This step could be part of a grand, comprehensive assessment of transportation agency missions, functions, and capabilities assessment. This would be a massive undertaking. Something similar is conducted annually by TRADOC, the U.S. Armyâs Training and Doctrine Command. Rather, the research team envisions activities resembling the assessments of capability gaps and potential technology solutions performed by the Centers of Excellence (see National Institute of Justice, 2011, pp. 93â94, available online at https://www.justnet.org/pdf/NLECTC-System- 2011-Annual-Report-508-compliant.pdf).
13 Agencies may perform the functions of surface transporta- tion stewardship slightly differently, but the existence of a document such as the AASHTO Bridge Inspection Manual (and other similar widely used standard documents such as the Highway Capacity Manual for other routine functions) suggests that agencies are performing the same essential func- tions with similar mechanisms and for similar purposes. Thus, interagency collaboration at this stage of STREAM may be beneficial in contrast to having agencies do so in a manner that masks this fundamental similarity. Step 2: IDENTIFY Potentially Appropriate Technology Applications The Identify step is a comprehensive screening process to determine which technologies are within or beyond the scope of the decision at hand. This step identifies technologies or differing applications of technology that are available, will become available, or prospectively could be available to affect an agencyâs ability to perform the functions noted in the Frame step. Identifying appropriate technologies requires some char- acterization of each of them, including a description of â¢ The technology itself (i.e., what it is and how it works); â¢ How it could improve on the current approach to the agencyâs function, or introduce a new approach; â¢ The literature that establishes performance and maturity, with references. The initial description begins the process of fully charac- terizing technologies, the focus of the next step. Step 3: CHARACTERIZE Alternative Technology Applications This step builds on the first two to provide quantitative and qualitative assessments of key characteristics of each technol- ogy and how each may affect agency functions and goals. This step also involves selecting those technologies or technology bundles that will be compared in the next step. Whereas the Frame step is the most challenging concep- tually, this is in many ways the most technically challenging step. It requires assessing key characteristics of technologies (e.g., how well they work, risks, costs, and benefits). This process is likely to bear largely common results for different agencies. If conducted in a formally collaborative manner or with informal communications between agencies, the results are also likely to be more systematic across such technology application assessments as well. Again, an aspect of STREAM that may at first appear to be obvious or even simplistic is none the less crucial: charac- terizations will be in terms relevant to the agencyâs functions and decision process rather than in terms of more technology- specific attributes. Technology can be characterized according to the follow- ing dimensions: â¢ The effects of the technology on agency functions and their resulting influence on (all) agency mission goals, â¢ Performance requirements to achieve the overarching function and goals, â¢ Net costs of adopting the technology, and â¢ Potential barriers to the successful implementation of the technology. These quantitative and qualitative characterizations will require subject matter expertise and should be made based on the best available data on technology development and dem- onstrated performance, taking into account the current state of practice of DOTs and MPOs in carrying out the functions in question. That notwithstanding, the goal is not necessar- ily to achieve consensus in these characterizations. For both current and potential technologies, there are often legitimate, irreducible uncertainties about their impacts, costs, and bar- riers to implementation. STREAM helps make these explicit and useful in comparing technologies. STREAM also accounts for the time dimension, allowing comparisons between technologies with different expected maturities while supporting comparison among technolo- gies expected to be of the same vintage.22 This last feature enables an agency to make a single evaluation of technologies that are ready for implementation as well as more prospec- tive technologies. STREAM would tend to rate the latter less highly (largely because of the inherently greater uncertainty, as will be seen later), but it would allow agencies to make better-informed decisions about when and to what degree to commit to a given technological course in light of possible development Characterizing Effects on Agency Missions The potential benefits to agencies from successful imple- mentation of a given technology would be framed in terms of the goal-based measures developed in the Frame step. The relationship between agency functions and mission goals and the potential for achieving more favorable outcomes by enhancing the capacity to perform these functions are char- acterized in detail. For each goal, a scale of measurable benefit 22 An assessment could consider a technology in the 0â5 year time frame along with those expected only to appear in the 5â10 year time frame or could use the same approach to only look at technologies that would fall into the same (e.g., 5â10 year or later) period of envisioned implementation.
14 is applied and each candidate technology is rated according to that scale. The ranges of informed opinion on each potential benefit are retained at this point. Too-early consensus, especially in the absence of data that could conclusively dispose of uncer- tainty and ambiguity, would actually be wasteful of the infor- mation currently available. Rather, ambiguous information and divergent opinions will be retained in the STREAM analysis and become an important aspect of the later step involving detailed comparisons among technological alternatives. Characterizing Barriers to Successful Implementation The next task of characterization is to weigh and evaluate things that could go wrong, especially those that go beyond the purely technical. For this purpose, the Characterize step requires developing for each alternative a composite measure, or score, that represents the probability of successful imple- mentation (POSI). The research team developed the POSI score by synthesiz- ing the information received from the interviews, the literature, and the empirical case study work conducted by University of California, Berkeley, researchers, as well as input from staff at Berkeley PATH. The STREAM procedure for determining POSI consists of looking at three major categories of impedimentsâ those that arise from 1. The technology itself, 2. Process or institutional issues at the level of or within the agency, and 3. External concerns. Each of these three categories of impediments is judged to be of â¢ Negligible or no concern, â¢ Small concern, â¢ Major concern, or â¢ Significant concern. A small concern is one that can be dealt with relatively eas- ily as judged in local terms, a major concern is one that would require actions that could be difficult or challenging to carry out, while a significant concern would require action that might not be possible or practicable. These definitions are deliberately loose. Those conducting a STREAM assessment would provide their reasoning behind such assessments. Assessments carried out on a collaborative basis or by another agency will provide a rationale behind the POSI component scores. These then may be modified or accepted by other transportation agencies in light of local circumstances. For each large category, there are three specific barriers as shown in Table 3-1. Each one would be subject to quantitative or qualitative assessment. STREAM was designed to allow for simplification where this would not lead to serious misjudg- ment. Though a limited list, these nine impediments account for most obstacles that arise. Technologies of different vintages would likely be scored differently from those currently available. Current technolo- gies would presumably present fewer concerns about several of the barriers because agencies are more familiar with them and processes may be in place to support their implementation. For the same reason, there may be less uncertainty about barriers to implementing currently available technology. This is one way that STREAM allows technologies of dissimilar maturity to be assessed within the same analysis. It also provides a con- venient entry point for updating as more information becomes available. Once these assessments are made, a single POSI value is assigned to the technology according to the scale shown in Table 3-2. The conditions in Table 3-2 are not necessarily the only ones on which the POSI score could be assigned.23 Category of Impediments Specific Barriers Affecting POSI Technology Unfamiliarity with core or applied technology Uncertainty concerning actual performance Addi onal implementa on requirements (training, standards, etc.) Agency Process or Institutions Need for new or conflict with exis ng regula ons or standards Non-fungibility of funding for required expenditures Extended or problema c approval processes External to Agency Iner a of exis ng processes and methods Insufficient poli cal or public acceptance Lacking presence of necessary vendor or support base Table 3-1. Sources of impediments that reduce POSI. 23 This approach displayed in the table was arrived at after considerable the- oretical and empirical discussion that considered a wide range of alternative approaches. It appears to meet the design criteria sought by the research team, particularly the balance for both explicit expression of underlying assumptions and simplicity of application. For these reasons, the research team chose to use four-level measures at several points in the STREAM process. This simplifica- tion did not appear to constrict the ability to use sophisticated reasoning in the actual valuation assignment process while allowing for considerable generality.
15 The advantage is that this allows a single metric to be con- structed from what otherwise would be a heterogeneous set of potential obstacles, some of which might be quantifiable in natural unit terms while others of which could only be rea- sonably assigned a qualitative measure. The three examples of STREAM in application (in Chapter 4 and Appendices B and C) demonstrate the flexibility inherent in this approach to POSI scoring. Characterizing Costs Several aspects of cost shaped the approach to incorporat- ing the cost factor into STREAM. â¢ Costs must be considered apart from barriers because they are fundamentally different. â¢ Costs should be calculated on a net basis. â¢ Local context is of great concern in measuring costs. A simple approach to cost is necessary but difficult to achieve. The STREAM design team concluded that an approach that places cost information at the disposal of a transportation agency when considering technology adoption alterna- tives but that does so in a simple, tractable manner is most appropriate.25 Complicating Features of Cost Costs are not simply another potential barrier to adoption and they cannot just be made another component of POSI. The probabilistic approach used to construct the POSI score cannot be applied to cost. A key feature of the POSI score is that the larger the number or severity of barriers, the less chance for successful implementation. This is not true for costs. Although a less expensive alternative is preferable to a more expensive one when all else is held constant, it is rarely the case that such a direct comparison can be made. Some- times the more expensive alternative might be considered desirable if it yields a higher probability of success, a greater expected benefit, or both. Thus, costs need to be considered on their own. All costs should be calculated on a net basis. So, for example, costs of acquisition, personnel, and maintenance should be net the value of redundant equipment that could be sold on the secondary market, personnel and maintenance that otherwise might have been required, and so forth. This would include consideration of tasks that no longer need to be performed as a result of adopting an alternative technology under consid- eration. Further, one-time costs required at the time of adop- tion should be distinguished from recurring costs that will be required during the effective life of the adopted technology. Local issues are of great concern in measuring costs. We have previously spoken of the problem of fungibility among sources of funds which we have accounted for in the POSI score. Budgets may differ in the extent to which either cat- egory of cost, fixed or recurring, might prove a problem on the local level. Cost also has both absolute and relative dimen- sions. Although a particular capital investment may cost $1M no matter where it is purchased, one DOT may require ten, while another requires one hundred. There is also a local, relative dimension to the consideration of costs that is par- ticular to each DOT. The $10M required of a smaller DOT may be viewed locally as a greater challenge than the $100M required of a large DOT, or vice versa. Cost Characterization Method Given these considerations, cost considerations in STREAM are separate from but similar to the POSI measure. The char- acterization of cost within STREAM, shown in Table 3-3, is similar to the one the research team designed for assessing the constituent factors of the POSI score. It divides costs into fixed and recurrent costs and lists the different potential categories of such costs. Table 3-2. POSI evaluation for a technology based on level of expected impediments. POSI Score Level Conditions for Achieving POSI Score Level 4 Number of Major concerns = 0 3 Does not meet criteria for POSI score level of 1, 2, or 4 2 Number of Significant concerns=124 -or- Number of Major concerns > 2 1 Number of Significant concerns > 1 24 While it might appear counter-intuitive that any technology with a Significant concern should receive a POSI score other than 1, this classification scheme accounts for possible uncertainty in evaluation. It is probably also best to think of the difference between Significant and Major concern as more of a gradient than a strict division. Therefore, and perhaps over a longer period of time, it would seem that if there is only one major Significant concern it could well become a focus of greater attention which might result in accommodations allowing it to be re-categorized as a Major concern. 25 Costs could be rigorously traded off against benefits in a manner consistent with decision theory but only through significant data gathering and compli- cated mathematical algorithms made even more complex, if not insoluble, by the presence of uncertainty. In keeping with the philosophy behind STREAM, the research team developed a more straightforward way to bring cost into the realm of agency decision making.
16 The range of estimates for the discounted net cost produced by each specific cost factor will, in sum, determine the range of total fixed and recurring costs for the candidate technol- ogy being evaluated.26 The output must state how the scale of deployment (e.g., the necessity to purchase one versus 100 units) affects cost, so that in future iterations (e.g., one that would require acquisition of 50 units) the analysts can also calculate costs. This scaling can help other transportation agencies use the same calculation method for their own needs. In addition, the factor used to discount future costs back to the present must be made explicit. This, too, is something that will vary given the situation of each transportation agency and so must be amenable to being tailored.27 The result of this process is a calculation of estimated dis- counted net cost associated with each technology alternative by component of cost as well as in aggregate. However, this is not applied as a raw score directly into STREAM because the effects of costs vary among agencies. Instead, each trans- portation agency assesses what this balance of cost means for them. One possible approach to doing so is illustrated in Table 3-4. This assessment carries some subjectivity and the mean- ings associated with each score level are designed as rela- tive terms: what may be âLittleâ net cost to one agency may, because of various factors, be seen as âMajorâ cost by another. These weightings can be subject to different opinions and assessments and so, as with both POSI and the calculation of benefit, would be amenable to a Delphi-type treatment in which the assessments of individual assessors can be aggre- gated and uncertainty ranges disclosed. Several issues associated with costs are not accounted for in the quantitatively based assessment of net cost presented in Table 3-4. Different technology alternatives may involve a shifting of cost centers either within or between organi- zations. This is related to but distinct from the question of differential fungibility discussed above. These issues provide additional reasons for rejecting the purely mathematical treatment of costs in relation to benefits. The goal is not to provide a deus ex machina approach that will automatically derive a solution on technology adoption decisions by agen- cies, but to create a framework that will fully account for and present actual decisionmakers with the factors necessary for decision while also accounting for the nuances that will affect that decision. Step 4: COMPARE Technology Alternatives and Tradeoffs This step uses a series of visualizations and other presenta- tion techniques to compare key characteristics of each technol- ogy which may have differential effects on agency functions and goals.28 Decisionmakers can use these visualizations, Category of Cost Specific Cost Factor Discounted Net Cost Net One-Time Costs Acquision (net the value of redundant equipment) Taxes/Penales/Fees (net TPF no longer required) Licenses, Royales, etc. (net) Net Recurring Costs O&M (net O&M costs of equipment made redundant) Training (net training no longer required) Taxes/Penales/Fees (net TPF no longer required) Licenses, Royales, etc. (net) Personnel (net personnel made redundant) Table 3-3. Factors affecting fixed and recurring cost and discounted net cost. 26 Taxes, penalties, and fees, as well as licenses and royalties, appear as both fixed and recurring costs. This is to allow for the possibility of both one- time and recurring charges being required for a particular technology being assessed. 27 The discount rate will determine the extent to which the benefit or burden between generations will be shared. Discount rates may differ because of dif- ferences in philosophical approach, local law or regulation, or local financial considerations. Cost Score Level Conditions for Achieving Cost Score Level 4 No net cost or net cost savings 3 âLileâ net cost 2 âMajorâ net cost 1 âExcessiveâ cost Table 3-4. Cost evaluation for a technology based on agency-specific considerations of cost. 28 During this study, the research team developed an Excel-based software tool for use in the Compare step to generate figures that plot the characterization data developed in the previous step.
17 together with the context in which the technology applications are to be applied, to examine the tradeoffs between alternative technology applications. This step is discussed in greater detail in the next chap- ter in which an in-depth treatment is presented by appli- cation to a specific function performed by transportation agencies. These visualizations offer a consistent approach for comparison, as opposed to requiring each agency to determine what techniques should be used for evaluating technology options. With wide use of a common approach, the experience of one agency can be made more meaning- ful to another and collaborative assessments can also be produced. It then would fall to the agencies to carry out the comparison but with greater hope of consistency, foun- dation in a wider range of goals, and with more definitive results with greater integration into other existing ongoing processes. Step 5: DECIDE: Adopt, Shape, Monitor, Research In this final step, decisionmakers determine what the agen- cyâs response should be to technological opportunities for enhancing agency functions and the ability to achieve mis- sion goals. The decision framework takes into account trade- offs described in the previous step and the local context and constraints. The purpose of STREAM is to present a plausible basis for making these decisions by creating a detailed guide to which an agency may then refer. Transportation agencies may adopt different postures when presented with technological alternatives: â¢ Ignore such an opportunity entirely â¢ Monitor developments and the accumulation of more information â¢ Engage in research to generate information about one or more technological options â¢ Engage in various shaping activities to influence the pace and directions of development of promising technologies â¢ Move forward with adoption and use within their own agency Given that the ultimate decision rests with the individual transportation agencies themselves, agencies require a sound basis for their decisions. Agencies also require an understand- ing of the choices available to them. In some cases, the deci- sion is simply one of whether to adopt a particular technology application or not. In many cases, the combination of differ- ences in technologies, different applications that embody those technologies, differences in likely dates of maturity among technologies, and the differing specific characteristics of tech- nologies will suggest a richer set of possible choices. A decision to monitor developments is one that would be warranted by sufficient uncertainty about what the eventual characteristics might prove to be for a rapidly developing technology. In addi- tion, agencies can, either individually or collectively, seek to shape the course of development. This step therefore requires sufficient preparation by performing the preceding steps as well as a means to determine the stance best taken given the circumstances of each agency.