National Academies Press: OpenBook
« Previous: Chapter 3 - Testing the Tool Prototype
Page 33
Suggested Citation:"Chapter 4 - Tool Implementation Playbook." National Academies of Sciences, Engineering, and Medicine. 2015. Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance. Washington, DC: The National Academies Press. doi: 10.17226/22177.
×
Page 33
Page 34
Suggested Citation:"Chapter 4 - Tool Implementation Playbook." National Academies of Sciences, Engineering, and Medicine. 2015. Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance. Washington, DC: The National Academies Press. doi: 10.17226/22177.
×
Page 34
Page 35
Suggested Citation:"Chapter 4 - Tool Implementation Playbook." National Academies of Sciences, Engineering, and Medicine. 2015. Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance. Washington, DC: The National Academies Press. doi: 10.17226/22177.
×
Page 35
Page 36
Suggested Citation:"Chapter 4 - Tool Implementation Playbook." National Academies of Sciences, Engineering, and Medicine. 2015. Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance. Washington, DC: The National Academies Press. doi: 10.17226/22177.
×
Page 36
Page 37
Suggested Citation:"Chapter 4 - Tool Implementation Playbook." National Academies of Sciences, Engineering, and Medicine. 2015. Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance. Washington, DC: The National Academies Press. doi: 10.17226/22177.
×
Page 37
Page 38
Suggested Citation:"Chapter 4 - Tool Implementation Playbook." National Academies of Sciences, Engineering, and Medicine. 2015. Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance. Washington, DC: The National Academies Press. doi: 10.17226/22177.
×
Page 38
Page 39
Suggested Citation:"Chapter 4 - Tool Implementation Playbook." National Academies of Sciences, Engineering, and Medicine. 2015. Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance. Washington, DC: The National Academies Press. doi: 10.17226/22177.
×
Page 39
Page 40
Suggested Citation:"Chapter 4 - Tool Implementation Playbook." National Academies of Sciences, Engineering, and Medicine. 2015. Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance. Washington, DC: The National Academies Press. doi: 10.17226/22177.
×
Page 40
Page 41
Suggested Citation:"Chapter 4 - Tool Implementation Playbook." National Academies of Sciences, Engineering, and Medicine. 2015. Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance. Washington, DC: The National Academies Press. doi: 10.17226/22177.
×
Page 41

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

33 4.1 Applying the Tool for Agency Decision Making State DOTs and MPOs can apply the framework and use the tool prototype in a variety of ways. This can be summarized as conducting analyses and producing critical inputs for at least four key agency activities: • Planning: Providing analytical support for development of SLRTPs, metropolitan transporta- tion plans (MTPs), comprehensive freight plans, TAMPs, and other planning activities (e.g., medium-range plans and alternatives analyses); • Programming: Evaluating and prioritizing projects to support development of STIPs, MPO transportation improvement programs (TIPs), and other budgeting and project evaluation/ selection activities; • Strategic activities: Providing analysis, information, and documentation to support stra- tegic efforts such as the development of revenue initiatives or a special program of proj- ects; and • Communications and public involvement: In conjunction with the previous activities or as stand-alone efforts, the tool prototype can be used as a tool to engage stakeholders and the public in policy development and investment analysis and to provide documentation and transparency for agency decision making. The following section provides an overview of how the various tool prototype functionalities can be applied to support agencies in addressing one or more of these activities; it includes a road map agencies can use to pursue these applications. 4.2 Example Applications and Use Cases Agency application of framework can be customized based on the unique interests and needs of individual agencies with respect to the four activities previously listed and used to support the following agency planning and policy development functions: 1. Overarching project prioritization, 2. Program-level analysis, 3. Project-level analysis, 4. Performance analysis and target setting, 5. Scenario analysis, 6. Establishing of relative priorities, and 7. Risk analysis. C H A P T E R 4 Tool Implementation Playbook

34 Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance 4.2.1 Overarching Project Prioritization At the most basic level, the prototype is a project prioritization tool that enables agencies to comprehensively rank projects and optimize their selection given unique program areas, perfor- mance measures/targets, priorities (i.e., weighting of performance measures), needs (i.e., candidate projects), resources, and policies. This application can be applied by agencies in a wide range of functions, including: • Prioritizing projects for inclusion in SLRTPs (if project-specific) and MTPs; • Prioritizing projects for inclusion in STIPs and TIPs; and • Providing a tool to educate stakeholders and the public about project selection processes (or even engage them in the process). Example Application: A state DOT that wants to develop a project-specific long-range plan can use the framework and tool prototype to conduct analysis and policy development at the front end of its planning process to identify a long list of candidate projects; define/revise goals, program catego- ries, and performance measures; and evaluate minimal investment levels corresponding to system performance targets, establish achievable system performance targets, and set relative priorities. The tool prototype could then be employed to evaluate and rank projects to inform development of a final list of selected projects to be included in the plan. The tool could also be used for a top-down analysis, where trade-off curves are established by management systems to develop budgets for inputs into the initial project development phase. 4.2.2 Program-Level Analysis In a more targeted application than the one described previously, agencies could use the tool prototype to evaluate and rank projects within specific program areas such as mobility, pave- ment, bridge, safety, operations, and economic development. Associated uses may include: • Prioritizing projects in program areas where management systems are weak or do not exist, • Developing a list of preferred projects to be funded through a targeted revenue initiative, • Assessing the performance trade-offs associated with different sets of priorities (i.e., changing criteria weighting) or program funding levels, and • Providing documentation and transparency for how a given set of projects was selected. Example Application: An agency has good pavement and bridge management systems that support prioritization of preservation projects but lacks a meaningful and defensible approach for selected mobility projects that are funded through a dedicated funding source. The tool prototype can be used to provide a defensible mechanism for informing the selection of mobility projects and to enable the agency to respond to inquiries about what projects would get funded at different program spending levels. Data to be analyzed may include qualitative data to support regional development or identify a project by location; for example, an expansion project on the primary freight network. 4.2.3 Project-Level Analysis The tool prototype can be applied to conduct analyses of individual projects and programs of projects to support decisions about investing in them. The following are some of the associated analyses an agency could use the tool to perform: • Assess the system performance trade-offs between two individual projects, • Assess the system performance trade-offs between two portfolios of projects, and • Assess the merits of a specific project or a portfolio of projects relative to all other candidate projects and evaluate the implications of advancing or deferring the project or projects.

Tool Implementation Playbook 35 These analyses can better inform stakeholders and the public by comparing the merits of pre- ferred projects to other candidate projects and to showing the impacts advancing these projects would have on overall system performance. Example Application: An agency receives a politically based request (e.g., from the governor) to advance at least one of two projects not currently programmed. The tool prototype could be applied to evaluate how each project ranks relative to projects in the program and other candidate projects that have not yet been selected for implementation. The tool prototype could also be used to quantify the system performance implications of implementing a predetermined project or set of projects in lieu of one selected purely based on investment optimization. Based on the results of the analysis, the findings could be used to (1) support one project over the other, (2) justify advancing the project(s) (e.g., if the project is not that far out of the money or if reprioritization would have minor performance impacts), or (3) provide documentation to support opposing the request (e.g., if the desired projects have relative little merit or advancing them would have significant performance implications). 4.2.4 Performance Analysis and Target Setting The prototype can also serve as a tool to conduct performance analyses across multiple invest- ment categories. Potential applications include: • Forecasting the overall system performance and other consequences that would occur at different investment levels (e.g., baseline, high, and low), for either planning or programming horizons; • Assessing the overall system performance trade-offs associated with achieving different per- formance standards for individual goal areas; • Helping stakeholders understand what performance targets are attainable, both individually and in concert with system-wide performance for other goal areas; and • Providing performance-based documentation and justification for plan and program decisions. Example Application: A state legislature is considering a statewide transportation funding ini- tiative and has created a blue-ribbon panel to develop recommendations for the size of the revenue package and how the new money should be used. To help inform the panel’s recommendations, the state DOT and the state’s MPOs could use the tool prototype to forecast the system performance that could be achieved at different funding levels. Similarly, the tool prototype could be used to quantify and articulate the impacts of a proposed funding cut to legislators, other stakeholders, and the public. 4.2.5 Scenario Analysis The tool prototype can be used to conduct scenario analyses in support of performance-based planning and programming. Potential applications include: • Assessment of system performance and resource allocation at different investment levels, • Determining the minimum investment/resource allocation required to achieve specific per- formance targets, • Development and analysis of scenarios based on specific constraints (e.g., proscribed resource allocations), and • Enabling stakeholders to develop and evaluate different investment strategies. Example Application: As a state develops a long-range transportation plan, it becomes apparent that senior management and stakeholders have widely divergent views about what the state’s future highway investment priorities should be. Moreover, these views tend to be based on narrow perspec- tives of what is important and how investment decisions can affect other performance areas. The tool prototype could be employed to develop scenarios that emphasize different program areas to help inform stakeholders about the need for balanced investment.

36 Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance Similarly, a state could use the tool prototype to support top-down development of its TAMP by helping officials look across asset classes to determine where they want them to fall on cost versus performance curves and allocate resources accordingly. These allocations could then serve as inputs to agency management systems. 4.2.6 Establishing Relative Priorities The AHP contained in the tool’s weighting module provides a means to help agency officials and stakeholders understand how their views of relative priorities translate into program area weights and ultimately the corresponding resource allocations. Potential applications of this capability include: • Generating inputs to resource allocation decisions/guidance for statewide and metropolitan long-range plans, STIPs/TIPs, and other budgeting activities, • Identifying parameters for developing alternative investment strategies, and • Providing a tool to engage the public in discussions about how agencies can balance competing priorities. Example Application: A state DOT wishes to actively engage its stakeholders in development of its long-range plan’s resource allocation guidance. To do so, the agency conducts a series of stakeholder workshops that use the tool prototype’s AHP weighting module to help participants gain consensus on relative priorities and associated resource allocation options. These findings are then used to inform the development and analysis of alternative investment scenarios. 4.2.7 Risk Analysis Uncertainty surrounds transportation decision making. Whether pertaining to cost estimates, budgets, asset deterioration, or project performance impacts, agencies are at risk of delivering pro- grams with outcomes that differ from original estimates. This realization has inspired MAP-21 legislation to call for risk-based transportation asset management plans. The tool prototype pro- vides an example of how uncertainty could be accommodated in a decision-making framework by: • Evaluating performance impacts for plus–minus changes in available budget, • Building probability distributions around performance outcomes given the estimated stan- dard deviation of a project’s impact, and • Supporting exploratory sensitivity testing around outcomes under varying inputs. Example Application: A state DOT is asked how confident it is in being able to achieve MAP-21 targets for bridge conditions in light of possible budget cuts over the next 10 years. To answer this, the agency first assesses bridges that are on the bubble, which, while not expected to become deficient, are at least questionable within the time horizon. Second, the agency assesses the uncertainty surrounding the efficacy of selected activities to correct/prevent structural deficiency. By simulating the likelihood of unplanned-for bridges to become deficient and the likelihood of successfully getting the planned-for bridges out of the deficient state, the agency runs an optimization process multiple times at the reduced funding level. The findings of these analyses are used to provide the stakeholder with a percentage sta- tistical level of confidence based on the number of times the target was reached despite the uncertainties. 4.3 Getting Started—Self-Assessment Agencies interested in exploring application of the framework and tool prototype may wish to begin by considering the questions and issues identified in Table 3. The table content builds on the findings of the initial project literature review on resource allocation practices and is meant to

Tool Implementation Playbook 37 Quesons and Issues Consideraons Intended Applicaons – What planning processes do you wish to support and what decisions do you want to inform through use of the tool prototype? Intended applicaons will drive both the data needed to support the tool prototype and the processes that can be used to apply it. To the extent possible, agencies will want to integrate the use of the framework and tool prototype at the beginning of planning and decision making processes. Applicaons will benefit from well thought out development of analycal scenarios (e.g., baseline, high, and low funding). Time Horizon – What is the me frame for applicaons? Running the tool prototype over different planning horizons will likely lead to different results. Ability to apply different planning horizons will be influenced by specificity of project data. Opmizing projects over an LRTP horizon should incorporate windows of opportunity into the opmizaon for a long range mulyear analysis period. Strategic Frameworks – Has your agency established adequate system goals, objecves, and performance measures needed to drive the cross asset allocaon analysis? Goals and objecves drive configuraon of tool prototype program areas and are essenal to the NCHRP Project 08 91 framework. Efforts to establish relave priories among goals and objecves can help influence tool weighngs. MAP 21 goals and performance measures can/should also be integrated into the tool. Program Structure – How well do program areas (i.e., funding categories) align with goals and objecves? The be“er the alignment, the more the tool prototype will be able to show the relaonship between recommended resource allocaons and achievement of agency goals. Performance Targets – Have performance targets been established for key goal areas? Agencies will need to enter performance targets into the tool, or Agencies can use the decision science processes to idenfy preference driven targets, as long as they meet MAP 21 and state criteria. Candidate Projects – Is there an exisng list of candidate projects to be evaluated? If not, are there plans for developing the lists or systems and processes in place to support development of a list? The level of candidate project informaon will influence tool prototype analycal capabilies. A top down analysis can be implemented as a first step to performance based budgeng and cross asset resource allocaon. Stakeholder Roles – Do you intend to use the tool prototype as a stakeholder or public engagement tool? Agencies will need to determine if they are using the tool to educate stakeholders/the public or to bring them into decision making; the la…er will require Table 3. Tool prototype self-assessment questions, issues, and considerations. (continued on next page)

38 Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance serve as a guide to help agencies (1) determine the highest value uses for the tool given an agency’s unique circumstances, (2) assess its level of readiness, and (3) identify the need to develop any resources or conduct preliminary research and analysis to support application of the tool. 4.4 Using the Tool The tool’s user guide has been developed showing the hands-on use of the tool prototype. Starting with the home page (Figure 14), the user guide walks through each tab of the Microsoft Excel–based spreadsheet, following the framework of NCHRP Project 08-91. Once project data are entered and performance measures are selected, weighting, scaling, and scoring are performed. Various trade-offs and optimization options may then be chosen. Within the spreadsheet interface, different colors indicate different functions of the cells. This method allows the user to quickly identify where input is required or where there is the potential for override. The screenshots in the user guide are for demonstrative purposes only, and all data inputs are merely examples. Any performance measure may be used based on the goals and objectives developed by the user agency. The following sections define how each tab within the tool prototype is used for the preferred bottom-up approach. As previously mentioned, a top-down approach may also be used, so the user guide addresses that, as well. Quesons and Issues Consideraons more work to bring people up to speed on how the tool prototype works. The applicaon of the tool prototype as an engagement tool will need to be designed around the transportaon literacy of those that will be involved, and some modificaons will be needed. Clear Instuonal Constraints – What instuonal barriers could limit the use or effecveness of the tool prototype? Does the agency’s organizaonal structure create barriers to implementaon of the tool prototype? In instances where there is a well established process or set of planning tools, agencies may need to clearly arculate the benefits of using the tool prototype (parcularly if it is replacing rather than complemenng an entrenched process/methodology). Outreach may be needed to help planning partners (e.g., MPOs) understand the tool prototype and its implicaons for decision making. Agencies with highly decentralized decision making may find applicaon of the tool prototype more challenging (at least on a statewide basis). Resource Allocaon Parameters – What statutory, administrave, or other factors influence resource allocaon decisions? Parameters such as statutory requirements for certain program allocaons can be accommodated in the tool prototype. The tool prototype can be used to show the performance implicaons of greater or reduced funding flexibility. Table 3. (Continued).

Tool Implementation Playbook 39 4.4.1 Data Integration and Performance Measures • Projects tab: Input candidate projects and attributes into the tab. Required attributes are a unique project ID, project cost and sponsoring program area, and variables to convert between project- and network-level performance [e.g., annual average daily traffic (AADT), project length]. Geographic identifiers may also be inputted for user reference. • Settings tab: Users can specify program areas to allocate resources between and measures to weigh performance preferences against. For each program area, project-level and correspond- ing network-level performance measures should be chosen. The tool is prepopulated with common program areas and performance measures, but any area or measure may be entered. • Performance tab: Projects and performance measures are populated from previous input. The user enters the estimated value of each performance measure in the cases of with and without project implementation. 4.4.2 Weighting, Scaling, and Scoring • Weighting tab: A rating scale compares the relative importance of performance measures (Figure 15). The weights are calculated and stored as the AHP weights. There is a consistency check for the selected pairwise comparisons. Expert overrides/fixed weights may be entered, as well. Figure 14. Tool prototype home page.

40 Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance • Scaling tab: All performance measures are normalized to a level playing field. If linear scal- ing is chosen, users must specify minimum/maximum possible values for each performance measure. For utility scaling, users must enter in the coefficients of the calibrated function. • Scoring tab: Scores, representing the importance of the project, are calculated by combining the weighting and scaling results. Users can iteratively override any result. Satisfactory scores are then sent to optimization/trade-off analysis. 4.4.3 Trade-off Analysis and Optimization • Optimization tab: Input budget and performance constraints. The analysis will show the recommended resource allocation for each program area, as well as a list of selected projects. State-of-repair dials are displayed based on user definitions of good, fair, and poor for each performance measure. • Trade-off tab(s): For the bottom-up mode, six different trade-off analyses are available: – TF1: Trade-off between two individual projects—Any two projects may be compared by entering the unique project IDs; – TF2: Trade-off between two project portfolios—Projects are chosen for each portfolio (project selection may be copied from previous optimization runs) and relative preferences are compared; – TF3: Trade-off between performance and investment level—Any performance measure and objective (minimize or maximize) may be selected; result shows general linkage between performance measure and investment level; – TF4: Minimum investment level to achieve performance targets—The user enters a per- formance target for all network-level performance measures; analysis will approximate the minimum investment budget necessary to reach each target; – TF5: Resource allocation scenario analysis—The user inputs various resource allocations for each program area; results show the final performance levels associated with each scenario; – TF6: Weighting scenario analysis—Weight of performance measures may be varied for each scenario; results show the allocated budget and performance for each scenario. • Example scenario comparison tab: Different scenarios can be conducted and compared, pre- senting the outputs in a variety of formats. 4.4.4 Top-Down Analysis The top-down approach allows decision makers to observe the impact of different resource allocations across assets at a strategic level. It is also helpful in identifying the optimal resource Figure 15. Rating scale for pairwise comparison.

Tool Implementation Playbook 41 allocation across assets, given budget limitations, objectives, and performance targets. Users need to input the following data for performance measures (such as IRI, bridge condition, and travel speed) and program areas (such as pavement, bridge, and mobility) in order to run top-down analysis: • Performance measure bounds (minimum and maximum values), • Performance measure targets, • Weight of each performance measure, • Investment level versus performance data, • Budget floor and ceiling for each program area, and • Total available budget. The investment versus performance data are used to develop trade-off curves in order to interpolate the resulting performance for any resource allocation. The tool will then determine the optimal resource allocation across program areas that maximizes system performance per agency preferences. 4.4.5 Risk Analysis The tool considers the uncertainties in expected budget and predicted performance measures and provides a range of possible performance outcomes by simulating different budget scenarios. • Risk tab: The user will input how much the performance measure may vary from its expected value for both with and without cases for each project. • Risk summary tab: Using the expected budget, how much the budget may vary from the expected budget (standard deviation), and the confidence interval as defined by the user, the tool will provide a range for the performance measures given budget and performance uncertainties. The tool also produces a graphical spectrum of expected performance values.

Next: Chapter 5 - Conclusions and Next Steps »
Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance Get This Book
×
 Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s National Cooperative Highway Research Program (NCHRP) Report 806: Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance provides guidance and a spreadsheet tool to help managers with applying data-driven techniques to project prioritization, program development, scenario analysis, and target setting. The tool and guidebook are intended to assist managers with analyzing and communicating performance impacts of investment decisions.

The software is available online only and can be download from TRB’s website as an ISO image. Links to the ISO image and instructions for burning an ISO image are provided below.

Help on Burning an .ISO CD-ROM Image

Download the .ISO CD-ROM Image

(Warning: This is a large file and may take some time to download using a high-speed connection.)

Software Disclaimer - This software is offered as is, without warranty or promise of support of any kind either expressed or implied. Under no circumstance will the National Academy of Sciences or the Transportation Research Board (collectively "TRB") be liable for any loss or damage caused by the installation or operation of this product. TRB makes no representation or warranty of any kind, expressed or implied, in fact or in law, including without limitation, the warranty of merchantability or the warranty of fitness for a particular purpose, and shall not in any case be liable for any consequential or special damages.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!