Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
49 Overview This chapter provides updated guidance for implementing a cross-asset resource allocation approach using the NCHRP Report 806 framework. The guidance is intended to assist state and local agencies interested in implementing a data-driven, transparent approach to allocating funding to particular investment areas or to specific projects. It intends to help agencies move away from allocating funding strictly within silos of assets or investment categories, and consider investments and/or projects that are cross-asset in nature. The guidance provides an overview of the process of implementing a cross-asset allocation approach, providing a list of possible goals and examples of performance measures. Note that the examples provided in the guidance are based largely on the case studies detailed in Chapter 3, supplemented by additional examples. Agencies should tailor their selections of goals, objectives, and performance measures to their needs and desired outcome. The guidance is written assuming agencies are prioritizing projects, but the guidance can be applied to prioritizing specific investments as well. The guidance includes the following eight steps: Step 1: Establish the Scope, Step 2: Define Goals and Objectives, Step 3: Select Performance Measures and Evaluation Criteria, Step 4: Assess Data and Analytical Capabilities, Step 5: Prototype the Approach, Step 6: Set Weights on Goals and Objectives, Step 7: Apply the Model, and Step 8: Communicate the Results. The steps walk through the process of selecting the goals and measures used to prioritize projects, developing a utility function that combines all the scores from individual performance measures and optimizing the selection of projects for funding based on a particular budget. This process results in a set of projects that effectively and efficiently achieves the goals and objectives of the agency given the budget constraints. Step 1: Establish the Scope The first step in implementing a cross-asset resource allocation approach is to establish the scope of the analysis. Depending on the desired goal of a resource allocation approach, the scope will differ. Key questions to consider when determining the scope include: â¢ What asset classes are included in the allocation approach? â Determine if funding is being allocated between pavement and bridge projects, or if other asset classes are being considered in addition to pavement and bridge. Further, assess C H A P T E R 4 Guidance for Implementing a Cross-Asset Resource Allocation Approach
50 Case Studies in Implementing Cross-Asset, Multi-Objective Resource Allocation whether projects are defined as targeting a single asset class or if projects are already cross-asset in nature. This guidance is organized assuming the decision maker will use it to prioritize investments or projects across asset classes. The guidance is also applicable within asset categories, though it is likely that there are systems in place for allocating funds within asset categories already. â¢ What investments are included in the allocation approach? â Determine if the allocation approach is directed at capital projects, preservation and maintenance projects, both, or other projects. â¢ What is the decision period? â Are projects typically programmed over a 1- or 2-year time period? 5 years? Based on the typical planning timeline, determine when the prioritization will be updated. â¢ How does the approach fit into the existing business process? â Determine how the allocation approach fits in with the existing capital programming and maintenance and operations processes. In addition, you may want to consider how the approach interfaces with district- or division-level project decision making. A cross-asset resource allocation approach may also impact decisions about budgeting. While budget decisions are often made prior to allocating resources to investments or projects, you may want to consider the results of a cross-asset resource allocation approach before making final budget decisions. Finally, determine how the approach integrates with project devel- opment. Are there any new communication steps or other processes that may need to be put in place to develop projects for the allocation approach? â¢ How will the results be used? â Determine if the results will be the definitive set of projects selected for funding or if the results will be used as one of many factors considered when selecting projects for funding. What are those other factors? What is the process for ultimately selecting projecting or allocating resources? Also, determine if the resulting ranks and optimization will be made available to the public. Examples: Establishing the Scope The California Department of Transportation (Caltrans) developed a methodology to prioritize projects in the California State Highway Operation and Protection Program (SHOPP), which funds repair, preservation, and safety improvements on the California SHS. (1) The Maryland Department of Transportation (MDOT) uses a cross-asset resource allocation approach to score and rank major transportation projects for inclusion in its Consolidated Transportation Plan (CTP). These are highway or transit capacity projects that exceed $5M. While some of these projects can include system preservation elements, the methodology excludes projects that are solely for system preservation. Projects are scored and ranked on an annual basis as the CTP is updated annually. (2) Instead of prioritizing specific projects, the Arizona Department of Transporta- tion (ADOT) uses a methodology to identify the desired allocation of resources between the following investment areas: preservation, modernization, and expansion. This allocation is used in its long-range plan, which is updated every 5 years. (3)
Guidance for Implementing a Cross-Asset Resource Allocation Approach 51 There are many ways to either broaden or narrow the scope of the allocation approach to best fit the goals of the agency. Check the examples in the call-out box to see how other agencies defined the scope of their resource allocation approach. Step 2: Define Goals and Objectives Once the scope is established, the next step is to formally define the goals and objectives for the resource allocation approach. The goals should represent what your agency desires to accomplish with investment in particular areas or projects. A good place to start in establishing goals is to look at existing agency documents that describe the mission or vision of the agency. Strategic or long-range plans often include agency goals that can be adapted for use in cross- asset resource allocation. Typically, a subset of the broader agency goals and objectives might be best to consider, depending on the analysis scope. Regardless, the goals selected for the resource allocation approach should be consistent with other agency documents. In order to keep the analysis from being too burdensome, a resource allocation approach should not have more than five to seven goals. In addition, it is important to ensure that goals do not overlap as this can lead to double counting issues in the analysis. Overlap can be avoided by making sure to select fundamental goal areas, such as those from Table 4-1. Not all of these goal areas will be applicable to every agency, and it is important to select goals that best indicate what the projects will accomplish. Goal Description Mobility The ability and ease of moving people and goods. Includes measures that indicate travel time savings or throughput gains due to a project. The ability to get to places of employment faster is a mobility improvement. Preservation This goal captures the cost savings achieved when an agency improves assets over the lifespan. Spending money on maintenance saves money in the long run and this goal area measures the benefits of those preservation activities. In addition, when assets are in poor condition they are more likely to fail. Thus, preservation activities include a risk reduction component that can be measured under this goal area. Safety This goal captures the benefits of making safety improvements (e.g., reducing crashes, improving bike/pedestrian safety) in the project area. Security This goal captures the benefits of activities performed to reduce harm to the population due to external forces. Measures in this goal area link to reducing crime or risk of terrorism, etc. Resilience The ability of the system to withstand risk. This goal area encourages work to prepare the transportation system for extreme weather events and also encompasses activities that promote multiple options for travel. Environment This goal encompasses progress on improving air quality and reducing emissions, improving health, and work related to natural resources (e.g., fish passages, wildlife crossings). Community This goal captures the benefits to the particular neighborhood surrounding the project area. Building bike lanes or improving pedestrian facilities are examples of project activities that benefit the community. Economic Development This goal captures benefit to the economy due to project activities. Includes measures related to job growth, tax income, number of people employed, or number of jobs available. Spending money generates economic effects and this goal incorporates those effects on employment opportunities in the area. Accessibility The ability and ease of accessing transportation. Includes measures that indicate the level of improvement to elevators, sidewalk ramps, and other important facilities. Accessibility activities ensure that individuals can get where they need to go even if they cannot drive. Environmental Justice A goal that values progress in ensuring that transportation improvements are not benefiting (or harming) people disproportionately. Table 4-1. Typical investment goals.
52 Case Studies in Implementing Cross-Asset, Multi-Objective Resource Allocation Examples: Defining Goals and Objectives Caltrans adapted the goals listed in its Strategic Management Plan for 2015â2020 for use in its resource allocation methodology. Not all of the goals in the Strategic Management Plan were applicable to the prioritization of preservation projects, so some adjustments and reconfiguring were needed to ensure the goals rep- resented what the agency sought to accomplish with the projects. This process of selecting goals ensured that the goals for resource allocation were consistent with existing goals within the agency. The table below shows the evolution of Strategic Management Plan goals to the goals used for resource allocation purposes. (5) Strategic Management Plan Goals Project Prioritization Goals Safety and Health Safety Stewardship and Efficiency Air Quality and Health Sustainability, Livability, and Economy Stewardship and Efficiency System Performance System Performance and Economy Organizational Excellence Sustainability and Livability Delaware Valley Regional Planning Commission (DVRPC) similarly established its evaluation criteria based on the goals of its long-range plan, Connections 2040. DVRPC selected a subset of the goals under the âCore Plan Principlesââspecifically the principles of Manage Growth and Protect the Environment, Create Livable Communities, and Establish a Modern, Multimodal Transportation Systemâ to formulate the nine objectives in the allocation methodology. (6) Example: Avoid Overlapping and Non-Fundamental Goals It is crucial that the goals and objectives do not overlap to ensure no double counting of benefits when setting up the measures for the methodology. The following two objectives are examples of overlapping objectives that one would want to avoid: (1) improve asset condition, and (2) reduce long-term maintenance costs. These objectives overlap because improved asset conditions are generally associated with reduced costs. Many agencies are tempted to include cost effectiveness or ROI as a goal or objective in the approach. This is an example of a goal that is not fundamental as it does not identify the root of what a project should accomplish. At this stage in the process, it is not necessary to identify cost-effectiveness as a goal. Rather, project cost and cost-effectiveness are incorporated into the resource allocation approach in subsequent steps.
Guidance for Implementing a Cross-Asset Resource Allocation Approach 53 Then, within each of the goal areas, identify the second-tier objectives. Objectives are specific, measurable statements that indicate the desired result in a particular goal area. For example, for the goal area of Safety, an agency could identify two objectives: 1. Minimize injuries and fatalities to road workers, and 2. Minimize injuries and fatalities to road users. Strictly speaking, goals and objectives are not required for implementing a cross-asset resource allocation approach. However, it is recommended to structure your approach with goals and objectives in order to establish a hierarchy. Keeney and Raiffa, in their book Decisions with Multiple Objectives: Preferences and Value Tradeoffs, provide excellent treat- ment on the hierarchical nature of objectives in developing an approach for multi-objective decision making. Developing a hierarchy of goals and objectives provides specificity and helps clarify the intended meaning of the broad and general goals. In addition, this structure of goals and objectives is helpful for using the tools described in the following chapter of this report. (4) Step 3: Select Performance Measures and Evaluation Criteria With the goals and objectives defined, the next step is to select the performance measures and evaluation criteria. Performance measures quantify progress toward goals and objectives. For each objective defined in the previous step, there should be one or more performance measures selected to quantify progress. There are several items to keep in mind when selecting perfor- mance measures: â¢ Measures should be quantitative rather than qualitative. Wherever possible, measures should be quantitative, determined with data, modeling, and/or a calculation. An example of a quantitative measure is the âaverage annual reduction in fuel consumption in gallons.â This value can be determined using the estimated reduction in VMT as a result of the project, as well as the improved IRI of the pavement, if applicable. A reduction in fuel consumption can then be used as an indication of a reduction in emissions under the environment goal area. A qualitative measure, however, might be a 1â5 score evaluating the degree to which the project promotes a reduction in emissions. This qualitative approach relies solely on judgment and may lack consistency depending on who is assigning the scores. â¢ Consider how measures scale based on project size. Measures should scale appropriately based on project size (e.g., project acres, length of road, cost). If Project A and Project B reduce the gallons of fuel consumed by the same amount, but Project A is half the cost of Project B, then the resulting score for Project A should be twice that of Project B to indicate that Project A provides more benefit given the cost. One way to scale measures then is to divide the measure value by the project cost. â¢ Normalize the measure to obtain a utility. Once all the measures are scaled appropriately in relation to project size, scores need to be normalized in order to combine them and obtain a utility. One approach is to convert all measures into 0â1 scores. For any particular measure, each value is divided by the highest value for that measure. This leaves one project with a score of 1 and the rest normalized accordingly. Alternatively, the measures can be constructed so that they are analogous to economic benefits, or dollars. These monetized benefits can be summed without normalization to obtain a utility. Table 4-2 provides example performance measures for each of the goal areas listed in the previous step. These examples were selected from the case study agencies as well as from other agencies that have implemented a cross-asset resource allocation methodology or include
54 Case Studies in Implementing Cross-Asset, Multi-Objective Resource Allocation Goal Agency Performance Measure Examples Mobility FHWA The following are the National Performance Management Measures from FHWA related to mobility, defined in 23 CFR 490.507, 607, and 707 (7): â¢ Predicted change in percent of person-miles traveled on the Interstate that are reliable. â¢ Predicted change in percent of person-miles traveled on the non- Interstate NHS that are reliable. â¢ Predicted change in the truck travel time reliability (TTTR) index. â¢ Predicted change in the annual hours of peak hour excessive delay (PHED) per capita. â¢ Predicted change in the percent of non-single occupancy vehicle (SOV) travel. VDOT â¢ Person Throughput: Predicted increase in corridor total person throughput attributed to the project. (8) MDOT â¢ Travel Time Savings: Predicted savings in travel time attributed to the project. (2) Preservation FHWA The following are the National Performance Management Measures from FHWA related to asset condition, defined in 23 CFR 490.307, 407 (7): â¢ Percent of pavement on the Interstate System in good/poor condition. â¢ Percent of pavement on the non-Interstate National Highway System (NHS) in good/poor condition. â¢ Percent of NHS bridges classified in good/poor condition. Caltrans MDOT SHA â¢ Asset Preservation Benefit: Given the quantity of bridge deck area and pavement lane miles in fair and poor condition that will be improved in the project, this measure incorporates the unit cost of preservation and the benefitâcost ratio of performing the work to compute a monetized benefit for preservation. (See case study description in Chapter 3.) Safety FHWA The following are the National Performance Management Measures from FHWA related to safety, defined in 23 CFR 490.207 (7): â¢ Predicted change in number of fatalities. â¢ Predicted change in fatalities per 100 million VMT. â¢ Predicted change in number of serious injuries. â¢ Predicted change in rate of serious injuries per 100 million VMT. â¢ Predicted change in number of non-motorized fatalities and non- motorized serious injuries. MDOT â¢ Impact on Reducing Fatalities and Injuries: Impact is based on the road severity index and the number of safety improvements included in the project. (2) Caltrans â¢ Annual Vehicle User Crash Savings: Computed based on predicted reduction in crash rate and crash cost. â¢ Annual Non-Vehicle User Crash Savings: Computed based on estimated worker exposure hours and incident cost. (See case study description in Chapter 3.) Security FDOT â¢ Link to Military Bases: Assigns points to project based on projectâs proximity to a military access facility. (9) Table 4-2. Performance measure examples.
Guidance for Implementing a Cross-Asset Resource Allocation Approach 55 Goal Agency Performance Measure Examples Environment Caltrans â¢ Annual Emissions Reduction Benefit: Estimated based on reduction in fuel consumption and the costs of various pollutants. â¢ Annual Active Transportation Health Benefit: Estimated based on average annual benefit to health and quantities of bike lanes, paths, and sidewalks. â¢ Water Quality Benefit: Estimated based on the average benefit per acre treated for water quality and the acreage of treated land. â¢ Biological Benefit: Estimated based on the quantities associated with each type of biological improvement and average unit benefit for each type of improvement. (See case study description in Chapter 3.) Community FDOT â¢ Bicycle and Pedestrian Access: Assigns points to categorize access to bike/ped facilities in the project area. (9) MDOT â¢ Community Access: Assigns points based on the projectâs contribution to enhancing community assets (e.g., schools, community centers, etc.). (2) DVRPC â¢ Multimodal Bike/Pedestrian: Assigns points based on construction of new bike/pedestrian facilities and connections to existing multimodal facilities. (10) Economic Development FDOT â¢ Per Capita Sales Tax: Amount of sales tax collected per capita for each census tract. (9) VDOT â¢ Access to Jobs: Predicted change in cumulative jobs accessible within 45 minutes for highway projects or within 60 minutes for transit projects. â¢ Project Support for Economic Development: Project consistency with regional and local economic development plans. (8) Accessibility MnDOT â¢ Percentage of curb ramps that are ADA compliant. (11) Environmental Justice SHRP Report â¢ Predicted change in person-hours of delay for disadvantaged populations compared to the entire population. â¢ Predicted change in noise levels for disadvantaged populations compared to entire population. â¢ Predicted change in air quality for disadvantaged populations compared to entire population. â¢ Predicted change in sidewalk connectivity for disadvantaged populations compared to entire population. â¢ Percent of regionâs unemployed or poor who cite transportation access as a principal barrier to seeking employment. (12) DVRPC â¢ Environmental Justice: Values projects that benefit census tracts with high IPD communities. (10) VDOT â¢ Access to Jobs for Disadvantaged Populations: Predicted change in cumulative jobs for disadvantaged populations, within 45 minutes for highway projects or within 60 minutes for transit projects. (8) Resilience MDOT â¢ Facility Resiliency: Estimates the projectâs impact on future flood risk based on whether the project is located in a 100-year flood plain and whether it mitigates the risk of flooding. â¢ Transportation Alternatives: Assigns points based on the degree to which the project will increase transportation alternatives and redundancy. (2) Caltrans â¢ Annual Vehicle Detour Benefit: Computed based on predicted reduction in service disruptions, such as road closures, due to reduced risk of hazards. (See case study description in Chapter 3.) Table 4-2. (Continued).
56 Case Studies in Implementing Cross-Asset, Multi-Objective Resource Allocation relevant performance measures in their TAMP. Example performance measures were selected from the following agencies: â¢ California Department of Transportation (Caltrans), â¢ DVRPC, â¢ Florida Department of Transportation (FDOT), â¢ MDOT, â¢ Minnesota Department of Transportation (MnDOT), and â¢ Virginia Department of Transportation (VDOT). The table also includes the FHWA performance management measures that are relevant to the goal areas of Mobility, Safety, and Preservation. Finally, SHRP 2 Report S2-C02-RR is included as it provides example Environmental Justice measures. Step 4: Assess Data and Analytical Capabilities In order to determine if you can support the analysis scope and the recommended set of measures, you must assess your data and analytical capabilities. There are clear tradeoffs in this step of implementing a cross-asset resource allocation approach. On one hand, better data improves the analysis of utility for each project being prioritized. On the other hand, obtaining better data takes time and effort. In this step, it may be helpful to involve relevant stakeholders to help determine what analysis can be realistically accomplished. Stakeholders can also provide information on what data can be obtained from existing systems. There are different techniques available to obtain data for use in a resource allocation approach. These include: â¢ Direct Measurement of observable values. â¢ Predictive Models such as travel demand models or geospatial analysis tools. These also include the use of trend analysis to predict based on historical values. The FHWA crash modification factors can be used in computing performance measures for safety as well (13). â¢ Representative Default established by a review of literature or through selected representa- tive projects. â¢ Subjective Judgment from experts. Examples: Normalizing a Measure MDOT normalizes each measure score by dividing the individual scores by the highest score for that particular measure. This converts all the scores to a 0â1 scale. The agency decided on this approach since the prioritization methodology incor- porates both quantitative and qualitative measures that are not easily combined without normalization (2). Caltrans constructed performance measures that are analogous to economic benefits, thus sidestepping the need to normalize the measures. These perfor- mance measures utilize benefitâcost terms to estimate annual benefits or savings such as those described in Table 4-2. (See the case study description in Chapter 3 for more information on the Caltrans approach.)
Guidance for Implementing a Cross-Asset Resource Allocation Approach 57 To overcome the data challenges in this process, you may need to: â¢ Revisit the analysis scope and measures. Steps 3 and 4 of implementing a cross-asset resource allocation approach are iterative. You may discover that you donât have the data or analytical capabilities necessary for particular performance measures. This is the time to revisit the evaluation criteria and select measures that can realistically be used given data and/or analytical constraints. â¢ Collect more data. Another way to overcome a lack of available data is to collect more data. Coordinating with other data collection efforts across the agency may yield additional data that can be used in the prioritization process. Note that much of the data used in a cross-asset resource allocation approach are actually calculated or modeled, not directly measured. Modeled data will likely be used in the approach and is a great way to estimate parameters that are not easily observed. Using modeled data is not an undesirable option and is certainly preferred to the option of soliciting opinions from indi- viduals in the agency. Overall, it is important to know that there are many options for obtaining data besides expert judgment. Step 5: Prototype the Approach With the objectives, performance measures, and data in hand, it is time to prototype the approach. This step is broken down into a series of tasks that walk through the process of testing the approach. It is suggested that the approach be prototyped in a simple spreadsheet. This organizes the work and easily enables the calculation of utility for each project. It also minimizes the issues associated with using a software system when the approach is not yet ready. In general, a software system can be a great tool to support cross-asset resource allocation, and many off the shelf systems exist, but it is essential to review and validate the approach before implementing a system. Task 5.1 Collect Data for a Set of Sample Projects First, identify a set of sample projects to use for testing the approach. The sample set should cover a range of project types in order to utilize all the measures included in the analysis. For Examples: Assessing Data Capabilities MDOT performed a series of workshops to select the performance measures and to assess its data and analytical capabilities at the same time. These workshops involved relevant stakeholders, and discussion was held over each potential measure to determine the feasibility of gathering data and performing the analysis. This was a time effective way to accomplish Steps 3 and 4 of the process as it reduced the back and forth iteration often required in these steps. (2) In setting up its approach, DVRPC recognized the challenges of collecting data and the need to select performance measures with data requirements that could be met by small MPOs with limited capabilities. For this reason, it opted to implement simpler performance measures initially and build their data capabilities over time. (10)
58 Case Studies in Implementing Cross-Asset, Multi-Objective Resource Allocation example, if an approach includes a system preservation measure, be sure to identify a sample project that includes system preservation work. It is best to use 15 or more projects to test the approach. This helps ensure that each measure is utilized and provides more information when reviewing the approach. Once the projects are identified, gather all the data necessary to calculate the various perfor- mance measures and organize the projects and data in a spreadsheet. Obtain the raw values for each performance measure, without any normalizing. Task 5.2 Calculate the Utility for Each Project. Calculating the utility for each project involves combining all the performance measures to obtain the total value for each project. In order to do this, first apply the normalizing approach described in Step 3. One approach is to convert all measures into 0â1 scores. For any particular measure, each value is divided by the highest value for that measure. This leaves one project with a score of 1 and the rest normalized accordingly. You could also convert measures into 0â100 scores simply by multiplying by 100. Alternatively, if the measures are constructed so that they are analogous to economic benefits, or dollars, then normalizing is not necessary. Next, assign a nominal weight to the objectives. For testing purposes, one could use equal weights for all objectives or estimate what the final weighting might be and apply those weights. This is not critical during the prototyping phase as the weights will be determined officially in the subsequent step. Once the performance measures are normalized and the weights are applied, sum the measures to obtain a single utility value for each project. Task 5.3 Review and Revise the Approach. Review the utility of each project as well as the utility divided by the cost values. At this stage, it is appropriate to rank the projects by either utility or utility/cost to see where different projects stand in the ranking. Items to consider when reviewing the results include: â¢ Were you able to get all the data? â¢ Are the data correct? â¢ Is the method scalable? â¢ How well did the process of defining projects, gathering data, and calculating the utility work? Are there aspects of the process that need improvement? â¢ Are the rankings reasonable? Is it what you would expect? Are certain measures dominating the utility values and skewing the results? â¢ Is the ranking of utility/cost the same as the ranking of projects by cost? This is something to avoid. If cost is driving the ranking too obviously, there may need to be changes in the approach. You may need to revise the approach slightly at this point in the process. Improvements could include: changing the assumptions in the data, especially for estimated values; eliminating or refining performance measures to address data issues; or adjusting modeling assumptions. Task 5.4 Document the Approach and Assumptions. Finally, it is critical that you document the approach and the assumptions used throughout the process. This will form the foundation of the final model used to allocate resources, making sure to build on the prototype. It is especially critical to document the assumptions made about various data items that may have been estimated in the process.
Guidance for Implementing a Cross-Asset Resource Allocation Approach 59 Example: Using a Spreadsheet When using a spreadsheet to prototype your cross-asset resource allocation approach, it is important to ensure that it can be used to communicate the approach to others. In particular, it should be developed in a way that decision makers can understand the scores and see the results. VDOT utilizes a spreadsheet showing the details of the methodology for SMART SCALE. The latest documents are featured on their website (8). The example spreadsheet shown below is adapted from the VDOT Smart Scale Project Scores spreadsheet. The spreadsheet includes basic identifying information about the projects and the scores for each of the measures utilized in the approach. The spreadsheet also has columns for the total benefit of projects, the score divided by the project cost, and the rankings, which can be filtered to see the results. Step 6: Set Weights on Goals and Objectives Setting weights on the objectives allows you to indicate the priority level of different goals. While it is important for a portfolio of projects to make progress in each of the defined goal areas, weights can provide another level of granularity to the prioritization. There are two common approaches to setting weights on objectives: pairwise comparison and the Delphi method. Pairwise comparison involves making judgments on the relative importance of two objec- tives when compared side by side. Participants in the pairwise comparison process are asked to compare two objectives and determine on a scale which is more important. For example, participants could be asked to determine whether or not the objective of safety is more or less important than mobility. Pairwise comparison can be a good approach to determine the weights on objectives if you have: â¢ A small number of objectives. This approach can become very burdensome if there are more than a few objectives since the number of comparisons that have to be made increases significantly depending on the number of objectives.
60 Case Studies in Implementing Cross-Asset, Multi-Objective Resource Allocation â¢ A sense of the scale. Typically when assigning a value to an objective on a 1â10 scale, for example, people can have very different senses of what each point on the scale represents. When using pairwise comparison it is beneficial to define the scale consistently so that everyone has the same sense of the scale. â¢ An authoritative set of decision makers. Pairwise comparison works best when the indi- viduals making the judgments are authoritative decision makers within the agency and have expertise in the goal areas. An alternative approach for setting weights is to use the Delphi method. In the Delphi method, relevant stakeholders are asked to vote on the weights for each objective. The results of the vote are then shared with the whole group, and participants take time to discuss differing opinions. After discussing, another round of voting commences. The goal of the Delphi method is to reach consensus, so theoretically voting and discussion rounds can continue until consensus is reached. In practice, likely two to three rounds of voting are needed to reach consensus on the weights. There are also two approaches that sidestep the need to set explicit weights in this step. First, the use of measures that are analogous to economic benefits avoids the need for further weighting since the dollar amounts can be compared directly. Second, DEA is an alternative that tries to maximize progress toward each objective without determining the value of achieving one objective versus another. This approach is described further in the Appendix. Step 7: Apply the Model With the approach prototyped and the weights set, youâre ready to apply the model. In order to successfully apply the model, it is important to establish a process for each of the following: â¢ Identifying candidate investments, â¢ Calculating measures for each candidate, Example: Delphi Method MDOT utilized the Delphi method to assign weights to the goals and objectives in their methodology. Relevant stakeholders from MDOT, SHA, MTA, and MACo were gathered together for a workshop. Each participant voted on the weights, and then discussion was held to review the differences of opinion and observe the ranking results with various weight options. Ultimately, two rounds of voting were used to reach consensus on the weights from each goal and objective. (2) Example: Pairwise Comparison DVRPC utilized pairwise comparison to assign weights to the objectives. DVRPC staff, as well as member agency, county, and city officials were surveyed to determine the directionality and degree of preference between the criteria. DVRPC found that clear definitions were essential during the facilitation of stakeholder preferences. By communicating the types of projects that would score higher as a result of their elicited preferences, decision makers were able to make better informed decisions. (10)
Guidance for Implementing a Cross-Asset Resource Allocation Approach 61 â¢ Prioritizing candidates, â¢ Using initial prioritizing to support resource allocation, and â¢ Updating key assumptions and parameters. In addition, it may be helpful to implement a system to support your agencyâs resource allocation efforts. Depending on the problem size and scope of the analysis, it may be appropriate to support the process in a spreadsheet, at least initially. This is a low cost, simple way to keep track of projects and rankings. NCHRP tools are also available to provide a means for initial implementation. The documentation for the tool developed through this project can be found in Chapter 5. If a more sophisticated system is desired, multiple COTS systems are available for this purpose. Step 8: Communicate the Results The final step in implementing a cross-asset resource allocation approach is to communicate the results of the prioritization and integrate them with existing business processes. First, it is important to document the key assumptions used throughout the allocation process. Data assumptions are especially important, particularly for those data items that are estimated. In addition, the approach to calculate the weights on the objectives should also be documented, as well as the process for applying the model established in Step 7 above. You may wish to develop a communications plan to specify when and how to share informa- tion about the decision-making process. Making details of the prioritization approach and the Example: Using DEA for Project Prioritization DEA values projects that make progress in all of the goal or objective areas. This is a desirable feature in cases for which you donât want to weigh one goal over another. However, in many cases there are goals that are, in fact, more important than others and should be taken into account in the prioritization. If this is the case, one may use hypothetical projects that show representative progress in each of the objectives to guide the algorithm and produce relative efficiency results that better represent reality for your given objectives. A paper by Sowlati, Paradi, and Suld titled, âInformation Systems Project Prioritiza- tion Using Data Envelopment Analysis,â demonstrates the use of âsample/artificial projectsâ to calibrate the analysis. In this paper, decision makers determine a set of sample information system (IS) projects, including the priority score for each project. Real projects were then evaluated based on the priority score of the sample projects. In the end, the results of the proposed DEA model using the sample projects were compared to the conventional application of DEA (without sample projects as a guide) and the results obtained by performing a pairwise comparison to determine the weights on the different objectives. The authors reach the following conclusions (14): â¢ âDecision makers are generally better at providing sample projects rather than determining what the weights on variables should be.â â¢ Decision makers are âfaster at providing the samples than at performing all the comparison and scoring required in an Analytical Hierarchy Process (AHP).â â¢ âThe proposed [DEA] model produced the appropriate scores to allow the projects to be ranked in priority order.â
62 Case Studies in Implementing Cross-Asset, Multi-Objective Resource Allocation results obtained using the approach enhances transparency concerning how decisions are made. The case studies of DVRPC and MDOT described in Chapter 3 illustrate how these two agencies documented their processes to provide information to other stakeholders and the public. References 1. California Department of Transportation. 2018 SHOPP: Fiscal Years 2018â19 through 2021â22. March 2018. 2. Maryland Department of Transportation. Chapter 30 Transportation Project-Based Scoring Model, Technical Guide, 2018. 3. Arizona Department of Transportation. Arizona Long-Range Transportation Plan Update, Final Working Paper #5: Recommended Investment Choice (RIC) Development. October 2017. 4. Keeney, Ralph L. and Raiffa, Howard. Decisions with Multiple Objectives: Preferences and Value Tradeoffs. Cambridge, UK, Cambridge University Press, 1993. 5. California Department of Transportation. Caltrans Strategic Management Plan 2015â2020. 2015. 6. Delaware Valley Regional Planning Commission. Connections 2040 Plan for Greater Philadelphia. September 2013. 7. National Performance Management Measures. 23 C.F.R Â§ 490. 2016. 8. Virginia Department of Transportation. SMART SCALE Technical Guide. 2016. 9. Florida Department of Transportation. Strategic Investment Tool Highway Component, Measures Handbook. 2014. 10. Delaware Valley Regional Planning Commission. Transportation Improvement Program, Appendix D: DVRPC TIP Project Benefit Criteria. 2015. 11. Minnesota Department of Transportation. DRAFT Transportation Asset Management Plan. April 2018. 12. Cambridge Systematics, Inc., High Street Consulting Group, TransTech Management, Inc., Spy Pond Partners, and Ross & Associates, SHRP 2 Report S2-C02-RR: Performance Measurement Framework for Highway Capacity Decision Making, Transportation Research Board of the National Academies, Washington, D.C, 2009. 13. Baha, Geni, Masliah, Maurice, Wolff, Rhys, and Park, Peter. Desktop Reference for Crash Reduction Factors. U.S. DOT Federal Highway Administration. 2008. 14. Sowlati, T, Paradi, J.C., and Suld, C. âInformation Systems Project Prioritization Using Data Envelopment Analysis.â Mathematical and Computer Modelling. Vol. 41, pp. 1279â1298, 2005.