**Suggested Citation:**"Step 11 - Evaluate and Integrate Risk and Uncertainty." National Academies of Sciences, Engineering, and Medicine. 2017.

*Guide for Conducting Benefit-Cost Analyses of Multimodal, Multijurisdictional Freight Corridor Investments*. Washington, DC: The National Academies Press. doi: 10.17226/24680.

**Suggested Citation:**"Step 11 - Evaluate and Integrate Risk and Uncertainty." National Academies of Sciences, Engineering, and Medicine. 2017.

*Guide for Conducting Benefit-Cost Analyses of Multimodal, Multijurisdictional Freight Corridor Investments*. Washington, DC: The National Academies Press. doi: 10.17226/24680.

**Suggested Citation:**"Step 11 - Evaluate and Integrate Risk and Uncertainty." National Academies of Sciences, Engineering, and Medicine. 2017.

*Guide for Conducting Benefit-Cost Analyses of Multimodal, Multijurisdictional Freight Corridor Investments*. Washington, DC: The National Academies Press. doi: 10.17226/24680.

**Suggested Citation:**"Step 11 - Evaluate and Integrate Risk and Uncertainty." National Academies of Sciences, Engineering, and Medicine. 2017.

**Suggested Citation:**"Step 11 - Evaluate and Integrate Risk and Uncertainty." National Academies of Sciences, Engineering, and Medicine. 2017.

**Suggested Citation:**"Step 11 - Evaluate and Integrate Risk and Uncertainty." National Academies of Sciences, Engineering, and Medicine. 2017.

**Suggested Citation:**"Step 11 - Evaluate and Integrate Risk and Uncertainty." National Academies of Sciences, Engineering, and Medicine. 2017.

**Suggested Citation:**"Step 11 - Evaluate and Integrate Risk and Uncertainty." National Academies of Sciences, Engineering, and Medicine. 2017.

**Suggested Citation:**"Step 11 - Evaluate and Integrate Risk and Uncertainty." National Academies of Sciences, Engineering, and Medicine. 2017.

**Suggested Citation:**"Step 11 - Evaluate and Integrate Risk and Uncertainty." National Academies of Sciences, Engineering, and Medicine. 2017.

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

109 11.1 Goal The goal of this step is to deal with uncertainty in the estimates of benefits and costs in the BCA. Many of the data that analysts use to estimate the benefits and costs of a transportation investment are uncertain. The problem is magnified when a multimodal, multijurisdictional freight corri- dor investment is under consideration, requiring the use of diverse data and assumptions regard- ing freight movement. Examples of sources of uncertainty include those relating to freight mode choice decision making, future policy decisions made by different local and state governments, and future changes in freight trip generation rates. Uncertainty leads to risk that a BCA will produce results that are not supported by actual costs and benefits. This guidebook offers recommendations for dealing with risk and uncertainty within a BCA, starting with an identification of sources of uncertainty. 11.2 Tasks Identify Sources of Uncertainty Identify the sources of uncertainty within a BCA. This includes assumptions made during analysis as well as inputs to the analysis. This also includes model parameters that are used to estimate benefits and/or costs but that are not known and/or fixed at the time of analysis. Sources of Uncertainty Categorize the sources of uncertainty according to whether each is a source of benefit-side uncertainty, a source of project-cost uncertainty, or a member of the set of other sources of uncertainty. This categorization helps consumers of a BCA compare and contrast the uncertainty in anticipated benefits and costs. When evaluating multimodal, multijurisdic- tional freight projects, there is a need to estimate diversion and induced demand. There are likely to be multiple sources of uncertainty in models of diversion and induced demand. For example, the impedance function and the travel times used in a travel demand model can be sources of uncertainty Table 18 lists several typical examples of sources of uncertainty. Note specifically where, within the BCA, each source of uncertainty appears. For example, a population growth rate may be used within a travel demand model and a commodity flow model. The cost of con- struction materials may be used in a formula to estimate construction costs. This step is par- ticularly important when evaluating multimodal, multijurisdictional projects because analysis requires a relatively complex research methodology and it is not always obvious where model parameters are used. S t e p 1 1 Evaluate and Integrate Risk and Uncertainty

110 Guide for Conducting Benefit-Cost Analyses of Multimodal, Multijurisdictional Freight Corridor Investments Nominal Value Report the nominal value assigned to each uncertain model input and the rationale behind this value. For example, an analyst should report that the interest rate used in discounting future and current benefits and costs follows the guidance of OMB Circular A-94. Completion of this step and the preced- ing step may reveal inconsistencies in the analysis, particularly given the diversity of models needed to estimate the benefits and costs of certain large-scale multimodal projects. Categories of Uncertainty Categorize the uncertainty created by each input parameter. A parameter may be a source of: â¢ Epistemic uncertainty: If uncertainty could be eliminated or reduced by collecting additional data, then the parameter in question is creating epistemic uncertainty. The VOFT is likely a source of epistemic uncertainty. â¢ Aleatory uncertainty: If there is natural randomness in a parameter, such as in the cost of a construction material, then the parameter is a source of aleatory uncertainty. â¢ Deep uncertainty: If there is no defensible way to assign a probability distribution to a parameter, such as a discount rate, then it is a source of deep uncertainty. This step is important for determining how best to mitigate uncertainty and risk, both during the analysis in question and for future analyses. Model parameters used to value outcomes, for example the value of reductions in travel times, are typically sources of epistemic uncertainty. Data can be collected to develop more precise estimates of these parameters. Models of diversion are also often sources of epistemic uncertainty. Uncertainty stems from the fact that analysts do not have the data needed to forecast firmsâ decisions. More data would reduce this uncertainty. Models of induced demand, particularly when looking far into the future, are often sources of deep uncertainty. Itâs impossible to assign probability distributions to future land use patterns or freight and passenger traffic volumes as a function looking 30 years into the future. Categorize the parameters according to whether they represent exogenous factors or policy variables. It is important to distinguish between uncertainty that is and is not created by public policy. For example, the costs of construction materials cannot be controlled by a government agency and is an exogenous factor. On the other hand, the discount rate used within an analysis is set by public policy. Certain model parameters can be considered either exogenous factors or policy variables depending upon context. As an example, consider the costs of accidents. The Benefit-Side (1 of 2) Benefit-Side (2 of 2) Cost-Side Other Sources Population growth rates Assumed travel times Material construction costs Discount rates Model forms used to forecast economic growth Induced demand Results of any contract negotiations Temporal scales of analysis Specific parameter values used to forecast economic growth Impedance function Property acquisition costs Spatial resolution of analysis Freight and passenger traffic growth rates Rail company operating costs model Labor costs Optimism bias Land use changes Mode choice models (Other) construction costs Supply chain structure changes Accident rates Construction delays Energy use models Externality costs Maintenance costs Values of time Value of travel time reliability (Other) operating costs Freight values of time Value of travel time reliability for freight Legislative action Legal action Financial difficulties Table 18. Typical sources of uncertainty in BCA.

evaluate and Integrate Risk and Uncertainty 111 actual costs of accidents are, of course, not set by public policy and can go up or down over time, for example as a result of new technologies that reduce the probability that passengers will die as a result of an accident. On the other hand, parameters that fix the costs of accidents within a BCA can be set by public policy and will influence the results of an analysis. The results of multimodal and multijurisdictional projects are often particularly sensi- tive to public policy, and it is worth pointing out this sensitivity when evaluating projects. Categorization of uncertain parameters helps analysts discover and/or highlight the impor- tance of policy levers. The results of efforts to further study risk and uncertainty may reveal ways to manipulate policy levers to minimize risk or maximize project benefits. Publication of results showing how exogenous factors create uncertainty also helps analysts point out unavoidable project risk. Publication of Results The identification and description of sources of uncertainty can be published in a spread- sheet or in a matrix within a written report. An example is shown in Table 19. Publication of these results is an acknowledgment of the risk inherent in the evaluation of multimodal, multijurisdictional freight corridor investments. Planners will have information they can use not just to evaluate the results of a BCA, but also to begin discussions of how to mitigate risk and otherwise ensure project success. Analysts and planners can and should account for the sources of uncertainty that have been identified, using methods described in the remaining sections of this chapter. Account for Uncertainty A variety of techniques can be used to account for uncertainty. This guidebook recommends transparency, sensitivity testing, Monte Carlo analysis, and robust decision making (RDM). The mix of methods to apply in a given situation depends on available time and resources as well as on the results of the preceding effort to identify sources of uncertainty. Transparency Publish the details of the methods used during analysis, assumptions made, model input data, and model results. In other words, emphasize transparency. This guidebook recognizes that not Source of Uncertainty Category Use Nominal Value/Default Assumption Rationale Type ofUncertainty Exogenous Factor or Policy Variable Discount rate Other (used in estimation of benefits and costs) Final combination, comparison of benefits and costs 7% OMB Circular A-94 Deep uncertainty Policy variable Value of travel time savings (TTS) and personal travel Benefits-side uncertainty Economic valuation of TTS 12.30 (US 2012 $ per hour) U.S. Department of Transportation (USDOT) Value of Time Guidance 2014 Epistemic uncertainty Can be considered either Costs of construction materials Project-cost uncertainty Estimation of project capital costs 12,300,000 (US 2016 $) Engineerâs notes (as documented elsewhere in BCA) Aleatory uncertainty Exogenous factor Table 19. Example description of sources of uncertainty.

112 Guide for Conducting Benefit-Cost Analyses of Multimodal, Multijurisdictional Freight Corridor Investments all data can be shared. The best available data should be used even if the data cannot be published. Data that can be published should be published. Transparency allows others to replicate and/or interrogate the results of a BCA. Credibility is increased, issues can be addressed early to limit risk, and post-hoc meta-analyses can reduce risk in future benefit-cost studies. It also allows the various stakeholders to build off the work done during the BCA. Sensitivity Testing. Building on the sources of uncertainty identified in the previous task, identify realistic ranges for uncertain parameters and realistic scenarios for uncertain assumptions, and then perform sensitivity studies over these ranges and scenarios. Identify the items on the list that are likely to generate the most risk. The number of items identified depends on both the amount of risk inherent in the study and the time and resources available for sensitivity analysis. At a minimum, perform a sensitivity analysis. For each model parameter (assumption) in the list of identified items, note the nominal value of the parameter (the default assumption) along with the highest and lowest values that are deemed to be realistic (the most extreme among the set of plausible assumptions). Go through each parameter and assumption in turn. Set all other parameters and assumptions to their nominal values. Set the parameter or assumption of inter- est to one of its most extreme but realistic values and calculate project benefits and costs. Then set the parameter or assumption of interest to its other most extreme but realistic value, and again calculate project benefits and costs. The discount rate is an example of a model parameter that typically would be considered when performing sensitivity analysis. There is no correct discount rate to use in an analysis, in the sense that it represents the single universally agreed upon way to relate future and current costs and benefits. It is worthwhile to see how project benefits and costs compare when using a discount rate thought to be reasonable but low and again when using a discount rate thought to be reasonable but high. Table 20 shows an example of the results. These results identify the sources of uncertainty that carry the most risk and deserve further study. For larger projects, estimate project costs and benefits for a small number of specific, named scenarios that are interesting, represent the values of particular stakeholders, or constitute a plau- sible composite vision of the future. In one or more of the named scenarios, set model parameters to values that are thought to be pessimistic but realistic. This secondary form of sensitivity analysis allows consumers of the analysis to evaluate project risk and uncertainty in an intuitive way, via comparison of scenarios that have meaning to them. The results of the pessimistic scenarios high- light the risks of an investment and are an important way to combat optimism bias. Publication of the results of sensitivity testing demonstrates that risk and uncertainty have been considered and quantified. Source of Risk and Uncertainty Nominal Value/ Default Assumption Range of Realistic Values/Most Extreme Assumptions Range of Resulting Project Costs (2012 US $, in Millions) Range of Resulting Project Benefits (2012 US $, in Millions) Annual population growth rate 0.67% 0.00â1.50 % 88.3 121.4â147.4 Discount rate 5% 3â7% 84.2â92.4 110.7â150.8 Cost of construction materials 44.7 (2012 US $, in millions) 35.0â75.0 (2012 US $, in millions) 78.7â118.6 135.7 Table 20. Example results of a sensitivity analysis.

evaluate and Integrate Risk and Uncertainty 113 Include at least four but no more than 15 named scenarios to study. It is difficult to have a meaningful discussion on risk and uncertainty if the results of only two or three scenarios are provided. On the other hand, it is time and resource consuming to define and com- pare 16 or more scenarios. For multijurisdictional, multimodal freight transportation projects, there may be a temptation to develop large numbers of scenarios to describe what may happen after an investment. Rather than separately coming up with 20 or more specific scenarios to study in a sensitivity analysis, the authors recommend a smaller-scale sensitivity study followed by applications of Monte Carlo analysis, RDM, and other advanced methods for dealing with risk and uncertainty. Monte Carlo Analysis Apply the Monte Carlo analysis15 where uncertain model input parameters can be assumed to follow known probability distributions and where time and resources are sufficient. Review the list of sources of uncertainty, this time identifying the model inputs that can be assumed to follow a known probability distribution function. Such items include any model parameters that are not considered sources of deep uncertainty. For example, VOFT can be assumed to follow a distribution function and can be used in a Monte Carlo analysis, but the discount rate used to relate future and current benefits and costs should not be used in a Monte Carlo analysis. Estimate Probability Distributions. Estimate or assume probability distributions for the parameters of interest. For example, an analyst may assume that the VOFT is drawn from a nor- mal distribution with a given mean and standard deviation following the advice of a published research report. When defining a distribution for annual population growth, the authors recommend ana- lyzing census data. If observed annual population growth rates in the study area in the recent past look as though they have been drawn from a normal distribution, then assume that future annual population growth rates will follow a similar distribution. If empirical observations of a parameter, such as annual population growth rate, do not appear to follow a known distribution function, then use the observations themselves to form an empirical distribu- tion function. An empirical distribution function assigns probability mass 1/n to each of n observed data points. This guidebook describes how to use both empirical and assumed distribution functions in the context of a Monte Carlo analysis in the following paragraphs. Define joint distribution functions for any sets of parameters that cannot be assumed to be independent of one another. For example, annual population growth and freight traffic growth rates in an area are likely to be correlated. The authors recommend collecting pairs of observations of these parameters and defining an empirical distribution function based on these pairs of observations. Sample from Distributions. The next step in a Monte Carlo analysis is to sample from the distributions. Go through each uncertain parameter in turn and select a random value for that parameter based on its distribution. The results are a full set of parameter values. Determine and record the benefits and costs of the different project alternatives under study given the identified parameter values. Do the same for other important model outputs such as pollutant emissions and jobs generated. Repeat this process of sampling from previously defined distribution func- tions and calculating important outputs of an analysis multiple times. The number of times to repeat the process should be determined by available time and resources, but should typically be on the order of 10,000. 15Monte Carlo analysis refers to a class of computational algorithms that rely on repeated random sampling to obtain numerical results.

114 Guide for Conducting Benefit-Cost Analyses of Multimodal, Multijurisdictional Freight Corridor Investments The resulting 10,000 or so estimates of model outputs describe the uncertainty inherent in evaluating the outputs. The authors recommend publishing: â¢ The mean: the average value of the model output across all of the runs of the Monte Carlo analysis and arguably the most commonly used and widely understood statistic. â¢ The five-number summary: the minimum, 25th percentile, median, 75th percentile, and max- imum observed values of the model output. These quantities are frequently used in statistics as a way to summarize a set of observed data. The median is often labeled a typical value for a set of observed data. The other statistics let readers see how close to this typical value most of the observed data fall. â¢ A 95% confidence interval: often used as a range of plausible values for a statistic of interest. Observed data points outside of the 95% confidence interval may be considered outliers, and it may not make sense to evaluate a project based on particularly unusual outlying data points. â¢ A histogram for each model output of interest: a visual summary of the distribution of a statistic of interest. This can reveal clusters, symmetry, or skewness in the results. Many software packages can help analysts run Monte Carlo simulation studies. These packages automate the steps including the production of a variety of statistics and graphs. Be careful not to produce and publish too many statistics and graphs. It is important to avoid confusing the con- sumers of a BCA with statistics that are not particularly relevant to the conclusions of the analysis. Robust Decision Making Analysts should apply RDM when multiple parameters or assumptions have been catego- rized as sources of deep uncertainty and when time and resources allow application of RDM. The list of sources of uncertainty referred to in this section includes notes on which items are sources of deep uncertainty (along with model parameters and assumptions that cannot be described via probability distributions). An example is the discount rate used to relate future and current benefits and costs. Define Plausible Values. First, define a discrete set or a continuous range of plausible values for each important but uncertain model input parameter and a discrete set of alternatives for each assumption that introduces substantial uncertainty into the modeling process. As in the application of sensitivity analysis and Monte Carlo analysis, there is no universally correct num- ber of items to consider, and expert judgment must be applied. For example, discount rates are particularly important for aggregating estimates of project benefits and costs. Discount rates of anywhere from 0 to 7% seem plausible. Use an automated procedure to generate a large number of scenarios that cover the space of plausible parameter val- ues and assumptions. There is no correct number of scenarios to generate, but prior studies have used on the order of 100,000 scenarios. There is no need to assign probabilities to the scenarios. Indeed one of the strengths of RDM is that it avoids assuming how likely different scenarios are. Estimate Benefits and Costs. Estimate benefits and costs of project alternatives in all gener- ated scenarios. Tools developed specifically with RDM in mind allow analysts to rerun analyses and then to display and interrogate the results. Discuss the results. The interrogation of the results, by different stakeholders and by analysts and the discussions that follow are key, somewhat unique steps in RDM. In particular, many tools are available for identifying particularly interesting scenarios. Find scenarios where project alternatives perform particularly well or poorly. For example, a large investment in passenger rail may not perform well in analyses where the future price of gasoline is assumed to be low. Find thresholds in parameter values where the relative ranking of project alternatives changes. For example, a projectâs benefits may exceed its costs only when a discount rate of less than 3% is applied.

evaluate and Integrate Risk and Uncertainty 115 Examine Results. The results of an RDM study show the situations under which different project alternatives appear most and least promising. Publication of these results leads to fruitful discussions. Discussion can focus on how likely or important different scenarios areâan intui- tive way to discuss risk and uncertainty. This type of discussion will be notably different than discussions that follow Monte Carlo analysis, where results are aggregated and interested parties cannot typically examine individual scenarios that may interest them. RDM differs from sensitivity analysis in several ways. The scenarios considered in an RDM analysis will fully cover the space of feasible ways to set model parameters, which is not typi- cally the case in sensitivity analysis. The initial results of an RDM analysis are often examined by studying where scenarios sit in objective space and by using custom RDM tools. This can, for example, allow an environmental non-governmental organization to immediately and auto- matically pick out scenarios where greenhouse gas emissions exceed some threshold value. Perhaps most importantly, RDM analysis is set up to encourage discussion among stake- holders and plan revision. Different stakeholders can focus on scenarios that interest them, either because the scenarios are set up in a way that interests the stakeholders or because the outcomes of the analysis in these scenarios concern the stakeholders. As an example, say that two stakeholders are evaluating the results of an RDM analysis. The stakeholders include a state DOT and an environmental non-governmental organization. The DOT may focus on the results where the discount rate parameter is set to 7% as this is the value used in internal discussions at the organization. The DOT can compare the relevant results to results obtained when evaluating other projects and again using a 7% discount rate. They can request further details on the scenarios where the NPV of a proposed project is less than $40 million. This will allow them to see the circumstances under which the proposed project looks weak. The environmental organization, for its part, can study the situations where green- house gas emissions exceed a threshold value that was set prior to analysis. Discussions can lead to the identification of new project alternatives that are more robust than initial alternatives, performing relatively well across the space of plausible scenarios and in the eyes of all stakeholders. With RDM, policy makers can study and evaluate projects in terms of scenarios instead of having to compare point estimates or previously established distributions of benefits and costs. Stakeholders with different beliefs regarding future developments can refer to the same analysis during discussion. While RDM has rarely been applied in BCA of transportation projects, it offers clear benefits for stakeholder involvement. Other Approaches Numerous other useful approaches are available for dealing with risk and uncertainty. For example, an analyst can document sources of risk and uncertainty along with mitigation plans in a risk register. External review of studies of multimodal, multijurisdictional freight corridor investments can highlight inappropriate assumptions or methods used in a study. First focus on the basic techniques described above. If the results of sensitivity testing, Monte Carlo analysis, or RDM indicate that substantial risk exists, consider taking additional action such as developing a risk register or asking external reviewers to go over a study. Address Optimism Bias Optimism bias is a source of risk and uncertainty. Deal with optimism bias by using the same techniques used to address other sources of risk and uncertainty. Past studies have typi- cally underestimated the costs and overestimated the benefits of transportation investments. For

116 Guide for Conducting Benefit-Cost Analyses of Multimodal, Multijurisdictional Freight Corridor Investments example, Flyvbjerg et al. (80) found that engineers underestimated the costs of highway and rail improvements by 20 and 45%, respectively. Optimism bias is the cause of this widespread and consistent error. There is no single solution to the problem of optimism bias. Do not simply increase the dis- count rate to account for optimism bias. Doing so impacts estimates of both benefits and costs in unforeseen ways, leads to more myopic policies/investments, and may convince analysts to distort their projections. Transparency Providing transparency for assumptions, tools, and models used is the best approach. Trans- parency allows for post hoc analysis of the assumptions used in transportation studies. This, in turn, helps transportation engineers learn from past mistakes, develop more realistic assump- tions, and, ultimately, reduce optimism bias. In addition, when managers make transparency a priority, research methods come under increased scrutiny even before publication. This can lead to increased realism and more rapid innovation and adoption of innovative/best practices. Testing Scenarios Sensitivity testing, Monte Carlo analysis, and RDM reveal the performance of transportation projects in situations that are realistic even if analysts have (consciously or inadvertently) framed their analyses using optimistic scenarios or assumptions. Pay particular attention to publishing the results of sensitivity testing and RDM for the pessimistic cases. Exploration of the results in these cases is particularly valuable in the fight against optimism bias. The results of an applica- tion of RDM are scenarios that consist of specific assumptions and parameter values with resulting cost and benefit estimates. No judgment is made regarding how likely different scenarios are, and the influence of optimism bias is minimized. If an optimistic scenario is chosen as the nominal or default scenario in a study, RDM, at the very least, reveals the results in more realistic scenarios. The optional additional steps mentioned above for dealing with risk and uncertainty, includ- ing risk register development and external review of studies, also mitigate optimism bias. Rerun the BCA and Update BCA Results Once the risk factors are discussed and parsed out by approach in the BCA, return to earlier steps and rerun the BCA. An accompanying worksheet walks the user through these approaches and shows how to parse inputs to BCA. Recognize that when using Monte Carlo Analysis, the decision criteria for NPV will change to expected NPV (or the expected value of the NPV) or âmost likelyâ NPV based on the mode of the distribution used or the most conservative or lowest value NPV. Use Table 18 and Table 19 to guide the choice of variables to home in and determine the treatment and identification of critical factors, variables, or parameters in terms of sensitivity, scenario, Monte Carlo simulation, or RDM. When there is much at stake and the projects are truly transformational, a conservative approach may be to use the minimum NPV and RDM. When projects are directed to certain areas or groups, it is important to focus on the regional analysis of NPVs. 11.3 Inputs: Recommended Tools and Data Sources A number of tools can assist the analyst with evaluating and integrating sources of risk and uncertainty in the BCA: â¢ United Kingdom Department for Transportâs Transport Analysis Guidance: WebTAG (2013) (https://www.gov.uk/guidance/transport-analysis-guidance-webtag).

evaluate and Integrate Risk and Uncertainty 117 â¢ Developing Harmonised European Approaches for Transport Costing and Project Assess- ment (HEATCO) (2006) (http://heatco.ier.uni-stuttgart.de). â¢ European Unionâs RAILPAG: Railway Project Appraisal Guidelines (2005) (http://www.eib.org/ attachments/pj/railpag_en.pdf). â¢ Procedures for Dealing with Optimism Bias in Transport Planning (2004) (80). â¢ New Zealand Transport Agencyâs Economic Evaluation Manual (2013) (http://www.nzta.govt .nz/resources/economic-evaluation-manual/). â¢ NCHRP Report 658: Guidebook on Risk Analysis Tools and Management Practices to Control Transportation Project Costs (2010). â¢ RAND Corporationâs RDMlab: Robust Decision Making for Good Decisions without Predic- tions (http://www.rand.org/methods/rdmlab.html). â¢ In-house tools or private sector tools for risk analysis like Palisade Corporations @Risk or Crystal Ball. 11.4 Best Practices and Examples Best practices for Step 11: â¢ Identify sources of uncertainty and risk, and describe them in detail. â¢ Compare projects in terms of the distributions of their projected benefits and costs, and/or in terms of their projected benefits and costs across a range of realistic scenarios. â¢ Publish the results of studies investigating project uncertainty and risk, and allow external parties to conduct follow-on work including post hoc analyses of assumptions and model results. Example 1: The National Gateway (www.nationalgateway.org) is a multijurisdictional railway project that allows double-stack trains to travel between the Midwest and ports in the Mid-Atlantic region. In their TIGER grant application, project proponents included a clear description of a sen- sitivity study they ran in an appendix entitled âCost Benefit Methodology in Evaluation of Project Costs and Benefits and Economic Impacts.â Further details regarding this example are provided in Appendix K to this guidebook. Example 2: The Downeaster Service Optimization Project (www.amtrakdowneaster.com/ tiger-6-grant-application) aims to improve rail service between Brunswick, Maine, and Boston, Massachusetts. In their application for a 2014 TIGER discretionary grant, project proponents put together a BCA that included 19 pages of appendix material including the results of Monte Carlo simulation. Although not perfect, the material did demonstrate an acknowledgment of the importance of risk and uncertainty. Further details regarding this example are provided in Appendix K to this guidebook. 11.5 Common Mistakes Common mistakes occur when the project team: â¢ Ignores risk and uncertainty and presents specific results, leading to an appearance of false precision. â¢ Publishes only point estimates of project benefits and costs, leading to an appearance of false precision. â¢ Publishes ranges of project costs and benefits without reference to how the ranges were found. â¢ Uses only two or three scenarios or relatively small ranges of parameter values when perform- ing sensitivity analysis, leading to an underestimation of uncertainty and risk.

118 Guide for Conducting Benefit-Cost Analyses of Multimodal, Multijurisdictional Freight Corridor Investments â¢ Ignores uncertainty in policy variables, leading to an underestimation of uncertainty and risk. â¢ Ignores uncertainty in exogenous factors, leading to an underestimation of uncertainty and risk. â¢ Ignores potential correlations between variables and uses individual variables considered separately to describe uncertainty. The results of each sensitivity study may be unrealistic. Resulting ranges of project costs and benefits may be too narrow to capture true uncertainty and risk. â¢ Ignores sources of deep uncertainty, leading to an underestimation of uncertainty and risk.