Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
42 This chapter describes lessons learned regarding the implementation of performance manage- ment programs. It combines relevant insights from previous chapters and organizes them around an implementation model that consists of four basic stepsâInitiate, Design, Execute, and Apply. While this model provides a general approach for structuring implementation efforts, the specific details for any given agency need to be tailored during the Design step. In practice, few agencies are likely to use a simple linear approach to developing a performance management program. Many agencies already report measures or have pieces of a program in place. Regardless of where an agency finds itself within each of these steps, the questions described in the four steps will be useful for agencies to help determine methods to strengthen their existing program. 7.0 Initiate The first step, Initiate, involves setting the direction for the performance management pro- gram. It requires answering three basic questions: â¢ Why implement the program? â¢ Who will be involved in it? â¢ What is its scope? 7.0.1 Why Implement the Program? Clearly defining the need for performance management is important for focusing implemen- tation activities and can help create by-in for the initiative. The agencies interviewed as part of this research project described several catalysts for their performance management efforts. The exam- ples ranged from a very broad leadership initiative aimed at instilling a performance culture in an agency (Ohio DOT) to a very narrow initiative focused on addressing a specific project delivery challenge (Virginia DOT). Other examples of âwhyâ included the following: â¢ A desire to communicate achievement of strategic goals in terms of âtangible resultsâ (Missouri DOT); â¢ The need to improve an agencyâs accountability (Washington State DOT); and â¢ A goal of providing more efficient and effective transit service (PACE). Regardless of the nature of the specific issues driving a performance management initiative, defining them and communicating them is critical in the implementation process. The funda- mental question that needs to be answered is, âwhat do you hope to achieve with the performance management program?â C H A P T E R 7 Implementation
7.0.2 Who Will Be Involved in It? Strong leadership from the chief executive is almost always a defining factor in the success of a performance management program. However, significant progress can be made and substantial benefits can be realized even before upper management becomes fully engaged. Performance man- agement programs can begin within any organizational unit and be successful if key staff in that unit are supportive. Once established and applied within an individual unit, performance data has a tendency to filter up through an agency because of a strong desire by managers for more and bet- ter information on which to base their decisions. Initially, a clear understanding of who is involved in the process, regardless of their level of responsibility, is more important than ensuring that upper management is supportive. In defining roles and responsibilities, agencies should, at a minimum, consider the following: â¢ The Champion of the ProgramâWho will be responsible for coordinating the effort and for ensuring its overall success? â¢ The Audience for the Performance Results (internal and external)âWho will use the infor- mation and what specifically will they use if for? â¢ Data and Measure OwnersâWho manages the data that will be used in the program and who will be responsible for reviewing each measure before it is published? 7.0.3 What Is Its Scope? As described in the previous sections, there is no optimal size or shape for a performance man- agement program. Successful programs are highly tailored to the implementing agencyâits wants, its needs, its culture, etc. Regardless of the breadth or depth of a program, agencies should clearly define its scope during the Initiate phase. Issues to consider include the following: â¢ What is the final product and what is the frequency of reportingâe.g., a web-based dashboard with real-time data (Virginia DOT); a comprehensive performance report published quarterly (Washington State DOT); a series of annual reports designed to support the programming process (Florida DOT), etc. â¢ What functions/program areas/modes will be measuredâe.g., highway congestion, safety, preservation, maintenance, operations, project delivery, etc. â¢ What portions of the network will be measured and at what level of granularityâe.g., will all functional classes be included, will results be reported statewide or by district, etc. 7.1 Design The second step, Design, consists of developing the details of the program. This step includes developing specific measures and designing mechanisms for reporting results. 7.1.1 Selecting Measures for Agency Strategic Priorities Performance measures are the building blocks of any performance management program. Therefore the selection of specific measures can make or break a new initiative. A number of pre- vious research reports have covered the selection of measures in detail. Examples include: â¢ NCHRP Report 446: A Guidebook for Performance-Based Transportation Planning, Transporta- tion Research Board of the National Academies, Washington, D.C., 2000. â¢ Strategic Performance Measures for State DOTs â A Handbook for CEOs and Executives, American Association of State Highway and Transportation Officials, Washington, D.C., 2003. Implementation 43
â¢ NCHRP Report 551: Performance Measures and Targets for Transportation Asset Management, Transportation Research Board of the National Academies, Washington, D.C., 2006. Following are some highlights from these and other publications on measure selection. Agen- cies are encouraged to refer to these reports for more detailed guidance. 184.108.40.206 Criteria for Selecting Measures A number of criteria have been developed for evaluating potential measures. The criteria can be highly tailored to an agencyâs specific performance needs, and in practice, the development of specific criteria is often seen as its own step in the implementation process. However, based on the results from previous research and insights from the agencies that participated in this study, performance measures at a minimum should meet the following three criteria: â¢ Strategic AlignmentâThe measures are consistent with the policies and priorities identified in the Initiate step. â¢ Useful for Decision SupportâThe measures enable decision-makers to identify problems and assess the implications of DOT action. â¢ Feasible to Report âThe measures can be calculated with existing data; or if new data is required, these data can be collected and managed in a cost-effective manner. Most agencies have performance data in one form or another. A practical approach to devel- oping measures is to review existing measures and data resources, assess the measures in terms of the above criteria, and then fill in the gaps relying heavily on existing data sets. 220.127.116.11 Documenting Measures Documenting the details of the selected measures enables consumers of the results to fully understand the sources and uses of the information being provided. It also captures the details required to compile data and calculate the measure in subsequent reporting periods. Table 7.1 presents a template for documenting performance measures and provides an example of one of the measures developed by the Oregon DOT as part of its Highway Performance Management System.2 7.1.2 Design Reports Effective performance reports enable stakeholders to access and understand results. In design- ing these reports, agencies should consider what information to provide, how best to present it, and the mechanics of generating and accessing it. The content and format of performance reports vary widely based on an agencyâs specific needs as determined during the Initiate step. However, effective reports typically contain the fol- lowing information: â¢ Measures organized by goal or strategic objective; â¢ The current value of each measure in relation to a specified target; â¢ Trend information; â¢ Future projections of performance (if appropriate); and â¢ Background material and/or a narrative so that the audience can better understand the results. The two basic options for the reporting mechanisms are (1) standard reports or brochures or (2) interactive access to results via a web portal or management system. In evaluating these two options, agencies should consider the context in which the results will be used. For example, if the main objective is to provide an annual snapshot of system performance as back ground infor- 44 Transportation Performance Management: Insight from Practitioners 2Oregon DOT, Highway Performance Management System User Guide; 2006.
mation for the planning process, a standard report may be appropriate. If, on the other hand, the objectives include enabling external stakeholders to track the real-time progress of construc- tion projects, a web-based system might be more appropriate. Figures 7.1 and 7.2 present two reporting examples. Figure 7.1 shows the Virginia DOTâs on- line dashboard. It provides a snapshot of current performance; indicates the degree to which cur- rent performance varies from target values using a green, yellow, and red scale; and enables users to drill down for further details. For example, users can click on the Projects gauge and view detailed cost and scheduled information for individual construction projects. Figure 7.2 illustrates a performance scorecard used by the Minnesota DOT. The scorecard rep- resents a static snapshot of current performance in terms of whether the performance in each area is good, satisfactory, or poor. In addition, smaller arrows provide trend information. For example, the up arrow next to âBridges in Poor Conditionâ indicates that this measure has improved since the previous reporting cycle. 7.2 Execute The next step in the process, Execute, involves performing the mechanics of the performance management program. This step includes collecting and/or compiling data, calculating the measures, and generating and distributing reports. These activities represent a sustained effort that must be performed on a continuous basis throughout the life of the performance manage- ment program. The long-term commitment (and costs) associated with formally adopting and reporting performance measures should be considered in the Design step of the implementation process. Implementation 45 Measure State Highway System Crash Rate Definition Number of total crashes and fatalities per 100 million VMT and 1,000 population Owner Traffic Engineering Services Unit Use Tracking crashes by severity and type on the state system allows the Oregon DOT to better gauge the success of engineering strategies geared toward specific types of crashes (e.g., runoff the road crashes). The measure is a lagging indicator of safety performance. The measure is reported annually. Derivation 1. Identify the number of crashes by severity (fatalities, injuries, property damage) on state highways. 2. Identify the number of vehicle miles traveled on state highways and the number of people in the state. 3. Divide the number of crashes by vehicle miles traveled in millions. Data Sources Number of CrashesâStatewide crash database. Number of FatalitiesâFatality Analysis Reporting System (FARS). Vehicle Miles TraveledâOregon mileage report. PopulationâTo be determined. Aggregation By region and functional class (functional class aggregation will use VMT base only, not population). Table 7.1. Template for documenting performance measures.
Each potential measure should be evaluated in terms of its benefits relative to the costs of cal- culating it. For this reason, performance programs often rely heavily on existing data sets. In these cases, the actual collection and management of the underlying source data does not rep- resent a new initiativeârather these activities occur as part of the agencyâs existing operating procedures. However, the supporting data often reside in a number of systems and databases throughout an agency and are managed by different organizational units. Therefore the amount of effort required to compile even existing data into an integrated performance report should not be underestimated. All successful performance management programs have a champion or designated staff respon- sible for sustaining the program. The time commitment associated with these responsibilities sig- nificantly will vary based on the breadth and depth of the overall effort. A common strategy for decreasing the time and effort required to execute a performance pro- gram is to automate as much of the process as possible. In most cases, performance measure val- ues are derived by performing a series of calculations on data that reside somewhere in an agency. The process of pulling data from various sources and performing calculations lends itself well to automation. Other aspects of the program that can be automated include the workflow associated with reviewing and approving results and the generation of standard reports. For example, the Maryland DOT has implemented a Performance Assessment and Collection Tool (PACT) that automates some of the day-to-day efforts associated with collecting and reporting performance. The tool enables an agency to identify, document, manage, and report on its goals, objectives, and performance measures. 46 Transportation Performance Management: Insight from Practitioners Figure 7.1. Virginia DOT dashboard.
7.3 Apply and Evaluate The final step in the implementation model, Apply, involves using the performance results to make better decisions. Similar to the Execute step, the Apply step represents a sustained long- term commitment. The main difference between these two steps is that using performance results requires agencies to address organizational, institutional, and cultural issues that go well beyond Implementation 47 Figure 7.2. Minnesota DOT performance scorecard.
the logistical challenges of calculating them. In fact many agencies that have made significant investments in collecting performance data have not yet made it to the Apply step. Implement- ing this step represents the major challenge in moving from performance measurement to per- formance management. For this reason, the insights from practitioners presented in Chapters 3 through 6 focused largely on this area. Implementing performance management at an organization is necessarily a challenge. Man- agers and employees are often used to a way of conducting business that, for a variety of reasons, they tend to hold onto. At the same time, increasing challenges in project delivery, intractable problems with congestion and safety, and a renewed focus on achieving efficient use of public funds have provided increased focus on any failures. Performance management takes significant time and effort to develop, especially if it is to last. This Guidebook has provided some insights into how other transportation agencies have suc- cessfully begun and sustained the performance management process, including the following: â¢ Begin by focusing on a clear and present challenge faced by the agency and use performance measures to help describe the problem and provide evidence for the most appropriate solution; â¢ Bring managers and employees along with this program, building their capability to use and manage with data, while also focusing on ways that they can do better; â¢ Expand the program over time and into the day-to-day processes and culture of the agency, such that there is an expectation that quality data will be used to support major decisions and agency staff will take ownership of their work; â¢ Train agency managers and employees to focus on the needs of agency customers and to bal- ance standard engineering and programmatic considerations against these needs so that the agency appears credible and capable to the public and legislative bodies; â¢ Sustain these efforts over time by ensuring that the program is not connected to a single indi- vidual or office within the agency; and â¢ Ensure broad distribution of performance data to legislators, stakeholders, and the public, building constituencies for the continued use of performance management at the agency. As a DOT applies performance management to its day-to-day processes, it is vital that it go through an evaluation of the program. This evaluation should take into account the design of the program and its implementation and provide a feedback loop so that adjustments can be made to performance measures and procedures. 48 Transportation Performance Management: Insight from Practitioners