National Academies Press: OpenBook
« Previous: Sessions and Topics
Page 85
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 85
Page 86
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 86
Page 87
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 87
Page 88
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 88
Page 89
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 89
Page 90
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 90
Page 91
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 91
Page 92
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 92
Page 93
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 93
Page 94
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 94
Page 95
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 95
Page 96
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 96
Page 97
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 97
Page 98
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 98
Page 99
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 99
Page 100
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 100
Page 101
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 101
Page 102
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 102
Page 103
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 103
Page 104
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 104
Page 105
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 105
Page 106
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 106
Page 107
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 107
Page 108
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 108
Page 109
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 109
Page 110
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 110
Page 111
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 111
Page 112
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 112
Page 113
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 113
Page 114
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 114
Page 115
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 115
Page 116
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 116
Page 117
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 117
Page 118
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 118
Page 119
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 119
Page 120
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 120
Page 121
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 121
Page 122
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 122
Page 123
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 123
Page 124
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 124
Page 125
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 125
Page 126
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 126
Page 127
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 127
Page 128
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 128
Page 129
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 129
Page 130
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 130
Page 131
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 131
Page 132
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 132
Page 133
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 133
Page 134
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 134
Page 135
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 135
Page 136
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 136
Page 137
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 137
Page 138
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 138
Page 139
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 139
Page 140
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 140
Page 141
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 141
Page 142
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 142
Page 143
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 143
Page 144
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 144
Page 145
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 145
Page 146
Suggested Citation:"Resourcre Papers." National Academies of Sciences, Engineering, and Medicine. 2005. Performance Measures to Improve Transportation Systems: Summary of the Second National Conference. Washington, DC: The National Academies Press. doi: 10.17226/13658.
×
Page 146

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

RESOURCE PAPERS 99395mvp70_128 12/13/05 12:36 PM Page 79

99395mvp70_128 12/13/05 12:36 PM Page 80

8 1 RESOURCE PAPER Performance Measurement in Transportation State of the Practice Theodore H. Poister, Andrew Young School of Policy Studies, Georgia State University Transportation agencies in the United States andelsewhere have dramatically transformed the waythey do business over the past 10 to 15 years, and performance measurement is an essential ingredient in their quest for managing effectively to produce results. The general movement toward managing for results has been driven by (a) increased demands for accountability and improved performance from the public, elected offi- cials, and the media; (b) strong leadership and the desire to strive for excellence within agencies; and (c) recogni- tion that sea changes in the environment in which trans- portation agencies function require strategic thinking to plot new courses of action and then measure success in implementing them. The commitment to increased accountability and per- formance has led to a plethora of approaches to improved management and decision making, typically initiated first by a few leading-edge agencies and then adopted by the mainstream. The approaches have radi- cally transformed, or have the potential to transform, the way these agencies operate on a day-to-day basis. Tools include strategic planning and management, perfor- mance-based transportation systems planning, stake- holder engagement processes, asset management, performance management, performance budgeting, process reengineering, and quality/productivity improve- ment processes. Transportation agencies have also adopted performance measurement systems widely. These systems are results-oriented management tools in their own right but are also critically important in link- ing and aligning other planning and management processes. Transportation professionals need to remember that pressure for more effective and responsive government in this country is by no means limited to the field of trans- portation (Newcomer et al. 2002). Requirements for sys- tematic goal setting and performance measurement are embodied in the Government Performance and Results Act of 1993 at the federal level and in legislative or exec- utive mandates in virtually all of the states (Melkers and Willoughby 1998; Aristigueta 1999). Local governments have also jumped on the performance measurement bandwagon, as exhorted by Osborne and Gaebler (1992), who pointed out that “if you don’t measure results, you can’t tell success from failure.” The extent to which this results-oriented management approach has permeated government in the United States was examined by the Government Performance Project (GPP), conducted jointly by university researchers and the editors of Governing magazine. Applying a set of systematic criteria through detailed surveys and site vis- its, the GPP evaluated all 50 state governments, 35 major cities, and a sample of federal agencies in terms of their practices in the areas of financial management, human resources, information technology, capital management, and managing for results. The resulting grades ranged from A to F and indicated, not surprisingly, that while some jurisdictions indeed have strong management capa- bilities, there is still considerable room for improvement (Ingraham et al. 2003). Parenthetically, in its second round of evaluations at the state level, the GPP plans to grade the performance of the departments of transporta- tion (DOTs) and environmental protection programs as well as central state governments. 99395mvp70_128 12/13/05 12:36 PM Page 81

Transportation agencies are arguably often on the leading edge of results-oriented management and per- formance measurement practices at all levels of govern- ment. This is illustrated by the fact that transportation agencies have often been asked to pilot goal-setting and performance measurement processes in the federal gov- ernment and various state and local jurisdictions and that DOT personnel have often been called on to help other agencies undertake these initiatives. Local public transit agencies have been monitoring comprehensive sets of performance measures with regard to operational efficiency, ridership, and revenue versus expense for more than two decades in an effort to manage strategically in a competitive industry (Fielding 1987). State DOTs have been experimenting with, refin- ing, expanding, and enhancing their performance mea- surement systems over that period. While the DOTs have always been “data rich” agencies in some respects, early measurement systems were oriented internally and focused principally on production and cost-efficiency. However, the field continued to evolve, and a Trans- portation Research Board synthesis report published in 1997 found that the “new” performance measures tracked by DOTs were significantly more outcome ori- ented, were tied to strategic goals and objectives, and focused more on service quality and customer service (Poister 1997). Other articles published around the same time illustrated such developments in a number of states, including New York (Albertin et al. 1995), Wis- consin (Etmanczyk 1995), Washington (Ziegler 1996), Delaware (Abbott et al. 1998), Virginia (Sorrell and Lewis 1998), and Texas (Doyle 1998). Most of the managerially oriented work cited above focuses more on measuring organizational performance than transportation systems performance. However, other transportation professionals have been working to incorporate performance measurement more cen- trally in transportation planning processes (Halvorson et al. 2000; Newman and Markow 2004). A few years ago a guidebook on performance-based planning was published to help agencies improve the development, implementation, and management of their transporta- tion plans and programs. The guidebook added perfor- mance measurement to existing planning processes to allow evaluation of alternative programs, projects, and services against overall transportation plan goals and objectives (Cambridge Systematics 2000). Growing out of a CEO workshop on managing change, a recent report addresses the need for trans- portation agencies to tie performance measures to strategic planning processes (TransTech Management 2003). The report makes a distinction between exter- nally and internally driven performance measures and summarizes the kinds of measures used by DOTs in such areas as mobility and congestion, safety, community quality of life, environment, economic development, system preservation and maintenance, project delivery, and human resources. In fall 2000 a national conference focused on the use of performance measures to improve the performance of both transportation systems and transportation agen- cies (Transportation Research Board 2001). The general sense of the conference was that performance measure- ment was becoming a permanent way of doing business in transportation agencies and that, although several issues remained, a number of lessons had been learned concerning the development of measurement systems, data collection, effective utilization of performance data, and maintenance of measurement programs over the long run. The purpose of this paper, then, is to track recent trends in the development and use of performance mea- sures in transportation, assess the current state of the practice, and point out further issues that must be addressed to use measurement systems most advanta- geously. Focusing primarily on state DOTs, it addresses the questions of what is measured, how performance is measured, how performance data are reported, and how performance measures are used. The paper concludes with a summary of recent trends in the field and outlines continuing challenges that need to be addressed. WHAT IS BEING MEASURED? Transportation agencies have become more holistic in the coverage of their measurement systems. They focus on the full range of performance as illustrated in the pro- gram logic model shown in Figure 1. The major focus areas are agency performance, system performance, and broader impacts. While agency performance concerns service delivery, projects completed, improvements made, and so forth, system performance focuses on the capacity and condition of transportation systems and their performance in terms of travel time, cost, conve- nience, and safety. Increasingly, transportation agencies are also concerned with the broader impacts of trans- portation initiatives with regard to community quality of life, economic development, and the environment. The kinds of performance measures monitored by transportation agencies span the entire model. Some focus on resources, primarily human and financial, while others measure outputs and agencies’ operating efficiency in producing them. Effectiveness measures are tied to outcomes-oriented objectives for improving transportation system performance and generating pos- itive impacts. Quality measures relate both to service outputs and to outcomes, and customer satisfaction measures similarly reflect satisfaction with outputs but even more so with transportation outcomes. Finally, 8 2 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp70_128 12/13/05 12:36 PM Page 82

cost-effectiveness measures and benefit–cost measures relate transportation outcomes and broader impacts to resources consumed and other costs. Balanced Scorecard Models Many transportation agencies have used the balanced scorecard model (Kaplan and Norton 1996) to ensure both an internal and an external perspective and a process as well as results orientation in defining goals and performance measures. The Charlotte, North Car- olina, Department of Transportation pioneered the use of this approach in the field of transportation. It devel- oped measures for each of the four original quadrants of performance, including the customer, financial, internal business practices, and learning and growth perspec- tives. Other DOTs have customized the model. The Illi- nois Department of Transportation, for example, has identified customer satisfaction and partnerships, busi- ness practices, delivery of programs and services, and learning and growth as the focal points of its strategic objectives and performance measures. The Georgia Department of Transportation organizes its strategic goals and performance measures in the six domains related in its strategy map (see Figure 2). Program Delivery Many states now are especially concerned with getting more effective control over program delivery and imple- menting their annual State Transportation Improvement Programs (STIPs). This concerns the entire process, from planning, preliminary engineering, design, and let- ting through actual construction. This is an important core business of all state DOTs and consumes a sub- stantial proportion of their financial resources and pro- fessional workforce on an ongoing basis. In this age of accountability, governors, legislatures, and transporta- tion commissions have mounted substantial pressure on DOTs to deliver on the projects to which funds have been committed. Thus, the top priority of many state DOTs around the country now focuses on delivering the projects that have been promised to their customers. For example, in the wake of serious financial mismanagement issues in a pre- vious administration, the Virginia Department of Trans- portation “jump-started” its renewed strategic planning process in 2002 with the central objective of getting its program delivery process back on track. Thus, measures associated with moving capital projects through the pipeline to completion predominate in all of the depart- ment’s internally and externally focused performance reporting systems. In addition, at least a few states, such as Washington, Oregon, and New Mexico, have received large increases in funding through additional revenue sources or expanded revenues from traditional sources and will now be responsible for delivering significantly larger programs. In Ohio, Governor Taft’s Jobs and Progress Plan calls for significant increases in both federal and state funds to put more Ohioans back to work on expanded highway construction programs over the next 8 3PERFORMANCE MEASUREMENT IN TRANSPORTATION Program Construction Maintenance Safety Operations Public Transportation Outputs Projects Completed Lane Miles Bridges Built Miles Resurfaced Repairs Made Treatments Applied Projects Completed Turn Lanes Added Stripes Painted Messages Displayed Incidents Cleared Signals Timed Vehicle Hours Vehicle Miles Seat Miles Immediate Outcomes > Capacity > Connectivity > Condition Smoother Pavements < Hazards More Efficient Operation > Coverage < Headways Intermediate Outcomes < Congestion < Travel Times > Convenience > Ride Quality < Operating Expense < Crashes < Injuries < Fatalities < Congestion < Delays < Crashes < Waiting > Ridership > Convenience Longer-Term Outcomes Mobility Quality of Life Economic Development Environmental Enhancement Community Development System PerformanceAgency Performance Impacts FIGURE 1 Transportation program logic model. 99395mvp70_128 12/13/05 12:36 PM Page 83

10 years. In Georgia, Governor Perdue’s Fast Forward Program provides for some $15 billion over the next 6 years, mostly for roads and bridges but also some for transit and rail. This infusion of new money from bond issues will double the size of the overall program, which has been running at about $1.2 billion per year. Given the pressure to make good on their commit- ments to the public and in some cases the added chal- lenge of moving significantly more projects through the process, most DOTs are increasingly concerned with tracking measures of program delivery. The focus is usu- ally on bringing in projects on time and within budget. In examining these measures more closely, some DOTs have been surprised to find that their performance in bringing projects to contract letting in the year pro- grammed for letting is significantly lower than previ- ously assumed. This has led to substantial efforts to streamline their processes, but it has also led to recogni- tion that in some cases STIPs have been dramatically overprogrammed and that programs need to be “right- sized” to afford more reasonable expectations of what they can accomplish. DOTs recognize the need to maximize the number of projects completed given the resources available for a given year; at the same time, they are concerned that projects can be stalled or slowed down by numerous factors beyond their control. Therefore, some DOTs see a need for providing a reservoir of projects in the annual STIP so that if some projects encounter difficulties along the way, others will have approval and be ready to go. At least a couple of states, for instance, program a 25 percent overrun in the number or dollar value of proj- ects to be brought to letting in a given year. This trans- lates into a target of letting 80 percent of all projects programmed for the year, which these DOTs see as ambitious but feasible. To improve program delivery through a highly struc- tured process improvement effort, the Pennsylvania Department of Transportation has 10 working groups focusing on issues such as planning and preliminary engineering, funding and programming, environmental clearance, right-of-way acquisition, utilities, permitting, design and development, consulting agreements, con- tract management and construction, and bridge design and construction. The performance measures the department is developing to track the efficiency of its program delivery focus on cycle times for completing overall projects as well as the individual elements of the process. However, at this point it is encountering diffi- culties in operationalizing some of these indicators because it needs better reporting mechanisms to provide the data input. The Pennsylvania Department of Transportation task force is also trying to develop good measures of the cost 8 4 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS Leadership Skills Stakeholder SatisfactionEmployee Development Employee Satisfaction Resignations ROW & CST Estimates STIP Delivery CST Delivery Maintenance Quality Paces Rating Congestion Fatalities Bridge Sufficiency Environmental Project Selection Transit Customer Satisfaction Leadership (Goal 5) Workforce (Goal 4) Stakeholders (Goal 6) Financial (Goal 3) Internal Processes (Goal 1) Customer (Goal 2) Stakeholder Satisfaction Program Delivery Vision Keep Georgia Moving with Quality Transportation Mission The Georgia Department of Transportation provides a safe, seamless, and sustainable transportation system that supports Georgia’s economy and is sensitive to its citizens and environment Core Values Committed Accountable Responsible Ethical FIGURE 2 Georgia Department of Transportation strategy map, fiscal year 2005. 99395mvp70_128 12/13/05 12:36 PM Page 84

of and level of effort required in completing various ele- ments of the overall program delivery process. It is con- sidering development of an information system for design and project delivery analogous to its highway maintenance management system. Such a system would track person-hours of effort and other direct costs expended on each stage of each project in order to mon- itor productivity and cycle times. In addition, the department would like to build measures of quality into such a system and is beginning to think about what kinds of measures to use in this regard. Incident Management Over the past several years, operating the highway sys- tem has been increasingly recognized as a central com- ponent of a DOT’s highway program responsibilities. In addition to utilities and work zone management, this includes coordination of signals and other traffic con- trols, pavement markings, use of high-occupancy vehicle lanes, signage, and the use of intelligent transportation systems technology to provide motorists with current traffic information via on-site variable message signs, websites, and the media. A major thrust in this emphasis on system operations concerns incident management—in particular, coordi- nation of effective responses to traffic disruptions on highways due to crashes, debris in the road or spills of hazardous materials, vehicle repairs, repair work zones and construction lane closures, and so forth. Effective management of such incidents can have major impacts on both subsequent traffic congestion and secondary crashes, and DOTs have developed proactive programs to respond to them, particularly in large urbanized areas. Thus, DOTs are now monitoring the occurrence of such incidents and tracking their performance in coping with them. For example, the Maryland State Highway Administration tracks such indicators as incident dura- tion, initial response time, and overall recovery time in terms of service quality and initial outcomes. In annual evaluations of its Coordinated Highway Accident Response Team, the administration also computes mea- sures of the reduction in vehicle operating hours, total traffic delay time, fuel consumption, total emissions, and secondary accidents avoided because of its incident management efforts, as well as an overall benefit–cost ratio (Chang et al. 2003). Customer Satisfaction In addition to focusing on service delivery and opera- tions, transportation agencies increasingly have been monitoring quality and effectiveness from the customer perspective (Stein and Sloane 2003). Public transit agen- cies have a long history of using customer surveys, not only to obtain information on trip origins and destina- tions but also to solicit feedback on customers’ percep- tions of the reliability, safety, convenience, and overall quality of the services they provide. State DOTs that conduct regular surveys of the pub- lic at large, motorists, or other stakeholder groups include those in Minnesota, New Mexico, Illinois, Ken- tucky, Pennsylvania, Ohio, and Georgia. The Pennsyl- vania Department of Transportation, for example, has conducted periodic surveys to monitor residents’ overall ratings of a variety of services ranging from highway construction and maintenance to roadside beautifica- tion, snow and ice removal, welcome centers, trans- portation planning, financial support for public transportation, vehicle inspection programs, and vehicle registration and titling services. Closer to the operating level, the department con- ducts an annual Highway Administration Customer Survey mailed out to 1,000 randomly selected licensed drivers in each of the state’s 67 counties. The survey tracks changes in customer satisfaction with a number of performance attributes related to ride quality, traffic flow, and safety separately for Interstate highways, numbered traffic routes, and secondary roads on the state system. It provides statistically reliable measures at the statewide, district, and county maintenance unit levels. The Florida Department of Transportation uses a mix of telephone surveys, mail surveys, and response cards to monitor feedback from six customer segments including resident travelers, visitor travelers, commer- cial travelers, special needs travelers, property owners, and elected officials. The data indicate the percentages of these groups who are satisfied versus dissatisfied with Florida highways, transit services, and other modes as well as with the department’s communications and interaction with external stakeholders. They are reported as individual performance measures but are also aggregated into an overall index of customer satis- faction. HOW IS PERFORMANCE MEASURED? Transportation agencies are increasingly careful about how they specify particular measures of performance, and this can be critically important in driving decisions and actions. In the area of highway safety, for example, trying to reduce fatalities as measured by the number of traffic fatalities per 1,000,000 vehicle miles focuses attention on safety improvement projects, improved operations, and more effective enforcement activities— 8 5PERFORMANCE MEASUREMENT IN TRANSPORTATION 99395mvp70_128 12/13/05 12:36 PM Page 85

a strategy of making the roads safer. Tracking the num- ber of traffic fatalities per 100,000 resident population might prompt the same kinds of policies but also emphasize the use of alternative modes and telecom- muting and in the long run the changing of land use pat- terns to reduce highway usage—basically a strategy of getting people off the roads. The National Highway Traffic Safety Administration and some state DOTs track data on both these measures. In an example relating to highway system preserva- tion, focusing attention on increasing the percentage of lane miles in good condition would tend to prompt a worst-first strategy for targeting resurfacing projects, perhaps even giving priority to lower-volume roads where repair work is less disruptive and easier to per- form precisely because traffic management problems are not at all severe. In contrast, monitoring the percentage of vehicle miles traveled (VMT) carried on lane miles in good condition would encourage targeting higher- volume roads in substandard condition for resurfacing and would produce more beneficial consequences. Highway System Safety and Condition All state DOTs track measures of highway safety and highway and bridge condition on a regular basis. The standard safety measures concern the number of crashes, injuries, and fatalities per million vehicle miles traveled on an aggregate basis, but some states also compare numbers of crashes occurring at high-accident locations after safety improvement projects have been completed with the numbers before the projects were undertaken. Other outcome-oriented safety measures include the number of crashes at at-grade railroad cross- ings, the number of pedestrian and bicycle injuries or fatalities on state highways, and the number of crashes at highway repair work zones. State DOTs monitor the condition of their highway systems in terms of ride quality, measured chiefly by the international roughness index (IRI), and pavement con- dition, as monitored in Georgia for instance with the Pavement Condition Evaluation System. The Ohio Department of Transportation measures system condi- tion every year with pavement condition ratings based on visual inspections of 100 percent of its pavements. Similarly, the Pennsylvania Department of Transporta- tion conducts annual windshield surveys of its roads to track pavement deficiencies, shoulder conditions, drainage problems, guiderails, signs, and other appurte- nances. The department also emphasizes ride quality data. It runs 100 percent of its Interstate highways and 50 percent of its other National Highway System high- ways for IRI data every year. States also monitor the number of deficient or weight-limited bridges. Traffic Flow and Congestion General Tracking good measures relating to traffic flow and con- gestion on state highways has been more challenging. Tra- ditional volume–capacity ratios have been the mainstay in this area, but they may be problematic in terms of know- ing what the traffic-handling capacity of particular seg- ments of highway really is. The Ohio Department of Transportation runs computer models applied to highway capacity measures to compute volume–capacity ratios on the entire network over 24 hours. From the results the department develops annual estimates of the percentage of lane miles congested, the percentage of VMT in areas exceeding congestion limits, and the percentage of peak- hour VMT exceeding congestion limits. The Annual Urban Mobility Report produced by the Texas Transportation Institute tracks traffic congestion in the 75 largest urban areas in the United States (Schrank and Lomax 2003). This method uses Highway Performance Monitoring System data collected by the Federal Highway Administration, with supporting data from state and local agencies. The resulting measures are computer-modeled estimates based on roadway characteristics and traffic volume counts because high- quality actual speed data are not available for many cities. The measures include • Travel time index—ratio of peak-period travel time to free-flow travel time, • Delay per person—hours of extra travel time divided by number of residents, • Cost of congestion—value of extra time and fuel consumed because of congestion, • Percentage of VMT congested—traffic occurring on congested roads during peak periods, • Percentage of congested lane miles—lane miles congested during peak periods, and • Percentage of congested time—percentage of time travelers expected to encounter congestion. Methodologies exist or are being developed to estimate the impact of a range of solutions on reducing congestion, including additional highway construction, demand reduc- tion, freeway entrance ramp metering, freeway incident management, traffic signal coordination, use of high- occupancy lanes, and public transportation improvements. Urban Congestion Measures Mitretek reports travel time trends on a monthly basis for 10 metropolitan areas where public- or private-sector organizations provide suitable point-to-point travel time 8 6 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp70_128 12/13/05 12:36 PM Page 86

data (Wunderlich et al. 2004). This approach involves the automated acquisition of roadway travel times posted to traveler information websites, such as www.georgia-nav- igator.com in Atlanta or www.smartraveler.com in Miami, Florida, for 5-minute intervals for times of the day and days of the week when information is monitored. The measures reported include • Travel time index—ratio of congested travel dura- tion to free-flow duration for all congested trips (those over 130 percent of free-flow travel time); • Buffer index—average time a traveler would have to reserve during the day for a trip to be on time 95 per- cent of the time; • Average duration of congested travel per day— hours that a network is designated as congested, when 20 percent or more of all trips are congested (over 130 percent of free-flow travel time); and • Percentage of congested travel—hours of con- gested travel as percentage of time. The urban congestion measures reported by Mitretek sometimes differ substantially from the urban mobility measures reported by the Texas Transportation Institute because they are derived from travel time data rather than traffic volume counts. In the Atlanta area, one of the cities included in the urban congestion report, the Georgia Department of Transportation has instrumented some major corridors. It uses its Automated Traffic Monitoring System to mea- sure speeds and then converts the data to trip times for particular point-to-point segments. In other areas, the department uses pilot vehicles to drive specified segments and measure trip times. It sticks to the same season and same operating conditions to assess changes over time. The department has the potential to get trip time data through traffic signal systems by using detector loops but has not implemented this yet. Washington State The Washington State Department of Transportation uses archived loop detector data to track travel times between specific pairs of origins and destinations on 12 of the most heavily traveled corridors in the Puget Sound region. These real-time data are used to report current travel times in each of these corridors for each 5-minute interval throughout the day and are posted on a website (www.wsdot.wa.gov/pugetsoundtraffic/travel times) along with the average travel times for the same trips. The department believes that it is important to distinguish between recurrent and nonrecurrent conges- tion that might be due to incidents, construction lane closures, debris in the road, inclement weather, unusual driving conditions, or abnormally high traffic volumes. At this point the department simply labels as nonrecur- rent trips that take twice as long as in normal free-flow conditions, but it is hoping to develop actual incident data sets that can be correlated with the archived travel time data to identify incident-related delays. The department recognizes the limitations of loop technology for monitoring travel times and is beginning to experiment with other emerging technologies that may provide more accurate data and be more cost-effective than loop detectors. In one county the department has begun to use roadside speed cameras to estimate travel speeds, and the data are processed and reported the same way as the loop detector data. In addition, the depart- ment is considering the possibility of using automatic vehicle locators for this purpose in the future. Many pub- lic transit systems use this geographic information sys- tems satellite-based technology at present, which is feasible because they are operating relatively few vehicles. Travel time reliability is another important indicator of the quality of transportation. The Washington State Department of Transportation also uses its travel time data to compute estimated 95 percent reliable travel times, within which trips in particular corridors can be com- pleted 95 percent of the time. Interestingly, the Minnesota Department of Transportation has measured perceived travel time reliability for numerous pairs of origins and destinations around the state from motorist survey data. Florida Mobility Measures The Florida Department of Transportation has been developing a set of mobility measures for some time now, and it is still a work in progress. The department has identified four dimensions of mobility—quantity of travel, quality of travel, accessibility, and utilization— and has defined multiple indicators of each for both highways and public transit. The department is working to overcome some data problems in operationalizing these measures, for instance by using automobile occu- pancy data to track person miles traveled as well as VMT. The department reports data on some of the mobility measures in the Short-Range Component of its statewide transportation plan and expects to be able to set targets for these measures in the plan and its short- range objectives. It is also hoping to expand the mobil- ity measures to incorporate rail, aviation, and water transportation in addition to highways and transit. Use of Alternative Modes Most DOTs track the volumes of passenger trips on their urban and rural public transit systems, and at 8 7PERFORMANCE MEASUREMENT IN TRANSPORTATION 99395mvp70_128 12/13/05 12:36 PM Page 87

least one, the Georgia Department of Transportation, measures the annual growth in transit ridership as compared with the growth in VMT on highways. Occasional surveys can provide measures of the per- centage of work trips, or total trips, made by transit, bicycles, or walking, but this requires heavy sampling fractions. A desirable measure of comparable service quality might focus on average transit travel times in an urban area in ratio to average driving times for the same trips, but that is difficult to operationalize. Other indicators of the efficient utilization of urban area transportation systems focus on numbers of vehicles or individuals using transit park-and-ride lots and the percentage of vehicular traffic on highways that does not consist of single-occupant vehicles. Environmental and Economic Impact The kinds of performance measures typically used on an aggregate basis to track the environmental impacts of transportation projects focus on outputs in environ- mental compliance such as the number of wetlands affected and preserved or the number of sites mitigated. The Maryland Department of Transportation, for example, tracks the number of acres of wetlands cre- ated and reforestation planted as a percentage of acres required and the number of storm water management enhancements completed compared with the number targeted. The Washington State Department of Trans- portation uses the following measures of environmen- tal compliance: • Number of noncompliance events concerning fish habitats, wetlands, water quality, or other issues; • Total acreage of replacement wetlands through creation, enhancement, buffers, or restoration; and • Number of replacement wetland projects meeting all standards, some standards, or no standards. With respect to the economic development impact of transportation improvements, one of the most important performance measures may be the number of jobs that are created or retained in a state through initiatives in which transportation commitments or projects are a con- tributing factor. Generally speaking, ongoing research, program and project evaluations, and case studies continue to illumi- nate our understanding of the real impacts of trans- portation facilities and services. Quantitatively and qualitatively, however, few practical measures exist at present for environmental quality and economic devel- opment that can be incorporated into performance- monitoring systems on a regular basis (Meyer 2001). Benefit–Cost Ratios While benefit–cost ratios have long been used by trans- portation agencies to evaluate the worth of proposed proj- ects, agencies are now starting to use them to monitor the overall economic efficiency of their programs on an annual basis. For example, in the state of Victoria, Australia, VicRoads tracks the aggregate benefit–cost ratios for all projects completed in a given year. In addition, VicRoads monitors the “achievement index” of all projects, which compares postproject benefit–cost ratios computed 2 years after project completion in ratio to benefit–cost ratios pro- jected before project implementation. Use of Indices As the number of measures incorporated in some moni- toring systems proliferates to a point where minutiae tend to overwhelm a central overall focus, some agencies have developed indices that combine multiple measures in order to summarize performance with fewer numbers. The Florida Department of Transportation, for example, focuses on 11 key performance measures to monitor progress in accomplishing its strategic objectives and executive board initiatives. While a few of these mea- sures, such as construction project time changes and cost changes, are captured by single indicators, others are indices in which a number of indicators are combined to provide an overall assessment of performance. As discussed earlier, for instance, the department’s key indicator of customer satisfaction is an index com- bining measures of feedback from six customer seg- ments with a variety of transportation modes and departmental services. Another of the department’s key performance measures focuses on system condition, which is an index of maintenance ratings and bridge condition and replacement measures. In turn, the main- tenance rating program includes 84 separate indicators of various aspects of highway condition. All of these items are used to compute an overall weighted index of system condition, which is one of the few key measures monitored by top management. Where the data indicate slippage, executives can drill down to find which com- ponents of system condition are the source of the prob- lem. At the program management and operating levels, the individual measures are more meaningful because they are more directly actionable in terms of resource allocation, treatments, and managerial initiatives. The Ohio Department of Transportation uses its Organizational Performance Index (OPI), created in 1997, to evaluate performance in various areas and combine dozens of performance measures into a single index of overall departmental performance. The depart- 8 8 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp70_128 12/13/05 12:36 PM Page 88

ment has decentralized decision making and operational responsibilities, and the OPI focuses on the performance of the department’s 12 districts, and by extension the 88 county-level maintenance units, in the following eight functional or “topical” areas: • Construction management, • Contract administration, • Equipment and facilities, • Finance, • Information technology, • Plan delivery, • Quality and human resources, and • System condition. While the measures for finance and contract admin- istration consist of single indicators, in the other six areas performance is monitored on multiple indicators, which are tracked individually but also combined into indices. In each case the index is evaluated on the basis of the sum of points scored as a percentage of the total points available. The eight indices are further combined into a single index of overall departmental performance in the same percentage format. All the individual measures and the eight indices are monitored in monthly reports for each district. As illus- trated in Figure 3, they are also combined into total index values for each district and the department as a whole. While the system has the capability of setting dif- ferential weights for the measures and subindices, at present all or most of them are weighted evenly. The measures in each of the eight topic reports are “owned” by a deputy director or other senior manager in the department’s four central office functional divisions, who reviews the monthly OPI reports and works with district engineers to take corrective actions in areas where performance might be slipping. Setting Standards and Targets Increasingly, transportation agencies are identifying per- formance measures and then setting numerical stan- dards or targets to be attained on those measures by 8 9PERFORMANCE MEASUREMENT IN TRANSPORTATION District 1 2 3 4 5 6 7 8 9 10 11 12 ODOT INDEX 95.8333 91.6667 97.9167 97.9167 100.0 100.0 95.8333 100.0 89.5833 95.8333 93.75 91.6667 93.2192 83.3333 79.1667 91.6667 91.6667 91.6667 83.3333 91.6667 91.6667 87.5 100.0 79.1667 79.1667 91.6667 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 100.0 72.0238 82.7381 85.119 83.9286 91.6667 86.3095 88.6905 87.5 66.6667 90.4762 95.8333 82.7381 90.4762 88.8889 100.0 100.0 100.0 100.0 88.8889 100.0 100.0 100.0 100.0 100.0 100.0 94.4444 100.0 50.0 80.0 95.0 95.0 83.3333 65.0 95.0 50.0 90.0 65.0 70.0 83.3333 75.0 87.5 93.75 68.75 93.75 93.75 87.5 87.5 75.0 56.25 87.5 81.25 87.5 100.0 95.0 91.6667 100.0 100.0 93.3333 100.0 98.3333 100.0 93.3333 100.0 93.3333 98.3333 Finance Information Technology Plan Delivery System Conditions Quality and Human Resources Total Index Value Construction Management Contract Administration Equipment and Facilities 80.5413 90.2803 89.6745 80.514 92.3512 92.0387 84.7606 92.7044 93.0159 88.1189 93.3929 93.2192 Total Index Value Historical Trend Statewide 110 100 90 80 70 60 50 40 30 20 10 0 In de x Va lu e (% ) 06/03 07/03 08/03 09/03 10/03 11/03 Month End 12/03 01/04 02/04 03/04 04/04 05/04 FIGURE 3 OPI executive summary, Ohio Department of Transportation. 99395mvp70_128 12/13/05 12:36 PM Page 89

specified years rather than simply calling for improve- ment over time. For example, a state DOT might set tar- gets for reducing the travel time index during congested hours in its major urban areas from the current value of 1.8 down to an average of 1.75 in 2005, 1.70 in 2006, and 1.65 in 2007. With respect to ride quality, for instance, a DOT might define a standard for “good” ride quality on non-Interstate numbered traffic routes as IRI values of 120 or lower and set targets for achieving that standard on 80 percent of those roads in 2005, 85 percent in 2006, and 90 percent in 2007. Measuring performance against preset targets, how- ever, raises the question of how to establish “stretch” objectives that challenge the organization to make meaningful improvements while still being realistic. Considerations that usually go into decisions regarding targets include past and current performance, service delivery characteristics, available resources and tech- nologies, customer preferences, public sentiment, media attention, and political feasibility. Obviously, these fac- tors do not always point in the same direction, and transportation managers are forced to make difficult decisions about priorities. HOW ARE PERFORMANCE DATA REPORTED? Since usefulness to managers, decision makers, and policy makers is the bottom line for assessing the worth of any performance measurement system, transportation agen- cies have learned that reporting performance data in terms of informative comparisons is of critical impor- tance. Typically, the most relevant analysis tracks change in performance over time and compares actual perfor- mance against targeted performance. Other useful report- ing formats compare performance across organizational units (e.g., offices, districts, maintenance units), across user groups, or against counterpart agencies or programs. Scorecards and Dashboards “Scorecard” in many agencies refers to a format for pre- senting performance data, often tied to strategic plans, in which key indicators are presented as measures of success in achieving goals and objectives, often in terms of targeted versus actual values over recent years (quar- ters or months) as well as targets projected into future time periods. “Dashboard” typically refers to a display of performance data in a format designed to convey crit- ical information at a glance. It often uses green, red, and yellow light designations to provide a quick look at the status of programs or initiatives. The Pennsylvania Department of Transportation was one of the first transportation agencies to use both scorecards and dashboards. The department’s scorecard consists of a set of strategic goals and objectives, perfor- mance measures, and targets against which current progress is monitored by a strategic management com- mittee on a quarterly basis. As department executives became concerned, however, that the scorecard mea- sures keyed to change-oriented strategic initiatives did not help to track the status of certain other core busi- ness functions, they developed a dashboard for manage- ment-by-exception monitoring of the status of their ongoing core functions on a monthly basis. At this point, most of the department’s districts and central office bureaus use their own scorecards and dashboards. Many other DOTs use dashboards to track selected sets of key performance measures at the executive level. Figure 4, for example, shows the Minnesota Department of Transportation’s dashboard, which provides a monthly snapshot of the performance of infrastructure investment and planning programs as well as maintenance and operations. Similarly, the Missouri Department of Transportation uses a dashboard format to monitor progress in imple- menting its departmentwide business plan through a high- level set of performance measures. The dashboard, which is geared toward outcomes envisioned in the department’s business plan, is produced every 6 months and is targeted to the Transportation Commission, legislators, and other key external stakeholders. In addition, every headquarters business unit at the Missouri Department of Transportation maintains a scorecard of measures tied to its work plan. These score- cards track the implementation of strategies in the department’s business plan as well as service delivery and work processes in key core functions. The score- cards are reviewed by top management on a quarterly basis and are used more as a management tool to ensure the accountability of these units in advancing the department’s strategic plan. Aggregation and Disaggregation Options Many transportation agencies now have performance measurement systems that afford the ability to support denser and more finely granulated performance data at the same time. As the software systems supporting the data have become more powerful, flexible, integrative, and interactive, performance data collected and moni- tored at operating levels are routinely rolled up to higher levels in the organization. At the same time, managers can monitor summary data for the department as a whole or for major divisions and then drill down to lower levels in the organization to examine the variation among oper- ating units or projects and sometimes identify sources of problems. 9 0 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp70_128 12/13/05 12:36 PM Page 90

9 1PERFORMANCE MEASUREMENT IN TRANSPORTATION Status Years Year Status of Measure Before Current State Data Trend Districts Current Future Data Infrastructure Investment and Planning Pavement – Ride Quality Dropped below targets - 2 or 8 on target % Good PSR Forecast PSR - Principal Arterials 2003 3-year decline 3 of 8 on taget % Poor PSR future decline 10 Pavement – Ride Quality Dropped below target - 1 of 8 on target % Good Forecast PSR - Minor Arterials 2003 3-year decline 6 of 8 on target % Poor future decline 10 Pavement, Public Slight gain. Below N/A Satisfaction – Ride Quality 2003 target every year 6 Pavement – Remaining Dropped below targets - No districts on targets Forecast Service Life (RSL) 3-year decline future decline 10 Principal Arterials 2003 Pavement – Remaining Dropped below targets - 3 of 8 on target % High Forecast Service Life (RSL) 3-year decline 4 of 8 on target % Low future decline 10 Minor Arterials 2003 Pavement Preventive Far short of target No district on target Target under Maintenance $ Investment 2003 review - HSOP 2 Bridge Condition – Declining % Good 4 of 8 on target % Good Principal Arterials 2003 Improved % Poor 5 of 8 on target % Poor 7 Bridge Condition – Improved 7 of 8 on target % Good Minor Arterials 2003 6 of 8 on target % Poor 7 Bridge Condition – Improved to exceed 7 of 8 on target % Good Collectors 2003 targets 4 of 8 on target % Poor 7 IRC Travel Speed – % No Small gain 1999–02; N/A Forecast gains miles meeting speed targets target 2002 expect drop 2003–04 2005–14 4 Congestion – % miles of Deteriorating toward N/A Targets under Metro Urban Freeways 2002 red review - TSP 9 Transit Advantages – Bus Achieved target N/A Forecast on Shoulder Miles - Metro 2003 3 or last 4 years target 2004–06 12 Balanced Letting Small declines 2003 & 1 on target Schedule 05–06 FY 2003 2004 thru May shows gains 6 Letting Timelines – Small improvement 4 on target 1st Year STOP - All projects FY 2003 6 Letting Timelines – Declined 1 on target 3rd Year STIP - MC, RC, BR FY 2003 3 STIP Cost Estimates – Large improvement 3 Yellow 3rd Year - MC, RC, BR FY 2003 5 Red 2 Lettings on Schedule – Improved 4 on target Year 1 Next 3 years - Milestones 7-2003 4-2003 4 on target Year 2 1 6 on target Year 3 Construction Limits on Improved 8 on target Year 1 Schedule – Next 3 years - 7-2003 4-2003 6 on target Year 2 1 Milestones 4 on target Year 3 EIS Duration – New Improved steadily since N/A Major Projects measure 1990 15 ROW Processing Time New Short of target N/A Forecast smaller measure workload 3 Plan Quality for Bid – Dropped slightly below 6 on target % Exceptional Mn/DOT Plans FY 2003 target last year 4 on target % Poor 9 Plan Quality for Bid – Far below targets N/A Consultant Plans FY 2003 every year 9 Construction Cost Deviation Deteriorated over last 4 on target from letting to completion – 2003 5 years 6 MC, RC, & BR Construction Timelines – Small decline N/A % of MC, RC & BR Projects 2003 6 meeting completion dates Operations and Maintenance Public satisfaction with Dropped below target Metro area and GrMN both Maintenance 2003 last 3 years yellow 6 Snow and Ice Removal – On target overall last 6 All districts on target Hours to Bare Lane - 03–04 years. All Road Classes 6 Annual Average on target. Incident Clearance Time Deteriorating last 3 years N/A Metro Freeways 2003 9 Striping – Maintenance Old Measure N/A – New Measure 6 districts yellow Indicator 2003 3 Signing – % of signs Old Measure N/A – New Measure 3 on target replaced to meet 12-year 2003 3 cycle Fatalities – Trunk Highways No target No target Long-term deterioration 58% of TH Fatalities are in (new target) 2003 Metro, D3 and D8 (2002) 9 Crash Rate – TH Crashes No target No target Long-term improvement Highest crash rates in Metro & per Million VMT (new target) 2002 (No 2003 data) D6 (2002) 6 High Crash Cost Locations – On target last 2 years N/A Improvements Scheduled 2003 2 Fleet – Units Within Life Improved 5 meet 2003 targeet Cycle 4-2003 4-2004 0 meet 2005 target 3 Fleet – Equipment Achieving Declined 0 meet 2003 or 2005 targets Minimum Utilization 4-2003 4-2004 3 Fleet – Preventive vs. Flat 2 meet 2003 target Reactionary 4-2003 4-2004 0 meet 2004 target 3 Summary of Measures Status Year Before Status Current Green 10 5 Yellow 13 20 Red 7 6 • 9 measures dropped to yellow or red (4 in pavement) • 5 measures improved to yellow or green FIGURE 4 Minnesota Department of Transportation dashboard: performance versus targets, June 2004. 99395mvp70_128 12/13/05 12:36 PM Page 91

HOW ARE PERFORMANCE MEASURES USED? State DOTs use performance measures for a number of purposes, and systems are being designed to support par- ticular uses. The uses include reporting performance to governor’s offices, legislatures, oversight bodies, and funding agencies; communicating with the public at large; and planning, budgeting, and performance management. Communicating with the Public and Other External Stakeholders The Virginia Department of Transportation publishes a quarterly report card for public consumption on its website and makes hard copies available to groups of external stake- holders. Consistent with the department’s top priority, the report card focuses solely on project delivery: the number of construction contracts actually completed versus scheduled for completion, the number of maintenance contracts com- pleted versus scheduled, the percentage of construction proj- ects completed within budget, and the percentage of maintenance contracts completed within budget. The report card also compares projects completed in the current period in terms of aggregated cost overruns and time extensions against previous years. An innovation is the department’s project dashboard, also published on the web. The project dashboard indicates the status of all projects and allows the user to select projects by district, local government jurisdic- tion, road system, route, or contract ID. The Washington State Department of Transportation produces a quarterly report, Measures, Markers and Mileposts, for the Washington State Transportation Com- mission and makes it available to the public on its website at www.wsdot.wa.gov. Often referred to as the Gray Notebook, this report uses “performance journalism” to provide brief narrative explanations and illustrations along with a mix of tables, charts, and graphs conveying a wide range of performance data. The Gray Notebook contains a major section on project delivery. In addition, it serves to monitor performance in such areas as worker safety; employment levels and training; highway safety; asset management; highway maintenance; environmental programs; incident response; and the use of vanpools, park-and-ride lots, Washington State ferries, and state- supported Amtrak service. This report simultaneously provides accountability and makes a departmental case statement to a variety of external stakeholders. Transportation Planning and Programming Performance measures increasingly are being used to establish the criteria for transportation systems plans as well as for subsequent decisions about preserving exist- ing assets and programming projects to advance those plans. The Ohio Department of Transportation recently completed work on its statewide multimodal trans- portation plan, Access Ohio. This project-specific plan is keyed to a number of performance-based objectives for the next 10 years, including the following: • Reduce the frequency of crashes from current lev- els by 10 percent. • Reduce the crash fatality rate from the current 1.31 fatalities per 100 million VMT so that it does not exceed 1 fatality per 100 million VMT. • Maintain an average level of service of D on the urban state freeway system and an average level of ser- vice of B on the rural freeway system. • Reduce the growth in vehicle hours of delay on the state’s multilane divided system to 8 percent per year from the current 12 percent per year. • Sustain Ohio’s pavements so that at least 93 per- cent of all state-maintained lane miles meet pavement condition rating standards. Planners at the department ran separate sufficiency ratings for safety, condition, and congestion over the entire network to identify locations that fell below stan- dards that had been established in each area for each functional class of highway. The projects contained in the plan were selected to remedy these deficiencies and meet other objectives established by Access Ohio. The depart- ment will monitor progress in moving toward these per- formance targets and will alter programming or adjust funding between programs and districts as necessary to achieve these objectives. The Minnesota Department of Transportation updated its statewide transportation systems plan in August 2003 in an effort that completed its conversion to a performance- based planning process. This statewide plan is guided by the department’s three strategic objectives and 10 strategic poli- cies, which then led to identifying outcomes, performance measures, and 20-year targets. One or more performance measurement sets have been identified for each of the 10 strategic policies, with specific performance measures defined within each category, separately for each modal group as appropriate. While Minnesota’s statewide trans- portation plan specifies transportation outcomes, measures, and targets, it does not quantify needs on a statewide basis. However, the eight districts are now developing district plans that will represent the first attempt to quantify needs in order to fulfill the performance-based 20-year plan. Strategic Planning and Management Many state DOTs have well-established strategic plan- ning processes at this point, and performance measure- 9 2 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp70_128 12/13/05 12:36 PM Page 92

ment systems are a critical element as these agencies focus more attention on implementing their strategic agendas and accomplishing strategic goals and objec- tives (Poister 2004). The Virginia Department of Transportation’s strategic planning process illustrates the central role of performance measures in this regard. Strategic goals are defined to address critical issues, and then performance measures are identified for track- ing the intended results. Strategies are developed for achieving the goals, and the measures are monitored to evaluate success in implementing them. In past years, the New Mexico Department of Trans- portation’s Compass has been the prototypical example of a performance measurement system used proactively as a management tool. Initiated under a previous admin- istration to help the organization stay focused on its “true north” values, the Compass incorporated 16 cus- tomer-focused key results monitored through a total of 80 performance measures. Because it grew out of a qual- ity improvement tradition, departmental executives looked for continuous improvement on these measures rather than setting annual targets for them. Through detailed quarterly reviews involving 50 to 70 managers, the Compass became the driving force behind all depart- mental management and decision making. For several years the Compass served as a de facto strategic agenda for the department, even though it had never conducted a formal strategic planning effort. Recently, however, the new administration at the department has developed a strategic plan on the basis of an assessment that, although it was useful as a perfor- mance management tool, the Compass lacked a big-pic- ture strategic orientation, included too many measures, and was limited by its total reliance on available data. Thus, the New Mexico Department of Transportation’s strategic plan for 2004–2005 includes strategic objec- tives and approximately 40 high-level performance mea- sures with ambitious targets, some of which will have to be operationalized with new data collection procedures. Plans are for the top management team to review these strategic performance measures on a monthly basis. While some of the Compass measures have been incor- porated in the new strategic plan, the Compass itself is being aligned with the strategic agenda and redirected to the operating level, where it will continue to be used as a principal performance management tool. The Pennsylvania Department of Transportation has been involved with strategic planning for more than 20 years, with performance measures as a central part of the process (Mallory 2002). Under the previous admin- istration, the process was redesigned into a comprehen- sive strategic management process that was fiscally realistic and more oriented toward implementation. Per- formance measures and targets were established for each strategic objective, and a department-level score- card was developed for tracking progress on these ini- tiatives on a monthly basis. Districts and central office divisions developed their own strategic goals and score- cards to support the enterprise-level plan, and these guided the development of annually updated business plans, which also emphasized performance measures and targets for all programmed activities. The current administration has retained the strategic management focus but is moving to streamline the process. The department’s current strategic plan con- tains five strategic focus areas and eight strategic goals, with performance measures and targets identified for each, and it drives the business-planning process in the districts and central office divisions. Scorecards are used at both levels, and the Pennsylvania Department of Transportation is currently deciding how frequently these performance data will be reported and reviewed. Performance Management In many DOTs performance measures play a central role in managing the work of managers and employees and focusing their attention on strategic objectives and other organizational goals. Most DOTs assign individuals as owners, sponsors, champions, or results drivers of spe- cific strategic objectives or other initiatives and hold them responsible for achieving the expected results. In some states, such as Pennsylvania, South Carolina, and California, these expectations are incorporated in indi- vidual-level performance contracts and performance plans, while other states, such as Illinois and Kentucky, use less formal approaches. In either case, however, peri- odic reviews of the performance measures associated with these initiatives provide powerful incentives for individuals to keep their assigned initiatives on track. The Ohio Department of Transportation’s business plan identifies objectives to be achieved by executives and members of the department’s Career Professional Service—the top 200 to 300 managers in the depart- ment and professional employees not in the collective bargaining unit—over a 2-year period. For senior man- agers in the department’s central office support func- tions (e.g., planning, finance, and information technology), the individual-level objectives and their accompanying action plans are developed to ensure that they provide effective support to the districts, where ser- vices are delivered. Many of these performance expecta- tions for both district and central office managers are tied to measures included in the OPI, discussed earlier, supplemented by other individualized objectives. All members of the Career Professional Service are evaluated in annual performance reviews, which are based in part on predetermined performance measures, including in some cases OPI measures, as well as 360- 9 3PERFORMANCE MEASUREMENT IN TRANSPORTATION 99395mvp70_128 12/13/05 12:36 PM Page 93

degree appraisals. The results, in the case of strong per- formance, influence decisions with regard to promo- tions and pay raises (although pay rates are frozen at present). Poor performance triggers plans for corrective actions and in some cases has led to demotions into positions with fewer responsibilities and lower salaries. Performance Budgeting As transportation agencies are more committed to results-based allocation of funds to programs and orga- nizational units, performance data are used to project differential levels of outputs and outcomes associated with alternative funding levels. The performance mea- sures provide the linkage between plans and budgets, sometimes between strategic plans and budgets, or between strategic plans and business plans and then between business plans and budgets. The Minnesota Department of Transportation’s bud- get is organized along product and service lines on the basis of an activity-based budgeting structure. The department has established four product and service lines including multimodal systems, state roads, local roads, and general support and services. Within each product and service line, the department’s budget is formatted to a hierarchy consisting of budget activities, products and services, core activities, and specific work activities. Par- enthetically, with respect to the lowest level in this budget structure, the department has established sets of approved activity codes, and each individual employee’s time sheet records the time spent on each activity, which allows actual costs to be assigned to programmatic activities. The department’s biennial budget allocates resources to these products and services, and each product and service is tracked in the budget with performance mea- sures that are linked to the department’s strategic plan, the 20-year statewide transportation plan, supporting district and metropolitan area plans, modal plans, and the business plans prepared by the districts and func- tional divisions. The business plans are developed to advance the department’s strategic agenda and 20-year transportation plan. Because funds are being allocated to products and services, core activities, and specific activities, the Minnesota Department of Transportation is building the capability to track the dollar investment in each of its strategic policies and evaluate the results by cumulating the corresponding sets of performance measures. The New Mexico Department of Transportation began transitioning to a performance budgeting process in 2001, as required by New Mexico’s Government Accountability Act. This entails budgeting funds to pro- grams rather than organizational divisions and relating budgets to outputs and outcomes with performance measures. Thus, the department has developed a pro- gram structure that overlays the organizational struc- ture. The major program areas consist of construction, maintenance, program support, aviation, traffic safety, and public transportation, and each of these is divided into various programs. For example, the overall maintenance program com- prises three separate but related programs: preservation, scheduled maintenance, and routine maintenance. Responsibility for these programs crosses organizational lines. For instance, the Engineering Design Division, the Transportation Planning Division, the Highway Opera- tions Division, and the Road Betterments Division all share responsibility for the construction program, and the overall budget for the construction program there- fore is allocated among these units, as shown in Figure 5. The performance measures related to this budget include project profilograph numbers for construction projects, the percentage of final cost increase over bid amounts, the number of calendar days between the date of physi- cal completion of a project and the date of final payment notification, and the number of combined systemwide miles in deficient condition. Comparative Performance Measures There are many examples of comparative performance measurement across transportation agencies in the United States, including the U.S. Department of Trans- portation’s Conditions and Needs Report, the National Bridge Inventory, and the National Transit Database, in addition to the urban mobility and urban congestion reports discussed earlier. In addition, many state DOTs in the United States benchmark performance data against DOTs in neighboring states or those with simi- lar-size systems or programs to see where they stand in the field and where they might get ideas for improving their own programs and operations. One university-based research report annually com- pares the 50 states on a wide range of performance mea- sures relating to their highway programs in terms of condition, congestion, safety, and expenditures (Hartgen 2004). The ongoing controversy concerning the rankings produced by this report illustrates the complex issues involved in trying to secure uniform measures from dif- ferent agencies as well as in specifying and standardizing measures that afford fair and useful comparisons. A project being conducted for the Transportation Research Board, NCHRP Project 20-24(37), is explor- ing possibilities for systematic comparative performance measurement within peer groupings of state DOTs, with voluntary participation, which might be configured dif- ferently for different focus areas. The use of “adjusted” performance measures, the percentage of miles of good 9 4 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp70_128 12/13/05 12:36 PM Page 94

pavement statistically adjusted for average number of freeze–thaw cycles or maintenance expenditures per lane mile, for example, might also help make compara- tive performance data more palatable and more mean- ingful to transportation agencies. Integration of Measurement Processes As transportation agencies utilize a number of measure- ment systems of varying scope and purpose, it is impor- tant to articulate the relationships among them and ensure that collectively they meet the organization’s need for performance information. The current perfor- mance measurement framework at the Minnesota Department of Transportation is referred to as the per- formance measures pyramid. As illustrated in Figure 6, it starts at the top with transportation system–level mea- sures that are based on the department’s statewide sys- tem plan and strategic policies. Below that are measures associated with district plans and modal plans, and then the pyramid flows down to business plan measures and then to operating measures linked to work plans for individual organizational units. The policy-based, system-level measures at the top of the pyramid reflect outcome targets over a 20-year period, identified in the state’s long-range transportation plan, which in turn is consistent with the department’s strategic directions and 10 strategic policies. The business plan measures, on the other hand, are tied to both output and outcome targets over a 2-year period, while the oper- ations-oriented or project-related measures at the bottom of the pyramid are tied to output targets to be achieved within 1 year. 9 5PERFORMANCE MEASUREMENT IN TRANSPORTATION FY05 CONSTRUCTION PROGRAM 200 Salaries and Benefits FY04 OPBUD ENGINEERING DESIGN DIVISION TRANSPORTATION PLANNING DIVISION HIGHWAY OPERATIONS DIVISION ROAD BETTERMENTS DIVISION ADMINISTRATIVE DIVISION-IT BUREAU FY05 OPBUD FY04 VS. FY05 15,108.1 17,078.5 1,970.4 300 Contractual Services 950.0 948.0 (2.0) 400 Other Costs 2,187.4 2,604.5 417.1 Total 18,245.5 20,631.0 2,385.5 200 Salaries and Benefits 4,067.8 4,673.6 605.8 300 Contractual Services 2,553.1 2,553.1 0.0 400 Other Costs 1,462.6 1,508.5 45.9 Total 8,083.5 8,735.2 651.7 200 Salaries and Benefits 24,993.3 28,038.3 3,045.0 300 Contractual Services 280.2 280.2 0.0 400 Other Costs 5,267.1 6,288.4 1,021.3 Total 30,540.6 34,606.9 4,066.3 300 Contractual Services 226,088.8 239,336.7 13,247.9 400 Other Costs 17,418.4 17,717.0 298.6 Sub total 243,507.2 257,053.7 13,546.5 700 Debt Service 109,205.1 149,569.8 40,364.7 Total 352,712.3 406,623.5 53,911.2 200 Salaries and Benefits 1,811.3 2,021.2 209.9 300 Contractual Services 234.3 184.3 (50.0) 400 Other Costs 1,337.5 1,410.9 73.4 Total 3,383.1 3,616.4 233.3 TOTAL CONSTRUCTION PROGRAM 412,965.0 474,213.0 61,248.0 TYPE FY 04 FY 05 a Outcome 3,800.0 2,500.0 b Time in calendar days between the date of physical completion of a project and the date of final payment notification Efficiency 200.0 182.0 c Project profilograph for construction projects (road quality and smoothness) Quality £4.2 £4.7 d Percent of final cost increase over bid amount Quality 4.1% 4.0% Number of combined system wide miles in deficient condition Purpose: To provide improvements and additions to the State's highway infrastructure to serve the interests of the general public. These improvements include those activities directly related to highway planning, design and construction that are necessary for a complete system of highways in the state. PERFORMANCE MEASURES 0.76% 54.21% 4.35% 31.54% 7.30% 1.84% ENGINEERING DESIGN DIVISION TRANSPORTATION PLANNING DIVISION HIGHWAY OPERATIONS DIVISION ROAD BETTERMENTS DIVISION Debt Service ADMINISTRATIVE DIVISION-IT BUREAU 395.0 400.0 405.0 410.0 415.0 420.0 425.0 430.0 435.0 440.0 FY04 OPBUD FY05 OPBUD Construction Program Issues: -Salaries and Benefits contains a 1.8% vacancy factor. The category also contains a 2% salary increase, which is effective the first full pay period after July 1, 2004. -The FY05 OPBUD is reconciled to January 2004 Revenue Projections. -Overall Local Government OPBUD reflects a projected growth of 2.7% over FY04. -Administrative Division now administers IT budget through CIO. FIGURE 5 New Mexico Department of Transportation budget example. 99395mvp70_128 12/13/05 12:36 PM Page 95

At the systems level, the Minnesota Department of Transportation has been developing measure sets— groups of performance measures that collectively track performance that is related to a particular policy or measurement category, for each of the 10 policies in its strategic plan. Wherever appropriate, these measure sets include subsets or specific measures linked to a given policy for each of four modal groups, including high- ways and bridges, passenger service/bicycle/pedestrian, motor carrier/railroad/waterways, and aeronautics. The top management team at the department uses its monthly dashboard to monitor the status of some 30 selected performance measures that are tied to the strategic plan. Where performance is slipping, managers can drill down into district reports, business plan mea- sures, or operating measures to locate the source of problems and request explanations and remedial actions. The Florida Department of Transportation ties its organizational and program planning to transporta- tion systems planning through a strategic planning process that includes four components: the Florida Transportation Plan, the Short-Range Component, annual strategic objectives, and executive board initia- tives. The Florida Transportation Plan, updated every 5 years and about to be updated to a 2025 plan, is a 20-year project-specific transportation plan for the state developed in conjunction with a wide range of transportation agencies and stakeholders. The Short- Range Component is basically a 5-year work program that the department will undertake to advance the Florida Transportation Plan. It is updated annually, and the Florida Department of Transportation guaran- tees complete delivery of the first year of the Short- Range Component to the governor and the legislature. While performance measures tied to the Florida Trans- portation Plan are reviewed annually, the department monitors delivery of the Short-Range Component on a monthly basis. The department has identified nine strategic objec- tives, also updated annually, that are largely focused on strengthening its workforce, business processes, leader- ship, and customer orientation to ensure the organiza- tional capacity needed to deliver the Short-Range Component effectively. In addition, a number of execu- tive board initiatives concern various other issues or processes that are also important elements of overall organizational effectiveness. The department monitors dashboards for the strategic objectives and executive board initiatives on a quarterly basis. The Florida Department of Transportation is devel- oping a business planning process in three tiers to pro- vide a focal point for implementing the various elements embodied in the various components of its strategic plan. Tier 1 is the department’s statewide business plan, which is monitored on a quarterly basis and updated annually. Tier 2 will consist of business plans developed for 24 core processes and major functional areas that sometimes cross organizational lines, while Tier 3 will consist of business plans developed by each of the department’s districts and functional units. Performance measures tied to the initiatives, work programs, and other activities included in these business plans will allow departmental managers to monitor progress in implementing the overall strategies for achieving the goals identified in the long-range transportation plan, the Short-Range Component, strategic objectives, and executive board initiatives. CONCLUSIONS This review of performance measurement in transporta- tion indicates that the state of the practice continues to advance. Obviously, there is wide variation among agencies with respect to the evolution of performance measures, the kinds of measurement systems they have, and how and the extent to which they use performance data in planning, management, and decision making. A few agencies have mature systems at this point, charac- terized by (a) a range of sophisticated measurement sys- tems in place; (b) alignment of measures with performance-oriented goals, objectives, standards, and targets; (c) useful performance-reporting processes tai- lored for various audiences and management needs; and (d) systematic procedures for reviewing performance data and using the information to strengthen planning and decision making. 9 6 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS Document Planning Horizon Statewide Transportation Plan Policy- Based, System- Level, Essential Mode and Program Measures Additional Business Plan Measures Operating Measures District/Metro Plans and Modal Plans Additional Modal/ Submodal-Specific Measures Additional District & Metro-Specific System Measures Business Plans 20 Years 20+ Years 2 Years Less Than 1 Year Work Plans FIGURE 6 Minnesota Department of Transportation perfor- mance measures pyramid. 99395mvp70_128 12/13/05 12:36 PM Page 96

Summary of Recent Trends Some of the more notable developments over the past few years include the following: • More states committed to using performance mea- sures. Some are just starting out, but leading-edge agen- cies are implementing second- and third-generation systems that are more sophisticated. • Continuing trend emphasizing more strategic per- formance measures, more outcomes, and more cus- tomer-oriented measures. • Emphasis on measures to track performance in program and project delivery, but also advances in implementing better measures of transportation system performance. • Increased use of customer satisfaction measures. • More holistic approaches in terms of coverage that relate different performance measurement systems and track data at different levels (roll up, drill down). • More disciplined efforts to align measures with goals and objectives and to use them as tools for inte- grating systems. • More sophisticated software applications, system support, and data displays. • Proliferation of performance measures, but also recognition of the need to focus more selectively on the “vital few” strategic objectives. • More disciplined efforts. • More intentional use of measurement systems to support other management, planning, and decision- making processes. • Increased reporting of performance data directly to the public, especially in online report cards, to pro- mote transparency in government. Continuing Challenges Transportation agencies are investing increased resources in performance measurement and finding innovative ways of measuring the performance of trans- portation systems and programs. The real objective here has to be the development and implementation of mea- surement systems that are cost-effective tools whose contribution to improved planning and decision making is worth the effort. Some of the issues that still need to be addressed include the following: • Agreeing on a common terminology (e.g., dash- boards, benchmarking). • Improving measures concerning travel times, con- gestion, and delay, especially on noninstrumented roads. • Developing measures that allow cross-modal com- parisons with regard to service levels, quality, travel times, and costs. • Developing improved measures for freight trans- portation. • Obtaining systematic feedback from other exter- nal stakeholders beyond motorists and the public at large (e.g., other user groups, local governments, legis- lators, the media). • Interpreting the implications of customer satisfac- tion measures in relationship to engineering and profes- sional planning performance criteria. • Setting appropriate targets that are aggressive yet realistic (e.g., Federal Highway Administration ride quality standards). • Using objectives and performance measures relat- ing to system performance to articulate the relationship between strategic plans and transportation system plans more clearly. • Implementing workable comparative performance measurement systems that provide information that is useful for benchmarking and process improvement. • Strengthening linkages between measures and employee performance management systems to ensure that individual managers and employees are motivated and held accountable for accomplishing agency goals and objectives and hitting targets. • Institutionalizing strategic planning and perfor- mance measurement more effectively in agencies so that they can provide useful support rather than be derailed by changes in elected officials, politics, funding, or top administration priorities. REFERENCES Abbott, E. E., J. Cantalupo, and L. B. Dixon. 1998. Perfor- mance Measures: Linking Outputs and Outcomes to Achieve Goals. In Transportation Research Record 1617, Transportation Research Board, National Research Council, Washington, D.C., pp. 90–95. Albertin, R., J. Romeo, L. Weiskopf, J. Prochera, and J. Rowen. 1995. Facilitating Transportation Agency Management Through Performance Measurement: The NYSDOT Experience with the “Management Performance Indica- tors” Report. In Transportation Research Record 1498, Transportation Research Board, National Research Council, Washington, D.C., pp. 44–50. Aristigueta, M. P. 1999. Managing for Results in State Government. Quorum/Greenwood, Westport, Conn. Cambridge Systematics. 2000. NCHRP Report 446: A Guidebook for Performance-Based Transportation Planning. Transportation Research Board, National Research Council, Washington, D.C. 9 7PERFORMANCE MEASUREMENT IN TRANSPORTATION 99395mvp70_128 12/13/05 12:36 PM Page 97

Chang, G.-L., A. Petrov, P.-W. Lin, N. Zou, and J. Y. Point-Du- Jour. 2003. Performance Evaluation of CHART: The Real-Time Incident Management System. Office of Traffic and Safety, Maryland State Highway System, Baltimore. Doyle, D. 1998. Performance Measure Initiative at the Texas Department of Transportation. In Transportation Research Record 1649, Transportation Research Board, National Research Council, Washington, D.C., pp. 124–128. Etmanczyk, J. S. 1995. Wisconsin DOT Measures Quality from Top to Bottom. Journal of Management in Engi- neering, Vol. 11, No. 4, July–Aug., pp. 19–23. Fielding, G. J. 1987. Measuring and Monitoring Transit Per- formance. In Managing Public Transit Strategically, Jossey-Bass, San Francisco, Calif., Chapter 4. Hartgen, D. T. 2004. The Looming Highway Condition Cri- sis: Performance of State Highway Systems, 1984–2002. University of North Carolina, Charlotte. Halvorson, R., T. Hatata, M. L. Tischer, and P. Kolakowski. 2000. Performance-Based Planning, Asset Management, and Management Systems. In Transportation Research E-Circular E-C015: Statewide Transportation Planning, Transportation Research Board, National Research Council, Washington, D.C., Chapter 5. gulliver.trb.org/ publications/circulars/ec015/ch5.pdf. Ingraham, P. W., P. G. Joyce, and A. K. Donahue. 2003. Gov- ernment Performance: Why Management Matters. Johns Hopkins University Press. Kaplan, R. S., and D. P. Norton. 1996. The Balanced Score- card: Translating Strategy into Action. Harvard Busi- ness School Press, Cambridge, Mass. Mallory, B. L. 2002. Managing the Strategic Plan with Mea- sures. Presented at 81st Annual Meeting of the Trans- portation Research Board, Washington, D.C. Melkers, J. E., and K. G. Willoughby. 1998. The State of the States: Performance Based Budgeting in 47 out of 50. Public Administration Review, Vol. 58, No. 1, pp. 66–73. Meyer, M. 2001. Measuring That Which Cannot Be Mea- sured—At Least According to Conventional Wisdom. In Conference Proceedings 26: Performance Measures to Improve Transportation Systems and Agency Oper- ations, Transportation Research Board, National Research Council, Washington, D.C., pp. 105–125. Newcomer, K., E. T. Jennings, Jr., C. Broom, and A. Lomax. 2002. Meeting the Challenges of Performance Oriented Government. Center for Accountability and Perfor- mance, American Society for Public Administration, Washington, D.C. Newman, L. A., and M. J. Markow. 2004. Performance- Based Planning and Asset Management. Journal of Public Works Management and Policy, Vol. 8, No. 3, pp. 156–161. Osborne, D., and T. Gaebler. 1992. Reinventing Government: How the Entrepreneurial Spirit Is Transforming the Public Sector. Addison-Wesley, Reading, Mass. Poister, T. H. 1997. NCHRP Synthesis of Highway Practice 238: Performance Measurement in State Departments of Transportation. Transportation Research Board, National Research Council, Washington, D.C. Poister, T. H. 2004. NCHRP Synthesis of Highway Practice 326: Strategic Planning and Decision Making in State Departments of Transportation. Transportation Research Board of the National Academies, Washington, D.C. Schrank, D., and T. Lomax. 2003. 2003 Annual Urban Mobility Report. Texas Transportation Institute, Col- lege Station. Sorrell, C. S., and J. F. Lewis. 1998. VDOT Is Moving in a New Direction: How Virginia’s Department of Trans- portation Is Using Strategic Management to Revamp Its Entire Operation. In Transportation Research Record 1649, Transportation Research Board, National Research Council, Washington, D.C., pp. 115–123. Stein, K. E., and R. K. Sloane. 2003. NCHRP Report 487: Using Customer Needs to Drive Transportation Deci- sions. Transportation Research Board of the National Academies, Washington, D.C. Transportation Research Board. 2001. Conference Proceed- ings 26: Performance Measures to Improve Transporta- tion Systems and Agency Operations. National Research Council, Washington, D.C. TransTech Management, Inc. 2003. Strategic Performance Measures for State Departments of Transportation: A Handbook for CEOs and Executives. American Associ- ation of State Highway and Transportation Officials, Washington, D.C. Wunderlich, K., S. Proper, and S.-J. Jung. 2004. Urban Con- gestion Reporting in Major U.S. Cities: Key Findings and Lessons Learned. Presented at 11th World Con- gress on ITS, Nagoya, Japan, Oct. Ziegler, B. J. 1996. Transportation Planning and Performance Measurement in Washington State. In Transportation Research Circular 465: Conference on Transportation Programming Methods and Issues, Transportation Research Board, National Research Council, Washing- ton, D.C., pp. 14–16. 9 8 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp70_128 12/13/05 12:36 PM Page 98

9 9 RESOURCE PAPER Organizing for Performance Management Mark C. Larson, Minnesota Department of Transportation The public-sector transportation community nowhas 15 to 20 years of experience in learning to planand manage with performance measures. The paths of development vary widely from one organization to the next, yet they converge toward some common elements of effective practice. Great progress on this journey has been logged. Today, transportation investments are being selected on the basis of performance deficiencies and forecast bene- fits. Project status is reported regularly to managers, leg- islators, and the public. Politicians debate the performance level of snow and ice removal. And a state transportation commission has posted experience in performance management as a critical qualification for a new secretary of transportation. This paper focuses on how that progress has been achieved—the factors that have contributed to success. Through interviews with veterans of practice and evidence from the Minnesota Department of Transportation’s expe- rience, the paper examines key drivers of development and elements of successful practice. The paper shares what experienced organizations see as the next steps in develop- ment. It looks at what tools can be added to the repertoire to make performance management more effective. Finally, the paper explores emerging challenges and issues. Interviews were conducted with eight states and two metropolitan planning organizations (MPOs). Additional information was collected from the American Association of State Highway and Transportation Officials’ Standing Committee on Planning meeting and peer exchange in Charleston, South Carolina, in May 2004; the October 2003 U.S. Department of Transportation roundtable in Washington, D.C.; the 2004 Transportation Research Board international scan; and other sources. This paper aims to crystallize the experience of a number of organizations and provoke thought and dis- cussion. Other organizations may be blazing different paths to effective performance management. Transit organizations and regional organizations, for example, have experiences that are different from those of states. DRIVERS OF DEVELOPMENT Performance-based management was well established in the private sector before being brought to the public sec- tor. Its growth reflects the need for public organizations in a democratic nation to be customer driven and open and transparent to citizens. It reflects the requirement that public organizations adopt modern, data-driven, scientific tools of management or risk becoming non- competitive and marginalized. Our transportation counterparts—freight haulers, railroads, airlines, and transit operators—are driven by industrywide measures of on-time performance, capacity utilization, and oper- ations efficiency that are monitored regularly by stake- holders. Some of their measures are set by regulation or statute. We may be moving in that direction. External Drivers It is instructive to look at the origins and triggers of per- formance measurement in public transportation organi- 99395mvp70_128 12/13/05 12:36 PM Page 99

zations across the country. Among states interviewed, the most common drivers have been external—legislatures, governors, and transportation commissions. • Oregon, Maryland, and New Mexico: The legisla- ture required regular performance reporting by state agencies. In Maryland, the governor mandated all agen- cies to tie measures to budget submissions. • South Carolina: The legislature mandated report- ing. The state department of transportation (DOT) does annual and quarterly reports. • Arizona: The legislature mandated performance- based planning and programming and prescribed cate- gories of measures. • Montana: The legislature requested information on what could be accomplished with additional funding and what had been accomplished with the last increase. This sensitized DOT leaders to the need to communi- cate performance and built support for developing the Performance Programming Process (P3). • Washington State: The Washington State Depart- ment of Transportation has used measures for years, but performance management was held back because the agency had trouble agreeing on “good enough” mea- sures. In early 2001 the Washington State Transporta- tion Commission put out a call for a new secretary with a background in performance management. Douglas B. MacDonald was hired from the outside and quickly broke the deadlock. He directed the organization to move ahead with applying measures in public commu- nications and decisions. He stressed that performance measurement is an iterative process; measures would be refined, adjusted, or replaced as the agency gained skills and data. In 2002 the legislature directed the establish- ment of benchmarks in nine priority areas. They were added to the department’s already extensive Gray Note- book quarterly performance reporting process. • California: Despite some 10 years of experience with a comprehensive policy and performance frame- work, California also has been hampered by an inability to arrive at consensus on specific measures. In 2004 under Governor Arnold Schwarzenegger, the new Cabi- net Secretary of Business, Transportation, and Housing asked for development of common measures for the state’s transportation system by the California Depart- ment of Transportation (Caltrans), regional organiza- tions, and stakeholders by summer 2004. Reporting is to begin in the fall. The secretary has exerted ongoing leadership to support this goal. Legislatures, governors, and transportation commis- sions have issued a wake-up call. In response to crises in public credibility or for other reasons, some legislatures have stepped into what they saw as a vacuum of account- ability. They not only have prescribed performance accountability but also have participated in selecting goals, measures, targets, and even projects. This may be understandable, since the industry has not agreed on standard measures of performance to the extent of many other industries. Design standards and other traditional guidelines are no longer enough to satisfy stakeholders. If we do not set performance outcome standards, others eventually will—sometimes to a significant level of detail. Nevertheless, whether we are accustomed to it or not, there is no question that the role of governing bodies is to set broad policy priorities and monitor results. Internal Drivers A somewhat different internal path to development came in several Midwest industrial states and elsewhere. Large-scale total quality management or Malcolm Bal- ridge programs initiated in the 1990s were vital building blocks of performance management in Pennsylvania, Ohio, and New Mexico. Quality programs helped build cultures of accountability. Paralleling the manufactur- ing sector, industrial engineers have helped develop per- formance management tools in several states. Some of these initiatives were initiated by governors but then took strong root in transportation departments. The following are examples: • Pennsylvania: From 1994 to 2002 Secretary Mal- lory embraced the Malcolm Balridge business model and “management by fact” (i.e., implementation of per- formance measures). • Ohio: The Ohio Department of Transportation began a corporate change process at a leadership con- ference in 1991 that led to development of a culture and system of accountability. By 1994 this new direction was embodied in the department’s Vision 2000 Strategic Plan. Dramatic changes followed, including the redefin- ition of core business functions, decentralization of responsibility to 12 districts, and the beginning of a quality culture of continuous improvement. An initia- tive to incorporate performance measurement led to cre- ation of process measures and the Organizational Performance Index (OPI). The department’s quality movement was part of a statewide quality improvement program enacted by the governor, called Quality Services Through Partnership. It brought union leadership and management together in a quality management program that resulted in mass training of state employees. • New Mexico: The New Mexico Department of Transportation has had a Quality Bureau since the 1990s. It created widespread awareness of measures and accountability in the agency and was instrumental in the 1 0 0 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp70_128 12/13/05 12:36 PM Page 100

growth of performance management practice. Recently the new secretary of transportation transformed it into a small Quality and Business Performance Division that reports directly to her and will have representatives in all six districts. • Montana and Minnesota: In these two states, the use of measures took hold internally as a result of the state transportation planning process in the mid-1990s. Montana developed the P3 investment analysis approach from the pavement, congestion, and bridge management systems and began using it to optimize funding distribu- tions to maximize system performance. In the Minnesota Department of Transportation, additional roots of mea- surement practice in the 1990s were in a maintenance business planning team, which was influenced by quality management principles. • California: The new cabinet secretary has started a process to choose internal organizational performance measures, in addition to system measures, to be com- pleted by late summer 2004. FUNDING FOR ACCOUNTABILITY: THE NEW QUID PRO QUO? In Ohio the statewide quality program laid the ground- work, but the legislature became the catalyst for further development. There, as in several other states, the legis- lature agreed to provide more resources in return for greater accountability. In some parts of the nation, carte blanche increases in gasoline taxes or other funding are no longer on the menu. Instead, resources are offered only when the transportation commission, legislature, or governor has a strong role in buying into or setting performance priorities, or even selecting projects. Here are several cases: • Florida: The legislature is seen as the key driver behind what is today one of the most highly developed performance management systems among state DOTs. Since a gasoline tax increase in the early 1980s, the legis- lature, the governor, and the Florida Department of Transportation have coupled new funding initiatives to accountability measures. A major stimulus was a “crash and burn” failure in which $800 million in contracts had to be canceled. After that, extensive project delivery mea- sures were created. Targets for preservation of pavement and bridges were written into statute. The Florida Trans- portation Commission was set up in the late 1980s to oversee accountability. Today the department believes it has great credibility with the legislature and stakeholders. • South Carolina: The state DOT provides regular reports to the legislature on the on-time, on-budget sta- tus of the largest construction project in state history, the Cooper River project. • Ohio: In 2003 the Ohio Department of Trans- portation obtained a 6-cent gasoline tax increase amounting to $5 billion over 10 years while other agen- cies were being cut. This Jobs and Progress Plan funds system improvement and expansion projects. Now the department is focusing on delivering on its promises. It has established a detailed tracking system. Monthly meetings are held to monitor projects. • Mississippi: The legislature was concerned that projects were not selected on the basis of “professional criteria” and was concerned about cost escalation. It passed a capacity improvement program—Vision 21— to be based on “technical criteria” and “needs-based priorities.” Now the department has a regular business plan and reporting process. • Arizona: Planning, programming, and project measures are reported every 6 months and are tied to a 1⁄2-cent sales tax expansion program. It expires in 2007, and renewal for another 20 years is seen as dependent on achieving a target of 90 percent on-time program delivery. Performance as of May 2004 was 95 percent. • Washington State: In 2003 the legislature enacted a new 5-cent gasoline tax in response to enhanced trust built through regular performance reporting and in exchange for accountability for proj- ect delivery. Projects were rated and ranked with per- formance-based criteria, then selected by the legislature. Now progress on more than 100 projects funded by the tax is tracked regularly via the “beige pages” within the quarterly Gray Notebook perfor- mance report and on an enhanced public website. Revenues will be from $600 million to $1.1 billion for each of the next five biennia. From the standpoint of performance-based manage- ment, the quid pro quo can take either a positive or a negative turn. If the criteria for project selection are not significantly performance based but purely political, we will move backwards. If the governing entities are brought into the fold of understanding, trusting, and supporting performance-based criteria, they may become forces for a bigger and better-performing pie. If, as in Florida and Washington, greater transparency brings greater public support for transportation, a win–win partnership can result. CURRENT AGENDAS FOR DEVELOPMENT Let’s turn now to examining the current stage of devel- opment in performance management practice (see Box 1), as reported by organizations recently interviewed. They were asked what their focus of development is at this time. The answers vary widely, but there are some common emphases: 1 0 1ORGANIZING FOR PERFORMANCE MANAGEMENT 99395mvp70_128 12/13/05 12:36 PM Page 101

• Project delivery measures reporting; • Reducing the number of top-level measures and aligning measures with strategic and long-range trans- portation plans; • Improving performance-based decision support tools; • Developing modal and intermodal measures, typi- cally as part of expanding long-range plans to be multi- modal; and • Moving beyond well-established measures for preservation and maintenance to develop better mea- sures for quality-of-life related goals, mobility, and safety. Project Delivery Measures and Real-Time Tracking and Reporting Systems Confidence in project delivery is often essential to restore or maintain accountability with legislatures, especially when new funding is sought. At the October 2003 U.S. Department of Transportation roundtable in Washington, D.C., it was agreed that credibility in obtaining greater long-term system investments is closely tied to credibility in delivering short-term out- puts. The delivery of priority construction projects on time and on budget is paramount. Highly visible main- tenance services, such as snow and ice removal, are also important. The Ohio Department of Transportation has a strong performance management system and now has develop- ment of a project delivery reporting system as its top priority. After gaining a 6-cent hike in gasoline tax rev- enue in 2003, the department’s challenge became to deliver on time and on budget. It has developed a track- ing program to monitor progress milestones specifically for Jobs and Progress projects. Examples of organizations with project delivery tracking systems in place are the San Francisco Bay Area Metropolitan Transportation Commission (MTC) and the DOTs of Virginia, California, Washington State, Florida, Minnesota, and Arizona. Typically, project delivery reporting systems encom- pass the department’s highest-priority projects, espe- cially those funded under special legislative or gubernatorial packages. Failure to make delivery of those projects transparent, on time, and on budget exposes organizations to great risk. Conversely, project reporting for high-priority initiatives is a compelling way to build a performance management culture. The Minnesota Department of Transportation’s greatest advance in acceptance of real-time dashboard reporting came with monthly statewide monitoring sessions initi- ated in 2001 for 50 projects in the Moving Minnesota package funded by the legislature out of one-time gen- eral fund dollars. This practice continues with the new bond-accelerated projects program, which consists of more than $800 million in projects. 1 0 2 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS BOX 1 General Stages of Development in State DOTs Stage 1. Measures creation and monitoring—past oriented. Passive, learning, undifferentiated. Either systems or process measures developed, or both, but not connected. Typified by annual legislative reports. Focus may be internal, process/quality improvement, or external/system. Customer awareness develops. The pursuit of “perfect” measures may paralyze progress. Inadequate measures or poor data often cited as reasons for not reporting. Stage 2. Beginning to manage and plan with measures—real-time and future oriented. Active, aligned to organizational priorities. Still deterministic. Set targets, but experience difficulty prioritizing among them. Start to move to quarterly, monthly, and real-time reporting. Begin to analyze performance trends and factors. Begin to integrate with strategic and long-range plans. Begin to apply measures in programming and project selection. Stage 3. Modeling performance and resource decisions. Future orientation aggressively developed. Multifactorial. Integration of cost and benefit information. Scenario oriented. Optimization driven rather than deterministic targets—the best blend of cost and results. Aim to min- imize life-cycle costs. More systematically align capital programs with best scenarios. Stage 3 may include more progress on complex measures that are beyond direct transportation organization control: safety, quality of life, access, economic development, and environmental and land use measures that require multiagency analysis and planning and more modeling. 99395mvp70_128 12/13/05 12:36 PM Page 102

Aligning Large Measurement Systems with Strategic or Long-Range Plans and Goals Numerous states developed measures in response to leg- islative mandates or quality initiatives but ended up with dozens of unrelated measures without clear priori- ties. They laid a cultural foundation for measurement practice but did not give clear direction. In an era of tight resources, it becomes obvious that achieving sys- tem performance targets across the board is not afford- able. If priorities are not set, the process of performance management can be undermined. • New Mexico is revamping its long-standing Com- pass report to support five strategic priorities in the new governor’s strategic plan. This will mean fewer depart- ment measures than in the past. In some cases measures are being modified and new ones sought. • Pennsylvania, one of the older practitioner states, is moving from more than 70 department measures to identification of a smaller select set to track at the exec- utive level. Other measures will be delegated. • Washington State is working to align its new trans- portation plan and measures with nine key issue areas. This will lead to a constrained investment proposal to the 2007 legislature. • Minnesota completed refocusing its measures on 10 policies in a comprehensive 20-year statewide trans- portation plan in 2003. The Minnesota Department of Transportation is now applying the policy framework and measures down the organization, into district long- range plans, a new freight plan, and a first-time High- way System Operations Plan. Also in 2003, a regular monthly reporting schedule was initiated for commis- sioner’s staff for measures tied to the nine highest-prior- ity objectives in the transportation plan and strategic plan. The department is also reevaluating its funding for- mula for districts and MPOs and adding performance factors. Improving Performance-Based Decision-Support Models Montana is working to develop its P3 business process (see Figure 1) further. P3 develops a performance-based budget by system level, district, and type of work for about 70 percent of the capital program. It predicts future performance on the basis of funding alternatives within the three primary management systems the state uses— those for pavement, bridges, and congestion. The process requires iterations. Efforts are under way to simplify the process and to incorporate explicit performance consid- eration of economic development in the analysis and funding distribution. Work is proceeding on a software tool for selecting projects on the basis of linkage to per- formance status and outcomes. The Montana Depart- ment of Transportation is attempting to improve bridge condition forecasting capabilities and add performance information for selection of safety projects. It wants to do more evaluation of program results. In addition, Mon- tana is training planners in the Highway Economic Analysis Tool for use in performance-based systems development and project development. Minnesota has hired a consultant to help adapt PON- TIS to model and forecast bridge condition. This capa- bility is essential for doing investment scenarios. One objective is to adjust bridge condition targets to reflect lowest life-cycle costs. Also, benefit–cost factors are being developed for five priority bridge preventive main- tenance strategies. Performance measures and targets for bridge preventive maintenance will be proposed in fall 2004 as part of the department’s new Highway System Operations Plan. 1 0 3ORGANIZING FOR PERFORMANCE MANAGEMENT Goals are established: 1. Understand future funding. 2. Predict performance as a function of funding over time. 3. Iterate against all goal areas until acceptable and fundable performance is predicted by management systems. 4. Establish a funding distribution plan linked to investments based on performance goals. 5. Gain approval and support of decision makers. STEPS A method to develop an optimal funding allocation and investment plan based on strategic highway system performance goals and the continual measurement of progress toward these goals. FIGURE 1 Montana’s P3 and mechanics of P3 funding plan. 99395mvp70_128 12/13/05 12:36 PM Page 103

California cites a more basic challenge—finding the resources and developing better data collection and mon- itoring systems across districts and in concert with regional organizations—to support uniform performance measurement. Modal Measures Development Several states with performance measurement systems are expanding to encompass modal or intermodal objectives. The Mississippi Department of Transportation is devel- oping level-of-service measures for its ports and modal operations. Wisconsin is expanding its statewide trans- portation plan and New Mexico its strategic plan to include modal measures. Florida faces a major challenge in partnering with outside agencies to develop measures related to the performance of the recently designated Strategic Intermodal System. Currently various modal administrations in Florida do not use the same data or measures. Development of freight measures is a priority for Cal- ifornia. Minnesota and others are working on freight measures also. System and operational measures have been in place for 3 years in Minnesota for freight and commercial vehicle operations, aeronautics (see Figure 2), and tran- sit. They are rooted in the statewide transportation plan and business plan and are reported three times a year and in budget documents. Quality-of-Life Measures Development: Beyond Preservation and Maintenance Florida, like a number of other states, believes its preserva- tion and maintenance measures are well established. It is now engaged in developing better measures for mobility and quality of life—safety, bicycle and pedestrian, and eco- nomic development measures. California has had difficulty agreeing on measures in some of these areas going back to the 1990s but now appears to be breaking through. Col- orado has established preservation measures and targets and is beginning to look at development of mobility mea- sures but is concerned about the cost of data development. These are some of the focal efforts of organizations today. Later we will reflect on some of the challenges and issues for the future of performance management. 1 0 4 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 100% 80% 60% 40% 20% 0% 19 95 20 00 20 05 20 10 20 15 20 20 Reporting Year Target - Good Target - Poor Trend-Based Projection Good Trend-Based Projection Poor Note: Good - PCI ≥ 56 Poor - PCI £ 40 Data are updated at individual airports every three years % Good 5% 8% 10% 7% 6% 6% 4% 3% 6% 6% 5% 4% 6% 6% 6% 4% 86% 82%82%82%81% 82% 83% 84% 82% 88% 89% 85% 86% 80% 76% 79% 78% % Poor Cu rre nt D at a 20 04 T ar ge ts 20 05 T ar ge ts 6- Ye ar 10 -Y ea r 20 -Y ea r FIGURE 2 Minnesota Department of Transportation statewide plan preservation policy, aeronautics: percentage of regional trade center airport pavements meeting good and poor pavement condition index (PCI). 99395mvp70_128 12/13/05 12:36 PM Page 104

ELEMENTS OF SUCCESSFUL PRACTICE The common objective among transportation organiza- tions is to build performance management systems that provide compelling information for decision making. We have observed that paths to this goal vary widely. Nevertheless, we can identify some practices that are important to success. The 2002 Transportation Asset Management Guide (1), prepared under NCHRP Project 20-24(11), pro- vides some of the most forward-looking direction avail- able. It provides tables of “benchmark state-of-the-art” best practices. The guide recommends that at the policy level, “The agency proactively influences policy forma- tion with realistic estimates of agency resources that are needed to achieve specific goals. It works with its gov- erning bodies to instill this realistic vision in resulting policy statements and objectives, as well as in measur- able performance targets” (1, p. 2-6). Another valuable source is NCHRP Synthesis of Highway Practice 326 (2). It offers a practical list of “success factors” for strategic management, many of which relate to performance measurement (2, pp. 2–3). SELF-IDENTIFIED SUCCESS FACTORS Organizations interviewed for this paper were asked the key success factors in performance management. The following are some highlights: • Leadership and regular executive performance monitoring from within top management. An attitude from leadership to “do it.” Improve measures as you go—don’t get paralyzed waiting for the perfect mea- sure. • Accountability links from policy to programming to performance monitoring. Individual program areas should not make policy independently (Florida, Wis- consin). • Turn data into information and apply it in deci- sion making—plans, programs, and budgets. If districts are making budget and program decisions, provide them with performance information feedback on the results that their proposals will achieve. • Ownership by staff and program offices (bridge, maintenance, etc.). Instill expectations that measures adopted will be reported regularly at top management meetings (Florida, Minnesota, Ohio). • Build legislative buy-in. The Washington State Department of Transportation provided a prioritized list of projects based on performance evaluation criteria to the legislature and gave it a role in selection. • Build a vision of the payoff. Create a wide under- standing of where the tools of measurement can take us. Understand the funding implications of different perfor- mance goals (Montana). • Have practical measures tied to compelling needs of the public or the organization. In the case of Maryland, the Port of Baltimore was rapidly losing market share. Establishment of opera- tional measures agreed on by the port, state, and union led to greatly improved operations and resulted in a revival of business. In Minnesota, monthly statewide video reviews of milestones for priority projects specially funded by the legislature accelerated understanding and acceptance of performance management. BASICS OF SUCCESSFUL PRACTICE Institutionalizing sound practices into mainstream processes should increase prospects of performance man- agement’s being sustained and reduce the possibility of its being marginalized as an add-on activity. Some ele- ments deserving further discussion are leadership, own- ership, staffing, customer information, regular reporting schedules, performance targets, and integration into planning and decision making. Leadership The importance of sustained leadership from within top management was cited by many states as a key success factor. One organization stressed the importance of top managers demonstrating to others that they are using performance data for decisions. The literature agrees. While leadership from appointed officials can be pow- erful, it is important to develop support from the career managers and supervisors who will endure changes in administration. Ownership The list of success factors includes the importance of building ownership of an organization’s strategic objec- tives (2, p. 3): • Identify “owners” of strategic objectives, initia- tives, measures, and action plans who will be responsi- ble for achieving results. • Communicate strategic objectives to internal (and external) stakeholders at every opportunity. Representatives of several states cited assigning own- ership of goals and measures as an important success fac- tor, as well as involving key staff in developing goals and 1 0 5ORGANIZING FOR PERFORMANCE MANAGEMENT 99395mvp70_128 12/13/05 12:36 PM Page 105

measures. The Pennsylvania Department of Transporta- tion has assigned owners to its goals and measures for many years. The Ohio Department of Transportation’s quality culture empowered many employees to partici- pate, starting at the line level. The Minnesota Depart- ment of Transportation has strengthened its practice in the last year by charging the division director with the lead responsibility for results and the director with lead responsibility for reporting for each measurement area. Representatives of several states conveyed the value of seeing performance reporting sessions as opportuni- ties for problem solving to make the organization and its staff successful, not as sessions for judgment. The Montana Department of Transportation under- took an arduous 3-year process involving multiple divi- sions to develop performance measures and its P3. It credits broad participation with spurring internal accep- tance of its performance-based funding allocation process. Staffing: Dedicated Versus Integrated In earlier stages of development, during the 1990s when state revenues were healthier, at least a few organizations had substantial units to lead performance measurement. Intensive staff efforts are required in the development phase. Today, the maturing of performance management appears to have led to mainstreaming measurement responsibilities across organizations, with a surprisingly small central function. This sends the message to line managers and supervisors that measurement is part of their jobs. The central function typically stimulates and guides development of measures and targets by program areas and assembles department-level reports for manage- ment and stakeholders. As long as the central function has enough staff, resources, and clout to carry out its mission, keeping its role limited appears to support wider institutionalization of performance management practice. Central staff dedicated primarily to performance mea- surement amount to one in Florida and Montana, one and one-half at the Bay Area MTC, three in Washington and Minnesota, and two in New Mexico, with six liaisons in districts to be added. Caltrans has two central staff and is examining how to increase the measurement presence in districts and throughout the organization. In addition to central staff, others are often assigned to measurement in such areas as maintenance or regional traffic manage- ment. They put substantial hours into developing mea- sures and collecting, processing, analyzing, and reporting performance data. Often the central measurement function is in the planning division. Examples are Florida, Montana, Wis- consin, Ohio, and Minnesota. Pennsylvania indicates that responsibility is distributed, though the lead staff person reports to the secretary’s office. Customer Information The Transportation Asset Management Guide best practices recommend, “Policy formulation seeks input from customers and stakeholders, and reflects customer priorities and concerns in resulting policy statements and objectives, and performance targets” (1, p. 2-6). A further recommendation is that information on cus- tomer perceptions be “updated regularly through sur- veys, focus groups, complaint tracking, or other means” (1, p. 2-11). A growing number of organizations are using cus- tomer information. The San Francisco Bay Area MTC hires third-party consultants to evaluate customer satis- faction with its 511 travel information program, which is operated by a contractor. The Minnesota Department of Transportation uses professional market research staff and consultants to measure customer priorities in tandem with customer satisfaction levels. Satisfaction levels with maintenance services have been tracked for 10 years. The process has contributed to prioritizing resources for services rated most important, such as smooth pavement, and to reducing investment in services rated less important by the public, such as roadside mowing. Interestingly, recent declines in satisfaction with road smoothness and litter removal have correlated with reduced spending resulting from tight budgets. Market research can be a powerful tool to assist in cal- ibrating service level targets. Minnesota’s market research staff has utilized specialized simulation studies with cus- tomers to establish target service levels for pavement smoothness, snow and ice removal, and other services. Regular Reporting Regular reporting of performance data, trends, and forecasts within the organization is essential to perfor- mance management for many reasons. It is one of the four legs of the Plan o Do o Check o Act performance management cycle. The Transportation Asset Management Guide states that good performance-based management “implies that the right information is available to the right levels at the right time” (1, p. 2-4). Performance reporting should take place for a manageable number of objec- tives or measures on a regular cycle that is integrated as much as possible into existing planning and decision venues. 1 0 6 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp70_128 12/13/05 12:36 PM Page 106

The Pennsylvania Department of Transportation cites the importance of a regular reporting system in keeping the performance measurement process alive and instilling a culture of evaluation and continuous improvement. Maryland is now exploring moving from annual to quar- terly reporting to keep performance management on the agenda and stimulate program adjustments during the year. Ohio encountered problems when it was unclear what was to be reported when. Now its OPI, covering some 70 transportation services, is reported monthly agencywide. The OPI has a common six-point scale and uniform reporting format. In addition, units in all divi- sions provide a focus report monthly to the division director on more detailed operational measures. This kind of intensity provides constant reinforcement of what is important. It facilitates real-time corrective action. The Minnesota Department of Transportation first established a regular monthly rotating reporting cycle for its Operations Division in 2001 and has now extended it to commissioner’s staff. Top staff review measures for highest-priority strategic objectives and delegate reporting for lower-priority objectives to lower levels. Managers determined the frequency of reporting for each area on the basis of the sensitivity of the mea- sure to change and its urgency. For example, milestones for high-priority projects are reported monthly to dis- tricts and quarterly to commissioner’s staff. Fleet man- agement measures are reported quarterly to districts. Bridge and pavement condition measures are reported annually. Reports are done at regular management meetings to make performance discussions a normal part of business management. The impact has been gradual but strong, especially for managers now in the fourth year of this process. At the same time, use of mea- surement information is growing in budget meetings and other dedicated venues. Setting Targets The Transportation Asset Management Guide and sev- eral states have identified setting quantitative goals, or targets, as a key success factor. In Minnesota, the take- off of performance management was in the 2000–2002 period, when business plan and transportation plan tar- gets were first established. Setting targets sent the mes- sage that measurement is about managing to get results, not just monitoring. In states such as Washington and Pennsylvania, targets for maintenance service levels and other areas are essential to the budget process. In Minnesota most state transportation plan mea- sures have 20-year targets. They are being applied in district long-range plan development. The performance- based targets define performance gaps and fiscal gaps and form the basis for calculating long-term resource needs to close the gaps. Figures 3 and 4 show examples of 20-year targets. Poister states that “numerical targets to be accom- plished within specified time frames tied to strategic objectives and performance measures” are vital to guid- ing strategic initiatives (2, p. 3). The Minnesota Department of Transportation has defined performance targets as “the level of service (quantity, quality or cost) to be delivered to customers (external or internal) for a specific period.” Several elements are important in providing direction in setting performance targets: • Baseline data trends; • Projected or forecast performance trend based on programmed projects or other factors; • Budget information and forecast; • Analysis, such as benefit–cost, life-cycle optimiza- tion, and so forth; • Customer expectations information—importance, satisfaction, and desired service level; • Industry benchmarks and engineering standards; and • Strategic vision and priorities of the commissioner, governor, and legislature. The draft International Scan on Performance Mea- sures cites the importance of analysis, such as benefit–cost, in underpinning the setting of targets and in determining the effectiveness of actions to reach the targets (3). The ideal condition for setting targets is to have his- torical and forecast performance in hand along with cus- tomer information and budget data on unit costs. Sometimes, however, organizations need to set strategic targets with limited data to drive organizational behavior 1 0 7ORGANIZING FOR PERFORMANCE MANAGEMENT Policy-bas ed perform ance level 20-year performance level Trend-based performance projection F ut ur e Pe rfo rm an ce G ap 1997 2001 6-year target 10-year target 20-year target Current Performance Level Performance Level FIGURE 3 Setting 20-year targets for Minnesota’s Transportation Plan—Case 3: performance is worsening. 99395mvp70_128 12/13/05 12:36 PM Page 107

and focus public attention. This is especially true when high-priority innovative strategic initiatives are under- taken. On the one hand, failure to set a target may slow progress. On the other hand, casual targets without any credibility can undermine the performance management process. An important issue is who sets the target levels. Par- ticipation by management teams and functional areas is important in achieving the buy-in necessary to mobilize and achieve the target. A balance of expert analysis and management vision is desirable. In Minnesota, the “expert office,” such as Bridge, Materials, or Traffic, typically recommends a target level on the basis of baseline data, programmed proj- ects, and planned initiatives. The division or department management team may adjust the target. Recently, aggressive 20-year plan targets for bridge structural con- dition were moderated when the assistant chief engi- neer, state bridge engineer, and planning office all agreed that there was no foreseeable scenario whereby resources would be available to meet the target. Montana believes that setting targets becomes less academic once tools are available to predict perfor- mance over time as a function of budget. Hence, it has focused its efforts on reorienting management systems to do forecasting. When transportation organizations themselves have not set credible goals and targets, the likelihood increases that governance bodies will. In Colorado, the Transportation Commission has set ambitious targets. With a state constitution that bars tax increases, the Colorado Department of Transportation is concerned about the availability of resources to achieve targets for bridges and mobility. Managing with performance measures and targets is a fact of life for transit systems. All transit systems receiving federal support must report uniform measures annually to the National Transit Database. Measures of ridership, capacity utilization, and subsidy levels are widely accepted and compared across systems. In Min- nesota, the Metropolitan Council bus system uses two measures to help decide what routes are candidates for reduction or removal of service when budget cuts force action: subsidy per passenger and operating cost per revenue hour. All 35 providers in the region are required by state law to report on the same measures. INTEGRATION INTO PLANNING, BUDGETING, AND DECISION MAKING Transportation decisions take place in many processes. Not bringing performance information into the processes marginalizes its value. There must be a link to the budget process “or you have nothing,” said one state planning division director. Details of integration are touched on throughout the paper, and selected examples are provided in Boxes 2, 3, and 4. 1 0 8 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 40% 30% 20% 10% 0% 19 93 % o f C on ge st ed D ire ct io na l M ile s 19 98 20 03 20 08 20 13 20 18 20 23 Statewide Plan Trend-Based Projection Statewide Plan Moderate Target Aggressive Target During Meter Shutdown 23.4% Historical Data 14.0% 14.9% 13.5% 11.1% 12.1% 18.6% 16.2% 16.6% 19.9% 22.9% 21% 21% 21% 21% 31% 37% 24% 25% 28% 26% 21% 22% Cu rre nt D at a 20 04 T ar ge t 20 05 T ar ge t 6- Ye ar 10 -Y ea r 20 -Y ea r FIGURE 4 Percentage of congested miles of Twin Cities urban freeway system. 99395mvp70_128 12/13/05 12:36 PM Page 108

1 0 9ORGANIZING FOR PERFORMANCE MANAGEMENT BOX 2 Integration of Performance Management Practices with Decision Processes and Budget: Selected Examples Governance—legislative Florida Sixty annual budget activity measures reported to legislature. Targets for pavement, routine maintenance, and bridge condition set in law. Washington Specific legislative mandate in nine measurement areas. Budget-based performance measure- ment requirements and performance agreements with governor. Additional voluntary track- ing of system, organizational, and project delivery measures quarterly. South Carolina Legislature mandates annual reports. Budget process California New administration moving toward performance-based budgeting (2004). Florida Input–output targets and unit costs required by legislature and governor for all activities, tracked by new activity-based budgeting system. Minnesota Performance information incorporated in biennial budget documents. Activity-based bud- geting expenditure data being developed to provide unit costs for products and services. Washington Maintenance accountability. Performance information used by department and legislature in negotiating targets for service levels and budget allocations for maintenance (see Figure 5). Ohio Performance data for up to 70 measures used in allocating funds to districts. For example, a “steady-state” goal is pursued, defined as the rate of rehabilitation projects matching the rate of deterioration. “Normalization” process used to equalize conditions across the state by adjusting district goals and funding. Wisconsin State budget office requires at least two measures for every program in budget. Executive strategic initiatives California Governor and cabinet secretary asked Caltrans and regional organizations to adopt common core performance measures for the transportation system in 2004. Alaska Missions and Measures citizen accountability program. In 2003 the governor asked all agen- cies to set missions and goals and report on measures annually as part of their annual bud- get requests. Activity Group 1 Roadway Maintenance and Operations 1A1 Pavement Patching and Repair 1A2 Crack Sealing 1A3 Shoulder Maintenance 1A4 Sweeping and Cleaning 1B1 Safety Patrol Group 2 Drainage Maintenance and Slope Repair 2A1 Maintain Ditches 2A2 Maintain Culverts 2A3 Maintain Catch Basins and Innlets 2A4 Maintain Detention/Retention Basins 2A5 Slope Repair Group 4 Bridge and Urban Tunnel Maintenance and Operations 4A1 Bridge Deck Repair 4A2 Structural Bridge Repair 4A3 Bridge Cleaning 4B1 Movable and Floating Bridge Operations 4B2 Keller Ferry Operations 4B3 Urban Tunnel Systems 1.0 + A – + B – + C – + D – + F – 1.9 2.0 2.9 3.0 3.9 4.0 4.9 5.0 5.9 FIGURE 5 Maintenance Accountability Process, Washington State Department of Transportation: activity service level targets and service levels delivered. 99395mvp70_128 12/13/05 12:36 PM Page 109

1 1 0 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS BOX 3 Integration of Performance Management Practices with Plans and Programs: Selected Examples Strategic plan Minnesota Rotating monthly reporting schedule initiated in 2003 for measures tied to objectives in strategic plan and statewide transportation plan. New Mexico Measures recently realigned and reduced at executive level to support five strategic plan pri- orities of new governor. Targets set for 2004, 2005, and 2006. Pennsylvania Measures and targets set for eight strategic and focus areas in strategic plan, with scorecards for executive management and chief engineer. Montana The strategic business plan explicitly requires that 95 percent of projects programmed be consistent with P3. Long-range transportation plan Minnesota Performance-based plan for 2003 sets 33 measures for three strategic directions and 10 poli- cies. Eight district plans under development in 2004 will operationalize the plan and define the 20-year performance-based funding needs to meet performance targets. Constrained plans will identify projects to address performance gaps. New freight plan and Highway Sys- tem Operations Plan will also support state plan policies and targets. Florida Twenty-year Florida Transportation Plan defines goals and objectives for all state and local transportation organizations. Annual Short-Range Component Plan documents strategic goals and objectives for Florida Transportation Plan for periods up to 10 years. Supported by Florida Department of Transportation 10-year Program and Resource Plan, which sets targets and guides program and funding decisions. Sixty legislative outcomes and activity measures used. Montana First statewide plan led to development of P3. The most recent plan emphasizes system preservation and consideration of economic development in process. Performance-based programming is consistently supported. Capital District Performance objectives and measures used to identify magnitude of gaps and support evalu- Regional ation of alternative strategies in New Visions 2021 Regional Transportation Plan. Transportation Plan (Albany, New York) Maricopa Performance measures used to evaluate the strengths and weaknesses of various future trans- Regional portation approaches and scenarios in the Maricopa Association of Governments Regional Transportation Transportation Plan. Plan (Phoenix, Arizona) Program and project selection Florida Decision support tool includes information on bridge, pavement, safety, mobility, vehicle miles traveled, economic development (under development), and other performance factors. It iden- tifies deficiencies and prioritizes projects. Can do preservation–capacity trade-off analyses. Southern Multimodal trade-off analysis tool applied to project analysis and selection process. California Council of Governments Wisconsin Metamanagement system prioritizes projects on the basis of performance needs. Perfor- mance targets used to evaluate adequacy of program; analysis is fed back to districts. Montana Seventy percent of the capital program is distributed on the basis of P3, and 95 percent of projects entering the program are consistent with its performance-based budgets for dis- tricts, systems, and types of work. Preservation is the first priority and is funded first. (continued) 99395mvp70_128 12/13/05 12:36 PM Page 110

1 1 1ORGANIZING FOR PERFORMANCE MANAGEMENT BOX 3 Integration of Performance Management Practices with Plans and Programs: Selected Examples (continued) Washington Performance-based programming and prioritization process. Asset condition and system performance (levels of delay, crash type, and frequency) are used in determining level of investment in program areas. Corridor plans Colorado “Corridor Visions” based on performance categories (mobility, safety, preservation) devel- oped by citizens and rolled up into statewide transportation plan. Minnesota Travel speed targets used as basis for community-based corridor planning process. Deficient corridors targeted for projects. BOX 4 Integration of Performance Management Practices with Operations and Management Processes: Selected Examples Project delivery management Virginia Public online project delivery dashboards established in response to legislative demands after a crisis in program delivery. Reported quarterly to governor. Washington In-depth project delivery reporting in the Gray Notebook’s beige pages. Comprehensive State web-based project status information posted. San Francisco Annual project performance report for operational projects provided to the public and the Bay Area MTC commission. South Carolina On-schedule, on-budget measures reports required by legislature for huge Cooper River bridge project. Minnesota Commissioner’s bond-accelerated projects milestones, right-of-way, and costs monitored monthly at district video meetings and quarterly at executive meetings. Florida Florida Department of Transportation Executive Board and Transportation Commission monitor production management report, which shows percentage of project deliveries in compliance. California Project delivery milestones and budget status reported quarterly to the California Trans- portation Commission. Posted on the web. Montana Project delivery targets set annually. Quarterly reports posted on the web. Delivery targets for total lettings and program mix are seen as the most crucial for achieving system perfor- mance objectives. Maintenance and operations Ohio OPI—exception reporting monthly to executive leadership and district engineers. Index of some 70 measures including maintenance reviewed quarterly. Pennsylvania Maintenance First. Well-developed system of regular reporting on performance versus tar- gets as well as customer satisfaction. Washington Maintenance Accountability Process. Advanced system of maintenance measures reporting calibrated on a common numeric scale. Legislature helps set targets and funds to the target levels Minnesota Highway System Operations Plan. New 4-year plan under development to set target levels of service and identify funding gaps for pavement and bridge maintenance, traffic operations, safety operations, and facilities and fleet. San Francisco Targets and results reported annually for operations such as incident management, transit Bay Area MTC electronic fare system, and 511 traveler information. 99395mvp70_128 12/13/05 12:36 PM Page 111

The Florida Department of Transportation and other participants in the May 2004 peer exchange in Charleston, South Carolina, emphasized the importance of performance measures being driven by policy and not being an end in themselves. The general business management literature reinforces this point. “Strategic performance measures should be closely focused around a line of sight, driven from the strategy,” according to a DePaul University expert in strategy design and balanced scorecard initiatives (4). Programming and Project Selection The Transportation Asset Management Guide indicates that performance information should directly inform the process that builds the recommended program and budget (1, p. 2-6). Project Delivery Management The Transportation Asset Management Guide indicates that “well-understood project delivery measures and pro- cedures are used to track adherence to scope, schedule, and budget” (1, p. 2-6). The guide disdains the use of “ad hoc processes” to control projects on an exception basis and states, “A process exists and is enforced to approve changes in project scope, schedule and cost.” While Min- nesota and others have developed systems that use proj- ect milestones to monitor project schedule status, it appears that fewer organizations have developed effective measures and controls for project scope and costs. The Transportation Asset Management Guide rec- ommends that information on costs and project outputs be “maintained in a form that can be used to track pro- gram delivery” (1, p. 2-12). Conclusion: Moving Toward Institutionalization The examples presented in the boxes are a small sample of the progress being made toward integration. Integration— along with leadership support, regular reporting systems, customer monitoring, targets, effective mainstreaming of staff responsibilities, and building a culture that owns per- formance measurement—is essential in achieving institu- tionalization of performance management. EMERGING STEPS IN DEVELOPMENT Planning for Performance—Tools to Model and Manage Future Results While the early years of performance measurement tended to rely on relatively passive retrospective moni- toring of results, the future will rely more and more on planning for positive performance outcomes with the use of predictive tools. Among the tools for forecasting and evaluating results of investment options are model- ing, scenario planning, benefit–cost analysis, life-cycle costing, and trade-off analysis. These tools are already being developed, used, and blended together. Among the most sophisticated integrated systems are those being used by Florida, Wisconsin, Montana, and the Southern California Association of Governments (SCAG). Minnesota has forecasting tools for pavement and interregional corridor mobility (see Figure 6) and shares these analyses with district planners. In Minnesota’s decentralized system some statewide priorities are set, but districts do the integration by balancing competing targets and making trade-off decisions. Modeling and Scenario Planning The Transportation Asset Management Guide best practices recommend that performance reports provide scenario testing of trends in performance versus cost and optimal timing of preventive and corrective mainte- nance and provide information on benefit impacts of proposed investments (1, p. 2-7). For state-of-the art practice it recommends “decision-support tools that facilitate exploration of capital versus maintenance trade-offs for different asset classes” (1, p. 2-12). Montana’s work in this area was discussed earlier and offers a valuable case study. Scenario planning tools are an emerging approach to modeling future performance on the basis of alternative sets of investments. The Capacity Building Program of the Federal Highway Administration (FHWA) is provid- ing technical assistance and financial support for the use of several software tools that bring together transporta- tion and land use planning. CORE PLAN is being used 1 1 2 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 100% 95% % o f I RC M ile s 90% 85% 80% 75% 70% 2002 2003 2004 2003–2005 estimates based on projects under construction 2004 estimate based on projects scheduled for completion 2005 2006 Calendar Year 2014 2023 Targets 85.0% 85.2%85.5% 85.7% 86.0% 85% 88% 88% 90% 5% 85% 83%83% 85% 20 14 T ar ge t 20 23 T ar ge t FIGURE 6 Percentage of interregional corridor miles meet- ing speed targets, Minnesota. 99395mvp70_128 12/13/05 12:36 PM Page 112

by Charlottesville, Virginia, among others, and PLACE 3S by the Sacramento Area Council of Governments. The tools allow inputting of demographics and land use and investment information, and they produce outputs for performance on level of service, access, and other measures. FHWA is looking for more interested parties and believes that small organizations with limited resources find these tools much easier to use than the old four-step model. The Chicago, Illinois, region and the state of Utah are engaged in intensive visioning and scenario-plan- ning efforts involving the public. For the Envision Utah project, investments are inputs, and performance results for measures such as transportation access are outputs. FHWA has held roundtable sessions with other inter- ested organizations (www.chicagometropolis2020.org/ and www.envisionutah.org/). In the Minnesota Department of Transportation’s decentralized planning system, districts use performance information for all major policy goals in the statewide plan to forecast gaps in infrastructure performance through 2030. As shown in Figure 7, the districts develop two scenarios: a performance-based scenario and a fiscally constrained scenario. Key areas include pavement condition forecast, bridge structural condi- tion, forecast interregional corridor travel speeds, regional trade center mobility, transit fleets, freight con- nectors, and safety. Pavement and interregional corridor travel speed have predictive models. Aggregating the fiscal gaps in performance and the costs to close the gaps will provide a statewide estimate of unmet future infrastructure needs—something that some legislators and stakeholders have been asking for. Districts have been asked to make preservation their highest priority. Pavement targets should be achieved by 2014 and demonstrable progress made toward bridge targets by 2023 (see Figure 8). The difference between the two scenarios will define the fiscal gap to meet long- range policy and customer-based performance targets. In addition to this process, in the short term some proj- ects are nominated by districts and selected centrally on the basis of performance criteria. Life-Cycle Costing, Benefit–Cost, and Other Optimization Tools The Transportation Asset Management Guide to best practices recommends, “Projects are evaluated in terms of realistic estimates of life-cycle costs, benefits, and per- formance impacts” (1, p. 2-6). It suggests that simply ranking the condition of assets and programming the “worst first” is not the answer. Life-cycle costing tools have progressed in certain areas and have some way to go in others. Minnesota has expanded from doing benefit–cost analysis on large proj- ects to setting targets on the basis of minimizing life- cycle cost for pavement, truck fleets, and transit fleets by using measures of years of life of the asset. Efforts to cal- ibrate bridge targets to least life-cycle cost are under way. The Minnesota Department of Transportation’s Bridge Office has estimated a 4:1 benefit for application of a priority package of bridge preventive maintenance treatments in combination with capital investments ver- sus the historic approach, which has been almost entirely capital investment. Measures and target service levels are being developed as part of a larger new High- way System Operations Plan, which supports the statewide transportation plan. Trade-Off Analysis Development of trade-off analysis is a goal of transporta- tion performance management. Among the organizations that have created tools to do it at some level are SCAG and the DOTs of Montana, Wisconsin, and Florida. The Florida package includes pavement and bridge, conges- tion, vehicle miles traveled, and safety performance fac- tors. Florida and Montana are both working to add economic development analysis elements to their systems. Figures 9 and 10 provide examples from Montana. Conclusion Further development and dissemination of these tools is a priority. Participants in the 2003 U.S. Department of 1 1 3ORGANIZING FOR PERFORMANCE MANAGEMENT • Plan Horizon 2008–2030 • 2008–2014 • 2015–2023 • 2024–2030 • Two Scenarios • Performance-Based Plan • Investments needed to meet targets by 2023 • Fiscally Constrained • Priorities for forecast revenues FIGURE 7 Scope of Minnesota Department of Transportation district plans. • System preservation is top priority • Allocate sufficient resources in fiscally constrained plan to meet and maintain pavement target by 2014 and make demonstrable progress toward bridge target by 2023 FIGURE 8 Guidance, Minnesota Department of Transportation district plans: priority among performance targets. 99395mvp70_128 12/13/05 12:36 PM Page 113

Transportation roundtable recommended upgrading man- agement systems, such as PONTIS, to increase their capac- ity for forecasting. One cautionary note: while future-oriented tools are becoming essential to planners, participants in the roundtable cautioned that projected data are sometimes not as effective in making a case for investments to legislators and the public as historical data. ADDITIONAL TOOLS Geographic Information Systems— Mapping Performance The Transportation Asset Management Guide state-of- the art benchmarks recommend that performance infor- mation be based on a common geographic referencing system and a “common map-based interface for analy- sis, display and reporting” (1, p. 2-12). Geographic information systems mapping is a promising tool for analyzing and communicating per- formance information. Numerous organizations show real-time congestion maps on websites and television. Maps of high-crash locations have dramatized safety problems in certain corridors in performance reports to Minnesota Department of Transportation management. Next winter, the Maintenance Office plans to communi- cate electronic maps of snow and ice removal time results by route throughout its eight districts for use by supervisors, drivers, and managers. Figure 11 illustrates the speed performance forecast for 2014 for the interre- gional corridor system in Minnesota. Dashboards and Performance Measurement Software: Real-Time Performance Monitoring The private sector has learned that automated perfor- mance reporting systems are valuable in reducing staff burden and increasing management and staff access to performance status data. Many DOTs use Microsoft Excel and Access applications. Some are experimenting with commercial business intelligence software. The U.S. Department of Transportation roundtable in October 2003 recommended setting specifications for commer- cial software vendors for transportation performance measurement and planning applications. Dashboard gauges are a visual tool promoted by commercial vendors and used by private- and public- sector enterprises. They are especially useful in monitor- ing large volumes of performance information. A common application is real-time monitoring of project management, such as in the Virginia Department of Transportation website (see Figure 12). Gauges may be less effective in displaying trend information, which is critical for sound analysis. Typically green means “on target”; yellow means “at risk, requires monitoring”; and red means “seriously short of target, needs intervention.” The Minnesota Department of Transportation has used dashboards effectively to monitor construction project development monthly, snow and ice removal monthly, and fleet man- agement quarterly, as well as information technology support and other administrative areas. Past crises in program delivery in Virginia led to the creation of public online dashboards. They track progress on advertisements, construction contract deadlines, con- struction contract budgets, and contract work orders (dashboard.virginiadot.org). As in Minnesota, this infor- mation helps stimulate project teams to solve problems, shift resources, make decisions to get lagging projects on track, or adjust schedules if no other choice is available. The Virginia Department of Transportation uses the dashboards to hold project managers accountable and reviews them at monthly commissioner updates. Virginia also posts a quarterly report card showing performance on core business outcomes—construction and mainte- 1 1 4 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 0 20 40 60 80 100 0 1 2 3 4 5 6 7 8 9 10 Time (years) High Expenditures, Increase Over Status Quo Moderate Expenditures, Maintain Status Quo No Additional Expenditures, Do Nothing Policy FIGURE 9 To predict performance as a function of budget, the Montana Department of Transportation uses scenario testing capabilities of bridge, conges- tion, and pavement management systems. Performance Objective A Resources Performance Objective B Resources Goal Area Program A (EX: Interstate Pavement) Goal Area Program B (EX: NHS Congestion) FIGURE 10 Trade-offs in performance goals (Montana). Some goal areas take more resources to change. NHS = National Highway System. 99395mvp70_128 12/13/05 12:36 PM Page 114

nance contract schedules and budgets. The Minnesota Department of Transportation shares project status infor- mation with commissioner’s staff quarterly and posts summary information and dashboards on its public web- site (www.dot.state.mn.us/ financing/#projects). Minnesota is piloting application of commercial soft- ware (Hyperion Performance Scorecard) to map its hier- archy of measures. In winter 2004–2005 the Maintenance Office will test the software to manage and report large volumes of snow and ice removal data by district, route, and event. The Florida Department of Transportation, which has a large number of measures, now has a perfor- mance measure for the percentage of key measures reports automated. It relied on Access database soft- ware but is in the second year of implementing com- mercial software (PB Views). It expects that the software will eventually enable all managers and employees to see to which measures in a hierarchy they contribute. Implementation is tied to a Balridge quality initiative, the Sterling model. Auditing Performance Data As performance management matures and data become essential in decision making and funding, the obligation to guarantee accurate, auditable data arises. The Governmental Accounting Standards Board has recognized performance data as an essential component of sound management. Texas was one of the early states to require auditing of performance measures data for all state agencies. The practice appears to be growing, sometimes with specific applications in transportation. In New Mexico the Legislative Finance Committee conducts audits every other year. The state DOT has initiated its own audits of performance data. In Florida the inspector gen- eral does a required yearly audit, including extensive 1 1 5ORGANIZING FOR PERFORMANCE MANAGEMENT FIGURE 11 Interregional corridor system speed performance forecast, by corridor, 2014 (Minnesota). 99395mvp70_128 12/13/05 12:36 PM Page 115

reviews of data collection and processing methods. All 60 measures in the Florida Short-Range Component Plan must document replicable processes. In 2003 the Washington State legislature created a Transportation Performance Audit Board consisting of legislative leaders and citizens. It is charged with review- ing performance measures used by several state trans- portation agencies and recommending performance audits (ltc.leg.wa.gov/tpab/default.htm). A request for proposals has been published, and a consultant will review and evaluate Washington State Department of Transportation measures. Transportation organizations may wish to initiate their own processes. Use of third-party auditors may enhance credibility. Caltrans has requested its audit sec- tion to work with all program areas to ensure that proper paper trails exist for performance data. In Ohio, where funding and individual performance reviews depend on performance data, the state DOT has a nearly full-time person auditing district data. Deputy directors must sign off on monthly reports of the OPI. Measuring and Managing Cost-Efficiency and Competitiveness The Transportation Asset Management Guide recom- mends as a best practice that an agency know “its cost for delivering its programs and services” by activity or class. Furthermore, it recommends periodic evaluation of options for delivery, such as internal, intergovern- mental partnerships, outsourcing, or managed competi- tion (1, p. 2-11). The trend today is toward focusing performance management on a few vital strategic objectives. But financial managers and legislators in some jurisdictions want universal accountability for all products and ser- vices. This direction has also gained momentum from pronouncements of the Governmental Accounting Stan- dards Board. The DOTs of Florida and Minnesota are developing activity-based budgeting (ABB) systems aimed at answering the following questions: What is the unit cost of each transportation product and service? What is the performance result for each additional dollar invested? What is the cost and result for in-house deliv- ery versus other options for delivery of the services? Florida’s legislature and governor have directed devel- opment of ABB unit cost information and input–out- put measures and targets for every activity in the budget. One early result is 200 additional pages of budget documents. Minnesota is at an earlier stage, with the final scope and extent yet to be determined. A productivity and cost management task force is developing information for the “plan delivery” budget activity. 1 1 6 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS FIGURE 12 Project dashboard, Virginia. 99395mvp70_128 12/13/05 12:36 PM Page 116

Monitoring Consultant Results— Performance-Based Contracting Expansion of performance-based contracting beyond traditional construction project incentives is an issue for the near future. Maintenance, information technology, and project development all offer opportunities. Benchmarking the Industry Comparison of performance is done intensively in many if not all major industries. It is done for transit systems via required reporting to the National Transit Database. Benchmarking of highway system and transportation organization performance across states and metropoli- tan areas is not highly developed or accepted except in selected areas, such as congestion. Diverse definitions and data collection processes complicate the issue. Nev- ertheless, if the transportation community does not agree on common measures it is likely that others will define them. An American Association of State High- way and Transportation Officials initiative, NCHRP Project 20-24(37), Strategic Performance Measures for State Departments of Transportation: Benchmarking Performance, is exploring identification of shared mea- sures. Figure 13 shows an example of benchmarking transit measures in Minneapolis–St. Paul, Minnesota. LESSONS LEARNED Looking back on the past decade or two of experience in learning performance management, transportation organizations surveyed offered these lessons: • Keep it simple (Pennsylvania). Make measures under- standable. Make the data reporting as simple and consistent as possible (Ohio, Washington, Minnesota). Stick with key measures—don’t try to measure everything (Florida). • However, one size will not fit all. Be flexible in your approach—you cannot simplistically apply the same practices for measures, targets, and reporting to all areas of the organization (Washington). • Start the process by taking small steps, then com- mit for the long haul. Use the best; don’t wait for the per- fect. Don’t wait endlessly for the perfect measures and the perfect data. Do it and learn and adjust as you go. Even the most experienced states and MPOs are still refining their measures. First get your system established, then work continuously to make it more effective over time (Montana, New Mexico, Washington, Minnesota). • Be dynamic and keep evolving. You are never done. When the environment changes because of a new administration or new public concerns, performance measurement must adapt (Washington). • The practice of measurement must be aligned with and driven by a policy or management priorities—it does not exist as an end in itself (Washington, Florida, Minnesota, Montana). – The framework goes from goals to objectives to policy to measures to targets. – There must be a clear link from policy to pro- gramming to performance monitoring. • Combine system outcome measures and organiza- tional measures. The latter are more likely to be input or output measures (California). • Getting local partners involved is vital, especially in states with many large and strong regional organiza- tions (California). Many of these points were reinforced by the interna- tional scan (3). The draft scan added another lesson: incorporate postevaluations. Compare the outcomes of transportation investments with the performance bene- fits projected before the investment was made. ISSUES AND CHALLENGES Several challenging issues deserve further exploration if we are to continue progressing: how many measures to use, individual performance accountability, balancing strategic vision with use of measures, and cooperation between states and regional organizations. How Many Measures? Steering with the Vital Few Versus Keeping All Areas Accountable A prevailing view on the issue of how many measures to utilize is to limit the number of measures to those aligned with strategic and management priorities. The 1 1 7ORGANIZING FOR PERFORMANCE MANAGEMENT Operating Cost Per Revenue Hour $78.37$77.06 1996 6-System Peer Average Metro Transit $100 $80 $60 $40 $20 $0 1998 2000 2002 $82.53 $89.31 $84.59 $94.62 $90.99 $92.82 FIGURE 13 Benchmarking transit measures, Metropolitan Council, Minneapolis–St. Paul, Minnesota. 99395mvp70_128 12/13/05 12:36 PM Page 117

report from the 2004 international scan recommends in its conclusions, “Don’t measure too many things. . . . In those situations where large numbers of performance measures were considered, a lack of focus resulted” (3, pp. 98–99). A contrasting view is emerging. Some financial man- agers and legislatures wish to have measures for all products and services, regardless of their strategic prior- ity. This means knowing the unit cost, productivity, and effectiveness of all activities. The benefit is to bench- mark the competitiveness of the organization versus alternative modes of delivery or to identify opportuni- ties for process improvement. As mentioned previously, without indicating the extent desirable, the Transporta- tion Asset Management Guide recommends knowing the costs of programs and services and examining deliv- ery options. The development of ABB and activity-based costing tools by the DOTs of Florida and Minnesota and others is making this more feasible. The Minnesota Department of Transportation’s solu- tion to managing large numbers of measures is, in con- cept, to delegate appropriate measures and submeasures to each level of an organization, within the strategic framework. Done properly, this can avoid measures overload while enabling all to understand their respon- sibilities. Executive staff have more high-level system and customer measures, while functional areas have more process and output measures to which they can actually manage day by day. Commercial software packages facilitate the mapping of measures hierarchies and the calculation of composite performance results. Individual Performance Accountability as a Management Tool Private-sector enterprises typically align individual per- formance goals and performance reviews with organiza- tion measures and goals. This approach has started to emerge in public-sector transportation organizations. In Caltrans, all nine deputy directors have had annual written performance agreements with the Caltrans director. They include goals, objectives, and operational performance measures aligned to Caltrans’s strategic plan. Some of the same measures, such as those for proj- ect delivery, are also reported to the California Trans- portation Commission and the legislature. Division directors under the deputies also have performance agreements. Ohio has an advanced level of accountability linking individual managers’ performance reviews to progress on department and district performance targets. Monthly state- and district-level exception reports of the comprehensive OPI are reviewed by the 12 district engineers and deputy director. Two-year business plan targets are set for 2004, 2006, and 2008. All districts are expected to know their goal deficiencies and to set goals to close the gaps. As a result of civil service reforms originating in the total quality management ini- tiative of the 1990s, career managers’ performance reviews are tied to progress on the measures. In addi- tion, funding is tied to performance targets. Districts with system deficiencies may get additional funding to close their gaps, but failure to make adequate progress can put a manager into a probationary period. The Washington State Department of Transporta- tion, like that of Ohio, has gained major new gasoline tax revenues and is moving toward stronger account- ability for individual managers in delivering priority projects on schedule and on budget. The general management literature points to the need for some kind of feedback cycle: “If performance mea- surement systems are to operate as organizational cata- lysts rather than as mere historical records, there must be an interaction between the degree of performance observed by the system and the rewards and sanctions impacting on actors within the business. If this is not the case, then the degree of impact of the system will be less- ened dramatically” (5). How widely and deeply the practice of integrating managers and employee performance accountability with organizational performance objectives will be adopted remains to be seen. Experience shows that lack of accountability systems to manage project delivery can trigger imposition of accountability tools by elected officials. Strategic Vision and Innovation: The Limits of Managing with Measures and Targets There are many issues in the appropriate use of perfor- mance targets. First, when are measures and targets not the right planning tool? Second, when should targets be set? Third, how aggressively should they be pursued? Finally, under what conditions can they be adjusted to meet changing realities? The role of planners and management is to be strategic. Organizations seeking major breakthroughs will not always find guidance in measures. Measures typically apply to what is being done today, not to new strategies and activities. For example, several years ago under Gov- ernor Jesse Ventura, the Minnesota Department of Trans- portation had as one of its strategic objectives to “increase multimodal transportation options,” with supporting per- formance measures. Governor Ventura led construction of the state’s first light rail line amid great controversy. As is typical of an entrepreneurial venture, its initial benefit–cost was not high. With the system now up and running, the owner-operator, the Metropolitan Council, 1 1 8 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp70_128 12/13/05 12:36 PM Page 118

now has the responsibility to track its performance and optimize cost-effectiveness. The Washington State Department of Transportation exercises caution in setting targets for areas that lack mature baseline data. The department rightly asserts that management targets need to be achieved. Once tar- gets are set for strategic objectives, organizations must pursue them steadfastly to sustain the momentum and credibility of their goals. Leaving the door too wide open can erode the power of target setting. At the same time, Washington and others acknowledge that sometimes “aspirational” or “stretch targets” are nec- essary to motivate action and innovation. In Minnesota, the 2003 state transportation plan set moderate and aggressive targets to reduce highway fatalities. Initially there was trepidation by safety engineers and district engi- neers about being accountable for reversing trends. But 2 years later, the targets have helped stimulate formation of a multiagency state–local task force to develop a compre- hensive highway safety plan addressing the areas that state DOT engineers said were outside their influence: educa- tion, enforcement, and emergency response. Performance targets should be serious and credible but not dictate decisions mechanistically. Such an approach negates executive managers’ responsibility to make decisions. Meyer states in the concluding lessons from the international scan, “Performance measures position you well to engage in debate, but may not nec- essarily be the determining factor in a decision, espe- cially in the legislative arena. Measures sharpen and focus the debate” (3). The Florida Department of Transportation has a bal- anced, flexible arrangement with its legislature. Annual targets are set for 60 required budget measures, and results are reported at the end of each fiscal year in June. At midyear, the legislature allows midcourse adjustment of targets, if justification is provided. This approach has contributed to realistic planning and a good relationship between the two bodies. In general, shorter-term targets, such as for opera- tions and budget planning, should be achievable and pegged to available resources. Where there is a major performance gap, short-term targets guide incremental gains toward long-term targets. More aggressive long- term targets can be maintained at the same time as a part of long-range plans. They should embody customer expectations, sound engineering, and optimum results with minimum life-cycle costs. Cooperation Between States and Regional Organizations The problem of differing measurement frameworks among states and regional organizations within states was an important concern discussed at the U.S. Depart- ment of Transportation performance measures round- table in 2003. It is an issue for all states but is especially challenging in large states with many MPOs. In California planning is conducted in 43 regions, including MPOs and county commissions. They receive 75 percent of funding. Progress in establishing perfor- mance-based planning has been hampered by the lack of consensus on measures for policies long established in California’s state transportation plan. This year, with a new governor and cabinet secretary, the logjam is being broken. They have asked Caltrans to work with regional organizations and other stakeholders to agree on a com- mon core set of system performance measures by sum- mer 2004 and begin reporting them in the fall. If it is successful, this breakthrough may serve as a guiding light for other areas of the country. SOURCES Notes of the following were used in the preparation of this paper: the session of the American Association of State Highway and Transportation Officials Stand- ing Committee on Planning, Charleston, South Car- olina, May 2–5, 2004; the Transportation Research Board Peer Exchange on Performance Measurement, Charleston, South Carolina, May 6, 2004; the ses- sion on Using Business Systems for Accountability and Public Awareness of the Southern Transportation Finance Conference, May 17, 2004; and the U.S. Department of Transportation’s Roundtable on Sys- tem Performance Measurement in Statewide and Metropolitan Transportation Planning, Washington, D.C., October 7–9, 2003. The following organizations were interviewed or sur- veyed: the California Department of Transportation; the Federal Highway Administration; the Florida Department of Transportation; the Metropolitan Coun- cil, Twin Cities, Minnesota; the Metropolitan Trans- portation Commission, San Francisco Bay Area, California; the Minnesota Department of Transporta- tion; the Montana Department of Transportation; the New Mexico Department of Transportation; the Ohio Department of Transportation; the Pennsylvania Department of Transportation, and the Washington State Department of Transportation. REFERENCES 1. Cambridge Systematics, Inc. Transportation Asset Management Guide. American Association of State Highway and Transportation Officials, Washington, D.C., 2002. 1 1 9ORGANIZING FOR PERFORMANCE MANAGEMENT 99395mvp70_128 12/13/05 12:36 PM Page 119

2. Poister, T. H. NCHRP Synthesis of Highway Practice 326: Strategic Planning and Decision Making in State Departments of Transportation. Transportation Research Board of the National Academies, Washington, D.C., 2004. 3. Meyer, M. Summary: International Scan on Perfor- mance Measures. Draft. May 2004. 4. Frigo, M. L. Strategy-Focused Performance Measures. Strategic Finance, Vol. 84, No. 3, 2002, pp. 10–15. 5. Carlin, T. Simplifying Corporate Performance Measure- ment. Australian CPA, Vol. 69, No. 11, 1999, pp. 48–50. 1 2 0 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp70_128 12/13/05 12:36 PM Page 120

1 2 1 RESOURCE PAPER Linking Performance-Based Program Development and Delivery Patricia G. Hendren, Lance A. Neumann, and Steven M. Pickrell, Cambridge Systematics, Inc. As citizens, system managers, and legislaturesnationwide demand enhanced levels of account-ability, many transportation agencies have turned to performance measurement to improve the planning, programming, and delivery of transportation projects and services. Performance information is also being used to monitor the transportation system by demonstrating not only that successive decisions lead to the identification, selection, and funding of the most effective projects and services but also that they are then delivered efficiently and produce the intended results. The combination of performance-based program devel- opment, project delivery, and system monitoring creates an effective, efficient, and accountable transportation management structure. State departments of transportation (DOTs), metro- politan planning organizations (MPOs) and councils of governments, system operators such as transit agencies and toll authorities, local transportation agencies, and citizen oversight groups have all experimented with per- formance measures to varying degrees to predict, shape, and report on the results of system investments and operations. Performance-based long-range transporta- tion plans (LRTPs) and capital or transportation improvement programs are increasingly common and illustrate the desire of agencies to demonstrate that their planning and programming processes are founded on fundamental goals and objectives that have the support of elected officials and the general public. In addition to plan and program development, the delivery of transportation projects and services (simply, “the program”) has also come under greater scrutiny, particularly from elected or appointed officials and watchdog organizations. The success of an agency’s rev- enue proposals increasingly depends on a proven track record of program delivery. In many states and regions, special-purpose revenue measures have been developed to fund a specific set of projects and services, and the implementing agencies must demonstrate delivery of the funded program given that there is a clear link between the funding measure (e.g., an incremental sales tax or bridge toll) and the voter-approved list of projects and services. System condition and performance reports, which make available current information about the actual condition and performance of various components of an agency’s modal or multimodal systems, are also becoming more popular, particularly as the Internet has made wide distribution of such information easier and less costly. Agencies have also started releasing data on the timeliness and cost of delivered projects. This last component of performance-based management, system monitoring and reporting, brings full circle the use of measures to assess need, prioritize solutions, implement, and gauge results. Although all three components—program develop- ment, delivery, and monitoring—are essential to perfor- mance-based management, the focus of this paper is on the linkage between program development and program delivery. Establishing a relationship between program development and program delivery will enable agencies to guide transportation decisions from conception to implementation. This paper begins with a discussion of performance measures application to the identification 99395mvp129_152 12/13/05 12:43 PM Page 121

and selection of projects and services and to subsequent delivery of the transportation program. Next, the importance of implementing these two components of performance-based management and the relationship between the processes are assessed. The paper concludes with an assessment of some of the challenges of further connecting program development and delivery and a few suggestions as to how these challenges can be addressed. With limited examples from transportation agencies that have fully connected these two procedures, the issues raised in the paper are designed to identify the need and opportunity to advance performance-based management of transportation systems. USE OF PERFORMANCE MEASURES IN PLANNING AND PROGRAM DEVELOPMENT A notable degree of progress in performance-based pro- gram development has been made since the passage of the Intermodal Surface Transportation Efficiency Act, which established the requirement that state DOTs develop multimodal transportation plans, similar to the requirement for MPO long-range plans. Initially, states’ long-range planning processes produced numerous goals and objectives, often with extensive public outreach, but the resulting transportation program was often not well connected with those goals (1). The addition of specific performance measures and analytical procedures to pre- dict the benefits and impacts of alternative investment scenarios has improved the connection between higher- level system goals and the resulting plan, whether it be a policy plan or a more specific long-range list of projects. Beyond the development of the LRTP or modal plans, some agencies have used performance measures more aggressively to drive prioritization and selection of proj- ects for inclusion in a capital improvement program. The process of “programming” projects in accordance with periodic agency budgets has historically been a common point of divergence between system goals and the actual projects. Many factors other than system con- dition or performance enter the programming decision process, such as regional formulas for distribution of funds, the need to keep construction or maintenance crews active, and political influence. While perfor- mance-based programming is unlikely to eliminate these other considerations and factors, it does allow agency managers to put more emphasis behind actual perfor- mance returns as a rationale for implementing certain projects rather than (or sooner than) others. Performance measures, when introduced into the planning process, are useful in a number of ways (2): • Establishment of a link between statewide goals and projects. Performance measures help clarify goals by causing planners and decision makers to be specific about what they are trying to achieve through invest- ment of public funds in transportation system improve- ments. For example, total hours of congestion and pavement condition by facility class are two measures that help define broader goals such as mobility or system preservation. • Prioritization and selection of programs and proj- ects. Performance measures can be used to identify pro- grams and projects with the highest return per dollar invested or those that will improve system performance the most overall. • Accountability. Clarifying the decision process behind program and project selection creates an atmos- phere of trust between transportation agencies, elected officials, and the public. Performance measures also provide agencies with the means to demonstrate the benefits of public expenditures. • Allocation of funds and other program resources. Agencies have used performance measures to guide bud- getary decisions. For example, some transit agencies incorporate performance measures (e.g., operating cost per passenger mile) and single-dimension factors (e.g., ridership) into the allocation of resources to various ser- vices (3). • Trade-off analysis. Performance measures can help sort out and address multimodal demands at trans- portation agencies that have responsibility for more than one modal system. Trade-off analysis becomes more rational when clearly defined objectives and mea- sures of performance have been articulated. For exam- ple, mutually exclusive corridor or program investments can be compared on performance metrics to identify not only the benefits of choosing Project A over Project B but the costs (or impacts) of not choosing B. • External and internal communication. Reports documenting performance measure data help clarify the implications of existing transportation programs, whether positive or negative, to elected officials and the public. Use of performance measures can also help cre- ate a unified message throughout an agency or depart- ment by making common goals and priorities a clearer part of the agency culture, much in the way that job site safety statistics have long been used in the construction industry to create an awareness and culture of accident prevention. • Benchmarking. Performance measures have been used by some agencies to compare their performance with that of peer agencies or to help set achievement goals (e.g., the Oregon Benchmark Report). The cumulative effect of performance measurement in planning and program development is both long- range guidance and near-term project selection that promises to deliver the system goals and objectives that 1 2 2 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp129_152 12/13/05 12:43 PM Page 122

were adopted, usually with buy-in from the public, spe- cial user groups, and elected officials. Whether these are traditional highway system goals such as safety, mobil- ity, and preservation or more multidimensional goals such as environmental protection, context-sensitive solutions, and economic development, an agency can point to a long-range plan and capital program of proj- ects and services that have been selected because of their ability to deliver performance in these areas. The application of performance measures in program development can be seen in a variety of examples around the United States. A performance-based LRTP, PennPlan, was adopted by the Pennsylvania Department of Transportation in January 2000. PennPlan outlines 10 statewide goals and lists action items and objectives that are linked to the broad goals (4). To monitor the attainment of the statewide goals, the department devel- oped performance measures for each objective and tar- get dates for implementation. Figure 1 shows an example of the relationship between the goals, objec- tives, measures, and targets. PennPlan also identified 29 corridors of statewide significance and established objectives for each corridor. The combination of statewide and corridor-specific objectives provides con- tinuity between state and local goals. The performance measures in PennPlan help define policy direction and provide the means to report program results. Currently, under new leadership, the Pennsylvania Department of Transportation is updating PennPlan, which may change the role of performance measures. As part of this process, the department is evaluating how performance measures can be further used to select projects and help guide resource allocation decisions. Several state DOTs have used performance measures to establish a more direct link between system plans and project selection. The Arizona Department of Trans- portation has recently substantially overhauled and updated its long-range transportation plan, MoveAZ, the core of which is now the identification, perfor- mance-based evaluation, and ranking of a 20-year pro- gram containing specific projects, consistent with recently enacted state law. An important component of the new draft plan is its linkage to the programming process. While different performance measures are used in the long-range plan and 5-year program, the “plan- ning to programming” aspect of MoveAZ formalizes the connection between the planning and programming processes through common goals and procedures, and the system needs set forth in the LRTP are better linked to the performance impact of selected projects. A clear connection between statewide goals and proj- ect selection has also been pursued by the Montana Department of Transportation, where the Performance Programming Process (P3) is used to distribute funds and select projects for the State Transportation Improvement Program (STIP) on the basis of congestion and pavement management systems (5). P3 was started in 1995 with the Montana Department of Transporta- tion’s new LRTP, TRANPLAN21, which emphasized the linkage between policy goals and project selection. Before the implementation of P3, funding allocation to Montana’s infrastructure was based on lane miles or roadway attributes. P3, on the other hand, evaluates project selection options on the basis of the attainment of the performance objectives outlined in the LRTP and resource constraints. By adopting the P3, funding is allo- cated to ensure that overall system goals are met. Cur- rently, approximately 70 percent of the state’s capital program is designated under the P3 process. The Mon- tana Department of Transportation believes that the P3 system keeps the planning focus on the customer, helps unify the various divisions under common goals, and enables the department to monitor its progress and thus improve accountability (6). Although the application and degree of integrating performance measures in program development vary across transportation agencies, the clear benefits of mea- sures ensure that they will continue to play an important role in transportation system management. USE OF PERFORMANCE MEASURES IN PROGRAM DELIVERY In addition to developing a sound transportation pro- gram, agencies are confronted with the difficult task of delivering highway and transit projects within expected scope, time frame, and budget. The combination of funding shortfalls, continual demand for additional capacity, public perception of construction delays, and cost overruns has intensified the focus on project deliv- ery management. Some agencies must demonstrate suc- 1 2 3LINKING PERFORMANCE-BASED PROGRAM DEVELOPMENT AND DELIVERY Goal Objective Performance Measure Target: Percentage of Miles Rated Poor Reduced to Maintain, upgrade, and improve the transportation system Improve pavement ride quality International roughness index for Interstates National Highway System Other department roads Interstates <5% <1% National Highway System <10% <5% Other department roads <20% <15% 2002 2005 FIGURE 1 Connection between goals and performance measures in PennPlan (4). 99395mvp129_152 12/13/05 12:43 PM Page 123

cessful project delivery before receiving additional fund- ing. Political pressure is a driving force behind the increasing demand for performance reporting. For example, as a result of legislative and public opinion, a blue ribbon panel was created to evaluate Missouri Department of Transportation fiscal management. One result of this assessment was the production of a semi- annual report, Dashboard of Performance, to document project delivery performance data (7). Project costs and schedules inevitably change for a variety of reasons including environmental mitigation requirements, utility relocation, right-of-way acquisi- tion, inconsistencies regarding contingencies, and design errors (8, 9). Several transportation agencies also report that work is sometimes added to a project (“scope creep”) after approval (10). Although these changes to project scope and schedule and cost are likely to occur, performance-based management of program delivery will help an agency to assess and understand the reasons for the changes and to take assertive, corrective action where warranted. Applying performance measures to project delivery can lead to more effective utilization of funds. Performance-based management of program delivery relies on measures to summarize project costs, track changes in project scope, monitor changes in delivery schedule, and minimize negative impacts of individual project-level changes on the overall program. Trans- portation agencies can use the performance measure data to explain project changes to elected officials and the public and thus improve accountability for public funds. Maintaining accurate project delivery informa- tion assists in the estimation of future human resources and cash flow needs across multiple years. In addition, regular reporting of project delivery performance increases its strategic importance with senior adminis- trators and highlights the vital role of effective project management procedures and staff. The role of performance measures in project delivery can be summarized as follows: • Promote efficient program management and deliv- ery. • Minimize unnecessary or avoidable changes to scope, cost, and schedule to maintain the integrity and intended impact of the approved program. • Track the number, extent, and cause of scope, bud- get, and schedule changes and use these data over time to identify process improvements. • Communicate results both internally and exter- nally. • Demonstrate accountability. The concept of monitoring the delivery of projects is simple in theory but can be difficult to execute. In 1997 the General Accounting Office (GAO) reported that data on the cost of highway and bridge projects were not read- ily available from the Federal Highway Administration or states. Without sufficient data, it was difficult to under- stand the magnitude and cause of cost increases, which highlighted the need for better project cost management (9). The true timing of project delivery can also be unclear because of various delivery phases in which schedule tracking begins, revised “baselining” of schedules, and completion date adjustments during construction (8). Since the 1997 GAO report was released, several state DOTs have enhanced their project management practices through the use of performance measures. For example, the Arizona Department of Transportation created the Program and Project Management Section (PPMS). This group tracks several project delivery per- formance measures on a monthly basis, including per- centage difference of final contract cost from original bid, planned versus actual construction projects adver- tised, planned versus actual construction projects awarded, and percentage difference between actual days worked and original contract days. An important aspect of the PPMS system is that the Arizona Department of Transportation has set targets for several of these mea- sures (e.g., 90 percent of the STIP projects should be awarded). Establishing performance goals for project delivery is an important element in further improving accountability and cost containment (9). A few state legislative mandates requiring transporta- tion agencies to improve the documentation of project delivery performance have been passed recently. In 1995 the California legislature required the California Depart- ment of Transportation (Caltrans) to provide a report to the Legislature that proposes and evaluates performance measures for all major capital outlay support functions, including project studies, project development, right-of-way acquisition, and construction oversight. The department shall pro- pose measures that 1) provide an accurate measure of annual efficiency, as well as 2) provide a consistent basis for year-to-year comparisons, and 3) evaluate both the department’s cost and its timeliness in com- pleting work. Furthermore, the department shall demonstrate that each measure that it proposes can be accurately generated from the department’s exist- ing or planned information systems. (11) Since 1995, Caltrans has released annual project management reports that document 12 performance measures evaluating capital outlay functions. The reports summarize the status of the measures and where available present historical data and agency progress toward established targets. Figure 2 is an example of the annually reported performance data Caltrans uses to 1 2 4 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp129_152 12/13/05 12:43 PM Page 124

evaluate the timing of projects. The measures tracked in Figure 2 indicate Caltrans’s success in completing the design of programmed projects within or ahead of schedule. Several states have implemented systematic tracking protocols to monitor various aspects of project delivery. The New Jersey Department of Transportation perfor- mance-based project management procedures were highlighted in the 2003 American Association of State Highway and Transportation Officials (AASHTO) report Strategies for Reducing Highway Project Deliv- ery Time and Cost (8). The department’s cost account- ing system tracks all expenses associated with the delivery of highway projects including indirect costs eli- gible under the Transportation Equity Act for the 21st Century. Its tracking and scheduling system sets a base- line schedule that cannot be changed and monitors the project budget. In addition, the department has created a performance-based rating system that evaluates a proj- ect on the basis of completion time, safety, environmen- tal compliance, pavement smoothness, and air voids. The rating system is designed to reward or penalize con- tractors on the basis of project completion time (8). The combination of a cost accounting, project tracking, and rating system provides the New Jersey Department of Transportation with the tools to manage project deliv- ery effectively. The Virginia Department of Transportation tracks the delivery of its 6-year program through an interactive web-based tool, the project dashboard. The dashboard system evaluates project delivery efforts (advertisement, construction contract deadlines, construction contract awards, and construction contract work orders) on the basis of a three-point scale: green (on track), yellow (warning of potential problems), and red (problem exists). The straightforward presentation of project delivery data makes this information accessible to a wide audience. The performance trends for project delivery (on time and on budget) are also documented in the quarterly report card. The dashboard was created in response to legislative and public questioning of the department’s fiscal management practices. The enhanced transparency of operations produced by the dashboard and the report card has not only improved the agency’s project management practices but also improved the assessment of the Virginia Department of Transportation as an accountable agency. Even though the use of performance measures for project delivery management varies across transporta- tion agencies, there are several commonly used perfor- mance measures. The majority of measures address cost and time elements; however, some agencies are also tracking the safety and quality of project work. Table 1 presents several performance measures currently used by agencies to monitor project delivery. Most existing performance-based project delivery systems focus on current and historical performance data to evaluate progress over time. A few agencies set targets for selected performance measures to help focus project delivery improvements. It is uncommon to use project delivery measures to compare agencies, and a recent AASHTO report cautions against benchmarking project delivery measures without an in-depth analysis of how project costs are categorized (8). For example, a 1 2 5LINKING PERFORMANCE-BASED PROGRAM DEVELOPMENT AND DELIVERY 112%109%104%117%117%Dollars 91%95%95%91%89%Projects FY 2001/2002 FY 2000/2001 FY 1999/2000 FY 1998/1999 FY 1997/1998 Percent Delivered /Programmed 0% 20% 40% 60% 80% 100% 120% 140% 89% 91% 95% 91% 117% 117% 109% 112% 95% 104% > 90% > 100% Targets FIGURE 2 Caltrans project management measures: time growth (12). 99395mvp129_152 12/13/05 12:43 PM Page 125

state with a more sophisticated cost accounting system that captures all costs charged to a project may register a higher design and construction inspection/engineering cost than a state with a less comprehensive system. Compared with the development and use of perfor- mance measures in transportation planning and program- ming, only a small body of research currently documents performance-based project delivery. Questions concerning the selection of appropriate measures, reporting cycles, and data sources remain. Agencies are under increasing pressure to track and communicate project delivery per- formance, and they have access to a limited number of tools. Although the “dashboards” produced by a few agencies such as the Virginia Department of Transporta- tion have been received well, it is unclear whether this approach will be able to provide project delivery details required by some oversight agencies (7). In addition, exist- ing project delivery measures fail to address the impact of a project on transportation system performance (7). Agen- cies may also be interested in the cost savings associated with implementing performance-based project delivery. Useful insights may be provided soon by some recently ini- tiated studies (e.g., National Cooperative Highway Research Program Project 8-49, Procedures for Cost Esti- mation and Management for Highway Projects During Planning, Programming, and Preconstruction). However, the absence of project delivery performance guidebooks, documentation of the state of the practice, and calculation of potential cost savings and the lack of postproject deliv- ery evaluation tools clearly expose the need for further research. LINKING PROGRAM DEVELOPMENT AND DELIVERY Performance-based management of a transportation sys- tem is made up of three components: program develop- ment, program delivery, and system monitoring and reporting. The focus of this paper has been on the first two components, and we have documented in the previ- ous sections how performance measures have been applied to program development and program delivery. The next level of sophistication and benefits occurs when existing performance-based program development and project delivery processes are linked. The linkage of performance-based program develop- ment and program delivery in an agency is rare. In fact, many agencies are engaged in only one of these processes, which reduces the overall effectiveness of their performance-based management. For example, a performance-based program development process in isolation may identify the best projects to fund but may not guard against excessive scope creep, schedule slip- page, or cost escalation. As a result, the program as delivered may cost more than promised and deliver less in terms of system condition and performance. If costs rise and the amount or quality of work diminishes, the benefit–cost ratio of the program is almost certain to be measurably lower than expected. Conversely, a perfor- mance-based delivery process in isolation may result in the efficient delivery of a program that includes mar- ginal projects, and again the ultimate effectiveness of the program may be reduced. By executing both perfor- mance-based program development and project deliv- ery, the most effective set of projects is not only selected but also implemented efficiently. We have documented how the implementation of performance-based program development and delivery varies across transportation agencies. Although there are some similar benefits to using performance measures in both processes, there are also distinct differences. A recognition of how performance measures vary between program development and delivery is important in beginning to understand the benefits and how to link the two processes. Table 2 provides a comparison of characteristics of performance measures for program development and delivery. A graphical display of a performance-based manage- ment structure further illustrates why agencies should link program development and project delivery (see Fig- ure 3). Component 1, program development, typically begins with establishing agency goals and objectives that are in turn monitored through performance mea- sures. On the basis of resource constraints, performance targets are set and projects and programs are identified and selected on the basis of performance criteria intended to lead a transportation agency toward its goals and objectives. For example, a state DOT could identify the goal “preserve the existing system,” with the related performance measure “percentage of high- way miles with acceptable pavement condition.” In turn, a project selection (or program budgeting) crite- 1 2 6 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS TABLE 1 Project Delivery Performance Measures (13, 14) Category Sample Measures Cost Ratio of preliminary engineering, environmental, design, right-of-way, or construction costs to total project costs Administrative cost by project Actual project cost versus award costs Dollars per project mile Percentage of unprogrammed costs (e.g., errors in materials) Number of contract change orders and costs Schedule Percentage of construction projects completed on time Actual versus planned project award or advertisement schedules Additional days required to complete project Percentage of STIP delivered by year planned Safety Number of accidents in construction zones Quality Evaluation of contractor’s work by certification accep- tance (field review of project) Percentage of engineering work requiring rework Survey of contractors evaluating construction process Project manager evaluation of contractors 99395mvp129_152 12/13/05 12:43 PM Page 126

rion would be the project’s estimated impact on high- way pavement condition. The relationship between per- formance targets and project selection is an iterative process based on changing needs, available resources, and political support. Component 2, project delivery, begins when prese- lected projects are passed off to the delivery team. A performance-based process uses measures to evaluate and monitor project implementation (e.g., percentage of construction contracts completed on time). Figure 3 illustrates not only that selection and delivery of project and programs are separate processes but also that there are two distinct groups of performance measures. Mea- sures in one group relate to project selection and are linked to agency goals and objectives, while those in the other focus on delivering projects. Typically, different groups of people are involved in developing each set of measures, which further separates and works against effective linkage of the two processes. Although there are different performance measures and procedures associated with program development and program delivery, it is the delivery of projects that produces the result (i.e., system performance). The third performance-based management component, system monitoring, reports on the performance changes that are due to implemented projects and programs. Although the delivered projects and programs are ide- ally selected through performance-based program devel- opment, the delivery of the program is what forms the foundation on which transportation goals and objec- tives will be met. This fact highlights the importance of linking program development and delivery. To date, there are few examples of successful linkage between performance-based program development and delivery. However, agencies’ past experience with the two components of performance-based management high- lights several effective approaches. To form a connection between program development and delivery, transporta- tion agencies need to communicate the value of selected projects to the delivery team and clarify how these proj- ects were chosen. Although the performance measures that guide program development and delivery are differ- ent, a worthwhile exercise would be bringing together planners and project managers to discuss how the two processes enable an agency to reach its goals or even to discuss potential common measures. In creating this type of strategic linkage, the balanced relationship between program development and delivery should be empha- sized; both are necessary components of a successful approach. 1 2 7LINKING PERFORMANCE-BASED PROGRAM DEVELOPMENT AND DELIVERY TABLE 2 Comparison of Performance Measures for Program Development and Delivery Process Characteristics Planning and Program Development Program Delivery Key objective Allocating resources to programs and projects Delivery of selected programs and projects as effi- to achieve system performance goals ciently as possible with minimal impact on cost, scope, and schedule Types of measures System condition and performance (pavement Address costs, scope, schedule, and work safety and condition, congestion, safety, etc.) quality Data reporting frequency Data collected and reported over long periods Project delivery information tracked on a regular basis of time. The impact of selected programs and (annually, monthly, and even weekly) projects on the transportation system often not known for several years External factors Existing external factors (driver behavior, Unexpected changes in external factors affect demographics, etc.) affect performance performance Challenges Selecting measures Selecting measures Data availability Data availability Analytical tools to predict performance Tracking project changes External factors External factors Defining expenditure impacts on system Assessing impact of program/project changes on performance system performance Monitoring over time Co m po ne nt 1 Pr og ra m D ev el op m en t Co m po ne nt 2 Pr oje ct D el iv er y Co m po ne nt 3 Sy st em M on ito rin g a n d Re po rti ng System Performance Measures to Monitor Progress on Meeting Goals/Objectives Performance Targets for Planning/ Programming Given Resource Availability Specific Set of Programs and Projects with a Defined Budget, Schedule, Scope Performance Measures to Track Delivery Result = System Performance Program Development and Project Selection Program/Project Implementation Goals/Objectives FIGURE 3 Performance-based management structure. 99395mvp129_152 12/13/05 12:43 PM Page 127

An important aspect of a strategic linkage between program development and delivery is that the criteria for evaluating and accepting scope changes need to be consistent with the criteria used for selecting the proj- ects in the first place. In this way, the effect and benefits of the project with respect to system goals remain con- sistent with the original intent and assessment of the project. For example, if benefit–cost ratio was a crite- rion in selecting a project, it would be prudent for proj- ect managers to reevaluate this criterion when proposed scope changes are considered. If the revised scope and cost would result in a substantially different benefit–cost metric, this should be considered before accepting the scope change. While often the sunk costs and momen- tum of a large project dictate that work must progress, reevaluation of the cost-effectiveness may result in fur- ther scope (or overall program) refinements that mini- mize negative impact on the program. This information should be gathered with the assistance of, or at least be communicated to, the program development team. The sharing of knowledge during and after project delivery is also vital to this strategic linkage. Informa- tion needs to flow not only from the program develop- ment team to the project delivery team but also from the project delivery team back to the program developers. To select projects and programs effectively in the future, an understanding of why a program changed or why the individual projects within a program changed is vital. Did cost issues derail a project or was there politically facilitated “queue jumping”? Another important piece of information is the impact of project delivery changes. If only 75 percent of the project was able to be delivered because of cost escalation, how were the necessary scope changes determined? If, for example, one or more of the proposed direct connector ramps had to be deferred or dropped entirely in a major freeway-to-freeway inter- change reconstruction project, how were the most desir- able scope changes (i.e., those with the least impact) determined? If only some of the projects in the overall program were delivered, how was it decided which proj- ects would be deferred, and what was the impact on the original program goals? While such compromises and sacrifices are not uncommon, the lessons learned during the delivery of a project could provide useful informa- tion for future program development cycles if the details are tracked and reported. A good example of the benefit associated with the “knowledge linkage” between program development and delivery is the impact of the Washington State Department of Transportation quarterly performance report, Measures, Markers, and Mileposts (also known as the Gray Notebook). This report includes perfor- mance measures that address program development as well as project delivery. The combination of reported measures has helped diminish agency and program silos and create a new level of collaboration throughout the agency (15). The Gray Notebook has given project man- agers and senior managers the means to discuss and diag- nose the department’s transportation program. Recently, the department underwent a reorganization that resulted in programming, system analysis, system planning, and strategic assessment responsibilities being housed under one division. This change further improves links between planning and programming. In addition, the information presented in the Gray Notebook will be the foundation for the next statewide transportation plan. Another means of connecting program development and program delivery to each other is to make efficient project delivery itself a high-level agency goal. The goals and objectives that guide program development typically do not address the delivery of projects, just as measures used to evaluate project delivery do not often link back to agency goals, other than in terms of general efficiency and accountability. The goals established at the highest level of performance-based management should be influenced by both program development and delivery decisions and actions and therefore create a natural link between the two processes. For example, the Missouri Department of Transportation produces a semiannual report document- ing the department’s progress toward three goals: “take better care of what we have,” “finish what we’ve started,” and “build public trust” (16). The department’s progress is evaluated on the basis of 16 performance measures that are linked to one of the three goals. The majority of the reported performance measures are common program- oriented indicators (e.g., fatality rates, deficient bridges). However, the department uses four project delivery mea- sures to assess how well it is “finishing what we’ve started.” Including a project delivery objective as an ele- ment of broad agency goals or vision elevates project deliv- ery performance to a departmentwide level. This helps raise the awareness of project delivery and communicate its importance to the staff and the public. CHALLENGES TO LINKING PROGRAM DEVELOPMENT AND PROGRAM DELIVERY To link program development and program delivery effectively, several issues need to be addressed, including time and resource constraints, internal organization restrictions, external factors, and internal and external communication challenges. A successful connection between performance-based planning and project delivery will require a joint effort between those involved in each process. Since transporta- tion agencies are struggling to implement existing pro- grams and projects, it will be a challenge to dedicate resources (funding and staff time) for this effort. Initial steps, such as jointly reporting and publishing program 1 2 8 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp129_152 12/13/05 12:43 PM Page 128

development and delivery data as in the Washington State Department of Transportation’s Gray Notebook, will facilitate the essential process of knowledge sharing. Ide- ally, the end result of linking the two procedures will be a more efficient use of resources, although reaching that point will take time and reallocation of internal resources. The organizational structure of a transportation agency will also affect performance-based management. Some agencies have firewalls, intended or not, between the various functions such as programming, project devel- opment, environmental clearance, procurement, and con- struction management. The separation of the activities can lead to some cost efficiencies but may negatively affect the flow of information necessary for a compre- hensive performance-based process. A recent AASHTO report focusing on project delivery concluded that states whose employees could perform many functions and understood the project delivery process experienced more efficient handling of workloads. Multifunctional staff would be a key component in linking performance-based program development and project delivery. As in setting up any aspect of a performance-based man- agement system, support from top management and buy-in from those who must implement the system are essential. If it is perceived as a cumbersome but nonessential reporting process, a performance management system will fail to achieve its central objective (i.e., efficient delivery of an effective program of projects). To increase employee sup- port for performance-based management, many agencies have carried out training and meetings to further explain the benefits (e.g., the Minnesota Department of Trans- portation). Other agencies have established a more direct link between staff and performance outcomes to create ownership over an agency’s success (7). External factors that influence the development of performance-based programs and project delivery will also affect an agency’s ability to improve the connection between the two processes. For example, the relatively short time frame for results expected by elected officials in many states influences project selection and delivery. Elected officials may be impatient to show immediate results in terms of project completion, regardless of whether the projects deliver the magnitude of improve- ment that was anticipated during the planning and selec- tion phases. Furthermore, as legislative turnover occurs because of term limits or other factors, changing priori- ties can undermine the longer-term objectives of a coor- dinated performance-based programming and project delivery process. Practitioners report difficulty in balanc- ing the short-term interest of elected officials with the longer-term perspective they have as system owners and conservators. Similarly, there is always pressure to spend the allocated funds even if available projects are not the most ideal. Thus, the shorter time frame for project bud- geting and construction relative to the planning and pro- gramming processes may work against linking program development and delivery. A clearly articulated agency goal such as “efficient program delivery” supported by a culture of performance-driven decisions will help agen- cies to stay the course even as they respond to external change forces and wavering political support. Improved communication and data management are both requirements and key benefits of connecting per- formance-based program development and project delivery. Internally, agency staff need to improve the speed and quality of information transfer between pro- gram development and delivery functions. For example, the lag time between construction activities and receipt of accurate cost-to-complete data inhibits the process of reassessing project cost-performance metrics. If agencies are to assess the potential impact of changes to the deliv- ered program (e.g., deciding which projects to defer to allow completion of a project in significant overrun) in advance of making the decision, they need current information on cost to date and cost to complete. The time frame for making decisions on scope changes is constrained by the high cost of keeping construction crews and equipment fully utilized; it is no doubt diffi- cult for most agencies to reassess program impacts on short notice because of data limitations and staff avail- ability, even if this level of cooperation and coordina- tion exists between the development and delivery functions. Creating a more open dialogue and free exchange of timely information between planners, pro- gram developers, and the project delivery team will enhance an agency’s overall performance management. An equally important element of communication is a frank and honest presentation of information to high- level decision makers, external stakeholders, and cus- tomers. Even though the data may show some inefficiencies or problems, it is important to convey the full story to create a lasting atmosphere of trust and cred- ibility. However, there can be resistance to documenting problems with either program development or delivery. Effectively communicating technical aspects of the two processes can be challenging, and the means of commu- nication may vary according to the audience (e.g., inter- nal program managers, policy makers, lay public, stakeholder agencies). On the whole, explaining system performance expectations and results may be easier than the more technical project delivery data, but decision makers and stakeholders would benefit from a better understanding of the major factors that contribute to delays and overruns in project delivery. SUMMARY Performance-based management is grounded in three components: program development, project delivery, 1 2 9LINKING PERFORMANCE-BASED PROGRAM DEVELOPMENT AND DELIVERY 99395mvp129_152 12/13/05 12:43 PM Page 129

and system monitoring and reporting. The degree of implementation of these three components varies across transportation agencies. However, a large number of agencies have incorporated performance measures of some form into their transportation system manage- ment. As this practice expands, the benefits of improv- ing connections between program development and program delivery become more apparent and signifi- cant. Performance-based program delivery will increase the likelihood that the selected program of projects has the capability to improve the effectiveness of the system. Performance-based project delivery ensures a more effi- cient delivery of those program benefits. And ongoing monitoring and reporting of the results lead to incre- mental improvements in both processes, greater aware- ness of the benefits of system investment, and improved agency accountability to the public and elected officials. By building the relationship between program develop- ment and program delivery, agencies will guide trans- portation decisions from conception to implementation more proactively and consciously. As the relationship between program development and delivery strengthens, goals common to the two processes will become clear. Even though program development measures will differ from project delivery measures, common agency goals will help link the processes. In addition, linking program devel- opment and delivery will improve communication across the agency (e.g., between planners and programmers) as well as with external stakeholders. A number of internal and external challenges exist in linking the program development and delivery processes. As is often the case, the best approach to addressing these challenges varies from one agency to another, depending on their key objectives, organiza- tional structure and governance, and resources. REFERENCES 1. Cambridge Systematics, Inc. NCHRP Web Document 26: Multimodal Transportation: Development of a Per- formance-Based Planning Process. Transportation Research Board, National Research Council, Washing- ton, D.C., 1999. 2. Pickrell, S., and L. Neumann. Use of Performance Mea- sures in Transportation Decision Making. Presented at the Conference on Performance Measures to Improve Transportation Systems and Agency Operations, Irvine, Calif., 2000. 3. Stanley, R. G., and P. G. Hendren. TCRP Synthesis of Transit Practice 56: Performance-Based Measures in Transit Fund Allocation. Transportation Research Board of the National Academies, Washington, D.C., 2004. 4. PennPlan Moves! Pennsylvania Statewide Long-Range Plan—2000–2025. Pennsylvania Department of Trans- portation, 2000. 5. Straehl, S. S., and L. A. Neumann. Performance Pro- gramming: Guiding Resource Allocation to Achieve Policy Objectives. In Transportation Research Record: Journal of the Transportation Research Board, No. 1817, Transportation Research Board of the National Academies, Washington, D.C., 2002, pp. 110–119. 6. Performance Programming Process. Montana Depart- ment of Transportation, 2000. 7. Bremmer, D., K. C. Cotton, and B. Hamilton. Emerging Performance Measurement Responses to Changing Polit- ical Pressures at State Departments of Transportation: Practitioners’ Perspective. In Transportation Research Record: Journal of the Transportation Research Board, No. 1924, Transportation Research Board of the National Academies, Washington, D.C., 2005, pp. 175–183. 8. Strategies for Reducing Highway Project Delivery Time and Cost. American Association of State Highway and Transportation Officials, Washington, D.C., 2003. 9. Transportation Infrastructure: Cost and Oversight Issues on Major Highway and Bridge Projects. GAO- 02-702T. U.S. General Accounting Office, Washington, D.C., 2002. 10. Transportation Research Circular E-C062: Addressing Fiscal Constraint and Congestion Issues in State Trans- portation Planning. Transportation Research Board of the National Academies, Washington, D.C., 2004. gul- liver.trb.org/publications/circulars/ec062.pdf. 11. Project Management Performance Report 1995–1996. California Department of Transportation, Sacramento, 1995. 12. Project Management Performance Report 2001–2002. California Department of Transportation, Sacramento, 2003. 13. Rogge, D., T. Carbonell, and R. Hinrichsen. Evaluation of Oregon Department of Transportation Project Delivery: Literature Review and DOT Survey. Oregon State University, 2003. 14. CTC and Associates. Highway Construction: Program and Project Performance Measures. Wisconsin Depart- ment of Transportation, 2003. 15. Howard, C. Washington’s Gray Notebook: Perfor- mance Journalism and Its Implications for Planning. Presented to the AASHTO Standing Committee on Planning, Charleston, S.C., 2004. 16. MoDOT Dashboard Measurements of Performance. Missouri Department of Transportation, 2004. 1 3 0 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp129_152 12/13/05 12:43 PM Page 130

1 3 1 RESOURCE PAPER Issues and Challenges in Using Existing Data and Tools for Performance Measurement Louis H. Adams, New York State Department of Transportation Frances D. Harrison and Anita Vandervalk, Cambridge Systematics, Inc. Many transportation agencies seek to improvebusiness processes by expanding the use ofperformance measurement without making significant additional investments in data collection and analysis tools. The purpose of this paper is to highlight technical issues associated with the use of existing data and tools for performance measurement in a trans- portation agency. Common challenges are identified and recommendations are included so that agency staff can anticipate and address the challenges in a proactive manner. INTRODUCTION Transportation agencies seeking to implement perfor- mance-based planning and budgeting methods rarely have the luxury of embarking on completely new data collection efforts and acquiring new information sys- tems and analysis tools in support of these efforts. Agencies must rely, for the most part, on existing data and tools. This is not necessarily a hardship—most agencies that provide transportation facilities and ser- vices typically collect considerable amounts of data about system condition, performance, supply, and demand. The challenge is how to take best advantage of the data collection resources and tools that are in place. Implementing or improving a performance measure- ment system need not be costly, time-consuming, or resource intensive. A simple initial program can be established by using a single indicator that is of interest to an executive-level sponsor, with targets indicating whether performance ratings are satisfactory or need improvement. The program can evolve incrementally over time as the agency learns from the initial experi- ence. It can add more measures; improve the sophistica- tion of measures; further integrate the use of the measures into strategic and tactical decision-making processes; and communicate results to a wider group of customers, partners, and stakeholders. This evolution- ary model of performance measurement improvement (as opposed to a “big bang”) is typical in agencies with successful programs. One common misconception is that performance measurement is synonymous with data collection: “If only we had the resources to collect better, more detailed data, then we could implement performance- based budgeting properly.” While the importance of good data cannot be ignored, there are many examples of agencies that have plenty of data but cannot or do not put the data to good use. The success of a perfor- mance-based planning and programming effort in an agency depends on a host of “downstream” activities. Among them are processing and quality checking the raw data, integrating data from different sources for mapping and analysis, transforming raw data into meaningful information to be reported as performance measures, developing trend information, projecting into the future for purposes of target setting and what- if analysis, and providing tools to decision makers to give them access to performance information in a con- venient and useful way. 99395mvp129_152 12/13/05 12:43 PM Page 131

There also are challenges on the business side with respect to using performance data effectively for deci- sion making and ensuring that the more mundane but critical processes and responsibilities for data process- ing, analysis, and distribution are working as well as possible. Each of these elements of performance mea- surement—data collection, processing, integration, management, analysis tools and methods, dissemina- tion, and use in the business process—is important to the ultimate success of the effort. Agencies should deter- mine which elements need more attention and develop a balanced strategy for improvement. Typically this strat- egy will require effort on multiple fronts: • Measuring the right things at a level of detail appropriate to what they will be used for; • Taking advantage of current technologies and tools for data collection, processing, and analysis; • Making the best possible use of existing data and legacy systems; • Enhancing tools over time to provide better deci- sion support; and • Building the staff capability and commitment required to ensure quality information and analyses that are actually used to make decisions. The scope of this paper is limited to performance mea- sures related to the transportation system and service pro- vided to system users as opposed to the performance of transportation agency personnel or organizational units. The paper also focuses on those measures used for high- level strategic (rather than tactical) decision making. Such performance measures allow executives and managers to identify key trends and conditions, understand causal fac- tors, and act on the information. DATA AND TOOLS FOR SPECIFIC TYPES OF MEASURES—EXAMPLE APPLICATIONS This section discusses four categories of performance measures: • Infrastructure condition and deficiency measures, • Mobility measures, • Safety measures, and • Customer service measures. For each of these four categories, common types of mea- sures and current issues related to the use of existing data and tools are reviewed briefly. Then an example is pro- vided to illustrate how agencies are making effective use of existing data and tools for performance measurement. The examples provide the background for the more general guidance provided in the following section of the paper. Infrastructure Condition and Deficiency Measures Condition is often included as a key performance mea- sure to indicate how well the agency is preserving the substantial investments that have been made in infra- structure. It also is often used as a proxy measure for the quality of service provided to transportation system users. Examples of infrastructure condition measures are average ride quality, percentage of asset length or count by condition range or category, remaining life, bridge health index, and bridge deck condition. Measures of functional deficiency are used to describe how well transportation facilities are serving their intended purpose. Examples of infrastructure func- tional deficiency measures include number or percent- age of load-posted bridges, percentage of miles not meeting shoulder width standards, and percentage of underpasses with height postings. Measures of backlog or need can be derived on the basis of standards for condition or functional deficiency. For these type of measures, it is best to establish precise criteria for what constitutes a need so that identification of a need can be an automated process. Because agency policies change over time, it is useful to provide tools that can derive deficiency or backlog measures from physical characteristics of facilities (lane and shoulder widths, load ratings, sign reflectivity, etc.) on the basis of varying definitions for what constitutes a deficiency. The use of remaining life as a performance measure is one approach that allows agencies to compare perfor- mance across different classes of assets. This requires reasonable estimates of the expected life of different types of assets under varying circumstances (traffic, environmental conditions, construction methods, main- tenance practices). Agencies using this measure will need to address the eventuality that regardless of the expected life values they set, there will be some facilities in oper- ation with a remaining life of zero (or less than zero), which may be difficult to explain to the public. Asset management systems are indispensable tools for performance measurement in this area, with capa- bilities to maintain inventory and inspection data and to store condition trends over time. Some systems also pro- vide capabilities for performing analysis to understand how future investment levels and patterns could affect system performance. Example: The Montana Department of Transporta- tion’s Performance Programming Process (P3) estab- lishes pavement performance targets for different portions of the roadway system (Interstate, National Highway System, and primary). Performance measures for pavement are the average ride quality and the per- centage of system length in poor condition. The depart- ment’s Planning Division makes extensive use of a 1 3 2 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp129_152 12/13/05 12:43 PM Page 132

pavement management system (PMS) to report current pavement performance and to develop resource alloca- tions to work types and systems that will allow the department to meet the established performance targets for each system in each district. The PMS already was in use within the department’s Materials Bureau before the implementation of P3 in 2000. Its primary application was for site-specific analy- sis and determination of appropriate pavement treat- ments. Pavement condition data collection procedures were well established, and the system had performance curves and decision rules needed to predict future per- formance and simulate performance impacts of alterna- tive budget levels. The department undertook several activities to make use of this tool for performance-based planning: • Coordination and agreement with the Materials Bureau on data updating protocols and schedules, including responsibilities for updating the list of pro- grammed pavement projects to be considered by the simulation; • Minor modifications to the software to provide reporting of aggregated performance results by the three system categories used for performance reporting; • Development of new spreadsheet-based tools that take the system’s outputs and produce charts and graphs needed for P3; • An annual quality assurance process consisting of running the simulation, reviewing network-level results against prior trends, and correcting the modeling param- eters and data as needed; and • Iterative runs of the system to determine the best allocation of resources across work types and districts to meet established performance targets. This example shows how infrastructure management systems can be used to provide technical information and analysis that are needed to support a performance-based planning process. It illustrates how systems that are imple- mented by one organizational unit (e.g., one concerned with facility inspection and development of maintenance, rehabilitation, and repair strategies) can be adapted by the unit responsible for performance-based planning. Safety Measures Examples of safety measures are number and rate of fatalities, injuries, run-off-the-road crashes, pedestrian crashes, heavy-vehicle crashes, impaired driver crashes, repeat offender crashes, uninsured driver crashes, and unlicensed driver crashes. In view of a new federal strategic goal to reduce by 9,000 the number of annual deaths attributable to high- way crashes, safety data needs and new levels of data integration must be given more attention before the 2008 implementation target. More than half of the fatalities are on rural two-lane roads, and half of those are off the state highway system. Effective design of a program to reduce fatalities will require accurate crash records to be readily available in context with highway inventory and built environment attributes in a geo- graphic information system (GIS) analysis framework across multiple government functions and levels of jurisdiction. Lack of standardization of crash data collection cri- teria and methods within and across states and changes to these criteria and methods over time have made it dif- ficult to benchmark one jurisdiction against another and to develop valid trend information for nonfatal crashes. Time-consuming manual data entry procedures for police crash reports also have presented problems of timeliness and accuracy in safety data, particularly with respect to establishment of reliable accident locations that can be correlated with highway design and condi- tion attributes. Progress is being made in all of these areas, however, as states implement uniform crash- reporting procedures and automated processes. Example: The New York State Department of Trans- portation’s safety goal is to reduce deaths, injuries, and total accidents. Three intervention strategies are safety capital projects; safety enhancements implemented within capital projects that are programmed for other purposes; and highway maintenance actions such as signs, delineation, traffic control devices, and other low- cost accident countermeasures. The accomplishment target is an average annual reduction of 1,500 accidents occurring at identified high-accident locations (HALs) on the state highway system, which would result in annual reductions in accident costs of $80 million. The project selection criteria are to address and treat HALs cost-effectively, reduce severe accidents at the lowest possible cost, and engineer accident countermeasures into all capital projects. HALs are identified by comparing prevailing accident rates with average rates for similar facilities. Sites with rates at least three standard deviations above the mean are then evaluated in detail with the use of collision dia- grams. Expected accident reductions attributable to engi- neering countermeasures are calculated from historical before-and-after studies. Benefit–cost analysis is required for all safety capital improvements. Those with bene- fit–cost ratios greater than 1.0 are programmed for implementation within 5 years of problem identification. Tallies of expected accident reductions and accident cost savings are kept in the program and project management system database. Postimplementation monitoring is used to refine accident reduction factors. Regions that do not propose sufficient safety goal accomplishment in their 1 3 3ISSUES AND CHALLENGES IN USING EXISTING DATA AND TOOLS 99395mvp129_152 12/13/05 12:43 PM Page 133

biennial program update proposals are directed by exec- utive management to rebalance their program proposal appropriately. A web-enabled safety information management sys- tem is the technological centerpiece for conducting net- work-level statistical analysis. Linear referencing of crash locations with data from the traffic-monitoring and highway inventory systems is essential for calcula- tion of accident rates. Availability of HALs in a GIS environment is one of many mappable attributes of the highway network. This example illustrates many successful techniques for using existing data and tools for effective perfor- mance measurement: linkage of an outcome-oriented performance measure to targets and specific project selection criteria and procedures, monitoring of pro- gram impacts through a combination of output-oriented measures (projects implemented) and modeled impacts, use of the existing GIS tool to disseminate crash infor- mation widely within the agency for use by decision makers, and use of GIS to perform integration of exist- ing data sources (accident data and traffic data) needed to derive useful performance measures (crash rates) from raw data (crashes). Mobility and Reliability Measures Examples of mobility and reliability measures are annual average daily traffic per lane mile, average travel rate (minutes per mile), nonrecurring delay, incident- related delay, travel time index (median reliability mea- sure), planning time index (95th percentile reliability measure), and percentage of vehicle miles of travel under congested conditions. Rapid progress and change are occurring in this cat- egory of measures as a result of the congestion manage- ment systems implemented by states in response to federal Intermodal Surface Transportation Efficiency Act legislation during the early 1990s, standardization of archived data user services information flows from intelligent transportation systems, and advances in the- oretical understanding of how agency interventions to address mobility issues change transportation system performance. Reliability of expected travel time through a metropolitan area is being articulated by the logistics industry as its principal performance concern. Determining the locations, magnitudes, and dura- tions of disruptions to expected travel time patterns requires processing of extensive data flows. Interval binned vehicle classification, speed, and volume contin- uous and coverage counts are routinely stored and made available in traffic-monitoring systems. Results of macroscopic and microscopic travel simulation models and associated two- and three-dimensional visualiza- tions are in widespread use in metropolitan areas for areawide and project-level analysis. Current challenges include capturing, quality check- ing, and archiving data flows from traffic management centers and ensuring that traffic simulations and visual- izations are an accurate portrayal of performance for the scenario being modeled. Both occur upstream in the work flows that lead to reporting of transportation sys- tem performance. Example: Simulation modeling of recurring and inci- dent congestion is at the heart of mobility performance measurement in the New York State Department of Transportation. Primary data sources are the highway inventory (containing both physical and administrative attributes) and weekday hourly directional traffic counts from the coverage count data collection pro- gram. Evaluation of strategies to reduce congestion is accomplished by modifying policy variables in the sim- ulation models, such as incident detection, response, and clearance times. Excess delay incurred by persons and goods traveling on the highway system and wasted fuel are the primary simulated performance measures. The economic losses associated with excess delay are monetized and estimated at more than $4.3 billion annually for state highways. The simulation model is implemented on a desktop computer. Input data files for the model are updated on an annual cycle, and output results are linearly referenced for use in GIS. Cost-effectiveness ratio criteria are the basis for pro- gramming projects. In the greater New York City met- ropolitan area, projects must reduce simulated excess delay for the opening year of the project by at least 75 person-hours per day for each $1 million of initial investment. The criterion is 35 for Upstate New York. Congestion is growing in New York, and the current funding constraints necessitate a modest goal of reduc- ing the growth in excess person-hours of delay by the end of the 5-year program period to 10 percent less than the simulated base case forecast. Other selected aspects of the mobility program and performance measures for it in New York are as follows: • Travel demand management: Actions that reduce single-occupant vehicle travel during peak hours must be funded to at least a level of $3 million during the 5-year program. The performance measures used are percent- age increase in peak-hour average vehicle occupancy and percentage reduction in growth of peak-hour vehicle miles of travel within the first 5 years of the program period. • Transportation system management: Each of the 11 regions must program investments for at least 10 of the most congested spot locations where peak-hour recurring queued conditions (Level-of-Service E or worse) can be addressed with relatively low-cost short- 1 3 4 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp129_152 12/13/05 12:43 PM Page 134

term strategies. The performance measures used are number of spot locations eliminated and person-hours of delay and truck-hours of delay reductions by the end of the 5-year program period. • Bicycle–pedestrian facilities: Programming projects to implement approved bicycle–pedestrian facilities to increase use of nonmotorized transportation in congested corridors. The performance measures used are new miles of on-street bicycle facilities, quantity of new or upgraded sidewalks and crosswalks, miles of multiuse paths, num- ber of transit facilities and activity centers accessible to bicycles and pedestrians, and the increase in bicycle and pedestrian usage by the end of the program period. This example shows how existing data from roadway inventory and traffic counts can be combined with traf- fic simulation tools to calculate mobility performance measures. Like the safety example, it demonstrates the value of integrating a high-level outcome-based measure with targets and project selection criteria and the estab- lishment of specific monitoring measures (some modeled and some output based). Customer-Oriented Performance Measures Most existing performance measurement efforts have focused on performance from the facility or supplier point of view. There is growing interest in reflecting the customer point of view and in using customer perceptions of trans- portation service as a performance measure. Examples of sources from which perception performance measures can be gleaned are random sample surveys of travelers and the public, web feedback, phone calls to 311 municipal ser- vices, phone calls to 511 traveler information services, press clippings, and media editorials. Not surprisingly, transportation user advocacy groups frequently use customer-related performance measures in their publications. The Road Information Project (www.tripnet.org) and the American Highway Users Alliance (www.highways.org) express the backlog of needs in terms of the excess cost of travel per high- way user on an annual basis. The costs attributed to condition and performance shortcomings are likened to a hidden tax, which is a drag on regional and national economic efficiency in a globally competitive market- place. The Texas Transportation Institute’s annual Urban Mobility Report (mobility.tamu.edu), which is a leading national source of congestion information, also focuses on customer-based measures. The following are challenges in using customer-ori- ented measures: • Routine collection of customer perceptions to establish valid trend data may be costly. • A relationship between customer perceptions and actual facility or service characteristics must be estab- lished. • Customers typically cannot distinguish facility ownership, and therefore poor customer ratings of the road network collected by a state transportation depart- ment could reflect conditions on the local street net- work. Example: One of the most active state departments of transportation in the collection of information from customers has been Pennsylvania’s. Customer data have been used for business planning and program develop- ment processes. Customer data sources have included annual statewide telephone surveys to measure cus- tomer perceptions of highway services, focus groups, and interview data gathered during the long-range transportation plan development process. A number of techniques have been used to refine cus- tomer data collection over time to improve its usefulness: • Survey sampling plans and question design that are sufficiently detailed to provide accountability at a district level—county-level surveys addressing topics that are easily related to the perceived mission of district engineers are more useful than surveys designed to track overall statewide customer satisfaction; • Inclusion of questions that distinguish among dif- ferent types of routes (e.g., Interstate versus other); • Survey designs that assess the perceived impor- tance of a given performance element, the customer’s rating of that element, and the range of ratings the cus- tomer considers acceptable; • Focus group methods that assess willingness to pay—for example, by asking respondents to allocate $100 to various strategic focus areas; and • Focus group questions to ascertain customer per- ceptions about whether performance indicators are improving over time. Customer data have been used to conduct strengths, weaknesses, opportunities, and challenges analyses dur- ing the strategic planning process. Strengths are defined as department products and services that have high cus- tomer approval and that are perceived as important (high importance, high grade). Opportunities are prod- ucts and services that have low customer approval but are still perceived as important (high importance, low grade). Weaknesses are internal policies that could limit the department’s ability to meet customer expectations, and challenges are external factors that could limit the department’s effectiveness. Recent initiatives at the Pennsylvania Department of Transportation have been pursuing the establishment of tighter linkages between data from operational systems 1 3 5ISSUES AND CHALLENGES IN USING EXISTING DATA AND TOOLS 99395mvp129_152 12/13/05 12:43 PM Page 135

(e.g., roadway inventory and pavement management) and customer data. Operational data can help interpret the importance of customer satisfaction and importance ratings and avoid simple reliance on these measures. A 2003 research report (1) sponsored by the department identified “break points” in the international roughness index where customers perceived pavement condition to shift from acceptable to unsatisfactory. The department also is working to develop predictive models (2) that estimate changes in customer satisfaction that would result from changes in operational performance targets. The results of these efforts will be used to inform the establishment of performance targets and guide resource allocation decisions. GENERAL GUIDANCE: MAKING THE MOST OF AVAILABLE DATA AND SYSTEMS Challenges in using existing data and tools for perfor- mance measurement can be divided into three categories: • Defining performance measures—deciding which performance measures and data sources to use, • Collecting and managing the data, and • Using performance data to support decisions. Guidance for each of these areas is provided below. Defining Performance Measures and Identifying Data Sources Defining performance measures and gaining consensus on which performance measures to use is the first chal- lenge faced by agencies that are starting or expanding a performance-based planning program. An agency should establish clear goals for what is being measured and how performance measurement is being used in the agency’s decision-making process. At least initially, deciding how to measure performance should not be based on what data are now being collected but on what the agency is trying to accomplish, the framework of policy goals and objectives that have been established, and the expectations of customers and partners. With these goals established, the following strategies can be used to define a performance measurement effort that makes the best use of already existing data and tools. Build on what already is in place. Consider the infor- mation and tools that already are being used to make decisions at both the strategic and tactical levels of the organization. Recognize that establishing data pro- grams and analysis methods for performance measure- ment is not an overnight process. It takes a significant amount of time and often years to refine data collection techniques, smooth data, and establish trends to ensure reliable results. Therefore, build on the already estab- lished data collection practices and procedures to the extent possible. Measure what will be used and use what is measured. Data items should not be collected just because they are available—they should have a critical business process use. It is best to select a small set of performance mea- sures that can be tracked realistically and used. Ideally, performance measurement is integral to an agency’s business processes. Managers depend on the informa- tion for both strategic and tactical decisions, and exter- nal partners and customers demand it. When performance information is in active use, errors in data and modeling results are quickly recognized. Assess the need for data quality improvements. Where existing data sources are to be used, evaluate their accuracy, precision, timeliness, and consistency and consider the ways in which the data will be used in performance measurement. Data quality improvements may be warranted, particularly for data that support multiple performance measurement processes or that will be used for critical decisions. Developing a strategic data plan for the agency is one way to assess whether it makes sense to reallocate existing data collection resources to support the performance measurement pro- gram. It may be cost-effective to make incremental improvements in existing data collection programs that will yield quality improvements essential to the credibil- ity and value of the performance-based planning process. Check for inconsistencies across systems. It is partic- ularly important to ensure accuracy and consistency of fundamental measures, such as system mileage and vehi- cle miles of travel, that are used to calculate many types of performance measures (e.g., weighted average condi- tion or condition distributions). Inconsistencies in these types of measures across data sets and analysis tools can sometimes arise because of the use of different data sources and data estimation methods. Sometimes the problem relates to data definitions. For example, many pavement management systems treat a 1-mile section of a divided highway as 2 centerline miles, whereas High- way Performance Monitoring System data sets may treat this as 1 centerline mile. Other times, the problem is due to the lack of data integration and updating pro- cedures, with the result that some systems do not have up-to-date information. Whatever the source of the problem, it is important to perform basic consistency checks on fundamental measures before combining or comparing data from different analysis tools or data sources. Measure or model the agency’s contribution to improved performance. Agencies have tended to rely on output-type measures rather than on outcome-type 1 3 6 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp129_152 12/13/05 12:43 PM Page 136

measures because what they do can be controlled, but the impact of what they do is not always easy to assess, let alone control. However, agencies need to demon- strate how expenditures of public tax dollars are in fact making things better for customers than they otherwise would have been. Benefits of operational improvements are particularly difficult to isolate for performance mea- surement. Frequently correlation can be established but not causality. Exogenous factors (such as fuel prices) that affect the result are difficult to predict. This perfor- mance measurement issue is the most challenging from a data standpoint. Several strategies can be used to address the issue of causality. First, agencies should strive to maintain consistent data collection methods over time in order to have valid trend information. Sec- ond, well-planned before-and-after studies can provide some level of control over certain variables. Third, groupings of performance measures can be set up that combine broad customer-oriented measures with mea- sures that are more directly related to an agency’s actions. Finally, modeling tools can be used to estimate the impacts or benefits of agency actions, as illustrated by the New York State examples above. Identify trend data. Investigate the availability of trend data for the measures that are selected. Have con- sistent trend data been established for the data sources to be used? How far back in time? Have measurement methods or computation methods that affect the valid- ity of the trend line changed? If changes in data collec- tion methods or schedules are under consideration, consider the impacts of these changes on the agency’s ability to maintain valid time-series information for key performance indicators. Anticipate data integration and quality-checking requirements. Calculation of many performance mea- sures requires integration of data from multiple sources. For example, accident rates are derived from crash sta- tistics and vehicle miles of travel. Integration is even more crucial for display and analysis of monitored data—for example, to allow district engineers to view pavement condition, accident rates, and congestion hot spots on the same map for use in project development. Therefore, it is important to look at the location refer- encing methods and the level of accuracy in location for the data to be used in performance measurement. Tem- poral referencing also is important for data integration and quality checking—for example, to correlate mea- sured improvements in infrastructure condition with capital projects or maintenance activities. Define performance rating scales and establish a feedback loop. Knowing whether goals are met is insuf- ficient information for a leader to use in making deci- sions about how to improve agency performance. It is helpful to define a quantitative performance rating scale for each goal, with an upper and a lower bound and ranges such as “excellent,” “satisfactory,” and “needs improvement.” It is also helpful to track whether the current performance rating is better or worse than an expected value and the value for the prior reporting period. Feedback that may be considered to adjust goals is also critical to a continuous process of performance measurement. Use peer comparisons. When viewed without context, quantified performance targets are often hard to evalu- ate. How is an executive to know whether an organiza- tion is performing well, marginally, or unacceptably? Benchmarking or comparison with peers is one method validating whether the targets set for an agency make sense. Viewing the agency’s performance as a share of national activity is another technique. As an example, the U.S. goal to reduce annual highway deaths by 9,000 by 2008 compared with the 2002 baseline was motivated by a comparison of U.S. fatality rates with those of other nations with extensive rural and metropolitan highway systems. Collecting and Managing the Data Once the agency has a plan for what measures are to be used and what the data sources are, much work remains to be done to ensure a smooth process from collecting the raw data to making the data available in the form of performance measures for decision makers. Guidelines for data collection and management are provided below. Manage performance data as an enterprise asset. If a data element is judged to be a critical input for the per- formance measurement process, it should have a data owner, a data element definition, a schedule for updat- ing, and a fixed amount of precision. Processes should be developed for quality-checking the raw data and turning the data into an aggregated and value-added information asset accessible to the whole enterprise. The details of data transformations should be clearly docu- mented to avoid downstream problems with use of inconsistent methods. The definition must be clear to end users and decision makers and applied consistently throughout the agency. Enterprise-level data elements must be accessible throughout the data-owning agency and for authorized uses among business process part- ners in cooperating local, state, and federal agencies. Data owners must provide metadata alerting users of data limitations and variability, and data access providers should prominently display metadata as notes in data transmittals and presentation graphics. Nail down data definitions. Particularly where data from secondary sources are being used to derive perfor- mance measures, it is important to obtain and docu- ment precise data definitions. To take the example of employment data, figures could vary widely on the basis 1 3 7ISSUES AND CHALLENGES IN USING EXISTING DATA AND TOOLS 99395mvp129_152 12/13/05 12:43 PM Page 137

of data sources and adjustment methods [e.g., adjusting for proprietors, adjusting for persons who work two jobs, and determining the location for which data are reported (i.e., the payroll office, the work site, or the residence of the worker)]. Recognize and plan for data management costs. Ade- quate resources must be provided to collect, store, archive, analyze, and disseminate critical data elements. Data- and analysis-intensive areas such as mobility and reliability require explicit resource allocations. Separa- tion of the data production function from the data analysis and dissemination function may ensure that neither function consumes a disproportionate share of resources. Adjust data collection, analysis, and reporting respon- sibilities. It is common to find similar data being collected and analyzed for different purposes in varying parts of the agency. Implementing an enterprise-level performance measurement program that uses existing data sources will often reveal this duplication and the inconsistencies that go along with it. Once data sources for the performance measurement program are clearly defined, accountability should be assigned to specific functional areas within the agency for the various data support functions. Responsi- bility and expectations should be clearly communicated and understood by all stakeholders in the process. Be sure to include time and resources to coordinate across organi- zational units and adapt existing systems and processes to meet new requirements. To reduce the likelihood of bot- tlenecks in the delivery of performance information, use a decentralized approach. The business processes of pro- ducing data, analyzing data in a single subject area, and providing integrated views of data across systems should be separated. Trying to do everything for everybody in a single step is too risky in terms of cost and schedule, espe- cially when technology leaps occur as frequently as every 18 months. Put data quality controls in place. Location and tem- poral validity and integrity control systems for all enter- prise-level data elements must be compatible. When data from multiple pieces of equipment are collected by way of multiple methods or from multiple sources, con- sistency of the measurement must be ensured. For exam- ple, quality control methods such as measurement of roughness by one state for a sample of segments near the border in a neighboring state (while repositioning the equipment for more work in its own state) can enable each state to ensure that equipment is measuring consistently with equipment used in the neighboring state. Other information quality factors that must be considered include relevance, correctness, accuracy, pre- cision, completeness, timeliness, usability, accessibility, data life, and conformity to expectations. Avoid linear referencing pitfalls. Trying to join aged linearly referenced highway attribute data with an up- to-date cartographic model of highways is a sure for- mula for loss of data integrity. The real-world highway system and the current cartographic model of it are changed frequently by route retirements, route addi- tions, and route measurements that occur whenever geo- metric changes are included in a project. Archived linearly referenced highway attributes can only be mapped correctly in a GIS application either by joining them to the matching archived cartographic model or by spatially transforming the archived attributes to be measured in the linear referencing datum that is current. Failure of GIS users to account for this temporal aspect of linear referencing systems is a major data integrity issue. One solution is to establish a business rule that requires all linearly referenced data enterprisewide to be transformed to the current cartographic model and to enforce the rule each time the cartographic model is updated. Plan for smooth transitions as legacy systems are replaced. New or upgraded systems should be planned in order to have a common method for locating assets and for recording events that happen in a temporal dimension. Metadata about the legacy data in terms of gaps, quality, and integrity should be maintained. Any differences between the results of performance measures calculated by the legacy system and the modern system should be identified before ending the phase of the con- version during which the systems run in parallel. Migra- tion of the historical values of data elements that are critical for time-series analysis and presentation of per- formance measures from the legacy system to the new system should be specified as part of the upgrade process. This includes data quality checks, transforma- tion of data code values, and transformation of the legacy record format to the modernized record format. Loading only the most recent data value into a new sys- tem has miniscule cost savings compared with loading all the data that have been archived as a part of the legacy system operation. Consider outsourcing data collection. Many data ele- ments are commodities and can be procured from pri- vate-sector vendors by low-bid methods at unit costs that are competitive with costs for provision of the same services by a public-sector work group. For example, privatized data collection is commonplace for auto- mated pavement condition surveys, periodic bridge inspections, and highway traffic counting programs. Evaluate the use of new data collection technology. Transportation agencies need to keep up with technol- ogy. For example, a common method for video logging in large counties involves periodic digital photo logging of assets with three pairs of cameras (left looking, right looking, and forward looking) from a van equipped with a differentially corrected Global Positioning Sys- tem. Staff at desktop workstations are able to use the 1 3 8 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp129_152 12/13/05 12:43 PM Page 138

photos to locate existing assets accurately by triangula- tion from the stereo images of the camera pairs, add attributed assets to an inventory database, identify damaged or missing assets, and assign condition ratings to assets. Agencies with a larger base of assets are lag- ging far behind metropolitan counties in developing comprehensive GIS-based, enterprisewide databases and associated user interface tools. Large agencies appear to be having difficulty in quantifying and com- municating the benefits of this and other technology as well as in competing for resources alongside other infor- mation technology proposals. Develop a data business plan. A data business plan can be undertaken to address issues of customer needs, return on investment for data collection, assessment of which data types are most beneficial to collect relative to resources invested, data stewardship, metadata, how to address changing agency needs, data definitions, and many other critical elements that have been discussed above. Using Performance Data to Support Decisions The performance data, once in place, must be dissemi- nated to decision makers in an effective manner, and analytical tools must be available to assist in develop- ment of performance targets and investment analysis. The guidelines below focus on the use of office produc- tivity, GIS, and specialized tools for performance-based planning. Provide methods and tools for drilling down and rolling up. It is best to focus only on a small set of per- formance measures for external reporting and strategic budgeting. However, more detail is needed for decision making at the staff level. Ideally, the few high-level per- formance measures can be derived from more detailed measures, and a drill-down capability will be available to allow staff to see detailed data and assumptions behind aggregate measures (e.g., whether pavement condition is poor primarily because of rutting). Desktop database, spreadsheet, and GIS tools as well as more specialized analysis tools should provide capabilities to summarize performance measures for different parts of the system (e.g., district and functional class) and from a user-oriented perspective (e.g., percentage of vehicle miles of travel on poor roads). They can provide a drill- down capability to allow a user to explore conditions at different levels of geographic aggregation. A hierarchy of measures also ensures that measures are backed by adequate detail, since systemwide aggregation does not always lead to meaningful decisions. Make use of GIS software and office productivity applications. Use of a desktop GIS application by well- trained staff can be an efficient means of preparing input data for a legacy system to process. The same is true of use of a desktop database application. Minor modifica- tions to legacy system source code are usually worth the effort to enable the system to produce comma-separated variable (CSV) output files of legacy system results. CSV files can be easily read into desktop spreadsheet and GIS applications, which can then be used for analysis, pre- sentation, and printing. The open database connectivity method enables staff who have been trained to use desk- top database applications to import or link to data stored in enterprise-level relational database manage- ment systems. The File Transfer Protocol is a frequently used method for moving legacy data sets among com- puting platforms. Use of desktop tools keeps staff train- ing costs low by avoiding the need for workforce skills in the use of antiquated mainframe programming lan- guages and editors. Mission-critical files created by desktop application users should be uploaded to agency file servers at least daily and from there backed up peri- odically. The process should include off-site disaster recovery protection of information assets. Make use of simulation tools. Fact-based what-if analysis of alternative funding scenarios and policy choices is a fundamental part of performance-based plan- ning and programming. For many types of performance indicators, simulation tools can be used to help provide an understanding of how future performance may be affected by the quantity, timing, and type of agency inter- ventions and by variations in factors outside the agency’s control (e.g., growth patterns). They can analyze future needs and provide an indication of the performance that can be achieved at various investment levels. This type of analysis is valuable for setting realistic performance tar- gets and guiding budget allocations consistent with these targets. Pavement and bridge management systems are important resources to tap for performance measure- ment. However, agencies should anticipate and plan for some effort to make best use of these systems, particu- larly when they are being used primarily for inspection data management and project-level decision making. To be credible and useful, these systems require calibration and validation against actual experience in a particular locale. Comparison of predicted trends against measured past trends provides a good reality check. Sufficient time and resources need to be allocated for this activity, and ideally, a technical champion should be designated to exercise the system and ensure that it is producing rea- sonable results. Agencies should be sure to look at per- formance predictions both at the site-specific level (i.e., individual pavement sections or bridges) and at the net- work level (i.e., predicted average Interstate pavement condition compared with past trends). Be aware of prioritization methods used by tools. Tools that provide the capability to predict system per- formance as a function of investment levels use a variety 1 3 9ISSUES AND CHALLENGES IN USING EXISTING DATA AND TOOLS 99395mvp129_152 12/13/05 12:43 PM Page 139

of methods for identifying needs and determining how the available budget is allocated. These methods should be well understood in selecting or configuring a tool and in determining how to make best use of an existing tool in the performance target–setting process. For example, an agency trying to determine the investment necessary to reduce the percentage of poor pavement miles in a district to 20 percent may make several runs of its pave- ment management system but find that even with fairly high investment levels, it cannot reduce the percentage below 30. The reason may be that the management sys- tem is not allocating resources on a “worst-first” basis; it may be allocating the available budget to more cost- effective investments in preventive maintenance. This raises several questions—for example, whether the tar- get should be reconsidered since it may imply an ineffi- cient use of funds over the long term and whether the management system’s models adequately reflect the user costs of poor pavements in its calculations. This type of debate is valuable and arguably a necessary part of determining how to use simulation tools in the context of performance targeting and investment analyses. Integrate project and program data. Tools that predict future condition should reflect work that is scheduled or programmed. Keeping infrastructure management sys- tems in sync with program/project databases is often a challenge. Efforts to ensure that data structures are consis- tent so that information can flow between management systems and program/project databases would be worth- while in most agencies. At a minimum, specific work flow processes should be defined to update management sys- tems as program/project databases change. Ensure consistent cost assumptions. When tools to do investment versus performance analysis are used, it is important to pay attention to the costing side. Tools and inputs should account for inflation and reflect proper use of discounting methods. Budgets and work costs should be consistent. For example, if work costs do not include indirect costs, the budgets should be reduced accordingly. CONCLUSIONS This paper has provided examples of successful approaches in the use of existing data and tools to sup- port performance measurement applications for four critical categories of measures (infrastructure, mobility, safety, and customer service). Each example was selected to show how performance measurement pro- grams can be built by using available data and standard tools (GIS, desktop applications, management systems). In each case, the key to success was not the sophistica- tion of the individual measures that were selected or the level of detail of the data collection effort but the way in which different data sources and tools were used in combination and the processes that were developed for using the performance measures to establish priorities and allocate resources. General guidance was provided for major steps in the process of performance program definition and imple- mentation. The guidance is intended to help agencies successfully navigate the array of technical, process, and organizational issues that can be anticipated as they undertake performance measurement. The general guid- ance and examples lay the framework for a practitioner; they are not meant to be exhaustive and answer all ques- tions associated with the topic. [Additional guidance concerning the use of data and tools in performance- based planning can be found elsewhere (3, Chapter 4).] They do suggest, however, that a systematic approach to performance program design that considers the interre- lated issues involved in measure definition, data man- agement, and business process can help agencies avoid major roadblocks and anticipate the nature and extent of the effort that will be required for success. REFERENCES 1. Poister, T. H., P. M. Garvey, M. T. Pietrucha, R. S. Ghe- brial, and C. L. Smith. Ride Quality Thresholds from the Motorist’s Perspective—Final Report. Pennsylvania Department of Transportation, Dec. 2003. 2. Cambridge Systematics, Inc. Improving Decision Sup- port with Customer Data. Draft final report. Pennsyl- vania Department of Transportation, June 2004. 3. Cambridge Systematics, Inc. NCHRP Report 446: A Guidebook for Performance-Based Transportation Planning. Transportation Research Board, National Research Council, Washington, D.C., 2000. 1 4 0 PERFORMANCE MEASURES TO IMPROVE TRANSPORTATION SYSTEMS 99395mvp129_152 12/13/05 12:43 PM Page 140

Next: Participants »
Performance Measures to Improve Transportation Systems: Summary of the Second National Conference Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s Conference Proceedings 36, Performance Measures to Improve Transportation Systems: Summary of the Second National Conference are the proceedings from a conference held on August 22-24, 2004, in Irvine, California. The purpose of the conference was to explore the implementation and use of performance measures and to discuss how to monitor the impact of performance measures on the delivery and quality of transportation services. The proceedings include summaries of presentations made in each conference session and of resource papers. These summaries highlight a variety of agencies' experiences with the use of performance measures and identify research that could improve the use of performance measures. The resource papers prepared for the conference are also included in the proceedings.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!