National Academies Press: OpenBook

Developing an Airport Performance-Measurement System (2010)

Chapter: PART III - Field Research on Performance Measurement

« Previous: PART II - Building a Performance-Measurement System
Page 92
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 92
Page 93
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 93
Page 94
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 94
Page 95
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 95
Page 96
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 96
Page 97
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 97
Page 98
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 98
Page 99
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 99
Page 100
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 100
Page 101
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 101
Page 102
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 102
Page 103
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 103
Page 104
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 104
Page 105
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 105
Page 106
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 106
Page 107
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 107
Page 108
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 108
Page 109
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 109
Page 110
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 110
Page 111
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 111
Page 112
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 112
Page 113
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 113
Page 114
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 114
Page 115
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 115
Page 116
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 116
Page 117
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 117
Page 118
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 118
Page 119
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 119
Page 120
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 120
Page 121
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 121
Page 122
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 122
Page 123
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 123
Page 124
Suggested Citation:"PART III - Field Research on Performance Measurement." National Academies of Sciences, Engineering, and Medicine. 2010. Developing an Airport Performance-Measurement System. Washington, DC: The National Academies Press. doi: 10.17226/14428.
×
Page 124

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

P A R T I I I Field Research on Performance Measurement 97 Chapter 1 Airport Case Studies on Performance-Measurement Systems 97 What Is a Case Study? 97 Case Study Sample 98 Learning Objectives 99 Case Study Reports of Selected Airports 119 Chapter 2 Regional, State, and Federal Applications of Performance-Measurement Systems 119 Government Performance and Results Act of 1993 (GPRA) 120 Airport Economics Manual (ICAO Document 9562) and Report of the Conference on the Economics of Airports and Air Navigation Services (ICAO Document 9908) 121 Association Support for Performance-Measurement Practices 128 The UK Centre for the Measurement of Government Activity

This chapter presents a detailed summary of current performance-measurement practices from a representative sample of airports that volunteered to share their experience. The purpose of the case study airport analysis is to leverage the knowledge of current industry practices, determine how outcomes and efficiencies are commonly measured, evaluate the role of per- formance measure results in the decision-making process, and identify types of commonly used measures and their benefits, as well as the factors that prevent airports from implementing a performance-measurement system. What Is a Case Study? Case studies are examinations of problems or issues in their real-world settings that serve as learning tools for those in a professional environment by contributing a tangible and innovative approach or particular scenario. Case studies share knowledge and experience gained by those who have already traveled similar paths. Readers can learn from others’ achievements, challenges, successes, and failures in obtaining goals similar to their own. This concept has been used to study performance-measurement practices in strategically chosen airports of all sizes and types. Case studies of how airports develop and manage their performance-measurement system can help individual airports evaluate their own performance- measurement system, avoid the mistakes that other airports may have made, and learn from air- port best practices. Case Study Sample The sample consists of 13 airports, 12 in the United States and 1 in Canada. The airports selected are of varying sizes, locations, and management types in an effort to most effectively represent the potential end users of the guidebook. Information on airport size was derived from the most recent data reported by the FAA. In 2007, the FAA defined large hubs as those airports with more than 7,635,056 reported annual enplanements, medium hubs were defined as having between 1,908,764 and 7,635,055 annual enplanements, and small hubs had between 381,753 and 1,908,764 annual enplanements. Those airports with commercial airline operations and annual enplanements of fewer than 2,500 passengers (the minimum number of enplaned pas- sengers necessary to be categorized as a small hub) are referred to as commercial service airports. Finally, GA airports (as defined by the FAA) are the remaining airports, excluding reliever air- ports. Based on this classification, the selected case study airport sample is the following: • 7.7% Commercial Service, Non-Hubs • 15.4% General Aviation 97 C H A P T E R 1 Airport Case Studies on Performance-Measurement Systems

• 7.7% Small Hubs • 23.0% Medium Hubs • 46.2% Large Hubs Exhibit III-1.1 lists the selected airports and provides their classification and approximate geographical location. Learning Objectives Through the 13 airport case studies, information was gathered on the following topics: • Airport approaches to performance-measurement systems and the relationship of the performance-measurement system to size and governance; • Industry understanding of the relevance of performance measures in monitoring outcomes and reaching goals; • Best practices and industry trends; 98 Part III: Field Research on Performance Measurement Map of Performance Measurement Case Study Airports Exhibit III-1.1. Case study airports.

• Performance-measurement system elements of success—communication, participation, and personnel buy-in; • Advantages, challenges, limitations, and trade-offs of the performance-measurement system; • Commonly used performance measures; and • Reasons airports have not implemented a performance-measurement system. Case Study Reports of Selected Airports Sebring Regional Airport Abstract Formerly Hendricks Field Army Air Base, Sebring Regional Airport is a GA airport operated by the Sebring Airport Authority. Sebring Regional Airport’s strategy is to become the economic engine of Highlands County in central Florida, a traditionally agricultural area. The airport has defined a clear mission and vision that give direction to Sebring Regional Airport for the next 50 years. Management is currently in the process of defining goals and strategies, and no mea- sures have been identified yet. However, financial performance and operational performance are monitored on a weekly basis. A small number of staff who have access to all available data, limited time and financial resources, and the need the address urgent matters immediately have constrained Sebring Regional Airport’s efforts to develop and implement a performance- measurement system. Airport Profile Sebring Regional Airport is a GA type airport located in the center of Florida in an agricultural area approximately 7 miles from downtown Sebring and an average of 100 miles (or a 2-hour drive) from Orlando, Tampa, St. Petersburg-Clearwater, Sarasota-Bradenton, Southwest Florida, Melbourne, Vero Beach, and Palm Beach airports. Since its inception as a military flying school in 1941, Sebring has turned into an economic engine in the area. Serving the government in times of war, Sebring was conceived to boost war-related activity until it was declared inactive at the end of the Second World War in 1945. In early 1946, the city received a temporary permit to operate so-called Hendricks Field as a civilian airport that opened to the public in 1947. In 1967, the city turned over the deed of the airport to the Sebring Airport Authority, which has managed the air- port since then. The airport has evolved into an economic engine—developing a commerce park that is the home of 17 organizations, including Sebring International Raceway, which attracts thousands of race fans from around the world every year. Sebring Regional Airport is run by a five-member staff, and all services are outsourced to the private sector. Program Goals and Objectives The airport has defined a vision and a mission that establish a clear direction for the airport as an economic generator for Highlands County for the next 50 years. The airport is currently in the process of defining goals and strategies, and as such, no measures have been established yet. However, airport management believes in the relevance of performance measures and the ben- efits of monitoring performance. Because the airport’s strategic approach to attracting air-traffic activity is through economic development, the airport works as a channel for businesses and infrastructure development. Management is very active in regional projects to improve transportation as a means of promot- ing economic activity in the area. Some of these projects are the following: • The Heartland Coast-to-Coast Corridor that will connect the east and west coasts of Florida through I-75/Florida’s Turnpike and I-75/I-95; Airport Case Studies on Performance-Measurement Systems 99 Sebring Regional Airport is in the first stages of strategi- cally aligning the organization. A performance- measurement system will be the final component in the development of the system.

• The Encouragement Zone, which allows airport-adjacent property owners to enjoy the same development opportunities as the Sebring Airport Authority by creating a strategic alliance for economic advantage that capitalizes on higher land values and land use compatibility and provides for future airport needs; and • The Scale and Economic Stewardship Program, which acts as a economic and community steward to provide infrastructure development and prosperity to the county and the region. The airport also capitalizes on the catalyst project, which promotes tax breaks. The most pressing goal for the airport is the acquisition of surrounding land to expand air- port property and attract more businesses to the airport commerce park, which is supported by a global platform with intermodal transportation (air, rail, and roads). Airport principles to be considered as goals are the following: • Community perception of the airport as a leader in growth and development of the area; • Airport growth; • Provision of the benefits of economic zone status to airport tenants through airport partici- pation in permitting, zoning, elimination of fees, and so forth; and • Retention of a young population. Data-Collection Procedure Financial and operational data are collected by administrative staff on a daily basis. Although it is not compiled or documented in a performance-measurement system, it is easily accessible through financial and operational records. Customer service is measured through customer ser- vice surveys. The drawback of not documenting performance measures resides in the difficulty of determining trends, leveraging gains from improvements and learning, and accessing the data when required since it is not readily available in one place. Finally, the alignment of strategic ele- ments and the achievement of goals are unknown or difficult to determine with accuracy. Measures, Standards, and Targets Currently, Sebring Regional Airport management collects data in four areas considered to be essential: • Financial performance • Operational performance • Capital investment programs • Customer service Employee satisfaction and training, although relevant measures, are not being documented due to the simplicity of monitoring a small staff (five employees). Performance measures under discussion that are related to potential airport goals are the following: • Increased number of organizations in the airport commerce park, increased airport revenues, and expanded airport property; • City and County grants received by Sebring Regional Airport in relation to grants received by other airports; • Florida Department of Transportation and FAA grants; and • The results of an economic impact study Monitoring and Reporting Because there’s no performance-measurement system in place and the airport staff is small, monitoring and reporting are done by management on a weekly basis. Since Sebring Regional 100 Part III: Field Research on Performance Measurement Data collection is embedded in the day-to-day operations.

Airport staff members are not involved with airport operations, information is being shared among managers only. Maintenance of the current process is fairly simple, with minimal time and cost implications. Successes and Challenges Sebring Regional Airport management sees a performance-measurement system to be an imperative tool for airports of all sizes. However, for airports with a smaller volume of operations, the development, implementation, and maintenance of a full-scale performance-measurement system isn’t always a priority. Some of the characteristics of smaller airports that contribute to this situation are the following: • Small Staff. Tracking training and overall staff performance without the implementation of a measurement tool is feasible when the number of employees is small, as is the case with Sebring Regional Airport. Airport management strives to deliver outstanding customer service and believes the private sector excels in that field; thus Sebring Regional Airport outsources many of their services. • Longevity of Management. Senior staff appointed for long periods of time leverage their knowledge of airport data that are not formally documented. • Limited Resources. Not having the time required to administer a performance-measurement system, having a small staff, and having limited available funds could very well impact the deci- sion of a small airport on whether to develop and implement a system to monitor performance. • Urgent versus Relevant Matters. In an environment run by a small staff with limited resources, urgent matters take precedence over relevant matters. Thus, the development and implemen- tation of performance measures and performance-measurement systems, although relevant to airport outcomes, are constantly being postponed in order to resolve emergencies or attend to immediate needs. Two major reasons to implement a performance-measurement system in spite of the difficul- ties are the following: (1) transferability of knowledge from airport officer to airport officer within the organization and (2) collecting data to be used for internal marketing directed at elected offi- cers and constituents. Mahlon Sweet Field, Eugene Airport Abstract Owned and operated by the City of Eugene, Mahlon Sweet Field Airport is a small airport and the second busiest in Oregon. The core of the business at the airport is GA activity. Mahlon Sweet Field has many components of a sound performance-measurement system including cus- tomer satisfaction surveys, benchmarking, demand studies, tenant surveys, and periodic per- formance reports. The city performance-measurement manager works with the management teams of each city department to develop service profiles that include very broad strategies and some high-level performance measures. The airport performance measures are compiled via a collaborative effort involving the city performance-measurement manager and the Mahlon Sweet Field management team and used as part of the annual city budget document. Airport Profile Located along the I-5 corridor, 5 miles away from downtown Eugene, in the middle of the State of Oregon, Mahlon Sweet Field is the second busiest airport in the state and the fifth largest air- port in the Pacific Northwest. Owned and operated by the City of Eugene, Mahlon Sweet Field serves an expansive six-county region. In 2007, Mahlon Sweet Field experienced 379,089 enplane- ments, a 26% increase since 2003. The airport operates with 40 full-time employees, which includes police and Aircraft Rescue Fire Fighting (ARFF) personnel. Providing excellent customer Airport Case Studies on Performance-Measurement Systems 101

service is the focus of this non-hub commercial service airport. Leadership at Mahlon Sweet Field regards its agreements with existing air carriers and other service providers as partnerships. Air- port officials are committed to creating expanded and new relationships with airlines, with a goal of supporting demand for air service in the region. Program Goals and Objectives Mahlon Sweet Field has many components of a sound performance-measurement system. The airport has a strategic framework to leverage the development of a performance-measurement system: mission, outcomes, operating principles, and a SWOT analysis are in place. Mahlon Sweet Field also has identified three overarching strategies with corresponding targets and action plans and tracks performance measures to monitor core processes. Because the pieces are viewed independently, they are not perceived as an integrated performance-measurement system, although independently they provide valuable information to airport operators. Mahlon Sweet Field objectives are developed at a broad level as a collaborative effort with the management team and the city performance-measurement manager. At this broad level, the objectives are aligned with the high-level strategies but are not detailed enough to be aligned with stakeholder expectations. Exhibit III-1.2 shows Mahlon Sweet Field’s current strategies and outcome measures. Along with the internal goals and objectives for the airport, Mahlon Sweet Field benchmarks against similar airports to gain a comparison perspective. The airport uses their performance measurement and external benchmarking to determine how Mahlon Sweet Field compares with peers. The airport also uses these techniques and measurement results as a way to communicate with the Airport Advisory Committee. The airport is able to effectively use their performance- measurement strategies as a mechanism for annual and periodic performance improvement focused primarily on customer service. Data-Collection Procedure Data-collection requirements are not substantial given that the bulk of the work is conducted annually and that a clearly defined and integrated performance-measurement system consisting of visible “dashboards” does not yet exist. Because the customer survey and demand survey are out- sourced, there are limited data requirements other than an annual review and revision of the survey instrument. The benchmarking survey is the most time consuming and includes the annual distri- bution of an MS Excel survey to the study partners. The monthly activity report is gathered from Mahlon Sweet Field’s financial reporting system and monthly tenant submissions and is managed 102 Part III: Field Research on Performance Measurement Mahlon Sweet Field has defined over- arching strategies with a mission, targets and action plans, outcomes, and operating principles, and the airport tracks performance measures. Mahlon Sweet Field uses measurement results as a communication vehicle and performance improvement. Strategies Outcome Measures Recruit and retain air service providers that meet regional needs Increase the number of passengers using Mahlon Sweet Field by 2% annually Establish a sustainable air service development fund Develop airport facilities and infrastructure to accommodate operational, safety, and security requirements and to meet projected demand customers at Mahlon Sweet Field Within 4 years meet airfield development needs as identified in the new Airport Master Plan Within 4 years complete the projects identified in the terminal rehabilitation plan 80% that they are able to find the products and Annually meet FAA and TSA safety and security mandates Provide the products and services needed by Maintain a satisfaction ratin services they need g from customers of Exhibit III-1.2. Mahlon Sweet Field strategies and outcome measures.

by a finance and administration staff person. Management uses the data gathered primarily as a means of communicating with the city, improving customer service, and negotiating with airlines. Mahlon Sweet Field uses a number of discrete tools to gather data in its performance- measurement system. The tools include the following: • Customer Satisfaction Survey. The customer satisfaction survey receives the most attention from the management team and is analyzed for areas of improvement; for example, a new vol- unteer program resulted from feedback on the survey, and implementation of free WiFi also emerged from this tool. • Demand Forecast Survey. This tool is used by the director and his team to negotiate with existing airlines and attract new ones. • Benchmarking Survey. The benchmarking survey involves the tracking of 19 key perform- ance indicators from 8 peer airports. • Activity Report. This report summarizes operational data and is distributed to all tenants, Mahlon Sweet Field managers, and the Airport Advisory Committee. • Tenant Survey. • Tenant Meetings. Airport tenants are met with on a regular basis. • Airport Advisory Committee Meetings. These committee meetings provide good forums in which tenants can voice their concerns. Measures, Standards, and Targets Mahlon Sweet Field has defined 17 performance measures that monitor core processes and the total system. Core processes are the following: operate and maintain the airfield, provide traveler support facilities and services, provide GA facilities and services, and plan and develop regional air service and facilities. The total system, on the other hand, involves the efficiency, effectiveness, financial performance, and customer satisfaction generated by Mahlon Sweet Field. (See Exhibit III-1.3 for a list of short-term and intermediate measures of Mahlon Sweet Field’s core processes and total system.) Six of the 17 performance measures overlap with the bench- marking study, in which Mahlon Sweet Field monitors four key performance areas: operations, productivity, revenues, and expenses and debt, for a total of 21 benchmarking measures. Monitoring and Reporting Each year, the Mahlon Sweet Field management team presents a summary of its performance (using all of the tools listed under the section entitled “Data-Collection Procedure”) to the Air- port Advisory Committee: • Customer Satisfaction Survey (outsourced and conducted annually). • Demand Forecast Survey (outsourced and conducted annually). • Benchmarking Survey (led by Mahlon Sweet Field and conducted annually). • Activity Report (led by Mahlon Sweet Field and conducted monthly). • Tenant Survey (led by Mahlon Sweet Field and only conducted once, in 2006). • Tenant Meetings (led by Mahlon Sweet Field and conducted monthly). • Airport Advisory Committee Meetings (led by Mahlon Sweet Field and conducted monthly). Successes and Challenges Mahlon Sweet Field has implemented many components of a performance-measurement system and works closely with the city to establish measures to track performance. The biggest challenges associated with Mahlon Sweet Field’s performance-measurement system include the following: • Difficulty in obtaining data from third parties (i.e., tenants and benchmarking partners); • Annual increases in the cost of the customer service and demand surveys; and • Systematic use of the information to communicate and improve performance. Airport Case Studies on Performance-Measurement Systems 103

From a transferability perspective, the discrete elements of the Mahlon Sweet Field system could be implemented at another organization with relative ease. They would serve as a founda- tion to develop an overarching strategy and an integrated approach. Dayton International Airport Abstract City owned and operated, Dayton International Airport is a small hub that is recovering remarkably from a period of recession. With a new director in 2006, Dayton International Airport changed its approach to doing business, understanding the need to define strategies and mea- sure performance in order to take the organization in a clear direction. Dayton International Airport implemented a scorecard system for measuring performance that is embedded in the decision-making process, thereby reducing thinking time and increasing accomplishment of new endeavors. Using the scorecard system, the airport can determine which tactics are not con- tributing to the achievement of airport goals and take corrective action. Dayton International Airport measures are limited to tangible assets and are determined based on what is important in the industry and what other airports measure in order to benchmark. Dayton International 104 Part III: Field Research on Performance Measurement Core Processes Short-Term and Intermediate Measures Operate and Maintain the Airfield Landing fees per 1,000 lb of landed weight Total aircraft operations Meeting FAA safety requirements Provide Traveler Support Facilities and Services Customer satisfaction rating or quality and importance of terminal services Airline passenger-related revenue per enplaned passenger Meeting TSA security requirements Provide General Aviation Facilities and Services Change in based aircraft Gallons of fuel sold Plan and Develop Regional Air Service & Facilities Market demand for air service Demand triggers as identified in the Airport Master Plan Percent of regional trips through Mahlon Sweet Field Total System Short-Term and Intermediate Measures Efficiency Average airline cost per enplaned passengers Effectiveness Number of passengers using Mahlon Sweet Field Percent of regional trips through Mahlon Sweet Field Financial Performance Operating expense per enplaned passenger Customer Satisfaction Customer satisfaction rating of signage, cleanliness, and appearance of the terminal Number of Mahlon Sweet Field passenger top-10 destination markets receiving direct service from Mahlon Sweet Field Exhibit III-1.3. Short-Term and Intermediate Measures at Mahlon Sweet Field.

Airport uses 18 performance measures to monitor finance, operations, productivity, and safety and to establish the next year’s goals, targets, and budget. Airport Profile Dayton International Airport is owned and operated by the City of Dayton and has 203 full- time employees. It is considered a small hub airport and is located approximately 14 miles from the center of Dayton, Ohio. Surrounded by three main airports (Port Columbus International Airport, Indianapolis International Airport, and Cincinnati/Northern Kentucky International Airport), Dayton International Airport has managed to excel in a recession while other airports are facing a challenge in retaining demand. In 2005, Dayton International Airport endured a 15% drop in passenger demand and the phasing out of Emery Aircraft, adding to the impact of dropping cargo volumes, which began in 2001 and left the airport with a large, unoccupied cargo facility. With a new airport director in 2006, Dayton International Airport increased the num- ber of enplaned passengers by 6.85%. By 2007, the increase in enplaned passengers was 8.4%, and this trend continued throughout 2008. Dayton International Airport served 1,412,758 passengers in 2007, decreased cost per enplaned passenger from $13 in 2006 to $5.5 in 2008, and expected to further reduce it to approximately $4.5 in 2009. These results reflect Dayton International Air- port’s new leadership style, which has emphasized creating a goal-oriented environment with measurable outcomes and working closely with stakeholders to understand customer needs and gain support from the city, businesses, and the community. Program Goals and Objectives Under the premise that an organization must have a strategy in order to know what to mea- sure and to move forward in a clear direction, Dayton International Airport completed a strate- gic plan and implemented a Balanced Scorecard in 2007 that serves as the navigation system of the organization. The performance-measurement system was created based on the strategic plan, and performance metrics are based on external benchmarks. Dayton International Airport’s strategy is different than the strategies of similar airports in that it focuses on airport opportunities. In order to increase opportunities, an airport should position itself advantageously. Factors such as size, competition, and SWOT analysis play a key role in determining an airport’s business strategy and differ from airport to airport. Therefore, man- agers must know an airport’s strategic market positioning and its public to maximize opportunities and measure the right data. Unfortunately, not all measurable data are under the airport’s control. In some instances, air- ports have little or no control over some of the key elements that impact airport performance, such as the number of enplaned passengers. Although the airport’s ultimate goal is to increase enplanements, whether or not this goal is reached is the result of a joint effort by airlines and air- ports to attract demand. Enhancing the airport’s appeal and facilitating the passenger experience in the airport can contribute to an increase in the number of enplaned passengers. However, Dayton International Airport also focused in decreasing costs to airlines to make the airport an attractive place to operate. A concentration and/or increase in airline operations at Dayton Inter- national Airport would increase the number of enplaned passengers as well. The decreased cost per enplanement was a significant factor in the airport starting to offer international flights to Toronto, Canada. Dayton International Airport has continuously reduced cost per enplaned passenger since 2006 to remain competitive and attract airlines if possible. Despite an uncertain economy, Dayton International Airport has been able to reduce its cost per enplaned passenger by prioritizing objectives and focusing on one at the time. Once a goal was attained, a new objective was established, shifting all the energies to the achievement of the new goal. A goal could be achieved through a myriad of alternative efforts that the airport defined based on its unique characteristics, identity, and the needs of its users. For instance, in Airport Case Studies on Performance-Measurement Systems 105 Dayton International Airport sought strategic alignment by implementing a Balanced Scorecard tailored to the strategic plan. Dayton International Airport remains strategically focused through a selective prioritization of objectives, channeling all departments’ synergies toward the achievement of the same goal.

order to decrease airline costs, Dayton International Airport increased non-aeronautical rev- enues to pass on more incentives to airlines. Dayton International Airport was able to do this by addressing, and thus realizing, value on both the cost and revenue sides. On the cost side, the measures included decreasing staffing from 203 personnel to 160, cutting overtime, and creat- ing efficiency in operating processes. The operating processes included work methods related to seasonal mowing, snow removal, security measures, and so forth. On the revenue side, the mea- sures included capitalizing labor in construction projects, marketing and then leasing hangars at the airport, and shifting costs—for instance, increasing parking fees while decreasing landing fees. The result of increasing parking fees while decreasing landing fees was an increase in enplaned passengers regardless of parking fees because passengers are more attracted to the benefits air- lines offer to them than the extra dollars in airport services. In order to successfully implement the performance-measurement system, Dayton Inter- national Airport created a “Continuous Improvement Coordinator” position fully dedicated to the management of the system and other strategic endeavors, such as the development and future update of the strategic plan. The success of the performance-measurement system implementation was complemented by a training program for all staff to promote objectives. The implementation cost was exceeded in the short term by the benefits realized from the performance-measurement system. The benefits of implementing a performance-measurement system for Dayton Inter- national Airport can be summarized in the following three main outcomes: • Establishment of the organization’s direction, • Alignment of all organizational efforts and leveraging the synergy of different departments working towards the same goal, and • Performance improvement in a downturn economy when the aviation industry is suffering. Data-Collection Procedure Data are collected on a regular basis, depending on each measure, and processed manually into a performance-measurement spreadsheet that is updated every quarter. For instance, the budget for the airport is divided into nine areas and the director meets with those nine depart- ments each quarter. Data are collected by the finance department, which sends updates to the director each month. Two customer surveys were completed in 2007 to measure customer service. Also, there’s a venue for channeling customer complaints through the airport’s web site that ensures a response within 72 hours. Although Dayton International Airport led an effort to automate data-processing efforts, it was restricted from doing this by the transparency rules of other city departments. Measures, Standards, and Targets Internal benchmarking is valuable for the information it provides on airport performance trends—how the airport is doing in relation to its own previous performance. External bench- marking is valuable for identifying industry trends. Thus, in selecting measures, it is important to adopt measures that will allow for both internal and external benchmarking. These measures should also be based on indicators that the industry embraces as important and indicators that other airports use to allow for proper comparison. Unfortunately, U.S. airports are not doing much to measure passenger comfort not only because of the difficulty of quantifying such subjective data, but also because of the implications of cultural diversity on such data. However, measures of intangible assets such as aesthetics, sense of space, shuttle bus waiting time, or distance between the parking lot and the terminal building, to name a few, need to be incorporated into performance-measurement systems in 106 Part III: Field Research on Performance Measurement Dayton International Airport appointed a dedicated person to be in charge of the performance- measurement system and other related strategic issues. Dayton International Airport selects performance measures that are being tracked by peer airports for use in benchmarking studies.

order to monitor customer satisfaction. Until this type of measure is widely adopted by the industry, benchmarking of such intangible assets will not be feasible. Dayton International Air- port strongly supports the incorporation of this type of measure into airport performance- measurement systems since no organization can efficiently manage those areas that are not being measured. Dayton International Airport’s Balanced Scorecard has a total of 25 measures selected accord- ing to what the airport industry is measuring in the following areas: • Cost Performance • Revenue Performance • Concessions Performance • Efficiency and Effectiveness • Customer Service Each measure serves as a baseline for that reporting year, and it is compared with previous year’s performance, as shown in Exhibit III-1.4. Airport Case Studies on Performance-Measurement Systems 107 Source: Dayton International Airport Ke y Performance Measures FY 07 Goal Cost Performance Actual % Change Actual % Change Actual % Change Meet "Target" Budget 95% of Approved Budge t 65% 46% 60% 58.33% 68% 58% Total Airline Cost Per Enplaned Passenger* $12.00 $8.42 43% $8.42 43% $8.42 43% Total Operating Costs Per Enplaned Passenge r $22.70 $18.40 23% $13.88 64% $16.12 41% Total Operations & Maintenance Costs Per Enplaned Passenge r $8.4 6 $7.11 19% $5.92 43% $5.22 62% Operations & Maintenance Costs Per Terminal Square Foo t $16.2 7 $6.24 161% $6.15 165% $5.30 207% Public Safety Costs Per Enplaned Passenge r $2.25 $2.64 -15% $1.88 20% $1.81 24% Fire Safety Cost Per Enplaned Passenge r $1.94 $1.86 5% $1.73 12% $1.50 29% Soft Costs of Projects 20% 11% 82% 0% 0% 0% 0% Revenue Performance Non-Airline Revenue to Airline Revenue 50% 51% 2% 49% -2% 51% 2% Total Non-Airline Revenue Per Enplaned Passenge r $12 $16.46 37% $12.99 8% $12.93 8% Cargo Space Lease d 100,000 sf 0 0% 0 0% 0 0% Concessions Performance Total Concessions Revenue per Enplaned Passenge r $5.00 $5.62 12% $5.44 9% $5.58 12% Total News & Gifts Revenue Per Enplaned Passenge r $1.62 $1.42 -12% $1.40 -14% $1.50 -7% News & Gift DBE 25% 25% 0% 25% 0% 25% 0% Food & Beverage Revenue per Enplaned Passenge r $3.65 $3.84 5% $3.67 1% $3.72 2% Food & Beverage DBE 3% 0% -100% 0% -100% 0% -100% Parking Revenue Per Enplaned Passenge r $7 $9.60 37% $9.08 30% $8.82 26% All Other Concessions Revenue Per Enplaned Passenge r $0.20 $0.37 85% $0.37 85% $0.36 80% All Other Concessions DBE 3% 1.46% -51% 1.46% -51% 1.46% -51% Efficiency & Effectiveness Change Order Costs /Project Costs 5% 0 100% 0 100% 0 100% Number of breaches of airport security pla n 0 1 -100% 0 100% 0 100% Number of violations airfield/runway incursion s 0 0 100% 0 100% 0 100% Customer Service Customer Satisfaction-Parking & Signage 85% 70% -18% 70% -18% 70% -18% Customer Satisfaction-Appearance 85% 70% -18% 70% -18% 70% -18% Customer Satisfaction-Concessions 85% 72% -15% 72% -15% 72% -15% Customer Satisfaction-Complaint Response Tim e 72 hours 72 hours 100% 72 100% 72 100% *Based on DAY Preliminary draft with residual Method of Rates & Charges Q1 FY 07 Performance Q2 FY 07 Performance Q3 FY 07 Performance Exhibit III-1.4. Dayton International Airport 2007 Balanced Scorecard.

Monitoring and Reporting The information is summarized in different forms and discussed at management meetings as follows: • Biweekly management meetings to discuss strategies and tactics with the participation of Finance, Safety, Engineering, and Operations departments, the director of aviation and deputy director of aviation. • Monthly financial update (expenses and revenues) presented to the director. • Quarterly budget overview report presented to the city. • Annual budget hearing in which performance measures and goals are presented to the city and to the airport council as guiding facts to legitimize additions to the budget. With all this information in hand, the scorecard is updated on a quarterly basis to show the progress in goal achievement through the course of the year. This information is widely avail- able to staff at all levels. Successes and Challenges The implementation of the performance-measurement system proved to be successful at Dayton International Airport. The results are overall improved operations and revenues at a time when airport activity is slowing down across the country. The data support this assessment: • Enplaned passengers (EPAX) increased by 6.9% in 2006 and 9.3% in 2007, and the latest numbers, in October 2009, show that enplanements are up by about 3% for the year. • CPEP dropped from $13.84 in 2005 to $5.5 in 2008 and is estimated to be $4.5 for 2009. • Aeronautical revenues dropped and Non-Aeronautical Revenues increased. • Landing Fees dropped from $3.95 in 2007 to $1.10 in 2008. • Airport impact in the community is calculated at $1 billion a year. Dayton International Airport did not escape the challenges generally encountered in the process of implementing a performance-measurement system; however, having a solid strategic foundation and a clear understanding of the airport’s positioning and vision were critical in overcoming these challenges. Lessons learned that Dayton International Airport can share with other airports on the successful implementation of a performance-measurement system are summarized below: • It is important to bring external stakeholders into the process to obtain support. • It is important to clearly communicate to staff members that an airport strategy with defined objectives and direction improves efficiencies when decisions need to be made. Knowing in what direction to move simplifies the decision-making process, thereby reducing thinking time. The basic question employees should be asking themselves when in doubt is: “How does this action play into the strategy?” • Staff members need to know what is in the performance-measurement system for them in order for them to embrace it. People follow success and their buy-in is a reflection of it. • A strategy is as much about a set of initiatives you will fully embrace as it is about divesting yourself of others. • Employees look for more than compensation in a job. In the words of President Roosevelt, it is the “joy of achievement and the thrill of creativity” that provides them with ultimate job satisfaction. Dayton International Airport management strongly believes that people follow success; as such, they have not had the need to implement an employee reward program. Understanding the benefits of the performance-measurement system and how it translates into an employee’s individual interests appear to support the success of the system. 108 Part III: Field Research on Performance Measurement

The implementation of the system was also a challenge, as it has proven to be in other orga- nizations as well. Implementation demanded time and effort at all levels of the airport. The performance-measurement system was seen as a radical change, and change is uncomfortable. To overcome resistance to change and anxiety, the director of aviation met twice in a 6-month period in small groups with all staff to explain what was going to take place and where the air- port wanted to go. Complaints were heard and addressed. Simultaneously, the deputy director and managers also met with their staffs to reiterate what was happening and what the next steps were. With 203 full-time employees, it took Dayton International Airport a year to gain employee buy-in at all levels in the organization—including managers, who were skeptical that the airport could implement such a drastic change. Employees embraced the changes once they saw the pos- itive results from the system and understood what was in it for them. Finally, as a city department, the airport must conform to city standards regarding automa- tion. These limitations pose a restriction on the transferability of the system. Dallas/Fort Worth International Airport Abstract Dallas/Fort Worth International Airport is the third largest airport in the world in terms of daily operations, the seventh in regard to passenger traffic, and the 28th in terms of cargo vol- ume. The Airport supports this growth with a good strategic plan that establishes a clear mis- sion, vision, and primary business goal, which is to grow the airport’s core business. After a solid strategic foundation was established, the implementation phase of the formal performance- measurement system took place in 2006. Previously, the airport had monitored a wealth of measures by department, but lacked a structural airportwide system. The current system reflects good top-level outcome measures and targets. It uses process tools such as Balanced Scorecard and Six Sigma. Top executives are rewarded for achieving performance results with bonuses. Airport Profile Dallas/Fort Worth International Airport is located 21 miles northwest of the city of Dallas and 26 miles northeast of Fort Worth in Texas. In 2007, the airport served over 59.7 million passen- gers and operated over 690,000 flights. The airport also has a strong cargo operation, having facilitated about 799,000 tons in 2007. Airport management reports to a board of directors, a semi- autonomous body charged with governing the airport that is appointed by the city council of Dallas and Fort Worth. Although the board enjoys some freedom in carrying out its core activ- ity, it needs the council’s approval for its annual budget, bond sales, and other financial activi- ties. Dallas/Fort Worth International Airport employs approximately 1,700 employees and has seven runways, five terminals, and 155 gates. It brands itself as a global airport. American Air- lines is its primary carrier, with a market share of about 85% of the airport’s passengers. In 2007, Dallas/Fort Worth International Airport won “Highest Customer Satisfaction for Large Air- ports” from J. D. Power and Associates and was rated the “Best Airport in the Americas” by ACI. Program Goals and Objectives Dallas/Fort Worth International Airport developed a strategic plan with a clear mission, vision, and primary business goal and four strategic focus areas derived from the airport’s mission. The airport’s primary business goal is to grow the core business of domestic and international pas- senger and cargo airline service. This primary business goal is to be accomplished through the following four strategic focus areas: (1) to keep Dallas/Fort Worth International Airport cost competitive, (2) create customer satisfaction, (3) deliver operational excellence, and (4) foster employee engagement. Each strategic focus area is broken down into specific goals and actions, and tied to specific key performance indicators. Airport Case Studies on Performance-Measurement Systems 109 Dallas/Fort Worth International Airport has a well-developed strategic plan with a clear mission and vision, primary goals, and a strategic focus.

At Dallas/Fort Worth International Airport, goals and actions are assigned top-level outcome measures referred to as “Level 1” measures and are addressed in a separate report called Key Air- port Measures, an internal Dallas/Fort Worth International Airport document. Each measure is assigned a target for the year. Management bonuses down to the associate vice president (AVP) levels are tied to achieving specified targets for each strategic goal. The airport has not yet taken performance measurement further down into the organization’s structure. The airport implemented a good internal process for developing the strategic plan and select- ing measures. This process consisted of a series of six 1-day sessions that included top executives down to the AVP level. Dallas/Fort Worth International Airport used an outside consultant to facilitate the process. The executives did an external scan of the environment and considered what they perceived to be the views of customers and stakeholders, but did not involve stake- holders or the board of directors directly in the process. The vision, mission, primary business goal, and key results were developed collaboratively by the executive vice presidents and vice presidents of the organization. The assistant vice presidents provided input on the strategic objec- tives and initiatives. Once the strategic foundation was established, a formal performance-measurement system was developed. The implementation phase started in 2006. Previously, the airport had monitored a wealth of measures by department but lacked a structured airportwide system. The performance- measurement system is headed by the Airport’s CFO. The system uses process tools such as Balanced Scorecard and Six Sigma. To secure a successful implementation and employee buy- in, Dallas/Fort Worth International Airport has trained its employees on the strategic plan. The CFO would like to benchmark Dallas/Fort Worth International Airport measures and prac- tices against those of peer airports in the future and so would like to see more transferability of measures among airports. Data-Collection Procedures Right now, data on the airport’s Level 1 outcome-based performance measures are gathered by the finance department annually and published in the report Key Airport Measures for that year, which is not distributed outside the airport or published on the Web. Currently, the performance- measurement system does not track Level 2, or “lower” measures, although there is a great deal of performance measurement at lower levels. The finance department also acts as the repository source for the Level 1 data. Most measures are taken monthly, but some are taken quarterly or annually. Financial information is provided by the AVPs and is directly tracked by the respon- sible department. Operational data come from the Airport Operations Database maintained by the operations department. Customer service information comes from the annual ACI Interna- tional Benchmarking Survey Measures, Standards, and Targets The airport’s primary business goal and the four strategic focus areas are supported by Level 1 outcome-based performance measures, each with a target for the current fiscal year. All the infor- mation is gathered internally, with the exception of customer service data, much of which comes from the ACI International Benchmarking Survey and the employee engagement survey. Following is a list of key performance measures for primary business goal and each of the strategic focus areas: • Primary Business Goal: Grow the Airport’s Core Business – International Passenger Airline Destinations – Number of Passengers (total and O&D) – Landed Weights (total and cargo only) • Strategic Focus Area 1: Keep Dallas/Fort Worth International Airport Cost-Competitive – Total Airline Costs – Airline Cost per Enplaned Passenger (CPEP) 110 Part III: Field Research on Performance Measurement Goals and actions are assigned top- level outcome measures referred to as “Level 1” measures. The strategic plan and the selection of performance measures are interrelated.

– Revenue Management (parking revenue per originating passenger, concessions sales per enplanement, commercial development acres leased, natural gas wells in production) – Underlying Bond Ratings • Strategic Focus Area 2: Create Customer Satisfaction – ACI Survey Rank—International – ACI Survey Rank—Over 40 Million Passengers • Strategic Focus Area 3: Deliver Operational Excellence – FAA Safety Compliance – Environmental Compliance • Strategic Focus Area 4: Foster Employee Engagement – Employee Engagement Index Score – Wellness Program Participation Dallas/Fort Worth International Airport is cognizant that one of the limitations of its performance-measurement system is that it includes measures of areas that escape the airport’s control and directly affect the airport’s performance. One example of this kind of measure is CPEP, a widely used measure that is affected by factors such as the nation’s or a region’s eco- nomic and financial conditions, airlines performance, and policies and other factors beyond the control of the airport—factors that are often unique to a particular area or airport and that there- fore make it difficult to compare peer airports. Because the performance-measurement system has been implemented for about a year at Dallas/Fort Worth International Airport, the airport currently monitors only Level 1 measures, and the system is currently structured around lagging indicators. The next version of the performance-measurement system will include Level 2 and Level 3 measures as well as leading indicators to help the airport be more proactive in making certain decisions. This will not be for- malized until a business intelligence system is in place at the airport. Monitoring and Reporting Most monitoring and reporting is done monthly, although, depending on the nature of the measure, some reporting is done quarterly and some annually. The responsible departments assemble the information and provide it to the CEO and other top executives. Some of this information is shared in meetings of the board of directors. The CFO would like to see all senior airport executives meet quarterly to discuss the status of Level 1 measures and those measures most closely linked to them. Success and Challenges Dallas/Fort Worth International Airport has gone a long way toward developing a well- structured performance-measurement system with a solid strategic foundation. Dallas/Fort Worth International Airport’s best practices include the following: • A strong strategic-planning process giving proper direction to the organization. • Clearly defined, measurable, strategic focus areas and primary business goal. • A good set of top-level performance outcome measures, with near-term (Fiscal Year 2008) targets. • IT platform adoption to ensure accuracy and availability of data with the double benefit of reduced data-collection effort and time. • Employee training program on strategic planning to ensure strategic element alignment and employee buy-in. • Executive compensation (bonuses) tied to outcome performance targets, down to the AVP level. • A good planning process that involved executives at the AVP level and above. • Branding and strategic positioning as a global airport. Airport Case Studies on Performance-Measurement Systems 111

Dallas/Fort Worth International Airport is still in the implementation phase of its first cycle of the performance-measurement system. The airport has not escaped employee resistance to change, making performance-measurement system implementation a tedious and longer process than originally expected. The current development of the Dallas/Fort Worth International Airport performance- measurement system reflects good top-level outcome measures and targets. At a later stage, the airport will cascade measures to link them to programs at lower levels, thus aligning the entire organization with the primary business goal and its four strategic focus areas. Ultimately, the air- port plans to develop the performance-measurement system to the point where the airport can tie individual employee performance standards to airport results. San Diego International Airport Abstract San Diego International Airport is a large hub governed by the San Diego County Regional Air- port Authority (the Authority). The need for an airport performance-measurement system began with the realization of an ongoing need to strengthen management accountability and manage the Authority’s key initiatives, crucial outcomes, and overall results. The system is aligned with the organization’s mission, vision, values, and strategic objectives as well as the annual operational and capital budgets. Over the past 2 years, the airport has implemented a state-of-the-art performance-management system incorporating sophisticated software and data collection tech- nology into the airport’s ERP system. Airport Profile Founded in 1928, San Diego International Airport is the 30th busiest airport in the country in terms of passengers and the nation’s busiest single-runway commercial service airport. San Diego International Airport is located approximately 3 miles from downtown San Diego, California, and served 8.9 million enplaned passengers from throughout the Southern California region (and Mexico) in fiscal year 2007. The Authority was created in 2003 as an independent agency to man- age daily airport operations and work collaboratively with local, state, and federal agencies to address the region’s long-term air transportation needs. As a financially self-sufficient agency, the Authority does not rely on taxpayer dollars or city or county funds for its operations. Program Goals and Objectives The Authority recognized the need for a more robust and effective performance-measurement system in late 2006. Prior to 2006, the management staff relied on a variety of distributed legacy software applications, electronic spreadsheets, paper forms, and other ad hoc materials and methods to collect, report, and manage key performance measures. The data collection and performance-measurement processes were cumbersome, often not timely, and very labor inten- sive. The Authority wanted to embed greater accountability and establish better, more accurate, measurement of performance indicators. With the appropriate technology tools, the Authority was able to better define the expected performance level and each individual’s contribution to divisional and organizational results. The Authority’s Business Planning Department was tasked to align and integrate the performance-management system with the organization’s strategic goals and each division’s annual goals. With functional cooperation and support, the performance-management system was designed, tested, and implemented. The performance-measurement system has matured from a limited scope of primarily reporting financial information to a fully integrated desktop application accessible by all management personnel through the Airport’s ERP software appli- cations. The organizational and functional scorecards are supplemented with external bench- marking results collected from comparable peer airports. 112 Part III: Field Research on Performance Measurement To attain employee buy-in, the current PM System was designed, tested, and implemented with cooperation and support from all functional groups.

The performance-management system provides timely and accurate visibility of organizational goals and performance. Accountability is enhanced when those responsible have access to the right data, at the right time, and in the right format to make the appropriate decisions. All management personnel have access to the performance-measurement system application (otherwise known as QPR) to monitor both organizational and divisional performance trends and results. Manage- ment’s involvement in the performance-measurement system criteria and their periodic review of results has strengthened their commitment to and accountability for performance. The Authority’s performance-management system monitors key organizational metrics as well as specific project cost and schedule performance. The QPR application offers a more user- friendly way of collecting and reporting performance data and a visual display of current per- formance and past trends (see Exhibit III-1.5). The dashboard allows users to drill down on each performance measure to better understand the underlying measures and to identify which mea- sure may not be performing to plan. The QPR software is linked to the Authority’s ERP system, which greatly enhances data utilization, accessibility, and consistency. The Authority periodically modifies its strategic plan and key objectives to reflect a 3-year planning horizon instead of the traditional 1-year perspective. Five objectives—Financial (e.g., Rev/EPAX, Cost/EPAX, Cash on Hand), Sustainability (e.g., energy and water conservation, waste reduction, and recycling), Customer Satisfaction (e.g., passenger, employee, and tenant), Performance Excellence, and Community Outreach—have specific initiatives and action plans over this 3-year time period. The dashboard tracks both the results and the progress of various projects within each of these objectives. Measures, Standards, and Targets Strategic objectives and the annual budget are closely linked so many initial measures reflect financial and operating results. As the performance-management system has matured, addi- tional measures reflecting the five objectives have been created to present a more balanced mea- surement scorecard approach. The Authority’s performance-measurement system will be continually updated and realigned to these new objectives, and more relevant measures will be linked to report progress and trends. The measurement system’s utility is very dependent on data collection and analysis, the generation of timely reports, and the availability of current and pertinent data to the organization. The updated performance-management system provides extensive infor- mation on the airport’s operations for better management decision-making and faster response to critical issues. The Authority views the performance-measurement system as an essential Airport Case Studies on Performance-Measurement Systems 113 The objective of the PM System is to instill accountability for the organiza- tional goals. Exhibit III-1.5. Snapshot of a dashboard implemented by San Diego International Airport.

management tool to achieve world class performance and recognition as best in class. The Authority also realizes that the performance-measurement system has yet to attain its full poten- tial value to the organization and its management staff. Business Planning works closely with divisional management to identify and select the appro- priate performance measures. Core measures are made up of more granular measures to thor- oughly understand cause and effect. A number of performance measures are financially oriented because of the close link between strategic planning and the budget process. In addition to the traditional financial and operational measures, the Authority is also measuring internal opera- tional process efficiency such as contract and hiring process cycle time, internal cost activity, and internal customer satisfaction. Performance targets are established on the basis of projected results and are expected to reflect business needs and the Authority’s objectives. Data-Collection Procedure Performance-measurement data are compiled from several sources, e.g., financial application software, external vendor databases, and individual MS Excel spreadsheets. Many of these data are gathered and downloaded either electronically or manually into the Authority’s ERP system. Having the data centrally located makes them readily accessible and easy to extract and present on the QPR dashboard. Collecting performance data requires a more systematic and rigorous process to ensure the data’s accuracy and timeliness. Most Authority operational and financial data are collected monthly, while the remainder is collected quarterly or annually. Monitoring and Reporting Data and performance results are available to all management personnel for review and eval- uation. Most performance measures and associated reports are generated within the QPR appli- cation and available electronically. Successes and Challenges The Authority’s biggest challenge is refining the data collection process and integrating the performance-measurement system application throughout the Authority. The organization has a great deal of interest in and enthusiasm for the performance-measurement system and its abil- ity to improve problem solving and decision-making. Both the manual and electronic activities associated with the performance-management system need continual refinement and employee input to enhance the capabilities. The Authority’s performance-measurement system continues to mature and evolve through a number of adaptive stages. It offers the organization a number of advantages including bet- ter performance data accuracy and timeliness. The performance-management system’s full ben- efits and payback have yet to be fully realized in the short time the system has been operational. The Airport Authority, however, fully expects that with continued use and ongoing refinements, the full functionality of the system will greatly exceed all expectations. Toronto Pearson International Airport Abstract Toronto Pearson International Airport is located in Ontario, Canada, and is governed by the GTAA. The GTAA has developed a new strategic plan, which defines the main three strategic themes of the airport for the next 5 years: global competitiveness, gateway development, and corporate sustainability. The strategy and performance measurement build on a Balanced Score- card framework that incorporates four significant perspectives: financial, customer and internal business processes, learning, and growth. Each of these four perspectives has associated strate- gic initiatives, actions, measures to identify performance, and targets to monitor results and pro- vide accountability. In turn, the actions, measures, and targets drive the development of annual 114 Part III: Field Research on Performance Measurement The PM System is linked to strategic planning and budget processes. Performance data collection does require a more systematic and rigorous process to ensure the data’s accuracy and timeliness.

business plans to map annual goals and budgets. High-level corporate measures and extensive operational metrics are available to each department for measuring performance. Results are used to determine performance-based rewards. Airport Profile Toronto Pearson International Airport is Canada’s busiest airport, located 17 miles northwest of downtown Toronto. Toronto Pearson is a large hub handling close to 32 million annual pas- sengers. It is operated by the GTAA, which was incorporated in 1993 as a non-share corporation under Part II of the Canada Corporations Act. It was recognized as a Canadian Airport Author- ity by the federal government in 1994. The GTAA operates Toronto Pearson as a public facility to benefit its customers, partners, and other stakeholders. Entirely self-funding, the GTAA is a not-for-profit corporation that reinvests any operating surpluses to expand and develop Toronto Pearson International Airport. Since 1996, when the GTAA assumed management of Toronto Pearson, the GTAA has replaced outdated airport infrastructure and expanded airport facilities. This includes terminal buildings, hangars, runways, parking garages, and other facilities. In 2007, the GTAA completed the Airport Development Program. Future expansions will be built by the GTAA when warranted by demand, giving the airport an ultimate capacity of 54 million passengers annually. Program Goals and Objectives Toronto Pearson’s strategic planning serves as the basis of the performance-measurement sys- tem. Both strategic-planning and performance-measurement system efforts commenced in 2005 and are comparatively new initiatives, with many aspects still developing. One function of the SPAD department is to develop and oversee the performance-measurement system and report results of the corporation’s performance to the executive team and CEO. The strategic plan is based on the Balanced Scorecard developed by Kaplan and Norton. In this approach, the airport’s vision leads to three strategic themes, based on core organizational values and beliefs: global competitiveness, gateway development, and corporate sustainability. These themes, in turn, lead to a number of broader strategic objectives grouped into four per- spectives: financial, customer, internal processes, and learning and growth. The Integrated Cor- porate Plan sets out specific initiatives with defined targets and measures for achieving each strategic objective. Performance measurement is a three-tier process. The first tier involves SPAD developing and tracking corporate measures, which are high-level measures developed for each strategic theme and reported to the CEO and the board. The second tier encompasses the operating depart- ments, which develop micro-level operating measures that make up the Integrated Corporate Plan (i.e., the means of achieving the vision and strategic themes). Airline CPEP demonstrates this process well. CPEP is a corporate measure that falls under the strategic theme of global com- petitiveness. CPEP is further composed of operating activity metrics such as logistics, baggage cart retrieval, commercial vehicles, and public parking, which are measured at the correspond- ing operating department levels. The GTAA believes that when strategic planning is developed, a performance-measurement system needs to be implemented as well in order to assess the success of strategic initiatives. Mea- sures and targets give employees something to focus on and track and create an urgency to achieve the target. Measures and targets also allow managers to base their decisions on performance- measurement results and take action to improve outcomes. There are five key objectives in the present GTAA performance-measurement system: • To understand cost drivers, efficiency, and reliability of airport operations • To measure progress • To improve performance Airport Case Studies on Performance-Measurement Systems 115 Toronto Pearson International Airport implemented a Balanced Scorecard strategically aligned with the airport’s vision and three strategic themes.

• To set priorities • To respond to changing conditions In addition to the objectives listed above, the GTAA also includes a reward for performance as one of the performance-measurement objectives to encourage accountability for the results. Cur- rently, the reward program extends only to the third level of management. However, the long- term goal is to put the reward program into practice throughout all levels of the organization. Data-Collection Procedure To ensure a comprehensive performance-measurement system approach, data at Toronto Pearson are collected for various periods from all organizational units and from various exter- nal sources, including airlines (about passengers), government agencies, and tenants (about cus- tomers). SPAD organizes and centralizes data for reporting purposes. Micro-level data stay with individual units and are not shared across the organization. Data that impact several depart- ments, such as financial information (costs), are collected centrally and are accessible to various departments within the organization. Data that affect only a limited number of business units, such as customer survey results, are available only to the unit responsible for that activity. Currently, no central data warehouse exists; therefore, one of the challenges experienced by the GTAA is data gathering from various sources across the organization. The GTAA aims to improve the system’s efficiency in order to sustain employee support of the process. Measures, Standards, and Targets Performance measures at the corporate, departmental, and sub-departmental levels are pre- pared for each strategic objective and initiative and are categorized by one of the four Balanced Scorecard areas. Each measurement area is built on department level measures that, in some cases, may have only a soft correlation or be relevant under several overarching strategic themes at the same time. For instance, outcomes of measuring taxiway delay may show how much the airport accrues in costs, prompting a decision to invest in new facilities. On the other hand, taxi- way delay could be considered as an environmental measure, showing correlation between delays and the amount of emissions emitted into the atmosphere. Selection and prioritization of measures assessing overall corporate performance are largely determined by executive team requests. The executives review and reassess the measures annu- ally to ensure continued support of the strategic themes. Most of the measures of Toronto Pear- son’s performance-measurement system are budgetary figures. All Toronto Pearson measures are expressed in constant Canadian dollars to avoid an inflation effect and overstatement of results. The GTAA acknowledges that more leading measures could be included in the performance- measurement system. Measures incorporate a mix of qualitative and quantitative metrics because certain important aspects of operations, such as process efficiency, cannot be quantified and could be omitted in a system that was only “numbers” focused. The GTAA aims for all measures to be results-oriented. There are three key measures that the board considers critical in its assessment of the organi- zation’s overall performance: airline CPEP, the ASQ level of ACI’s ASQ Survey, and revenue under expenses ratio. If the targets for these three measures are not met, the organization as a whole will not have met the desired level of performance for that year. Targets are in the process of being developed for all measures. Targets for performance measures are set through a consultation process at an executive level with considerable research. For example, if the executive team is concerned about whether a target for ownership costs (rent) of 12% of airport revenue is feasible, SPAD will run a model to estimate how such a target would affect various operational areas and if it is reasonable and achievable. Organiza- 116 Part III: Field Research on Performance Measurement Performance results are communicated at different levels of the organization. The board monitors three key perfor- mance indicators to assess the overall airport performance. Because Toronto Pearson Inter- national Airport’s performance- measurement system is being reviewed and adjusted annually to ensure strategic alignment, it serves as a conti- nuous improvement system to the organization.

tional standards also play an important role in the performance-measurement system. According to the GTAA, if organizational standards are set low, the improvement in perform- ance will be minimal as well. Due to a very unique cost structure at Toronto Pearson (no government support, single till debt, rent payments, etc.), benchmarking is mostly done internally to track financial annual per- formance. Internal benchmarking is also performed with Toronto Pearson’s two terminals, which are measured and analyzed at each level of performance as two separate entities. When it comes to external benchmarking, Toronto Pearson management is very selective about its peer airports to ensure data comparability. Benchmarking peers are limited to only those airports that specify the underlying components of their measures or those whose internal operations are familiar to the GTAA. Benchmarking reports, such as the Air Transport Research Society (ATRS) publication, are not used by the GTAA due to the unspecified components of the performance measures of other airports. Monitoring and Reporting The GTAA has a cascading model in which performance measures are reported throughout the organization. There are three levels of reporting: corporate, departmental and sub-departmental. The highest tier reports 15 high-level corporate performance measures to the CEO/executive team and the board of directors by way of the strategic management section. This reporting is done quarterly and includes highly aggregated performance measures for review, not action. The second tier of performance reporting and monitoring is at the departmental level. Monthly reports are developed and directed to each department’s vice president. The monthly reports present more detailed performance measures and are often financially based. The third tier provides fur- ther detailed performance measures and occurs within a department at a sub-departmental level. These are often operational performance measures. Similar to the reports at the departmental tier, lower-level reporting highlights areas of concern, pinpoints immediate sources of difficulty, and acts as a prompt for action. This cascading model allows for problems to be identified and resolved early, so that sources of influence at lower levels are identified and resolved before they become apparent in higher- level performance measures and untraceable. This model also promotes ownership and under- standing of performance in all areas of the operation. Success and Challenges In an effort to maintain a successful and competitive airport, the strategic planning process is cyclical, with scheduled reviews and updates of the strategic plan to adapt to a changing environ- ment. Exhibit III-1.6 presents the full business cycle at Toronto Pearson International Airport. One of the benefits that the performance-measurement system has brought to the GTAA is the confidence the board has gained in airport management. The performance-measurement system is an accountability tool with results presented periodically. The board can take quick action if it notices the signs of poor performance and can request detailed explanations of specific outcomes. Management also has more knowledge and therefore has better control over airport operations. In addition, a performance-based compensation system also contributed to the improvement of performance results. The compensation system at GTAA takes into consideration individual and collective performance to avoid competition among employees and encourage teamwork. Forty percent of the assessment is based on group performance; therefore, there is an incentive for an individual manager and the whole team to meet the targets and improve organizational performance. Airport Case Studies on Performance-Measurement Systems 117 There are 15 end- outcome or corpo- rate performance measures. Cascading measures are monitored at the departmental and sub-departmental levels. To recognize performance results, Toronto Pearson International Airport implemented an employee reward system.

Together with improvement in performance, Toronto Pearson experienced an increase in communication efforts among the organizational units when the performance-measurement system was implemented. The GTAA reported that prior to the performance-measurement sys- tem departmental statistics and information were not always shared across departments in the organization. Even though opening lines of communication within the organization is still a work in progress, horizontal communication has improved considerably. Data sharing has espe- cially improved in cases where data from various departments are needed to calculate corporate measures and produce reports for executives. Due to its still maturing system, Toronto Pearson faces one of the most common performance- measurement system challenges—getting people to accept the system and accept accountability for results. This challenge may be explained by a change in the organization’s purpose and scope. Twelve years ago, the GTAA’s focus was on developing the airport infrastructure. Today, the focus is developing the airport business and making its operations more efficient. The change in the organization’s vision needs to be reflected in employee actions and attitudes. People do not yet fully recognize how their activities and actions affect overall organizational performance. To deal with this challenge, a strategy map illustrating the strategic themes, initiatives, and measures was introduced organizationwide by the executive team with the aim of showing staff at all levels of the organization how to incorporate strategic planning and performance measurement into their daily activity. The GTAA’s desire to have a comprehensive performance-measurement system presented the challenge of measuring intangible assets. The GTAA had an abundance of statistical measures but had difficulty measuring and benchmarking unquantifiable aspects of organizational activ- ity. Measurement is even more complex when the results will be compared with results of other organizations. This aspect of the performance-measurement system is a work in progress. 118 Part III: Field Research on Performance Measurement Source: Toronto Pearson International Airport Exhibit III-1.6. Toronto Pearson International Airport annual review and planning cycle. The performance- measurement system served as a venue to improve internal communication at Toronto Pearson International Airport.

119 This chapter serves as a reference to the reader on relevant regional, state, and federal appli- cations of performance-measurement systems. A vignette of the efforts carried out by organiza- tions that set a precedent in performance measurement, along with available information and where to find it, is presented. The efforts described include the following: • Government Performance and Results Act of 1993 (GPRA) • Airport Economics Manual (ICAO Document 9562)25 and Report of the Conference on the Econom- ics of Airports and Air Navigation Services (ICAO Document 9908),26 (ICAO) • Association Support for Performance-Measurement Practices – Airports Council International (ACI) – Civil Air Navigation Services Organisation (CANSO) – National Association of State Budget Officers (NASBO) – Service Efforts and Accomplishments (SEA) Reports advocated by the Governmental Accounting Standards Board (GASB) • The UK Centre for the Measurement of Government Activity Government Performance and Results Act of 1993 (GPRA) The Government Performance and Results Act of 1993, also known as GPRA and “The Results Act,” was one of a series of laws passed in the early 1990s that was designed to bring a higher level of management practices to federal organizations. Together with the Chief Finan- cial Officer’s Act of 1990, which instituted more rigorous financial management, and Chief Information Officer legislation, which raised technology decisions to the executive level, GPRA sought to bring a higher level of performance to the federal government. Interest in GPRA arose out of Congressional and White House interest in Total Quality Man- agement (TQM) in the late 1980s. The work of W. Edwards Demming was cited by the Defense Department and other organizations as a model for improving performance and quality. Malcolm Baldrige, President Ronald Reagan’s first Secretary of Commerce, emphasized this model as important for improving America’s competitiveness and quality levels, in both the private and public sectors. This effort resulted in the creation of The Baldrige National Quality Award in 1987. The criteria used in awarding this trophy have been widely used as a performance framework by corporations and government at all levels. In 1992, building on this rising awareness of performance initiatives across all sectors, Sena- tor Fred Thompson, Representative Newt Gingrich, and others championed a new “Results Act” to bring increased transparency and accountability to the federal government’s many activities and programs. GPRA required agencies to author a strategic plan with measurable performance targets and was signed into law in 1993. However, it did not take effect for most agencies until C H A P T E R 2 Regional, State, and Federal Applications of Performance- Measurement Systems The “Results Act” required agencies to create a strategic plan with measur- able performance targets.

120 Part III: Field Research on Performance Measurement 1997. GPRA marked the first time Congress had required that firm agency measures be inte- grated into the budget process. GPRA requires all federal agencies to do the following: • Set a 3- to 5-year strategic plan with a clear mission and vision. • Set measurable outcome goals for all major functions. • Develop target levels for all goals. • Develop specific strategies for achieving goals. • Regularly measure results and set annual goals. • Incorporate measures and analysis into the annual budget process through publicly available Performance and Accountability Reports. • Set timeframes for regular program evaluation. GPRA also allows the Office of Management and Budget (OMB) to authorize an alternative form of a goal, including a description of a successful and minimally effective program. In 2006, the OMB developed a pilot program to make agencies’ goals and measures more transparent and easily accessible to the public. Agencies participating in the pilot program pre- pare a 25-page Citizens’ Report that sums up key financial and performance issues, key goals, and the way that funds are spent to meet those goals. While the White House and OMB emphasized the importance of keeping performance infor- mation short and concise, others have argued that the Citizen’s Report separates performance from budgeting. The standard Performance and Accountability Reports (PARs) publish perfor- mance and budget data together, but these large reports have been characterized as too long and complex for the general public. All major agencies, regardless of whether they participate in the pilot program, are now also required to create a two-page summary that gives the reader a quick snapshot of agency results, in addition to the standard PAR. More than 15 years after GPRA became law, it stands as the only piece of legislation that addresses government strategic planning and performance management. GPRA has continued to gain importance as agencies have begun not only to meet the letter of the law, but also to incor- porate the principles underlying the law into their management practices. Airport Economics Manual (ICAO Document 9562) and Report of the Conference on the Economics of Airports and Air Navigation Services (ICAO Document 9908) The International Civil Aviation Organization (ICAO), a UN Specialized Agency, develops regulatory principles, policies, and techniques of international air navigation to foster the plan- ning and development of international air transport. ICAO was founded in 1947, and it oper- ates through regional offices throughout the world, with headquarters in Montreal, Canada. One of the major duties of ICAO is to adopt international standards and recommended prac- tices. In this capacity, ICAO initially addressed performance and productivity measures in the Airport Economics Manual (ICAO Document 9562)27, dedicating Chapter 3, Section C, to dis- cussing performance measurement as a financial management tool for airport managers, regu- lators, and users. The document stresses that performance measures can be applied to all aspects of an airport, not only to its airside and landside operations, but also to safety, security, and com- mercial practices. ICAO encourages airports to select relevant areas of measurement and suggests four key areas: safety, delay, productivity, and cost-effectiveness. ICAO suggests a five-part performance system that airports developing performance mea- sures could use: (1) selecting the most important goals, (2) establishing a measurement method, (3) setting targets, (4) determining what work or initiatives are needed to achieve those goals,

Regional, State, and Federal Applications of Performance-Measurement Systems 121 and then (5) assessing the results of performance measures and their impact on achieving the goals. Within this context, ICAO identifies three units of measurement: (1) inputs (capital assets, staff numbers, supplies, and services), (2) output (enplanements, operations, cargo handled, finan- cial aspects such as costs and revenues and aero and non-aeronautical revenues and their compo- nents), and (3) outcome (quality and efficiency of services such as safety, timeliness, productivity, and cost-effectiveness). Applications of performance measures are benchmarking, identification of best practices and performance drivers, investment analysis, consultation with users, forecasting, and internal assessment tools. Performance-measurement practices were further discussed at the Conference on the Eco- nomics of Airports and Air Navigation Services (CEANS) in September 2008, where the inter- action among states, providers, and users was reviewed and incorporated. The revisions and several recommendations were documented in Report on the Conference on the Economics of Air- ports and Air Navigation Services (CEANS) (ICAO Document 9908)28. Contributions relate to two main areas: • Economic Performance and Minimum Reporting Requirements – States should ensure that appropriate performance-management systems are developed and implemented by their service providers. – States should ensure that service providers establish performance objectives with the pur- pose, as a minimum, to continuously improve performance in four key performance areas and report at least one relevant performance indicator for each key performance area. – Based on ICAO’s Policies on Charges for Airports and Air Navigation Services (ICAO Docu- ment 9082)29 the establishment of performance-management systems by service providers is recommended. • Consultation with Users as a Source of Allocating Funds to the Right Projects: – A dialogue should be established with regional organizations on economic performance with a view to improving performance of the air navigation services system. – States should ensure that a clearly defined, regular consultation process is established with users by their airports. – Users will be consulted on the level and structure of charges as well as on capacity develop- ment and investments. – Users’ feedback obtained during consultations will be considered as far as possible before reaching a decision regarding any proposal. – Confidentiality of market-sensitivity data will be properly protected. – Relevant decision documents will provide an appropriate rationale for the decision. Association Support for Performance-Measurement Practices Airports Council International (ACI) The ACI is actively involved in performance measurement as a tool to improve customer ser- vice and overall airport performance. Among the many efforts pursued by the organization, two have been successfully deployed: ACI-NA Airport Performance Benchmarking Survey and the ASQ Survey. ACI-NA Airport Performance Benchmarking Survey ACI-NA is proactively conducting the Airport Performance Benchmarking Survey, which generates operation and financial data on participating airports in the United States and Canada. The origin of the survey was the Airport Initiatives in Measurement (AIM) Survey developed by ICAO suggests a five-part perfor- mance system that airports developing performance measures could use.

122 Part III: Field Research on Performance Measurement Tampa International Airport, an outreach to eight Florida airports in 2003. By 2005, ACI-NA had launched the Airport Performance Benchmarking Survey, a consolidation of the AIM Survey with the Macro Benchmarking Survey administered by ACI-NA that provided data on four to five macro indicators. Over the years, the ACI-NA Finance Committee has revised and improved the survey form to incorporate information from other industry surveys with the aim of reducing the reporting burden of participating airports. As a consequence, the FAA has submitted a request to OMB to revise FAA Form 5100-127 to unite it with this survey. The Airport Performance Bench- marking Survey is also aligned with the ACI Economics Survey. The survey currently is made up of approximately 75 operational and financial measures. A snapshot of the Airport Performance Benchmarking Survey is presented in Exhibit III-2.1. Results from the survey reflect raw data provided by all participating airports. It is at the discre- tion of the individual airports to select the best peer group for benchmarking purposes. Data are available to participating airports in MS Excel and MS Access through email and on CD-ROM. Airport contact information is also provided to encourage communication between airports so that they can learn from each other’s best practices. A summary report that presents the aggregate performance of participating airports is presented at the ACI-NA Finance Committee Meeting; however, there is no specific mention of data or actions undertaken by individual airports. The Airport Performance Benchmarking Survey is administered by the ACI-NA Strategic Planning and Performance Management Group once a year. It is rolled out by the end of March and data are readily available to participating airports in two releases. The first release is done in June–July and it involves airports with a fiscal year ending in December of the previous year. The second release is done in August–September and includes data from airports with a fiscal year ending in June of that year. This two-phase release process allows for up-to-date information. Participation in the survey has increased by 67% since it was first rolled out in 2005, and the survey captures 90% of large hubs and 57% of medium hubs. Small hubs and non-hubs also par- ticipate but to a lesser extent. Five Canadian airports were incorporated into the sample in 2008. A total of 77 airports participated in 2008. Enrollment in the survey is open to all airports in North America and there is no charge. ACI-NA also sponsors a series of training sessions offered to airports on what information to include and how it should be reported. The goal is to ensure that airports enter comparable data that can be widely used in external benchmarking efforts. During the training sessions, airports are encouraged to ask questions, which helps ACI-NA improve the survey tool. Sixty people have attended the training sessions since the beginning of the series in year 2007. Airport Council International-Airport Service Quality Within ACI there is a subgroup dedicated to airport service quality (ASQ). ASQ offers services and information to the airport industry specifically tailored to airport performance measure- ment and emphasizing customer service. Within ASQ there are four main initiatives: (1) ASQ Sur- vey, (2) ASQ Performance, (3) ASQ Assured, and (4) ASQ Management. The ASQ Survey is the world’s leading airport customer satisfaction benchmark program with over 120 airports in more than 45 countries surveying their passengers every month of the year. All airports use the same questionnaire and follow the same methodology. Airports can participate in the process on four different levels: • ASQ Survey Main Programme. The Main Programme has over 120 airports participating and is designed for all airports that require regularly updated information on their service per- formance for operational and strategic decision-making. Every month, at all participating air- ports, departing passengers are interviewed about their on the day experience. All airports use the same questionnaire and methodology. The ASQ Survey Main Programme is tailored for The ACI-NA Airport Performance Bench- marking Survey generates operating and financial data of participating airports. The survey is con- ducted on a yearly basis and has two releases: (1) June–July and (2) August–September.

Regional, State, and Federal Applications of Performance-Measurement Systems 123 Exhibit III-2.1. ACI-NA Airport Performance Benchmarking Survey30 sample.

124 Part III: Field Research on Performance Measurement airports of all sizes, from 0.5 million passengers to 85 million. The wide range of ASQ partic- ipants allows each airport to select an appropriate benchmarking panel. The Main Programme offers quarterly results, providing insight into and comparisons with the service performance of airports all over the world. A powerful array of deliverables is available within weeks of the end of each calendar quarter. These deliverables include management summaries and interactive data mining and analysis tools as well as individually tailored reports, panels, and raw data. Participating airports have access to all other participating airports’ results. • ASQ Survey Regional Programme. ASQ Regional is specifically designed to provide bench- marking for airports with fewer than 2 million passengers. It allows smaller airports to take advantage of the benefits of ASQ without having to invest in the full ASQ main programme. ASQ Regional surveys twice a year for each season’s schedule and is fully compatible with the ASQ Main Programme. It also offers the powerful array of ASQ management tools and cus- tomer insight, including benchmarking indices based on the entire ASQ airport list. Airports can upgrade from ASQ Regional to the ASQ Main Programme at any time. The Regional Pro- gramme is particularly attractive for airport management companies looking for a tool to measure and monitor the customer service performance of their regional airports. Larger air- ports, which need the full depth of information provided by the Main Programme, can then be compared against smaller regional and developing airports. • ASQ Survey Unique. ASQ Unique is a one-off survey and review of an airport’s customer service performance that can be conducted at a time of the airport’s choosing. ASQ Unique is a fully flexible customer service benchmark that gives airports access to ASQ management insight and information without requiring significant investment. ASQ Unique allows airports to analyze and investigate their customer service performance secure in the knowledge that they are ben- efiting from the techniques and methodologies used in the world’s most advanced airports. ASQ Unique is distinguished from ordinary customer satisfaction surveys in that it offers benchmark indices to allow airports to place their results in perspective and compare them- selves against the industry average. ASQ Unique is fully compatible with the ASQ Main and ASQ Regional Programmes and permits airports to “test drive” the ASQ management tools. • Airport Specific Survey. ASQ identifies any concerns, but to get to the heart of the issue and to highlight the causes and potential solutions often requires more detailed research. Airport Specific Research concentrates only on areas of concern and potential key revenue/success drivers. This ensures that airports receive detailed action points focused exactly where they are needed. This research operates at a varying number of levels, depending on individual airport requirements. Each airport is different and requires a different mix. ASQ Performance was developed at the request of a number of airports that wished to com- plement the information obtained from the ASQ Survey with actual measures of the service delivered. Faced with a wide range of methodologies and measurements, airports worldwide have agreed to standardize performance measurements on 16 KPIs that define the passenger experience through the airport, and one unique methodology—ASQ Performance. ASQ Perfor- mance measures the levels of service delivered by an airport through a series of observations scheduled to ensure an accurate reflection of key issues and it puts those measures into context through comparison with other airports. It allows airport management to measure the service performance actually delivered by the airport and accurately pinpoint underperformance, bottle- necks, and over-performance. Each participating airport receives the data from all other participating airports, allowing it to identify best practices and to measure its own performance precisely. Excellence in service is not a singular occurrence; it is proven to be the result of continual effort and commitment to pro- viding the best possible service. Conceived as a tool for airport management, ASQ Performance offers monthly feedback and a range of deliverables from management summaries to full data- bases capable of displaying each recorded observation. The ASQ Performance methodology is ASQ Performance complements the ASQ Survey with 16 standard KPIs on the service delivered.

Regional, State, and Federal Applications of Performance-Measurement Systems 125 simple in concept, but experience has shown that an in-depth understanding of each airport is vital to creating accurate measures that can be trusted. Key aspects of ASQ performance methodology are the following: • 16 KPIs are measured by all airports all year round, with the option to add additional KPIs. • Observations are conducted using a PDA (personal digital assistant) to keep fieldwork costs at a minimum. • All airports use an identical methodology to guarantee consistent data for trend and bench- mark analysis. • KPIs are only measured during peak times. • A minimum sample size is mandatory to guarantee representative benchmarking. • Participating airports can choose to increase the sample size or the number of surveyed KPIs. ASQ Assured, the third component to ASQ, is a certification scheme specifically designed and operated for airports by ACI. It helps airports meet the challenge of managing service quality in a very dynamic industry environment by recognizing the initiatives and processes in place at the airport. Airports participating in ASQ Assured go through ACI’s custom-designed audit process to assess their commitment to service quality. ASQ Assured is unique in that it recognizes that providing a continually high quality of ser- vice is a journey, not a destination. ASQ Assured does not mandate an arbitrary service level. Certification recognizes airports’ commitment to service quality and the fact that systems and processes are in place to constantly improve the service provided to passengers. ASQ Management, the fourth and final aspect of ASQ, provides support, advice, and advisory services for airports looking to improve their quality of service. Advisory projects range from supporting airports looking to achieve ASQ Assured Certification to assistance in changing air- port culture and implementing best practices throughout the airport. ASQ Management can also provide training for staff involved in using the ASQ Survey and ASQ Performance. The training covers understanding and analyzing market research to presentation skills and assis- tance in embedding ASQ in day-to-day business processes at the airport. One of the core roles of the ASQ Management service is to assist airports in the implementa- tion of best practices identified through the ASQ Survey and ASQ Performance. By analyzing and collating the results of hundreds of airports’ customer satisfaction surveys and through reg- ular meetings and discussions with the airports, the ASQ initiative is building a best practice model of service quality in an airport. Airports wishing to improve customer quality can now access knowledge of the best practices through the ASQ Survey and ASQ Performance to accelerate their own improvements. Civil Air Navigation Services Organisation The Civil Air Navigation Services Organisation (CANSO) acts as the global voice of the com- panies that provide air traffic control and represents the interests of Air Navigation Service Providers (ANSPs) worldwide. CANSO’s mission is to assist its members in providing a safe and seamless airspace, with particular emphasis on customer-driven performance, cost efficiency, and optimized air traffic management. CANSO’s strategic goals are focused on improving global air navigation service performance. As such, its mission is to provide a global platform for customer- and stakeholder-driven civil air navigation services, with emphasis on safety, efficiency, and cost-effectiveness. Air navigation service performance measurement and global benchmarking lie at the heart of this objective. It is recognized that the ability to monitor and measure performance is a key requirement for any business or industry in identifying areas for improvement and setting performance-based targets. ASQ Assured is a certification scheme that helps airports manage service quality. ASQ Management provides support, advice, and advisory services for airports looking to improve their quality of service.

126 Part III: Field Research on Performance Measurement It is for this reason that CANSO launched its global benchmarking work program, supported by its Global Benchmarking Workgroup (GBWG). One of the key objectives for this ANSP ini- tiative is to support the establishment of performance-based air-traffic management (ATM). Improved transparency of air navigation service performance and the visibility of the perfor- mance of others promotes understanding of what drives good performance. Further, it supports improved decision-making and facilitates target setting. Overall, CANSO’s aim is the following: • To develop a set of key global performance indicators for air navigation service, • To identify international best practices, • To support constructive dialogue with customers and other stakeholders, and • To assist individual ANSPs in optimizing their performance. The CANSO GBWG initiative acknowledges the significant achievements in the field of per- formance measurement and benchmarking by the Eurocontrol Performance Review Unit (PRU). The approach taken by the GBWG sought to draw from a range of existing initiatives, including those of the Eurocontrol PRU, the Asia Pacific ANSP benchmarking initiative, the International Air Transport Association’s (IATA’s) work on air navigation service performance, and individual ANSP international benchmarking studies and harmonization efforts. The GBWG has also been developing global performance indicators in air navigation service productivity, cost-effectiveness, and quality of service. The CANSO Safety Standing Committee is also devel- oping safety metrics. The ultimate goal for the GBWG is to develop robust reports suitable for exter- nal publication; however, it is acknowledged that before this can be achieved, more work is required to refine supporting processes, improve the speed of data collection and validation processes, and establish an appropriate scope of measures. CANSO member ANSPs have, through their own initiative and the demands and expectations of their customers, placed a great deal of importance on performance measurement and bench- marking. The CANSO Global Benchmarking initiative has and will continue to provide an essen- tial opportunity to share knowledge and collaborate globally. It will promote understanding of what drives good performance in ATM, reveal best practices that will assist individual ANSPs in optimizing their performance, and serve the needs of air navigation service oversight bodies. National Association of State Budget Officers The National Association of State Budget Officers (NASBO) has served as the professional mem- bership organization for state finance officers for more than 60 years. NASBO is the instrument through which the states collectively advance state budget practices. As the chief financial advisors to our nation’s governors, NASBO members are active participants in public policy discussions at the state level. The major functions of the organization consist of research, policy development, edu- cation, training, and technical assistance. These are achieved primarily through NASBO’s publica- tions, membership meetings, and training sessions. NASBO is an independent professional and educational association and is also an affiliate of the National Governors’ Association. NASBO’s role in the performance-measurement arena is to serve as an information source for its members so that best practices can be shared. In this capacity, NASBO staff members moni- tor the performance-measurement activities of the federal, state, and local governments that are at the forefront in this area. Further, NASBO maintains contacts with states, the federal govern- ment, and other associations and keeps apprised of recent developments, including new publi- cations and training. NASBO recognizes that due to both fiscal constraints and the public’s desire for government to be more accountable, state governments have increasingly turned to incorporating performance measures in state budgeting. Whether performance measures are legislative or gubernatorial ini- CANSO’s global benchmarking pro- gram supports the establishment of performance-based ATM, improved decision-making and facilitate target-setting. Global performance indicators in air navigation service productivity, cost- effectiveness, quality of service and safety metrics are being developed by CANSO. NASBO serves as an information source for its members in the performance- measurement arena to promote best practices.

Regional, State, and Federal Applications of Performance-Measurement Systems 127 tiatives, the budget office is often responsible for implementing the process; at the very least, it is an integral part of the process. NASBO points out that just as each state is unique, their approach to incorporating performance measures is also individualized, tailored to the specific political realities and governance of the state. States continue to experiment in this area and their experiences provide insight into best practices for the implementation of performance measures. NASBO’s 2008 document Budget Processes in the States (available at www.nasbo.org) illustrates the measures and processes adopted in each of the 50 states and Puerto Rico. Government Accounting Standards Board— Service Efforts and Accomplishment The Government Accounting Standards Board (GASB) has been studying the use of perform- ance management, measurement, and reporting by governments almost since its creation in 1984 by the Financial Accounting Foundation (FAF). The GASB’s focus with its Service Efforts and Accomplishments (SEA) project is on only one aspect of the performance-management process— the external reporting of SEA performance information. The GASB is providing state and local governments with voluntary guidance that will assist them in effectively communicating SEA per- formance information to city councils, staff, and the public. Such guidance will assist governments in their duty to be publicly accountable and make informed economic, social, and political deci- sions. In addition, the GASB seeks to guide and educate the public about SEA reports. The objective of the SEA is to encourage the reporting and use of SEA performance informa- tion by doing the following: • Developing conceptually based suggested guidelines for voluntary reporting of SEA performance information that will help officials effectively communicate the government’s SEA performance in a way that the public will find meaningful and understandable and • Completing a limited update of selected sections of “Concepts Statement No. 2, Service Efforts and Accomplishments Reporting,” based on information that has been gained through the GASB’s combined research and to clarify the scope and limitations of SEA reports. The scope of the overall project does not include establishing the goals and objectives of state or local governmental services, establishing specific nonfinancial measures or indicators of service performance, or establishing standards of or benchmarks for service performance. The reporting of the results of governmental programs and services is referred to as SEA reporting for government. This type of reporting encompasses not only information about the acquisition and use of resources, but also includes information about the outputs and outcomes of the services provided. Information is included about the relationship between the use of resources (costs) and those outputs and outcomes—what may be referred to as measures of per- formance. A variety of measures are needed to assist users in assessing governmental performance, including measures of inputs, outputs, efficiency, and outcomes (measures that relate service efforts to service accomplishments) and external factors that influence results. SEA reporting is the act of preparing and publishing a report measuring the efficiency and effec- tiveness with which an organization operates in trying to achieve desired results. SEA reporting produces information on the results of government programs or services that can be used to help make decisions. Literally, it provides citizens and other users with measures or indicators of the volume, quality, efficiency, and results of public services. These indicators of performance, when publicly reported, are yardsticks that can be used to figure out whether government is working well or poorly, or somewhere in between. The approach to incorporate perfor- mance measures is individualized

128 Part III: Field Research on Performance Measurement Reporting of SEA performance information is more effective if the government’s officers, elected officials, citizens, public entities and organizations, and public servants realize they are all accountable for their performance and the use of public resources. The reporting of this informa- tion for assessing accountability is more commonly done internally, but there is a growing desire for external users/recipients (such as citizens) to know how the government is performing. The UK Centre for the Measurement of Government Activity Recognized by the United Nations and the McKinsey Group as a leader in government per- formance management, The UK Centre for the Measurement of Government Activity is a divi- sion of the Office of National Statistics and is responsible for measuring volume of inputs, output, and productivity change over time for public services. The United Kingdom, along with Italy and other European countries, has taken the lead in addressing productivity and performance in gov- ernment, as well as in addressing the broader issue of the impact of government spending and national gross domestic product figures. The aim of the UK Centre is to measure all outputs resulting from several broad areas of pub- lic spending including health, education, transportation, and so forth. The Centre contributes government production and performance data to the overall UK national accounts figure, giv- ing a more complete view of the nation’s economic activity and production than is currently available in the United States. As government spending continues to grow and represents an increasing proportion of all economic activity, the failure to determine how productive the pub- lic sector is can greatly impede our understanding of the national economy. The UK Centre is developing an advanced framework to truly determine the quantity and quality of products and services resulting from government spending. A specific focus is to develop measures for the quality of public services in terms of their actual impact on desired and pre-determined outcomes and the use of output and productivity measures. The UK Centre was created in 2005 as the result of recommendations in a high-profile govern- ment report that determined that the lack of performance measures was damaging the national economy and government management. The UK Centre’s specific duties are the following: • To ensure that the measures of key government services in the national economic statistics are accurate and meaningful; • To keep improving performance measures, working with government departments, man- agers and other stakeholders; • To conduct rolling reviews of methods of measurement of different public services, ensuring methodology keeps pace with changing circumstances and modes of delivery; • To continue publishing a regular series of authoritative “productivity” articles describing the output and productivity performance of the main public services; and • To develop and publish credible and coherent individual reports on specific government pro- gram areas. The aim of the UK Centre for the Mea- surement of Govern- ment Activity is to measure all outputs resulting from sev- eral broad areas of public spending.

Next: Endnotes »
Developing an Airport Performance-Measurement System Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s Airport Cooperative Research Program (ACRP) Report 19: Developing an Airport Performance-Measurement System provides guidance on developing and implementing an effective performance-measurement system for airports. The report’s accompanying CD-ROM provides tools designed to help users complete the step-by-step process for developing an airport performance-measurement system as presented in ACRP Report 19.

The CD-ROM is also available for download from TRB’s website as an ISO image. Links to the ISO image and instructions for burning a CD-ROM from an ISO image are provided below.

Help on Burning an .ISO CD-ROM Image

Download the .ISO CD-ROM Image

(Warning: This is a large file that may take some time to download using a high-speed connection.)

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!