Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 18
GETTING STARTED Two important actions are required to begin customer-driven benchmarking. The first is to establish the internal benchmarking team. The second is to explore related management processes within the agency that affect performance measurement. Internal Team To begin, establish the internal team that will be responsible for implementing customer-driven benchmarking. This team will do all that is necessary to establish the agreements, processes, and procedures for customer-driven benchmarking and to inform and support the line organization regarding what is going on. This team may be a task force that will operate under the direction and management of the senior maintenance leader or may consist of several people who already report to the senior maintenance leader. In either case, recognize that this team will be functioning as long as the agency is involved in customer- driven benchmarking. This team must include the most senior maintenance leadership, other maintenance and systems staff, and, if necessary, consultants. Collectively this team requires background and experience in the following: Defining maintenance practices and managing maintenance work (i.e., the team requires an expert who has credibility with field managers); Designing, administering, and interpreting costumer surveys and related consumer research; Collecting and utilizing data for performance measurement; Inputting, manipulating, and extracting data from the maintenance and related asset management systems; Inputting, manipulating, and extracting data from the financial management system; Setting performance targets, budgeting, and allocating resources to field organizations; and 19
OCR for page 19
Chapter 1: Introduction to Benchmarking Training superintendents, crew leaders, and equipment operators. One person may provide several of these capabilities. This team will work across internal maintenance suborganizations such as districts, counties, areas, and garages. The team will also be the primary coordinating body with partner agencies. Related Management Processes A typical organization, whether public or private, has many related management processes and systems that seek to achieve some of the same goals as customer-driven benchmarking. It is important to be aware of these related management processes, to use relevant data and performance measures from these processes, and to coordinate with them. The following management processes related to benchmarking are found in many organizations: Asset management, Outsourcing, Performance-based planning, and Public reporting in conformance with the Governmental Accounting Standards Board (GASB). Asset Management With the completion of the Interstate Highway System and the enactment of the Intermodal Surface Transportation Efficiency Act in the early 1990s, national policy regarding roads turned decisively in the direction of preserving the existing investment and making the best use of existing highway capacity. By the end of the 1990s, the thrust to preserve existing investment was folded into the idea of "asset management." Asset management is a systematic process of maintaining, upgrading, and operating physical assets cost-effectively, although in the broadest sense it can apply also to materials, equipment, and financial resources.2 According to the proceedings of an executive seminar on asset management conducted in 1996, attributes, key components, 2 Center for Infrastructure and Transportation Studies at Rensselaer Polytechnic Institute, 21st Century Asset Management, Executive Summary, Proceedings of a workshop sponsored by the American Association of State Highway and Transportation Officials and the Federal Highway Administration, October 1997. 20
OCR for page 20
procedures, and outputs of an of asset management system include the following: A common understanding of performance measures and criteria; Understandable results in a user-friendly environment; Customer focus; A mission-driven orientation (i.e., asset management strives to help the organization achieve its mission); Accessibility at many levels within the organization; Linkages to technical analysis, decision making, and budgetary processes; Inventory information and condition databases; Life-cycle cost analysis; and Optimization (i.e., allocates limited funds in order to maximize net benefits or minimize total costs).3 These attributes are strikingly similar to key elements of a customer-driven benchmarking process. Because of the similarity, the likelihood of succeeding in a benchmarking effort can be substantially strengthened by properly coordinating with the asset management program of an agency and by thoroughly understanding the asset management systems that are in place, under development, or being planned. Therefore, at the start of undertaking a benchmarking effort, it is desirable to take an inventory of your agency's asset management efforts. By doing so, you will be able to identify procedures, performance measures, sources of data and information, and other resources that can help in benchmarking. You are also likely to find increased support for your benchmarking efforts. Those charged with asset management will usually recognize that customer-driven benchmarking can benefit asset management, and vice versa. 3 Asset Management, Advancing the State of the Art into the 21st Century Through Public-Private Dialogue, Proceedings of an executive seminar sponsored by the American Association of State Highway and Transportation Officials and the Federal Highway Administration, Washington, DC, October 1997. 21
OCR for page 21
Chapter 1: Introduction to Benchmarking There may also be shared recognition of the desirability of integrating benchmarking into the overall asset management program. Outsourcing Nearly all agencies outsource at least some of their maintenance operations. A critical issue in outsourcing is determining which activities to outsource, developing performance specifications for contracting these activities, and evaluating the performance of contractors. In addition, contractors themselves have a compelling need to evaluate their own performance in order to serve their clients effectively and to remain competitive. Doing each of these tasks well depends on having appropriate performance measures. Many of these performance measures are similar to those that might be used for benchmarking. When getting started on benchmarking, it is desirable to determine what type of performance measurement, if any, is being used in conjunction with outsourcing. You should coordinate with those responsible for performance-based outsourcing and, if possible, arrange to share data and results. It is also desirable to contact contractors who must work under performance-based specifications. Contractors may have insights regarding how to establish an effective benchmarking process. Also, contractors may wish to become benchmarking partners. Performance-Based Planning The reinvention of government to make it more responsive to customer needs and more accountable has been going on for a long time and accelerated in the late 1980s. Gradually, and then with increasing speed, public officials and managers in government recognized that establishing customer-oriented performance measures and targets for accomplishments is one of the most effective ways to improve government efficiency and effectiveness. With the enactment of the Government Performance and Results Act, all federal agencies were required to develop a performance- based strategic plan by identifying appropriate input, outcome, and output measures; setting targets; striving to meet the targets; and reporting on their progress. Many states have enacted similar 22
OCR for page 22
legislation, as have cities and counties. An excellent example of performance reporting is Oregon Benchmarks, which received national recognition. The private sector has also been using performance-based planning. In order to avoid the dangers of relying upon an overly narrow set of performance measures, many private firms (and government agencies) have been developing "balanced scorecards." The balanced scorecard approach involves performance measurement, goal setting, reporting, and monitoring in four areas: 1. Customer perspective, 2. Internal perspective, 3. Innovative and learning perspective, and 4. Financial perspective. When beginning a benchmarking process, you should determine what types of performance-based planning are occurring in your agency and identify opportunities to cooperate and share information and results. Accountability and the GASB The GASB has played a major role in fostering performance assessment, mainly to foster increased accountability of agencies to their customers and the people who finance and pay for government services. To this end, the GASB carried out a major research program entitled "Service Efforts and Accomplishments Reporting: Its Time Has Come," which produced a series of reports on performance measurement and reporting, including one on road maintenance.4 An outgrowth of this effort has been Ruling 34 of the GASB, with calls for government transportation agencies to depreciate their assets or report on the condition of assets by using data in an asset management system when they prepare their annual financial reports to the public. Virtually every government agency prepares its financial reports in conformity with GASB 4 Hyman, W., R. M. Alfelor and J.A. Allen, Service Efforts and Accomplishment Reporting: Its Time Has Come, Road Maintenance, Governmental Accounting Standards Board, February 1993. 23