Cover Image

Not for Sale

View/Hide Left Panel
Click for next page ( 40

The National Academies of Sciences, Engineering, and Medicine
500 Fifth St. N.W. | Washington, D.C. 20001

Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement

Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 39
39 Reporting of and accountability for performance; and performance metrics related to reliability are related as Evaluation of staff performance. well to travel time and delay measurement. Some travel- time data can be modeled, using volume and capacity rela- The range exhibited includes development of standards, tionships assumptions and facility-level detection (where use of outcome measures, and performance-related data available)--especially over long time periods--but real- acquisition process. time analysis on a regional or corridor basis is increasingly seen as dependent on vehicle speeds and travel time and variance measurements, requiring either extensive loop Development of Standards detector deployment and maintenance or vehicle probe data (that must be acquired from the private sector). The There have been some self-evaluation instruments for SO&M, costs of extensive detection (especially on arterials and out- such as for signal operations and incident management. How- side congested areas) must be weighed against its advan- ever, there are no accepted standards or warrants for the appli- tages in discrete areas and provision of volume data. In cations' performance. Benchmarking efforts are generally both cases, there are complex analytics that require exper- informal. An FHWA survey indicates that the mature states imentation and experience to produce useful data. Very few identified in this project's survey had developed measures that states have made this investment. were oriented to meeting the SAFETEA-LU performance reporting requirements, including some outcome measures related to speed and travel time, incident numbers and dura- Process Maturity as tion, and some operations activities output data. In addition, a Bridge to Defining a few districts in a few states are measuring incident-related Improvements in times for procedural improvements. However, the current Institutional Architecture lack of common definitions and uniform means of recording and archiving the information substantially reduces the util- In this report, the concept of capability maturity is applied to ity of much of the available data and makes it virtually impos- both process and institutional characteristics based on the sible to manage a program toward improved effectiveness. A recognition that those aspects of process essential for program recently completed NCHRP study developed the needed stan- effectiveness must be present at defined levels of criteria-based dards, but few states are considering their adoption. maturity to be effective. Process maturity levels are identified for the purposes of indicating the types of change needed to advance toward increasingly supportive institutional architec- Use of Outcome Measures ture related to culture, organization and staffing, and resources and partnerships. Only outcome measures (i.e., impact on service levels) are true It should be noted that whereas the maturity model measures of program improvement and success, but with very approach is used to structure the relationship between process few exceptions (and those on a pilot basis), performance data maturity and supportive institutional architecture, the guid- are not being used to tune strategy applications in the context ance is focused on improving institutional architecture--not of a commitment to continuous improvement. A few of the on business or technical processes. mature states use output data to improve their activity effec- tiveness. The development of performance measures, systems, data, and the analytics to utilize them requires a considerable Levels of Process Maturity time frame. The range of process capabilities determined in the survey and interviews can be defined into three capability maturity levels. The lowest level (Level 1, or L1) is identified with the state of Performance-Related Data Acquisition Process SO&M associated with a transitioning transportation agency. The increasing focus on performance data both for strategy A logical increment (Level 2, or L2) has been observed in the applications analysis and improvement and for accounta- more mature transportation agency practice and represents bility reporting--internally and externally--places a signif- current best practice, as determined from project research. icant burden on the acquisition, development, and analysis The highest level of capability (Level 3, or L3) is not directly of performance data--both outputs and outcomes. Whereas observable among transportation agencies, but is extrapolated output measures (such as the event time-stamping in inci- as the presumed ideal outcome of the vectors of improvement dent response) can be helpful in tuning up procedures, the in capability established by L2.

OCR for page 39
40 Level 1: Transitioning of SO&M within the agency's overall mobility improve- ment portfolio. The point of departure for most state DOTs determined by the survey performed by this project was in a situation where SO&M strategy applications were becoming some- Level 3: Integrated what standardized, but on an ad hoc basis. At this level, Providing a best practice-defined benchmark for each of the SO&M is recognized as an issue. Individuals are charged business processes (often defined by current best practice or with certain specific projects or activities at the project analogies in other industries) is Level 3. At this level, SO&M level. The process is siloed and hero-driven. This state of is established as a program with predictable outcomes. Activ- play is illustrated by many of the transitioning transporta- ities are developed using standard processes (e.g., planning, tion agencies. What is often missing are the formalizing and systems engineering, project development, budgeting) and documentation of plans and procedures, the full range of effectiveness is measured and used to support a program of professional capacities that provide for institutionalizing continuous process improvement. Best practices are installed good practices, standardization of systems, and perfor- and measured consistently within the program framework mance monitoring, all of which provide the basis for that characterizes the mature states described. improving program effectiveness. Table 5.2 presents SO&M processes in an operations matu- rity framework at the conceptual level used in the model, illustrating the four key aspects of business processes, the lev- Level 2: Mature els of maturity, and the general descriptions of each level. SO&M is recognized as an agency activity and is beginning As part of the determination of institutional architecture, a to be managed as a program. Business processes are being further breakdown of the business processes into specific activ- developed and standardized. Capabilities are being devel- ities was developed. Table 5.3 further defines the criteria that oped at the unit level, but are program-unstable and distinguish among the three levels of capability. These criteria dependent on particular staff. This state of play is illustrated indicate the capabilities at each level and suggest the types of by many of the "mature" state DOTs and other transporta- actions that must be taken to advance to the next level of capa- tion agencies. What is missing at this level is often statewide bility, such as documentation, training, and performance mea- coverage and consistency, a clearly understood appropriate surement. Although process levels are not a direct focus of the project development process, full integration into pro- guidance, the definitions of process levels provided the basis for gramwide resource allocation decisions, and the use of out- determining the necessary institutional characteristics for each come measures to support the appropriately expanded role aspect needed to move processes up in level. Table 5.2. The Process Maturity Framework Used in the Model at a Conceptual Level Level of Capability Maturity Level 1 Level 2 Transitioning Mature Level 3 Getting organized: Developing methods and Integrated unique ad hoc processes: capabilities Best practice integrated, activities at project developed at the strategy documented and measured level, siloed, application level, consistently within Key Aspects hero driven but unintegrated program framework Program scoping Narrow and opportunistic Needs based and standardized Full-range core program Technical processes Informal, undocumented Planned, mainstreamed Integrated, documented Technology and systems development Project oriented; qualitative Implementation using a rational Standardized, cost-effective evaluation process for evaluating and systems/platforms prioritizing Performance measurement Outputs reported Outcomes used Performance accountability

OCR for page 39
41 Table 5.3. The Criteria for Process Maturity Levels of Capability Maturity Business Level 1 Level 2 Level 3 Process Transitioning Mature Integrated Program Narrow and opportunistic Needs based and standardized Full-range core program scoping Mission vague--ad hoc operations Operations business case made as Full-staged program of synergizing activities based on regional initia- needed; mobility-based multistrat- functionalities tives, with limited central office egy application program Operations as key trade-off invest- support Consistent statewide strategy appli- ment with other improvements in Narrow/ITS-project based, low- cations related to specific problems, terms of mobility management hanging fruit desired outcomes by function, Program extended to lower Absence of statewide service geography, network jurisdictions standards Technical Informal, undocumented Planned Integrated and documented processes Projects/issues handled on firefight Strategic planning and budgeting of Integrated operations-related plan- basis with only modest formal staged improvements including ning, budgeting, staffing, deploy- regional/district planning (but no maintenance and construction ment, and maintenance both within standard template) implications ("exercising the plan") operations and with statewide and Minimal concepts of operations Concepts of operations and related metropolitan area planning and architecture, procedures processes developed, including Full documentation of key concepts ad hoc/no consistency, National major communications structure of operations, procedures and pro- Incident Management procedures Procedures and protocols fully tocols compliance exploit systems No/limited documentation Technology and Qualitative, opportunistic Evaluated platforms Standardized, interoperable systems Technologies selected at project Basic stable technology for exist- Systematic evaluation/application development level ("big bang") ing strategy applications, evalu- of best available technology/ Limited understanding of operating ated on qualitative basis and procedure combinations with platform needs incremental evolution Mixed data items Identification of standardized, Standard technology platforms Lack of appropriate procurement statewide interoperable/integrated developed/maintained process operating platforms and related Assets inventoried procurement procedures Continuity of operations plans in place Performance Outputs reported Outcomes used Performance accountability measurement Concept of continuous improvement Procedures exercised Continuous improvement perspec- absent Outcome measures developed and tive adopted (requires intra- and Projects lack objectives: measure- used for improvement interagency after-action analysis) ment of outputs only with limited Outcome measures reported Accountability and benchmarking at analysis/remediation unit and agency level via regular Output measures reported outcome performance reporting-- Limited after-action analysis internal and public