National Academies Press: OpenBook
« Previous: Part I - Guidebook
Page 83
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 83
Page 84
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 84
Page 85
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 85
Page 86
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 86
Page 87
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 87
Page 88
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 88
Page 89
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 89
Page 90
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 90
Page 91
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 91
Page 92
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 92
Page 93
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 93
Page 94
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 94
Page 95
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 95
Page 96
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 96
Page 97
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 97
Page 98
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 98
Page 99
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 99
Page 100
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 100
Page 101
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 101
Page 102
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 102
Page 103
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 103
Page 104
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 104
Page 105
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 105
Page 106
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 106
Page 107
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 107
Page 108
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 108
Page 109
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 109
Page 110
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 110
Page 111
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 111
Page 112
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 112
Page 113
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 113
Page 114
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 114
Page 115
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 115
Page 116
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 116
Page 117
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 117
Page 118
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 118
Page 119
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 119
Page 120
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 120
Page 121
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 121
Page 122
Suggested Citation:"Part II - Research Report." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 122

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Research Report P A R T I I

C O N T E N T S II-3 Chapter 1 Introduction II-7 Chapter 2 Guidance Development Process II-21 Chapter 3 Literature Review II-24 Chapter 4 Stakeholder Interviews II-31 Chapter 5 Pilot Testing II-38 Resources Reviewed

II-3 1.1 Research Objectives State departments of transportation (DOTs) have made steady progress in the use of data and information systems to inform transportation asset management (TAM) decision-making. Advances in data acquisition, management, and reporting tools and technologies are enabling more automated, efficient, and integrated flows of data across systems and more agile and effective ways of delivering information needs to end users. The objective of NCHRP Project 08-115 was to develop guidance to assist DOTs in advancing their use of data and information systems in TAM practices. This guidance was intended to provide a means for DOTs to benchmark their current practices and select and prioritize improvements. DOTs have strong incentives to use this guidance to advance their practices—they face growing expectations from the public, increasing demand for transparency and accountability, and pres- sures to reduce staff and improve internal efficiencies. The research was designed to explicitly recognize the fact that meaningful change cannot be enacted simply through the procurement of new tools and technologies. DOTs must also consider institutional challenges faced and organizational practices to overcome those challenges. For this reason, the guidance developed through this research covers strategies and practices for managing organizational change associated with making data and information system-related improvements. 1.2 Research Tasks NCHRP Project 08-115 was accomplished through a series of 12 tasks and subtasks, sum- marized as follows: • Task 1. Kickoff Meeting – Prepare briefing materials and participate in a 60- to 90-minute kickoff web conference with the NCHRP project panel to discuss the Work Plan, technical approach, schedule, and research product review procedures, and – Prepare a memorandum documenting the discussions and key decisions made. • Task 2. Practice Review – Conduct a critical review of the literature, current practices, and near-market technology related to TAM data acquisition, management, and use; – Include examples of applications of emerging data collection technologies, use of private and crowd-sourced data, data governance and management, data integration (e.g., across asset data repositories, analytical tools, road-network systems, engineering design systems, and projects), and information delivery methods; C H A P T E R 1 Introduction

II-4 Guidebook for Data and Information Systems for Transportation Asset Management – Include practices that are used among domestic DOTs, international transportation agencies, and other industries whose practices may be adaptable to DOTs; and – Prepare a technical memorandum presenting the results of the task that includes (a) an annotated bibliography; (b) a summary of current practices, challenges, and opportunities identified in the sources reviewed; (c) a list of asset management data collection, analysis, communication, and decision support techniques and associated success factors as identified in the sources reviewed; and (d) an annotated list of the emerging technologies and practices that are likely to become available within 2 to 5 years. • Task 3. Interim Report 1 – Describe an organizing practice-maturity level framework for the guidance to be developed in this research that will be useful for support of DOTs’ data-driven asset management analyses and decision-making; – Within this framework, catalog the practices documented in Task 2; – Develop a preliminary guidebook outline suitable for print- and digital-format presenta- tion of the guidebook; – Prepare a draft Interim Report 1 presenting the organizing framework and guidebook outline and incorporating the results of Task 2; – Prepare a preliminary plan for stakeholder consultation to be conducted as Task 4; – Meet with the NCHRP project panel via web conference to discuss the Draft Interim Report 1 and the Task 4 plan; and – Document the discussions and key decisions from the meeting, and make revisions as appropriate to produce a revised Interim Report 1 and Task 4 plan. • Task 4. Stakeholder Consultation – Provide the Interim Report 1 to 25 to 30 stakeholders who have been recruited to represent the target audience for the guidance and review the document; – Conduct five structured online focus group discussions, with each group comprising 5 to 6 individuals; – Based on the focus group discussions, modify the Interim Report 1 content as appropriate to enhance the guidebook’s effectiveness; and – Document the focus groups’ key findings and planned revisions of the organizing framework and guidebook outline in a technical memorandum provided to the NCHRP project panel. • Task 5. Practices Benchmark Review – Working within the practice-maturity level framework established in previous tasks, establish criteria for the guidebook to assist users as they benchmark an agency’s state of practice; – Draft copy to describe self-assessment tools and formats for presenting current and aspirational state-of-practice level descriptions, gap analysis, and an improvement roadmap; – Compile a candidate set of application vignettes to illustrate the application of the bench- mark assessment and improvement planning process; and – Present the results of Task 5 in a technical memorandum to the NCHRP project panel. • Task 6. Interim Report 2 – Prepare a draft Interim Report 2 presenting the results of Tasks 4 and 5 and describing the institutional resources and workforce capabilities required to support each benchmark level of practice; – Meet with the NCHRP project panel via web conference to discuss the draft Interim Report 2; and – Document the discussion and key decisions from the web conference and make revisions as appropriate to produce a final Interim Report 2. • Task 7. Implementation Strategy – Define improvement strategies for advancing the level of practice within an agency or organizational unit;

Introduction II-5 – Describe the general scope for typical improvement projects and likely ranges of resource requirements, key stakeholders, and leadership necessary for successful implementation; – Describe tools for practice-level self-assessment and improved benchmark descriptions; and – Present the results of this task to the NCHRP project panel in a technical memorandum suitable for use in Task 8, Pilot Testing, and include a plan for the testing. • Task 8. Pilot Testing – In consultation with the NCHRP project panel, identify and recruit two agencies to test the draft guidance that has been developed in the preceding tasks; – Provide support as needed to observe and assess the effectiveness of each agency’s application of the self-assessment, benchmarking, and improvement strategy development framework, tools, and methods; and – Prepare a technical memorandum documenting the pilot testing, the test agencies’ comments on the value of the guidance and the results of the pilot testing, and suggested improvements for guidance content and format. • Task 9. Interim Report 3 – Prepare a draft Interim Report 3 presenting the content of guidance developed in the pre- ceding tasks, including practice-maturity levels, assessment tools, improvement strategies, a roadmap development process, capabilities self-assessment checklists, and detailed outlines for incorporating this content into print and digital versions of the guidebook. • Task 10. Panel Meeting – Prepare presentation materials summarizing the content presented in draft Interim Report 3 and participate in a 1-day meeting with the NCHRP project panel at an NCHRP facility to discuss content and formats for the guidebook; and – Document the discussions and key decisions of the meeting in a brief memorandum. • Task 11. Final Guidebook – Prepare the final draft of the guidebook in print and digital formats; – In consultation with the NCHRP project panel, identify the site and servers that will be used for the digital version of the guidebook, and prepare the digital version to be compatible with the site and its servers; – Solicit NCHRP review of both versions of the final draft guidebook, respond to comments received, and prepare the final guidebook in print and digital versions. • Task 12. Project Report and Technical Memorandum on Implementation of Research Findings – Prepare a technical report summarizing the methodology and results of the research, a set of presentation slides to introduce the project and guidebook to potential users, and a memorandum on actions the NCHRP and others can take to encourage dissemination and adoption of the guidebook; – Solicit review from the NCHRP project panel; – Respond to comments received; and – Prepare the final guidebook in print and digital versions. 1.3 Report Overview This report documents the key findings and recommendations of NCHRP Project 08-115, “Guidebook for Data and Information Systems for Transportation Asset Management.” The primary product of the research—the final guidebook—is provided as Part I of this report. The guidebook also documents the companion digital tool that was developed through this project. The balance of the research report is presented in four chapters, as follows: • Chapter 2 describes the guidance development process; • Chapter 3 presents the results of the literature review and synthesis of current practices, chal- lenges and opportunities, data management techniques, and success factors;

II-6 Guidebook for Data and Information Systems for Transportation Asset Management • Chapter 4 documents the results of the stakeholder survey; and • Chapter 5 documents the results and feedback from the Pilot-Testing process. A list of the resources consulted in the literature review appears after Chapter 5, and Part III of this report collects the appendices that supplement both the guidebook and this research report. In Part III, Appendix J provides additional details about the literature review in the form of an annotated bibliography, and Appendix K presents a research implementation plan including actions for dissemination and implementation of the research products.

II-7 C H A P T E R 2 2.1 Overview The guidebook was developed through the following process: • Literature Review: A literature review was conducted to provide an effective starting point for the research project. • Initial Content: The project team then created an initial technical framework, guidebook content outline, and sample content based on the conclusions of the literature review and with input from the NCHRP project panel. • Stakeholder Interviews: A series of stakeholder interviews were held to gather feedback on the proposed guidebook framework, organization, and sample content; to identify specific DOT successes and challenges in the use of data and information systems for TAM; and to examine DOT motivations for use of the proposed guidebook materials. • Revised Content: A revised framework and technical content (including assessment elements, practice benchmarks, and potential improvements) were developed based on key findings from the stakeholder interviews and the guidance and input provided by the NCHRP project panel. These materials were developed to support DOT pilot testing. • Pilot Testing: Organized to apply the developed framework and detailed content, pilot tests with two DOTs were conducted to assess the effectiveness of the guidebook content, and gather input from the DOTs. • Draft Guidebook Development: A final framework, outline and sample content were created based on the pilot-testing feedback and the input of the project panel. • Digital Tool Development: A digital tool (eventually named the TAM Data Assistant) was created to support assessment, improvement identification, and improvement evaluation activities proposed within the guidebook. • Panel Meeting: A full-day, in-person project panel meeting was organized to review the revised guidebook content and supporting digital tool. The meeting included exercises during which the panel was able to use the digital tool and provided input to its functionality and features. • Draft Final Research Products: A draft final guidebook and supporting digital tool were provided for final panel review. • Final Research Products: Based on feedback reviewed from the final panel review, a final version of the guidebook and digital tool were developed. Detailed results of the literature review, the stakeholder interviews, and the pilot testing are provided in Chapters 3, 4, and 5. The remainder of this chapter: • Highlights the key findings of the literature review and stakeholder interviews; • Discusses how these findings influenced the guidance development; • Reviews feedback from the project panel and pilot testing; and • Discusses how the guidance was revised based on the feedback. Guidance Development Process

II-8 Guidebook for Data and Information Systems for Transportation Asset Management 2.2 Literature Review The project started with a comprehensive literature review to document the current state of the practices being assessed and to identify resources that could be leveraged for developing the research products. The literature review identified: • A range of tools and technologies supporting data collection, analysis, and reporting; • Several data management, integration, institutional, and organizational practices that are available to DOTs to advance their asset management programs; • Technical and institutional barriers to the advancement of data and information systems supporting asset management; • Common motivations that drive DOTs to seek continuous improvement of their asset manage- ment programs and services; and • Existing management tools that are available to help DOTs assess their existing capabilities related to asset management and data management. The literature review found that successful implementation of data and information systems for TAM requires three ingredients. First, agencies must deploy core tools and technologies such as asset management systems, data collection technologies, and reporting and mapping tools. Second, agencies must implement data management and governance practices to inte- grate their data and produce reliable information from raw data. Third, agencies must employ change-management, knowledge-management, and other organizational practices to adapt their business processes and workforces to take full advantage of new tools and technologies for TAM. This three-level model of TAM data practices is illustrated in Figure II-1. The figure repre- sents TAM data practices as a series of three concentric circles. The innermost circle represents the core tools and technologies that are used to create or collect information. The next layer represents information management and integration practices (e.g., data standards) that facil- itate the consistent use and application of collected information. The outer layer represents the organizational and institutional practices that are necessary for the successful implemen- tation of the tools, technologies, and information management practices highlighted by the inner layers. Figure II-1. Layers of TAM data practices.

Guidance Development Process II-9 2.3 Initial Framework Based on the literature review and the experience of the research team, an initial conceptual framework for the guidebook was developed. As shown in Figure II-2, the initial framework pro- posed to identify specific TAM motivations that would serve as the entry point to the guidebook. From these motivations, the user would be led to specific technical areas (originally called the “organizational model”) in the guidance materials. Within these technical areas, the user would complete self-assessment, gap analysis, and improvement roadmapping activities. The technical areas of the guidebook would be defined by the technical framework and would be organized based on five stages in the data life-cycle. Each of these five stages would be further broken down into sub-elements for more-targeted discussion and evaluation. Figure II-3 depicts the five data life-cycle areas proposed within the initial framework: • Specify and Standardize Data; • Collect Data; Figure II-2. Initial framework. Figure II-3. Data life-cycle as proposed for guidebook.

II-10 Guidebook for Data and Information Systems for Transportation Asset Management • Store, Integrate, and Access Data; • Analyze Data; and • Act as Informed by Data. The initial framework also included a draft outline for the guidebook (see Figure II-4). The outline was organized around a proposed series of seven steps that a user would take to identify gaps and plan improvements: 1. Overview: Reviewing the purpose and intended audience for the guidance, this step would highlight the process of identifying improvement needs, conducting an assessment, identifying gaps, creating a roadmap for improvement, and planning and implementing specific improve- ment actions; 2. Improvement Motivation: Oriented to a DOT asset manager or executive, this step would recognize their motivation to improve asset management systems, processes, or outcomes, recognize the possibility for data or information system improvements, and recommend stakeholders for engagement in the assessment and improvement roadmap development; 3. Assessment: Based on the motivations of the DOT asset manager or executive, this step would guide the assessment team in conducting an initial, high-level assessment of the current state of practice and identification of the desired future state; 4. Gap Analysis: This step would discuss how initial assessment results are used to complete a high-level gap analysis and would identify various technologies, tools, and practices that could be improved upon or implemented to deliver the desired state; 5. Roadmap: This step would present the reader with options from which an improvement road- map can be developed; 6. Action Planning: In this step, for each improvement action, the guidance would provide a detailed list of recommended actions, internal and external stakeholders for engagement, general time and budget considerations, organizational and institutional requirements, and short vignettes highlighting practical examples or case studies to provide context for the devel- opment of a detailed action plan; and 7. Implementation: In the final step, general support would be provided to assist the reader in engaging the DOT toward the implementation of the improvement roadmap. This framework served as a starting point for the initial panel input and stakeholder engage- ment activities. 2.4 Stakeholder Interviews The research team sought to validate and refine the initial framework through a series of stake- holder interviews that were organized as focus group meetings on selected aspects of the framework. Each focus group session was 1 hour in length and began with a brief introduction of the research effort, the guidebook to be developed, and the proposed framework. After this introduction, the research team solicited feedback on the proposed framework and then engaged focus group participants in a discussion of agency successes and challenges related to the introduction of new data, information management systems, decision support tools, or related information manage- ment practices for TAM. To ensure meaningful discussion, the focus groups were organized into four (4) topic areas and were limited to small groups (targeting 5 to 6 participants). The research team recognized that participation from a broad range of DOT professionals would be necessary to ensure that comprehensive input was provided. Targeted DOT profes- sionals included: • Asset managers and TAM program leadership; • Information technology managers and technical leads; Figure II-4. Initial guidebook outline.

Guidance Development Process II-11 • Business intelligence or data warehouse managers, workforce, risk, and knowledge man- agers; and • Executives involved in TAM and information technology decision-making. The research team developed and executed a strategy to engage staff from DOTs across the nation, resulting in the successful execution of five (5) planned focus groups (one addressing each topic area and the fifth focusing on multiple perspectives across a single DOT). The focus groups are discussed in further detail in Chapter 4. 2.4.1 Focus Group Organization As shown in Figure II-5, the focus group discussions were organized, executed, and documented to address four topics: asset data collection, information access and transparency, managing condi- tion and performance, and efficient project delivery. The fifth focus group involved multiple participants from a single DOT, the Louisiana Depart- ment of Transportation and Development (Louisiana DOTD). This group addressed spanning all four topic areas and was ensured input from a fuller, cross-functional perspective. Through these discussions, the research team collected input on the proposed framework. A summary of key findings in each topic area is presented in the next section. 2.4.2 Summary of Feedback Feedback from the focus group participants indicated that the proposed framework, organiza- tion, and anticipated content were comprehensive and meaningful. No significant changes were recommended in the general framework and organization, but key findings related to DOT chal- lenges and needs were identified in each of the proposed data life-cycle areas. These findings are discussed in further detail in Chapter 4. 2.5 Revised Guidance Based on the feedback from the focus groups, the research team refined the framework and drafted a set of assessment materials for each of the five data life-cycle areas that would be suit- able for pilot testing. The following materials were created: • Assessment benchmarks, consisting of a comprehensive set of assessment elements and prac- tice benchmarks for different levels of advancement; Asset Data Collection Managing Condition & Performance Information Access and Transparency Efficient Project Delivery Figure II-5. Focus group topic areas.

II-12 Guidebook for Data and Information Systems for Transportation Asset Management • Assessment response templates to facilitate assessment response and improvement selection; • Assessment summary and improvement recommendations, organized by assessment area and section, summarizing performance and providing potential improvement recommenda- tions; and • Organizational practices, consisting of descriptions of four types of organizational practices that are useful in supporting the development of agency expertise, coordination, and change management. 2.5.1 Summary of Framework Changes Table II-1 compares the initial recommended framework with the streamlined framework. All elements of the original framework were represented within the final concept; however, the orga- nization was simplified to allow for more detailed and more comprehensive assessment elements without creating an undue burden on users of the guidance. 2.6 Pilot Testing: Process and Outcomes 2.6.1 Background In consultation with the NCHRP project panel, the Connecticut and Utah DOTs were identi- fied for participation in pilot-testing activities. Before their selection, each DOT had been asked to identify a particular motivation and focus for their participation. This section provides a summary of each DOTs focus and motivation for participation and lists the pilot-test participants and meeting dates. 2.6.1.1 Connecticut DOT The Connecticut DOT was interested in exploring the data and information systems asso- ciated with their guiderail asset management programs and activities. They identified capture and integration of guiderail-related data from design files through ongoing CADD to GIS integration activities, as well as from maintenance activities through an anticipated Maintenance Management System implementation. Connecticut DOT participants included representatives from the agency’s asset management, guiderail, civil integrated management (CIM), and CADD programs. 2.6.1.2 Utah DOT The Utah DOT sought to evaluate how data and information-related improvements could be developed to support the management of the agency’s pavement striping assets. These assets currently have limited statewide programs for data-informed decision-making. The Utah DOT hoped to use this guidance to advance striping data and information systems to support the management of pavement striping as a “Tier 1 asset,” meaning that life-cycle planning, perfor- mance measurement, and performance targeting would be incorporated into the asset’s statewide asset management program. The Utah DOT pilot testing included participation from the agency’s asset management, main- tenance, operations, traffic operations, safety, performance and process improvement, data, tech- nology, and analytics programs. 2.6.2 Process Pilot testing was conducted through a series of online meetings independently organized with each of the two participating DOTs. The process involved (1) application of the self-assessment,

Guidance Development Process II-13 Guidance Area RevisedFramework Initial Framework Rationale for Change Entry Point to Guidance Assessment Areas • Specify and Standardize Data • Collect Data • Store, Integrate, and Access Data • Analyze Data • Act as Informed by Data TAM Motivations • Efficient Data Collection • Improve Information Access • Better Manage System Condition and/or Performance • Transparency • Efficient Project Delivery Significantly Streamlined • In the initial framework, TAM motivations had an indirect relationship to the proposed organizational areas; • The discussion of motivations added unnecessary complexity given that data life-cycle areas were already relatable to the target audience. Organizational Hierarchy Relabeled • Assessment Areas • Assessment Sections • Assessment Elements • Practice Benchmarks Multiple “Areas” • Organizational Areas • Assessment Areas • Assessment Elements • Practice Benchmarks Added Clarity • The category “Organizational Areas” was removed to avoid confusion with “Assessment Areas.” Assessment Summary and Improvement Recommendations Section Level • Assessment results interpreted by Section; • General and detailed improvement recommendations provided by ranges of Practice Benchmark levels. Element Level • Assessment results interpreted by element; • Current and desired state of practice used to identify improvements by Element. Significantly Streamlined • The revised structure allows for more assessment elements without creating an undue burden to the user; • Improvements that address multiple elements will no longer create overlapping improvement recommendations. Technical Library Integrated • Embedded within the assessment summary and improvement; recommendations; • Easily reviewed in aggregate to determine the best options. Cross-Referenced • Each Element (by performance range) generates potential improvements and associated technical library entries; • Individual review of multiple technical entries is required. Significantly Streamlined • The revised framework reduces complexity of cross-referencing and summarizing many discrete improvement recommendations; • Revisions are supported by feedback from the panel and stakeholder consultations indicating that the primary challenges are organizational, not technical. Table II-1. Comparison of proposed guidance areas (revised vs. initial framework).

II-14 Guidebook for Data and Information Systems for Transportation Asset Management benchmarking, and improvement strategy framework, content, and methodology that had been developed through prior research activities, and (2) examination of the value of the guidance, its results, and suggested improvements from the perspective of the target DOT audience. To complete a comprehensive assessment and evaluate the developed improvement recom- mendation, the pilot testing involved six 90-minute meetings with each DOT. • Initial Kickoff (Meeting 1): The research team learned about the DOT’s motivation and context for use of the guidance, explained the anticipated pilot-testing process, and introduced the guidance framework and sample materials to the participants; • Self-Assessment (Meetings 2 and 3): DOT participants used the assessment benchmark and response template materials to evaluate the DOT’s current state of practice and identify areas for potential improvement; • Improvement Selection (Meetings 4 and 5): DOT participants reviewed the assessment sum- mary and improvement recommendation materials, providing feedback regarding the presenta- tion of the materials and their adequacy for purposes of practical implementation; and • Closeout (Meeting 6): Outcomes from the self-assessment and improvement selections were discussed, including a detailed review of the improvement roadmap, organizational capability, use case/vignette examples, and general guidance and support content. The DOT participants were asked to share their views about what worked and what could be improved about the current framework. 2.6.3 Outcomes The pilot testing yielded critical feedback and improvement recommendations. The balance of this section presents the key inputs that were gathered and developed during the pilot tests. A more detailed summary of the pilot tests is presented in Chapter 5. 2.6.3.1 Assessment Elements and Benchmarks Generally speaking, participants found the assessment elements and benchmarks to be com- prehensive and meaningful. Beyond simple adjustments to add clarity in specific language, most recommendations were to provide supporting examples to help with DOT understanding of key concepts, themes, or contexts that are likely to be presented within various elements or benchmark practice levels. The final organization of the assessment elements is shown in Figure II-6. For each assessment element, five distinct benchmark levels were defined. DOTs conducting an assessment would identify the benchmark level that best describes the agency’s current practices (for the selected assessment scope) and the target benchmark level they hope to achieve. The benchmark levels are described in general terms in Table II-2. 2.6.3.2 Assessment Summary and Improvement Recommendation Materials Participants felt that the improvement descriptions provided enough detail for DOT consid- eration. However, challenges were noted in connecting improvements (which were presented at the section level) to individual assessment elements and practice benchmarking results (which were presented at the element level). In the revised framework, improvements were associated with each assessment element rather than presented at the section level. 2.6.3.3 Response Templates Section-level response templates were intended to capture assessment outcomes and selected improvements. Pilot-test participants found the assessment rating and note taking to be effective; however, it was difficult to connect the assessment results to the improvement selection.

Figure II-6. Final organization of assessment areas, sections, and elements.

II-16 Guidebook for Data and Information Systems for Transportation Asset Management Benchmark Practice Level Range Description Initial Steps (Practice Level ranges from 0 to less than 1) DOTs operating within this practice range lack significant, formal processes or structures in the area of practice. What practices may exist are characterized by ad hoc/informal application, as management has not yet set or endorsed a specific practice or policy. Incremental Improvement (Practice Level ranges from 1 to less than 2) DOTs operating within this practice range are beginning to see formalization of processes and structures in the area of practice; however, roles may not be recognized or well known, and the resources to execute these duties may not be in place. Advanced Practice (Practice Level ranges from 2 to less than 3) DOTs operating within this practice range are performing at or above the standard of their peers. In most aspects of these programs, formal structures, roles, and responsibilities are in place and the necessary staff time and resources are dedicated to the practice. Top Performing (Practice Level ranges from 3 to 4) DOTs operating within this practice range are performing at the very top and are leading examples among their peers. Formal structures, roles, and responsibilities are in place and are integrated into regular business practices in a manner that ensures constant evaluation and improvement as business and technology evolve. Table II-2. Benchmark practice levels: range descriptions. In the revised framework, the templates integrated the assessment and improvement information on the same page. 2.6.3.4 Implementation Support Materials The DOT participants expressed that detailed implementation plans would not be a practical outcome of the assessment process. They recommended simplifying the process by focusing on the evaluation of selected improvements to support executive communication. This evaluation would consider priorities for improvement, assessing the relative effort for implementation versus impact, and acknowledging implementation challenges. In the revised framework, detailed improvement recommendations were provided and orga- nized by element and practice range. Each detailed improvement included an overview of the time, resources, expertise, coordination, and change requirements involved in making improve- ment. If available, an additional reference also was highlighted where further detail regarding the potential improvement could be found. 2.6.3.5 General Guidebook Materials Participants expressed that the guidebook should be made as simple as possible, but acknowl- edged that some complexity was unavoidable given the desire to cover any collection of assets and the entire data life-cycle. It was agreed that strong introductory materials were needed to explain the framework and establish a clear understanding of how to use it. The concern about complexity reinforced the importance and value of a digital tool that could streamline the process and workflow for conducting an assessment, allowing users to easily focus on specific areas or aspects of interest. Participants were enthusiastic about a supporting digital tool that would simplify the assessment process and streamline improvement selection and evaluation over the pen-and-paper approach. Participants also noted that they envisioned the typical use case for the guidebook to be improvement within an individual asset program. They indicated that a single, comprehensive evaluation of the full TAM program would likely be too complex due to fundamental differences across assets as well as the typically wide-ranging current and desired states between the data, systems, and tools used in differing asset programs.

Guidance Development Process II-17 2.7 Final Guidebook Organization and Content Extensive updates were made to the initial guidebook materials in response to the pilot testing. The revised guidebook organization, shown in Figure II-7, included the following chapters: • An initial Introduction and Overview chapter to outline the guidebook materials and content, as well as its intended purpose and process of use; • A Pre-Assessment Preparation chapter to cover the steps needed to prepare for the assessment process, including establishing a scope, enlisting participation, and setting expectations; • A Self-Assessment and Improvement Identification chapter to provide guidance for conducting the assessment and identifying improvements; • An Evaluation and Summary chapter to provide guidance on evaluating the identified improve- ments and presenting improvement recommendations for executive consideration; and • An Implementation Support chapter to provide references to supporting materials on organi- zational practices and case studies. • Case studies, detailed in an appendix, to provide examples of how specific DOTs were able to develop and deliver data and information system-related improvements to their TAM programs. The case studies addressed a variety of topics across all five data life-cycle areas, as follows: – Area A: Specify and Standardize Data, specifically in relation to establishing and applying a governance framework; – Area B: Collect Data, specifically in relation to statewide vehicle-based data collection, data collection quality management plan implementation, and mobile field data collection implementation; – Area C: Store, Integrate, and Access Data, specifically in relation to asset management system integration with CADD; – Area D: Analyze Data, specifically in relation to reporting and business intelligence tool implementation and cross-asset resource allocation implementation; and – Area E: Act as Informed by Data, specifically in relation to multi-objective project prioritiza- tion program implementation. Part III, Appendix F, provides further descriptions of the organizational practices referenced in Chapter 5. Briefly, the organization practices are: • Strategic management, covering management practices that relate to how the organization’s priorities are identified, evaluated, and planned for, including strategic planning, establish- ment of decision-making structures and authorities, and performance management; • Initiative management, covering management practices that enable the organization to plan, coordinate, deliver and improve programs, projects, and overarching initiatives, including busi- ness analysis, process improvement, change management, and program and project management; • Talent management, covering management practices that relate to workforce needs of the organization, including workforce planning, recruiting and retention, succession management, and training; and • Knowledge management, covering management practices for building, leveraging, and sustaining the know-how and experience of an organization’s employees, including knowl- edge audits, knowledge capture and codification, collaboration and communities, and knowledge transfer and learning. 2.8 Full-Day Panel Meeting The research team conducted a full-day meeting with the project panel to review the final guidance framework, organization, technical content, and guidebook content examples, and to demonstrate the proposed digital tool. Examples of early functionality of the tool are provided in Figures II-8, II-9, and II-10. Figure II-7. Adjusted guidebook content.

II-18 Guidebook for Data and Information Systems for Transportation Asset Management Figure II-9. Digital tool assessment summary. Figure II-8. Digital tool assessment page.

Figure II-10. Digital tool improvement evaluation page.

II-20 Guidebook for Data and Information Systems for Transportation Asset Management The research team walked the panel members through a full guidance assessment and improve- ment selection process, providing examples from the pilot testing and soliciting feedback from the panel. The panel generally agreed with the changes that had been proposed to the guidance following the pilot testing. The supporting digital tool was met with enthusiasm and was found to be a more streamlined method of performing the assessment and improvements selection, even in beta form. The panel provided several feature suggestions and function-testing remarks. The meeting also included a research implementation discussion. Part III, Appendix K presents the ideas that were documented for research dissemination and implementation. 2.9 Final Guide Based on final feedback from the project panel, the research team completed a draft final guide- book and research report. These materials were reviewed by the panel, whose comments were addressed in the final guidebook and digital tool and this supporting research report.

II-21 C H A P T E R 3 3.1 Overview The literature review identified: • Technical (technology-related) and organizational challenges faced by DOTs seeking to advance data and information systems for TAM; • Key technologies and supporting tools supporting data-driven TAM practices; • Information management and integration practices important to the management, analysis, sharing, and reporting of information; • Existing management models, self-assessment tools, and organizational frameworks that should be considered during guidebook development. 3.1.1 Challenges Many specific challenges and unique case studies were identified in the reviewed literature. In general, the challenges identified could be categorized into technology-related and organiza- tional challenges. Technology-related challenges could typically be grouped into issues related to: • Lack of internal data standards and documentation; • Limited formal documentation of DOT business processes and requirements; • Greater system complexity associated with higher maturity programs; • Difficulty of integration of legacy systems and/or migration to an enterprise asset management system; • Lack of system interoperability across the asset life-cycle; • Lack of industry standards for emerging technologies and tools (such as Building Information Modeling, or BIM); and • Data privacy and security concerns. Organizational challenges could typically be grouped into issues related to: • Resource limitations (funding, staffing); • Institutional silos between various DOT programs and assets; • Lack of readiness to change; • Lagging workforce skills, expertise, and adoption; • Effective integration of data governance into the business culture; • Contracting and legal hurdles relating to new technologies; • Industry and other external stakeholder acceptance of new approaches; and • Political challenges. Literature Review

II-22 Guidebook for Data and Information Systems for Transportation Asset Management Given these challenges, the final guidance recognizes that DOTs must tackle any substantial improvement strategy through incremental, iterative improvement actions. This approach is reflected in the form and format of the improvement recommendations provided through the developed guidance and tools. 3.1.2 Technologies The research identified several key technologies or tools that form the backbone of data-driven asset management systems by providing powerful data storage or data collection functionality. The functionality provided by these systems typically is useful throughout the asset life-cycle from initial planning through construction and maintenance. The manner in which these core systems are leveraged by the DOT is typically indicative of the effectiveness and maturity of the agency’s programs. As such, these technologies will often form the foundation of a DOT improvement roadmap. These key technologies are described in Part III, Appendix J, Section J.1. Additional supporting technologies that were identified through the literature review play a more focused role in particular areas of the asset or data life-cycle. These supporting technologies include: • Data storage tools, • Data collection tools, and • Project delivery tools. Often, these technologies will not strictly govern the overall maturity of a program; however, they may contribute significant value for targeted applications. These supporting technologies are described in Part III, Appendix J, Section J.2. 3.1.3 Information Management and Integration Practices A wide range of practices that relate to the specification, collection, storage, management, analysis, sharing and reporting of information are currently used by DOTs to implement data-driven asset management processes. These practices are described in Part III, Appendix J, Section J.3. 3.1.4 Organizational Practices The literature review indicated that the successful incorporation of data and information systems into the delivery of DOT asset management programs depends at least as much on the organizational and institutional practices of the agency as it does on the technology and infor- mation systems and information management practices themselves. A wide range of successful practices can be found in lessons learned and case studies docu- mented in previous research and synthesis projects. Previous research and literature also provide numerous guidance and management models that highlight the requisite and supporting organi- zational practices of DOTs. Organizational practices that are relevant to successful implementation of data and informa- tion systems for DOT asset management programs are further discussed in Part III, Appendix J, Section J.4.

Literature Review II-23 3.1.5 Existing Maturity Models and Self-Assessment Tools The literature review identified several examples of management tools that can be used to help a DOT assess the agency’s existing or desired capabilities in order to improve performance. These tools are documented in Part III, Appendix J, Section J.5. 3.2 Relevant Resources A list of the key resources examined in the literature review is provided at the end of this research report. Additional detail is provided in the form of an annotated bibliography in Part III, Appendix J, Section J.6.

II-24 C H A P T E R 4 Stakeholder interviews were conducted to introduce initial concepts for the organization of the to-be-developed guidebook to a broad group representing the target audience. Two separate approaches were used to raise national awareness of this research initiative and meet the research team’s need for input from the target audience. First, the research team worked to have this research featured on the agenda of the TRB’s Asset Management Committee (ABC40) meeting at the 2019 annual meeting of the Transportation Research Board. At this meeting, the research team made a brief presentation to share the purposes of the research and the need for informed volunteers to participate in focus groups to help shape the guidebook development. Second, a detailed email soliciting focus group participation was distributed by AASHTO’s program director for planning and performance management to members of the AASHTO Subcommittee on Asset Management and the AASHTO Committee on Data Management Analytics. To ensure the development of a comprehensive understanding of DOT needs for this guid- ance, the research team engaged a limited number of participants through their professional networks. These participants were targeted in each focus group topic area based on the research team’s understanding of the current practices and level of engagement of the individual or DOT in that topic area. This approach was successful in ensuring that a fuller context of successful DOT practices was represented. A total of four focus groups were organized, executed, and documented by the research team, addressing the topic areas of asset data collection, information access and transparency, managing condition and performance, and efficient project delivery. A fifth focus group was organized with the Louisiana DOTD to provide a more cross-functional perspective on the guidebook from the perspective of a single DOT looking at all the topic areas. 4.1 Focus Group Participation 4.1.1 Asset Data Collection A 1-hour asset data collection focus group was held on May 3, 2019 (see Table II-3). This focus group was organized to better understand how DOTs are collecting asset inventory, con- dition, performance, and work history information, and how this information is meaningfully integrated into their TAM programs and associated business processes. The focus group discussions covered the following topics: • Broad network-level data collection systems and techniques (such as vehicle-mounted LiDAR collection); Stakeholder Interviews

Stakeholder Interviews II-25 • Detailed, project-level data collection practices and supporting tools (such as field-based mobile data collection); and • Technical, procedural, and organizational challenges to developing trust and use of these data across the enterprise. 4.1.2 Information Access and Transparency A 1-hour information access and transparency focus group was held on April 29, 2019 (see Table II-4). This focus group was organized to better understand how DOTs are storing, inte- grating, reporting, and communicating information to support their TAM programs and asso- ciated business processes. The focus group discussions covered the following topics: • Importance of data integration across business silos; • Needs and complexity associated with developing and maintaining an enterprise location referencing system; • Value of, as well as data management and governance challenges posed by, simple, configurable data collection and business intelligence tools that are available to DOT business units; • Current strategies to provide mobile devices to field staff; and • Increased focus on performance reporting and dashboards, particularly in response to federally required Transportation Asset Management Plans (TAMPs). Participant (by Title) Organization Program Manager, Asset and Performance Management South Carolina DOT Performance Coordinator, Office of the Chief Planner Florida DOT Director, Office of Maintenance Maryland State Highway Administration (SHA) Director, Maintenance, Asset & Facility Management Division Utah DOT Technology Transfer Engineer Utah DOT Highway Data Analytics Engineer Idaho DOT Table II-3. Asset data collection focus group participants. Participant (by Title) Organization Chief Information Officer Montana DOT Assistant Director, Data Management Services Washington State DOT Manager, Transportation Performance Management Bureau New Mexico DOT Manager, Information Management Branch Colorado DOT Director, IT Strategy and Portfolio Management Texas DOT Table II-4. Information access and transparency focus group participants.

II-26 Guidebook for Data and Information Systems for Transportation Asset Management 4.1.3 Managing Condition and Performance A 1-hour managing condition and performance focus group was held on May 3, 2019 (see Table II-5). This focus group was organized to better understand how DOTs are using data and information systems to better invest in their transportation assets, including network-level life- cycle management, tradeoff analysis, prioritization and treatment selection, and their connec- tion with project-level TAM investment decisions and field decisions. The focus group discussions covered the following topics: • Tracking and target-setting against federal and state measures; • Programming and project selection strategies and tools; • Alignment of TAM analysis and communication with executive expectations, field staff needs, and statewide transportation improvement plans; • The importance of knowledge-management and change-management processes and training; • The inclusion of “Tier 2” and “Tier 3” assets in asset management processes and the importance of cross-asset resource allocation in an enterprise TAM program; and • The need to recognize and emphasize the quality and sustainability of data collection efforts that drive condition and performance management processes. 4.1.4 Efficient Project Delivery A 1-hour efficient project delivery focus group was held on April 26, 2019 (see Table II-6). This focus group was organized to better understand how DOT processes for planning, designing, Participant (by Title) Organization Director, Asset Management and Performance Bureau Vermont DOT Director, Performance and Process Improvement Utah DOT Director, Maintenance, Asset & Facility Management Division Utah DOT Technology Transfer Engineer Utah DOT Highway Data Analytics Engineer Idaho DOT Executive Vice President Applied Pavement Technology, Inc. Table II-5. Managing condition and performance focus group participants. Participant (by Title) Organization GIS Manager Utah DOT Principal Engineer, Architecture, Engineering, and Construction (AEC) Applications Connecticut DOT TAM Administrator Iowa DOT Project Surveyor, New Technology Oregon DOT Table II-6. Efficient project delivery focus group participants.

Stakeholder Interviews II-27 and delivering projects are improved through the agency’s incorporation of TAM data and information systems, and to identify how project design and construction information can be effectively integrated with TAM systems and tools. The focus group discussions covered the following topics: • Tracking and mapping of transportation projects during design and construction; • Integration of asset information into projects including “BIM for Infrastructure” approaches; • Limitations of typical project information tracked at the bid item level or activity level; • 3D asset data collection, its value in automating and improving project development and delivery, and challenges associated with maintaining information quality and trust; and • Data duplication and data management and governance strategies to build trust and under- standing of information collected across the enterprise. 4.1.5 Louisiana DOTD Focus Group A 1-hour efficient project delivery focus group was held on July 2, 2019 (see Table II-7). This focus group incorporated members of a cross-functional team within the department that oversees the agency’s asset data collection and management systems. 4.2 Key Findings Although each focus group was conducted independently, had different individual participants, and covered unique topic areas, common themes arose in discussions regarding current challenges faced by DOTs and the desired areas of guidance to be included in the research document. The key findings developed through the focus group meetings are documented in this section, organized by data life-cycle phases. 4.2.1 Specify and Standardize Data The focus group participants expressed a range of needs and challenges in areas relating to the specification and standardization of data, as documented in Table II-8. A common theme related to the challenges of establishing and maintaining a strong governance, data management, and change-management culture within a DOT. Participant (by Title) Organization Administrator, Data Collection and Management Systems Section Louisiana DOTD Asset Management Engineer Louisiana DOTD Highway Inventory Engineer Louisiana DOTD Traffic Monitoring Engineer Louisiana DOTD Pavement Management Engineer Louisiana DOTD Bridge Management Engineer Louisiana DOTD Quality Assurance (QA) Manager for Pavement Management System (PMS) Louisiana DOTD Table II-7. Louisiana DOTD focus group participants.

II-28 Guidebook for Data and Information Systems for Transportation Asset Management 4.2.2 Collect Data All participants reported having generally strong data collection programs but struggled with the wide variety of tools supporting data collection and often overlapping data collection efforts that result in duplication of effort and require secondary rectification efforts (see Table II-9). Most participants observed a strong correlation between their agency’s mission and objectives and their data collection. Some agencies were also taking steps to build data collection into the project life-cycle by including data requirements in contract terms. 4.2.3 Store, Integrate, and Access Data Nearly every participant noted the presence of multiple disparate systems as a key challenge. Many participants were in the process of either procuring a new enterprise asset management system or trying to integrate existing systems. With respect to procuring new systems, there was Need/Challenge Description Data Cataloging, Metadata Understanding and communicating the wide range of existing data already collected: Without a strong understanding of existing data and information sources, and an effective means to communicate this information across the enterprise, duplicative data collection can and does occur. Data Governance Establishment and integration of an effective data governance strategy. Base Data Standards Interest in expanding data collection programs for assets other than pavement and bridges: Many participants noted they found it challenging to identify and agree on data needs for these assets. They wanted to avoid over-collection and to ensure that any data collection program could be sustained. It was noted that collection should be clearly aligned with the existing or anticipated business process. Location Referencing Systems Specifying, managing, replacing, and retiring of legacy location referencing systems is an area of significant DOT investment and a potential source of, and solution to, many data integration challenges facing TAM programs. Table II-8. Specify and standardize needs and challenges. Need/Challenge Description Contract Requirements for Data Provision Certain participants highlighted the possibility to require construction contractors to submit as-built plans or other data products that would include certain asset data. Mobile Device Strategies Many of the participants noted that, while their DOTs had provided tablets for field-based data collection and review, it was much easier to anticipate the use of smartphones for these purposes as these devices were more available. Proliferation of Simple Data Collection Tools The prevalence of simple-to-configure data collection tools and applications was seen as valuable, but also as a complicating factor for traditional data management and governance processes due to the ease by which business users can circumvent collection policies. LiDAR LiDAR data collection was a repeated topic of discussion. A wide range of perceptions was shared about the value of these data collection efforts, and a widely accepted challenge was managing the data processing and storage requirements of these collections. Project-Level Data Collection Most participants noted that they were at various stages in moving toward the incorporation of detailed asset information within their project files. These participants recognized that, traditionally, DOTs have managed and tracked their projects by activity or by bid item but not by specific assets. Table II-9. Collect data needs and challenges.

Stakeholder Interviews II-29 Need/Challenge Description Breaking Down Silos Participants expressed the need to break down traditional silos within the DOT through data and information sharing. Cloud-Based Systems/Data Storage A relatively new development for DOTs, this is an area of interest, with some agencies noting the ease of data integration, partner collaboration, and connection to remote devices being triggers for this interest. Cloud-based storage was identified as a possible solution to DOT data storage problems. Dashboards The prevalence of simple-to-configure dashboard tools and reporting applications was seen as valuable, but also as a complicating factor for traditional data management and governance processes due to the ease by which business users can circumvent these policies. Business Intelligence Tools Business intelligence tools of all types (from executive to detailed asset management to public-facing dashboards) are an area of significant interest. The use of business intelligence tools to examine data has frequently aided in the identification of data gaps. Table II-10. Share needs and challenges pertaining to data storage, integration, and access. a desire for guidance on best-available systems and how to best address data model requirements in a holistic fashion. Agencies noted struggles with defining master data when trying to integrate information across systems with overlapping content (see Table II-10). Agencies recognized the value and need for clear sources of record for all enterprise data. 4.2.4 Analyze Data Most participants described efforts to supplement data for additional asset classes to enable cross-asset portfolio analysis (see Table II-11). Agencies faced data quality deficiencies for non- pavement and bridge assets and were seeking guidance on techniques to close data and quality gaps effectively and efficiently. 4.2.5 Act as Informed by Data Most focus group participants appeared to still be in a relatively immature state of practice with respect to confidently acting on data. The level of practice appears to be largely driven by challenges in data collection, data consistency, and having a common enterprise system or reporting environment (see Table II-12). One participant provided an example of acting on data only realizing later that the data were bad, which underscores the importance of data governance to improve data reliability. Need/Challenge Description Cross-Asset Optimization Several focus group participants suggested that a TAM program is not complete unless it includes true, cross-asset optimization and decision-making. Cross- asset optimization appears to be an area where many DOTs are heading, both for network-level resource allocation and project selection. Additionally, DOTs are looking for means to better optimize investments in multiple assets under a single project. Data Quality and Currency Most participants expressed that concerns over data quality and data currency are major obstacles to the use of existing enterprise-level data collection programs. There was interest in tools, techniques, and practices to assess, improve, and communicate data quality and data currency. Table II-11. Analyze data needs and challenges.

II-30 Guidebook for Data and Information Systems for Transportation Asset Management 4.2.6 Other/Cross-Cutting Needs and Challenges The focus group discussions revealed common needs and challenges across several areas that cut across the agency and were not specific to a particular tool or data need (see Table II-13). Many of these needs appeared to emanate from ongoing challenges related to staffing changes and retirements, the pace of technological change, and the lack of a common vocabulary for data and information systems, which impedes communication. Need/Challenge Description Asset-based Performance Management Several participants noted a shift from activity-based tracking of performance to asset-based tracking with the ability to affect investment decisions based on comparing the volume of investment to the value obtained. Data Governance Participants cited a critical need of strong data governance to achieve confidence in data completeness and quality. This is a prerequisite for being able to confidently act on data in making asset-related decisions. Table II-12. Act on data needs and challenges. Need/Challenge Description Business Engagement Numerous participants in the focus group discussions noted the importance of business engagement in IT requirements development, project delivery, and deployment. Various strategies were discussed, such as the use of “Agile” project management methodologies, the involvement of both IT and business project managers in major projects, and the early engagement of power users (e.g., “Train-the-Trainer” sessions). Custom Coded vs. Off-the-Shelf IT Solutions Several DOTs expressed that they were in various stages of scoping, development, or implementation of various enterprise asset-management, maintenance-management, project-management or other such systems to migrate from existing, custom-coded applications to an off-the-shelf product. Knowledge Management Participants noted their challenges with knowledge transfer given the high rate of turnover and retirement of key staff in their aging workforce. Change Management Most participants noted challenges in addressing the cultural and attitude changes required to break down silos, unite on common systems and tools, and implement new and consistent processes. Some participants noted recently initiated or soon-to-be-initiated enterprise change-management training programs. Table II-13. Other/cross-cutting needs and challenges.

II-31 In consultation with the NCHRP project panel, the Connecticut DOT and the Utah DOT were identified for participation in pilot-testing activities. The pilot testing was intended to provide opportunities to observe and assess the effectiveness of each agency’s application of previously developed guidebook content, focusing on self-assessment, improvement selection, and improve- ment roadmapping framework, tools, methods, and content. Agency comments on the value of the guidance and suggested improvements to the materials were captured as part of this process. 5.1 Pilot Testing: Participation 5.1.1 Connecticut DOT The Connecticut DOT was interested in exploring the data and information systems associated with their guiderail asset management programs and activities. Specifically, the DOT identified the capture and integration of guiderail-related data from design files through ongoing CADD to GIS integration activities, as well as from maintenance activities through an anticipated Mainte- nance Management System implementation. The Connecticut DOT participants included representatives from the agency’s asset manage- ment, guiderail, civil integrated management (CIM), and CADD programs. By program area, the participants included: • Asset management (1 participant), • AEC applications (1 participant), • Guardrail (2 participants), • CIM/AEC applications lead (1 participant), and • Civil applications and CADD. The six meetings of the pilot test took place on the following dates: • Initial Kickoff Meeting, October 7, 2019; • Self-Assessment Meeting 1, October 11, 2019; • Self-Assessment Meeting 2, October 24, 2019; • Improvement Selection Meeting 1, October 28, 2019; • Improvement Selection Meeting 2, November 6, 2019; and • Closeout Meeting, November 15, 2019. 5.1.2 Utah DOT The Utah DOT was interested in data and information-related improvements to support the management of the agency’s pavement striping assets, which currently have limited statewide C H A P T E R 5 Pilot Testing

II-32 Guidebook for Data and Information Systems for Transportation Asset Management programs for data-informed decision-making. The DOT wanted to use the assessment process to identify ways for advancing striping data and information systems to support pavement striping’s management as a “Tier 1 asset,” meaning that life-cycle planning, performance measurement, and performance targeting are incorporated into the asset’s statewide asset management program. The Utah DOT pilot test included participation from the agency’s asset management, mainte- nance, operations, traffic operations, safety, performance and process improvement departments, as well as the data, technology, and analytics programs. By program area, the participants included: • Traffic operations (1 participant), • Performance and process improvement (1 participant), • Asset management (1 participant), • Data, technology, and analytics (1 participant), • Traffic and safety (1 participant), • Maintenance (1 participant), and • Operations (1 participant). The six meetings of the pilot test took place on the following dates: • Initial Kickoff Meeting, September 29, 2019; • Self-Assessment Meeting 1, October 3, 2019; • Self-Assessment Meeting 2, October 10, 2019; • Improvement Selection Meeting 1, October 16, 2019; • Improvement Selection Meeting 2, October 23, 2019; and • Closeout Meeting, November 21, 2019. 5.2 Key Findings 5.2.1 Outcomes The pilot-testing process resulted in critical feedback and improvement recommendations. Key findings from the pilot tests are presented in this section. 5.2.2 Assessment Materials After a general introduction to the guidebook framework, the participants in the pilot tests completed a self-assessment of current performance against the full set of 51 assessment elements. Generally speaking, participants found the assessment elements and benchmarks to be compre- hensive and meaningful. Beyond simple adjustments to add clarity using more specific language, most of the recommendations were to provide supporting examples to help with the DOTs’ understanding of key concepts, themes, or contexts that are likely to be presented within various elements or benchmark practice levels. As shown in Tables II-14 and II-15, participants recommended several adjustments to the assessment materials. Table II-15 lists specific elements for which substantial recommendations were made to rework the associated practice benchmarks or benchmark levels. 5.2.3 Assessment Summary and Improvement Recommendation Materials After the initial assessment, assessment summary and potential improvement recommendations were presented for each assessment section and benchmark (practice) level.

Pilot Testing II-33 Recommendation Rationale for Change Use “asset-specific” benchmarks. • Eliminate broad, “enterprise-wide” practice benchmark descriptions (e.g., “the agency has an asset breakdown structure for each asset type”); • Focus the user on an asset-specific application (e.g., “the agency has established an asset breakdown structure for the asset”); and • Avoid conflicts where “most” but not “all” cases achieve a particular level. Eliminate overlaps between benchmarks. • Avoid confusion by strictly evaluating applicable aspects of practices. Note: Without clear delineation, users tended to evaluate themselves beyond the limits of the area under assessment. For example: - A data model had been defined (Specify Data), - Data collection had not yet begun (Collect Data), and - Users rated the data modeling low as the data model had not yet been implemented as part of a data collection effort. Add support to each element. • Provide supporting content to ensure proper evaluation; and • Address common themes and contexts, and share examples. Table II-14. Assessment material recommendations: general. Table II-15. Assessment material recommendations: element-specific. Assessment Area Element-Specific Rationale for Change Area A: Specify and Standardize Data Sections/elements targeted for benchmark reworking: • A.1.a: Data Model Element - Relabel and focus this element entirely on the Asset Inventory Data Model. • A.1.b: Condition Indices Element - Relabel and focus this element entirely on the Asset Condition and Performance Data Model element. - Note that condition indices are a specific aspect of the element. • A.1.c: Design Model Standards Element - Revise Benchmark Level 4 to differentiate it from Benchmark Level 3. • A.1.d: Location Referencing Element - Avoid benchmarks related to “level of implementation.” • A.2.a: Treatments and Work Data Model Element - Current Benchmark Level 4 is more consistent with a Benchmark Level 3. • A.2.b: Treatment and Work Location Referencing Element - Avoid benchmarks related to “level of implementation.” • A.4: Metadata Section - All associated elements are applicable more to an enterprise-wide perspective, as opposed to individual asset level. • A.5: Governance Section - All associated elements are applicable more to an enterprise-wide perspective, as opposed to individual asset level. Area B: Collect Data Sections/elements targeted for benchmark reworking: • B.2: Project Information Section - Need to clarify what “type” of project data collection is the focus of this section. - DOTs often develop/store/collect contract level information and details relating to contractor payment. - Focus should be on the asset work accomplishment information that must be field-collected/validated during acceptance inspection. • B.2.b: Project Information Automation Element - Avoid duplicative reference to “integration”-type benchmarks. • B.3: Maintenance Information Section - Need to clarify what “type” of data collection is the focus of this section. - DOTs often develop/store/collect contract-level information, and details relating to contractor payment. - Focus should be on the asset work accomplishment information that must be field-collected/validated during acceptance inspection. • B.3.b: Maintenance Information Automation Element - Avoid duplicative reference to “integration”-type benchmarks. • B.4: Priority Criteria and Values Section - Change focus of element benchmarks from frequency of collection to what is collected. • B.4.b: Decision-Maker Values Element - This element was particularly confusing, revise benchmarks to bring clarity. (continued on next page)

II-34 Guidebook for Data and Information Systems for Transportation Asset Management The participants in the pilot tests naturally moved directly to the improvement materials and did not spend much time considering the general assessment summary materials. Additionally, there were challenges connecting the improvements (which were presented at the section level) to individual assessment elements and practice benchmarking results (which were presented at the element level). Once this connection was made, most of the improvements were found to stand well on their own and provided enough detail for DOT consideration. Table II-16 summarizes the recommended adjustments to the improvement recommenda- tion materials. 5.2.4 Response Templates To support the process of assessment and improvement selection, section-level response templates were developed. These templates described each element in the section and included places for a user of the guidance to record assessed practice levels and associated notes. Additional space was provided where specific improvements and associated notes could be identified during improvement selection. Participants in the pilot tests found the assessment rating and note taking to be effective; however, it was difficult to connect the assessment results to the improvement selections. The participants felt that providing assessment and improvement information within the template would improve the process. Additionally, the need to reference separate materials (e.g., assess- ment benchmark or assessment summary and potential improvement materials) was also a barrier to easy use. Table II-17 summarizes the recommendations to improve the response templates. Table II-15. (Continued). Area E: Act as Informed by Data Sections/elements targeted for benchmark reworking: • E.2.a: Data-Driven Project Planning and Scoping - Adjust benchmark Level 4 to focus and clarify project scope template concept. • E.3.a: Infrastructure Maintenance - Current focus is explicitly preventive maintenance; should this be expanded? • E.3.b: Equipment Maintenance - This element generated confusion, may require support for when/how it applies, given equipment maintenance is often considered separately from asset management at a DOT. Area C: Store, Integrate, and Access Data Sections/elements targeted for benchmark reworking: • C.2: Asset Life-Cycle Data Integration Workflows Section - This section was understood in concept, but difficult in application. - Adjust benchmarks of all elements to focus on “what” data is integrated through the workflow, as opposed to the extent to which the workflow is implemented. - Adjust benchmarks of all elements to make more clear what types of information are expected for exchange at various life-cycle stages. • C.3: Other Data Integration Workflows Section - This section was understood in concept, but difficult in application. - Adjust benchmarks of all elements to focus on “what” data is integrated through the workflow, as opposed to the extent to which the workflow is implemented. - Adjust benchmarks of all elements to make more clear what types of information are expected for exchange in various external data areas. Area D: Analyze Data No major recommendations in this area. Assessment Area Element-Specific Rationale for Change

Pilot Testing II-35 Recommendation Rationale for Change Provide “element- level” improvements. • Simplify association of potential improvements with assessment results. • Eliminate need to identify average or minimum practice levels within a section. • Support simultaneous presentation and completion of self-assessment and improvement selection. • Ensure improvements are defined for all elements at all practice levels. Remove “assessment summary” details. • Simplify association of potential improvements with assessment results. • Eliminate need to identify average or minimum practice levels within a section. • Support simultaneous presentation and completion of self-assessment and improvement selection. Remove “improvement requirements” details. • Simplify “improvement selection.” • Eliminate confusion where pre-populated Low/Medium/High values do not align with the DOT’s specific context. Note: These categorizations of implementation effort would be most useful if they are able to be defined on a case-by-case basis by the user during the improvement evaluation. Re-phrase as “actions.” • Clarify the action recommended to the user by beginning the improvement with a verb (e.g., “create a plan,” “document a business case”). Add contextual examples. • Explain unfamiliar concepts (e.g., “asset breakdown structures”). Note: Provide examples from several perspectives (e.g., for two differing assets). Ensure that all improvements are unique. • Support user recognition of differences in actions between various practice levels or elements. • Avoid repetition (even if only small adjustments addressing minor differences). • Eliminate concerns raised by repeated content. Table II-16. Assessment summary and improvement recommendation material recommendations. Recommendation Rationale for Change Provide “element- level” response templates. • Simplify the association of potential improvements with assessment results. • Eliminate the need to identify average or minimum practice levels within a section. • Support the simultaneous presentation and completion of self- assessment and improvement selections. Add “benchmark levels.” • Eliminate the need for additional reference material during the assessment process. Add “potential improvements.” • Eliminate the need for additional reference material during the improvement selection process. Recognize both current and desired states. • Clearly identify gaps and provide a basis for a compelling assessment summary. • Focus the user on elements where improvement is desired. • Establish the extent of desired improvement. Add checkboxes to standardize/streamline process. • Eliminate the need to transcribe improvement selections. • Generate clearer outcomes. • Standardize the content for digital tools. Table II-17. Response template recommendations.

II-36 Guidebook for Data and Information Systems for Transportation Asset Management 5.2.5 Implementation Support Materials During the closeout meetings, participants reviewed the roadmapping and organizational practice content in detail. The DOT participants expressed that detailed implementation plans would not be a practical outcome of the assessment process; significant stakeholder engagement activities and institutional processes are required to finalize implementation plans. Instead, implementation support can focus on (1) categorizing selected improvements into immediate, short-term, and long-term actions, (2) assessing the relative impacts and efforts required to accomplish various improvement actions, and (3) supporting the identification of general strategies to help overcome organizational obstacles to their projects. Overall, the participants recommended simplifying the process of producing implementation plans and focused on materials that would be suitable for presenting to executives to gain support (as opposed to detailed plan development). Table II-18 summarizes the recommended improve- ments to the improvement support material. 5.2.6 General Guidebook Materials During closeout meetings, participants discussed the general guidebook content and the process of using the guidebook. The DOT participants were asked to share recommendations based on their experience in the pilot test. The general sentiment was that the guidebook could be simplified to an extent, but it would (necessarily) remain complex. Therefore, strong introductory materials would need to be provided, and clear expectations would need to be set in advance of initial use. This feedback reinforced the value of a digital tool, which could streamline the process and workflow for users while also targeting content only to the specific areas or aspects of user interest. Participants also identified that they saw the typical use case for the guidebook to be improve- ment within an individual asset program. They expressed that a single, comprehensive evaluation Recommendation Rationale for Change List selected improvements. • Generate a list of improvements. Generating a list was confirmed as valuable (despite recommendations to eliminate detailed improvement roadmapping). Group improvements into immediate, short- term, and long-term actions. • Simplify the improvement roadmapping process. • Eliminate the need for a time-consuming ranking process. • Support efficient communication to executives or other stakeholders. Simplify “improvement requirements” to “key challenges.” • Simplify the process. • Given that assigning Low/Medium/High values to each of the five aspects (time, resources, expertise, coordination, and change) is cumbersome, focus on what adds value. Key challenges are important, whereas showing that four of five aspects have “low” needs is not as important. • Add simple checkboxes or pick lists to identify key challenges (when they exist). Consider improvement impact versus effort. • Support efficient communication. • Allow users to create an overall “impact vs. effort” chart for comparison of selected improvements. • Allow users to select impact and effort, do not pre-populate these values. No need to directly link improvements to organizational practices. • Avoid adding unnecessary complexity. • Instead provide strong connection at the guidance level (clearly connecting general organizational challenges to each organizational practice). Table II-18. Improvement support material recommendations.

Pilot Testing II-37 Recommendation Rationale for Change Use cases/vignettes. • Support understanding during assessment and improvement selection. • Broader, high-level case studies are useful, but simpler examples are also needed. Adjust introductory materials. • Clearly explain the guidance framework and process for using the guidance up front. • Recommend an expert facilitator to guide stakeholders through the process. • Identify recommended participants in the process (include business and IT). Note: The process is seen as best not targeted to typical field staff. • Provide a slimmed-down, simplified, area- or section-level assessment tool to: - Facilitate quick engagement of decision-makers, and - Identify potential value or motivation for the full process. Adjust output materials. Focus on executive-level engagement tools such as: • “Radar”/“Spider web” charts showing current and desired state against various elements; • “Impact versus Effort” charts showing selected improvements; • Lists of selected improvements, grouped into immediate, short-term, and long-term actions and key challenges; and • Details provided in completed, element-level response templates and notes. Adjust other materials. • Individual improvements do not require direct connection to various organizational practices. They should: - Provide strong connection at the guidance level, and - Connect general organizational challenges to improvement with specific organizational capabilities and practices. • Provide section- or element-level context for each element. For example, - Provide context during assessment and improvement selection. - Explain common approaches or themes in advancement. Table II-19. General guidebook material recommendations. of the full TAM program would be too complex because of the fundamental differences across assets and the typically wide-ranging current and desired states between the data, systems, and tools used in various asset programs. To address this issue, the guidebook was adjusted to recommend an asset-specific focus, or that TAM program-wide evaluations be focused within a single assessment area (e.g., Area A: Specify and Standardize Data). An introductory chapter of the guidebook supports appropriate scoping based these intended use cases. Table II-19 summarizes the recommended improvements to the guidebook content.

II-38 The project team prepared an annotated review of the following resources, which appears in Part III, Appendix I: 1. Spy Pond Partners and Iteris, Inc. (2015). NCHRP Report 814: Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. Transportation Research Board, Washington, D.C. Available at: http://www. trb.org/Main/Blurbs/173470.aspx. 2. AASHTO (2015). Transportation Asset Management Gap Analysis Tool. Available at: https://www.tam-portal. com/resource/aashto-transportation-asset-management-gap-analysis-tool-users-guide/. 3. Secrest, C., K. Schneweis, and G. Yarbrough (2011). Final Report of NCHRP Project 08-36/Task 100, “Transportation Data Self-Assessment Guide.” Available at: http://onlinepubs.trb.org/onlinepubs/nchrp/ docs/NCHRP08-36(100)_FR.pdf. 4. Gharaibeh, N., I. Oti, D. Schrank, and J. Zmud (2017). NCHRP Synthesis 508: Data Management and Governance Practices. Transportation Research Board, Washington, D.C. Available at: http://www.trb.org/NCHRP/ Blurbs/176005.aspx. 5. Draft research from NCHRP Project 03-128, “Business Intelligence Techniques for Transportation Agency Decision-Making.” [Ed. Note: Information on this currently active project is available at: https://apps.trb.org/ cmsfeed/TRBNetProjectDisplay.asp?ProjectID=4352.] 6. Spy Pond Partners, LLC, Transcend Spatial Solutions, LLC, and James P. Hall. (2015) NCHRP Report 800: Successful Practices in GIS-Based Asset Management. Transportation Research Board of the National Acad- emies, Washington, D.C. Available at: http://www.trb.org/Publications/Blurbs/172204.aspx. 7. Harrison, F. D., W. Duke, J. Eldred, M. Pack, N. Ivanov, J. Crosset, and L. Chan (2019). NCHRP Research Report 920: Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Transportation Research Board, Washington, D.C. Available at: http://www.trb.org/Main/Blurbs/179095.aspx. 8. FHWA (2018). TPM Implementation Guidebook (the Transportation Performance Management Technical Assistance Program Guidebook). Available at https://www.tpmtools.org/guidebook/. 9. Cambridge Systematics, Inc., Boston Strategies International, Inc., Gordon Proctor and Associates, and M. J. Markow (2010). NCHRP Report 666: Target-Setting Methods and Data Management to Support Performance-Based Resource Allocation by Transportation Agencies: Volume I: Research Report, Volume II: Guide for Target-Setting and Data Management. Transportation Research Board of the National Academies, Washington, D.C. Available at: http://www.trb.org/Publications/Blurbs/164178.aspx. 10. Cambridge Systematics (2011). NCHRP Report 706: Uses of Risk Management and Data Management to Support Target-Setting for Performance-Based Resource Allocation by Transportation Agencies. Transportation Research Board of the National Academies, Washington, D.C. Available at: http://www.trb.org/Publications/ Blurbs/166250.aspx. 11. Maggiore, M., K. M. Ford, High Street Consulting Group, and Burns & McDonnell (2015). NCHRP Report 806: Guide to Cross-Asset Resource Allocation and the Impact on Transportation System Performance. Transportation Research Board of the National Academies, Washington, D.C. Available at: http://www.trb. org/Publications/Blurbs/172356.aspx. 12. Adam, J., et al. (2015). Advances in Civil Integrated Management. NCHRP Project 20-68A, Scan 13-02, Final Scan Team Report. Available at: http://onlinepubs.trb.org/onlinepubs/nchrp/docs/NCHRP20-68A_13-02.pdf. 13. O’Brien, W. J., B. Sankaran, F. L. Leite, N. Khwaja, P. Goodrum, K. Molenaar, G. Nevett, and J. Johnson (2016). NCHRP Report 831: Civil Integrated Management (CIM) for Departments of Transportation, Volume 1: Guidebook and Volume 2: Research Report. Transportation Research Board, Washington, D.C. Available at: http://www.trb.org/Main/Blurbs/174318.aspx. Resources Reviewed

Resources Reviewed II-39 14. Austroads (2018). Guide to Asset Management Processes, Part 9: Asset Information Management Systems and Data. Available at: https://austroads.com.au/publications/asset-management/agam09. 15. Austroads (2018). Guide to Asset Management Processes, Part 10: Asset Management Implementation and Improvement. Available at: https://austroads.com.au/publications/asset-management/agam10. 16. Austroads (2011). Harmonization of Location Referencing for Related Data Collection. Available at: https:// austroads.com.au/publications/asset-management/ap-t190-11. 17. McCuen, T. L., and D. M. Pittenger (2016). ACRP Synthesis 70: Building Information Modeling for Airports. Transportation Research Board, Washington, D.C. Available at: http://www.trb.org/main/blurbs/174386.aspx. 18. McCuen, T. L., and D. M. Pittenger (2015). “BIM for Airports, ACRP – A Synthesis of Airport Practice.” PowerPoint Presentation. Available at: http://onlinepubs.trb.org/Onlinepubs/acrp/acrp_syn_070.pptx. 19. BS8536 Briefing for Design and Construction – Part 1: Code of Practice for Facilities Management (Buildings Infrastructure). Available at: https://shop.bsigroup.com/ProductDetail/?pid=000000000030315621 (requires purchase). 20. BS8536 Briefing for Design and Construction – Part 2: Code of Practice for Asset Management (Linear and Geographic Infrastructure). Available at: https://shop.bsigroup.com/ProductDetail?pid=000000000030333121 (requires purchase) 21. Integrating 3D Digital Models into Asset Management (2018). Draft report on FHWA research shared with NCHRP project team. 22. Identifying Data Frameworks and Governance for Establishing Future CIM Standards (2018). Draft report on FHWA research shared with NCHRP project team. 23. “Identifying Data Frameworks and Governance for Establishing Future CIM Standards” (2018). Overview briefing on FHWA research shared with NCHRP project team. 24. Jackson, P. (2018). Infrastructure Asset Managers BIM Requirements - TR 1010: Delivering the informa- tion ‘Asset Managers’ need and can trust using openBIM™. Available at: https://buildingsmart-1xbd3ajdayi. netdna-ssl.com/wp-content/uploads/2018/01/18-01-09-AM-TR1010.pdf.

Next: Part III - Appendices »
Guidebook for Data and Information Systems for Transportation Asset Management Get This Book
×
 Guidebook for Data and Information Systems for Transportation Asset Management
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Effective transportation asset management (TAM) depends on having good data about the assets under management, their descriptions, current condition and history, functional performance, and the activities conducted to develop, maintain, improve, and rehabilitate them during the course of their service lives.

The TRB National Cooperative Highway Research Program's NCHRP Research Report 956: Guidebook for Data and Information Systems for Transportation Asset Management presents a structured approach for assessing an organization’s current data and information management practices in support of transportation asset management and strategies for improving these practices.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!