National Academies Press: OpenBook
« Previous: Summary
Page 13
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 13
Page 14
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 14
Page 15
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 15
Page 16
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 16
Page 17
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 17
Page 18
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 18
Page 19
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 19
Page 20
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 20
Page 21
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 21
Page 22
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 22
Page 23
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 23
Page 24
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 24
Page 25
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 25
Page 26
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 26
Page 27
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 27
Page 28
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 28
Page 29
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 29
Page 30
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 30
Page 31
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 31
Page 32
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 32
Page 33
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 33
Page 34
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 34
Page 35
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 35
Page 36
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 36
Page 37
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 37
Page 38
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 38
Page 39
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 39
Page 40
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 40
Page 41
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 41
Page 42
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 42
Page 43
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 43
Page 44
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 44
Page 45
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 45
Page 46
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 46
Page 47
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 47
Page 48
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 48
Page 49
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 49
Page 50
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 50
Page 51
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 51
Page 52
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 52
Page 53
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 53
Page 54
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 54
Page 55
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 55
Page 56
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 56
Page 57
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 57
Page 58
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 58
Page 59
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 59
Page 60
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 60
Page 61
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 61
Page 62
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 62
Page 63
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 63
Page 64
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 64
Page 65
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 65
Page 66
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 66
Page 67
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 67
Page 68
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 68
Page 69
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 69
Page 70
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 70
Page 71
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 71
Page 72
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 72
Page 73
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 73
Page 74
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 74
Page 75
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 75
Page 76
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 76
Page 77
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 77
Page 78
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 78
Page 79
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 79
Page 80
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 80
Page 81
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 81
Page 82
Suggested Citation:"Part I - Guidebook." National Academies of Sciences, Engineering, and Medicine. 2021. Guidebook for Data and Information Systems for Transportation Asset Management. Washington, DC: The National Academies Press. doi: 10.17226/26126.
×
Page 82

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Guidebook P A R T I

C O N T E N T S I-3 Chapter 1 Introduction I-8 Chapter 2 Pre-Assessment Preparation I-16 Chapter 3 Self-Assessment and Improvement Identification I-55 Chapter 4 Evaluation and Summary of Results I- 63 Chapter 5 Implementation Support

I-3 Transportation Asset Management is defined by the American Association of State Highway and Transportation Officials (AASHTO) as a “strategic and systematic process of operating, maintaining, upgrading, and expanding physical assets effectively throughout their life cycle. It focuses on business and engineering practices for resource allocation and utilization, with the objective of better decision-making based upon quality information and well- defined objectives.” This chapter provides an overview of the guidebook and presents its purpose, scope, and organization. Background State departments of transportation (DOTs) have made steady progress in the use of data and information systems to manage transportation assets. Advances in data acquisition, management, and reporting tools and technologies are enabling more automated, efficient, and integrated flows of data across systems and more agile and effective ways of delivering information needs to end-users. DOTs have strong incentives to take advantage of these advances; they face growing expec- tations from the public, increasing demand for transparency and accountability, and chal- lenges to make the best use of limited resources to deliver value. This guidebook provides step-by-step techniques and a digital tool to: • Assess current practice and establish a desired state, • Identify and evaluate data- and information system-related improvements, and • Secure agency support for improvements and plan an implementation strategy. C H A P T E R 1 Introduction Institutional challenges faced by transportation agencies in effectively and sustainably advancing how information is managed, shared, and used within and across organizations require much more than procuring new tools and technologies. In recognition of these challenges, this guidebook identifies organizational capabilities and typical strategies which will help DOTs accelerate and implement cross-functional and enterprise-wide changes to how data are collected, managed, shared, and used in their TAM-related programs. Purpose and Scope The purpose of this guidebook is to assist DOTs in advancing the use of data and information systems for transportation asset management (TAM). It is intended to be used in conjunction with a companion digital tool, the TAM Data Assistant, providing a comprehensive way to benchmark agency practices and identify and evaluate improvements.

I-4 Guidebook for Data and Information Systems for Transportation Asset Management Guidebook Purpose TAM is by nature data- and analysis-intensive. Data (and information derived from data) about asset inventory, condition, performance, and related work activities are used to inform agency strategies for maintenance, rehabilitation, and improvement. Data also inform the allocation of increasingly scarce resources. Most transportation agencies have asset management systems in place and use a variety of systems to plan and track maintenance activities and capital projects. However, they face chal- lenges with integrating data across systems and across the asset life-cycle. Agencies also seek to capitalize on opportunities to adopt new, emerging tools and technologies for data collection and analysis. This guidebook provides a structured approach that agencies can use to assess current prac- tices in the use of data and information for TAM. This approach can be applied comprehen- sively; it can be targeted for a particular asset, or it can focus on a particular topic area such as data collection or data integration. A companion digital tool, the TAM Data Assistant (available at www.dataassessment. tam-portal.com), can facilitate conducting the assessment, identifying improvements, and evaluating candidate improvements. This guidebook is intended to be used initially to help agencies plan and organize an assessment. It also provides supplemental resources that can help agencies with each step of the process—understanding the context for each of the assess- ment elements, learning about and evaluating possible improvements, and planning an imple- mentation strategy. Guidebook Scope The guidebook is structured around a data life-cycle framework. This life-cycle (illustrated in Figure I-1) consists of five essential steps for making efficient and effective use of data and information for TAM. The data life-cycle approach was selected to reinforce the importance of anticipating how data will be used before collecting it. The data life-cycle can be viewed as a supply chain in which the finished product is a data-informed decision. Getting a quality product depends on sound practices for specifying data, collecting it, storing and integrating it, providing access to potential users, and having suitable analysis tools and processes in place. Each step of the data life-cycle represents an assessment area (area) in the guidebook. For an overview of each area, see the text box titled “Technical Framework.” Further details are provided in Chapters 2 and 3. Data Life-Cycle Figure I-1. Guidance framework.

Introduction I-5 For each area, the guidebook provides benchmark levels (modeled levels described by the research) and candidate improvements. The benchmarks describe levels of practice represent- ing a trajectory for advancement from a nonexistent or minimal practice to a more sophisti- cated practice. The candidate improvements describe initiatives that agencies can pursue to move from one benchmark level to the next. The guidebook provides two types of supplemental resources to help agencies select and plan improvements: • Case studies, which provide examples of implementation experience; and • Organizational practice descriptions, which highlight approaches that can be taken to over- come the very real challenges of implementing data and information improvements. Technical Framework Data Life-Cycle Area Overview Specify and Standardize Data Supports the understanding of the needs and full costs of asset inventory, condition and performance, treatment, and work history data. Also addresses the documentation of data meaning, derivation, and quality, and the establishment of governance structures and processes and stewardship roles and responsibilities. Collect Data Explores TAM-related data collection processes, tools and technologies, and quality as delivered with respect to existing data standards. Store, Integrate, and Access Data Addresses data availability across the enterprise and the elimination of redun- dant and duplicative data. Specific asset life-cycle process areas are identified for data standardization and integration, as well as other data and process areas important to TAM decision-making. Analyze Data Examines decision-support tools, techniques, and practices that facilitate devel- opment of actionable information and insights supporting decision-making. Data exploration, reporting, visualization, and asset modeling are a focus within this area. Act Informed by Data Covers data-informed TAM practices, exploring asset life-cycle management through resource allocation and prioritization, project planning, scoping and design, and maintenance decision-making. Anticipated Uses The guidebook is intended to be used to carry out a formal assessment and improvement planning effort. However, it can also be used as a resource for DOTs that are not ready to pursue an assessment process but are interested in what they can do to improve their practices. Formal applications of the guidance would involve selecting a focus area, forming a team, using the

I-6 Guidebook for Data and Information Systems for Transportation Asset Management companion tool to carry out the assessment, and then producing a plan for improvements. Informal or individual uses of the guidance could involve using the guidebook as a reference for individual agency managers or TAM practitioners to understand possible future directions for advancement, review case studies, and provide ideas for evaluating improvement strategies. Intended Outcomes Completing the full assessment process will result in: • A shared understanding of current agency practice (e.g., as measured using benchmark levels) and a shared vision for how the agency wants to advance (e.g., as measured by targeted benchmark levels); • A list of candidate data and information system-related improvements that could be used to close the gaps between current and target practice levels; and • A prioritized list of improvements that have been created based on a systematic process of evaluating the likely impacts against the effort required and the implementation challenges. Intended Audience This guidebook is targeted at state DOT asset managers, business leads, system owners, and stewards who are interested in evaluating and improving how data and information systems are used within their TAM programs. Although state DOTs are the primary intended audience for the guidance, it is also applicable to other transportation asset owners (e.g., transit agencies). To fully realize the benefits of the assessment, other business, technical, and supporting functions should be involved in the process, including: • Field asset management staff; • Information technology (IT) managers; • Business intelligence and geographic information system (GIS) managers; and • Workforce, human resource, and organizational change-management leads. The TAM Data Assistant Agencies can use an electronic tool like the TAM Data Assistant to conduct the assessment. This tool supports benchmarking of an agency’s current and desired practice levels, improve- ment identification and evaluation, and creation of a results summary and communication. Although this guidebook contains all of the materials needed to carry out the assessment process, use of the TAM Data Assistant is strongly recommended because the tool will streamline workflow and provide summary materials useful in communicating and engaging with agency executive management or other decision-makers. A brief overview of the TAM Data Assistant functions is provided in Chapter 2. Chapter 4 details TAM Data Assistant uses supporting improvement evaluation, results summary, and executive communication, and Appendix H provides uses of the TAM Data Assistant to facilitate the complete assessment process. Appendix I contains a quick reference guide that explains the functionality and use of the TAM Data Assistant in greater detail. Relationship to Other Guidance This guidebook provides a comprehensive perspective on data and information system use in TAM; however, other related assessment and guidance products are available. These products include: TAM Data Assistant TAM Data Assistant: Overview Chapter 2 TAM Data Assistant and Improvement Evaluation Chapter 4 TAM Data Assistant and Executive Communication Chapter 4 Facilitator Materials Appendix H TAM Data Assistant Quick Reference Guide Appendix I

Introduction I-7 • The AASHTO Transportation Asset Management Gap Analysis Tool (part of the “TAM Portal”), developed under National Cooperative Highway Research Program (NCHRP) Project 08-90. • Various data management self-assessment tools, developed under NCHRP Project 08-92 and made available for download from TRB’s report webpage for NCHRP Report 814: Data to Support Transportation Agency Business Needs: A Self-Assessment Guide. • The FHWA Transportation Performance Management (TPM) Self-Assessment Tool. • NCHRP Research Report 920: Management and Use of Data for Transportation Performance Management: Guide for Practitioners, which provides guidance on management and use of data for Transportation Performance Management. This guidebook and the TAM Data Assistant complement these existing tools by providing an in-depth look specifically at how data and information systems are applied to TAM practice. Guidebook Organization This guidebook is organized to help agencies step through the process of preparing for an assessment, conducting an assessment, evaluating improvements, and planning for implementation. The self-assessment and improvement identification materials are organized in a three-level hierarchy: • Areas, representing each phase of the data life-cycle (e.g., data collection); • Sections, representing topics relevant to a data life-cycle phase (e.g., asset inventory, condi- tion and performance data collection); and • Elements, representing items for benchmarking and improvement (e.g., asset inventory, condition, and performance data quality). The five chapters of the guidebook constitute the remainder of Part I of this guidebook. The technical appendices that supplement the guidebook are provided in Part III of this report. • Chapter 1: Introduction. This chapter has described the purpose, scope, and target audience for the guidebook and provided an overview of the assessment process. • Chapter 2: Pre-Assessment Preparation. This chapter helps prepare an agency for conduct- ing an assessment. It covers selecting a focus area, assigning a facilitator, and engaging the most suitable participants. Chapter 2 also reviews the steps for conducting the assessment and selecting and evaluating improvements. • Chapter 3: Self-Assessment and Improvement Identification. This chapter provides guidance and links to resources that can be useful as participants assess current practices and select improvements. The content in Chapter 3 is organized around the five areas representing data life-cycle phases. • Chapter 4: Evaluation and Summary of Results. This chapter provides guidance that can be used as participants evaluate candidate improvements, set priorities, and develop materials for gaining executive support for improvements. • Chapter 5: Implementation Support. This chapter describes additional resources that can be used to support the implementation of data and information system-related improvements. Such resources include case studies of DOT practice and resources about general organiza- tional practices (such as change management).

I-8 This chapter describes activities that are needed to prepare for conducting an assessment of agency practices. Selecting a Focus TAM practices and context vary by DOT, asset area, and individual business function or working group. Data and information system practices also vary among and within DOTs. Selection of a focused use case is essential to developing meaningful results from the use of this guidance. Specifically: • An asset-specific focus allows the DOT to examine data and information system practices within a given asset program, across one or more of the five data life-cycle areas of the framework. • An area-specific focus supports DOT examination of enterprise practices impacting multiple assets within a specific area of the data life-cycle. This section of the guidebook examines each of these use cases and their anticipated value. Use Case Overview and Value Many DOTs will want to use this guidance to improve a targeted asset program. In this appli- cation, a DOT can evaluate and improve how data are defined, collected, accessed, analyzed, and used in that asset program’s decision-making processes. DOTs may also target a specific data life-cycle area to identify improvements benefiting the TAM program or the enterprise at large. It is not advisable to undertake assessment of multiple data life-cycle areas without narrowing the focus to an individual asset program. With such a broad scope, achieving meaningful results is impractical at best. Asset-Specific Focus The goal of this use case is to improve outcomes or prepare for a major system or business process change within a specific asset program. This focus involves bringing together diverse, informed perspectives in a well-documented discussion of asset-related needs and possible improvements to maximize the value of this effort. Central office program management and analysts, district decision-makers, field staff, and other key stakeholders should be involved to raise awareness of key contexts and challenges faced across the program and identify meaningful improvements. For examples of how an asset-focused effort could add value, see the text box titled “Asset-Specific Focus: Anticipated Value by Data Life-Cycle Area.” C H A P T E R 2 Pre-Assessment Preparation

Pre-Assessment Preparation I-9 Asset-Specific Focus Anticipated Value by Data Life-Cycle Area Specify and Standardize Data—Standardize Data and Information Meaning and Use • Identify areas where existing asset data standards are not serving the needs of various stakeholders; • Examine how location referencing and design file standards are inconsistently applied between various asset systems and processes; • Raise awareness of resource allocation and decision-making values and criteria, identifying inconsistencies between field, central office, and executive values; and • Improve understanding and involvement in metadata- and governance-related activities. Collect Data—Deliver Asset Data Collection Needs, Improve Data Quality and Generate Efficiencies • Identify asset data-collection needs, technologies, or efficiencies within the asset program; • Examine field-based tools and systems to collect needed project and maintenance information; and • Capture public perception and decision-maker values to inform asset priorities and decisions. Store, Integrate, and Access Data—Increase Data Access and Integration Within the Asset Program • Evaluate database tools and structures to ensure that data are stored and accessed efficiently and can be integrated across various asset life-cycle systems and workflows; • Examine other data sources (such as revenue, budget, expenditure, demand, utilization, or environmental information) as may be needed to improve asset decision-making; and • Explore field data access needs, public data access needs, and data access security considerations. Analyze Data—Advance Analytical and Reporting Capabilities Supporting Asset Decision-Making • Evaluate TAM analysis capabilities, including improved tools, practices, and environments; and • Identify methods to improve asset performance prediction, optimization, and prioritization models. Act as Informed by Data—Improve Asset Management Decision Quality and Outcomes • Consider methods to integrate data into network-level resource allocation and prioritization decisions; project planning, scoping, and design; and infrastructure and equipment maintenance.

I-10 Guidebook for Data and Information Systems for Transportation Asset Management Area-Specific Focus The goal of this use case is to improve data and information systems, tools, practices, and techniques within a given data life-cycle area, advancing related TAM program capabilities. To achieve the desired results, it is important to include asset management staff and business system owners, as well as IT and business support staff. For examples of how an area-specific focus could add value, see the text box titled “Area-Specific Focus: Anticipated Value by Data Life-Cycle Area.” Area-Specific Focus Anticipated Value by Data Life-Cycle Area Specify and Standardize Data • Advance and standardize asset-related data models, including location referencing, resource allocation, and project design standards; and • Define and implement enterprise metadata and governance programs. Collect Data • Streamline collection of asset and project data through standardized tools and multi-purpose collection programs; and • Capture public opinion and decision-maker values to support cross-asset and/or cross-program investment prioritization. Store, Integrate, and Access Data • Explore cross-functional data integration initiatives and examine enterprise data and information system solutions; and • Increase internal and external stakeholder access through TAM-related data warehouses and dashboards. Analyze Data • Develop advanced, cross-asset resource allocation or multi-objective project selection systems, processes, or tools; and • Provide enterprise business intelligence and/or analysis solutions meeting TAM program needs. Act Informed by Data • Establish enterprise performance targeting and project prioritization programs; • Advance data-driven, project-level design and scoping decisions; and • Improve agency infrastructure and equipment maintenance practices. Key Roles and Responsibilities This section highlights the roles of key participants and the general process involved in the efficient, effective use of this guidebook. A diverse set of perspectives will be needed to examine current and desired capabilities and identify targeted improvements. A cross-functional team should be formed and led by a knowl- edgeable, trusted, and respected facilitator. The team’s participants should be selected based on their background, their ability to constructively participate in the focused discussion, and their

Pre-Assessment Preparation I-11 ability to advance the anticipated outcomes of the process. Recommended participants and their respective responsibilities are shared below. Project Sponsor It is strongly recommended that a project sponsor be identified for any formal application of this guidance. The project sponsor should have decision-making authority, be willing to be engaged throughout the process, and share enthusiasm for achieving the targeted improvements within the focus area. The project sponsor should: 1. Provide leadership by providing executive- or management-level endorsement and support for the assessment and the recommended improvements; 2. Select a facilitator by appointing a member of the team to organize, communicate, and manage the process and detailed activities; and 3. Be a champion for the project by engaging with leadership and management to cultivate and ensure enthusiasm and cross-functional participation by targeted business, IT, and support units. Assessment Facilitator The role of the assessment facilitator is essential. The assessment facilitator organizes and leads the self-assessment, improvement identification, and improvement evaluation activities. A good candidate for this role is a team member who is organized, empathetic to the diverse perspectives of the participants, and able to command the attention and respect of the group. Ideally, this individual should be knowledgeable about the DOT asset management program and the supporting data and information systems. Also, the facilitator should not have a particu- lar agenda or bias concerning the outcome or conclusions of the group (their role or perspective should not be seen as inherently favoring certain assets or data areas). The ideal candidate for such a role is a program or project manager from the enterprise asset management, business process improvement, or other such program. If candidate agency staff are not able to dedicate the time necessary to prepare, facilitate, document, and summarize the results of the process, use a qualified external consultant. Key responsibilities of the assessment facilitator are: 1. Assessment scoping, which involves establishing the assessment focus with the project sponsor; 2. Participant selection, which is accomplished by identifying and engaging targeted participants in the process; 3. Participant preparation, which involves sharing context and direction throughout the process, ensuring that expectations are clear, and ensuring that individuals are adequately prepared to constructively participate; 4. Group facilitation, which involves organizing meeting attendance and providing direction to meeting activities; ensuring productive discussion and full participation; documenting key meeting outcomes; providing summary materials for group review and preparation in advance of future meetings or activities; and utilizing tools like the TAM Data Assistant to capture group consensus during the assessment, improvement identification, and improvement evaluation activities; 5. Assessment leadership, provided by capturing group consensus on current and desired state and selected improvements; documenting supporting contexts and takeaways from the assessment meetings; and delegating action items (e.g., gaps in understanding that need to be closed by targeted participants);

I-12 Guidebook for Data and Information Systems for Transportation Asset Management 6. Improvement evaluation leadership, which is provided by reviewing practice gaps and assessment notes; by considering organizational needs, challenges, and context; by asking questions that support informed discussion of agency improvement priorities; by prepar- ing supporting materials (e.g., “radar” charts); by capturing group consensus on improve- ment challenges, impact and effort, and priority; and by considering when “reassessment” is needed to refine the assessed current or desired state or to identify additional or remove previously selected improvements; 7. Results summary preparation, which involves summarizing outcomes for implementation action and presenting improvement priorities for executive endorsement and action; and 8. Implementation Support, which is achieved by working with the project sponsor and other participants to advocate for implementation; by seeking funding opportunities; and by leading efforts to incorporate recommendations into the agency’s technology, business, and/or process improvement plans, initiatives, and actions. Asset Program Leads Program leads from within the selected TAM focus area (or who rely on the data and information systems within the identified data life-cycle area) are critical participants in the process. These individuals are typically central office program management staff, project managers, analysts, or engineers who understand asset management decision-making needs from a state- wide and policy perspective. They should also be able to discuss the organizational challenges posed by substantial data, information system, or business process changes. A typical team includes several such individuals, spanning key asset and/or program areas, and at least one program lead who can share executive management perspectives. Field Asset Management Leads Team leads will typically be district asset managers, engineers, or maintenance supervisors who are involved in day-to-day field asset management decision-making and execution. These staff must share the practical realities, challenges, priorities, and constraints of field asset management staff. A typical team should include team members who can offer differing perspectives. A district management perspective is necessary, as well as project-level decision-making and boots-on- the-ground field perspectives. IT Management and Staff The team will include key IT staff, particularly those who have an understanding of existing technologies, applications, and priorities within the targeted area. Team members may include IT relationship managers (those engaged with or integrated with key business units or applications), system administrators, project managers, or business or technical analysts. IT staff should be prepared to share data, technology, or application-related context and perspective as business needs or capabilities are discussed. These individuals should identify technology solutions from other agency business functions that may be useful to the TAM program. During improvement evaluation, IT staff should share the technical process, challenges, and constraints anticipated when delivering IT solutions.

Pre-Assessment Preparation I-13 Data Life-Cycle Area Subject Matter Experts Other important perspectives should be represented on the team by subject matter experts as appropriate to the asset program, or when focusing on specific data life-cycle areas and tasks. Subject matter experts can contribute to each of the five key areas in the framework, as follows: • Area A: Specify and Standardize Data: Computer-aided design and drafting (CADD) and location referencing system managers and technical experts, metadata and governance leadership or staff; • Area B: Collect Data: Statewide data collection (e.g., LiDAR or image-based vehicle collection), GIS program, and/or mobile data collection program managers; • Area C: Store, Integrate, and Access Data: Data warehouse and GIS program managers and technical experts, business, data and/or enterprise architecture staff; • Area D: Analyze Data: Business intelligence, data analysis/science program managers or staff; and • Area E: Act as Informed by Data: Performance management or performance dashboard staff, capital, operations, and maintenance program budgeting, and/or field project and construction managers. Recommended Preparation This section outlines the recommended process for guidebook use and identifies the keys to success. Detailed instructions are provided in Part III, Appendix H. Process Overview Full, formal use of this guidance includes seven activities, some of which may involve iterative stages: 1. Initial scoping; 2. Participant engagement; 3. Process kickoff meeting; 4. Self-assessment and improvement identification meetings; 5. Improvement evaluation meetings; 6. Outcome summary and communication; and 7. Implementation support. These activities are led by the assessment facilitator, though initial scoping should also involve the leadership of a project sponsor. Keys to Successful Use Facilitator preparation, participant engagement, and use of the TAM Data Assistant are strongly recommended. In particular: • Facilitator preparation is essential to ensure an active, prepared assessment facilitator. Appendix H provides a detailed walk-through of each activity in the process, discusses sharing anticipated outcomes, and provides detailed facilitator instructions and information about uses of digital tools and supporting materials (such as sample meeting agendas or participant engagement materials). • Participant engagement is needed to ensure that a small but representative, cross-functional group of knowledgeable and engaged individuals can share perspectives on existing TAM processes, related data and information systems, and potential improvements.

I-14 Guidebook for Data and Information Systems for Transportation Asset Management • The TAM Data Assistant is an online, digital tool that provides a streamlined workflow to create assessments; benchmark performance; select, evaluate, and prioritize improvements; and summarize and communicate outcomes. Part III, Appendix I, provides a detailed quick reference guide for users of the TAM Data Assistant. TAM Data Assistant: Overview The companion digital tool is available online through the AASHTO TAM Portal at www. dataassessment.tam-portal.com. Figure I-2 illustrates screenshots from the TAM Data Assistant that the user would access to create assessments; benchmark performance; select, evaluate, and prioritize improvements; and summarize and communicate outcomes. TAM Data Assistant Facilitator Materials Appendix H TAM Data Assistant Quick Reference Guide Appendix I Create Assessments Create and customize assessments of the agency’s TAM programs. Benchmark Performance Benchmark current practices and desired state for 51 individual elements. Select, Evaluate, and Prioritize Improvements Select from candidate improvements to address identified practice gaps. Prioritize selected improvement based on implementation impact, effort, and challenges. Figure I-2. Workflow (screenshots from the TAM Data Assistant).

Pre-Assessment Preparation I-15 Summarize and Communicate Outcomes Export summary communication materials directly from the tool to engage executives, advocate for implementation priorities, and frame decision-making. Figure I-2. (Continued).

I-16 This chapter provides detailed supporting materials for assessment and improvement identification. This content is organized around the five areas of the data life-cycle. Agencies are encouraged to limit the scope of the assessment to a single asset class or to 1 to 2 data life-cycle areas for multiple assets. This approach will ensure that the process doesn’t pose an undue burden on participants while providing a substantive look at a specific area of interest. Self-Assessment Framework and Materials Overview This section details the technical framework used to organize the assessment and improve- ment identification materials, and provides area- and section-specific materials to support the assessment process. The technical framework is developed around the data life-cycle, which, for the purposes of this guidance has been broken down into five (5) areas. Each area is further organized into sections containing a variety of individual elements. Area A: Specify and Standardize Data This area supports the understanding of the needs and full costs of asset inventory, condition and performance, treatment, and work history data. This area also addresses the documenta- tion of data meaning, derivation, and quality, and the establishment of governance structures and processes and associated stewardship roles and responsibilities. Area A is subdivided into five sections that together comprise fifteen individual elements. The sections and elements are: • A.1: Inventory, Condition, and Performance Data Standards – A.1.a: Asset Inventory Data Model – A.1.b: Asset Condition and/or Performance Data Model – A.1.c: Design Model Standards – A.1.d: Location Referencing • A.2: Treatments and Work Data Standards – A.2.a: Treatments and Work Data Model – A.2.b: Treatments and Work Location Referencing – A.2.c: Process Documentation and Management • A.3: Resource Allocation and Prioritization Standards – A.3.a: Prioritization Factors – A.3.b: Analysis Parameters • A.4: Metadata Standards – A.4.a: Data Dictionary Standards and Guidelines – A.4.b: Dataset Metadata Standards and Guidelines C H A P T E R 3 Self-Assessment and Improvement Identification

Self-Assessment and Improvement Identification I-17 • A.5: Governance Standards – A.5.a: Data Stewardship – A.5.b: Data Standards and Guidelines Development/Adoption Processes – A.5.c: Data Collection Approval/Coordination Processes – A.5.d: Change Control (Systems and Data) Processes Area B: Collect Data This area explores TAM-related data collection processes and practices, tools and technol- ogies, and quality as delivered with respect to existing data standards. Area B is sub divided into four sections that together comprise eleven individual elements. The sections and elements are: • B.1: Inventory, Condition, and Performance Collection – B.1.a: Inventory, Condition, and Performance Coverage – B.1.b: Inventory, Condition, and Performance Automation – B.1.c: Inventory, Condition, and Performance Quality • B.2: Project Information Collection; – B.2.a: Project Information Coverage – B.2.b: Project Information Automation – B.2.c: Project Information Quality • B.3: Maintenance Information Collection – B.3.a: Maintenance Information Coverage – B.3.b: Maintenance Information Automation – B.3.c: Maintenance Information Quality • B.4: Priority Criteria and Values Collection – B.4.a: Public Perceptions – B.4.b: Decision-Maker Values Area C: Store, Integrate, and Access Data This area addresses data availability across the enterprise and the elimination of redundant and duplicative data. Specific asset life-cycle process areas, as well as external data and process areas, are identified for data standardization and integration to streamline business processes and improve decision-making. Area C is subdivided into four sections that together comprise fourteen individual elements. The sections and elements are: • C.1: Databases – C.1.a: Efficient Storage – C.1.b: Database Linkages – C.1.c: Document Linkages – C.1.d: Data Storage Capacity • C.2: Asset Life-Cycle Data Integration Workflows – C.2.a: Asset Management Data to Project or Work Order – C.2.b: Project Planning to Project Development – C.2.c: Project Development to Project Delivery – C.2.d: Project Delivery to Asset Management Data • C.3: Other Data Integration Workflows – C.3.a: Financial (Revenue, Budget, and Expenditure) Data – C.3.b: Demand and/or Utilization Data – C.3.c: Environmental Data

I-18 Guidebook for Data and Information Systems for Transportation Asset Management • C.4: Data Access – C.4.a: Field Access to Data – C.4.b: Public Access to Data – C.4.c: Access Security Area D: Analyze Data This area examines decision-support tools, techniques, and practices that facilitate the development of actionable information and insights to support decision-making. Area D is subdivided into two sections that together comprise five individual elements. The sections and elements are: • D.1: Data Exploration, Reporting, and Visualization – D.1.a: Analysis Environment – D.1.b: Analysis Practices – D.1.c: Analysis Tools • D.2: Modeling – D.2.a: Asset Performance Prediction – D.2.b: Optimization/Prioritization Area E: Act as Informed by Data This area covers data-informed TAM practices, exploring asset life-cycle management through resource allocation and prioritization; project planning, scoping, and design; and maintenance decision-making. Area E is subdivided into three sections that together comprise six individual elements. The sections and elements are: • E.1: Resource Allocation and Prioritization – E.1.a: Performance Targeting – E.1.b: Project Prioritization • E.2: Project Planning, Scoping, and Design – E.2.a: Data-Driven Project Planning and Scoping – E.2.b: Data-Driven Project Design • E.3: Maintenance – E.3.a: Infrastructure Maintenance – E.3.b: Equipment Maintenance Detailed Data Life-Cycle Framework Figure I-3 provides a representation of the complete data life-cycle framework, comprising five areas, 18 sections, and 51 elements. The balance of this chapter shares supporting guidance and context for each area and section within this framework and is designed to supplement and support detailed element-level practice benchmarks and potential improvement recommendations. Assessment and Improvement Identification The TAM Data Assistant includes descriptions of each assessment element, practice bench- marks, and associated improvements. In this report, reproducible versions of these descrip- tions are included as Element-Level Response Templates in Part III, Appendices A through E.

Figure I-3. The complete data life-cycle framework.

I-20 Guidebook for Data and Information Systems for Transportation Asset Management Assessment participants can use the templates to become familiar with the material before using the tool in a group setting. Before completing an assessment, participants are encouraged to review the area- and section-specific guidance materials that follow. These materials provide context and examples that will help participants understand the benchmark levels and their associated improvements. Agency staff can evaluate the current and desired state of agency practice against element-level practice benchmarks and select from potential improvements to close identified gaps. Completing the Assessment The following steps are recommended to complete the assessment: 1. Review the element-level response templates in appendices A through E to become familiar with the material. For any given assessment, it is only necessary to look at those areas/sections that have been selected as relevant to that assessment. 2. Read the guidance material in this chapter for the selected areas/sections. This material helps define the scope of each portion of the assessment. 3. Work through the assessment before meeting as a team, either by using the templates in appendices A through E, or by using the TAM Data Assistant. This step will allow all members of the team to think through the material in advance. 4. Share the results of Step 3 at group assessment sessions and work toward a set of consensus results that characterize the agency’s current and desired practices in relation to benchmark levels and the improvements to be considered. Understanding Benchmark Levels For each assessment element, the individuals completing the assessment will select two bench- mark levels (ranging from 0 to 4) to represent agency’s practice levels related to that element. The first benchmark level selected represents the agency’s current level of practice, and the second represents the agency’s targeted level of practice. General practice-level descriptions for each benchmark level are provided in the text box titled “Benchmark Levels.” For each assessment element, a more tailored set of benchmark levels has been provided in Part III, Appendices A through E. Benchmark Levels General Practice-Level Descriptions 0 Non-Existent: The DOT does not have any significant practices within this aspect of their business. 1 Initial Steps: DOT practices are found; however, these are characterized by ad hoc or informal application, and they are not likely to be endorsed by management. 2 Incremental Improvement: The DOT is beginning to see formalization of the processes and structures within this aspect of their business. 3 Advanced Practice: The DOT is performing at a level at or above the standard of their peers. 4 Top Performing: The DOT is a leading example of practice among their peers.

Self-Assessment and Improvement Identification I-21 Selecting Target Levels Assessment teams should not assume that the target benchmark level should be selected at the highest possible level (4) for all of the assessment elements. An agency may currently be at Benchmark Level 0 for a particular element because there is no particular benefit to advancing in that area. For example, it may not be cost-effective to collect data or perform sophisticated analysis for an asset that has a very short life-cycle and accounts for a small portion of the agency’s budget. The benchmark levels have been defined in a way that assumes the agency must pass through each level to get to the next level. In other words, it isn’t possible to skip levels. This means that, to advance, the most reasonable target for any specific element will often be the next benchmark level up from where the agency currently is. It is helpful to use a standard timeframe (e.g., 2–3 years) when setting target levels. The target level should be (1) beneficial for the agency to reach and (2) realistic to achieve within the target timeframe. Selecting Candidate Improvements In this guidebook, candidate improvements are suggested for each benchmark level. These improvements are designed to move an agency up from its current level to the next level. Keep in mind that not all candidate improvements will be appropriate for every agency. Agencies should feel free to tailor them to their specific situation—or create new candidate improvements that would help the agency advance from its current level to the next level. Area and Section Guidance The next sections of this chapter provide guidance for each assessment area and section. Transportation agencies can use this guidance to: • Understand the definition and scope of each assessment area and section; • Understand some of the issues and key decisions to be made when considering improvements related to each section; and • Review conceptual examples of agency practice relevant to the section. Response Templates Appendices A through E provide paper-based response templates that can be used to com- plete self-assessment and improvement identification independently of the digital TAM Data Assistant. A paper-based approach is not recommended for full application of the guidebook; however, it can be useful for individual preparation in advance of group self-assessment discussions. Area A: Specify and Standardize Data This area deals with establishing asset, treatment, and work data standards, standard prioriti- zation factors, metadata standards, and comprehensive governance programs. Area Overview Area A is organized into five distinct sections: • A.1: Inventory, Condition, and Performance Data Standards, covering asset inventory, con- dition, and performance data models, as well as supporting design and location referencing standards; Response Templates Area A: Specify and Standardize Data Appendix A Area B: Collect Data Appendix B Area C: Store, Integrate, and Access Data Appendix C Area D: Analyze Data Appendix D Area E: Act as Informed by Data Appendix E Area A: Specify and Standardize Data A.1: Inventory, Condition, and Performance Data Standards A.2: Treatments and Work Data Standards A.3: Resource Allocation and Prioritization Standards A.4: Metadata Standards A.5: Governance Standards

I-22 Guidebook for Data and Information Systems for Transportation Asset Management • A.2: Treatments and Work Data Standards, covering asset treatment and work data models, as well as the supporting design and location referencing standards; • A.3: Resource Allocation and Prioritization Standards, covering definition of standardized prioritization factors and analysis parameters; • A.4: Metadata Standards, covering dataset-level and data element (data dictionary) metadata standards and guidelines; and • A.5: Governance Standards, covering roles, responsibilities, and processes for adoption of data standards and guidelines, data change control, and data collection approval and coordi- nation practices. Improvements in this area aim at specifying data requirements to align with agency business needs, standardizing data models so that information from different systems can be integrated and aggregated for analysis and reporting, and formalizing roles and processes to ensure align- ment and coordination across different stakeholders. Section A.1: Inventory, Condition, and Performance Data Standards Credible, reliable data begins with well-defined and understood standards. Inventory, con- dition, and performance data are the most important components of a data-informed TAM program. Location referencing and design standards also are essential to support integration across life-cycle systems and with other data (such as roadway use or environmental data). Required, Recommended, and Optional Data Attributes Agencies can establish clear data requirements by documenting comprehensive inventory, condition, and performance data models. Gather input from key stakeholders to ensure models meet business needs and are practical to collect and maintain. Clearly identify required, recommended, and optional data attributes (see Figure I-4). Required and recommend fields should be feasible to collect and maintain; optional fields may only be collected under specific circumstances. Data that cannot be reliably collected or main- tained should be excluded from the data model. Figure I-4. Examples of required, recommended, and optional inventory, condition, and performance data attributes. Area A: Specify and Standardize Data A.1: Inventory, Condition, and Performance Data Standards A.2: Treatments and Work Data Standards A.3: Resource Allocation and Prioritization Standards A.4: Metadata Standards A.5: Governance Standards

Self-Assessment and Improvement Identification I-23 Minimum Data Coverage Agencies can examine asset and performance management decision-making needs to estab- lish clear requirements for where, when, and how often asset inventory, condition, and/or performance data will be collected (see text box for conceptual examples). Conceptual Examples Targeted Data Coverage Based on Asset Life-Cycle Considerations Streamlining Pavement Marking Performance Data Collection Certain pavement marking materials have a service life of less than 1 year. An annual retro-reflectivity performance data collection adds little value where these markings are used; therefore, these can be excluded from collection. Network-Level Data Collection Requirements Focusing Network-Level Collection Maintaining network-level data collection to meet detailed project-level design requirements is not generally cost-effective—data do not stay accu- rate, due to changing field conditions as well as maintenance and project work. Focus statewide collection on meeting requirements for network-level use cases (e.g., performance management, needs analysis, investment prioritization). In some cases, a complete network-wide collection may not be needed or may not be practical given resource constraints. In these situations, collection can be limited to what can be collected and maintained in a timely, cost-effective manner—and targeted to what is most valuable to decision-makers. Important Terminology The following terms are used within this section: • Data attribute, meaning a specific piece of the data model, describing a data entity. A data attribute contains a specific fact important to the business (e.g., Bridge ID, Sign Type, Pavement Roughness, or Install Date). • Asset breakdown structure, meaning a hierarchical model of the agency’s assets, with high- level categories (e.g., traffic assets) and subcategories (e.g., traffic signals). • Location referencing system, meaning a set of data and procedures for managing loca- tions of geographic objects using one or more methods for specifying a location. For TAM, a location referencing system often includes a linear referencing system that specifies loca- tion as the distance along the roadway from a reference point (e.g., a county boundary or intersection). • Component breakdown, involving models that divide complex assets into individual parts of the larger whole, such as dividing a bridge into the deck, superstructure, and substructure. • Asset information model, as defined by Building Information Modeling (BIM) standards (ISO 19650), meaning a model that compiles the data and information related to or required for the operation of an asset. Associated Response Templates A.1.a: Asset Inventory Data Model A.1.b: Asset Condition and/or Performance Data Model A.1.c: Design Model Standards A.1.d: Location Referencing Appendix A

I-24 Guidebook for Data and Information Systems for Transportation Asset Management Section A.2: Treatments and Work Data Standards Standardized data on asset treatments and work allows agencies to coordinate improvement planning across funding programs, understand asset maintenance and rehabilitation costs, com- pile a unified work history for an asset, and build meaningful models of the performance of different treatments. Required, Recommended, and Optional Data Elements Required and recommended fields should be feasible to collect and maintain; optional fields may be collected only under specific circumstances; and data that cannot be reliably collected or maintained should be excluded from the data model elements (see Figure I-5). Minimum Data Coverage Agency staff examines the DOT’s asset management decision-making needs to establish clear requirements for what extent treatment or work history data attributes will be collected (e.g., Interstate work history may be required, whereas other work may not be captured). It is important to balance the cost of tracking treatment information against the value the data will add to decision-making. For example, it may be costly to record itemized, location- specific information about certain minor or routine maintenance activities, particularly if the needed information can be collected and tracked in a more aggregated manner. Location-Based, Asset-Based, and Other Work Tracking Mechanisms Asset treatment and work can be captured by work location (e.g., paving on a particular route between specific mile points) or against the asset inventory (e.g., rehabilitation of a specific bridge). In some cases, geographic or organizational-level (e.g., county-wide summary) or con- tract or project-level information is sufficient. For each work type, consider and incorporate the necessary level of granularity for tracking into the data model and collection requirements (see text box for conceptual examples). Figure I-5. Examples of required, recommended, and optional treatment and work data. Area A: Specify and Standardize Data A.1: Inventory, Condition, and Performance Data Standards A.2: Treatments and Work Data Standards A.3: Resource Allocation and Prioritization Standards A.4: Metadata Standards A.5: Governance Standards

Self-Assessment and Improvement Identification I-25 Important Terminology The following terms are used within this section: • Work accomplishments, meaning the type and quantity of completed work on assets (e.g., inspections, repairs, or replacements). Descriptions of work accomplishments may include other information, such as date completed, whether the work was performed by state forces or contract, resources used, and cost. • Data exchange protocol, meaning the standard rules for data transfer between project design, delivery, and asset life-cycle management systems and/or process participants. • Project information model, as defined by the ISO 19650 standard, meaning a model devel- oped during project design and construction that begins as a design intent model, and then evolves to be a virtual construction model. Section A.3: Resource Allocation and Prioritization Standards Standardized prioritization factors and analysis parameters are critical to support high-level asset management decision-making and resource allocation. When standardized across the agency asset portfolio, these factors and parameters support alignment of investments with the agency’s mission, goals, and objectives, and support transpar- ency in decision-making. Conceptual Examples Targeted Data Collection for Minor/Routine Work Streamlined Guardrail Damage Repair Treatment Data Collection When spot repairs to the guardrail rail sections do not impact the overall guardrail system length, configuration, product type, or functional condition, detailed work or project information may not be necessary for asset management purposes. This decision can be reflected in the data model and requirements. Confirmation of As-Built Data Pavement Maintenance Treatment History Planned paving activities may be subject to field adjustments (e.g., project limits may be shortened or extended, paving sections added or removed, or treatment material types or thickness may be modified based on prevailing field conditions or available funding). For certain activities, it may only be necessary to reflect adjustments to the work location (e.g., when patching, for which treatment details are unlikely to change). For other activities, however, detailed treatment information may need to be captured in the field to ensure accuracy of critical information such as material types and thicknesses. Agency staff need to consider these circumstances when establishing detailed field data collection data models and requirements. Associated Response Templates A.2.a: Treatments and Work Data Model A.2.b: Treatments and Work Location Referencing A.2.c: Design Model Standards Appendix A Area A: Specify and Standardize Data A.1: Inventory, Condition, and Performance Data Standards A.2: Treatments and Work Data Standards A.3: Resource Allocation and Prioritization Standards A.4: Metadata Standards A.5: Governance Standards

I-26 Guidebook for Data and Information Systems for Transportation Asset Management Alignment with Planning TAM resource allocation prioritization factors and analysis parameters should reflect the organizational goals, objectives, and priorities established in the agency’s long-range transpor- tation plans, state transportation improvement plans, TAM plans, and other similar or related documents. Through alignment with planning, TAM resource allocation can effectively com- municate program value, priorities, and needs to better compete for limited agency funds. Typical Prioritization Factors An agency might prioritize asset or investment decisions in any number of ways. Conceptual examples of prioritization factors are shared in the text box. Conceptual Examples TAM Investment Prioritization Factors Asset Tiers Agencies may group assets into management tiers to support cross-asset prioritization. For example, Bridge, Pavement, and ITS assets may be prioritized over other assets. Roadway Classification Agencies commonly prioritize maintenance or replacement of assets on Interstate or higher functional class roadways over roadways with lower classifications. Asset Classification It is helpful to examine asset sub-types or classifications to identify investment priorities. For example, 4-bolt cantilever structures may be prioritized for maintenance or inspection over other types of structures due to safety concerns. Asset Condition, Performance, or Known Deficiencies Investment in assets may be prioritized within certain condition or performance levels. For example, traffic signals that are operating inefficiently may be priori- tized for retiming, component repair, or replacement, or even for a full rebuild. Asset Usage or Risk-Based Factors High-use or high-risk assets are often a prioritized TAM investment. Examples include bridges with long detour lengths and the use of roadway departure crash rates to prioritize roadside safety hardware investment. Analysis Parameters The following are typical analysis parameters: • Analysis horizon, which establishes the base and future years for the analysis. • Network or inventory, which identifies the included/excluded subsets of the asset inventory or agency network. • Available funding, which documents current and/or projected funding constraints.

Self-Assessment and Improvement Identification I-27 • Minimum or desired state, which sets the minimum or desired condition or performance levels that must be delivered by an optimized TAM investment strategy. • Treatment benefits and costs, which quantify eligible investment types, their impact on asset condition, performance, or other metrics included in the analysis, and their associated cost. • Asset deterioration models, which reveal the impacts of the lack of investment to asset condi- tion, performance, or other metrics. Important Terminology The following terms are used within this section: • Cross-asset metrics, meaning metrics that allow for measurement and comparison of out- comes across asset programs. Cross-asset metrics typically are established based on the agency goals and performance objectives. Examples include benefit, value, need backlog, safety, and operational performance. • Investment prioritization factors, meaning factors that allow individual projects or other asset management investment opportunities to be evaluated against program goals or performance objectives for purposes of investment optimization or prioritization (see examples above). • Analysis parameters, meaning key inputs to agency asset management or investment opti- mization analysis, such as asset deterioration rates, treatment condition reset values, treat- ment unit costs, or analysis time horizons. Section A.4: Metadata Standards Standard formats and processes for documenting data element definitions and calculations, as well as dataset level information, ensure that data are well understood and useful to TAM staff, IT staff, and data users. Accurate, accessible metadata enables users to identify data sources and elements that are available across the enterprise and to understand their limitations. Metadata Upkeep Metadata upkeep is often a challenge for an agency. It is important to ensure that appropriate procedures, roles, and responsibilities are in place for adding, changing, or deleting metadata items. Additionally, the DOT must consider what metadata management tools (such as a web- based metadata repository) are needed to ensure efficiency in recording and sharing. Data Dictionary Standards When establishing standards for data dictionary metadata, staff can consider available national/international standards (e.g., ISO 19115 and other supplemental IS191** series stan- dards). The following are standardized attributions for a data dictionary: • Application, system, table, and field names, which identify unique names and/or identifiers to the associated IT application, specific table, and associated field associated with the data dictionary entry; • Description, a meaningful description of the documented field or data element; • Required, indicating whether the field is required during data entry or can be left unfilled; • Field type and requirements, which capture information relating to the nature of the infor- mation being stored in the field, such as the field type, length, precision, or acceptable values; • ID/key/uniqueness, which captures whether the field is a primary or foreign key, or is otherwise required to be unique; • Confidentiality/sensitivity, which classifies the potential confidentiality or sensitivity of the information contained in the field (for example, if it contains personally identifiable information); Associated Response Templates A.3.a: Prioritization Factors A.3.b: Analysis Parameters Appendix A Area A: Specify and Standardize Data A.1: Inventory, Condition, and Performance Data Standards A.2: Treatments and Work Data Standards A.3: Resource Allocation and Prioritization Standards A.4: Metadata Standards A.5: Governance Standards

I-28 Guidebook for Data and Information Systems for Transportation Asset Management • Usage, which documents any particular context or limits to the use of the data in the field; and • Associated business terms, which identify what business terms or concepts are represented by the field. Dataset- or Database-Level Standards When establishing standards for dataset-level metadata, consider the following standardized attributions: • Application or system name, which is a unique name and/or identifier of the IT application or system or that which is associated with the dataset; • Owner/steward, typically a business point of contact or subject matter expert; • Creation/update dates, which indicate the date when the dataset was created or last updated; • Security/sensitivity, which categorizes the security or sensitivity level of the dataset or system; and • Acceptable uses, which document the acceptable uses of the system or dataset. Important Terminology The following terms are used within the assessment and improvement identification materials associated with this section: • Metadata, referring to data that provides information about other data. This informa - tion can be technical (e.g., field names and formats) or business-oriented (e.g., data definitions); • Data catalog, referring to a listing of available data resources complied to facilitate discovery and understanding; and • Data dictionary, referring to a table documenting individual data elements in a dataset containing information such as data element name, description, and type. Section A.5: Governance Standards Formal policies and procedures, oversight structures, roles, and processes are critical for data standards development and adoption. Data governance standards help to ensure that the data collected and maintained by the DOT are well understood, used appropriately, and are effectively and efficiently collected and leveraged across the enterprise. Governance of Complex Data-Informed Business Processes Implementing or advancing data-informed TAM business processes will increase the amount, complexity, and integration of data collected and managed by the agency. Data management challenges will grow as practices advance upward through the benchmark levels, with each level requiring increased emphasis on governance to sustain advanced practice. Data Governance Structure A commonly used governance structure consists of an upper-level committee that establishes governance policy and direction and is supported by one or more lower-level committees (Figure I-6). These governance committees establish data stewardship roles to provide accountability for data within individual business units, functional areas, and/or data subject areas. (For brief descriptions of these data stewardship roles, see the conceptual examples in the text box.) Associated Response Templates A.4.a: Data Dictionary Standards and Guidelines A.4.b: Dataset Metadata Standards and Guidelines Appendix A Area A: Specify and Standardize Data A.1: Inventory, Condition, and Performance Data Standards A.2: Treatments and Work Data Standards A.3: Resource Allocation and Prioritization Standards A.4: Metadata Standards A.5: Governance Standards

Self-Assessment and Improvement Identification I-29 Enterprise Data Stewards Business Data Stewards Data Custodians Governance Council Figure I-6. Example Data Governance Structure. Conceptual Examples Data Governance Roles and Responsibilities Upper-Level Policy Committee (e.g., Governance Council) The Governance Council is a decision- and policy-making authority, typically reporting directly to high-level executive management, providing oversight and direction to the enterprise governance program. Technical Working Committee (e.g., Enterprise Data Stewards Committee) The Technical Working Committee develops governance policy, standards, practices, and guidance; addresses implementation issues; and promotes adherence within the agency. Enterprise Data Stewards Enterprise data stewards represent enterprise-level interests in data within specific subject areas, facilitating coordination and agreement across business units. Data Stewards Data stewards are accountable for data within a specific business area, working with individual business owners to ensure that the data are well managed and provide value to the organization. Data Custodians Data custodians are the technical staff who work under the direction of the data steward and are responsible for execution of governance and data management activities by supporting direct entry, quality control, and maintenance of the data.

I-30 Guidebook for Data and Information Systems for Transportation Asset Management Data Governance Policy Agencies can adopt policies that establish data governance roles and responsibilities, and to ensure that data is treated as an agency asset. Such policies can define data of agency-wide interest and lay the groundwork for data standardization and other processes to ensure coor- dination across business units on data collection and development. Important Terminology The following terms are used within this section: • Data governance, describing the roles and lines of accountability for the management of an organization’s data assets to achieve its business purposes and compliance with any relevant legislation, regulation, and business practice; • Data stewardship, describing the formal, specifically assigned and entrusted lines of accountability for business responsibilities (as opposed to IT responsibilities), ensuring effective control and use of data and information assets; • Community of interest, referring to a group of stakeholders with a common interest in a type of data or another topic area and which, in contrast to a community of practice (wherein members have similar job functions), members may come from different parts of the organi- zation and have distinct goals; and • Change management or change control, referring to processes in place to review, evaluate, and coordinate changes to data products, applications, and systems, and intended to minimize impacts to users and reduce any change-related errors. Area B: Collect Data This area addresses the collection of asset inventory, condition and performance data, treatment and work history data, and information about external decision-maker and public perceptions in a manner that can be incorporated into DOT TAM programs. Area Overview Area B is organized into four distinct sections: • B.1: Inventory, Condition, and Performance Data Collection, covering the collection of asset inventory, condition, and performance data, together with specific consideration of coverage, automation, and quality; • B.2: Project Information Collection, covering the collection of project work accomplish- ments to update asset inventory and maintain the work history for specific assets, with a focus on data collection coverage, automation, and quality; • B.3: Maintenance Information Collection, covering the collection of maintenance work accomplishments to update the asset inventory and maintain the work history for specific assets, with a focus on data collection coverage, automation, and quality; and • B.4: Priority Criteria and Values Collection, addressing the capture of public perceptions and decision-maker values to help guide DOT TAM decision-making. Improvements in this area are aimed at advancing methods for collecting and assuring the quality of key data supporting TAM analysis, reporting, and decision-making. Improvements may include the deployment of innovative technology solutions, as well as improved quality control and assurance techniques and streamlined business processes. Associated Response Templates A.5.a: Data Stewardship A.5.b: Data Standards and Guidelines Development/ Adoption Processes A.5.c: Data Collection Approval/Coordination Processes A.5.d: Change Control (Systems and Data) Processes Appendix A Area B: Collect Data B.1: Inventory, Condition, and Performance Data Collection B.2: Project Information Collection B.3: Maintenance Information Collection B.4: Priority Criteria and Values Collection

Self-Assessment and Improvement Identification I-31 Section B.1: Inventory, Condition, and Performance Collection Understanding asset inventory, condition, and performance is fundamental to TAM. Data collection activities must be planned to ensure that the right data are gathered with sufficient quality to support decision-making. Data collection is costly, so agencies must carefully manage scope and work to achieve efficiencies. Data Collection Program Review Most transportation agencies have programs in place for collecting inventory, condition, and performance data; however, needs and requirements change over time, as technology advances and new data sources create opportunities to improve efficiencies. Periodic review of data collec- tion programs across assets is helpful to determine if adjustments are warranted. Key questions include: • What data collection is happening now, and how can those processes be automated? • What information is available in other departments that could be brought into the data collection program? Quality Management and Governance A comprehensive Data Quality Management Plan (DQMP) enables a consistent collection process across assets and departments. The development of a DQMP can begin with individual assets and be expanded and integrated over time. Governance processes should also be put into place to ensure that data collection and quality control measures remain aligned with business processes and needs. Digital Transformation and Automation As asset collection is standardized, manual and paper processes can be replaced by digital systems and automated processes. Agencies with several disparate collection and management tools can find opportunities for consolidation. Examples of network-level and project-level data collection are discussed in the “Conceptual Examples” text box. Area B: Collect Data B.1: Inventory, Condition, and Performance Data Collection B.2: Project Information Collection B.3: Maintenance Information Collection B.4: Priority Criteria and Values Collection Conceptual Examples Network-Level Collection Mobile LiDAR Data for multiple asset categories (e.g., pavements, roadside assets, signage, marking, drainage) can be bulk-captured using mobile LiDAR vehicles. The data can be processed to extract feature inventory, asset condition, and detailed asset attribution. Project-Level Collection Destructive and Nondestructive Pavement Investigation Project-level data collection for pavements can be informed by destructive methods such as drilling/coring rigs or by nondestructive deflection testing via falling weight deflectometers. Ride quality can be measured using tools such as a high-speed profiler.

I-32 Guidebook for Data and Information Systems for Transportation Asset Management Data Sourcing and Collection Opportunities Evaluate existing data sources before developing a new data collection program. If new or additional data collection is needed, consider whether outsourcing would be more sustainable than establishing a new, internal data collection program. Important Terminology The following terms are used within this section: • Data Collection Plan, referring to an initiative or program planning document that outlines how a data collection program will be executed and improved to meet identified business needs. This plan should attempt to make the best use of current resources to leverage capital investment and technology, and it should be guided by documented business cases and value for data collection. • Data Quality Management Plan (DQMP), referring to a documented management system that details the quality objectives and controls to be applied during the various phases of asset data collection. Its purpose is to ensure quality in all work processes, products, and outputs, and to support continuous quality improvement. Management sponsorship and governance is critical to ensuring the success of the plan. Section B.2: Project Information Collection Agencies track information about capital projects from planning through construction phases. If properly structured, this information can be leveraged within TAM to update asset inventories and condition projections and to maintain asset-specific work histories that can be used to better understand asset performance. Network-Level Tracking Program-level asset investments and accomplishments may be quantified from project scoping documentation. Typically, this information is only useful for the primary asset in the project and generates limited activity or provides asset-specific information. With this level of detail, project information is useful in estimating general, network-level trends and/or impacts to the TAM program. Activity-Level Tracking Project-level asset information often is extracted from project development documents (e.g., contract bid tab information) or project delivery systems used to track contract payments (e.g., AASHTOWare Project SiteManager). Information often is not structured in a manner that can be related to specific assets; however, it provides valuable insight into activity-level invest- ments and accomplishments within the project limits. Asset-Level Tracking Tracking project activities against specific project line-items and including asset informa- tion (i.e., an asset ID, location, manufacturer, and other details) allows asset managers to understand specifically what assets were impacted or installed through a project. Developing a comprehensive work history for an asset is a valuable support for detailed TAM decision- making. Project Data Template Creation Developing project delivery templates that include key assets and standard asset data naming supports data extraction and integration. Some examples related to legacy project data extrac- tion and asset-level project information are provided in the “Conceptual Examples” text box. Associated Response Templates B.1.a: Inventory, Condition, and Performance Coverage B.1.b: Inventory, Condition, and Performance Automation B.1.c: Inventory, Condition, and Performance Quality Appendix B Area B: Collect Data B.1: Inventory, Condition, and Performance Data Collection B.2: Project Information Collection B.3: Maintenance Information Collection B.4: Priority Criteria and Values Collection

Self-Assessment and Improvement Identification I-33 Legacy Project Data Conversion Historical project files contain valuable asset inventory information. Programmatic conver- sion of these files is an effective means of asset data collection. Useful technologies include: • Optical character recognition (OCR) tools, which are useful for automated recognition of typed, handwritten, or printed text within imagery; and • Text analytics and natural language processing (NLP) techniques, which can be used to process text into useful data. Important Terminology The following terms are used within this section: • Automated file validation, referring to specific software created for the purpose of “running” or “processing” project digital files to validate and quality assure information located within; • Project data extraction automation, referring to current or legacy digital project files that contain asset and non-asset information; and • Project data templates, referring to pre-populated project files that include asset types and standard asset information. Using these templates to begin projects enables better quality management and consistent delivery of asset information. Section B.3: Maintenance Information Collection Capture of standard maintenance work order and accomplishment information can provide valuable insights for asset life-cycle planning and maintenance budgeting. Digital Work Order Data Collection Field maintenance forces have a unique opportunity to capture needed asset information while on-site. Traditionally, DOT maintenance programs have been delivered through manual (often pen-and-paper) business processes. Converting to a digital work order process or a Computerized Maintenance Management System (CMMS) can bring uniformity and efficiencies to these tasks. Using digital tools also supports the development of a useful, comprehensive work history. Conceptual Examples Legacy Project Data Extraction Regulatory Signage Extraction Standard design templates for regulatory signage can include individual cells or blocks at each sign location, and modern design files are prepared utilizing real local/global coordinate systems. The combination of consistent nomenclature and geo-positional information allows for the automatic extraction of signage assets within a project file for upload to a GIS-based asset management system or asset inventory. Asset-Level Project Information Regulatory Signage Asset Information Project information models that include regulatory signage in a consistent format and with geospatial accuracy enable the automatic extraction of these items. Additional metadata applied to these assets can be extracted alongside the type and location (e.g., installation date, sign dimensions, sheeting material). Associated Response Templates B.2.a: Project Information Coverage B.2.b: Project Information Automation B.2.c: Project Information Quality Appendix B Area B: Collect Data B.1: Inventory, Condition, and Performance Data Collection B.2: Project Information Collection B.3: Maintenance Information Collection B.4: Priority Criteria and Values Collection

I-34 Guidebook for Data and Information Systems for Transportation Asset Management Mobile Data Collection Mobile data collection devices and applications significantly improve the quality of field- collected information. These tools offer simplified data collection and share valuable reference information. Built-in GPS and GIS capabilities support accurate location against aerial imagery or custom base maps. Field maintenance information also can be directly collected against the asset inventory. Work Order Automation Automation of the service request to work order process dramatically streamlines mainte- nance business processes. As an additional benefit, these techniques improve the accuracy and efficiency of work accomplishment data collection. Examples related to work order management software are provided in the “Conceptual Examples” text box. Important Terminology The following terms are used within this section: • Routine maintenance, referring to maintenance tasks that are planned in advance. These can be recurring tasks or one-off scheduled preventive-care tasks. • Service requests, referring to asset-repair requests that are filed due to damage or wear. These requests can originate from inside or outside an agency. • Computerized Maintenance Management System (CMMS), referring to software that is used to manage an organization’s maintenance operations. • Work orders, referring to authorized maintenance tasks. Work orders can result from approved service requests or be generated as part of planned preventive or routine mainte- nance schedules. Section B.4: Priority Criteria and Values Collection Although many TAM decisions are based on technical factors such as condition indices, it is important to put processes in place to understand both asset-user and decision-maker values and priorities. These values and priorities can be used to inform the appropriate use of technical information for decision-making. Public Perceptions The public’s perceptions and priorities for transportation infrastructure are a critical compo- nent of long-term TAM planning and strategic TAM prioritization. Data collection about the public’s perceptions and priorities can begin through ad hoc engagement, during individual public meetings, or with targeted surveys. In more-advanced applications, these techniques should be coordinated and refined into standardized, continuous feedback programs. Valuable tools to collect data on public perceptions include: • Focus groups, in which small, diverse groups of people are directly engaged to gather feedback; • Surveys and polling, conducted to gather broader samples of public opinion; • Social media outreach, offering two-way communication with the public regarding TAM priorities and values; and • Public meetings, scheduled to directly engage interested members of the community regarding the TAM program. Associated Response Templates B.3.a: Maintenance Information Coverage B.3.b: Maintenance Information Automation B.3.c: Maintenance Information Quality Appendix B Area B: Collect Data B.1: Inventory, Condition, and Performance Data Collection B.2: Project Information Collection B.3: Maintenance Information Collection B.4: Priority Criteria and Values Collection

Self-Assessment and Improvement Identification I-35 Conceptual Examples Work Order Management Software Work Association with Asset and Organizational Structures Transforming traditional maintenance programs to a centralized CMMS will allow association of maintenance work accomplishments with organizational, work, and asset hierarchy. Automated Work Ordering A simple example of work order automation is through the use of scheduling tools to generate recurring work orders for routine maintenance activities.

I-36 Guidebook for Data and Information Systems for Transportation Asset Management Agency Values and Perceptions Internal agency perceptions and leadership values are additional data points that are needed to plan asset management capital programs. These values and perceptions can include require- ments that originate from historical/legacy organizational structures as well as from political influences affecting executive leadership. Decision-makers should be targeted using a structured approach. Some example tools and approaches are: • The Delphi technique, a structured, interactive approach to building consensus among panels of experts; • The nominal group technique, a group problem-solving and decision-making process that promotes consideration of all opinions and is useful in groups of various sizes; and • Decision trees, which are used to map observations and conclusions about decision processes and priorities. Insights and Value Criteria Modern decision science and data analytics programs require more than inventories and conditions to provide long-term value. The information obtained from public and agency perception surveys provide required context to support investment decisions. Examples of two approaches to outreach and engagement are described in the “Conceptual Examples” text box. Conceptual Examples Customer Service and Social Outreach Tying the Public Facing Side to Maintenance DOTs can utilize their customer service centers to tie feedback directly into work orders and maintenance functions. Many public interactions take place on Twitter, Facebook, or other social media outlets. Agencies can introduce ways to connect feedback and maintenance issues into their data collection systems. Targeted Focus Group Engagement Capital Plan Roadshows Some agencies may decide to take their capital improvement plan and budget on the road. Targeting a series of town hall-style public outreach sessions and/or focus groups sessions is an effective means of obtaining a regionally varied input into a transportation asset management capital program. Important Terminology The following term is used within this section: • Decision science, referring to quantitative techniques that are used to inform decision-making by identifying optimal choices based on available information. Decision science seeks to make plain the scientific issues and value judgments underlying these decisions and to identify tradeoffs that might accompany any particular action or inaction. Associated Response Templates B.4.a: Public Perceptions B.4.b: Decision-Maker Values Appendix B

Self-Assessment and Improvement Identification I-37 Area C: Store, Integrate, and Access Data This area involves structuring, integrating, and providing access to data to support TAM operations, management, and reporting needs. Area Overview Area C is organized into four distinct sections: • C.1: Databases, covering the design of TAM databases to support efficient and effective storage of and access to contained data and documentation; • C.2: Asset Life-Cycle Data Integration Workflows, covering the efficient exchange of information across various business processes, tools, and systems involved in the complete asset life-cycle from project planning through delivery and ongoing asset management and operation; • C.3: Other Data Integration Workflows, covering the efficient exchange of information between asset management and financial and other supporting business systems and practices; and • C.4: Data Access, covering the ability of agency field staff and the general public to efficiently and securely access TAM information. Improvements in this area are aimed at efficiently managing asset data and integrating data across systems supporting different assets and different asset life-cycle phases and agency business processes. Section C.1: Databases Making data available across the enterprise while eliminating redundant information is critical to driving efficiency and reliability. Moving from paper-based data to fully integrated databases also requires planning and critical thinking about how and where the data will be stored, and how different databases will be linked to each other. Data Storage Management Data retention, backup, and disaster recovery are essential to the sustainability of agency asset databases and the continuity of critical, data-informed TAM business processes. It is important to examine and quantify risks and select tiered data storage solutions that align with the agency’s risk tolerance and budget. Examples of key components of a Disaster Recovery Plan are provided in the “Conceptual Examples” text box. Source Systems and Master Data The agency should identify and designate the source systems in which essential agency business information (such as agency assets, financials, contracts, or employees) will be gathered and stored. A best practice is to create master data from the source data to (1) provide a single “source of truth” for reporting, (2) protect source data integrity, and (3) ensure that changes to source data are reflected in replicated or derivative datasets. Asset Identifiers and Linear Referencing for Data Linkages Asset identification and linear referencing schemes are vital to agency database integration. New and existing TAM databases should be structured to provide these standardized data link- ages. This practice will enable integration of asset and non-asset data for TAM analysis and decision-making. Area C: Store, Integrate, and Access Data C.1: Databases C.2: Asset Life-Cycle Data Integration Workflows C.3: Other Data Integration Workflows C.4: Data Access Implementation Support Organizational Practices Chapter 5 Case Studies Chapter 5 Area C: Store, Integrate, and Access Data C.1: Databases C.2: Asset Life-Cycle Data Integration Workflows C.3: Other Data Integration Workflows C.4: Data Access

I-38 Guidebook for Data and Information Systems for Transportation Asset Management Establishing these data linkages typically requires programming; however, more and more commercial software tools are providing end-user utilities to help automate development, decreasing reliance on staff with specialized skillsets. Data Warehousing Across the enterprise, asset and non-asset data commonly are stored in different systems. A data warehouse is a central repository of integrated data that supports efficient reporting and analysis. Data typically will be loaded into a data warehouse through automated routines that run on a set frequency based on end-user requirements. Important Terminology The following terms are used within this section: • Linear referencing system (LRS), referring to a method used for spatial referencing of the locations of physical features along a linear element. LRS features are described in terms of measurements from a fixed point, such as a mile marker or station along a road. Each feature is located by either a point (e.g., a signpost) or a line (e.g., a no-passing zone). A well-governed LRS helps ensure that the spatial relationships between assets held in different databases can be viewed and analyzed. • Data lake, meaning a single repository of different databases held in native form, which is typically used for data exploration rather than routine analysis. When implementing a data Conceptual Examples Components of a Disaster Recovery Plan Summary The summary takes a few pages to cover the most important steps of the plan and key contacts. A stand-alone Executive Summary document also may be prepared. Scope and Purpose This section provides an introduction that describes the purpose and scope of the plan along with documentation of authority and approvals and the frequency of review and updates required for the plan. Roles and Responsibilities This section provides descriptions of the key roles and responsibilities of each member of the disaster recovery team and any limitations based on governance and approval thresholds. Response Procedures This section describes the processes to be initiated and followed in the disaster response, including assessment of the situation, documentation of any damages, and notifications required based on severity of the damages. Documentation Requirements This section provides clear directions for the documentation of activities as required if the plan is activated. Associated Response Templates C.1.a: Efficient Storage C.1.b: Database Linkages C.1.c: Document Linkages C.1.d: Data Storage Capacity Appendix C

Self-Assessment and Improvement Identification I-39 lake, it is important to consider the desired end-uses and such details as storage, security, agility, and end-user sophistication. • Disaster Recovery Plan, referring to a specific plan that documents a set of policies and procedures that support the recovery of data and data infrastructure in the event of a natural or human-made disaster. In the context of asset management, the Disaster Recovery Plan should consider necessary access to data related to life-line assets such as evacuation routes, utilities, and communications. • Cloud storage, referring to an arrangement by which the physical storage of data is man- aged by an external service provider (e.g., Amazon Web Services or Microsoft Azure). Cloud storage often provides lower cost and less direct maintenance, but requires addi- tional data access and security considerations. Cloud storage has emerged as a useful tool to address rapidly growing data storage needs. Section C.2: Asset Life-Cycle Data Integration Workflows Through technology advancement, it is now possible for data to persist across the asset life- cycle (i.e., from planning through to design and delivery and ultimately through to operations and maintenance). Integrating workflows between the various phases of the asset life-cycle is critical to unlocking the value that is amassed through these phases (see Figure I-7). Asset Life-Cycle and BIM for Transportation BIM for Transportation has emerged as a process that supports asset life-cycle data inte- gration. As technology has evolved to be more and more interoperable, more industries and Area C: Store, Integrate, and Access Data C.1: Databases C.2: Asset Life-Cycle Data Integration Workflows C.3: Other Data Integration Workflows C.4: Data Access Figure I-7. Example data and information flow supporting asset registry development and maintenance.

I-40 Guidebook for Data and Information Systems for Transportation Asset Management agencies are seeing the advantages of leveraging BIM—and a common data environment—to support the asset life-cycle from beginning to end. Project Planning Asset information that has been developed during project planning and design can seam- lessly flow through to the construction and operations phases, generating efficiencies in each project phase and yielding a detailed asset information model to inform asset operation and maintenance. During project planning, key decisions are made regarding site location, constructability, and regulatory requirements. Assets within the project limits may inform planning decisions. For example, federal project requirements may dictate that safety assets (e.g., guardrails) must be evaluated and brought to current standards during the project. By integrating exist- ing asset information into project planning systems, these activities can be streamlined and planning outcomes can be improved. GIS databases and tools are customarily used during planning. GIS data that has been collected and generated during planning regarding boundary constraints, material consider- ations, or site conditions will be key inputs to the project delivery. It is important to ensure that data workflows from planning to development consider how these data are passed on to create efficiencies in future phases. Project Development Project development includes the financing and design aspects of an infrastructure project. Schematic and detailed designs progress the project plan into a constructible state. Designs typi- cally incorporate information about asset location, size, material, type or standard, and other details. Design information can be coded into an owner-specified design model as data that can then be leveraged during project delivery. Project Delivery Project delivery encompasses the physical construction of the new asset. During this phase, the information contained in the design model can be carried forward to inform and optimize the construction phase. As-built information also can be captured during project delivery. Handover/Handoff When construction has been completed, the asset is handed over for management and main- tenance. At this stage, asset life-cycle data integration workflows yield the most significant benefits. Data that has been developed and evolved through the project phases can be transferred directly into the owner’s asset data model, providing updated asset inventory, condition, and work history. This data should be subsequently maintained through the owner/operator’s asset management systems. Further discussion of data integration across asset life-cycle stages is provided in the “Conceptual Examples” text box. Important Terminology The following terms are used within this section: • Asset life-cycle, covering phases that represent key milestones in the development of the asset, starting with planning, then development, then delivery, and ultimately operation and maintenance; Associated Response Templates C.2.a: Asset Management Data to Project or Work Order C.2.b: Project Planning to Project Development C.2.c: Project Development to Project Delivery C.2.d: Project Delivery to Asset Management Data Appendix C

Self-Assessment and Improvement Identification I-41 • Data transformation, covering the process of converting data from one format to another, which often is required to support data integration workflows (particularly when different technologies are employed by different users or stakeholders over the asset life-cycle); • Batch processing/transfer, a process that supports mature data integration workflows by allowing for certain data transformation tasks to be performed according to a routine, frequently without human intervention; and • Performance targets, meaning targets based on measurements that are intended to provide evidence or give an indication of an asset’s level of service or performance. Performance targets can be directly imposed by regulators or set based on the strategic objectives of an organization. Performance targets can be established to meet a committed minimum level of service to the users of that asset (e.g., smoothness of pavement) or they can be aspira- tional (e.g., if an organization is trying to enhance the level of service to encourage more use or thwart competition, as by reducing congestion levels on managed lanes). Conceptual Examples Data Integration Across Life-Cycle Stages Project Planning to Project Development Most planning activity occurs in the 2D space using tools like GIS. Planning tasks include site selection, choosing among alignment alternatives, and conducting an environmental constraints analysis. The information gathered for one aspect of planning (e.g., soil information) can influence another aspect (e.g., material selections or placement of pilings in project design). Interoperability between GIS and design tools affords new efficiencies that can eliminate the need to export/import data to inform design. (One example of an interoperable tool is Esri’s Autodesk Connector for ArcGIS). Through such tools, planning data can be shared directly with designers to inform decision-making. Project Development to Project Delivery Current DOT practices typically involve disconnected processes during which the design model is discarded and the contractor defines a model optimized for their delivery purpose. These disconnected processes are inefficient and ultimately translate to additional costs for the owner. Alternative delivery methods are creating new efficiencies, but infrastructure owners can take responsibility for defining model requirements throughout the phases of design-bid-build projects. Project Delivery to Asset Inventory/Condition and Work Orders When model specifications are defined contractually, the as-built plans can be delivered digitally as as-built models. With model specifications directly aligned with the asset information model, data for the assets that were defined in the as-built model can be imported to supplement the asset inventory. Valuable as-built information (e.g., location, photos, manufacturer- recommended maintenance schedules, and warranties) can be retained for use in maintenance work ordering, warrantee claims, and general asset management and operation.

I-42 Guidebook for Data and Information Systems for Transportation Asset Management Section C.3: Other Data Integration Workflows Information contained in agency financial, traffic monitoring, transportation modeling, and environmental systems supports asset risk analysis and prioritization and improves TAM treatment selection and decision-making. Demand and Utilization Traffic monitoring and traffic and demand modeling data can help TAM practitioners understand the current and forecasted level of service required of existing or proposed roadway assets. This information is highly valuable in asset management decision-making, particu- larly to inform prioritization, selection, and scoping of projects and maintenance work. By establishing a means to integrate measured or modeled demand with the DOT’s asset data, the agency can: • Improve planning and optimize funding to coincide with current and forecasted levels of services required to meet user needs; • Improve design of new or reconstructed assets based on existing demand, reducing future TAM needs; • Improve maintenance treatment selection to account for known or modeled demand; and • Prioritize decision-making to deliver maximum value for the public and other TAM stakeholders. Examples of other data integrations are presented in the “Conceptual Examples” text box. Area C: Store, Integrate, and Access Data C.1: Databases C.2: Asset Life-Cycle Data Integration Workflows C.3: Other Data Integration Workflows C.4: Data Access Conceptual Examples Other Data Integrations Use of Demand Modeling Data in Asset Management Traffic demand modeling provides roadway asset owners with forecasted traffic volumes. Forecasted traffic volumes are used in prioritizing work programs and capital projects. Database considerations that support linking demand models to asset inventory are the key to unlocking integration efficiencies (e.g., by using common location referencing or roadway section identifiers). Use of Environmental Data in Asset Management Flooding risk is a key environmental factor considered in TAM. With accurate spatial location referencing for asset data, analysis can be conducted to assess the impacts and risks posed by flooding events. The analysis results can be used to inform prioritization of maintenance procedures, treatment selection, asset design decisions, and may also influence new development decisions. Environmental Modeling Environmental systems contain data valuable in understanding asset deterioration, environ- mental impacts of assets and regulatory requirements applicable to individual projects and maintenance actions. Environmental data often is managed in GIS. Geoprocessing tools can efficiently combine GIS information with the associated assets based on their geolocation.

Self-Assessment and Improvement Identification I-43 Financial Register Versus Operational Register A financial asset register is used to produce financial statements and support long-term finan- cial planning and budgeting. In contrast, the operational asset register, which is typically stored in the asset data model of an asset management system, supports the ability to associate and track work orders and maintenance. It is important to establish a tie between the financial asset register and the operational asset register to support accurate financial reporting and to leverage asset work order and mainte- nance history to support financial forecasting and planning. Important Terminology The following terms are used within this section: • Two-way data exchange, meaning bi-directional reading and/or writing of data between two databases (see “Conceptual Examples” text box on “Two-Way Data Exchange”); • Non-asset data, meaning data that is contextual to the asset but not directly about the asset. For example, the soil type in the area of a buried utility pipe is not data explicitly about the asset but is highly relevant to how the asset will perform; • Data transformation, covering the process of converting data from one format to another as may be required to support data integration workflows, particularly when different technologies are employed by different users and stakeholders throughout the project phases; and • Batch processing/transfer, a process that supports mature data integration workflows by allowing for certain data transformation tasks to be performed according to a routine, frequently without human intervention. Associated Response Templates C.3.a: Financial (Revenue, Budget, and Expenditure) Data C.3.b: Demand and/or Utilization Data C.3.c: Environmental Data Appendix C Conceptual Examples Two-Way Data Exchange Exchange Between Financial and Operational Asset Registries A two-way data exchange between an operational asset registry and a financial asset registry may occur when a new asset is constructed: • In one direction, the asset value may need to be exchanged from the operational asset registry to the financial asset registry. • In the other direction, the acquisition cost of an asset, which is stored in the financial asset registry, may need to be exchanged from the financial asset registry to the operational asset registry to support life-cycle cost analysis. Section C.4: Data Access Data access must be carefully planned to balance the agency’s business needs and public accountability with the need to protect data integrity and mitigate the risks of data misuse or misinterpretation. Connected and Disconnected Editing Field maintenance staff need the ability to access, update, or input data in both connected and disconnected environments. Area C: Store, Integrate, and Access Data C.1: Databases C.2: Asset Life-Cycle Data Integration Workflows C.3: Other Data Integration Workflows C.4: Data Access

I-44 Guidebook for Data and Information Systems for Transportation Asset Management • Connected editing requires an internet connection (cellular or Wi-Fi) to be able to read or write data back to a database. In a connected environment, field data can be made avail- able immediately upon collection, allowing efficient coordination with office staff or other stakeholders who are not in the field. Field staff also can access information that would otherwise be unavailable without advance planning (e.g., detailed design files from pre- vious projects or comprehensive asset work histories). Connected editing improves field decision-making and generates significant efficiencies by avoiding unnecessary travel time between field and office locations. • Disconnected editing allows a user to download and store data locally on a mobile device (e.g., in the office or at another location with a reliable internet connection). The user can add to or update the local data on the mobile device and later, once an internet connection is reestablished, upload the enhanced/revised information back to the main database. If fieldwork requires users to retrieve or collect data in remote areas, disconnected editing options will likely be required to support these activities. Access Levels and Data Security Data access and security are more easily managed early on, during system development (i.e., when the system and associated data models can be structured to support the assignment and enforcement of data access or security levels). With proper consideration, data access and security can be controlled at the system level, the application level, or even the database level. Sample questions to aid in determining data access levels are provided in the “Conceptual Examples” text box. Conceptual Examples Data Access and Security Database Access Qualifying Questions Determining who should have access to different datasets can be a daunting task. Key questions that can serve as a guide include: • Why does the user need to access the data? • How will the data be used? • Is the data being accessed sensitive (i.e., would release of the data pose risks)? • Does the user need read-only access, or will the user need to update the data as part of their task? Role-Based Data Access Role-based data access is an approach to granting or denying access to users based on their designated roles in an organization. By defining roles and respon- sibilities, the appropriate levels of data access can be granted across enterprise- wide systems. Data governance programs commonly define such roles and implement over- sight to monitor and manage the roles and responsibilities so that they can evolve over time to support the changing data and systems environment of the organization.

Self-Assessment and Improvement Identification I-45 Mobile Access Mobile devices such as laptops, phones, or tablets provide handy means for users to access data when they are away from the primary office environment. When evaluating mobile access, it is important to consider: • Security protocols and technical programming that are required to make data and/or tools available; • Data that are required in the field versus data that are desired or useful only in the office; and • Agency policies and practices relating to mobile device procurement and personal cell phone use. The proliferation of mobile technologies offers a perception of ease and convenience, but having too much data or overcomplicated tools can reduce efficiency and create adoption challenges. Industry trends are toward targeting mobile tools for niche functions and employ- ing responsive web design (RWD) on primary applications that make web pages render well on a variety of devices and window or screen sizes to avoid costly additional programming to support mobile device use. Story Boards and Dashboards Story boards and dashboards have emerged as key data visualization tools that enhance communications. The ability to use illustrations, maps, charts, and other graphics is critical to effective communication of the complex messages of a DOT asset management program. For example, Esri’s GIS story boards with embedded maps and charts can communicate critical asset risk areas or forecasted network-level asset conditions far more powerfully than present- ing the same data in written reports and spreadsheets. Similarly, tools such as Microsoft Power BI and Tableau make it easy to mine and present trends for historical asset condition values or projected savings based on various project prioritization schemes to support funding approv- als. Such tools also allow a DOT to provide curated access to agency data, which is particularly useful when engaging non-expert or external stakeholders. Important Terminology The following terms are used within this section: • Firewall, meaning an IT security system that monitors and controls incoming and outgoing network traffic, screening what is and is not let through based on predetermined security rules. It is essentially a barrier between trusted sources and untrusted sources. Adjustments may be required in firewall security protocols to account for new means of access (e.g., mobile or third-party access to agency systems). • Single sign-on (SSO), referring to technology that facilitates ease of data access across various enterprise applications and network resources through an authentication process that allows access to multiple applications with one set of login credentials. SSO reduces or eliminates the need for users to maintain different user names and passwords for different systems. Area D: Analyze Data This area addresses the use of established decision-support tools, techniques, and practices to support the development of actionable information and insights that in turn support decision- making. Associated Response Templates C.4.a: Field Access to Data C.4.b: Public Access to Data C.4.c: Access Security Appendix C Area D: Analyze Data D.1: Data Exploration, Reporting, and Visualization D.2: Modeling

I-46 Guidebook for Data and Information Systems for Transportation Asset Management Area Overview Area D is organized into two distinct sections: • D.1: Data Exploration, Reporting, and Visualization, addressing the analytical environ- ment, practices, and tools used within the agency data analysis practice; and • D.2: Modeling, presenting specific asset performance prediction and TAM investment optimization and prioritization methodologies which can be supported through data analysis. Improvements in this area are aimed at advancing practices for transforming raw data into information that can support decision-making. Section D.1: Data Exploration, Reporting, and Visualization Transforming raw data to actionable information requires establishing consolidated data processing, analysis, and reporting environments and tools, as well as standardized reporting procedures and training to support effective data analysis. Data Analysis Environment A centralized data analysis environment offers significant time savings, improved analysis quality and trust, and a common platform around which to standardize reporting, visualization, and analysis tools, techniques, and practices. Agencies can populate this environment with authoritative, curated datasets and develop the standardized data transformations needed to support routine TAM data analysis needs. It is important to provide standardized capabilities and solutions to address ad hoc analysis requirements (e.g., use of a data lake to temporarily expose data for time-bound data explora- tion activities). TAM Data Visualization Practices Standardized data reports and visualizations are effective communication and information- sharing tools. Common visualization techniques include: • Straight-line diagraming tools that simplify the representation of the roadway to provide location referencing context. • Performance dashboards that track and represent agency goals, objectives, and performance measures to guide daily asset management work activities and decisions. • Data marts and interactive reporting tools that provide highly usable, ad hoc reporting functions. TAM Data Analysis Practices Many transportation agencies are developing specialized data analysis and data science programs to support TAM and other business areas. Descriptions of analytical techniques commonly leveraged to support TAM are provided in the “Conceptual Examples” text box. Important Terminology The following terms are used within this section: • Big Data, meaning data that is too large and complex to be dealt with through traditional data processing applications and methods; Area D: Analyze Data D.1: Data Exploration, Reporting, and Visualization D.2: Modeling Associated Response Templates D.1.a: Analysis Environment D.1.b: Analysis Practices D.1.c: Analysis Tools Appendix D

Self-Assessment and Improvement Identification I-47 • Business intelligence, meaning the systems, applications, and processes that change raw data into useful business information; • Data science, meaning the use of scientific methods, processes, algorithms, and systems to extract knowledge and insights from structured and unstructured data; and • Data mart, a scaled-down version of a data warehouse, meeting a particular analytical, reporting, or decision-support need. Conceptual Examples Analysis Techniques Supporting TAM Geo-Processing Geospatial information can be used to integrate and compile disparate datasets useful for TAM analysis. Data, Text, and Process Mining Techniques can be applied to identify anomalies, patterns, and correlations within the large datasets available to TAM practitioners. Temporal Analysis This technique enables examination or modeling of a variable within a dataset over time, which is useful for applications such as asset deterioration modeling, performance trend analysis, investment scenario analysis, and asset work history or use evaluation. Trade-Off Analysis This technique facilitates the comparison of investment priorities with fiscal constraints (both within a given asset program, or across multiple programs). Prescriptive Analytics Business analytics can be used to find the best course of action for a given situation (e.g., selecting a TAM treatment for a specific location or asset). Predictive Modeling Business analytics can be applied to forecast future conditions (e.g., asset condition forecasting). Predictive Analytics Data mining, statistics, modeling, machine learning, artificial intelligence, or other techniques can be used to make predictions about unknown future events. These techniques are emerging in DOT practice. Decision Science Decision science can be applied to score projects and optimize selections for programming based on benefits, costs, and other measures that are used to assign relative importance. This technique is often seen in multi-objective project prioritization and decision-analysis applications.

I-48 Guidebook for Data and Information Systems for Transportation Asset Management Section D.2: Modeling Agencies are implementing increasingly powerful asset performance models and investment prioritization and optimization techniques to support asset life-cycle planning, project prioriti- zation, and network-level resource allocation. Asset Performance Models Support condition- or performance-based management strategies through the development of models to forecast asset performance over time. Transportation agencies can use modeling outcomes to improve asset life-cycle planning strategies selection and TAM investment and resource allocation decisions. Common asset performance models include: • Improvement benefit models that anticipate future asset condition and/or performance for a given TAM investment; and • Asset deterioration models that forecast future condition or performance of the asset, assuming no TAM investment. Ideally, development of these models is based on statistical analysis of trusted work and performance history; however, where trusted data are unavailable, expert opinion can be used to develop or refine asset performance models. Used in combination, improvement benefit models and asset deterioration models form the backbone of TAM investment optimiza- tion and prioritization analysis. Modeling techniques can be deterministic or probabilistic. Examples of these modeling techniques are provided in the “Conceptual Examples” text box. Area D: Analyze Data D.1: Data Exploration, Reporting, and Visualization D.2: Modeling Conceptual Examples Modeling Techniques Deterministic Modeling A relatively simple and commonly used modeling approach in TAM, deterministic modeling applies regression analysis to develop “best-fit” equations to characterize asset performance changes over time or based on TAM investment. Probabilistic Modeling Useful to incorporate uncertainty by providing a distribution of possible strategies, probabilistic models are most applicable to network-level analysis (such as setting funding expectations or needs) in TAM. TAM Optimization and Prioritization Analysis DOT asset management systems are often used to conduct network-level optimization analysis of potential investment strategies or treatment options. Key inputs to the analysis are: • Current inventory and condition information that is necessary as a baseline for the analysis and to establish the potential investment options; • Asset performance models (as described in the section titled “Asset Performance Models”);

Self-Assessment and Improvement Identification I-49 • Treatment rules and costs that define the conditions under which a specific TAM treatment may be applied (e.g., triggering conditions) and the costs of those interventions; and • Analysis parameters, including the: – Analysis horizon (the number of years to be analyzed); – Analysis objective (e.g., to maximize benefit or to minimize treatment cost); and – Analysis constraints (e.g., minimum performance expectations or maximum funding limits). For some assets, condition- or performance-based management information may not be available. In these cases, age-based or reactive management techniques can be useful. These approaches can still rely on network-level analysis to prioritize investment options based on available asset information, associated prioritization factors, and existing funding and resources. Cross-Asset Resource Analysis The output from asset-specific investment optimizations can be combined and analyzed to identify the optimal distribution of resources across asset and program areas. In this approach, a DOT must relate performance outcomes from individual asset programs to a common benefit or value (typically based on overarching agency goals and objectives). With these rela- tionships established, trade-off analyses can be completed to optimize the total agency benefit or value based on asset-specific outcomes that have been modeled at various potential invest- ment levels. Important Terminology The following terms are used within this section: • Investment optimization, referring to analysis techniques applied to select ideal TAM invest- ments for a given analysis horizon, objective, and set of constraints; • Investment prioritization, referring to screening and ranking techniques used to establish TAM investment priorities; and • Analysis parameters, referring to key inputs to agency asset management or investment optimization analysis, such as asset deterioration rates, treatment condition reset values, treatment unit costs, or analysis time horizons. Area E: Act as Informed by Data This area involves the application of data and information systems, processes, tools, and techniques to TAM decision-making, including performance targeting and prioritization programs, project-level scoping and design, as well as infrastructure and equipment main- tenance practices. Area Overview Area E is organized into three distinct sections: • E.1: Resource Allocation and Prioritization, addressing how data are applied within TAM performance targeting and project prioritization; • E.2: Project Planning, Scoping, and Design, presenting how project planning and scoping, as well as design, are informed by data; and • E.3: Maintenance, capturing how available data are incorporated into agency infrastructure and equipment maintenance practices. Associated Response Templates D.2.a: Asset Performance Prediction D.2.b: Optimization/ Prioritization Appendix D Area E: Act as Informed by Data E.1: Resource Allocation and Prioritization E.2: Project Planning, Scoping, and Design E.3: Maintenance

I-50 Guidebook for Data and Information Systems for Transportation Asset Management Improvements in this area are aimed at better integrating use of data and information within TAM processes including network-level investment decision-making, project-level prioritiza- tion and development, and routine maintenance decisions. Section E.1: Resource Allocation and Prioritization Optimizing the allocation of scarce resources is one of the essential goals of a TAM program. As data and analysis methods improve, agencies can prioritize projects and allocate resources based on a firmer, more defensible foundation. Performance Targeting Asset performance targets should be: • Aligned with TAM goals, objectives, performance measures, and analysis; • Achievable through existing practices; and • Integrated meaningfully into actual TAM decision-making. Project Prioritization Project-level investment decision-making should be aligned with stated agency goals and objectives. A variety of techniques can be employed to support informed maintenance, reha- bilitation, and replacement project selection. Examples include: • Simple asset information summaries that expose trends in asset allocations, inventory, condition, and/or performance through business intelligence tools (ideally as formal performance dashboards). By promoting the visibility of performance outcomes, TAM practitioners and management will be encouraged to evaluate and improve existing project- selection practices. • Network screening tools, which are used to evaluate available data (e.g., asset inventory, condition, utilization, risk, or other factors) to identify ideal TAM investments for indivi- dual assets or locations. • Multi-objective decision analysis techniques that can be employed to objectively evaluate project costs in relation to the projects’ anticipated benefits to agency goals or objectives. Evaluation results should be formally incorporated into decision-making through well- documented procedures that are routinely evaluated against agency priorities. Examples that illustrate the incorporation of performance targets into TAM decision-making are provided in the “Conceptual Examples” text box. Important Terminology The following terms are used within the assessment and improvement identification materials associated with this section: • Cross-asset metrics, which allow comparison and evaluation of agency performance across multiple asset areas and programs (e.g., in terms of asset need, value, or benefit); • Cross-asset resource allocation, a technique by which potential investment strategies across multiple assets and/or program areas are evaluated to identify an investment program that best meets overarching agency priorities; • Multi-objective decision analysis (MODA), a decision-making process utilized to make the best decision given a complex set of competing criteria and priorities (e.g., as used by agen- cies in capital project selection). Utilizing an established objective hierarchy and defined value function based on agency goals and objectives, the DOT completes detailed, project-level data collection and analysis to score potential projects and identify those with the highest returns on investment. These projects are then prioritized in programming of available funds. Area E: Act as Informed by Data E.1: Resource Allocation and Prioritization E.2: Project Planning, Scoping, and Design E.3: Maintenance Associated Response Templates E.1.a: Performance Targeting E.1.b: Project Prioritization Appendix E

Self-Assessment and Improvement Identification I-51 Section E.2: Project Planning, Scoping, and Design Asset inventory, condition, work history, and treatment recommendation data are used to support efficient project development, as well as selection of optimal design features to meet maintenance and operational needs. Project Planning and Scoping DOT policies often recommend (or even require) that certain asset maintenance, repair, or replacement activities be incorporated within planned capital projects. These requirements often are based on existing field conditions; however, transportation agencies often have limited formal tools to support informed decision-making. Typically, simple checklists are developed to support these processes. Agencies should seek to improve scoping by integrating available asset inventory, condition, and performance informa- tion directly into project planning. Incorporation of Optimization and Prioritization Analysis Transportation agency asset management systems are often used to conduct network-level optimization analysis of potential investment strategies or treatment options (further detailed in the support materials in this chapter for Area D). Conceptual Examples Performance Target Incorporation into TAM Decision-Making Agency Strategic Plans TAM performance targets reflect agency TAM goals, objectives, measures, and targets; document funding expectations, key asset life-cycle practices, and roles and responsibilities; and raise TAM awareness and establish agency direction, priorities, and strategy. TAM Resource Allocation and Budgeting TAM performance targets should be aligned with resources by adjusting available resources and budgets or targeted performance as appropriate and necessary. Performance Dashboards TAM performance targets guide decision-makers as progress is reported and by supporting course corrections when needed. For this purpose, output measures (e.g., miles paved, bridges rehabilitated) may be correlated to outcome measures that are more difficult to monitor (e.g., percentage of good pavement, number of deficient bridges). Continual Improvement TAM performance targets include the identified roles and responsibilities of staff or contracted vendors who have accountability for achieving the targets. Targets are routinely evaluated to support continual improvement of TAM decision-making business processes. Area E: Act as Informed by Data E.1: Resource Allocation and Prioritization E.2: Project Planning, Scoping, and Design E.3: Maintenance

I-52 Guidebook for Data and Information Systems for Transportation Asset Management Agencies are often challenged to meaningfully incorporate outcomes from network-level TAM optimization analysis into project-level decision-making. Policy, procedures, and tools are all necessary to overcome these challenges; however, they must be balanced by recognition of the reality that project-specific field conditions cannot be fully accounted for in network-level analysis. It is important that appropriate flexibility be offered to decision-makers in the field. Field Performance Verification Transportation agencies can monitor asset condition and performance after project delivery to validate actual versus predicted outcomes. If significant discrepancies are observed, analysis can help determine if design or construction practices can be improved or if asset per- formance models should be adjusted. An approach of continual monitoring and improvement benefits design and construction practices, project-level decision-making, and network-level TAM analysis. Examples that illustrate the incorporation of TAM analysis in project-level decision-making are provided in the “Conceptual Examples” text box. Conceptual Examples TAM Analysis Incorporation in Project-Level Decision-Making Treatment Selection Screening Prescriptive decision-analysis techniques can be applied to individual assets or potential investment locations to establish acceptable treatment categories. These techniques are very useful in preventive maintenance scoping, where certain field conditions may be known to result in low performance benefits (e.g., application of preventive maintenance seal coats to pavements exhibiting fatigue or “alligator” cracking). Network-Level “Best Mix of Fixes” Predictive modeling and analytics can provide optimized TAM investment strategies. Rather than applying modeled outcomes directly to specified locations, it is useful to aggregate outcomes by treatment or activity type. Then, the agency can communicate the investment targets to field decision-makers and allow field selection of the specific locations and detailed TAM treatments or activities. This approach balances the optimal strategies with the field realities that are not accounted for in the network-level analysis. Predictive modeling and analytics also can be paired with treatment selection screening to ensure that the field- selected treatments are appropriate to the specific locations selected. Important Terminology The following terms are used within this section: • Project scoping templates, as developed for common project types, can be pre-populated with TAM analysis outcomes and asset inventory and condition information as the basis for field project scoping. These templates provide efficiencies in scoping activities and encourage investment decisions aligned with TAM priorities. Associated Response Templates E.2.a: Data-Driven Project Planning and Scoping E.2.b: Data-Driven Project Design Appendix E

Self-Assessment and Improvement Identification I-53 • Evidence-based design and construction, meaning the use of a scientific methodology and statistical techniques to evaluate project design decisions and construction practices to achieve the best possible outcomes. Evidence-based design and construction can be useful to TAM programs in identifying changes to design standards and processes that support improved asset management and operations outcomes. Section E.3: Maintenance Asset life-cycle modeling techniques can be used to develop effective routine, preventive, and reactive maintenance programs for transportation infrastructure and equipment. Common Maintenance Practices Most agency equipment and many transportation assets have well-documented standards for routine and preventive maintenance. Equipment manufacturers recommend regular preventive maintenance cycles (e.g., oil changes or tune-ups) and many DOTs, other trans- portation asset owners, or industry representatives have recommendations for when various transportation assets should be inspected or receive preventive care. These practices can be incorporated into DOT life-cycle models and maintenance programs. By investing in these activities, even following a defined, interval-based methodology, the DOT can expect to generate long-term savings through improved performance and extended service life. Examples that promote awareness of standard operating procedures, targeted communica- tion and outreach, and performance targets are provided in the “Conceptual Examples” text box. Area E: Act as Informed by Data E.1: Resource Allocation and Prioritization E.2: Project Planning, Scoping, and Design E.3: Maintenance Conceptual Examples Promoting Awareness Standard Operating Procedures Agencies can identify targeted routine and preventive maintenance activities and document clear standard operating procedures for maintenance staff. It is advisable to share this documentation in an easily accessible location and to advertise its availability. For complex activities, agencies can consider establishing formal training courses to ensure proper application. Targeted Communication and Outreach DOTs can organize regular meetings with field maintenance staff to promote awareness of preventive and routine maintenance expectations and opportunities, and to share anticipated outcomes and benefits of these practices. Performance Targets Using life-cycle modeling, interval-based methods, or available funding, DOTs can establish and track preventive maintenance targets. It is important to validate that maintenance efforts are being targeted to appropriate candidate assets or equipment (as preventive maintenance is typically only useful to extend the life of an asset that is still in good condition).

I-54 Guidebook for Data and Information Systems for Transportation Asset Management Automated Work Ordering To the extent that asset management and work ordering systems can be integrated, agencies can implement tools to automatically screen the asset inventory for routine, preventive, and reactive maintenance candidates. When assets have been identified, the integrated system can automatically generate and assign work orders to trigger necessary maintenance activities. Important Terminology The following terms are used within this section: • Automated work ordering, an automated process that generates maintenance work orders that are typically based on asset use, age, maintenance or work logs, inspection results, observed defects, or condition ratings; • Preventive maintenance, meaning programs or activities that employ a network-level, long- term strategy that enhances asset performance or extends asset life through a set of proactive, cost-effective practices; and • Routine maintenance, meaning recurring maintenance activities that are regularly employed over the life of an asset (e.g., cyclical inspection, servicing, or replacement of components). Associated Response Templates E.3.a: Infrastructure Maintenance E.3.b: Equipment Maintenance Appendix E

I-55 This chapter provides guidance for evaluating candidate improvements and communicat- ing information about recommended improvements to build support and secure resources for implementation. Practice Summary, Improvement Evaluation, and Result Communication At this stage of the process, the agency will have a list of candidate improvements to close gaps between where the agency is now and where it wants to be. The next important step is to evaluate the candidate improvements. The purpose of this evalu- ation is to set priorities and build an understanding of the likely implementation challenges the DOT will face. The result of this step is a summary of the gaps to be closed and the recommended improve- ments for closing them. These results can be developed into communication materials that make a case for resourcing improvements. Current and Desired State Summary Users of the guidebook will establish the current and desired state of practice (in relation to the benchmark levels) for each assessed area, section, and element of the framework. This step provides a clear picture of where gaps exist in current practice, exposing opportunities for potential improvement. Element-level response templates are provided in the printed guidebook and can be used to complete a pen-and-paper assessment; however, use of the TAM Data Assistant can facilitate easier summary and communication of the assessment results. A visual summary and presentation of current and desired practice benchmarking will be the most effective means of communicating assessment outcomes. “Spider web” or “radar” charts are best used for this communication (see Figure I-8). Given the number of individual elements, individual summary charts should be developed for each assessed area within the guidance framework. These charts will provide a compelling visual representation of where current performance is high or low, and where there are gaps between current and desired performance. Using these charts will clearly identify priorities for advancement, and will support improvement evaluation and prioritization. C H A P T E R 4 Evaluation and Summary of Results

I-56 Guidebook for Data and Information Systems for Transportation Asset Management TAM Data Assistant The TAM Data Assistant simplifies the summary of assessment outcomes by automatically generating these charts from the detailed assessment data. Additional Recommendations Preparing a summary and review of assessment results can generate new insights from the assessment team and allow for broader engagement and input beyond the observations of the individuals who were involved in the initial assessment process. The assessment summary materials proposed in this guidebook can be used to iteratively refine the assessment details and generate more meaningful assessment results and improve- ment priorities. Explanation of Recommended Summary Charting Figure I-8 exemplifies the recommended approach to visualizing the current and desired state captured through the assessment process. Four key elements of this visualization are: 1. The “spider web” or “radar” chart itself, including each assessed element within the area, organized by section, and representing each possible level of performance (from Benchmark Level 0 to Benchmark Level 4); 2. The current performance (highlighted in blue in Figure I-8), which is provided for each assessed element within the targeted area; 3. The desired performance (highlighted in green), which is provided for each assessed element within the targeted area; and 4. The element identifier and name for each assessment element represented in the summary chart. TAM Data Assistant Reference Materials General Uses Chapter 2 Assessment Facilitation Uses Appendix H User Quick Reference Guide Appendix I Figure I-8. TAM Data Assistant assessment summary example.

Evaluation and Summary of Results I-57 Use of Recommended Summary Charting • Identification of Low- and High-Performing Sections and Elements. In Figure I-8, gover- nance and metadata practices are easily identified as low-performing practices, whereas treat- ment and work data standards are relatively high-performing practices. Low-performing practices may become obstacles to ongoing advancement and may need to be prioritized for improvement, even if these capabilities are not specifically an area of focus for the agency. Referring again to Figure I-8, without advancing governance and metadata capabilities, the ability to efficiently and effectively collect, integrate, or analyze TAM data may be compro- mised due to lack of understanding of and compliance with data standards as business needs and practices change. • Gaps in Current and Desired Performance. In Figure I-8, all assessed elements show a gap between current and desired performance; however, certain elements have larger gaps than others. Governance elements typically are two levels lower than desired, and will require sig- nificant investment and potentially face substantial institutional hurdles and organizational challenges to implement improvements. Based on this summary, a long-term governance implementation initiative could be considered. Communication to decision-makers could highlight the significant gap in current practices with respect to the desired state of practice and the value and benefits of investment in advancing governance practice. Detailed Analysis Detailed assessment data can be exported from the TAM Data Assistant to an Excel spread- sheet (an export file). The export file can be used to readily list, filter, sort, and apply calculations that may be helpful in communicating the current practice (current benchmark level), desired state of practice (target benchmark level), or practice gaps. The raw data in the export file could potentially be used to create a “radar” or “spider web” chart, but the TAM Data Assistant does this automatically for each framework area. The assessment information can also be combined with detailed improvement evalua- tion outcomes (also included within the export file) to relate current and desired practice to individual improvement opportunities (as is discussed under “Improvement Evaluation”). TAM Data Assistant Quick Reference Guide Detailed information on the functions and use of the TAM Data Assistant can be found in the TAM Data Assistant Quick Reference Guide provided in Part III, Appendix I. Improvement Evaluation After candidate improvements have been identified, the next step is to evaluate them, recog- nize the effort that will be required versus the likely payoff for executing them, and anticipate any implementation challenges. This evaluation step is important for setting priorities and develop- ing a comprehensive improvement strategy. The TAM Data Assistant allows users to sort, filter, and review a list of improvements that have been identified during the assessment process. Through this interface, users can track evaluation results based on the criteria described in the next section of this guidebook. Each candidate improvement should be evaluated in the context of other selected improve- ments. This evaluation allows the relative impact, effort, and priority of each improvement to

I-58 Guidebook for Data and Information Systems for Transportation Asset Management be established (as high, medium, or low) with respect to the other identified options. Improve- ment-specific challenges can also be identified for consideration during strategy development. For example: • Impact is characterized by the extent to which new or existing practices will transform TAM-related business practices; • Effort is characterized by the level of resources and staff time required and the extent to which those can be incorporated into the responsibilities and budgets of existing busi- ness units; • Priority is established on the basis of when the improvement would be targeted for implementation, ranging from immediate action to being recognized for future, as-yet unplanned action; • Challenges can be grouped into distinct categories (e.g., time, resource, expertise, coordina- tion, change, or other). Applications of these evaluation factors are illustrated in the “Conceptual Examples” text box. Conceptual Examples Evaluations of Proposed Improvements Impact Evaluation Effort Evaluation Priority Evaluation High Impact High Effort High Priority • Transforms current business in a way that addresses major process pain points, is likely to extend to multiple business units, and adds value to multiple business processes Medium Impact • Makes existing business processes significantly more efficient and effective; however, may be within a limited area of business (e.g., a specific business function or process area) Low Impact • Contributes a minor adjustment to an existing business process but will not significantly change the business • May already exist informally but are being formalized or clarified in the context of the program at large • Requires a major commitment of resources and staff time, typically across multiple business units (e.g., a major IT application, a statewide technology deployment) Medium Effort • May be incorporated within typical budgets and resources but would require planning and coordination, typically limited to a specific business function or process area Low Effort • Can be included within routine responsibilities of a business unit or working group • Typically can be completed within a short timeframe • Targeted for immediate action Medium Priority • Intended to begin within the next several investment or planning cycles (e.g., 1–2 years) Low Priority • Recognized, but not anticipated for action within the near future • Unlikely to be incorporated into near-term planning activities

Evaluation and Summary of Results I-59 Conceptual Examples Challenge Categorization Time Coordination Change • Recommended when the time available is limited for the extent of the effort Resources • Recommended when the level of resources or staff time would require executive approval Expertise • Recommended when the expertise required is not available to the DOT without specialized support • Recommended when engage- ment and agreement are required across many different areas of business within the DOT, particularly when many of the impacted business units do not typically work together as part of the routine business of the agency • Recommended when the improvement will significantly transform current business across multiple business units and processes, requiring extensive process reengineering and/or training to those impacted TAM Data Assistant and Improvement Evaluation The TAM Data Assistant provides functionality for recording ratings of impact, effort, priority, and challenges for each selected candidate improvement. Additional information can be found in Part III, Appendix I. Additional Recommendations An iterative approach to improvement evaluation is recommended. To the extent practical, this process should also involve external stakeholders and external planning processes. For example, the goals and objectives stated in an agency’s strategic plans should be incorporated into the prioritization of improvement actions. The availability, workload, and resources of affected business units should also be considered, as well as the engagement and enthusiasm for change found in potential project sponsors and business leads. Without stakeholder engagement, it is unlikely that a data or information system improvement will be successfully and sustainably implemented within routine business. Improvement Evaluation Tools Figure I-9 demonstrates the TAM Data Assistant functionality supporting improvement evaluation. The interface organizes five key aspects of the evaluation. The five key aspects of this interface are: 1. Sort and display functionality, to organize improvements identified during the self-assessment process; 2. Filter functionality, to apply criteria to filter the improvements based on area, challenge, priority, effort, impact, and other factors; 3. Individual improvement details, to highlight details for each selected improvement; 4. Evaluation criteria, to establish the improvement’s impact versus effort, priority, and associated challenges; and 5. Assessment information, to review the current and desired state of the associated element and provide a link to quickly return to, and adjust, the associated assessment information. TAM Data Assistant TAM Data Assistant: Overview Chapter 2 TAM Data Assistant and Improvement Evaluation Chapter 4 TAM Data Assistant and Executive Communication Chapter 4 Facilitator Materials Appendix H TAM Data Assistant Quick Reference Guide Appendix I

I-60 Guidebook for Data and Information Systems for Transportation Asset Management Detailed Analysis Detailed improvement data can be exported from the TAM Data Assistant to an Excel spread- sheet. One of the worksheets within the export will include each selected improvement and other potential improvements. This spreadsheet file should be used for any external analysis. If the TAM Data Assistant is not used, a similar spreadsheet could be developed based on the informa- tion presented in this guidebook. The export file will contain the following information: • Element ID: A unique identification number assigned to the associated element in the detailed technical framework. This unique identifier allows the improvement information to be joined to the assessment details; • Improvement Description: The detailed descriptive language for the assessed element; • Priority: The low-, medium-, or high-priority value assigned to the improvement; • Impact: The low-, medium-, or high-impact value assigned to the improvement; • Effort: The low-, medium-, or high-impact value assigned to the improvement; • Time Challenge: An indicator of whether a time challenge was identified for the improve- ment (0 if no challenge was identified, or 1 if a challenge was identified); • Resource Challenge: An indicator of whether a resource challenge was identified for the improvement (0 if no challenge was identified, 1 if a challenge was identified); • Expertise Challenge: An indicator of whether an expertise challenge was identified for the improvement (0 if no challenge was identified, 1 if a challenge was identified); • Coordination Challenge: An indicator of whether a coordination challenge was identified for the improvement (0 if no challenge was identified, 1 if a challenge was identified); • Change Challenge: An indicator of whether a change challenge was identified for the improvement (0 if no challenge was identified, 1 if a challenge was identified); 4 5 3 1 2 Figure I-9. Using the TAM Data Assistant to evaluate selected improvements.

Evaluation and Summary of Results I-61 • Other Challenge: An indicator of whether another type of challenge was identified for the improvement (0 if no challenge was identified, 1 if a challenge was identified); • Status: An indicator of whether the improvement was or was not selected for improvement; and • Evaluation Notes: Improvement notes captured during the self-assessment activity. The export file can be used to readily list, filter, sort, and apply calculations that may be helpful in communicating the priorities for improvement. By joining these results with the detailed assessment information, the user can further refine the priorities for improvement. The “Conceptual Examples” text box in this section illustrates a variety of ways in which filters and sorting can be used in the export file. Conceptual Examples Detailed Result Evaluation High-Impact, Low-Effort Improvements The agency can filter for high-impact, low-effort improvements. Consider improvement opportunities that deliver significant value without substantial effort. Where practical for immediate investment, communicate these “low- hanging fruit” to decision-makers as easy wins. Combine Assessment and Improvement Information Assessment and improvement information can be combined using the Element ID field. Use this approach to improve communication of improvement priorities by also relating current or desired performance. Improvement of Low-Performing Elements The combined assessment and improvement information can be used to identify improvements for low-performing elements. By sorting on the “Assessment Current Level” field and ordering the results from lowest to highest value, improvements associated with the lowest-performing elements will be moved to the top of the list. A low-performing element may not always stand on its own as a priority of the organization, but it may be relevant given the interrelated nature of performance within the framework. Lagging performance in one aspect of performance can have an impact on the ability to be successful in other areas. Improvement of Elements with Large Performance Gaps The combined assessment and improvement information can be used to identify improvements for elements with large gaps between current and desired performance. Calculate the difference between the “Assessment Desired Level” and “Assessment Current Level,” and sort on the resulting value from largest to smallest value. Improvements associated with the largest performance gaps will now be found at the top of the file. Consider how initial improvements in these areas should be prioritized, given that multiple improvements over an extended period will likely need to be implemented.

I-62 Guidebook for Data and Information Systems for Transportation Asset Management Executive Communication Securing support to implement improvements requires clear, concise communication about the current state of DOT practices, the desired state of those practices, key performance gaps, and which improvements have priority. The assessment facilitator, project sponsor, and other key team members should be involved in development of executive communication materials. Digital Output and Uses Radar charts, individual improvement evaluation data entry, and summary improvement “impact versus benefit” charts can be selected and used directly in briefing materials and other executive communications materials that are designed to speak to the specific needs and interests of the targeted decision-makers. Detailed export output should be used as the basis for any nonstandard communication materials. This approach will ensure that the materials prepared are easily maintained or updated should the assessment results be revisited at a future date. Recommendations for effective executive communication include the following: 1. Present the assessment focus and context, emphasizing the motivation, desired value in selecting the focus, and the cross-functional nature of the assessment team; 2. Communicate the current and desired state quickly, demonstrating where performance is low, where it is high, and were improvement is most necessary, and providing practical examples of the impacts that low performance is having on current TAM business; 3. Share a clear set of implementation priorities that address gaps in current practices, emphasizing that these are the agreed-upon priorities of the cross-functional team; and 4. Acknowledge challenges that will be faced, outlining organizational practices and real-world case studies that will support successful implementation. The TAM Data Assistant and Executive Communication Summary materials generated from the TAM Data Assistant or the downloadable export file can be used by evaluation teams to prepare executive communication materials. Addi- tional information on the tool’s Results Page, Excel Report, All Assessments Page, and Current Assessment Page can be found in the TAM Data Assistant Quick Reference Guide in Part III, Appendix I. Implementation Support Materials Organizational Practices Chapter 5 Case Studies Chapter 5

I-63 This chapter references supplemental materials that are provided in the guidebook appendices that can be used to support implementation planning for data and information improvements. These materials provide background on organizational practices as well as DOT case studies highlighting implementation. Organizational Practices Making meaningful changes to how data are managed, shared, and used within and across a DOT TAM program requires much more than procuring new tools and technologies. Agencies must ensure that they have the necessary workforce capabilities to successfully use and integrate new technologies—and that they can adapt to new processes for creating and using information. Four key organizational practices can be employed to support the implementation of TAM data and information improvements: • Strategic management, • Initiative management, • Talent management, and • Knowledge management. Part III, Appendix F provides additional guidance for each of these practices, identifying typical strategies and documentation that can be employed to apply them within a DOT TAM program. Organizational Practice Use Large organizations like DOTs will face institutional challenges to sustained improvement. Many of these challenges can be addressed through the deliberate application of the identified organizational practices. Implementation Challenges: Overview Improvement of DOT data and information systems and related TAM business practices requires: • Time and resources for technical work and review and revision cycles; • Expertise as needed to create workable standards and facilitate review and negotiation processes; C H A P T E R 5 Implementation Support Time and Resource Challenges Appendix F Strategic Planning Strategic Governance Portfolio Management Expertise Challenges Appendix F Workforce Planning Employee Development Succession Management Knowledge Capture and Dissemination Coordination Challenges Appendix F Collaboration and Peer-to-Peer Learning Change Challenges Appendix F Organizational Change Management Performance Management Enterprise Architecture

I-64 Guidebook for Data and Information Systems for Transportation Asset Management • Coordination to reach agreement among different business and IT stakeholders, and potentially to create and manage agreements with outside vendors and partner agencies; and • Changes to data collection processes, IT systems, and business processes for collecting, entering, reporting, and using data. Assembling the needed time, resources, and expertise and navigating an agency’s coordi- nation and change management needs can be more challenging than conducting the actual technical work for improvement. This section of the guidebook highlights some organizational strategies that can be used to overcome these challenges, DOTs can address time and resource challenges through: • Strategic planning, by increasing agency direction and support for data- and information system-related improvement and initiatives; • Governance, by establishing decision-making structures and prioritized investment in activities to develop enterprise data standards; and • Portfolio management, by offering techniques to identify and advocate for the business value and return-on-investment from data and information system and TAM investments. Expertise challenges can be addressed through: • Workforce planning and employee development, by providing techniques to identify job skills and develop the associated recruitment and training as necessary to build the specialized technical skills (e.g., data modeling) and soft skills (e.g., group facilitation and engagement) to develop and implement meaningful data and information system practices; and • Knowledge capture and dissemination, by establishing a common resource base for individuals involved in data- and information system-related efforts and TAM business processes. Coordination challenges can be addressed through: • Collaboration and peer-to-peer learning, by creating opportunities for collaboration between business and IT professionals and creating cross-functional work groups to focus on data and information-system related improvements. Change challenges can be addressed through: • Change management, by developing approaches to communicate the purpose, building awareness, and facilitating the adoption of new data standards, information systems, and related TAM business practices by individual staff, business units, and the organization at large; • Performance management, by setting objectives and performance measures that promote awareness and compliance with new, data-informed business practices; and • Enterprise architecture, by establishing a reference for existing or proposed business processes. Organizational Practice Guidance Detailed information on the four organizational practice areas can be found in Part III, Appendix F. For each practice area, the appendix content provides a brief practice overview, describes typical strategies, and addresses how they can be applied to specific implementation challenges (see the text box titled “Appendix F: Detailed Organizational Practices”). Appendix F also provides additional external references for further examination.

Implementation Support I-65 Case Studies The case studies provided in Part III, Appendix G, offer practical examples of real projects completed by state DOTs that can serve as best-practice references. These references can be used in conjunction with improvement recommendations to support projects and initiatives to enhance data management maturity in accordance with this guidebook. The format and content of the case studies is discussed in the text box titled “Appendix G: Case Studies.” Case Studies: Overview Case study selection was guided by an understanding of some of the more challenging and progressive areas of the guidance in this report. To help with focus, each of the case studies presented has been aligned to an assessment area; however, any single case study could poten- tially cover more than one area or element. Appendix F: Detailed Organizational Practices Appendix F offers detailed information and resources to assist transpor- tation agencies as they address implementation challenges using strategic management, initiative management, talent management, and knowledge management. Opening Pages (Overview) For each organizational practice, the opening pages provide the essential concepts pertaining to the practice and recognize each of the typical strategies that are used with that practice. Typical Strategy Details Individual strategies are identified and documented within each organizational practice area. These strategies are not meant to be comprehensive, but are identified as they address institutional and organizational challenges that may be faced by DOTs as they advance their data and information system related practices supporting their TAM programs. For each strategy, specific but high-level guidance is shared relating to its execution at the DOT. This approach is intended to provide a base understanding of how the DOT can pursue application of the strategy to address identified challenges. References Given the focus of this guidebook, it is not practical to provide comprehensive guidance for application of each practice area, even within the specific context of supporting DOT TAM programs. However, recognizing the critical role these organizational capabilities will play in sustained improvement in DOT practices, for each organizational practice Appendix F provides additional, external reference materials. Implementation Support: Case Studies Appendix G

I-66 Guidebook for Data and Information Systems for Transportation Asset Management Appendix G: Case Studies The case studies presented in Appendix G are based on real projects completed by state DOTs that can serve as best-practice references. The interpretation and application of the improvement recommendations will vary amongst DOTs based on size, organizational structure, leadership objectives, and other factors. However, by reviewing how improvement recommendations have aligned with real project examples and various areas of the assessment, DOTs will be able to see how other agencies have approached similar challenges, how those challenges were addressed, and how desired outcomes were achieved. Each case study is provided in a consistent format. This format provides the reader a concise and clear description of why the project was undertaken, the approach applied, the value delivered, and the key challenges faced. Supporting graphics are included with each case study to provide visual context in the form of charts, workflows, screen captures or other artifacts. Motivation The motivation section of the case study is designed to create a relatable position for why a DOT would undertake such a project. The goal of the motivation description is to help the reader identify with the originating challenge or opportunity and relate it to a similar challenge or opportunity within their own organization. Approach The approach is intended to provide a high-level walk-through of the key steps the DOT took to execute the project or initiative. Specific step, actions, tactics, and engagement strategies employed by the DOT are detailed as applicable. Value Delivered In this section, the outcome of the project or initiative is described in qualitative or quantitative fashion. From the outcome value information in the case study, readers can infer similar outcome value propositions for improvements that they are considering. Key Challenges Faced Each case study highlights important organizational requirements and challenges that are faced during implementation. Each case study categorizes these challenges by time, resources, expertise, coordination, and change. Supporting Graphics and Content To bring the project or initiative to life, select images are provided in each case study to support the text. Depending on the project, the images include photo- graphs, screen shots or applications, charts, or other representative graphics to help illustrate motivation, approach, value or challenges.

Implementation Support I-67 For each case study, a brief overview has been provided in the remaining sections of this chapter. Each overview presents the assessment area, section, and/or element references identified for the case study, together with a brief description of how the case study provides a useful example of practice and how it is linked to the assessment and improvement frame- work. The detailed case study materials are provided in Part III, Appendix G. Ohio DOT: Establishing a Governance Framework Area A: Specify and Standardize Data Section A.5: Governance Standards Elements: A.5.a–A.5.d (All) Project: Establishing and Applying a Data Governance Framework Description: This project illustrates the criticality of stewardship and formal oversight for data standards within an organization. The case study reveals the necessity to engage across all levels of the organization to ensure that there is investment to provide a comprehensive, sustainable governance structure established by policy. This case study demonstrates how a specific DOT could advance governance elements from Benchmark Practice Level 1 or 2 to Benchmark Level 3, by implementing improvements for stewardship roles and governance structures, data management maturity self-assessments, and data and integration through process mapping. Utah DOT: Statewide Vehicle-Based Data Collection Area B: Collect Data Section B.1: Inventory, Condition, and Performance Data Collection Elements: B.1.a, B.2.a, and B.3.a (Coverage) and B.1.b, B.2.b, and B.3.b (Automation) Organization: Utah DOT Project: Statewide Mobile LiDAR Data Collection Description: This project demonstrates establishing enterprise standards and driving consis- tency so that statewide inventory can be collected uniformly. It also illustrates the need for careful analysis to determine the value of data collection, to guide investment decisions on how much data to collect, and the importance of automating processing steps to create efficiencies when dealing with large datasets. This case study demonstrates how a specific DOT could advance these elements from Benchmark Level 2 to Benchmark Level 3 by implementing improvements for manual data collection automation and collection tools and methods consolidation. Colorado DOT: DQMP Development Area B: Collect Data Section B.1: Inventory, Condition, and Performance Data Collection Elements B.1.c, B.2.c, and B.3.c: Quality Project: Pavement DQMP

I-68 Guidebook for Data and Information Systems for Transportation Asset Management Description: This project showcases the ability to leverage federal requirements as an impetus to addressing a larger and more complex issue. Additionally, this project reveals the importance of change management and careful attention to understanding: • The business process, • What will change, and • How the proposed change will affect the stakeholders. The training and certification aspects of the case study illustrate a method to support sustained change. This case study demonstrates how a specific DOT could advance these ele- ments from Benchmark Level 1 to Benchmark Level 2 or 3 by implementing improvements for a Data Quality Collection Plan. Virginia DOT: Mobile Field Data Collection Implementation Area B: Collect Data Sections B.1, B.2, and B.3: • B.1: Inventory, Condition, and Performance Collection • B.2: Project Information Collection • B.3: Maintenance Information Collection Elements B.1.c, B.2.c, and B.3.c: Quality Project: Mobile Field Data Collection of Maintenance Work Accomplishments Description: This project showcases the value of defining data collection standards and data capture strategies that allow for consistency in field data collection. These standards and strate- gies are foundational to the development of mobile field data collection tools and downstream analysis tasks. Also highlighted in this project is the cost-benefit analysis that must be made with respect to software customization decisions. This case study demonstrates how a specific DOT could advance these elements from Bench- mark Level 3 to Benchmark Level 4 by implementing improvements for automated data quality collection audits. Utah DOT: Mobile LiDAR and BIM/CADD Integration Area C: Store, Integrate, and Access Data Section C.2: Asset Life-Cycle Data Integration Workflows Element C.2.c: Project Development to Project Delivery Project: Integration of 3D Modeling Data to Support Asset Management Description: This project exemplifies the value of leveraging asset inventory and condition data into the project delivery phase. It also shows that investments in one asset life-cycle stage can pay dividends in another: the additional returns on investment by looking at the larger life-cycle viewpoint can be considered to aid justification of new data and digitalization projects. This case study demonstrates how a specific DOT could advance this element from Bench- mark Level 3 to Benchmark Level 4 by implementing improvements for automation of asset life-cycle data transfers. Ohio DOT: Multi-Objective Project Prioritization Program Implementation Area E: Act as Informed by Data Section E.2: Project Planning, Scoping, and Design

Implementation Support I-69 Element E.2.a: Data-Driven Project Planning and Scoping Project: Transportation Asset Management Decision-Support Tool (TAMDST) Description: This project illustrates the accumulated value and derived benefits from normal- izing ratings and metrics to support cross-asset planning. It further demonstrates the value of dashboards and visualization techniques to support decision-making as well as making those decisions on prioritization defensible. This case study demonstrates how a specific DOT could advance this element from Bench- mark Level 2 to Benchmark Level 3, by implementing improvements for network-level perfor- mance monitoring programs.

Next: Part II - Research Report »
Guidebook for Data and Information Systems for Transportation Asset Management Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Effective transportation asset management (TAM) depends on having good data about the assets under management, their descriptions, current condition and history, functional performance, and the activities conducted to develop, maintain, improve, and rehabilitate them during the course of their service lives.

The TRB National Cooperative Highway Research Program's NCHRP Research Report 956: Guidebook for Data and Information Systems for Transportation Asset Management presents a structured approach for assessing an organization’s current data and information management practices in support of transportation asset management and strategies for improving these practices.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!