National Academies Press: OpenBook
« Previous: Foundation
Page 26
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 26
Page 27
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 27
Page 28
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 28
Page 29
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 29
Page 30
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 30
Page 31
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 31
Page 32
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 32
Page 33
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 33
Page 34
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 34
Page 35
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 35
Page 36
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 36
Page 37
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 37
Page 38
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 38
Page 39
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 39
Page 40
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 40
Page 41
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 41
Page 42
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 42
Page 43
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 43
Page 44
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 44
Page 45
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 45
Page 46
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 46
Page 47
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 47
Page 48
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 48
Page 49
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 49
Page 50
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 50
Page 51
Suggested Citation:"Reporting." National Academies of Sciences, Engineering, and Medicine. 2019. Management and Use of Data for Transportation Performance Management: Guide for Practitioners. Washington, DC: The National Academies Press. doi: 10.17226/25462.
×
Page 51

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Reporting

Introduction • Foundation • Reporting • Insight • Cases Step 3 Store & Manage Data This step includes validating, cleaning, normalizing, aggregating, and integrating data; storing the data in one or more repositories—either within the agency or “in the cloud”; producing documentation needed for both technical and business users of the data; and managing access to the data—to both protect it from unauthorized use and to ensure that it is accessible to those who need it. This step also includes activities to design, develop, and manage databases and technical infrastructure for data storage and data integration. The key decisions that agencies must make are • where and how to store data, • how to make sure data can be integrated across repositories as needed, • which best practices should be implemented for QA and documentation, and • how much data to keep. “Data is just like crude. It’s valuable, but if unrefined it cannot really be used.” Michael Palmer 27

Introduction • Foundation • Reporting • Insight • Cases Step 3.1 Establish Databases Design databases to support analysis needs. Performance measures rely on a deep archive of data to develop an accurate baseline; understand multi-year, seasonal trends; and establish reasonable targets. Database design supporting performance measures should consider requirements for reporting, trend analysis, and root cause analysis. Design should also consider the possibility that requirements may change over time—for example, an agency may decide to calculate different metrics, drawing on the same raw data sources. Therefore, both raw and transformed data may need to be stored. When raw data is voluminous (for example, pavement images), processed data can be maintained in active storage and the raw data can be kept in lower-cost archive storage. Determine data retention policies. If retention policies are not modernized to reflect changes in storage costs, or if they are set without full understanding of business needs, there is a danger of loss of valuable data and TPM capability. Ten or more years ago, data storage hardware was both physically large and expensive. Therefore, agencies implemented data retention policies to better manage budgets and constrained physical space in data centers by limiting the amount of storage and the duration of the storage. Both the size and cost of storage have dropped dramatically over the years. With the exponential cost savings and available storage options, agencies can re-examine their retention policies to make sure they align with business needs. For example: • An agency may be required to report performance of the system to the federal or state government in 15-minute intervals. That agency may be tempted to aggregate raw data coming in 1-minute intervals and only retain the aggregate information to save space. Later, the agency may identify a need to track incident management performance metrics—requiring the original 1-minute data that tracks growing and shrinking queue lengths, user delay, arterial signal performance, and the effects of secondary incidents. If the 1-minute data are gone, the agency may be unable to track that metric accurately (if at all). • A data set may have limited value by itself and would be considered unimportant for retention. However, when combined with other MARC repurposed two open positions, including a “GIS specialist” and a “demographer,” into “data developer” positions capable of creating and managing systematic workflows for data gathering and organization. The developers have since created automated processes to obtain data sets and import them into SQL databases. The databases have front-end interfaces that greatly simplify the process of querying them to extract the information MARC needs. Case G 28

Introduction • Foundation • Reporting • Insight • Cases data sets, it may provide new and important performance measures insights that neither data set could in isolation. Data storage architectures can be designed to consider both current and future TPM data requirements. For example, cleaned and processed data that support frequent and/or immediate TPM needs can be stored using faster and more accessible servers. Raw data used to generate those measures and calculations can be stored on lower-end servers or storage arrays locally or with a cloud-based back-up provider. Agency TPM personnel should work collaboratively with information technology (IT) departments and records managers within the agency to draft more modern retention policies. Plan for both “big” and “small” data. As the concept of “big data” becomes more prevalent and hyped, there is a potential for data proponents and decision makers to focus too much on big data and overlook the value of “small data.” For example, in an effort to utilize a big data set, agencies may invest in big data platforms that are not designed to handle smaller data sets. The push to keep on top of the latest technology trends can usurp resources from existing data and performance management programs and may not meet the agency’s predominant requirements. How does an agency determine when data is big enough to require a big data storage approach? The true sign of a need for a big data platform is when traditional storage and processing techniques become inadequate for the specific usage needs that have been identified. Agencies should strike a balance in their investment in big and small data platforms and work backwards from their use cases to the storage strategies. For example: • It is appropriate to store some data sets in relational databases if they lend themselves well to normalization, indexing, joining, and database-supported statistical analysis. Calculating incident clearance metrics can be done very effectively using a traditional database management system. • On the other hand, complex analytics that parse through billions or even trillions of data points for identifying problem locations, prioritizing projects, computing user delay, or understanding the complexities of signal retiming efforts are computationally expensive and are not cost effective in traditional relational databases. In order to fully leverage both types of data sets, 29

Introduction • Foundation • Reporting • Insight • Cases agencies can develop simple interfaces that operate across both the traditional relational databases and big data platforms. Agencies should have strategies for both big and small data, including a way to integrate across both types of data. A well-designed storage architecture can be flexible enough to accommodate both data sets as well as incremental changes as technology and data continue to develop. Identify storage options. In the past, agencies had to rely on in-house systems to store and process their data. As the concept of cloud computing becomes more mainstream, agencies are now presented with a choice between storing data in-house or in the cloud. There are also a variety of commercial and open source data storage options available for both cloud and on-premise. What is right for each agency is highly dependent on the TPM use case, agency policies (procurement, IT, etc.), IT staff support, the type of data, and how frequently the data will be accessed. There are several general considerations each agency should address as they evaluate their options. Consider commercial cloud and specialized hosted solutions. Commercial cloud providers’ services (like Amazon, Azure, etc.) appear to be very affordable when pricing is based solely on the amount of data to be stored. However, agencies need to consider more than just the size of their data when estimating costs. Pricing is also highly dependent on the number of transactions (or the number of times you access the data and process it) and how much bandwidth is used. For example, it may be inexpensive to load raw data into the cloud, but very costly to extract it back when needed to calculate metrics to support TPM. The appropriate approach then may be to move data processing and metrics calculations to the cloud as well and avoid extraneous costs of downloading data each time it needs to be processed. Alternatively, agencies can rely on specialized hosted solutions from their partners: universities, consultants, and sister agencies. Partners may have a better understanding of the transportation and TPM domain and provide more cost-effective approaches for storing transportation data. While in-house solutions often provide more control, they also require a larger investment in workforce and physical infrastructure. However, storing data in the cloud may not be cost effective for highly transactive systems. It is important to consider all these aspects to avoid becoming a “hostage” of a vendor or service or investing in an internal system that becomes obsolete due to its inability to innovate and stay current. New Jersey DOT uses an outsourced cloud-based hosting platform and subject matter expertise to collect, store, and analyze complex data sets to support agency TPM efforts. Case H 30

Introduction • Foundation • Reporting • Insight • Cases Plan for data security. Implement a sound data back-up strategy that will allow you to restore data in the event of a hardware failure, cyber- attack, or inability to physically access facilities. If your data contains personally identifiable information (PII) or other sensitive elements, it should be clearly categorized as sensitive and managed to prevent unauthorized access. TPM-related data that may be sensitive include crash reports, travel survey data, and data from mobile devices. Some agencies have policies that do not allow sensitive data to be stored in the cloud. However, many cloud providers have a robust security policy to both prevent and recover from cybersecurity compromises. In contrast, agencies may have limited funds and expertise to implement robust security mechanisms. Maintain metadata. While some data sets are considered “self- explanatory,” metadata and documentation are critical. For example, highway crashes may appear to be a straightforward data set. On closer examination, you may find that data from one jurisdiction is gathered using different definitions for serious injuries than another. Data may be collected using a mixture of electronic and manual processes with different quality assurance processes applied. Newer data sets may be provisional and subject to further updates. Metadata and documentation become even more important when data is used in calculations to support TPM. Two individuals can use the same raw data and measure definition, but execute calculations differently depending on the context and interpret the results completely differently. Metadata should be maintained at both the data set and data element level. Data set metadata covers information such as source, spatial and temporal scope, quality, and access classification. Data element metadata covers meaning, origins, usage, value domain, and format. Standards for data set level metadata can be found in International Organization for Standardization (ISO) 19115 and the Office of Management and Budget’s (OMB) Project Open Data (POD) Schema. Standards for data element level metadata can be found in ISO/IEC 11179. Proper metadata and documentation that is frequently updated and audited can ensure that confusion and interpretation variations are minimized. Metadata and documentation must be properly versioned so that data processing spanning different versions of metadata can be interpreted and processed properly. 31

Introduction • Foundation • Reporting • Insight • Cases For more information... 1. ISO 19115—Geographic Information-Metadata https://www.iso.org/standard/53798.html 2. ISO/IEC 11179—Information Technology-Metadata Registries http://metadata-standards.org/11179/ 3. NCHRP Project 17-75, Leveraging Big Data to Improve Traffic Incident Management (TRB, 2019—publication forthcoming) http://apps.trb.org/cmsfeed/TRBNetProjectDisplay.asp?ProjectID= 4051 4. Integrating Emerging Data Sources into Operational Practice (FHWA, 2017) https://rosap.ntl.bts.gov/view/dot/34175 5. NCHRP Project 08-36 Task 130, Inventory and Assessment of Methods for Making Collected Transportation Data Anonymous (TRB, 2016) http://onlinepubs.trb.org/onlinepubs/nchrp/docs/NCHRP08- 36(130)_FR.pdf 32

Introduction • Foundation • Reporting • Insight • Cases Step 3.2 Load & Integrate Data Establish repeatable data loading processes. Ad hoc data loading conducted in a rushed manner is a recipe for disaster. Repeatable processes need to be set up and, ideally, automated to load and transform raw data into a form suitable for use. When a problem occurs with a data load, procedures should be in place to roll back and then repeat the process once the issue is identified. Sometimes, a series of loads are needed to refresh data in various repositories. For example, new bridge inspection data may be loaded into a staging database for review and quality assurance. The data may then be transferred to the bridge management system database for analysis and to the agency’s road inventory system. These data flows should be thoroughly tested, automated, and well-documented. Accurate and detailed documentation is essential, especially when data loads occur infrequently and there are multiple systems and staff from different business units involved. Store both raw and processed data. Storing transformed performance data in addition to raw data can facilitate analysis and reporting. Make use of data integration tools. There is a wide array of commercial and open source tools available supporting data integration processes. Some tools are geared to building extract-transform-load processes for data warehouse environments; others are geared to big data sets. Several excellent tools focus on integrating geospatial data. Use of these tools requires expertise and involves a learning curve, but it can save a great deal of time for data loading and integration tasks while also reducing the risk that errors are introduced through highly manual processes. At MARC, use of automated processes and commercial data integration tools for maintaining key data sets has greatly simplified the process of querying, which means MARC is able to dedicate more time to analyzing data, not just collecting it. Virginia DOT integrated pavement condition data with data on planned paving projects to produce performance monitoring reports that tracked anticipated versus actual changes in condition and the likelihood of achieving performance targets. The data integration effort relied on standardization of several data elements across two databases. Case G Case K 33

Introduction • Foundation • Reporting • Insight • Cases For more information... 1. NCHRP Synthesis 523: Integration of Road Safety Data from State and Local Sources (TRB, 2018) http://www.trb.org/Main/Blurbs/177990.aspx 2. NCHRP 08-36, Task 131: Transportation Data Integration to Develop Planning Performance Measures (TRB, 2017) http://onlinepubs.trb.org/onlinepubs/nchrp/docs/NCHRP08- 36(131)_FR.pdf 3. Informational Guide for State, Tribal and Local Safety Data Integration (FHWA, 2016) https://safety.fhwa.dot.gov/rsdp/downloads/fhwasa16118.pdf 4. Data Integration Primer (FHWA, 2010) https://www.fhwa.dot.gov/asset/dataintegration/if10019/if10019.pdf 34

Introduction • Foundation • Reporting • Insight • Cases Step 3.3 Assess & Improve Data Quality Data quality assessment. Poor quality data may have significant impacts on calculated performance metrics and therefore impact TPM decisions. Step 2.2 discussed the importance of planning for data quality as part of data acquisition and outlined the contents of a data quality management plan. However, there may be already-existing data sets needed for TPM that are of unknown quality. A data quality assessment can be conducted to determine suitability of a data set for use in TPM. Quality assessment can consider multiple characteristics, including completeness, currency, accuracy, and consistency. Data accessibility and interoperability are also sometimes considered. Assessing data quality involves establishing data quality metrics and measurement methods. For example, a metric for crash data completeness might be the percentage of data records that are missing a location code. This could be measured through a simple data query. Accuracy is typically assessed through a combination of independent verification for a sample of the records and application of validation checks to make sure measured values are within expected ranges. Quality management. Quality management is a continuous process that starts prior to data acquisition and continues through the entire data life cycle. It should include analysis and flagging of data records that fail specific quality policies and thresholds. For example, pavement roughness measurements less than 30 inches/mile or travel speeds over 150 mph might be flagged as suspect. It is important to find the right balance when planning for data quality improvement. All too often, agencies spend large amounts of resources attempting to clean, scrub, and validate data—only to find that there continue to be data issues regardless of how much time and energy is spent in cleaning. Perfection becomes the enemy of good, and agencies end up never fully using the data to inform decisions. Worse, the department (or person) responsible for the data hides it or prevents others from using it due to potential issues, fear, liability, etc. As soon as data (in any form) become available, it can and should be analyzed for data quality and consistency. The act of analyzing data, even when it has “Data that is loved tends to survive.” Kurt Bollacker I-95 Corridor Coalition data use agreements contain explicit data quality specifications that ensure 3rd-party-provided data meets required quality standards to support TPM. Case D 35

Introduction • Foundation • Reporting • Insight • Cases not been cleaned or validated, is important for guiding and informing potential users, applications, and data investment decisions. Annotating (not discarding) suspect records. When suspect data records are encountered, a methodical process should be followed to flag these records and address the gaps in a carefully planned manner. Bad data records should not be summarily deleted because this could cause downstream analysis problems. Discarding bad data could negatively impact calculations if data gaps are not properly addressed. One way to address suspect or missing records is to fill gaps with historical data or otherwise imputed data. When this approach is used, these imputed records must be flagged to ensure that TPM decisions account for this simulated/modeled input. In some situations, marking and tracking bad data can provide important information that can be used to improve future data quality. For example, a crash data manager might observe a pattern of inaccurate or incomplete crash records from one particular source. As another example, a traffic sensor may exhibit a specific pattern where it reports erroneous data (or does not report data) every day during the 8am hour. If bad data is discarded or not otherwise tracked, this particular failure (and its cause) may never become evident and subsequently continue to impact metrics based on that data. For more information... 1. Development of a Computational Framework for Big Data-Driven Prediction of Long-Term Bridge Performance and Traffic Flow (Midwest Transportation Center, 2018) https://rosap.ntl.bts.gov/view/dot/36042 2. Crash Data Improvement Program Guide (NHTSA, 2017) https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812419 3. National Performance Management Research Data Set (NPMRDS) —Speed Validation for Traffic Performance Measures (Oklahoma Department of Transportation, 2017) http://www.okladot.state.ok.us/Research/FinalRep_2300_FHWA- OK-17-02.pdf 36

Introduction • Foundation • Reporting • Insight • Cases Capabilities Checklist: Store & Manage Data Basic o Data needed for TPM is stored in databases that are managed and regularly backed-up to provide protection from unauthorized access and corruption. o Back-ups are tested on a regular, established cycle (e.g., monthly). o Quality control procedures are in place to flag records that do not meet established validation criteria. o Data dictionary information (metadata) is maintained and stored in a standardized fashion. o Annual data snapshots are created for coordinated reporting across data programs. Advancing o Hardware and software requirements for data storage, updating, integration, and access are understood. o Central data repositories have been established to integrate data from multiple sources and provide source data for reporting and analysis. o Cloud and hosted storage options are considered for larger and more complex data sets. o Data retention policies and archiving protocols have been updated to reflect lower storage costs and analysis of TPM business data needs. o A range of data storage options are available to support databases with high transaction volumes and memory-intensive calculations as well as archived data retained for future use. o Standards have been adopted to enable combining data from different sources. o Data from multiple sources are fused to assemble a more complete and accurate data set than would be possible from any single source. o Where appropriate, edge computing techniques are used— involving data processing at the source (e.g., at the site of the field sensor) rather than within a centralized repository. Do’s and Don’ts Do: Consider cloud storage to reduce or minimize the agency’s IT footprint and make it easier to scale storage up or down based on need. Explore hosted solutions from partners —universities, consultants, and sister agencies—to provide cost-effective approaches to managing large and complex data sources. Explore how fusing of disparate data sources can add value to your analysis and capabilities. Build or hire expertise in statistical analysis and computer programming to effectively analyze and transform data into TPM-related information. Adjust your agency’s data retention policies and storage architectures so that potentially useful data isn’t destroyed permanently. Establish repeatable, automated, and documented data-loading processes. Take advantage of commercial data integration tools. Don’t:  Delete older data. The minute you get rid of it, you’ll find you need it again.  Delete erroneous data records. Flag them instead.  Aggregate data sets to the lowest common denominator to save on storage space.  Let the allure of “big data” technologies prevent you from continuing to invest in proven solutions.  Rely on ad hoc approaches to loading and integrating data. 37

Introduction • Foundation • Reporting • Insight • Cases Step 4 Share Data This step includes sharing transportation performance data across business units within an agency, across agencies, or with the general public. This includes but is not limited to transmitting data and reports to meet reporting obligations. Agencies benefit from sharing data through improved coordination across jurisdictions, enhanced understanding of joint priorities, and leveraging of investments. Note: this step focuses on the mechanics of data sharing and reporting, including tool selection. See Step 5 for a discussion of data analysis and Step 6 for a discussion of communicating data. “There’s a digital revolution taking place both in and out of government in favor of open- sourced data, innovation, and collaboration.” Kathleen Sebelius, Former United States Secretary of Health and Human Services 38

Introduction • Foundation • Reporting • Insight • Cases Step 4.1 Establish Reporting & Presentation Infrastructure Select and deploy analysis and reporting tools. Data analysis and reporting tools that are available to agency staff are a critical element in making effective use of data. These can include tools that fuse “siloed” data from disparate sources, tools that fill in gaps (missing data), and those that identify or screen data outliers. Other important tools support analytics and visualization that help the agencies “see” into the data— asking questions, identifying issues, deriving meaning from the data, and communicating those insights to others. Tools include commercial business intelligence packages that support both traditional reporting as well as dashboards: GIS tools, statistical analysis packages, and specialized tools geared to particular types of performance data—for example, asset management systems and analytics platforms for congestion performance reporting. While it is unlikely that a single reporting and analysis tool can meet all of the agency’s needs, it is important to keep in mind that every new tool requires support to bring on new releases, train users, and troubleshoot issues. It is best to follow a disciplined and coordinated process of defining needs and requirements and considering whether existing tools are sufficient prior to bringing on a new tool. Make build versus buy decisions. Developing the appropriate analytics software and databases that make the data easier to analyze and accessible to end users can be a significant hurdle for agencies. For an agency to build successful tools independently, they will typically need to draw upon the expertise of software engineers, system architects, user interface and user experience design specialists, developers, and project managers. The tools will need to be maintained over time; therefore, ample documentation and knowledgeable staff are needed that can be called upon over the course of many years to keep the tools up to date. Building complex tools with extremely small teams can be risky and costly to an agency. Because of the high barrier to entry and continuing maintenance costs of developing custom tools, many agencies are now choosing to either purchase off-the-shelf tools or to leverage tools that other Maryland State Highway Administration (SHA) uses Regional Integrated Transportation Information System (RITIS) visual analytics to combine disparate data sets and derive valuable information as part of after action reviews for operational improvements. Case E 39

Introduction • Foundation • Reporting • Insight • Cases agencies/universities have already paid to develop. This effectively creates a pooled-fund approach to software development and maintenance. This approach is becoming easier for those agencies who are unaccustomed to purchasing services and for those who have historically not adopted tools and products that were not developed in-house or even within their respective states. Whether an agency decides to build their own tools, hire consultants to build custom tools, or leverage existing tools, the following items should be considered. In-House Development • Allocate ample time to working on requirements for usability, functionality, and recruiting multiple user groups to get an understanding of expected usage. • Find an experienced partner. Attempt to procure the services of a consultant who has performed similar work for other agencies. Analysis tools may need customization and tailoring, but a proven provider is often more reliable than a standard consultant. • Recognize that initial startup will be costly. There are several private-sector and university providers that have excellent archiving, fusion, and analytics products. Some of these systems work across borders and across multiple agencies. Consider adopting similar technologies or products as neighboring jurisdictions when possible so that shared experiences, knowledge, and benefits from shared resources can be leveraged. • Avoid “black box” solutions that do not explain the underlying technologies, algorithms, or methods used to calculate the performance measures. Ensure the chosen provider has documented procedures that can be shared with software engineers and data analysts. Some providers have multistate/agency steering committees that collectively drive the features of the archive products to ensure they are constantly meeting user needs. Purchasing Tools • More and more states and MPOs are starting to purchase probe- based speed data; however, not as many agencies are investing in tools to analyze the data that enables better decisions. Probe data vendors, for example, have analytic tools that are sold at prices that are less expensive than the effort needed to reproduce those tools inside of the agency. These tools dramatically improve the 40

Introduction • Foundation • Reporting • Insight • Cases productivity of analysts by making the data much easier to analyze. In addition, these tools provide capabilities to agencies that might have previously taken months of effort to produce. • Most vendors can provide return on investment (ROI) examples— — including case studies from previous applications—showing how much money other organizations have been able to save by investing in 3rd-party data analytics tools or 3rd-party data. The I-95 Corridor Coalition has produced these types of ROI documents for its member agencies showing the benefits of some of their probe and incident data analytics products. More information can be obtained at www.i95coalition.org, or by reaching out to any 3rd-party data provider. Purchasing Services • For agencies that are not comfortable using analytic tools and are not interested in doing in-house data analysis, hiring outside consultants or university support may prove to be a viable option. Consultants and universities frequently have access to scientists, statisticians, database programmers, economists, and other analysts that would otherwise be difficult to hire at state and local agencies. When seeking out-of-agency services, it is wise to review product and project portfolios for examples of prior work to ensure an agency’s needs match the skills of the consultant or university personnel being proposed on a project. • When hiring outside support (consultant or universities), consider a phased approach to projects. Start small, and ensure the consultant is able to perform basic analysis and fusion tasks with the data available. If the consultants are successful, then work can progress to bigger analysis tasks—adding layers of complexity and building on prior work and available data sets. Initiating extremely large analysis tasks that are not easily broken down into smaller deliverables can be a recipe for confusion, cost overruns, disappointment, and waste. • Regardless of who does the work, it is advisable to avoid mandating that consultants use specific tools, technologies, or techniques to deliver a solution. New technologies, methodologies, and tools are developed quickly and often. Requiring outdated technologies can result in unnecessarily limiting the agency and the consultant in performing analytical tasks. Allow the consultants to drive these decisions based on what they perceive to be the most efficient and effective tools and methods. 41

Introduction • Foundation • Reporting • Insight • Cases For more information... 1. NCHRP Project 03-128, Business Intelligence Techniques for Transportation Agency Decision Making (TRB, report forthcoming) http://apps.trb.org/cmsfeed/TRBNetProjectDisplay.asp?ProjectID=4 352 2. Development of a Travel-Time Reliability Measurement System (Minnesota Department of Transportation, 2018) http://www.dot.state.mn.us/research/reports/2018/201828.pdf 3. Implementation of Probe Data Performance Measures (Pennsylvania Department of Transportation, 2017) https://rosap.ntl.bts.gov/view/dot/32283 42

Introduction • Foundation • Reporting • Insight • Cases Step 4.2 Establish Data Standards & Formats Take advantage of data standards. There are a number of data standards that can be adopted for agency data sets and/or used when sharing transportation system performance data between agencies (see Table 2). Some data standards cover data dictionary information (data elements and their definitions); others are more comprehensive and specify data formats, message structures, and technical mechanisms and protocols for sharing. Data standards can make sharing processes easier. However, standardization should not be a prerequisite for sharing. It is more beneficial to share non-standard data than to not share anything. While standards are absolutely necessary in some instances (such as for vehicle- to-vehicle safety communications), the use of standards can break down in practice. Standards may become cumbersome because they try to address every possible data element and use case, or alternatively, standards are extended with custom fields and effectively lose the benefit of being a standard. Agencies may be asked to comply with a standard imposed by an external entity as a condition for data sharing. This can have unintended consequences if that standard requires data to be “dumbed down” to the lowest common denominator to satisfy the needs of the external entity. In order to comply with this standard and remain on budget, agencies may permanently modify their data to match that standard and therefore lose significant value of that data for future use. This issue of standards becomes even more challenging when dealing with big data. Unstructured data and crowdsourced data are rarely standardized or clean, but still may have substantial value to an agency to support TPM. The key to successful data sharing is to adhere to standards when possible, but not at the cost of losing insight or capability from non-standard data sets. “The wonderful thing about standards is that there are so many of them to choose from.” Grace Murray Hopper 43

Introduction • Foundation • Reporting • Insight • Cases Table 2. Example data standards related to TPM. Safety American National Standards Institute (ANSI) Standard D16 Manual on Classification of Motor Vehicle Traffic Crashes Model Minimum Uniform Crash Criteria (MMUCC) (https://crashstats.nhtsa.dot.gov/Api/Public/Publication/812433) Model Inventory of Road Elements (MIRE) (https://safety.fhwa.dot.gov/rsdp/downloads/fhwasa17048.pdf) Pavement Condition HPMS Field Manual—defines pavement data elements AASHTO Standard R43-13, Standard Specification for Transportation Materials and Methods of Sampling and Testing, Standard Practice for Quantifying Roughness of Pavement Bridge Condition FHWA National Bridge Inspection Standards System Performance IEEE 1512-2006 Standard for Common Incident Management Message Sets for use by Emergency Management Centers ITE TMDD 3.3 ITE TMDD Traffic Management Data Dictionary (TMDD) Standard for Center to Center Communications ASTM E2665-08 Standard Specifications for Archiving ITS- Generated Traffic Monitoring Data Other Open Geospatial Consortium—variety of standards for geospatial data All-Roads Network of Linearly Referenced Data (ARNOLD) manual—provides guidance and best practices for building linear referencing systems (LRS) covering all public roads National Information Exchange Model (NIEM)—provides a common vocabulary that enables efficient information exchange across diverse public and private organizations. Data Type Applicable Standards 44

Introduction • Foundation • Reporting • Insight • Cases Select file formats. Certain file formats have advantage over others when it comes to sharing data between agencies. For example, exchanging PDF files containing detour plans may make sense on an individual case basis, but it significantly reduces the ability to automatically process information and incorporate it in TPM processes. Ideally, data should be formatted in a machine-readable format that provides the most flexibility for integration in TPM tools. Common data file formats found in open data platforms include JSON, XML, CSV, and KML. For more information... 1. Open Geospatial Consortium http://www.opengeospatial.org/standards 2. Project Open Data https://project-open-data.cio.gov/ 3. General Transit Feed Specification http://gtfs.org/ 4. National Information Exchange Model https://www.niem.gov/ 5. https://www.fhwa.dot.gov/bridge/nbi.cfm 6. USDOT JPO ITS Standards Program https://www.standards.its.dot.gov/ 7. Manual on Classification of Motor Vehicle Traffic Crashes Eighth Edition—ANSI D16 (Association of Transportation Safety Information Professionals, 2017) http://www.atsip.org/ANSI_Ver_2017_D16.pdf 8. HPMS Field Manual (FHWA, 2016) https://www.fhwa.dot.gov/policyinformation/hpms/fieldmanual/ 9. Traffic Monitoring Guide (FHWA, 2016) https://www.fhwa.dot.gov/policyinformation/tmguide/tmg_fhwa_pl_17_00 3.pdf 10. All Roads Network of Linear Referenced Data (ARNOLD) Reference Manual (FHWA, 2014) https://www.fhwa.dot.gov/policyinformation/hpms/documents/arnold_refe rence_manual_2014.pdf 11. AASHTO Standard R43-13, Standard Specification for Transportation Materials and Methods of Sampling and Testing, Standard Practice for Quantifying Roughness of Pavement, 34th/2014 Edition (AASHTO, 2014) National Bridge Inventory Resources 45

Introduction • Foundation • Reporting • Insight • Cases Step 4.3 Publish Data Designate authoritative data sources. Authoritative data sources for performance measure calculation should have been established as part of Step 1.3—Identify Analysis and Reporting Requirements. In preparation for publication, it is also important to designate authoritative sources for the computed performance measures and for any contextual data to be provided in the reports. Only designated authoritative sources should be used for reporting. Following this guideline will ensure that information released to the public is consistent and quality-checked. Determine what data to share. The growing “open data” movement is creating the need for agencies to decide what data to proactively make available to the public, what data to provide on request, and what data to keep restricted. Several states have developed policy guidance on data classification. For example, the District of Columbia defines five levels: • Level 0—Open (the default classification) • Level 1—Public, Not Proactively Released (e.g., due to potential litigation risk or administrative burden) • Level 2—For District Government Use (exempt from the Freedom of Information Act but not confidential and of value within the agency) • Level 3—Confidential (sensitive or restricted from disclosure) • Level 4—Restricted Confidential (unauthorized disclosure can result in major damage or injury) DC has adopted the philosophy that data should be open by default and restricted only if there is a reason to do so. Select data sharing methods. Sharing methods can vary from very basic file transmission, such as FTP, to more complex asynchronous, persistent transmission methods such as subscriptions, web services, and others. Open data sharing platforms such as data.gov have been established at the federal level and by many state agencies. While simple methods may be quick and inexpensive to implement, they can, in some situations, diminish the value of shared data. For example, files posted to an FTP site once a day introduce unnecessary latency and reduce certain TPM capabilities. 46

Introduction • Foundation • Reporting • Insight • Cases To support system performance management, agencies should strive to share data in near real time and at the highest possible resolution in order to provide the most flexibility and usefulness. For example, the Maryland State Highway Administration (SHA) allows external entities to securely subscribe to their real-time operations data sharing interface, which pushes incident information out to subscribers as it is entered by operators into their Advanced Traffic Management System (ATMS) platform. This approach allows partners to integrate this data into their system and become aware of significant incidents as soon as they occur (as opposed to minutes or hours later). This is important because it enables better real-time tracking of incident clearance times, responder activities, and other measures that are often requested by senior managers. Share data within the agency. Departments within agencies often invest in data collection and data services to satisfy specific needs. For example, operations groups may procure and install sensors to support real-time operations. Planning groups may install different devices with a slightly different configuration to support planning and modeling needs. However, agencies frequently fail to evaluate existing investments within the agency. For example, there could be significant overall cost savings if agency departments evaluated existing data sets within the agency and adjusted the existing configurations or agreements rather than going through a completely separate procurement process. This is particularly true for larger agencies Share data with other agencies. Sharing data with other agencies provides significant benefits to all parties, as well as the traveling public. Access to other agencies’ data allows a more holistic approach to TPM, as well as better coordination in efforts to improve performance. Some of the challenges to sharing data with other agencies include data sharing methods, formats, and agreements. It is also important for agencies to develop methods for the integration of external data. Separating relevant data from noise is an important exercise that can have a significant impact on TPM output. For example, an agency integrating incident data from a neighboring agency may want to only focus on external incidents close to the border, major regional incidents, or external incidents that may have an impact on the agency’s area of responsibility. This means that for TPM purposes, the agency must develop a policy for identifying incidents of significance to the agency and its system while avoiding the trap of throwing away too much data that may be of use in the future. The Metropolitan Area Transportation Operations Coordination (MATOC) Program enhanced real-time data sharing among agencies. This allowed agencies to become aware of incidents more quickly, to respond more quickly, to clear the incident more quickly, to alert travelers more quickly, and to develop standard operating procedures (SOPs) that account for impacts of regional and cross- jurisdictional events. Case F The Florida DOT created an open data portal for sharing data both internally and with the public. Several FDOT business units had already begun to post important data sets online at various web sites. The portal provided a central location for data sharing, making it easier for people to locate available data. Case C 47

Introduction • Foundation • Reporting • Insight • Cases Sharing data with the public. Public agencies have a responsibility to provide best possible service to their customers: the general traveling public. One component of this responsibility is sharing of agency’s performance with the public. While some data elements may be sensitive, most transportation data can be shared to better inform the public regarding system performance. In addition to open data, agencies can provide easily digestible and interactive reports regarding system performance. One challenge with open data is that it is exposed to a general public that has varying levels of understanding of raw data. This can lead to distorted interpretation of data. While this is a real challenge, it should not be a barrier to sharing data with the public. Agencies can provide metadata, documentation, and sample applications to help users better understand raw data and its potential uses. Application programming interfaces (APIs) allow users to develop their own data ingestion and processing applications and add value to existing data sets. In order for the agency to effectively distribute information, it must be able to share data via APIs for integration with other applications and systems. For example, the City of Chicago publishes APIs for historical congestion estimates, average daily traffic counts, and other TPM-related data sets. Similarly, the New York State DOT publishes APIs for historical traffic and transit events across the state and New York City. Consider data sharing agreements. When agencies share data with each other, there is frequently a need for some type of an agreement or memorandum of understanding. Interagency agreements—especially those between states—have sometimes been difficult to negotiate because of governing law language and other restrictive terms and conditions. Because of this, many agencies are now opting to make their data “open” to the public—eliminating the need for data sharing agreements. Other agencies opt for informal, hand-shake agreements. Informal agreements can work well, though agencies with significant staff turnover will want to document any agreements in place. Investigate public–private data sharing arrangements. Over the last decade, the private sector has emerged as an important partner when it comes to transportation data to support TPM. Private-sector data providers have been able to leverage technology and innovation in ways that public agencies are often unable to do. The concept of sharing data with the private sector has become both more important and more 48

Introduction • Foundation • Reporting • Insight • Cases prevalent in recent years. Not only are agencies benefiting from obtaining new data sets from the private sector, but they are also benefiting from the private-sector value-add to the existing agency data sets. Agencies must be careful about negotiating data sharing contracts with private-sector entities. In particular, agencies should pay particular attention to data use restrictions and seek maximum flexibility in use of data. This includes the ability to share data with universities and partner agencies and the ability to generate and share reports and summaries with the general public. Agencies, in turn, should treat the private sector as equal partners who can assist in disseminating information to the public and providing valuable insight in customers’ behavior and travel patterns. Provide tools for easy data access. Data has little value if it is not easily accessible. With continued improvement in bandwidth capabilities, web-based tools and data portals are becoming the norm. These tools allow users to log in and access data from anywhere with an internet connection. In addition to web-based access, the user interface and efficiency of the applications are critical. Poor user interfaces can make it difficult to understand what data and capabilities are available. Similarly, executing a query on a data set and waiting several hours or even days to receive an answer is unacceptable. Users must be able to quickly define a question and receive a response to make data and information useful. This means that agencies need to go beyond establishing databases or big data platforms and ensure that appropriate tools exist to access, visualize, and manipulate data for TPM. In many cases, more than one type of tool will be required to meet the needs and skill sets of different types of users. For example, some agencies make available one reporting package for technical staff and “power users” and a second for more casual users. The I-95 Corridor Coalition collaboratively developed a public– private partnership between member agencies and 3rd-party data providers to take advantage of the latest private sector data offerings. They created a liberal and flexible model data use agreement that has become the “gold standard” for agencies and consortiums across the country for over a decade. Case D 49

Introduction • Foundation • Reporting • Insight • Cases For more information... 1. Data Presentation on Transportation Agency Websites: Trends and Best Practices (Caltrans, 2017) http://transweb.sjsu.edu/sites/default/files/1501-data- presentation-on-transportation-agency-websites-trends-and- best-practices.pdf 2. Uses of Geospatial Applications for Transportation Performance Management (FHWA, 2016) https://www.gis.fhwa.dot.gov/documents/Uses_of_Geospatial_ Applications_for_Transportation_Performance_Management_ Case_Studies.pdf 3. State of the Practice on Data Access, Sharing, and Integration (FHWA, 2016) https://rosap.ntl.bts.gov/view/dot/35860 4. NCHRP Synthesis 460: Sharing Operations Data Among Agencies (Transportation Research Board of the National Academies, 2014) http://www.trb.org/Publications/Blurbs/170868.aspx 5. Geospatial Tools for Data Sharing: Case Studies of Select Transportation Agencies (FHWA, 2014) https://rosap.ntl.bts.gov/view/dot/12147 50

Introduction • Foundation • Reporting • Insight • Cases Capabilities Checklist: Share Data Basic o Employees are aware of key performance data sources within the agency. o There are clear agency policies in place that data should be shared unless the need to protect it is demonstrated. o There are protocols defined for how to share data to meet different needs that consider use of state and federal open data portals and hosted or cloud solutions. o Open data portals are used to share data. o Data explanations are provided in “plain English” to help users understand meaning, sources, and limitations. Advancing o Data governance and stewardship structures have been established to facilitate communication about data sharing and identify opportunities for synergies across business units for collaborating or combining data sources. o Data sharing agreements are used (internal to an agency and between an agency and its partners) that specify what data will be shared, when and how it will be shared, and establish a clear understanding of data limitations and expectations for use. o Data are shared in formats that are designed to meet the needs of different users, which may include standard reports, data feeds, and dashboards. o Data with sensitive elements are sanitized for public distribution. o Data contracts and sharing agreements are reviewed to ensure that agency flexibility is retained. Do’s and Don’ts Do:  Strive to open your data up to partner agencies and the public. Make sure that your data sets are ready to be shared by putting in place some standard criteria (no sensitive information, passed basic quality review, from an authoritative source, etc.) Treat other agencies as partners with whom you want to share your data so that you can improve systems safety and reliability. Put your data into standard formats when it is simple and improves upon your capabilities. Don’t:  Assume a one-size-fits-all data feed will work for both the public and your agency partners.  Sign data sharing agreements with restrictive “governing law” language.  Let a lack of standardization become an excuse for not sharing data. 51

Next: Insight »
Management and Use of Data for Transportation Performance Management: Guide for Practitioners Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Recent federal legislation has established requirements for agencies to set performance targets and report on safety, pavement, and bridge conditions; transit asset state of good repair; system performance; freight; and mobile source emissions. These requirements have resulted in increased visibility and attention to Transportation Performance Management (TPM) and increased awareness of the importance of data within that process.

The TRB National Cooperative Highway Research Program's NCHRP Report 920: Management and Use of Data for Transportation Performance Management: Guide for Practitioners provides practical guidance to transportation agencies to help improve their use of data for performance management.

The guidance is organized around six data life-cycle stages and includes a discussion of what is involved in implementing each step and some of the critical choices to be made; a synthesis of key points in the form of “Dos and Don’ts” checklists that can be used to assess agency capabilities and identify opportunities for improvement; and illustrative examples.

While this guide draws on many examples related to the federally defined TPM areas (safety, pavement, bridge, and system performance), it does not provide official guidance for MAP-21/FAST Act target setting or reporting. It provides a framework for assessing current data management practices and a source of ideas for practice improvement. Its purpose is to promote practices that will enable agencies to go beyond meeting reporting requirements, to get valuable insights from data that can be used to boost agency results.

An additional resource to the guide is a downloadable report: Developing National Performance Management Data Strategies to Address Data Gaps, Standards, and Quality: Final Research Report.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!