National Academies Press: OpenBook

Crash Records Systems (2005)

Chapter: Chapter Three - Survey Results

« Previous: Chapter Two - Literature Review
Page 12
Suggested Citation:"Chapter Three - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2005. Crash Records Systems. Washington, DC: The National Academies Press. doi: 10.17226/13688.
×
Page 12
Page 13
Suggested Citation:"Chapter Three - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2005. Crash Records Systems. Washington, DC: The National Academies Press. doi: 10.17226/13688.
×
Page 13
Page 14
Suggested Citation:"Chapter Three - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2005. Crash Records Systems. Washington, DC: The National Academies Press. doi: 10.17226/13688.
×
Page 14
Page 15
Suggested Citation:"Chapter Three - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2005. Crash Records Systems. Washington, DC: The National Academies Press. doi: 10.17226/13688.
×
Page 15
Page 16
Suggested Citation:"Chapter Three - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2005. Crash Records Systems. Washington, DC: The National Academies Press. doi: 10.17226/13688.
×
Page 16
Page 17
Suggested Citation:"Chapter Three - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2005. Crash Records Systems. Washington, DC: The National Academies Press. doi: 10.17226/13688.
×
Page 17
Page 18
Suggested Citation:"Chapter Three - Survey Results." National Academies of Sciences, Engineering, and Medicine. 2005. Crash Records Systems. Washington, DC: The National Academies Press. doi: 10.17226/13688.
×
Page 18

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

13 Twenty-six states responded to the survey for this synthesis. The vast majority of responses (23 of 26) came from DOTs or the equivalent. One response was received from a highway safety office within the state police agency, a second came from an office of highway safety within the department of pub- lic safety, and the final response did not specify the agency. The survey responses reflect crash record systems in place and/or under development during the early summer of 2004. Figure 3 shows the geographic distribution of the responding states, which are shaded on the map. Table 1 gives an over- view of their crash experience. As seen in Figure 3 and Table 1, the responding states are from all areas of the country and represent a broad distribu- tion of crash experience. It is of interest to note that the data in Table 1 also came from different years of crash records. The oldest data came from Washington State (which plans to have 1996–2002 data available soon). As of this writing, only 3 of the 26 states had made their 2003 data easily accessible by the public. Because these data are derived from crash summary data available on the Internet at a particular point in time, it is likely that some of the other states either have 2003 data available for internal use or they have a policy against providing crash data on the Internet. SURVEY RESPONSES Question 1 of the survey asked respondents whether their responses applied to an existing system, a new system cur- rently under implementation, or to a planned future system. Twenty-one of the 26 respondents reported that their answers described a current system. Four of the remaining five respon- dents indicated that they were describing a new system that was currently under implementation. One respondent (Col- orado) said that its responses describe a planned system for which funding is already in place. Because almost 90% of the respondents were from state DOTs, in some cases (e.g., Mis- sissippi) there is an existing crash database at the custodial agency, but a separate state DOT crash system of linked data is currently under development. Question 2 of the survey asked respondents how long it takes (from the date of the incident to final data entry) to enter a crash into their central crash database. Figure 4 shows the distribution of answers. As may be seen in the figure, the most frequent responses were “Within 90 Days” (10) and “Within 30 Days” (9). Still, almost 20% of respondents indicated that it could take from 91 to 364 days to be entered into their statewide system (“Less than 1 year”). Two respondents indi- cated that it takes more than a year to enter a crash into their system. Again, it is likely that some of those respondents that refer to their DOT system as not receiving crash data for more than 90 days (e.g., New York reports more than a year), have crash data readily available from that state’s custodial agency at an earlier time. Question 3 asked respondents whether all crashes meet- ing the state threshold are collected and entered into the crash records system. Twenty-two of 26 states (85%) responded “Yes.” Three states (i.e., California, Connecticut, and Ore- gon) stated that not all reportable crashes are entered in the system and one did not answer. Question 4 was a compound question. The first part of the question asked if users are able to obtain reports from the system. The second part asked how this is accomplished. Figure 5 summarizes the answers to these questions. Because it is possible for a system to support more than one level of reporting, the data in Figure 5 show the cumula- tive totals for all answers from each respondent. No single respondent reported that users are unable to obtain reports from their crash records system. Twenty of the 26 systems, almost 77%, support ad hoc queries specified by the user and 17 (65%) indicated that their crash systems support prede- fined “canned” reports. Of the 17, only 3 states indicated that canned reports were the highest level of reporting available; the remaining 14 crash systems supported both predefined and ad hoc reporting. Four states have systems designed to support users by having them submit requests to trained ana- lysts. In three of these four cases, this was the only way for users to obtain a report. At least some of the states that reported having to submit report requests, such as Iowa, have a university-based center that actively supports these requests. Question 5 asked respondents to indicate whether road- way, vehicle, driver, emergency medical service (EMS), and other sources of data can be linked with the crash data. Because it is possible for a system to include linkage to more than one external data source, Figure 6 shows the cumulative totals for all answers from each respondent. Twenty of the 26 respondents (77%) indicated that their crash database has links to roadway data. This was more than double the number CHAPTER THREE SURVEY RESULTS

14 FIGURE 3 Geographic distribution of responding states. State Total Fatal Injury PDO Fatalities Injuries Total Casualties Year of Data Arizona 134,228 974 46,209 87,045 1,119 74,230 75,349 2002 Arkansas 70,904 557 28,125 42,222 641 52,474 53,115 2002 California 522,562 3,517 201,478 317,567 3,926 305,907 309,833 2001 Colorado 96,990 595 26,208 70,187 655 38,283 38,938 2002 Connecticut 82,787 319 34,448 48,020 343 51,129 51,472 2000 Delaware 20,408 118 6,021 14,269 137 9,967 10,104 2001 Hawaii 10,848 133 6,125 4,590 140 8,620 8,760 2001 Idaho 26,700 261 9,661 16,778 293 14,601 14,894 2003 Illinois 438,990 1,273 87,458 350,259 1,420 127,719 129,139 2002 Iowa 64,361 394 23,763 40,204 445 36,031 36,476 2000 Kansas 78,271 449 18,495 59,327 511 27,059 27,570 2002 Kentucky 130,347 810 32,393 97,144 915 49,329 50,244 2002 Louisiana 160,991 791 48,800 111,400 902 82,800 83,702 2003 Maine 37,251 153 11,538 25,713 165 16,415 16,580 2000 Maryland 104,843 606 38,875 65,362 661 59,517 60,178 2002 Mississippi 91,687 786 24,228 66,673 871 37,174 38,045 2003 Missouri 94,623 822 27,376 66,425 922 42,298 43,220 2002 Montana 23,529 232 6,479 16,818 269 10,083 10,352 2002 New York 306,050 1,431 172,174 132,445 1,554 259,143 260,697 2001 Nevada 62,237 330 20,475 41,432 381 31,522 31,903 2002 Oregon 48,282 388 18,679 29,215 436 27,791 28,227 2002 S. Carolina 108,280 949 32,427 74,904 1,053 52,095 53,148 2002 Virginia 154,848 860 55,041 98,947 942 78,842 79,784 2003 Washington 51,474 318 22,298 28,858 360 34,178 34,538 1996 W. Virginia 49,913 405 16,859 32,649 444 25,788 26,232 2002 Wisconsin 129,072 723 39,634 88,715 805 57,776 58,581 2002 TABLE 1 CRASH DATA FOR STATES RESPONDING TO THE SURVEY

15 of linkages reported for any other data source; however, it is not unexpected because the respondents were predominantly from the state DOTs. The next most frequent linkages cited were to the vehicle and driver data files. Seven respondents reported linkages to sources of data “other” than those cited in the question. These were identi- fied in the survey responses as: • Linkages to annual average daily traffic volume data (3), • A link to citation data (1), • A link to hospital discharge data (1), • A link to their bridge inventory (1), and • No source of linked data identified (1). The following states reported more than one linkage in their survey response: 9 10 5 2 0 2 4 6 8 10 12 Within 30 Days Within 90 Days Less than 1 year Over 1 year N o. o f R es po ns es 0 4 17 20 0 5 10 15 20 25 No user reports come from the system Request reports from trained analyst/programmer Pre-defined reports come from system Can run my own ad- hoc queries N o. o f R es po ns es 20 9 8 4 7 1 3 0 5 10 15 20 25 Roadway Vehicle Driver EMS Other None No answer N o. o f R es po ns es FIGURE 4 How long it takes (from the date of the crash) for a report to be entered into a traffic records system. FIGURE 5 How easily can users obtain reports from the system? How does this process work? FIGURE 6 Other sources of safety data that are linked to the system. EMS = emergency medical service.

Colorado—Vehicle and driver. Iowa—Roadway, vehicle, and driver. Maryland—Roadway, vehicle, driver, and EMS. Missouri—Roadway and vehicle. Mississippi—Roadway, vehicle, driver, and EMS. Nevada—Roadway, vehicle, driver, EMS, and citations. New York—Roadway, vehicle, and driver. South Carolina—Roadway, vehicle, driver, EMS, and hos- pital discharge. Virginia—Roadway, vehicle, driver, and bridge inventory. It is interesting to note that there are many more docu- mented linkages of data available in these reporting states (e.g., CODES projects); however, the survey respondents did not report the additional linkages that might be available. Even within the DOTs, for example, Missouri’s report of a roadway linkage refers not to a single roadway characteristic file, but rather to their comprehensive transportation manage- ment system. This enterprise-wide system is a GIS-based data system supporting their activities with extensive information about traffic, pavement, safety, bridges, and travelways. Question 6 asked respondents to tell us what location cod- ing methods are used in their systems. The answers are sorted into three basic categories: 1. Locations based on posted locations in the field (e.g., mileposts), 2. Document-based systems that assign a calculated loca- tion code (e.g., mile point, log point), and 3. Locations in which a latitude and longitude are col- lected by GPS or a GIS map is used to pinpoint the location. Figure 7 summarizes the responses to Question 6. Because it is possible for a system to include more than one location coding method, this figure shows the cumulative totals for all answers from each respondent. 16 Fourteen of the 26 states that responded to the survey, or almost 54%, are using GIS and/or GPS map-based systems with coordinates. These systems include reading a GPS to obtain coordinates either automatically or manually at a crash site, GIS locator routines to identify a site by pointing to a map, and after-the fact locating of a crash on a map based on the officer’s description of the location. Twelve crash record systems use location-coding schemes based on reference posts or mile markers placed on roadsides, and 16 systems are using a document-based mile point and calculated dis- placement methodology for locating a crash. Question 7 asked respondents to specify what percentage of crashes is located reliably in their system. The answers ranged from 50% to 100%, with the median response at 94%. The mean response was 88.7%. Two respondents reported that the percentage of crashes reliably located was unknown. In general, the crash records systems that identify locations based on one of the methods of obtaining coordinates are per- ceived as more accurate and descriptive than the crash sys- tems using traditional field and document-based methods. Being able to conduct spatial analysis with crash and related data with a GIS was cited as a considerable advantage to using the coordinate-based location method. Question 8 asked respondents for an estimate of the cost to develop their crash records system. Thirteen respondents reported a total cost for developing their statewide crash records system. Although it was not possible to determine the system elements included in the total cost, the mean cost of crash systems reported was just over $861,000. The median cost was $500,000. The difference in these two measures indicates that some outliers likely affected the mean—in this case, one system came in at $3,500,000 and two systems were near $2,000,000. The remaining 10 systems reported much lower costs. Excluding the three multimillion dollar systems, the mean cost of the systems was approximately $390,000. This estimate is much closer to the median value of $400,000 for these 10 systems, indicating that for these 14 12 16 0 5 10 15 20 25 GIS/GPS Map-Based Posted in Field Document Based N o. o f R es po ns es FIGURE 7 Location coding methods used.

17 lower-priced systems, the measure of central tendency is not overly affected by outliers. Twelve of the respondents reported that they did not know the cost of developing their systems. Missouri reported a cost for their enterprise-wide system of $24 million, which included numerous data files linked to their crash data. This cost was not included in the averages cited previously, as there was no obvious way to apportion the cost of the crash component of the system. Recent costs for large crash sys- tems that were not reported in this survey include the Texas crash system, which is expected to cost approximately $9 mil- lion and the Indiana crash system that has cost approximately $5.5 million. Question 9 asked respondents for the cost of collecting and entering crash data into their crash records system. Eleven of the 26 states reported this cost, with 3 providing a cost per crash. For the other eight, cost per crash data were calculated using the summary data (annual total cost) of the system and an estimate of the total number of crashes based on the data reported in Table 1. The costs ranged from a low of $1.53 in California to a high of $38.85 in Washington State. Costs for Washington State and Oregon (at $19.88/crash) were by far the highest reported, with the next highest cost per crash reported at $7.61 (estimated for Missouri). It should be noted that the discrepancies in crash costs reported could be the result of many factors, including inconsistencies in the cost components counted as part of the estimate, methods used to compute the component costs, and actual differences in the costs of labor and other items in the various locales. These costs fall within the average of $21.00 per crash calculated in the 1998 crash cost study described in chapter two. Question 10 asked respondents what features and capa- bilities they like about their crash records systems. Twenty- four of the states responded to this open-ended question by listing one or more features. Because most respondents indi- cated that there were several features that they liked about their system, the data in Figure 8 show the cumulative totals for all answers from each respondent when summarized in four broad categories. • Data collection—Ten of the 24 respondents mentioned data collection as a feature they like about their crash records system. Three of these respondents mentioned electronic transfer of crash data into their system and eight of the respondents were particularly pleased with the data edits and quality control in their system. • Management—Nine of the 24 respondents favorably mentioned the management and maintenance of their crash records system. Of these, five spoke of the bene- fits of their document management system and five spoke of the ease with which the crash records system could be managed. • Linkage—Ten of the 24 respondents were pleased with the ability of their crash records systems to link with other components of the traffic records system. Four of the 10 respondents were particularly pleased with their ability to use location or GIS as a means of link- ing data with roadway and other inventories. Other data components mentioned as linked to crash records sys- tems included driver, vehicle, EMS, and hospital dis- charge data. • Analysis and reporting—Approximately two-thirds of the respondents (15 of 24) believe that the best feature of their crash records system is the ease with which they can do analysis and reporting of the data. The reporting responses include query capability, canned reports, ad hoc reports, and exporting of data to other systems. Question 11 asked respondents what they would change about their crash records system if they could start over. Twenty-three of the states responded to this open-ended question by listing one or more features. Because most states indicated that there were several features that they would like to change about their systems, the data in Figure 9 show the cumulative totals for all answers from each respondent when summarized in four broad categories. 10 9 10 15 0 2 4 6 8 10 12 14 16 18 Data Collection Management Linkage Analysis & Reporting N o. o f R es po ns es FIGURE 8 What features and capabilities do you like about your crash records system?

• No change—Four of the 23 respondents indicated that they would not change anything about their crash rec- ords system. • Easier data collection and access—More than half of the respondents (12 of 23) specified that they would like to automate and streamline data collection procedures with electronic data collection and use of additional technol- ogy in the field (e.g., bar code scanning and GPS), user- friendly interfaces, personal computer-based relational databases, simpler ad hoc reporting, and/or Internet-based access. • Agency coordination—Three of the 23 respondents indicated they would like to have better interagency coordination to ensure that all needs are being met or to effect legal changes that would mandate standards for crash data collection. • Linkages to other data sources—More than half of the respondents (13 of 23) mentioned that they would like to include better linkages to other data components for both data entry and reporting. Nine of these specifically would have included better linkage with location con- trol data for both data entry and reporting. Question 12 was an open-ended question asking respon- dents to indicate if they know of any good crash record sys- tems. Twelve states responded to this question. Half (6) believed their own state has a good crash record system. Two of the respondents indicated they had looked at several sys- tems and had found only components of a good crash records system, and no completely good system(s). Six respondents named other states as having a good crash records system, with four of these responses citing the Iowa TraCS system. Question 13 was an open-ended question asking respon- dents to indicate what characteristics they like about the crash systems that they had named in question 12. Ten states responded to this question and half (5) mentioned character- istics of their own crash systems that they provided in their answers to survey question 10. Four of the remaining five discussed characteristics that they like about the Iowa TraCS 18 system, such as the location tool, electronic crash entry on laptop computers in the car, electronic submission with inter- nal edits and sharing of common data. The remaining respon- dent liked the cluster search program in Colorado that was part of an FHWA research project. SUMMARY Overall, the survey response was gratifying. The geographic dispersion of the responses gives some measure of comfort in claiming that the results are representative of the United States. Unfortunately, some large states and some states that are known to be working on new crash records and traffic records systems did not respond to the survey. Where possible, the information on successful practices and initiatives to be pre- sented in the next chapter will be supplemented with informa- tion gathered on various crash systems and practices from other sources, such as periodic traffic records assessments. The survey results show that the data are more timely and complete than might be expected—more than 80% of states claim to have data entry completed within 90 days of a crash, and almost 85% of states claim to have all reportable crashes coded into their systems. Access to analytic results also appears to be satisfactory, with more than 75% of states giving users the capability to run ad hoc queries on their own. Not surprisingly, linkage of the crash file to other sources of traffic records information is uneven. More than 75% of crash records systems link to roadway data, but this was more than double the percentage of linkages reported to driver and vehicle data. Most crash systems use more than one location coding method, with traditional document-based and map-based methods being the most prevalent. The ability of states to code locations of crashes is quite good, with almost 90% of crashes located in the crash records system. The use of GPS to locate crashes in the field or GIS maps to pinpoint a crash location is increasing. 4 12 3 13 0 4 2 6 8 10 12 14 16 No Change Ease of Collection Coordination Linkage N o. o f R es po ns es FIGURE 9 What would you change about your crash records system if you could start over?

19 All of these capabilities cost money. For the half of the respondents who gave us this information, the average cost to develop a system was just over $850,000. There were 10 sys- tems that cost less than $1 million and 3 systems that cost more than $1 million. The ongoing cost of having data in the crash records system was addressed by 11 of the states and the cost per crash varied widely, from a high of almost $40 to a low of just over $1.50. It is likely that the wide variance is the result of what steps of a crash records system process- ing were actually included in the costs cited. The most popular features of the current crash records sys- tems were analysis and reporting, linkage, and data collection; however, only analysis and reporting were cited by a major- ity of the users. The most frequently requested improvements are linkage to other systems and easier data collection and access. Approximately one-quarter of the respondents men- tioned their own crash records system as a model and only a handful mentioned other crash records systems as having all the features they would like to have in their own. From the survey responses, there was no consensus among practitioners about a crash records system that served all aspects of a successful traffic records system. There are sys- tems that, perhaps, efficiently capture all the needed data in a single area (e.g., crashes). However, that does not translate to the broader traffic records arena and the other systems needed to support users. As might be expected, the situation is characterized best as a patchwork of data ranging from delayed and incomplete to timely and complete.

Next: Chapter Four - Successful Crash Records Systems and Initiatives »
Crash Records Systems Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s National Cooperative Highway Research Program (NCHRP) Synthesis 350: Crash Records Systems examines crash records systems practices and programs as applied to highway and traffic safety. The report covers crash data collection, crash processing and management, and data linkages for reporting and analysis. While no single comprehensive system examples are identified in the report, many examples of one or more successful components were found to address the needs of three groups of stakeholders—data collectors, data managers, and data users. The report also contains information about lessons learned from examples of successful systems, addressing the needs and concerns of stakeholders.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!