National Academies Press: OpenBook

Training of Traffic Incident Responders (2012)

Chapter: Chapter 3 - Findings and Applications

« Previous: Chapter 2 - Research Approach
Page 16
Suggested Citation:"Chapter 3 - Findings and Applications." National Academies of Sciences, Engineering, and Medicine. 2012. Training of Traffic Incident Responders. Washington, DC: The National Academies Press. doi: 10.17226/22810.
×
Page 16
Page 17
Suggested Citation:"Chapter 3 - Findings and Applications." National Academies of Sciences, Engineering, and Medicine. 2012. Training of Traffic Incident Responders. Washington, DC: The National Academies Press. doi: 10.17226/22810.
×
Page 17
Page 18
Suggested Citation:"Chapter 3 - Findings and Applications." National Academies of Sciences, Engineering, and Medicine. 2012. Training of Traffic Incident Responders. Washington, DC: The National Academies Press. doi: 10.17226/22810.
×
Page 18
Page 19
Suggested Citation:"Chapter 3 - Findings and Applications." National Academies of Sciences, Engineering, and Medicine. 2012. Training of Traffic Incident Responders. Washington, DC: The National Academies Press. doi: 10.17226/22810.
×
Page 19
Page 20
Suggested Citation:"Chapter 3 - Findings and Applications." National Academies of Sciences, Engineering, and Medicine. 2012. Training of Traffic Incident Responders. Washington, DC: The National Academies Press. doi: 10.17226/22810.
×
Page 20

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

16 Assessment of Training Effectiveness An assessment of training effectiveness was performed using the Kirkpatrick four-level model of training evaluation, the training industry standard for evaluating training program effectiveness (8). An overview of the model and the four lev- els of assessment are presented in Table 3.1. For the purposes of the SHRP 2 L12 project, Levels 1 and 2 were the only levels that could be assessed effectively. At the end of each pilot course, participants were asked to complete a course evaluation form. The data compiled from these evaluation forms were used to analyze the Level 1 (re action) data, which assesses how the participants felt about the training experience. Analysis consisted of identifying any trend data that indicated areas in which most participants (80% or more) had issues with critical items. Table 3.2 provides the combined Level 1 responses (re action) from both pilots. The data are based on 33 respon- dents from the first pilot and 17 respondents from the second pilot who completed the Level 1 evaluation forms. The over- all Level 1 evaluation indicates that the students reacted positively to the instructional program. Results from the evaluation analysis were addressed between the pilots, as the data demonstrate, as well as after the pilots. At the end of each pilot course, participants were asked to complete a test to determine their level of understanding of course materials. The data were analyzed in several ways: the overall score obtained by each student, student performance on individual questions (to determine the clarity, reliability, and validity of questions), and how scores compared across disciplines. Initially, the results of the test were used to assess participants’ Kirkpatrick Level 2 (learning) understanding of course materials and increases in knowledge and capabilities based on completion of the course. The research team then analyzed the total participant results on each test item. This analysis assessed their increase in knowledge and ability related to multidisciplinary TIM programs. Test item analy- sis was performed to determine the root cause for any statisti- cally significant failure of the group against specific test items. This test item analysis provided a qualitative assessment of test item and instructional issues. When test items or instructional changes were made between pilots, those test items driving the change were elim- inated from pilot-to-pilot comparisons. The third set of data analyzed was the assessment results (Level 2). Assessment results are organized by incident responder types. The mean, median, and mode score for each group was determined along with the range of results. A standard deviation was derived for each grouping (cluster) to determine if there was a statistical significance in deviation from the scores. Cross- comparisons would have been provided to identify any defi- ciencies in instruction for specific participant groups if such a result were identified, which it was not. Table 3.3 provides the analysis of the total group and the individual groups of participants for the Georgia pilot course delivery. Five participants’ results are included in the total group but are not evaluated as a specific role-based group because they attended the pilot as evaluators. One other candidate was discounted because the name on the assessment did not match any name on the class roster; therefore, the candidate’s discipline could not be identified. The results, in aggregate, were within an expected or normal (bell) distribution—that is, the students’ test results were within planned param- eters. No group deviated significantly from the total group in terms of results, indicating that the instruction worked satisfactorily regardless of an individual’s job role as mea- sured by the Kirkpatrick Level 2 testing. Participants were expected to achieve a score of at least 80 out of 100 to receive a certificate establishing successful completion of the course. The data indicate that no changes to the instruction or testing are required to accomplish the learning objectives adequately. C h A p T E r 3 Findings and Applications

17 Table 3.1. Kirkpatrick Four-Level Model of Training Evaluation Level Evaluation Type (What Is Measured) Evaluation Description and Characteristics Examples of Evaluation Tools and Methods 1 Reaction Reaction evaluation is how the learner felt about the training; it is a form of customer satisfaction. Smile sheets, feedback forms, verbal reaction, post-training surveys, or questionnaires. 2 Learning Learning evaluation is the measurement of the increase in the learner’s knowledge and skill and the change in attitude from the beginning to the end of instruction. Assessments or tests. 3 Behavior Behavior evaluation is the extent of applied learning and level of implementation when the learner is back on the job. Observation and interviews over time are required to assess change, relevance of change, and sustainability of change. 4 Results Results measure the effect of the learning that the learner acquires on business measures. Measures are already in place via normal management systems and reporting—the challenge is to relate them to the trainee. Table 3.2. Combined Level 1 Reactions Question Strongly Agree Agree Neutral Disagree Strongly Disagree Scheduling Date and timing of training fit my schedule. 28% 62% 8% 0% 0% Trainer The instructors clearly explained the goals and objectives of the training. 46% 54% 0% 0% 0% The instructors clearly conveyed the material to the audience. 54% 44% 0% 2% 0% The instructors’ knowledge of the subject material was satisfactory. 85.71% 14.29% 0% 0% 0% The instructors’ pace of presenting the material was appropriate. 42% 46% 10% 2% 0% The instructors satisfactorily answered participants’ questions. 64% 34% 2% 0% 0% The instructors satisfactorily used training aids to help facilitate a clearer understanding of the topic. 64% 34% 0% 0% 2% The written material provided helped me understand the content of the training. 34% 28% 30% 8% 0% Overall Training The content of this training course was valuable to me in developing my understanding of this subject matter. 62% 38% 0% 0% 0% The content of this training appropriately built on my existing knowledge of this subject matter. 60% 38% 2% 0% 0% I am satisfied that the learning objectives for this training were met. 48% 52% 0% 0% 0% The duration of the training was sufficient for learning the subject matter. 40.43% 42.55% 17.02% 0% 0% Based on the training I received, I am able to explain the subject matter to others that may need future assistance on this topic. 36% 56% 6% 2% 0% I am likely to request or attend additional training on this topic in the future. 50% 30% 14% 6% 0% I would recommend this training to others. 70% 30% 0% 0% 0% The training environment was comfortable/appropriate for the class. 72% 22% 4% 2% 0% Time-Saving Measures >10 hours 6–10 hours 3–5 hours 1–2 hours 0 hours Estimate the time this training may save you on researching information. 31.91% 25.53% 31.91% 10.64% 0%

18 lesson was recorded and analyzed to determine the amount of time required for the successful completion of each les- son. The results of these analyses are shown in Table 3.4. According to standard classroom practice, students should be allowed a 10-minute break for every 50 minutes of instruc- tion, meaning that 15.5 hours should be allocated to ensure adequate time to deliver the incident responder portion of the training with an additional 3 hours allocated to the train- the-trainer portion. Course Length The training course was originally proposed as a 1-day course with 8 hours of class time and an additional 4 hours for the train-the-trainer component. However, when the research team gathered the core competencies and began the curriculum design process, it became apparent that this time was insufficient to cover all content adequately. During the pilot course deliveries, the time to deliver each Table 3.3. Analysis of Certification Test Results: Georgia Pilot Course Delivery Total Group Law Enforcement Fire and Rescue Towing Hazmat Notification and Dispatch DOT No. of students 31 6 4 5 1 3 6 Mean 86% 89% 88% 87% 94% 85% 83% Median 87% 89% 89% 87% 94% 85% 83% Mode 90% 88% 90% 90% 94% NA NA Range 19 6 6 14 0 3 13 Standard deviation 0.050403 0.020619 0.028570 0.055967 0.000000 0.015430 0.053226 Table 3.4. Analysis of Lesson Durations No. of Minutes Lesson Pilot 1 Pilot 2 Recommended Comments Incident responder Lesson 0: Course introduction 140 70 60 After the first pilot, a decision was made to move the terminology and statistics component to another lesson to improve the clarity of the message. Additional evaluator introductions and comments were included in the timing of this lesson during the second pilot, which would not routinely occur. Lesson 1: Statistics, terminology, and standards NA 80 80 This lesson now exists as a stand-alone lesson. Lesson 2: Notification and response 20 40 40 The team decided to use additional examples in this lesson to underscore teaching points and focus more on the role of dispatch (traffic operations center and traffic management center), which increased the time requirement for this lesson. Lesson 3: Arrival 70 75 75 The timing of this lesson remained consistent during both pilots. Lesson 4: Initial size-up 30 30 30 The timing of this lesson remained consistent during both pilots. Lesson 5: Command responsibilities 20 45 60 Recommendations from the first pilot and the working group suggested expanding this section to include more detail and examples. However, during the second pilot, time constraints meant that not all the material could be covered. (continued on next page)

19 No. of Minutes Lesson Pilot 1 Pilot 2 Recommended Comments Incident responder Lesson 6: Safety, patient care, and investigation 90 80 90 During the second pilot, time constraints meant that not all the material could be covered. Lesson 7: Traffic management 110 80 120 During the second pilot, time constraints meant that approximately one-third of the material and activities in this lesson could not be covered. Lesson 8: Clearance 30 60 60 Recommendations from the first pilot and the working group suggested expanding this section to include more examples as well as more detail on quick clearance laws. Lesson 9: Termination 3 2 30 Time constraints meant that the team was unable to test this lesson during either pilot. Lesson 10: Tabletop 60 75 90 Recommendations from the first pilot and the working group suggested including a two- dimensional walkthrough before moving to the tables for the hands-on portion. During the second pilot, time constraints meant that student groups were only able to rotate through two scenarios. Lesson 11: Situational awareness 20 30 45 A detailed demonstration of cone placement was added to this lesson after the first pilot. This change increased the lesson duration but allowed it to engage participants and be adequately staged. Train the trainer Assessment 60 90 60 Additional assessment questions were tested during the second pilot, which added to the duration of this component. Because future participants will complete only a randomized selection of test questions, 60 minutes will be adequate for assessment. Adult learning theory 60 NA NA A decision was made to drop this lesson after the first pilot because it was determined that potential instructors should have acquired this knowledge through training and experience as instructors at their respective agencies. Legal guidelines and considerations 60 60 60 The timing of this lesson remained consistent during both pilots. Resources, best practices, and real-world scenarios 35 35 60 Additional content and resources were added as a result of the second pilot. Hands-on activity setup NA 15 15 The need for this lesson was identified during the first pilot. Situational awareness setup NA NA 15 The need for this lesson was identified during the second pilot. Course logistics and orientation NA NA 30 The need for this lesson was identified during the second pilot. NA = Not applicable. Table 3.4. Analysis of Lesson Durations (continued)

20 Modular Design To make the training more readily accessible to participants, the lessons were designed as self-contained modules. This modular construction enables the training to be adminis- tered over several consecutive days, weeks, or months, depending on the needs of the intended recipients or trainers, and making the content accessible to, for example, volunteer firefighters, who may not be able to attend a full 2-day ses- sion. Examples of modular delivery options could include the following: • Brief (e.g., 15-minute) segments in daily roll call training delivered to patrol officers during briefing sessions at shift change; • Fire department weekly safety meetings; and • Integration with existing training programs (see section in Chapter 4 on online training). The course includes curriculum materials that could be exported to other delivery platforms, such as the Trans- portation Operations Academy offered by the Center for Advanced Transportation Technology at the University of Maryland, the I-95 Corridor Coalition’s 3-D interactive training program, or the initial in-service training offered by law enforcement or fire academies. The research team recommends facilitated feedback ses- sions with national and regional stakeholder organizations (e.g., the International Association of Chiefs of Police, the National Fire Protection Association, the Towing and Recov- ery Association of America, the National Emergency Manage- ment Association, and the American Association of State Highway and Transportation Officials) be convened at upcom- ing conferences to explore the attractiveness, feasibility, and ideal environments for modular delivery of the training. Multidisciplinary Training Each of the pilot training courses was delivered by a multi- disciplinary training team that combined practical TIM experience with extensive training experience. This model proved highly effective: • The training team’s practical experience established cred- ibility with the course participants. Participants viewed the trainers as having real-world experience, which in turn helped establish the credibility of the course materials. • The training team represented fire and rescue and law enforcement. This dual representation helped establish credibility with the course participants and also encour- aged more student participation. The trainers discussed their respective disciplines’ views on TIM, which encour- aged the representatives of stakeholder groups to share their own experiences. • The instructors’ previous training experience ensured that the training team understood how to deliver training using the multiple instructional methods included in the course. During the pilots, a deliberate effort was made to ensure that the class consisted of a balanced mix of participants from the primary incident response roles at a typical incident scene. In many cases, this was the first time any of the stu- dents had sat down with members of other disciplines. Throughout the training, participation in instructional activ- ities allowed students to gain unique perspectives and under- standing of the roles that responders from other disciplines play. Because this training was conducted on a regional level, responders also had an opportunity to build relationships; this opportunity is valuable because some of the attending responders are likely to respond to the same incidents.

Next: Chapter 4 - Conclusions and Suggested Research »
Training of Traffic Incident Responders Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s second Strategic Highway Research Program (SHRP 2) Report S2-L12-RW-1: Training of Traffic Incident Responders presents the results of a project that developed a training program for traffic incident responders and managers.

The training program described in the report contains two components: training of trainers and incident responder training.

This report is available only in electronic format.

For more information on traffic incident responder training, contact your state's FHWA division office.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!