National Academies Press: OpenBook
« Previous: Chapter 2 - Research Approach
Page 6
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 6
Page 7
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 7
Page 8
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 8
Page 9
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 9
Page 10
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 10
Page 11
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 11
Page 12
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 12
Page 13
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 13
Page 14
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 14
Page 15
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 15
Page 16
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 16
Page 17
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 17
Page 18
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 18
Page 19
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 19
Page 20
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 20
Page 21
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 21
Page 22
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 22
Page 23
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 23
Page 24
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 24
Page 25
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 25
Page 26
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 26
Page 27
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 27
Page 28
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 28
Page 29
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 29
Page 30
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 30
Page 31
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 31
Page 32
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 32
Page 33
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 33
Page 34
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 34
Page 35
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 35
Page 36
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 36
Page 37
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 37
Page 38
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 38
Page 39
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 39
Page 40
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 40
Page 41
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 41
Page 42
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 42
Page 43
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 43
Page 44
Suggested Citation:"Chapter 3 - Findings." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 44

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

6This section of the report presents major findings from the research efforts. The team began with an analysis of business needs and continued with the specification, design, develop- ment, and testing of the TIM assessment tool. Needs Analysis Figure 3.1 provides a visual overview of the assessment needs and the assessment process. Assessment Needs: What to Assess The research team believes that the TIM training courses should bring about two aspects of learning: knowledge trans- fer and on-the-job behavioral changes. Both types of learning have the potential to positively affect performance results for responder agencies in terms of safety and efficiency. This section describes assessment needs in terms of Kirkpatrick’s four-level evaluation model. Level 1 Assessment: Reaction A reaction evaluation measures students’ personal reactions to a training experience. Questions explore whether • The content was relevant to the student’s job; • The course was perceived to be helpful in terms of improv- ing the student’s job performance; • The subject matter was well organized; • Training materials were effectively presented; • The training session provided the student opportunities to participate; and • The training was a satisfactory learning experience. Since these types of assessments are rather general, the research team decided to use the National Highway Institute (NHI) course evaluation form as the basis for the Level 1 questions. Level 2 Assessment: Learning Learning evaluation measures the increase in knowledge or intellectual capability from before to after the training experi- ence. It aims to determine the following: • Did the students learn what was intended to be taught? • Did the students experience what they were intended to experience? • What is the extent of advancement or change in the stu- dents after the training, in the direction or area that was intended? Questions for students might include these: • According to the TIM phases of incident response, which of the following is the next responder duty after incident arrival? A. Initial size-up B. Traffic management C. Investigation D. Clearance • Why it is important for the Communications Center per- sonnel to provide the geographic location of an incident using mile markers or the nearest intersection? A. To provide the most accurate description for later- arriving responders B. To track which intersections see the greatest occurrence of incidents C. To identify the type of incident D. To more accurately identify the specific location of the incident • Why should responders approach a burning vehicle from a vantage point other than the front or rear of the vehicle? A. Items may violently explode, propelling loose parts off the vehicle B. To avoid smoke inhalation C h A p t e r 3 Findings

7 questions based on the TIM training material from Projects L12 and L32A include the following: • Using the definition of a TIM timeline in Figure 3.2, what student behavioral changes were implemented or observed to shorten the duration of each phase? • Were student behavioral changes implemented or observed to 44 Better communicate locations of incidents? 44 Better describe the nature of incidents? • Were student behavioral changes implemented or observed to better ensure the response contains the appropriate resources? • Were student behavioral changes implemented or observed in terms of responder vehicle positioning? • Were student behavioral changes implemented or observed in terms of lane blocking? • Were student behavioral changes implemented or observed to ensure that TIM responders wear appropriate safety apparel? • Were student behavioral changes implemented or observed to better coordinate multiagency TIM operations? • Were student behavioral changes implemented or observed to better anticipate and prepare the necessary TIM resources? Level 4 Assessment: Results Results evaluation measures the effect on the organization or environment resulting from the improved performance of trainees. Measures typically involve business or organizational key performance indicators, such as volumes, values, percent- ages, timescales, return on investment, and other quantifiable aspects of organizational performance. For example, • Reduction in number of TIM responder deaths and injuries; C. So as not to interfere with other firefighting activities D. To mitigate the dangers of passing traffic Level 2 assessment questions are typically created by the subject experts who develop a training program, since they are the most familiar with the course material. The Project L32A final report included a 92-question student exam, which the research team decided to use as the basis for this project’s Level 2 assessment (Transportation Research Board 2013). Level 3 Assessment: Behavior Behavior evaluation measures the extent to which the train- ees applied the learning and changed their behaviors; this may occur immediately or several months after the training, depending on the situation. The goal of this evaluation is to determine • Did the trainees put their learning into effect when back on the job? • Were the relevant skills and knowledge used? • Was there noticeable and measurable change in the activ- ity and performance of the trainees when back in their roles? • Was the change in behavior and/or new level of knowledge sustained? • Would the trainee be able to transfer his/her learning to another person? Questions at this level are designed for trainees, their peers, and their immediate supervisors who observe the trainees’ on- the-job behaviors on a regular basis. Sample behavior-oriented Figure 3.1. SHRP 2 Project L32C TIM training assessment process.

8To achieve optimal assessment results, the following may need to be adjusted: • Duration of the evaluation period (the research team used a 6-month period to obtain assessment data within the project’s time frame); and • When the evaluation period starts—as stated previously, behavioral changes take time, and those changes need to be observed before evaluating organizational results. Assessment Process: How and When to Assess This section describes the assessment process in terms of how and when to conduct an assessment. Figure 3.3 is a concept- of-operations diagram for the overall assessment process. Essential Student Information A clear requirement for the tool is the ability to perform multi- dimensional analysis. Certain essential student information must be collected at course registration, or when the student completes an assessment form, to enable this level of analysis with the L32C assessment tool. This includes information about the student’s agency and the nature of the student’s affiliation with it, his/her responder discipline, and so on. The type of data that is required is discussed in more detail in a subsequent section of this chapter, Database Design. Note that some data administration is required to enter survey results, test scores, and other input data into the assessment tool. • Improved incident and roadway clearance time; • Reduction in number of secondary incidents; • Equipment and resource readiness; and • Reduction in TIM responder turnover. Level 4 questions are generally directed toward senior management. Example results-oriented questions for post- TIM training evaluation include the following: • How many TIM responder injuries occurred on average in a 6-month period before TIM training? How many TIM responder injuries occurred in the most recent 6 months after TIM training? • How many secondary incidents occurred on average in a 6-month period before TIM training in your state? How many secondary incidents occurred in the most recent 6 months after TIM training? • What was the average time needed to clear a major incident before TIM training? What was the average time needed to clear a major incident during the most recent 6 months after the TIM training? • How many times was incident clearance delayed due to lack of equipment and/or resource readiness in a 6-month period before TIM training? How many times was inci- dent clearance delayed due to lack of equipment and/or resource readiness in the most recent 6 months after TIM training? • What was the TIM responder turnover rate before TIM training? What was the TIM responder turnover rate dur- ing the most recent 6 months after the TIM training? Source: FHWA. Figure 3.2. TIM timeline.

9 Table 3.1 provides a summary of critical success factors and input data sources at each evaluation level. The balance of this section provides further discussion on when and how to assess the TIM training programs based on Kirkpatrick’s four-level evaluation model. Level 1 Assessment: Reaction When to Assess To get an accurate gauge of the students’ reactions to the training, this evaluation is best carried out immediately fol- lowing the completion of the TIM training, preferably before students leave the classroom. Inputs Table 3.2 contains the origin and format of the expected Level 1 input data. AnAlysIs The research team expects that data analysis at this level will evaluate the following: • Student enthusiasm by state or region; • Perceived relevance by students based on course modules; and • Perceived effectiveness of in-person training versus online training. The research team believes there is a two-fold aspect to TIM training assessment based on the Kirkpatrick evaluation model. In one aspect, at each of the four levels, data must be collected from the intended sources and then analyzed using the assess- ment tool to generate the output. Another aspect of the assess- ment involves cross-level examination to determine what positive outcome from each level gets propelled to the next level. For example, of the many things that students learned in the classroom, what was retained and turned into behavioral changes on the job which, in turn, translated to TIM safety and efficiency improvements on a regional or national level? This type of assessment can provide additional feedback on training materials as well as training environment and methodology. The output of the data analysis is presented in the form of statistical analysis results and their graphical representations when appropriate. The output can be stored in some type of electronic format such as Excel spreadsheet, XML file, and/or PDF file, which can be downloaded or e-mailed to interested parties. Additionally, the assessment tool will have the capa- bility for registered users to review input data and to perform their own analysis. To achieve the desired evaluation goal at each of the four evaluation levels, it is imperative not only that the proper questions are asked, but also that the evaluations are done at the right time, using the most appropriate methods, and fol- lowed by relevant data analysis. Figure 3.3. Concept of operations for assessment process.

10 again immediately or shortly following the completion of the TIM training. Inputs Table 3.3 contains the origin and format of the expected Level 2 input data. AnAlysIs The research teams expect that data analysis at this level will evaluate the TIM training in terms of the following: • The effectiveness of each trainer; • The effectiveness of each module taught; and • Whether or not students consistently miss certain questions. Level 3 Assessment: Behavior When to Assess Changes in behavior take time. Therefore, observation and evaluation of behavior over time are required to assess change, relevance of change, and sustainability of change brought about by the TIM training. The research team suggests that Level 3 behavior evalua- tion be performed at least 1 to 2 months after the comple- tion of a TIM training course. The research team also believes that repeating this evaluation over a longer period of time is beneficial, if feasible. Repeated evaluation gives insight into the sustainability of the behavioral changes, and additional changes that take longer to implement may be discovered. Inputs The research team developed an initial set of Level 3 survey questions. The survey is hosted on the L32C assessment plat- form; the target survey responders are the trainees’ supervi- sors and peers, or the trainees themselves. Ideally, input data will be captured electronically, but paper-based surveys can be input by manual data entry. Level 2 Assessment: Learning When to Assess To obtain an accurate measure of the knowledge and skills learned this evaluation is ideally performed both before and after the training. Students can fill out the pretraining survey any time after registering for the TIM training and before training class commences. The same survey can be completed Table 3.1. Critical Success Factors and Data Sources for Evaluations Evaluation Level Critical Success Factors Data Source 1. Reaction • Evaluation must be done immediately after training ends. • Trainees 2. Learning • Evaluation must be done before and after training. • Trainees 3. Behavior • Evaluators must allow time for behavioral changes to be observed. • Trainees must be allowed the right work environment to implement behavioral changes. • Peers and/or immediate supervisors must be able to observe the behavioral changes. • Supervisors • Peers • Trainees 4. Results • More time will likely be needed to obtain organi- zational results. • Management support is a must. • Pretraining and posttraining results are needed for comparison. • Evaluation must be able to determine what improve- ments are due to training efforts as opposed to other organizational initiatives. • Management • TIM performance measures Table 3.2. Level 1 Input Data Origin and Format Input Data Origin Input Data Format Participant feedback for the national TIM responder train-the-trainer course Hardcopies L32B—participant feedback for the e-learning TIM training course Electronic L12-based TIM training programs—participant feed- back for the training courses Hardcopies NHI TIM training courses—Level 1 evaluation form Electronic Table 3.3. Level 2 Input Data Origin and Format Input Data Origin Input Data Format Student exam scores for the national TIM responder train-the-trainer course Hardcopies L32B—student exam scores for the e-learning TIM training course Electronic L12-based TIM training programs—student exam scores for the training courses Hardcopies NHI TIM training courses—Level 2 test scores Electronic

11 knowledge transfer activities at other agencies and organi- zations. The team also interviewed professionals from vari- ous aspects of the responder community with supervisory and management experience; these individuals provided useful input into training assessment. Finally, because the Kirkpatrick Model is used across many domains, the team sought best practice lessons and recommen- dations that could apply to any significant training initiative. Table 3.4 summarizes the sources the team examined in the literature review and the lessons they offer for Project L32C, if any. Synthesis of Best practices After casting a fairly wide net in search of insights into how organizations assess the effectiveness of their training pro- grams, the research team did not find anything close to a per- fect model that could be emulated when developing the TIM assessment processes and tool for this project. In fact, when talking with others about SHRP 2’s aspirations for Project L32C, the team was frequently asked about the project’s prog- ress, because it was exactly what was needed. The research team did, however, find many discrete current practice examples that can be mapped against the require- ments outlined in the RFP, which provides a framework for synthesizing best practices that the team believes are appli- cable to the project. These are summarized in Table 3.5. recommended Business Model The research team concluded that no single training assess- ment program provides a complete pattern for the long-term implementation of the TIM assessment program, in terms of either process or business model. The team’s approach to developing a recommended business model was to map a synthesis of best practices to the known set of requirements for Project L32C. The research team believes that the TIM training tool cannot be thought as a single, stand-alone software program. Consis- tent with the objectives for the tool outlined in the RFP (e.g., multifaceted, sustainable, scalable to a variety of applications, providing a framework for coordination at the local program level, suitable for integration into FHWA and other national program efforts), the tool needs to be viewed as a system, whose success will be highly dependent on a sustainable busi- ness model. While a government-funded national training initiative differs from a private-sector product or service venture, many of the fundamental building blocks of a traditional business plan apply. Figure 3.4 shows how these fit together conceptually. AnAlysIs The research team expects that data analysis at this level will provide information such as the following: • What areas of learning tend to be retained over time? • What areas of learning lead to the most positive behavioral changes on the job? Level 4 Assessment: Results When to Assess Measurement of organizational results may take many months. The time needed to measure the results of the TIM training may be even longer given the multidisciplinary and multi- agency nature of the training programs. The research team suggests that Level 4 evaluation be per- formed at least 3 months after the completion of a TIM train- ing course. Repeating this evaluation over a longer period of time will also be beneficial. The repeated evaluation may uncover not only organizational results that take longer to realize but also improvements that result from more cross- disciplinary participation on a regional and/or national level. Inputs The research team developed an initial set of Level 4 survey questions. The survey is hosted on the L32C assessment plat- form; the target survey responders are agencies’ senior man- agement. Ideally, input data will be captured electronically, but paper-based surveys can be input by manual data entry. AnAlysIs The research team expects that data analysis at this level will give insight into the following: • What behavioral changes translate into TIM performance improvement? • What measurable improvements have been achieved in terms of TIM safety and efficiency as a result of the TIM training? • What TIM resources are missing that may hinder safety or performance? Literature review The research team began this review by examining pertinent sources from the transportation realm that were available in the public domain. Because such sources usually provide lim- ited visibility into process, governance, technology, and best practices, the team eventually pursued additional avenues to gain insight into these critical matters. Through the auspices of the TETG, the research team was able to gain access to contacts involved in training and

12 Table 3.4. Sources of SHRP 2 Project L32C Literature Review and Lessons Learned Source Description AASHTO American Association of State Highway and Transportation Officials (AASHTO) does not directly offer any training, but related entities provide information about or access to training resources. NTIMC The National Traffic Incident Management Coalition (NTIMC) has a presence on the AASHTO website, which pro- vides a link to training resources at http://ntimc.transportation.org/Pages/TRAININGRESOURCES.aspx. This is a compendium of links to materials that can be ordered or, in some cases, downloaded. It also references training courses at other sites, which the research team investigated as part of the literature search and covers when discussing that source. Emergency Transportation Operations, FHWA This arm of FHWA provides a link to training at http://www.ops.fhwa.dot.gov/eto_tim_pse/training/index.htm. At the time of the literature review, the only available reference was to an August 2012 webinar related to SHRP 2 national TIM responder training. National Highway Institute This FHWA entity offers an array of classroom and web-based courses. NHI’s online training platform is expected to be used to deliver the training modules being developed in Project L32B. Basic contact information is required to register and subsequently purchase access to NHI’s online courses (some are free): Similar information is captured on paper-based registration forms for classroom training. Once a course has been purchased it is accessible by logging into a personalized My Training page: A typical Level 1 assessment captures a student’s reaction to training. Learning assessment (Level 2) is based on the particular course’s subject. NHI’s online platform has the ability to capture high-level test results. It is not clear that any analysis is performed. The research team did not identify any Level 3 or 4 assessment program. (continued on next page)

13 (continued on next page) National Volunteer Fire Council This organization offers a range of training focused on health and safety subjects at http://www.nvfc.org/training/ education/health-and-safety-training. Most of the webinars offered are simply YouTube videos, which means that no registration is required, anyone can watch them, and no assessment seems feasible. A few of the webinars are actually online training modules hosted by the insurer McNeil & Company at http://www.mcneilandcompany.com/ risk-management/e-learning/. The online training is free, but an individual learner cannot enroll for a course until his/her training officer registers the organization in the program. This can be seen in the registration screen: a specific access code is required, and the student’s organizational affiliation is chosen from a dropdown list. Since a continuing education (CE) identification number (ID) is required, it can be inferred that some Level 2 assessment is done to ensure course completion. The team’s research suggests that no systematic Level 3 or 4 follow-up assessments are done. I-95 Corridor Coalition This organization provides a number of online courses accessible via http://www.i95coalition.org/i95/Training/tabid/ 87/Default.aspx. Some of the courses, including ones related to incident management, are self-hosted. Registration is required but only involves providing an e-mail address, which is validated by clicking through on an e-mailed link. Courses of this type consist of a sequence of video modules, each followed by a quiz. A multiple-choice exam con- cludes the course. This approach provides the basis for a basic Level 2 (Learning) assessment, albeit limited, since the learner is given multiple tries to get the correct answer; and feedback given following an incorrect answer steers the student to the correct one. Other courses are actually hosted by CITE (discussed separately), and those that are fee-based require registration with that organization. ERSI The Emergency Responder Safety Institute (ERSI) has a Learning Network accessible at https://learning.responder- safety.com. It currently offers five online training modules relevant to TIM. The content is available only to regis- tered users, but anyone can register. Only basic contact information is required. An organization name is required but it can be entered arbitrarily. Organization type is selected from a dropdown list. A unique e-mail per registrant is required so that a unique training record can be maintained. Adobe Flash is a technical prerequisite for taking a course, and the content is video-heavy. Modules must be completed end-to-end and built-in Knowledge Check and Skills Challenge steps provide a basic Level 2 (Learning) assessment. For example, the Intro to Fire Service Traffic Control Professional course skills challenge consists of 12 multiple-choice questions. Correctly answering 75% or more of the questions generates a course completion certificate in PDF format that automatically down- loads. The team’s research suggests that no systematic Level 3 or 4 follow-up assessments are done. International Association of Chiefs of Police (IACP) This association’s Center for Police Leadership and Training (CPLT) provides information and registration services for training offerings at http://www.theiacp.org/LeadershipandTraining/tabid/68/Default.aspx. A close inspection of the offerings shows that the courses are mostly leadership-oriented and classroom-based. Online courses are mostly links to downloadable, previously recorded webinars. The IACP does not seem to offer anything instructive for Project L32C. Towing and Recovery Association of America (TRAA) This organization operates the National Driver Certification Program (NDCP), which involves a paper-based appli- cation process, self-study (only the first-level study guide is downloadable), and either pencil-and-paper-based testing at a local community college or computer-based testing in certain states. As currently constituted, this program does not seem to be instructive for Project L32C. TRAA also offers other training materials on the Products page on its website. The TIM Training Program for Entry Level Towers is a new, featured product that consists of a CD and one paper copy for $20. Table 3.4. Sources of SHRP 2 Project L32C Literature Review and Lessons Learned (continued) Source Description

14 NTED The Federal Emergency Management Agency (FEMA) National Training and Education Division (NTED) offers over 200 courses to first responders. The majority of the courses are delivered on or near the requesting agency’s site or at a training partner site (e.g., a university). Some courses are offered online. Registration requires chain-of- command approval and facilitation by designated training points of contact. Because of the restricted nature of access to this entity’s offerings, the research team was only able to gain insight into the assessment process through a TETG-facilitated introduction to an agency employee. In summary, the research team learned that • There is a centralized registration function for all courses. • A standard form is used for Level 1 (Reaction) assessment. • Any immediate Level 2 assessment (Learning) is done on a course-by-course basis, and the team’s contact was not certain that it was done in a manner that facilitated meaningful analysis. • Kirkpatrick-style Level 3 and 4 assessments (Behavior and Results) are at the discussion stage inside the agency, but no concrete plan is in place to implement them. • The current long-term assessment strategy involves mailing a cover letter and standard survey posing three broad, open-ended questions to students 6 months after they have attended a course. • Survey responses are manually entered into an Access database and analyzed qualitatively. CITE The Consortium for ITS Training and Education (CITE) offers a range of certificate programs, blended courses, and online training. The latter is available at http://www.i95coalition.org/i95/Training/tabid/87/Default.aspx. Registration is required for all courses, and all but two involve fees of $50 to $200. The free courses require between 2 and 8 hours to complete, so the research team did not examine them. The registration process collects basic contact information. Organization type and the registrant’s role in the organization are selected via checkboxes. National Fire Academy (NFA) This arm of the U.S. Fire Administration, which is part of FEMA, provides a wide range of courses in on-campus and off-campus classroom settings; it also offers a subset of courses online via NFA Online at http://www.usfa.fema .gov/nfa/nfaonline/browse/index.shtm. Offerings include National Incident Management System (NIMS) Incident Command System (ICS)-series courses familiar to many responders, especially those in supervisory positions or higher. An introduction facilitated by the TETG was essential to get a broad overview of process from an agency employee. However, the insights gained from one person—while useful—cannot be considered comprehensive given the vast scope of the organization and its mission. Nonetheless, the research team believes the NFA model is the one most relevant to L32C. In summary, the research team learned that • There is a centralized admissions function for the NFA and also an organization responsible for long-term evaluation. • Level 1 and 2 assessments (Reaction, Learning) are done at course completion. Pretesting is only done (a potentially useful strategy for deeper Level 2 assessment) in a Hazmat-oriented chemistry course (so that the instructor can determine students’ baseline level). • In an online setting, mandating completion of a survey and exam can be easily done, thus accomplishing Level 1 and 2 assessments. When asked how NFA gets students to comply in classroom settings, the team’s source said an unspoken rule stipulates that students have to complete an evaluation to receive a certificate. • In classroom settings, the Level 1 assessment input has moved from paper forms that were optically scanned, to an online system. Students are handed a business-card-sized form with a unique ID, which is entered when the student accesses the evaluation system. • Similar to NTED, 6 months after a student has completed a course, the student’s supervisor is e-mailed a survey, which is designed to address Level 3 and 4 (Behavior, Results) assessments. The admissions/registration process requires chain-of-command approval or sponsorship, and it always captures the student’s then-current supervisor information. • NFA publishes an annual evaluation report. The most recent one available is for 2009 (National Fire Academy 2009). Law enforcement training professional The team’s discussion with this senior professional focused mainly on issues related to participation, governance, and sustainability: • He stressed the critical importance of the supervisor in obtaining feedback on training effectiveness with respect to long-term changes in individual behavior and organizational change (i.e., Kirkpatrick Levels 3 and 4). • He emphasized the need for programmatic incentives or mandates to drive participation, at both the individual learner and supervisor/management level. He used the term carrot and stick. In eras of tight budgets, allotting officer and management time to training and assessment is difficult. • He felt that online delivery of training was essential to the long-term success of the program, for all the expected reasons. He also made the point that online training should be available “where cops go all the time” and cited NLEARN (discussed next) as an example of online venues that are very familiar in law enforcement circles. NLEARN The National Law Enforcement Academy Resource Network (NLEARN) is part of the International Association of Directors of Law Enforcement Standards and Training (IADLEST). This source was suggested by the law enforcement training professional the research team interviewed as part of the literature search effort. NLEARN offers over 200 online courses. Access is restricted to official law enforcement use only, and content is not for public release; so the team was ultimately unable to get any insights from this source. Table 3.4. Sources of SHRP 2 Project L32C Literature Review and Lessons Learned (continued) Source Description (continued on next page)

15 Fire service professional This senior professional is experienced with managing and training fire and rescue personnel. During his discussion with the team, • He stressed the vital importance of supervisor and management participation in any Level 3 or 4 assessment process and ensuring that individuals responding to any survey that measures strategic outcomes (i.e., Level 4, Results) have the authority to do so. • He focused on the interdisciplinary nature of the TIM training curriculum and suggested that assessment of behavioral changes (Level 3) attempt to measure what students learned about other disciplines and responder roles, as well as any changes they made in how they communicate with other disciplines. • In terms of Level 4 (Results) assessment, he felt that incident clearance time was an important measure. Kirkpatrick community Donald Kirkpatrick and family have established an online community that provides members with access to train- ing evaluation resources that build on the conceptual model he created. The team engaged this community to identify best practices that might be applicable to Project L32C. The information the research team garnered is general in nature and not specific to the transportation sector; it is cited—where applicable—in the Synthesis of Best Practices section of this chapter. Table 3.4. Sources of SHRP 2 Project L32C Literature Review and Lessons Learned (continued) Source Description Table 3.5. Synthesis of Best Practices Category Best Practice General • The assessment program should be driven by the desired end result (strategic goals) and how the training will specifically affect it. Level 1 Assessment— Reaction • The assessment should be done immediately after the training ends. • Level 1 typically should use the least resources. • The scope should be limited to identifying opportunities to improve the program, instruction, support, and administration. Level 2 Assessment— Learning • This assessment can precede Level 1. Learning can be measured in steps throughout the learning event. • The results of quizzes and exams can be used to provide insight into gaps in instruction and student materials, or deficiencies in test-item construction. Level 3 Assessment— Behavior • Most resources/effort should be devoted to Level 3. • This is arguably the most important assessment as there is little point in a good reaction and increase in knowledge if nothing changes once the learner is back on the job. • Input from both graduates and their supervisors are necessary to evaluate the effectiveness of the program. • Involving line management in this level of assessment is critically important, since observation over time is required to assess change, relevance of change, and its sustainability. • The trainee’s opinion is relevant but tends to be more subjective and less reliable. Level 4 Assessment— Results • Effective Level 4 assessment requires senior management participation, as they are most attuned to their agency’s key performance indicators. • As with Level 3, results must be measured over time. Organizational/ Institutional • There must be a long-term commitment to collecting necessary data and conducting systematic assessment over time. • Incentives (in the form of either mandates or value in exchange) can be created to drive program participation over time. Process/ Technology • A centralized, consistent registration function should be used for all levels of training and types of delivery methods. • The recipient agency should be involved in the enrollment process, even though individual trainees may self-register. • Organizational affiliation, role, supervisor, and so on should be captured for subsequent follow-up assessment, data aggregation, and analysis. The four key elements of this business model are the following: 1. Value proposition. This is the most important element. Like any other product or service, the long-term success of the TIM assessment tool depends on its ability to help cus- tomers achieve demonstrable results. Value to customers will be delivered in multiple forms, including classroom, online, and hybrid courses; train-the-trainer events and webinars; plus automated follow-up evaluations and self- assessment capabilities. a) Customers are the responder agencies nationwide that will train their personnel using the new TIM curricu- lum. These personnel will come from law enforcement,

16 • Analysis and reporting—assessing various dimensions of performance from a programwide perspective. b) Operations. These processes will cover day-to-day activi- ties in support of operating, administrating, and main- taining the assessment tool. These include • Systems management—including administration, backup/recovery, monitoring, security, and trouble- shooting of the computing, network, and storage environment that supports the assessment tool; • Application maintenance—implementing bug fixes and functional enhancements to the tool; and • Help desk—providing a contact point for end-user assistance and troubleshooting. c) Marketing. These activities will promote initial agency uptake and ongoing use of the assessment tool. 4. Resources. Human resources, facilities, and technology will be required to operate and sustain the assessment tool. a) People. Although it is too early to forecast exact head- counts, operation and management of the overall assess- ment program will require human resources to cover all of the management, operations, and marketing activities described above. b) Facilities. Facilities requirements and costs should be commensurate with final staffing numbers. No unusual facility requirements are foreseen. c) Technology. There will be technology-related costs for operating the TIM assessment program. The research team advocates the use of cloud computing services, open source software, and/or commercial, off-the-shelf software to minimize hardware and software acquisition costs and maintenance fees. fire and rescue, departments of transportation (DOTs), towing and recovery, hazmat, and other disciplines. b) Partners may be needed to engage the wide range of responder disciplines that the TIM training curriculum is designed to reach, particularly in terms of promotion and shaping customers’ perceptions of value. Examples of potential partners include AASHTO, CITE, ERSI, various FEMA divisions, and IADLEST. c) Performance measurement will be based on measures of strategic significance to FHWA, participating agencies, and other partners. Examples of these performance mea- sures are roadway clearance time, incident clearance time, and secondary crashes. Example performance measurement categories that the TIM assessment busi- ness model can be aligned with include • FHWA’s Focus States Initiative: TIM Performance Measures, and • The National Unified Goal for Traffic Incident Man- agement from the National Traffic Incident Man- agement Coalition. 2. Governance. Postimplementation success of Project L32C will also depend on sustained funding, leadership, poli- cies, leadership and decision making. 3. Process and activities. A number of key operational pro- cesses and activities must be planned and managed over the TIM assessment tool’s life cycle. a) Management. The research team expects that manage- ment activities will involve how and when the four lev- els of assessment will be planned, initiated, executed, tracked, and analyzed. These activities include • Program management—administering, executing, and enhancing the overall assessment program; and Figure 3.4. Conceptual business model.

17 • Analysis and reporting—used to enable program staff as well as participating agencies to analyze and report on train- ing participation, needs, effectiveness, and so on. Figure 3.5 shows the use case framework in terms of these three blocks of functionality. The balance of this section describes the process of the main success scenario and alternate scenarios, as applicable, for each use case (Tables 3.6–3.15). Functional Requirements Driven by the use cases defined in the previous section, the functional requirements for the system are organized around the three functional areas and the interactions among them. Architecture The high-level operational concept and deployment topology of the TIM assessment tool are depicted in Figure 3.6. Commercial Off-the-Shelf Products and Services The limited scope, funding, and time frame of this research project dictated maximum use of off-the-shelf technology and little to no custom software development. This drove the selection of the following products and services that were used in the system. requirements Analysis Users of the Assessment and Reporting Tool Based on the concept of operations, the research team identi- fied the following groups of users who will access the assess- ment and reporting tool. • Participating agency points of contact (POCs). These are agency-designated staff responsible for the overall admin- istration and day-to-day management of the TIM training program. For example, the POCs ensure that trainees from the agency are properly registered and their pertinent infor- mation is updated properly and promptly in the assessment and reporting tool. • Trainers. In addition to teaching the TIM training courses, the trainers will likely create or assist the application admin- istrators in creating pre- and posttraining tests and surveys. • Trainees. These are students of the TIM training program. They will use the assessment tool to take the pretraining tests, posttraining Level 1 reaction surveys, and posttrain- ing Level 2 learning tests. • Agency managers. Agency managers and/or their desig- nated staff will use the assessment and reporting tool to take Level 3 behavior surveys and Level 4 results surveys. They will also be interested in performing various data analyses and generating performance reports. • Application administrators. Application administrators are responsible for the proper setup of all constituents of the assessment and reporting tool, as well as for the creation and maintenance of tests and surveys. Additionally, they will create, maintain, or assist in creating and maintaining assessment reports. Use Cases Based on the concept of operations and the assessment and reporting tool’s target user groups, the research team devel- oped a set of primary use cases. These use cases define the high-level business requirements and were used as a frame- work for the subsequent development of the system func- tional requirements. Table 3.6 summarizes these use cases. These use cases can be mapped to the three functional areas of the tool introduced previously: • Survey management—used to execute a particular kind of assessment, from a pretraining assessment survey to a Level 4 (Results) survey long after a training event; • Constituent relationship management (CRM)—used to manage relationships and communications with all key constituents (e.g., individual students, agency training officers and/or management, and trainers); and Table 3.6. High-Level Use Cases Use Case Description UC01 Administrator captures/modifies a participating agency UC02 Administrator captures/modifies a contact UC03 Student takes pretraining assessment test UC04 Student takes posttraining Reaction survey (Level 1) UC05 Student takes posttraining Learning test (Level 2) UC06 Student’s supervisor/agency POC submits Behavior survey (Level 3) UC07 Student’s agency management submits Results survey (Level 4) UC08 TIM program staff/agency personnel perform data analysis UC09 Administrator authors/modifies surveys/tests UC10 Administrator authors/modifies analysis reports

18 Figure 3.5. Use case framework. Table 3.7. Administrator Captures/Modifies a Participating Agency Step Main Success Scenario Alternate Scenarios 1 Administrator logs into the system. 2 Administrator adds a new participating agency. If the agency already exists, the administrator can modify agency informa- tion or delete the agency. 3 Administrator specifies a POC for the newly added agency. Table 3.8. Administrator Captures/Modifies a Contact Step Main Success Scenario Alternate Scenarios 1 Administrator logs into the system. 2 Administrator adds a new contact and assigns the contact to a participating agency. If the user already exists, the administrator can modify contact information or delete the contact.

19 Table 3.9. Student Takes Pretraining Assessment Test Step Main Success Scenario Alternate Scenarios 1 Student logs into the system. If this is a new student, the system will prompt him/her to register and select his/her associated agency. 2 System presents pre- training assessment test to the student. If the test results are supplied by external sources, the system will import and store the test results. 3 Student completes the test. Student may save incomplete test and finish later. 4 Student submits test results. If test is incomplete, system will prompt the student to complete the test before submitting. 5 System saves test results. Table 3.10. Student Takes Posttraining Reaction Survey (Level 1) Step Main Success Scenario Alternate Scenarios 1 Student logs into the system immediately following the training class. 2 System presents Level 1 Reaction survey to the student. If the survey results are supplied by external sources, the system will import and store the survey results. 3 Student completes the survey. 4 Student submits survey results. If survey is incomplete, system will prompt the student to complete the survey before submitting. 5 System saves survey results. Table 3.11. Student Takes Posttraining Learning Test (Level 2) Step Main Success Scenario Alternate Scenarios 1 When student submits Level 1 Reaction survey, system presents Level 2 Learning survey/test to the student. Student will need to login first if not already logged in. 2 Student completes the survey. If the test results are supplied by external sources, the system will import and store the test results. 3 Student submits test results. If test is incomplete, system will prompt the student to complete the test before submitting. 4 System saves test results. Table 3.12. Student’s Supervisor/Agency POC Submits Behavior Survey (Level 3) Step Main Success Scenario Alternate Scenarios 1 A configurable period of time after the class, the system sends reminder to the stu- dent’s supervisor or the agency’s desig- nated personnel about completing Level 3 Behavior survey. 2 User logs into the system. 3 User takes the survey. 4 User completes the survey. User saves incomplete survey. If survey is not completed after a configurable period of time, system will send reminder to the user to complete the survey. 5 User submits survey results. If survey is incomplete, system will prompt the user to complete the survey before submitting. 6 System saves survey results.

20 Table 3.13. Agency Management Submits Results Survey (Level 4) Step Main Success Scenario Alternate Scenarios 1 A configurable period of time after the class, the system sends reminder to the agency’s manager about completing Level 4 Results survey. 2 User logs into the system. 3 User takes the survey. 4 User completes the survey. User saves incomplete sur- vey. If survey is not com- pleted after a configurable period of time, system will send reminder to the user to complete the survey. 5 User submits survey results. If survey is incomplete, sys- tem will prompt the user to complete the survey before submitting. 6 System saves survey results. Table 3.14. TIM Program Staff/Agency Personnel Perform Data Analysis Step Main Success Scenario Alternate Scenarios 1 User logs into the system. 2 User selects a report. 3 User specifies report parameters. There are no parameters for the report. 4 System performs data analysis. 5 System generates report. 6 User saves the report. Table 3.15. Administrator Authors/Modifies Surveys/Tests Step Main Success Scenario Alternate Scenarios 1 Administrator logs into the system. 2 Administrator authors a new survey or test. If the survey/test already exists, the administrator can modify or delete it. 3 Administrator saves the new or modified survey/test. Administrator discards the changes. Table 3.16. Administrator Authors/Modifies Analysis Reports Step Main Success Scenario Alternate Scenarios 1 Administrator logs into the system. 2 Administrator authors a new data analysis report. If the report already exists, the administrator can modify or delete it. 3 Administrator saves the new or modified report. Administrator discards the changes. Table 3.17. CRM Requirements Requirement ID Description CRM-1 The system shall allow administrators to create different types of constituents such as partici- pating agencies, trainers, and trainees. CRM-2 The system shall maintain pertinent information about each constituent. CRM-3 The system shall allow administrators to modify or delete existing constituents. CRM-4 The system shall allow authorized users to create calendar events for constituents. CRM-5 The system shall manage follow-up tasks such as sending notifications or reminders to constitu- ents to take appropriate surveys. CRM-6 The system shall allow administrators to import constituent-related information such as newly registered trainees and the training classes for which they registered. Table 3.18. Survey Management Requirements Requirement ID Description SVY-1 The system shall allow an authorized user to take a survey. SVY-2 The system shall allow a user to save an incomplete survey and finish it at a later time. SVY-3 The system shall allow a user to submit a completed survey. SVY-4 The system shall allow an administrator to create new surveys. SVY-5 The system shall allow an administrator to modify or delete existing surveys. SVY-6 The system shall be accessible to users with Internet connectivity and current versions of Internet browsers. SVY-7 The system shall allow an authorized user to import survey results from external sources.

21 Table 3.20. Integration Requirements Requirement ID Description INT-1 The Survey Management component shall pro- vide survey access information (such as a URL) to the CRM component. INT-2 The CRM component shall provide survey access information to the recipient when sending out notification or reminder for the recipient to take the survey. INT-3 The CRM component shall provide certain perti- nent information such as agency name and department to the Survey Management component. INT-4 The Survey Management component shall provide survey results to the Analysis and Reporting component. INT-5 The CRM component shall provide certain pertinent information such as agency name, department, and state to the Analysis and Reporting component. INT-6 The Survey Management component shall provide survey results to the Analysis and Reporting component. Table 3.19. Analysis and Reporting Requirements Requirement ID Description AR-1 The system shall allow an authorized user to cre- ate new data analysis reports. AR-2 The system shall allow an authorized user to modify or delete existing analysis reports. AR-3 The system shall allow reports to have zero or more input parameters. AR-4 The system shall provide data aggregation capability. AR-5 The system shall provide data filtering capability. AR-6 The system shall allow reports to be saved in PDF and/or other appropriate formats. Figure 3.6. System architecture. CRM Salesforce.com (www.salesforce.com) is a market-leading customer/constituent relationship management (CRM) sys- tem that is delivered as software-as-a-service (SaaS). In a SaaS model, an organization subscribes for a particular service level and pays a monthly or annual usage fee, and the vendor hosts and manages the entire application environment in the cloud. No hardware or software needs to be purchased, man- aged, or maintained, and the application is accessed through a web browser. Salesforce.com was selected for this project for numerous reasons, including its • Market leadership status; • Completeness of functionality in key areas related to man- aging information about organizations/agencies, individual contacts, and scheduling and tracking follow-up activities and communications; • Ease of customization and integration with other systems; and • Adoption by various federal agencies, including the U.S. DOT. Salesforce.com features that were used in the TIM assess- ment tool include the following:

22 Data and Integration Models Information about TIM stakeholders is managed in Sales force.com. A simplified view of the core Salesforce.com data model applicable to this project is shown in Figure 3.7. An account object represents an agency or organization, which can have many associated contacts. In turn, each con- tact can have many associated tasks and events. This is the fundamental, hierarchical data structure that is managed by this portion of the TIM assessment tool. Information about training effectiveness is collected using FluidSurveys. The FluidSurveys data model is very simple. Each survey that is “published” is accessible at a unique URL and associated with a specific survey collector ID. As surveys are completed, the data are stored in the FluidSurveys cloud. At any point, an authorized user can export the collected data in a vari- ety of formats. Each record is a self-contained representation of the survey, consisting of information collected by the system (date, time started, time completed, etc.), each question, and each response. While both Salesforce.com and FluidSurveys provide built-in reporting capabilities, these are mainly oriented to generating lists and tabular reports of current information. Historical reporting and analysis over time requires the use of a separate database to accumulate information for these purposes, which is why it is part of the system. By design, the overall system architecture is loosely coupled to simplify development and preserve flexibility. While both Salesforce.com and FluidSurveys provide comprehensive application programming interfaces (APIs), a simpler integra- tion model based on data export/import scripts supports all of the use cases envisioned within the scope of the L32C proj- ect. These points of integration are shown in Figure 3.6. Registration data from Project L32B/NHI is imported into Salesforce.com using the service’s built-in data loader. Other sources of contact/agency data (e.g., rosters from previous classes and workshops, mailing lists) can be imported through the same mechanism. Data are exported to the reporting and analysis database using the FluidSurveys export tool, then imported using the Salesforce.com data loader. • Contact management—adding and updating informa- tion about people and the entities with which they are associated; • Activity tracking—creating follow-up tasks, calls, e-mails, and other relevant events associated with people or entities; • Workflows—triggering time- and event-based activities related to people or entities; • Reports—with tabular lists of people or entities by state, responder discipline [emergency medical services (EMS), law enforcement, etc.], and other attributes; and • Data loader—importing data from external sources and exporting data for more complex reporting and analysis. Survey Management FluidSurveys (www.fluidsurveys.com) is a widely used survey management system that is also delivered in SaaS form. Fluid- Surveys was selected because it is • A complete survey authoring and collection system; • Cost-effective and easy-to-use; and • Simple to integrate with Salesforce.com. FluidSurveys features that were used in the TIM assess- ment tool include • Drag-and-drop editor for authoring and modifying surveys; • Multiple question types (only a few of the 35+ types were used); • Styling tool to create a theme and brand surveys; • Online survey results collection; and • Surveys exportable to printable format. Analysis and Reporting The research team selected Microsoft Access for the initial implementation of the TIM analysis and reporting function- ality. The rationale for using this product is that • It is a commonly installed component of the Microsoft Office suite. • Users with analysis and reporting responsibilities are usu- ally familiar with it. • Access databases can be readily up-scaled to SQL Server. The Access database is as a repository containing data from both CRM and survey applications. These data are used to generate reports and answer questions such as “What per- centage of trainees from each state found the training helpful to their job performance?” and “What percentage of agencies saw TIM performance improvement after participating in TIM training?” Figure 3.7. Simplified core CRM data model.

23 Functional Specifications The functional specifications for the TIM assessment tool fall into two broad categories: user-visible functions and the design of the reporting and analysis database. Each category is discussed in the following sections. User-Visible Functions Each major user-visible function of the TIM assessment tool is explained in terms of the use cases described previously. (See Tables 3.21–3.30 and Figures 3.8–3.24.) Database Design The TIM assessment tool database is designed for the pur- pose of data analysis and reporting. It assimilates data from both the CRM and the Survey Management software. The database contains the following tables: 1. Organizations; 2. Contacts; 3. Responses; and 4. Answer keys. Figure 3.25 shows an entity relationship diagram of the tool’s database. Guiding Principles The research team used the following guiding principles when designing the database: a) The database shall not contain information that links a par- ticular survey response or test results to a specific trainee. b) The database shall be flexible and scalable to accommo- date future survey changes or expansion. c) The database shall allow for ease of reports creation. As a result of adhering to guiding principle (a), certain essen- tial information required for analysis and reporting will need to be collected as a part of each Level 1 and Level 2 survey. The trainee will need to provide the following information: • Agency; • Discipline (DOT, EMS, law enforcement, etc.); • Number of years in the position; • Affiliation (paid professional or volunteer); • Reason for training; • If this is a retake; • If this is an online course; • Training session start date; • Training session city; and • Training session state. Table 3.21. UC01: Administrator Captures/Modifies Participating Agency Topic Specifications Description This function is used to add, change, or delete information about any agency, organization, company, or other entity that participates in the TIM training program. The function was implemented by custom- izing the Salesforce.com account-related form’s layouts and fields. Actors Constituent Data Administrator (CDA, which is a role or responsibility) Preconditions The CDA has to be logged into the CRM module and have permission to edit account objects. Inputs Data from an external source, such as list of registrants for a training class Events sequence The CDA is able to create a new account by clicking the “New” button on the Accounts home screen, which takes him/her to the New Account screen. Account Name is the only required field and is the link that connects all contacts to an account. Note that Salesforce.com can model an organizational hierarchy through the optional Parent Account field. Whether or not to implement this field is a business decision. Once the account-related information has been entered, the CDA can click the “Save” button, or “Cancel” to exit without saving. The CDA has numerous ways to look up an account to modify it, including a dropdown list of all accounts and numerous preconfigured views or reports— all accessible on the Accounts home page. A search box at the top of every screen in the CRM module can also be used to locate the desired account. Once the target account is located, double-clicking on the account name will open the Account Detail screen. The CDA can click the “Edit” button, modify any of the necessary fields, then click the “Save” button to apply the changes, or “Cancel” to exit without saving. Postconditions The new account (agency, organization, etc.) is created or modified. Requirements map CRM-1, CRM-2, CRM-3, CRM-4 Related user interface (UI) Figures 3.8–3.10 (text continues on page 44)

24 Figure 3.8. Accounts home screen. Figure 3.9. New account screen.

25 Figure 3.10. Accounts detail screen.

26 Table 3.22. UC02: Administrator Captures/Modifies Contact Topic Specifications Description This function is used to add, change, or delete information about any person who participates in the TIM training program. The function was implemented by customizing the Salesforce.com contact form layout and fields. Actors CDA (which is a role or responsibility) Preconditions The CDA has to be logged into the CRM module and have permission to edit contact objects. Inputs Data from external source, such as list of registrants for a training class Events sequence The CDA is able to create a new contact by clicking the “New” button on the Contacts home screen, which takes him/her to the New Contact screen. As described for UC01 (Table 3.21), Account Name forms the link between Accounts and Contacts and is thus a required field. A look-up function (accessed by clicking a magnifying glass symbol) can be used to find the account with which the contact is associated. Note that Salesforce.com can also model reporting relation- ships through the optional Reports To field. Whether or not to use this field is a business decision. Once the contact- related information has been entered, the CDA can click the “Save” button, or “Cancel” to exit without saving. The CDA will have numerous ways to look up a contact to modify information about the individual, including a dropdown list of all contacts, a drilldown by account, and numerous preconfigured views or reports—all accessible on the Con- tacts home page. A search box at the top of every screen in the CRM module can also be used to locate the desired contact. Once the target contact is located, double-clicking on the contact name will open the Contact Detail screen. The CDA can click the “Edit” button, modify any of the necessary fields, then click the “Save” button to apply the changes, or “Cancel” to exit without saving. A contact’s link to training events is captured using Salesforce.com’s Campaign functionality. An individual contact’s training history can be seen in the Campaign History of the Contact Detail screen. Training events are defined as campaigns, which can be seen in the Campaigns home and Campaign Detail screens. The Campaign Members section of the latter screen shows the linkage of multiple contacts to a particular training event. Postconditions The new contact is created or modified and linked to the correct account. Requirements map CRM-1, CRM-2, CRM-3, CRM-4 Related UI Figures 3.11–3.15 Figure 3.11. Contacts home screen.

27 Figure 3.12. New contact screen.

28 Figure 3.13. Contacts detail screen.

29 Figure 3.14. Campaigns home screen. Figure 3.15. Campaigns detail screen.

30 Table 3.23. UC03: Student Takes Pretraining Assessment Test Topic Specifications Description This function allows the TIM training program to assess a student’s knowledge before he/she participates in a training course or module. This function was implemented through FluidSurveys’ data collection capabilities. Actors Student, plus a Survey Data Administrator (SDA, which is a role or responsibility) in the case of a paper-based assessment Preconditions The survey has been authored and published on the web and, optionally, exported to PDF format to make it print-ready— all as described in UC09 (Table 3.29). The student has been provided with a URL to take the assessment online or has been provided with a paper-based version of the assessment vehicle that will be entered into the online system by the SDA, who must be logged into the Survey Management module to do so. Inputs Online: na Paper-based: Completed assessment form Events sequence Online: The student enters the URL provided into his/her preferred web browser and begins to enter his/her responses. Alternatively, if the URL has been e-mailed to the student, the e-mail client software may allow him/her to click through to the target URL. An on-screen progress bar shows percentage completed. When all required questions have been answered, the student clicks “Submit” to save his/her responses. Paper-based: The SDA visits an administrative URL and follows a similar procedure to enter the student’s written responses into the Survey Management module. Outputs na Postconditions The student’s responses are recorded and available for subsequent reporting and analysis. Requirements map SVY-1, SVY-2, SVY-3, SVY-6, SVY-7 Related UI Variant of 3.17 Note: na = not applicable. Table 3.24. UC04: Student Takes Posttraining Reaction Survey (Level 1) Topic Specifications Description This function allows the TIM training program to assess a student’s reaction to a training course. This function was implemented through FluidSurveys’ data collection capabilities. Actors Student, plus an SDA (which is a role or responsibility) in the case of a paper-based assessment Preconditions The student has been provided with a URL to take the assessment online or has been provided with a paper-based version of the assessment vehicle that will be entered into the online system by the SDA. Inputs Online: na Paper-based: Completed assessment form Events sequence Online: The student enters the URL provided into his/her preferred web browser and begins to enter his/her responses. Alternatively, if the URL has been e-mailed to the student, the e-mail client software may allow him/her to click through to the target URL. An on-screen progress bar shows percentage completed. When all required questions have been answered, the student clicks “Submit” to save his/her responses. Paper-based: The SDA visits an administrative URL and follows a similar procedure to enter the student’s written responses into the Survey Management module. Outputs na Postconditions The student’s responses are recorded and available for subsequent reporting and analysis. Requirements map SVY-1, SVY-2, SVY-3, SVY-6, SVY-7 Related UI Figure 3.16 Note: na = not applicable.

31 Figure 3.16. Level 1 reaction survey.

32 Table 3.25. UC05: Student Takes Posttraining Learning Test (Level 2) Topic Specifications Description This function allows the TIM training program to assess how well a student has learned material presented in a training course or module. This function was implemented through FluidSurveys’ data col- lection capabilities. Actors Student, plus an SDA (which is a role or responsibility) in the case of a paper-based assessment Preconditions The student has been provided with a URL to take the assessment online or has been provided with a paper-based version of the assessment vehicle that will be entered into the online system by the SDA. Inputs Online: na Paper-based: Completed assessment form Events sequence Online: The student enters the URL provided into his/her pre- ferred web browser and begins to enter his/her responses. Alternatively, if the URL has been e-mailed to the student, the e-mail client software may allow him/her to click through to the target URL. An on-screen progress bar shows percent- age completed. When all required questions have been answered, the student clicks “Submit” to save his/her responses. Paper-based: The SDA visits an administrative URL and follows a similar procedure to enter the student’s written responses into the Survey Management module. Outputs na Postconditions The student’s responses are recorded and available for subsequent reporting and analysis. Requirements map SVY-1, SVY-2, SVY-3, SVY-6, SVY-7 Related UI Figure 3.17 Note: na = not applicable.

33 Figure 3.17. Level 2 learning survey.

34 Table 3.26. UC06: Student’s Supervisor/Agency POC Submits Behavior Survey (Level 3) Topic Specifications Description This function allows the TIM training program to assess long-term changes in student behavior. This function was implemented through FluidSurveys’ data collection capabilities. The follow-up activity is scheduled by a Salesforce.com workflow. The request to complete the survey may be sent as an e-mail from Salesforce.com, in which case it will be recorded as an activity related to the contact. Such an e-mail will contain the URL at which the recipient can complete the survey. The request can also be sent by mail (a manual process) and, optionally, recorded as an activity related to the contact. Actors Designated point of contact (POC) at an organization whose person- nel have participated in the TIM training program, plus an SDA (which is a role or responsibility) in the case of a paper-based assessment Preconditions The POC has been provided with a URL to take the assessment online or has been provided with a paper-based version of the assessment vehicle that will be entered into the online system by the SDA. Inputs Online: na Paper-based: Completed survey form Events sequence Online: The POC enters the URL provided into his/her preferred web browser and begins to enter his/her responses. Alternatively, if the URL has been e-mailed to the POC, the e-mail client software may allow him/her to click through to the target URL. An on-screen progress bar shows percentage completed. When all required questions have been answered, the POC clicks “Submit” to save his/her responses. Paper-based: The SDA visits an administrative URL and follows a similar procedure to enter the POC’s written responses into the Survey Management module. Postconditions The POC’s responses are recorded and available for subsequent reporting and analysis. Requirements map CRM-4, CRM-5, SVY-1, SVY-2, SVY-3, SVY-6, SVY-7 Related UI Figure 3.18 Note: na = not applicable.

35 Figure 3.18. Level 3 behavior impact survey.

36 Table 3.27. UC07: Student’s Agency Management/POC Submits Results Survey (Level 4) Topic Specifications Description This function allows the TIM training program to assess long- term changes in strategic outcomes after an organization’s personnel have participated in a training event. This function was implemented through FluidSurveys’ data col- lection capabilities. The follow-up activity is scheduled by a Salesforce.com work- flow. The request to complete the survey may be sent as an e-mail from Salesforce.com, in which case it will be recorded as an activity related to the contact. Such an e-mail will con- tain the URL at which the recipient can complete the survey. The request can also be sent by mail (a manual process) and, optionally, recorded as an activity related to the contact. Actors Designated POC at an organization whose personnel have par- ticipated in the TIM training program, plus an SDA (which is a role or responsibility) in the case of a paper-based assessment Preconditions The POC has been provided with a URL to take the assessment online or has been provided with a paper-based version of the assessment vehicle that will be entered into the online system by the SDA. Inputs Online: na Paper-based: Completed survey form Events sequence Online: The POC enters the URL provided into his/her preferred web browser and begins to enter his/her responses. Alterna- tively, if the URL has been e-mailed to the POC, the e-mail client software may allow him/her to click through to the target URL. An on-screen progress bar shows percentage completed. When all required questions have been answered, the POC clicks “Submit” to save his/her responses. Paper-based: The SDA visits an administrative URL and follows a similar procedure to enter the POC’s written responses into the Survey Management module. Postconditions The POC’s responses are recorded and available for subsequent reporting and analysis. Requirements map CRM-4, CRM-5, SVY-1, SVY-2, SVY-3, SVY-6, SVY-7 Related UI Figure 3.19 Note: na = not applicable.

37 Figure 3.19. Level 4 online survey example.

38 Table 3.28. UC08: TIM Program Staff/Agency Personnel Perform Data Analysis Topic Specifications Description This function allows TIM program staff or agency personnel to perform analysis on data collected via Level 1 through Level 4 surveys. Data analysis is performed using a Microsoft Access–based assessment tool. A set of predefined assessment reports was created as part of this tool. The reports answer questions such as whether students think the TIM course helps their job performance, how well students scored on each lesson, and whether participating agencies see TIM performance improvement after the training. Reports can be aggregated by agency, state, discipline, student affiliation, or training method as appropriate. Actors Program manager or analyst Preconditions Predefined reports have been created in Access. The user has been granted login credentials and access privileges to run reports and to save them in PDF format or export them to Excel. Inputs Organization and contact information from Salesforce.com, survey results from FluidSurveys Events sequence The user logs into the assessment reporting tool and navigates through the user interface to select the report level (1–4) and a desired report. He/she may also specify data filters and/or report aggregation level when appropriate and desired. The user will run the selected report and save the report as a PDF file or export the data to a CSV/Excel file. Outputs A PDF report of CSV/Excel data file Postconditions The report file or data file can be distributed; the Excel data file can be further analyzed. Requirements map AR-3, AR-4, AR-5, AR-6 Related UI Figures 3.20 and 3.21 Figure 3.20. Select and run assessment reports.

39 Figure 3.21. Sample assessment report. Trainees Who Found The Training To Be Helpful For Job Performance by Organization Affiliation: Paid Professional % RespondersOrganization Name # Responders 100%Branson Department of Public Safety 1 100%Ozark Fire Department 1 25%Ozark Wrecker Service 4 50%Springfield EMS 4 50%Willard EMS 2 60%Willard Police Department 5 Table 3.29. UC09: Administrator Authors/Modifies Surveys/Tests Topic Specifications Description This function allows the TIM training program to create new assessment vehicles or change existing ones. This function was implemented through FluidSurveys’ survey authoring capabilities. Actors Survey Author (SA, which is a role or responsibility) Preconditions The SA is familiar with training objectives and desired outcomes and is knowledgeable about designing assessments. The SA must be logged into the Survey Management module and have permission to author, edit, and publish surveys. Inputs na Events sequence The SA clicks the “New Survey” button to create a new survey or selects the name of an existing survey and then clicks the “Edit” button to begin modifying the survey. If the SA wishes to use an existing survey as the basis for a new one, he/she can select the existing survey, then click the “Actions” button and select “Duplicate” from the dropdown list to create a copy that can subsequently be modified. The process of creating and modifying surveys is covered in detail by documentation and how-to videos available at http://fluidsurveys.com/help-tutorials/. Outputs na Postconditions The new or modified survey is available to collect responses online or be exported to a printable format. Requirements map SVY-4, SVY-5 Related UI Figure 3.22 Note: na = not applicable.

40 Figure 3.22. Authoring and editing surveys.

41 Table 3.30. UC10: Administrator Authors/Modifies Analysis Reports Topic Specifications Description This function allows the training program manager or business analyst to create new assessment reports and to modify or delete existing reports. Newly created reports are made available through the Access assessment reporting tool’s user interface; deleted reports are removed from the user interface. Actors Training program manager or business analyst Preconditions The user has been granted login credentials to the Access assessment reporting tool and has Access privileges to create, modify, and delete reports. The user has also been granted privileges to create and modify user interface screens. Inputs na Events sequence The Training program manager or business analyst logs into the Access tool. He/she will create new assessment reports using SQL queries and report layout. The reports will have the ability to filter and/or aggregate data when appropriate. The user will modify the assessment user interface to make the newly created reports available for program managers and analysts to perform data analysis. If the user deletes an existing report, it will be unavailable from the Access user interface. Outputs New or modified assessment reports Postconditions New assessment reports are created; existing reports are modified or removed. Requirements map AR-1, AR-2, AR-3, AR-4, AR-5 Related UI Figures 3.23 and 3.24 Note: na = not applicable. Figure 3.23. Report query design.

42 Figure 3.24. Report design. Figure 3.25. Database entity relationship diagram.

43 Table 3.32. Contacts Table Design Column Name Description Contact ID (PK) Unique identifier for the person, generated by the CRM system First name Contact’s first name Last name Contact’s last name Salutation Contact’s salutation Organization ID The organization that the contact belongs to Job title Contact’s job title Department The department of the organization for the contact Mailing street Contact’s street address Mailing city City of the contact’s mailing address Mailing state State of the contact’s mailing address Mailing zip code Postal code for the contact’s mailing address Phone Contact’s phone number Fax Contact’s fax number Mobile Contact’s mobile phone number E-mail Contact’s e-mail address Responder disciplinea Contact’s disciplinary field, chosen from the following list of values: • DOT • Emergency Medical Services • Fire and Rescue • Hazmat • Law Enforcement • Towing and Recovery • Other Responder affiliationa Contact’s affiliation, chosen from the following list of values: • Paid Professional • Volunteer • Not Applicable Is trainera Whether the contact is a TIM trainer: • No • Yes Is designated POCa Whether the person is a designated point of contact for the organization a This information is not currently collected during the NHI course registration process. These attributes are collected via Level 1 and 2 surveys and stored in the Responses table (Table 3.33). However, since there is no direct connection between a survey response and the survey responder, these attributes are not reflected in this (Contacts) table. The related fields are thus placeholders should the TIM training registration process be modified in the future to gather these additional attributes. Table 3.31. Organizations Table Design Column Name Description Organization ID (PK) Unique identifier for an organization, gener- ated by the CRM system Organization name Name of the organization Street Street address of the organization City City where the organization is located State State of the organization Zip code Postal code for the organization Phone Main phone number of the organization Website The organization’s website Note: PK = primary key. Database Tables The remainder of this section provides a detailed description for each database table. Organizations. Table 3.31 contains a list of all agencies/ organizations participating in the TIM training program. Contacts. Table 3.32 contains a list of all trainees and train- ers participating in the TIM training program. It also identi- fies an organization’s POC for Level 3 and Level 4 follow-up surveys. Responses. Table 3.33 contains responses for all surveys. For flexibility and scalability, each row in this table represents the response to one survey question. Answer Keys. Table 3.34 contains the answer keys to the Level 2 survey (test) questions. Each row in this table represents the answer key to one question. System test and pilot Scope of Testing The research team employed standard practices for testing the software. These included unit testing during development, the execution of a set of tests based on the use cases described earlier, and various ad hoc tests. For example, when core product functionality was being used to create an account or contact in Salesforce.com or to author a survey and collect survey data online with Fluid- Surveys, the research team made the assumption that the ven- dor had performed thorough quality assurance testing, which did not need to be replicated. The research team therefore con- fined the scope of testing mostly to extensions or customiza- tions, such as the addition of custom fields, and to validating the integrity of data flowing through the system. (continued from page 23)

44 Table 3.33. Responses Table Design Column Name Description Survey type Level 1, Level 2, Level 3, or Level 4 Survey collector (PK) A means to gather survey responses over time and through multiple versions of the surveys Response ID (PK) Unique identifier of response to a survey Question ID (PK) Unique identifier of a survey question within a survey collector Question Text of the survey question Answer Responder’s answer to the question Lesson The training lesson this question is related to Score Whether or not the answer matches the answer key (1) or (0), used only by Level 2 surveys/tests Status Status of the survey response • Complete • Incomplete Created at Date/time when the response was created Invite e-mail E-mail address for the survey invite Responder organization ID Responder’s organization ID project was extended beyond the March 31, 2014, end date of Project L32C, a joint pilot was not feasible. In lieu of a pilot, SHRP 2 program staff requested that the L32C research team document the requirements for data integration between the TIM assessment tool and the NHI system hosting the L32B online courseware. The research team produced such a document, which is attached to this report as Appendix A. The team also provided NHI with sam- ple data files conforming to this document so the agency could assess its ability to produce exportable data in this format. During the pilot time frame, the research team also con- ducted three separate briefings and demonstrations of the TIM assessment tool to SHRP 2 program staff, personnel from various FHWA departments, and the TETG. Pilot The original project plan anticipated conducting a pilot to test the TIM assessment tool in a setting approximating pro- duction usage. The pilot was to be conducted in conjunction with pilot testing of the online TIM training course being developed by SHRP 2 Project L32B. However, because that Table 3.34. Answer Keys Table Design Column Name Description Survey type Level 1, Level 2, Level 3, or Level 4 Survey collector (PK) A means to gather survey responses over time and through multiple versions of the surveys Question ID (PK) Unique identifier of a survey question within a survey collector Question Text of the survey question Answer Answer key to the survey question Lesson The training lesson this question is related to Created at Date/time when the answer key was created

Next: Chapter 4 - Conclusions, Recommendations, and Summary »
Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum Get This Book
×
 Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s second Strategic Highway Research Program (SHRP 2) Report S2-L32C-RW-1: Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum documents the development of a tool to assess the effectiveness of a multidisciplinary, multiagency training curriculum for traffic incident management.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!