National Academies Press: OpenBook
« Previous: Chapter 1 - Background
Page 3
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 3
Page 4
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 4
Page 5
Suggested Citation:"Chapter 2 - Research Approach." National Academies of Sciences, Engineering, and Medicine. 2014. Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum. Washington, DC: The National Academies Press. doi: 10.17226/22320.
×
Page 5

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

3The work for this project was roughly divided into four stages, as shown in Figure 2.1. The initial task was to develop and describe a full-range assessment process for the TIM training program that would address what would be assessed, how the assessment would be done, and when each assessment step would take place. A parallel and related task was to perform a literature review, including an assessment of other relevant training initiatives, and develop a business model that specified how best to develop, implement, and sustain the TIM assessment tool. The research team then developed a set of use cases that served as a frame- work for the development of functional requirements; that framework guided subsequent system architectural design, development, and testing. Finally, the research team demon- strated the TIM assessment tool to SHRP 2 program staff, Federal Highway Administration (FHWA) personnel, and the Technical Expert Task Group (TETG). Key aspects of the research approach are discussed in the followed sections. Collaborative Nature of Project L32C The L32C project is tightly coupled with Projects L12, L32A, L32B, and other TIM training programs nationwide and requires support from many federal, state, and local agencies. That made L32C a highly collaborative effort. Figure 2.2 shows the interactions and communications between L32C and other projects and organizations. During the course of the work, the L32C team • Attended project meetings with L32B, SHRP 2, and FHWA staff to review tasks, milestones, and timelines for Projects L32B and L32C and to discuss dependencies between the two projects; • Engaged with National Highway Institute (NHI) regarding its e-learning platform and mechanisms for integrating with it; • Reviewed the L12 and L32A projects’ final reports and course materials; • Reviewed Project L32A course evaluation questionnaires and student exam; • Reviewed Project L32B results and initial system require- ments and design approaches; • Reviewed NHI’s registration survey and questions for course evaluation; • Attended an FHWA-run TIM train-the-trainer class held in Rhode Island; and • Interviewed responder agency managers and training professionals. Conceptual Model for Training Evaluation Early in the project the research team decided to use the widely used and popular “Kirkpatrick Model” as a conceptual refer- ence for the TIM assessment tool. Donald L. Kirkpatrick’s four-level evaluation model first appeared in a series of arti- cles published in 1959 and became popular with his 1994 book, Evaluating Training Programs (Kirkpatrick 1959; Kirkpatrick 1994). The idea behind the Kirkpatrick Model is to provide orga- nizations with meaningful ways to evaluate training programs or learning in the organization. The four levels of evaluation described by the model are depicted in Figure 2.3. Level 1. This level tries to ascertain how students feel about the training; it is a measure of student motivation and satis- faction. Students are typically asked to fill out evaluation or feedback forms immediately after the training ends. These forms usually include questions to evaluate instructors, train- ing materials, and training logistics. Level 2. This level measures how much the students have learned by attending the training. The measurements aim to find out what knowledge was learned, what skills were devel- oped or improved, and what attitudes were changed. Students C h a P T E r 2 Research Approach

4Level 4. This level measures the impact on the business as a result of students attending the training and their subse- quent on-the-job behavioral changes. The impact may be determined in terms of improved safety, increased productiv- ity and efficiency, and reduced staff turnover. This level of assessment is usually the most difficult because results take time to achieve; measurements are needed both before and after the training. The evaluation also needs to determine what business results have been achieved as a result of student participation in the training, as opposed to other organiza- tional initiatives. The research team applied this four-level evaluation model when designing the TIM assessment process and tool. System Development Methodology and Approach The research team followed a typical systems development life cycle (SDLC) approach to designing and developing the TIM assessment tool. The analysis stage of the project pro- vided an understanding the business requirements, an essen- tial first step in any SDLC methodology. From there the team • Documented a set of use cases; • Developed a concept of operations and initial meta- architecture for the system; • Translated this foundational information into functional requirements; • Established the systems architecture, developed functional specifications, and documented test cases; and • Developed the software and performed unit testing. The high-level concept for the system which the team envi- sioned is shown in Figure 2.4. It consists of three major blocks of functionality: • Survey Management—functionality to execute a particu- lar kind of assessment, from a Level 1 (Reaction) survey at the conclusion of a training event to a Level 4 (Results) survey long after a training event or series of events. • Constituent Management—functionality to manage rela- tionships and communications with all key constituents (e.g., individual students, agency training officers and/or management, and trainers). • Analysis and Reporting—functionality that enables program staff as well as participating agencies to analyze and report on training participation, needs, effectiveness, and so on. Custom software development is almost always the least desirable approach to system implementation, and that was certainly the case for this project. Controlling costs, ensuring typically need to complete evaluation forms or perform some type of tests both before and after the training. Level 3. This level measures whether on-the-job behavioral changes have occurred as a result of students attending the training, and if so, to what extent. Trainees, their immediate supervisors, and their subordinates or peers who often observe their behaviors may be asked to participate in this level of evaluation. The degree of assessment difficulty is increased at this level because behavioral changes often take time, and the right environment must be provided for the students to implement their behavioral changes. Additionally, those who participate in this evaluation need to be observant to note the behavioral changes that took place. Figure 2.1. SHRP 2 L32C project stages. Figure 2.2. Collaboration between the SHRP 2 L32C project and other projects and organizations.

5 usage fee, and the vendor hosts and manages the entire application environment in the cloud. • Cloud computing services allow an organization to sub- scribe to a cloud-based, virtual computing, storage, and net- work environment and pay for usage on a time-, capacity-, and bandwidth-used basis. In this case the organization is responsible for licensing, installing, and maintaining the application that runs in this virtual environment. As described in more detail in Chapter 3 of this report, the team’s approach was to use SaaS subscriptions for survey management and CRM functions and to base the analysis and reporting functions on desktop products that integrate with a cloud computing platform. sustainability, and preserving long-term flexibility are always important considerations, particularly in a research project with limited scope and funding. All of these factors pointed to the use of off-the-shelf tech- nology, with a focus on integration and customization via configuration, as opposed to writing code from scratch. For- tunately, the major functional elements of the envisioned sys- tem were all available in various cost-effective forms: • Highly capable and popular survey management and con- stituent relationship management packages (which evolved from customer relationship management, or CRM) are available as software-as-a-service (SaaS) subscriptions. In this model, an organization pays a monthly or annual Survey Management System Survey design Survey execution Real-time results Basic reports and cross-tabs Constituent (Customer) Relationship Management Contact management Automated workflows Scheduled follow-ups Communications management Analysis & Reporting Tools Advanced Reports Multi-dimensional analytics Historical performance Figure 2.4. High-level system concept. Figure 2.3. Kirkpatrick’s four levels of learning evaluation.

Next: Chapter 3 - Findings »
Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum Get This Book
×
 Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s second Strategic Highway Research Program (SHRP 2) Report S2-L32C-RW-1: Post-Course Assessment and Reporting Tool for Trainers and TIM Responders Using the SHRP 2 Interdisciplinary Traffic Incident Management Curriculum documents the development of a tool to assess the effectiveness of a multidisciplinary, multiagency training curriculum for traffic incident management.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!