Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
D-1 Appendix D Guidebook Vetting Protocol The research team adopted the vetting protocol that it used on the Preconstruction Services Cost Estimating Guidebook, a deliverable of the NCHRP 15-51 project. The process was carried out two times at two different agencies, first the Minnesota DOT and then at the Alabama DOT. The vetting process was developed from a process used in adult education research to validate the content of university-level textbooks (Grant et. al 2009; Lane and Burtuzzi 2011). The vetting protocol has five steps, which are detailed in the following paragraphs. Step 1: Engage stakeholders The audience for a guidebook is typically DOT employees or independent practitioners. This audience should be involved throughout the evaluation process. This includes considering where and how to conduct the evaluation. Due to restricted funding within a DOT is it often logical to undertake the evaluation on site, rather than expect staff to travel. Generally, a diverse group of people within an organization will be impacted by (and reap benefits from) the contents of a guidebook. Therefore, it is wise to include as many of those people as practical in the vetting. The research team must identify the intended users of the research and hence established the stakeholders that would be involved with the evaluation of the produced document. The ATC process typically involves staff effort from several DOT offices. To ensure that the guidebook effectively communicated with all these people it was decided to hold a workshop at each DOTs main office, in a bid to get the highest possible number of attendees. Step 2: Create a logic model A logic model for the entire research project was developed, as shown in Figure D.1. This identified the wider situation that prompted the research initially and desired outcomes of the work. It should be noted that creating and evaluating the quality of the PCS Estimating Guidebook is an output of the model, which will then be facilitated to achieve the outcomes. SI TU AT IO N INPUTS OUTPUTS OUTCOMES ⢠Staff ⢠Time ⢠Money ⢠Equipment ⢠Facilities ⢠Develop Products ⢠Workshops ⢠Publications ⢠Field Days Short Term Medium Term Long Term ⢠Social ⢠Economic ⢠Civic ⢠Environmental ⢠Behavior ⢠Practice ⢠Decision- making ⢠Policies ⢠Awareness ⢠Knowledge ⢠Skills ⢠Attitude ⢠Motivation Changes in: Conditions Changes in: Action Changes in: Learning Assumptions External Factors Figure D.1. Logic Model (adapted from University of Wisconsin Extension, 2002)
D-2 Step 3: Collect data The vetting workshops took one day and involved upper management, design, construction, and contracting personnel from the two selected DOTs. Two weeks prior to the workshop a copy of the draft ATC Guidebook was circulated to invited participants. The workshops consisted of a mixed program of presentations of guidebook material and interactive exercises to apply the concepts of the guide in practice. Participants worked together to complete each of the exercises. Data to evaluate the quality of the guidebook was collected using observations, survey and a focus group. Observations were made during Day 1 of the workshop. Of particular interest to the research team was each participantâs engagement with the material provided. Three members of the research team used an evaluation observation form, Figure D.2, to tally what they observed. Specifically, the observers were looking to see if participants were able to relate the ATC concepts to their prior ATC experience and find value in the information provided. Observations were also made during the exercises to see how well the guidebook aided participants in completing the designated exercises. If the group had to ask additional questions this was noted as it implied that the guide was not comprehensive enough in that particular area. Figure D.2. PCS Guidebook Evaluation Observation Form
D-3 A series of standard questions were asked at the end of the workshop to measure participantâs views on the guidebook and the concepts presented within it. The questions were developed following principles of Taylor-Powell and Renner (2009) and aimed to quantify perceived changes in motivation, knowledge and skills as a result of the guidebook content. The second section of the workshop involved a focus group with DOT preconstruction personnel and upper management. The purpose of a focus group is to obtain perceptions about a defined topic using a planned discussion (Yu et al. 2006). The workshop focus group involved a structured discussion about the strengths and weaknesses of the guidebook and the realities of implementing its concepts within an agency. It was important for the research team to steer discussion to issues relating to the guidebookâs applicability to all national highway agencies and limit focus on just its application to the DOT participating in the vetting. Notes were taken by the research team and conglomerated together to form a master list of feedback points. Step 4: Analyze and interpret Process the collected data into a useful format and then analyze it to interpret the key messages from the workshop participants. Establish what can be learned from this information and any limitations to the results collected. Step 5: Use the Results Use the knowledge gained by the collected information to make decisions on improving the guidebook. Document how these insights influenced the guidebook revisions. Assess whether the key objectives of the research project have been met. Ask your vetting team the question: Does it fulfill some or all the outcomes prescribed within the logic model? This vetting process can be completed iteratively any number of times until the research team are satisfied that the data collected from the workshops fulfills the research outcomes. It is recommended that a different agency is used each time to ensure the relevance of the Guidebook across the nation