Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
35 C H A P T E R 7 7.1 Introduction The identification and presentation of contract administration tools for D-B and CM-GC projects are key contributions of these guidebooks. For the purpose of this research project, a tool is defined as Agency Contract Administration Tools A tactic or process relating to D-B contract administration, such as checklists, spread- sheets, guidelines, and structured meetings. Although the tools take on many forms and accomplish a variety of objectives, their core function remains the same. This chapter discusses tool identification, selection, development, and examples. 7.2 Tool Identification The tools presented in these guidebooks were identified during the case study interviews dis- cussed in Chapter 6. When learning about the detailed processes used during the execution of these D-B and CM-GC projects, the research team identified tools that were used to perform specific operations. To be considered for these guidebooks, the tools identified also had to be unique to D-B or CM-GCâor at least modified to fit these delivery methods. If an agency or team had a tool that was used on projects regardless of the delivery method, then it was not considered specific to ACMs. However, if a tool was used on D-B-B projects but modified to fit more appropriately with D-B or CM-GC, then it was considered for these guidebooks. After a tool was identified, the research team would request specific documentation or detailed descriptions on the func- tion of the tool, why it was created and used, when it was used most effectively, and how to most appropriately use it. In some instances, similar tools were identified on multiple case studies from different agen- cies or on different teams within the same agency. This allowed the research team to obtain multiple examples of how the tool was implemented on projects and gain more insight on the specific benefits. Some agencies had different names for certain tools or processes, or they implemented them in a different manner. However, if the tools performed the same function or operation, the research team created one comprehensive tool description for the guidebooks.
36 Guidebooks for Post-Award Contract Administration for Highway Projects Delivered Using Alternative Contracting Methods These instances also provided multiple examples to be included with the tools. Multiple tool examples allowed agencies less familiar with the tools to understand their function and processes them more clearly. This also allowed agencies to tailor the tools to fit more appropriately within their current system or standards. 7.3 Initial Tool Selection Survey After the case study interviews were completed, a survey was presented to the consultant team and experienced practitioners to test the applicability and effectiveness of the identified tools. Survey Participants This survey was distributed to a list of 20 experienced practitioners who agreed to work with the research team. These individuals were deeply involved in post-award contract administra- tion of projects delivered using ACMs. It was determined that their experience would provide useful insight into the effectiveness and appropriateness of the identified tools. Of the 20 prac- titioners who received the questionnaire, 16 were able to provide their responses within the given time frame. Survey Layout The original 36 identified tools were separated into three categories: Team Alignment Tools, Design Tools, and Construction Tools. A brief description of each tool was provided to ensure that the participants were familiar with the tool and its purpose. The tool description also explained whether the tool was compatible with CM-GC projects, D-B projects, or both project types. After each tool was described, the first question asked was, âIs this tool effective enough to include in the guidebook for Post-Award Contract Administration?â Participants were asked to select one of the following answers: ⢠Yes ⢠No ⢠I do not feel comfortable reviewing this tool. If the participant responded âYes,â the participant would move on to answer more ques- tions regarding this tool. If the participant selected âNoâ or âI do not feel comfortable review- ing this tool,â the participant would move on to the next tool description. Participants that responded âYesâ to the first question would then be asked, âIs this an appropriate tool for the following project complexities?â The selection options were presented to participants in a matrix (Figure 7.1). Figure 7.1. Project complexities matrix.
Agency Contract Administration Tools 37 Based on their own experience with ACM projects and the descriptions of the tools, the par- ticipants were asked to select if each tool was appropriate for non-complex, moderately complex, and complex projects. The following definitions were provided regarding project complexities: Complex (Major) Projects ⢠New highways or major relocations, ⢠New interchanges, ⢠Capacity adding or major widening, ⢠Major reconstruction (4R, 3R with multiphase traffic control), ⢠Congestion management studies required, and ⢠Environmental impact statement or complex environmental assessment required. Moderately Complex Projects ⢠3R and 4R projects that do not add capacity, ⢠Minor roadway relocations, ⢠Non-complex bridge replacements with minor roadway approach work, and ⢠Categorical exclusion or non-complex environmental assessment required. Non-Complex (Minor) Projects ⢠Maintenance betterment projects; ⢠Overlay projects, simple widening without right-of-way (or very minimum right-of-way take), or little or no utility coordination; ⢠Non-complex enhancement projects without new bridges (e.g., bike trails); and ⢠Categorical exclusion. (Note: 3R = resurfacing, restoration, rehabilitation; 4R = new construction or reconstruction). Next, the participants were asked, âIs this an appropriate tool for the following project sizes?â Similar to the previous question, the selection options were presented as a matrix (Figure 7.2). The project sizes were presented in the form of contract dollar amounts ranging from less than $10 million to more than $50 million. Finally, participants were given the option to provide any additional comments about the tool presented. These comments could be regarding the description of the tool, their experience with the tool, or a justification for their answers to any of the questions. Results After all participants who were able to give responses had completed the survey, it was necessary to analyze the data. Of the 36 tools included in the survey, none were eliminated from Figure 7.2. Project size matrix.
38 Guidebooks for Post-Award Contract Administration for Highway Projects Delivered Using Alternative Contracting Methods consideration for the guidebooks based on the responses provided. A summary of the responses to the first question, âIs this tool effective enough to include in the guidebook for post-award contract administration?â can be seen in Table 7.1. The responses to the second question, âIs this an appropriate tool for the following project complexities?â were divided into three classifications, seen in Table 7.2. The purpose of these classifications was to share the recommendations of the practitioners with the agencies that were considering using these tools. If a tool was classified as âNot recommendedâ for a project complexity, then the practitioners felt that it would be ineffective, not appropriate, or not worth the resources to implement with that type of project. When a tool was classified as âConsider case by case,â this meant the tool should be considered, but depending on the details of the project could still be effective or ineffective. It was at the discretion of the agency to determine if the tool would function properly with a project of that complexity. Finally, if a tool was classified as âRecommendedâ for a projectâs level of complexity, then the practitioners felt that this tool should be used for the corresponding complexities. Details regarding the classifications for each tool can be found with the tool descriptions. A summary of the results can be seen in Table 7.3. Table 7.4 shows that as projects became more complex, the practitioners recommended the use of more tools to execute the project. Percent Consensus Number of Tools ⥠70% 36 ⥠70% to < 80% 6 ⥠80% to < 90% 6 ⥠90% to 100% 24 Table 7.1. Question 1 results. Percent Consensus Classification < 50% Not recommended ⥠50% to < 80% Consider case by case ⥠80% to 100% Recommended Table 7.2. Classifications. Classification Project Complexities Non-Complex Moderately Complex Complex Not Recommended 6 tools 0 tools 0 tools Consider Case by Case 23 tools 4 tools 0 tools Recommended 7 tools 32 tools 36 tools Table 7.3. Classification versus complexity summary. Classification Project Complexities <$10 million $10 million to $50 million >$50 million Not Recommended 2 tools 0 tools 0 tools Consider Case by Case 20 tools 2 tools 0 tools Recommended 14 tools 34 tools 36 tools Table 7.4. Classification versus size summary.
Agency Contract Administration Tools 39 The results of the third and final question, âIs this an appropriate tool for the following project sizes?â were classified using the same ranges as the complexities in the second question as shown in Table 7.2. Again, the details regarding the classifications can be seen in each tool description, and a summary of the results can be seen in Table 7.4. Similar to the results of the complexity question, the practitioners felt that as the contract value of a project increases, the number of tools used should increase, as well. 7.4 Final Tool Selection and Examples The survey results showing the most appropriate project sizes and complexities were incorporated into the tool descriptions, which can be found in the guidebooks. However, some tools were removed after the results of the survey because of a lack of or low-quality examples. The research team felt that tools without high-quality examples would be unclear and potentially misunderstood by guidebook users. Therefore, before any of the removed tools are considered for future use, they should be further developed and tested by state transportation agencies. The research ultimately incorporated 28 tools for NCHRP Research Report 939: Guidebooks for Post-Award Contract Administration for Highway Projects Delivered Using Alternative Contract- ing Methods, Volume 1: DesignâBuild Delivery, and 32 tools are included in NCHRP Research Report 939: Guidebooks for Post-Award Contract Administration for Highway Projects Delivered Using Alternative Contracting Methods, Volume 2: Construction ManagerâGeneral Contractor Delivery. The provided tool descriptions were generated using the information obtained during the case studies, feedback from the practitionersâreviewers, and comments collected from the initial tool selection survey. 7.5 Summary The tools presented in these guidebooks were created and used by state transportation agen- cies across the country to administer contracts for D-B and CM-GC projects. The tool descrip- tions and recommended useâdepending on project size and complexityâwere generated throughout a process involving case studies of 30 projects and 13 agencies and the feedback and practitioner opinions from the research teamâs group of practitioners. These tools have proven effective in administering contracts for D-B and CM-GC projects and should be considered by all state transportation agencies for future use. The provided examples are meant to be used for guidance, but tailoring should also take place to ensure that the tools fit specifically within each agencyâs established system.