Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
IMPLEMENTATION The VE Job Plan establishes a sequence of activities that have been proven to successfully improve a product, project, or process. However, no similar sequence of activities has been uniformly adopted to implement the proposed VE ideas. The different team rolesâdesign and VEâhave tradition- ally (and in some jurisdictions legally) required a complete separation of the design and value activities. This can lead to a potentially adversarial relationship if human relations are not respected during the VE study (2). Miles (4) first cautioned during his first VA training work- shop group in 1952 that this segregation of roles could lead to the âcompetitor instinct.â It is common practice to have the designer complete the initial review of the VE proposals and advise the STA of how they should be addressed (i.e., accept, modify and accept, or reject). The majority (almost 60%) of the responding transporta- tion agencies indicated that some form of defined implemen- tation strategy was in place for their VE programs. In many cases, an implementation or design review meeting is held following the VE workshop to consider the proposed VE ideas. For example, Michigan convenes a meeting immedi- ately following the workshop with the VE and design teams. The combined group considers the disposition of each idea by deciding on one of three outcomes: ⢠Accept for implementation into design, ⢠Accept for further study, or ⢠Reject and list specific reasons. New York State used a similar process; however, it per- mits the regional VE office to conditionally accept the rec- ommendation for further study, but defer the final decision to the main office. The implementation process must confirm who is respon- sible to make the decision, define a response time frame, and manage stakeholder and political expectations as well as sen- sitivities. The process may necessitate different implementa- tion team compositions to suit the idea being evaluated. For instance, the design branch would likely defer to the con- struction branch of the agency if the VE idea was a construc- tion idea. 32 Virginia noted that the agency also uses an appeals pro- cess. The VE report is forwarded to all of the discipline man- agers that will be affected if the recommendation is accepted. All comments are synthesized by the regional VE manager and forwarded to the chief engineer for program develop- ment. The chief engineer has the final authority, but may consider an appeal supported by the appropriate justification materials. Californiaâs process includes three stepsâ(1) review VE alternatives, (2) resolve alternatives, and (3) present results. The entire VE team is involved in the meeting. A written record regarding resolution (some agencies refer to this as disposition) of the VE alternatives identifies whether the VE proposal was accepted, modified and accepted, or rejected. Resolution of each idea is based on the validation of the accepted results. MONITORING VALUE ENGINEERING IDEA IMPLEMENTATION Implementation of the VE proposals is the only way to truly determine the total impacts and costs. The initial effort made during the development phase of the Job Plan is intended to refine and confirm the cost estimates. Monitoring idea imple- mentation can promote a greater understanding of the impacts. There are two aspects of monitoring that must be considered: ⢠Confirming that the idea was included in the design and ⢠Developing a better understanding of the true impacts and costs. As discussed earlier in this report, FHWA is required to report the VE activities on an annual basis. STAs must provide supporting information on the VE proposals in terms of cost. MONITORING VALUE ENGINEERING PROGRAM PERFORMANCE Monitoring VE program performance ensures that expen- ditures and effort to deliver the program are well under- stood. Sixty-four percent of the responding STAs monitor program expenditures and avoided costs to develop a Return- CHAPTER FOUR IMPLEMENTATION AND MONITORING
33 FIGURE 13 Sample VE program performance measures summary report for Washington State DOT (65 ). FIGURE 14 Sample return on investment summary report for New Mexico (66).
on-Investment (ROI) report. This report is only one of the metrics measured. Other measurements include: ⢠Number of VE studies performed, ⢠Cost of the VE studies, ⢠Estimated project costs, ⢠Number of VE recommendations, ⢠Value of VE recommendations, ⢠Number of approved recommendations, ⢠Value of approved recommendations, and ⢠VE change proposals. These are the program metrics that must be submitted to FHWA. 34 In addition to the FHWA-required program results, sev- eral transportation agencies, including California, Missouri, New Mexico, Virginia, and Washington State, have now begun to measure nonmonetary results. Figures 13 and 14 present the program results for Washington State (VE performance measure for 2001â2003) and New Mexico (performance-based budget measure), respectively. In Florida, program performance is measured as a per- centage of the annual VE work plan. Ontario follows up with VE workshop participants after a workshop to obtain timely feedback. Arizona develops a benefit-cost ratio for its program.