Cover Image

Not for Sale



View/Hide Left Panel
Click for next page ( 30


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 29
30 effectiveness, testing of skills acquired, and peer exchange. Maintenance resource guides, training modules, and Approximately 60% of the LTAP/TTAP centers used the CD-ROM files developed, produced, and distributed information for input to annual reports, accountability to nationally; and senior management, and program justification. Other uses for Roads Scholar Program. the outcomes of the measures were to modify programs and manuals, as justification for more activities, and for various LTAP/TTAP centers identified elements of their technol- forms of information dissemination. ogy transfer projects that were either easy or difficult to repli- cate. Examples of those easiest to replicate are: For the types of methods used for measuring performance, state DOTs mentioned measures such as the number of Basic course design and curriculum, research findings implemented, tally of outcomes that result Classroom presentations, in change, quantification of savings, relationship of project Convening stakeholders, to priority needs, and number of organizations that changed Finding training locations, methods. The most frequently cited means for evaluation Core program of slides, and were benefitcost and return-on-investment determinations Setting up a program. for quantitative data and surveys for qualitative data. Only about 25% of the state DOT survey respondents used the Whereas examples of those hardest to replicate include: information generated from measuring performance in their annual reports or for program justification. Approximately Dedication and knowledge of lead team; 35% of responding agencies used the information for account- Getting committed group willing to help; ability to senior management. (Note that only half of the Private-sector involvement; respondents used measures for performance.) State DOTs Securing funding (about one-quarter of respondents also reported the information on their web pages and pub- highlighted this item); lished it in research newsletters, received additional funds for Field demonstration--owing to the need for equipment, programs, and, for specific projects, used the performance operator, and good weather; and data for facilitating implementation. Interagency communications. State DOTs with a role defined to coordinate technology State DOTs were also asked if the successful project they transfer tend to use or not use performance measures equally. reported on would be easy to replicate in another agency, However, when there is no coordinating function, there is a with responses spanning the range from easiest to hardest two times greater likelihood that the agency will not use per- with little consensus. The degree to which the technology formance measures for technology transfer and implementa- transfer effort could be replicated had little relationship to the tion of research results activities. Furthermore, the experi- various technology transfer processes conducted during ence level of the respondent has little influence on whether these efforts. performance measures are used. The state DOTs provided some insight to the elements According to the LTAP/TTAP respondents, performance that were easiest or hardest to replicate. Examples of the measures were used at approximately the same rate whether easiest to replicate are: the LTAP/TTAP center was operated by a state DOT or by others. Such measures were used about three times more Marketing efforts, frequently than not used. Partnership with transportation association, Mechanics of the training process, REPLICATING SUCCESSFUL Cooperation among DOT sections, TECHNOLOGY TRANSFER Arranging the workshop, Having training manuals and modules available on Of the successful technology transfer projects reported on in DOT website, and the LTAP/TTAP survey, respondents indicated that they Showing benefits through demonstration. were moderate to easy to replicate in another agency. This is an important factor for enhancing the content and increasing Whereas examples of the hardest to replicate include: the number of technology transfer activities. Representative projects include: Finding a champion; Staffing for technology transfer; Statewide workshops; Policy and legislative changes; Training and technical assistance for the new Highway Tailoring the system to a state's specific needs; Capacity Manual; Finding resources, expertise, time, and funds; Summer intern program management; Overcoming opposition of contractors; and Product demonstration/showcase; Technical expertise to sustain production.