National Academies Press: OpenBook
« Previous: Chapter 9 - Educate, Inform, and Provide Technical Assistance
Page 77
Suggested Citation:"Chapter 10 - Evaluate Progress." Transportation Research Board. 2014. Guide to Accelerating New Technology Adoption through Directed Technology Transfer. Washington, DC: The National Academies Press. doi: 10.17226/22342.
×
Page 77
Page 78
Suggested Citation:"Chapter 10 - Evaluate Progress." Transportation Research Board. 2014. Guide to Accelerating New Technology Adoption through Directed Technology Transfer. Washington, DC: The National Academies Press. doi: 10.17226/22342.
×
Page 78
Page 79
Suggested Citation:"Chapter 10 - Evaluate Progress." Transportation Research Board. 2014. Guide to Accelerating New Technology Adoption through Directed Technology Transfer. Washington, DC: The National Academies Press. doi: 10.17226/22342.
×
Page 79
Page 80
Suggested Citation:"Chapter 10 - Evaluate Progress." Transportation Research Board. 2014. Guide to Accelerating New Technology Adoption through Directed Technology Transfer. Washington, DC: The National Academies Press. doi: 10.17226/22342.
×
Page 80
Page 81
Suggested Citation:"Chapter 10 - Evaluate Progress." Transportation Research Board. 2014. Guide to Accelerating New Technology Adoption through Directed Technology Transfer. Washington, DC: The National Academies Press. doi: 10.17226/22342.
×
Page 81

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

77 Tier 3: Evaluation and Decision Making Components • Evaluate Progress • Reach Deployment Decision It is axiomatic in organization theory that performance measurement and feedback are essential to improvement (e.g., London, 2003). The importance of evaluation is emphasized throughout the T2 literature. For example, evaluating success is a core objective of FHWA’s Highways for LIFE program (Bergeron, 2010), evaluating failures is noted as an organizational characteristic for successful diffusion and implementation (Desouza et al., 2009), and out- come measures are regarded as critical for process improvement (Hodges and Wotring, 2012). Just as most T2 initiatives are complex, with many intersecting parts, evaluating T2 progress can also be a multifaceted endeavor. Considerations in evaluating T2 progress are framed in terms of these questions: What should be evaluated? How should the evaluation be conducted? Who should have responsibility for the evaluation process? How should the evaluation infor- mation be used? Considerations for Evaluating Progress What Should Be Evaluated? The Innovation Adoption Process serves as a guide to determining what to consider in evaluating T2 progress. Evaluation should focus on the components of the guided T2 phase of the Innovation Adoption Process, but it is advisable to also include evaluations of related C H A P T E R 1 0 Evaluate Progress 1. Has the need been documented? If yes, proceed to the next question. If no or unsure, proceed to the component discussion. 2. Has the evaluation coordinator been identified? If yes, proceed to the next question. If no or unsure, proceed to the component discussion. 3. Has the feasibility of the solution been identified? If yes, proceed to the next question. If no or unsure, proceed to the component discussion. 4. Have the components of the T2 process been evaluated? If yes, proceed to the next question. If no or unsure, proceed to the component discussion. 5. Has the deployment been evaluated? If yes, proceed to the next component. If no, proceed to component discussion.

78 Guide to Accelerating New Technology Adoption through Directed Technology Transfer Innovation Adoption Process phases beginning with the need or problem, research on possible solutions and the choice of a technology to transfer, and deployment actions. Here are some specific points to keep in mind: Evaluate and document the need • What is the need? Is it based on a particular problem, deficiency, or opportunity to be pursued? • How was the need determined, through excessive costs, production problems, customer complaints, or something else? • Who decided that the need is a priority now, and how was this determined? Answers to these questions will help to plan and shape the T2 effort and the evaluation of any deployment that might follow, so documenting the answers from the outset is important. Evaluate the feasibility of potential solutions • What was done to evaluate potential internal solutions, and what were the findings of those evaluations? • What was done to evaluate potential external solutions, and what were the findings of those evaluations? • If original research was conducted to develop a solution, did this effort yield a product that could solve the problem? If so, is it a feasible solution for this organization? If a decision is made to proceed with the T2 effort, answers to the questions above will be helpful in several respects, including planning specific T2 activities and informing decision makers and stakeholders. Key players should be aware of the steps leading to the T2 decision; compelling evaluation data will encourage them to be supportive of that decision. If problems or obstacles are encountered during T2, answers to the questions above may help to retrace steps and reevaluate prior decisions. If a decision is made not to proceed with the T2 effort and the search for a solution continues, documentation of answers to these questions will help avoid duplicating effort. Evaluate components of the guided T2 effort • Who is the champion? What role is the champion expected to play? Is the champion effective in this role? • Who is (are) the decision maker(s)? Do they have timely information about the technology? How do they relate to other stakeholders? • Who are the stakeholders? What are their perspectives on the need and the technology being transferred? What roles do they play in the T2 effort? What are their criteria for successful T2? • Will a cost/benefit analysis be conducted to determine if resources were well spent? • What communication and education activities will be conducted—demonstrations, show- cases, technical assistance? For what audience(s)? How will the effectiveness of these activi- ties be measured? • Are intellectual property issues being managed properly? These are questions that should be revisited periodically throughout the guided T2 effort, as negative answers to them could indicate the need to adjust or correct T2 activities to keep the initiative on track. Indeed, consideration of these questions should be specified in the T2 plan, detailing who should ask them and when, and how the information they yield should be used. To help with this, a checklist is provided in Appendix A to help you track progress through a guided T2 effort. You can check off items once they have been identified, defined, and/or addressed. The checklist includes each component most likely to be present in a guided T2 effort and can provide a solid foundation for a complete evaluation of the guided T2 effort.

Evaluate Progress 79 Evaluate deployment • Is there a deployment plan? • How will deployment be evaluated? If the outcome of the guided T2 effort is a decision to proceed with full-scale deployment of the technology, consideration of the questions above will help to prepare for its evaluation. How Should the Evaluation Be Conducted? Evaluation procedures take many forms and serve many purposes. They can be formal or informal, formative or summative, qualitative or quantitative, require original data collection or use existing records. Choices concerning how to evaluate T2 should be based on the questions the evaluation is supposed to answer and the decisions it is intended to inform (see the section titled, “What should be evaluated?”). An organization may have in place many of the measures needed for T2 evaluation as part of its performance management and quality control practices. Meeting minutes, memos, email exchanges, and other correspondence document communications among key players (champions, decision makers, stakeholders). Reports and briefings prepared by individuals, work- ing groups, and committees document major and minor decisions made as T2 progresses. Of course, evaluation procedures may also require development of measures tailored to the particular requirements of a transfer initiative. If stakeholders constitute a large and diverse group, for example, a survey may be an efficient way to assess their perspectives regarding the problem that’s driving the T2 initiative and the feasibility of potential solutions. Effective- ness of a demonstration, showcase, or other educational activity could be evaluated in terms of Kirkpatrick’s (1998) criteria for training evaluation (reactions of participants concerning perceived value of the activity, learning achieved or knowledge transferred, behavioral changes or skills developed through participation, and organizational results or outcomes attributable to the activity). A formal T2 evaluation study may also sometimes be warranted, particularly for large-scale initiatives involving many stakeholders representing multiple units of one or more organiza- tions. The focus of an evaluation study could be on the problem or need that initiated the search for a solution, on the formative aspects of the T2 effort, and/or on its summative outcomes— including deployment of the technology. A qualified researcher who is well-versed in program evaluation methodologies should plan and guide a study that includes formulation of the research question, design of the research protocol, selection of measures and oversight of the measurement process, analysis and interpretation of data gathered through the measurement process, and reporting of the findings of the study. The research report should inform champi- ons and decision makers regarding choices such as adjustments that may be needed to foster a successful T2 initiative and whether to proceed with deployment of the technology. Who Should Have Responsibility for the Evaluation Process? The T2 plan should address all aspects of the evaluation process to keep it coordinated and efficient and to ensure that the information it yields is credible, relevant, and understood. The plan should assign responsibility for management of the process to someone with broad responsibility for the T2 initiative. Often this will be the champion, but it could be anyone with the requisite authority and performance management skills. In particular, this person must see to it that the relevant parties have the evaluation information they need at the right time to keep the T2 effort on track for ultimate success.

80 Guide to Accelerating New Technology Adoption through Directed Technology Transfer How Should the Evaluation Information Be Used? Information yielded by evaluation of the T2 effort serves multiple purposes, including tracking progress toward goals, indicating where and when adjustments are needed, acknowl- edging and reinforcing contributions of participants, informing decision makers, document- ing and communicating successes, and determining when the transfer process is complete and the deployment process can begin. The T2 plan should anticipate decision makers’ and stake holders’ uses for evaluation information and map the information required to serve the purposes of each. However “objective” an evaluation may be, it is always open to interpretation. The inter- pretations of decision makers and stakeholders will be colored by their individual perspec- tives on the need, the technology being transferred, and the organizational implications if the technology were to be fully deployed. This is both a reality and strength of process evalua- tion, not a shortcoming or limitation. If the benefits of the technology are to be realized, they must be recognized and valued by all stakeholders. The champion (or other guided T2 effort manager) should provide stakeholders with timely information and help them reach a com- mon understanding of it. Careful attention to collection, distribution, and interpretation of evaluation information helps an organization become more proficient at T2 and reinforces its innovative culture. Infosys Develops Metrics to Evaluate Innovation Progress and Enhance Development Developing appropriate metrics is important in enhancing innovation and, by proxy, successful T2. Unique metrics must often be developed when novel approaches or technologies are being utilized or developed. For example, Infosys began as a small software company, experiencing early success provid- ing software solutions to organizations. Leaders at Infosys, however, saw even greater opportunity in providing solutions outside of software. The organiza- tion took a significant risk and developed Infosys Consulting, a much broader approach to helping organizations solve their problems, but also a much more complicated one. The result was a tremendous leap in revenue, awards for innovation, and recognition as a known innovator in the industry. A keystone to their success in innovation was developing unique metrics and scorecards for their new approach to consulting. Rather than applying the metrics used in the original business, leaders sought to develop adaptive metrics aimed at assessing trends and providing useful feedback to managers. Decision makers realized that to be innovative, novel and unique metrics were a necessary part of the development process. They were wise in their decision to NOT apply commonly used metrics from their original software business. In addition, leaders were willing to adapt and adjust evaluation methods as new informa- tion was gathered about the new consulting business. Original concepts and approaches are often in a state of near constant revision—metrics were fluid enough to move with changes, yet explicit enough to provide the informa- tion necessary to make additional requisite adjustments (Govindarajan and Trimble, 2010).

Evaluate Progress 81 Suggested Readings Bergeron, K. A., “Highways for Life.” Public Roads, Vol. 73, No. 4 (2010) pp. 2–9. Desouza, K., C. Dombrowski, Y. Awazu, P. Baloh, S. Papagari, S. Jha, and J. Kim, “Crafting Organizational Innova- tion Processes.” Innovation: Management, Policy & Practice, Vol. 11, No. 1 (2009) pp. 6–33. Hodges, K., and J. R. Wotring, “Outcomes Management: Incorporating and Sustaining Processes Critical to Using Outcome Data to Guide Practice Improvement.” The Journal of Behavioral Health Services & Research, Vol. 39, No. 2 (2012) pp. 130–143. Kirkpatrick, D., Evaluating Training Programs: The Four Levels, 2nd Edition, Berrett-Koehler Publishers, Inc., San Francisco, CA (1998). London, M., Job Feedback: Giving, Seeking, and Using Feedback for Performance Improvement, 2nd Edition, Lawrence Erlbaum Associates, Publishers, Mahwah, NJ (2003). McDavid, J. C., and L. R. L. Hawthorn, Program Evaluation and Performance Measurement: An Introduction to Practice, Sage Publications, Inc., Thousand Oaks, CA (2006). Merton, D. M., and A. T. Wilson, Program Evaluation Theory and Practice: A Comprehensive Guide, Guilford Press, New York, NY (2012). Shadish, W. R., T. D. Cook, and D. T. Campbell, Experimental and Quasi-Experimental Designs for Generalized Causal Inference, Houghton, Mifflin and Company, Boston, MA (2002).

Next: Chapter 11 - Reach Deployment Decision »
Guide to Accelerating New Technology Adoption through Directed Technology Transfer Get This Book
×
 Guide to Accelerating New Technology Adoption through Directed Technology Transfer
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB’s National Cooperative Highway Research Program (NCHRP) Report 768: Guide to Accelerating New Technology Adoption through Directed Technology Transfer presents a framework and guidance on how to use technology transfer to accelerate innovation within a state department of transportation or other such agency.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!