Best Practices from Evaluation Processes in the Federal Sector
As noted above, numerous federal agencies perform formal independent assessments of major programs and use well-defined procedures in the reviews.1,2 The agencies’ processes often start from a generic set of principles and procedures and then develop into a tailored process that reflects the unique nature of the organization or program being reviewed. Our purpose in reviewing the practices of other agencies is to identify useful guiding principles that could underpin the DoD unique evaluation process.
Federal program evaluation processes are often based on a set of core principles that are consistent across agencies. A good example is NSF’s evaluation policy, which was updated in 2020.3 NSF notes that assessments can have different purposes, such as monitoring progress, guiding improvement efforts, or determining the effectiveness or efficiency of a program. The five evaluation principles in the policy, along with some relevant text from the policy sections, are reproduced here:
- Relevance and Utility: Address questions of importance, address information needs of the stakeholders.
- High Quality and Rigor: Provide credible evidence while balancing time and resources for obtaining it, use appropriate evaluation designs to address key questions, promote proper interpretation of findings.
- Independence and Objectivity: Evaluations strive for objectivity in planning, conducting, interpreting and disseminating findings, and evaluators shall be insulated from influence which might affect their objectivity.
- Transparency and Reproducibility: NSF promotes transparency in the planning, implementation, and reporting phases, to the extent possible, decisions regarding the specification of the evaluation – such as objectives, design and methods, and dissemination of findings, will be documented ahead of time.
- Ethics: Evaluations shall be planned and implemented to safeguard the dignity, rights, safety, and privacy of participants, stakeholders, and other affected entities, and be equitable, fair, just, and consider contextual factors which could influence findings or their interpretation.4
2 See National Science Foundation (NSF), 2020, Business Systems Review (BSR) Guide, NSF 21-046, October, https://www.nsf.gov/pubs/2021/nsf21046/nsf21046.pdf.
3 See NSF, 2020, “Evaluation Policy,” September, https://www.nsf.gov/od/oia/eac/PDFs/nsf_evaluation_policy_september_2020.pdf.
In addition to these core principles, some agencies have published detailed guidance to support the development and execution of evaluation processes. For instance, NSF has a guide to support major facility program reviews;5 likewise, the National Aeronautics and Space Administration (NASA) has the NASA Standing Review Board Handbook for mission reviews.6 These are broad guideline documents intended to cover a wide range of agency programs. Both documents assert that specific program reviews in their agencies would need to tailor a review for the unique attributes of the program being evaluated from the larger set of evaluation processes and related metrics. It is useful for DoD’s MII evaluation process to consider some of the common themes from these guides.
The first important consideration of the evaluation process is the selection of the evaluation team. The review team leads are selected based on their recognized leadership in the community and their relevant expertise in the technology area. The review team lead is also responsible for planning and scheduling review activities as well as for assuring consistent implementation of agency policies and processes in the evaluations. NASA suggests a separate review manager, who works with the review team lead, to address these additional responsibilities. The review team lead is responsible for developing a candidate list of team members. The evaluation process requires a diverse team of experts with knowledge on all of the essential elements of the program. Depth of knowledge in the technology area is important, but capabilities in management, programmatics, testing, or integration competencies might also be relevant. NASA suggests that having a review team that supports all of the formal evaluations of a program over its life has benefits in terms of continuity and familiarity with the program’s purpose and history. Also, a best practice suggested is to keep the team size to the minimum required to cover all important aspects of the evaluation.
Any evaluation process needs to be tailored for the specific characteristics of the program reviewed. NASA’s guide mentions the importance of factoring in program maturity in developing the review process. Also, both NASA and NSF’s guides stress the need for sensitivity to the cost and organizational burdens of the review process for the program being reviewed.
Each of the guides divide activities in the review broadly into pre-site visit, site visit, and post-site visit phases. The pre-site visit phase is where all of the information necessary to support the evaluation process is gathered. For both NSF and NASA, the pre-site visit activities start many months prior to the site visit. NSF’s guide mentions activities starting 4 months in advance, and NASA’s is earlier at 6 months in advance of the site visit. The earliest activity in the NASA process is a planning session between the review team leads and program leadership, which sets the expectations for the data and information that will be required in the review process. NASA has formal data delivery requirements set at 60 and 20 days before the site visit.
A related activity is the development of and the agreement to the terms of reference (TOR) for the evaluation. The review team lead is responsible for drafting the TOR, which covers the nature, scope, schedule, and ground rules for the review. The team lead then works with the government convening authorities and the program leadership to collaboratively develop a TOR that meets the agency’s assessment expectations. NSF’s process also includes formal scheduling, planning, and scoping activities. As the site visit draws near, there are reviews made of the progress in assembling the supporting information and data that are needed for the formal evaluation and coordination meetings between the review team and the program leadership. The site visit is typically 1 to 3 days in length, depending on the size and complexity of the program. The review team lead is responsible for developing the agenda in concert with the program leadership. During the pre-site visit phase and at the site visit, the review team—both individually and as a group—perform an assessment of the program’s status against a set of core requirements or strategic criteria. The NSF guide provides a comprehensive catalog of assessment questions organized in focus area modules to support the activities of the review team. The team uses the set of modules and questions to tailor the assessment of the specific program. In the NASA process, the
team develops an assessment of performance against a set of criteria and reports status as successful, partially successful, or unsuccessful.
The post-site visit activities include report generation and briefing schedules, which again are tailored for different types of programs. Interestingly, in NASA’s process, there is a requirement that the review lead provide a “Snapshot Report” 24 to 48 hours after the site visit. This report is a one-page summary of the program review, which provides a summary of the review team’s findings and a discussion of key issues and risks.
Another useful source of insights for the DoD MII evaluation process is the National Institute of Standards and Technology (NIST)-led Manufacturing Extension Partnership (MEP) program.7 MEPs are public–private partnerships that are dedicated to support small- and medium-sized manufacturing companies. MEPs and MIIs share a focus on development and growth of manufacturing ecosystems. The MEP program has been successfully providing support to industry for more than 30 years. In 2018, the MEP Advisory Group created a Performance and Research Development Working Group. The working group focused on providing input and guidance to the MEP program to examine performance measurement, management, and evaluation processes that support the MEP National Network. The objective of the activity was to provide improved center evaluation processes and approaches, to promote National Network learning, and to suggest additional data and analysis capabilities for the MEP centers. Some of the findings and observations of the group are of interest. First and foremost, the goal of the MEP evaluation process is to provide accountability to all of the program stakeholders and to demonstrate that the MEPs are making a meaningful difference to their clients and that they also have a broader economic impact. The working group noted that the MEP has had a consistent focus on measuring performance and impact, which has driven improvements in the program and in the centers. A key component of the evaluation is the NIST MEP Client Survey, which captures relevant information such as sales, cost savings, hiring, etc., from industry. Also, an IMPACT metrics report is produced that assesses impact and market penetration. There is a panel review process performed that focuses on trends in center performance. This information allows center-to-center comparisons to be made and best practices to be identified.
There have been many studies over the 30-year life of the MEP program that have examined the effectiveness of these PPPs, improvements to the MEP National Network, and improvements to the MEP evaluation processes. Although the MEP program has been in existence for a long time, the working group suggested areas for research to further refine and improve the assessment processes used for the MEPs in the future.
Guidance to DoD MII Evaluation Process
Even though the DoD MIIs have significantly different goals than the Agency programs described above, the DoD 5-year evaluation of the MIIs can benefit from insights and lessons learned provided by these guides and earlier studies. First, the selection of the review team lead is obviously critical. This individual is responsible for all of the planning, execution, and reporting steps in the evaluation. There will be a substantial time commitment for this individual for the period of the evaluation process. Regarding the review team, the JDMC review team could benefit from members who have knowledge of the technical field, knowledge of the EWD needs in the field, knowledge of DoD’s needs in the technical field, knowledge of the ecosystem in the field, and knowledge of DoD’s management in similar major technology development programs. The Joint Defense Manufacturing Technology Panel (JDMTP) is an important source of manufacturing technology expertise in DoD and can support the MII reviews.
7 MEP Advisory Board Performance & Research Development Working Group, 2019, Performance Framework Final Report, https://www.nist.gov/system/files/documents/2019/05/08/report_wgperformanceframework_final.pdf.
Although across the set of DoD MIIs there are many organizational similarities, the individual MIIs have some unique characteristics. Also, the MIIs are at different levels of maturity (institute age). There will be a need to develop a tailored evaluation process for individual MIIs. Collaborative planning activities and the development of a TOR-like agreement among the JDMC team lead, OSD ManTech, and MII leadership will enable the development of a transparent evaluation process that is balanced between DoD’s needs for a strategic assessment and the time and cost resource requirements placed on the MII. In terms of information and data provided prior to the site visit, these materials would include OSD ManTech’s perspectives on specific attributes of the MII being evaluated and the MII-prepared responses to the relevant metrics-driven questions developed by OSD ManTech. As with DoD’s review of UARCs and FFRDCs, the results of surveys of customer satisfaction needs to be included in such material. Since the MIIs are PPPs, the customers include both department organizations that have sponsored work at the MII as well as industry members and other stakeholders from the ecosystem in the technology area. Finally, the assessment of MII performance would include consideration of responses provided to evaluation criteria questions in the strategic areas of importance defined for the DoD MII review process as the four evaluation questions. This topic is discussed further in the section “Strategic Evaluation Criteria for DoD MIIs” below.
At the site visit, the team would interact with MII leadership and institute members and receive detailed briefings on technology developments and transitions, EWD programs, ecosystem development, and the MII business plan’s status and direction. The team will meet to develop the set of findings and recommendations from the evaluation. The team lead can then provide a recommendation to OSD ManTech and OUSD R&E on the future of the MII. The reports developed by the review team will also be critical for the Department’s tracking of trends in performance of individual MIIs over time.
Finally, it should be noted that as the JDMC MII evaluation process starts in FY 2021, it is expected that an effort will be made to capture lessons learned from the process so that improvements can be incorporated as the 5-year evaluations progress over time. It can also be expected that refinements to team structure, data and information sets, timelines for their delivery prior to the site visit, and suitable reporting requirements will be made as the process matures. A formal meta-evaluation of the first two MII reviews planned for 2021 would be useful to the JDMC. This would be a collaborative effort among OSD ManTech, the JDMC, and MII leadership, which would lead to an effective and efficient evaluation process. In addition, the repository of the 5 year evaluation results for all of the MIIs will enable the identification of lessons learned and best practices from individual MIIs which can be incorporated into future reviews and which can support the general improvements within the set of MIIs.
The JDMC evaluation of the DoD Manufacturing USA institutes provides an opportunity to better insure and document the value of the MIIs for DoD. The nature of the JDMC evaluation is different from the annual review of MII performance, in that the JDMC has an opportunity to assess the broader impact and value of the institutes in meeting DoD’s mission needs in advanced manufacturing technologies. The metrics previously developed by DoD are very valuable to the JDMC evaluation process. Although the four question evaluation framework is appropriate to accomplish the strategic evaluation of the MIIs, there are a number of topics which should be considered which would expand the scope of the questions to better reflect an evaluation process designed for the MIIs.
There are three chartering principles, as seen in Figure 3.1, which are common to all DoD MIIs:
- Perform advanced manufacturing research and scale-up,
- Develop and sustain a resilient domestic ecosystem, and
- Develop a sufficient, well-trained workforce.
The combination of these principles underlines the uniqueness of MIIs which necessitates the development of a similarly unique set of evaluation criteria. Assessments within these areas serve as the starting point for the evaluation. There are a number of considerations that will require the JDMC to tailor the MII evaluation based on the specific institute being reviewed. DoD has started nine MIIs in a wide variety of technical fields from 2012 to 2020. Within the set of MIIs, there are institutes that have a focus on a facility (or facilities) with advanced technology equipment and resident expertise (e.g., the Manufacturing Innovation Institute NextFlex), institutes that serve as convening bodies for expertise and capabilities that are distributed among the members (e.g., the Manufacturing Innovation Institute AM), and institutes that have combinations of these characteristics. Over time, the institutes will have a natural evolution from their initial, start-up phase to a more mature, execution phase. The technology readiness level (TRL) and manufacturing readiness level (MRL) of technologies will vary with the portfolio mix of breakthroughs, developments, applications, and new concepts from the pipeline of ideas coming to an institute. The number of transitions to DoD applications will also vary over time based on opportunities, viable solution candidates, and application funding. Demonstrations of expansion of the DoD supply chain or enhancements to supply-chain resilience would increase over time as well. The EWD needs of DoD will evolve as technologies mature and are transitioned to DoD systems. As noted in the section “Evaluation Process for DoD MIIs” above, in anticipation of the MII evaluation, the JDMC team lead will need to work with OSD ManTech and MII leadership to develop a department assessment of the appropriate characteristics that then can be evaluated and the relevant metrics and discussion topics that also can be applied to the specific MII. As has been done by other Agencies, the DoD should have a broad set of evaluation topics within the framework which can be used to guide the design of an evaluation process for a particular MII. The evaluation questions, along with suggested additional critical topics for consideration for the JDMC evaluations, are shown below.
Question 1: Is There a Continuing Need for the MII?
This question is the DoD’s assessment of the importance of the MII being evaluated in the context of DoD’s overall needs in advanced manufacturing technologies. MIIs have unique combinations of organizational attributes that provide advanced technologies, an ecosystem of importance to DoD and the nation, and a knowledgeable workforce in a specific manufacturing technology area. The programmatic focus of the MIIs is in advanced technology development and demonstration, with a concentration on programs in TRLs 4 to 7. MIIs typically engage in pre-competitive technology development projects and also perform technology demonstration projects. Other organizations within DoD focus on development of technologies up to TRL 4 and also on transition to implementation in TRL 7-9. The question of continuing need for the MII depends on both DoD’s long-term plans for the MII technology area and how the MII is integrated into DoD’s strategy for advanced technology implementation. As has been noted before, the MII assessment acknowledges that within the set of MIIs, there are technologies that are at different maturities and at different levels of penetration into DoD programs and systems. Furthermore, MIIs are at different points within the start-up and follow-on stages when the 5-year reviews are performed. Therefore, a long-term view of the continuing need is required.
In these strategic reviews, DoD assesses the MII in the context of emerging needs of DoD agencies and military services. The JDMC review process will need input from these organizations and DoD stakeholders. Relevant topics to examine would include the following:
- Does DoD leadership view the MII technical field as critical to the future of DoD? For example, does this MII cover any of the needs identified in the Industrial Capabilities Report (A&S) to Congress?
- Is there evidence that science and technology work in OSD and the agencies and services supports the need for technology maturation in an MII?
- For mature MIIs, is there evidence that the DoD Research, Development, Test and Evaluation (RDT&E), acquisition, and sustainment communities are engaged in the development, transition, and implementation of MII technologies?
- Does DoD see a need for an ecosystem to drive advanced manufacturing technology implementation activities (e.g., roadmapping, community convening, standards and regulations development, etc.)?
- Does DoD see a need for sustainment and growth of a workforce to support this technology area for DoD and the nation?
- Does DoD view the MII as essential so that DoD and the nation will benefit from the expansion of the ecosystem in this technology area (e.g., resilient supply chain, versatile production capability, globally competitive position in the technology area, etc.)?
- What is the DoD’s sunsetting strategy for the MII?
Question 2: Is an MII the Best Alternative?
With this question, DoD will assess whether the organizational characteristics of the MII still afford the best approach to achieving DoD’s goals in a specific technology field. The unique organizational characteristic of the MIIs is that they are industry-led, public–private partnerships, unlike other sources available to DoD. MIIs were chartered8 and organized for establishing and growing a manufacturing ecosystem, advancing research and technology, and securing human capital. These outcomes support OSD ManTech’s congressionally mandated mission to support the warfighter while also enhancing U.S. manufacturing base capabilities, expertise, and intellectual property. In assessing
8 As outlined in the MII Strategy and Assessment document MII Strategy, received on November 9, 2020, from ManTech.
whether continuation of the MII is the best way to achieve these outcomes, the JDMC is expected to assess the strengths and weaknesses of available alternatives in the MII field of discipline. Relevant topics to examine would include the following:
- Are alternative sources available in the MII technology area with capabilities for delivering the needed outcomes in technology maturation, EWD, and manufacturing ecosystem? Alternatives might include existing R&D consortia, contractors, FFRDCs, UARCS, national laboratories, other agency programs, etc.
- What are the strengths and weaknesses of alternative sources in comparison to those of the MII public–private partnership model that could be used to validate that the MII is still the best alternative?
- What evidence exists of continued “private” commitment to the public–private partnership to ensure MII viability for renewing the 5-year agreement?
We note that DoD has well established best practices for analysis of alternatives in the acquisition of defense systems.9 A much more streamlined process is called for in the five year review of an MII, focused on the decision as to whether to continue an agreement that was considered the best alternative at the time of original award. The primary emphasis should be on updating the identification and qualitative assessment of alternative sources to include both those originally considered and new alternatives that may now be available to DoD, Standard qualitative techniques such as Strengths, Weaknesses, Opportunities, Threats (SWOT) analysis may be appropriate. If the qualitative evaluation determines a viable competing source may be equally effective in meeting DoD needs, a more detailed quantitative analysis of costs and benefits is required.
Question 3: Has the MII Performed Well?
This question assesses the performance of the MII against DoD’s goals for the institutes. Consistent with DoD’s chartering principles for the MIIs, there are three components of the assessment—technology development, ecosystem development, and EWD. Since the formation of the institutes, DoD has had an active effort to develop and improve the metrics it uses to perform annual assessments of the MIIs. The metrics provided in the 2020 DoD Metrics List and Final Metrics List10 are of value to the 5-year assessment process in that they provide key inputs on trends in the MII. Being a 5-year strategic evaluation, the evaluation needs to examine trajectories toward a desired steady state of the MII. In some cases, this will entail trend analysis on the data collected by DoD and the institute. In other cases, additional topics that relate to DoD mission needs and national stature, as well as future plans and directions of the institutes, need to be considered.
Advanced Manufacturing Ecosystem
9 DoD Instruction 5000.84, “Analysis of Alternatives”, August 4, 2020.
10 As provided by ManTech through the Excel books: “Final DoD MII Performance Metrics_cleared” and “2020 MII Complete Metrics List_cleared,” both received from ManTech on December 11, 2020.
11 Executive Office of the President, 2017, National Security Strategy of the United States of America, Washington, DC, p. 55.
12 Fiscal Year 2020 Industrial Capabilities Report to Congress, OSD A&S Industrial Policy, January 2021. https://media.defense.gov/2021/Jan/14/2002565311/-1/-1/0/FY20-INDUSTRIAL-CAPABILITIES-REPORT.PDF.
quantum and cybersecurity, all of which are vital to national defense.13,14 A strong manufacturing sector not only ensures a ready supply of defense and commercial goods and services, but also ensures the integrity and safety of these goods, such as electronics and control systems.
Advanced manufacturing encompasses all aspects of manufacturing, including the ability to quickly respond to customer needs through innovations in product design and production processes and innovations in the supply chain. As manufacturing advances, it is becoming increasingly complex and knowledge-intensive, relying on diverse data streams and partner networks to accelerate the pace of new products and services delivered.
Advanced manufacturing ecosystems can provide speed in technology development while enabling partner networks to co-innovate around common challenges.15 This ecosystem approach requires deliberate coordination among and between various parties to solve shared challenges and meet shared objectives. In the case of smart manufacturing initiatives, the Deloitte and MAPI Smart Manufacturing Ecosystem Study identified four primary ecosystems that support advanced manufacturing ecosystems,16 see Table 3.1.
Given the pace of change, maintaining vibrant and resilient manufacturing ecosystems is of paramount importance to U.S. national security and the DOD. Hence, establishing and growing manufacturing ecosystems is one of the three chartering principles of Manufacturing USA MIIs.
Since DOD MIIs vary in the nature, maturity, and intended use of their focused technology, each MII and its stakeholders need to identify the shared challenges and shared opportunities that their ecosystem wishes to develop, grow, and maintain.
TABLE 3.1 Four Primary Advanced Manufacturing Ecosystems and the Focus Areas of Shared Objectives
|Advanced Manufacturing Ecosystems||Focus Areas of Shared Objectives|
|Supply Chain Ecosystem||Source raw materials, calibrate supply to demand, facilitate storage and distribution of finished product to customer.|
|Production Ecosystem||Make products that meet customer requirements, quality standards, and cost margins.|
|Customer Ecosystem||Connect and engage with customers, enable customers to order, maintain, and service products.|
|Talent Ecosystem||Create pipelines for skills and roles that are needed to support smart (advanced) manufacturing.|
15 Deloitte, 2020, “Accelerating Smart Manufacturing: The Value of an Ecosystem Approach,” October, https://www2.deloitte.com/us/en/insights/industry/manufacturing/accelerating-smart-manufacturing.html; Deloitte, “Manufacturing Goes Digital: Smart Factories Have the Potential to Spark Labor Productivity 2019,” Deloitte and MAPI Smart Factory Study, https://www2.deloitte.com/us/en/insights/industry/manufacturing/driving-value-smartfactory-technologies.html.
As public–private partnerships, MIIs provide DoD with an opportunity to broaden and strengthen DoD’s supply chains for advanced technologies through the ecosystem. The character of the ecosystem that best suits the needs of DoD will be dependent on the specific technology field of the MII. Each MII therefore needs to describe the advanced manufacturing ecosystem that it envisions building, based on the vision of the needs of the DoD and what will ultimately be acquired by DoD acquisition and sustainment from the manufacturing community. The MII also needs to explain its role in growing and improving the ecosystem and state why no other entities can play that role better than it can. The health of the ecosystem of the MIIs is therefore an important aspect of the value of the institutes to DoD. Over time, it is natural for members to join and depart from the MII ecosystem based on its own interests. However, it is important for DoD to monitor support for the MIIs with industry representatives in those relevant fields. Strategically, DoD needs to continue to assess the health and growth of the ecosystem over time.
Possible assessment activities for health and growth of the ecosystem are as follows:
- Assess the MII’s plan and progress in development and growth of an ecosystem for its technology area.
- Assess what programs are in place to enable development of a robust: 1) supply chain ecosystem, 2) production ecosystem, 3) customer ecosystem, and 4) talent ecosystem
- Assess the distribution and number of partners from: a) industry, b) economic development, c) workforce development, d) education, e) government, f) professional organizations, and g) philanthropy.
- Assess evidence of MII impact on the robustness and resilience of DoD supply chains (e.g., multiple sources for technology, technology companies linked to system providers or government supply chain, sufficiently trained workers, foundation provided in emerging technology by institutes).
- Assess if effective marketing and communications systems are in place to attract new members, as well as support and retain existing members.
- Examine whether Tier 1 industry members and other DoD original equipment manufacturers (OEMs) use the MII in platform system-specific manufacturing technology development activities.
- Assess how the MII has helped summarize and promote sharing of common manufacturing processes in the form of sharable SOPs and helped SMEs adopt these best practices.
- Assess how the MII has supported standards development. Assess evidence that the MII user facilities are supporting regional and national needs and providing value to the consortium members (e.g., access by small- and medium-sized enterprise (SME) companies typical of lower-tier DoD supply chain members).
- Assess the level of state and local entities participating in economic development, EWD, etc.
- Assess the health of the MII ecosystem.
- Have new industry collaborations emerged from MII activities?
- What number of jobs are directly attributed to efforts of the MII?
- Are there new start-up companies which have been enabled by the MII ecosystem?
- For MIIs that have a facility base, are there examples of new start-up companies incubating at the MIIs?
- For MIIs that have a facility base, what is the blend of small, medium, large companies that have used the facilities (e.g., are the user organizations technology creators or adaptors?)?
- Assess evidence (e.g., market survey, etc.) of MII impact on global competitiveness.
One important reason for DoD’s support of MIIs is that they offer effective delivery of manufacturing technologies that support DoD needs. The OSD ManTech metrics list contains the topics that document the contribution of the MIIs to technology development and transition. An additional topic relevant to MII performance is the efficiency of technology transition. Determining the rate-limiting steps in developing an advanced technology is critical to addressing acceleration and maximizing impact. The MIIs are responsible for finding innovative approaches to address technology development challenges. During an MII evaluation process, a key question that can be addressed is, What is limiting the rate of maturation of the technology or a resilient ecosystem for this advanced manufacturing field? Are there technology gaps, a lack of standards, inadequate equipment and facilities, a lack of qualified suppliers, insufficient demand, untrained workforce, etc.? Once identified, the evaluation can focus on the MII strategy for addressing these rate-limiting issues and address if adequate resources are appropriately prioritized and aligned to efficiently and effectively close these gaps. The 5-year evaluation would ideally assess the trends in the effectiveness of the MIIs technology development and implementation activities. Also, as the MIIs mature, it is to be expected that the degree of engagement with DoD’s acquisition and sustainment communities and opportunities for technology insertion onto DoD systems and platforms will increase.
Possible assessment activities include the following:
- Assess evidence of technology maturation, technology scale-up, technology transfer, technology implementation, and trends in the data.
- Assess evidence of technology development acceleration (the transitioning efficiency).
- Assess evidence that DoD RDT&E, acquisition, and sustainment communities are actively engaged in the technology (and EWD) road-mapping activities.
- Assess evidence of MII engagement in addressing standards development and regulatory issues in support of DoD and national needs.
- Assess the amount of engagement with the DoD RDT&E, acquisition, and sustainment communities, which supports acceptance of the MII as a preferred technology provider.
- Assess how the MII has helped summarize and promote sharing of common manufacturing processes in the form of sharable SOPs and helped SMEs adopt these best practices.
Education and Workforce Development
EWD has been a core mission of the MIIs from the outset, based on an understanding that new production technologies will not be adopted unless there is a workforce ready to apply them. Currently, MIIs collect institute-specific data on the numbers of those trained and reached in their EWD programs. While numerical data provides a useful metric for MII performance, the committee has done a detailed review of the kinds of EWD programs undertaken by MIIs and developed a list of “best practices” that various MIIs are undertaking. While these best practices are less amenable to metrics, evidence that best practices are being applied represents an important indicator of program quality. It is also indicative of the reach, both potential and ongoing, of the MII’s EWD programs in improving workforce education skills in the MII’s technology area. While no single institute is undertaking all of the program elements delineated below, the ability to perform a number, with quality efforts, can be an important indicator of program strength. Questions seeking evidence from the MII of application of best practices appear below, with a more detailed explanation about the need for and significance of each question set out in Appendix C.
Possible questions for information gathering on EWD issues are set out below, with indicators likely to be developed earlier in the MII’s term listed before later indicators:
- Is the MII adequately staffed with personnel experienced in EWD practices?
- Has the MII dedicated adequate funding resources to support significant EWD programs, from both its core and outside funding?
- Has the MII developed and circulated, with industry and education institution involvement, a detailed set of knowledge, skill, and assessment (KSA) elements and corresponding competencies in its technology area?
- Is the MII engaged in both mapping skill demand and in developing skill roadmaps, both regionally and nationally, and using these to inform its programs?
- Has the MII evaluated available credential offerings in the MII’s technology area then worked with industry to fill in credential gaps to either develop or apply industry-recognized credentials?
- Are the MII’s education materials being developed with and used by its education and industry ecosystem?
- Has the MII formed significant programs for regional engagements around EWD needs?
- Has the MII developed online education materials in its technology area available to industry and education institutions?
- Has the MII engaged effectively with DOD depots, arsenals, shipyards and/or defense contractors in its EWD programs or through efforts that reach veterans?
- In collecting data on numbers that the MII educated and trained, is there information on whether those educated entered positions or performed work based on the training?
- Is the MII engaging with DOD acquisition and sustainment to identify and support internal DOD EWD needs and opportunities? How many programs have been executed? Have the programs successfully met the DOD’s needs? Is the MII engaging with DOD acquisition and sustainment to identify internal EWD needs?
Current and former MII stakeholders should be consulted in the development of this information. Supplemental material relevant to and discussing these questions appears in Appendix C.
Question 4: Is the Governance and Management Effective?
Similar to Question 3, the performance of the MII in establishing and operating the institute has been reviewed on an annual basis by DoD. There is a section in the 2020 DoD Metrics list directed at operations performance. The data collected from the annual reviews, along with the trends in the data, will be very useful to the JDMC evaluation committee performing the 5-year assessment. In addition, the JDMC review can address the MIIs progress toward achieving DoD’s strategic goals for the MIIs, as well as DoD’s overall mission needs for manufacturing technology. The examination of the adequacy of MII leadership in addition to business plans and processes in supporting the achievement of these mission needs could be relevant to the effectiveness of the governance and management of the MIIs.
Possible assessments for governance and management are as follows:
- Provide evidence of leadership and organizational effectiveness in advancing and transitioning technology, EWD, and manufacturing ecosystem enhancement and use.
- Assess performance relative to MII Business Plan; examine how the MII business plan has evolved as the MII matures.
- Assess the MII’s strategy for engagement with DoD RDT&E, acquisition, and sustainment communities.
TABLE 3.2 Summary of Goals and Expected Outcomes for Evaluation Criteria
|Question||1. Is there a continuing need for the MII?||2. Is an MII the best alternative?||3. Has the MII performed well?||4. Is the governance and management effective?|
|Goals||Assess the importance of the MII in the context of DoD’s overall needs in advanced manufacturing technologies.||Assess whether the organizational characteristics of the MII still afford the best approach to achieving DoD’s goals in a specific technology field.||Assess the performance of the MII against DoD’s goals for the institutes.||Assess the performance of the MII in establishing and operating the institute.|
|Expected Outcome||Long-term view of continuing need, in the context of emerging needs of DoD agencies and military services.||Evaluate strengths and weaknesses of available alternatives in the MII field of discipline.||Examine trajectories toward a desired steady state of the MII for (i) technology development, (ii) ecosystem development, and (iii) EWD.||Review trends in the annual operations performance metrics, progress towards achieving DoD’s strategic goals, and DoD’s overall mission needs for manufacturing technology.|
- Assess the percentage of budget (revenue and costs) in technology programs, EWD, outreach, operations.
- Assess funding levels and trends from Intellectual Property (IP).
- Assess funding levels and trends from DoD customers.
- Assess funding levels and trends from commercial customers.
- Assess funding levels and trends from non-DoD federal agencies.
- Assess level, quality, and trends of funding or cost share from non-federal sources (states, foundations, etc.).
- Assess user facility utilization rate and nature of use and of the user group (e.g., govt., OEM, SME, etc.).
- Assess plans for DoD cyber compliance.
- Assess partner retention.
- Assess funding level for technology transition and implementation at member facilities.
- Assess the level of utilization of the MII by government laboratories, universities, and other industry groups.
The goals and expected outcomes of the MII key objectives (advancing research and technology, establishing and growing a manufacturing ecosystem, and securing human capital) in the four critical areas of the evaluation framework are summarized in Table 3.2.
All of the topics in the Strategic Evaluation Criteria for DoD MIIs section above represent qualitative metrics which are directly relevant to the three key strategic objectives for the MIIs (advancing research and technology, establishing and growing a manufacturing ecosystem, and securing human
capital). In addition, the trend analysis mentioned, which will be performed based on data collected by OSD ManTech and the Institutes on the MIIs over time will serve as quantitative assessments of progress against these key objectives. These quantitative measures are described below.
Advancing Research and Technology
As has been noted, OSD ManTech and the MIIs already measure and collect data on a number of metrics important to technology development and operations. This includes tracking the number of projects that are being executed, progress to meeting technical objectives in timely manner, and how many projects are started or get completed over a given time period. The institutes also categorize these metrics in total, by non-DoD government entity, by DoD service, by state or local government, and by commercial entity. In addition to the metrics currently gathered, the committee suggests additional, strategic assessment should include: MIIs must develop a process to track demonstrated outcome impact of completed projects on cost, lead time, and/or platform performance enhancements, and MIIs should track their impact on the roadmaps for the various technologies they tasked with and how these impacts fit within the larger community contributions. The additional metrics suggested for the research and technology objective are:
- Describe the process used to track demonstrated outcome impact of completed projects on cost, lead time, and/or platform performance enhancements.
- Assess impact on the roadmaps for the relevant MII manufacturing technologies and how these MII contributions interface with the larger community contributions.
Establishing and Growing a Manufacturing Ecosystem
OSD ManTech and the institutes measure and collect data on a number of quantitative metrics relevant to their manufacturing ecosystem. This includes tracking the number of successful technology transitions, number of patents filed and used, number of members on funded projects, level of member cash to in-kind contributions, level of ecosystem participation, number of DOD subject matter experts engaged, number of states involved and level of state contributions. For the long-term strategic evaluation, the committee suggests that the institutes define the advanced manufacturing ecosystem which the MII and its stakeholders envision to serve the needs of its community. The institutes should identify the ecosystem programs and partnerships, which are required to establish robust and resilient DOD supply chains and track and report trendlines in 1) the number and growth of successful tech transitions into relevant DOD industrial bases, 2) growth in number or percentage of IP portfolio which contributed to the competitiveness of its ecosystem, 3) number of MII programs focused on establishing robust and resilient DoD supply chains, and 4) growth in number and value of regional/state/local economic development programs and partnerships focused on nurturing, growing, and maintaining a vibrant and resilient ecosystem. The institutes should address how the ecosystem metrics have progressed or changed over time to put their impact in context on technical development, talent development, production capability, standards development, regulatory requirements, customer engagement and overall operations. The additional metrics suggested for the Ecosystem objective are:
- Identify the ecosystem programs and partnerships, which are required to establish robust and resilient DOD supply chains and track and report trends in 1) the number and growth of successful tech transitions into relevant DOD industrial bases, 2) growth in number or percentage of IP portfolio which contributed to the competitiveness of its ecosystem, 3) number of MII programs focused on establishing robust and resilient DoD supply chains, and 4) growth in number and value of regional / state / local economic development programs and partnerships focused on nurturing, growing, and maintaining a vibrant and resilient ecosystem.
- Address how the ecosystem metrics have progressed or changed over time to put their impact in context on technical development, talent development, production capability, standards development, regulatory requirements, customer engagement and overall operations.
Securing Human Capital
The Institutes are currently required to collect quantitative data on such items as numbers of students participating in institute projects or institute internship or training programs, numbers completing institute-related certificates or apprenticeships, teachers or trainers completing institute training programs, number of EWD projects and their funding showing longer-term and shorter-term projects, funding to the institutes from outside sources for EWD, and the level and depth of learning experiences for the different EWD participants identified. Aside from extensive qualitative metrics, the committee suggests additional metrics regarding whether those trained completed programs and entered positions or performed work based on the training. The MII should be asked to develop trendlines of this data for the MII’s period of operations, applied against workforce education goals set out in the MII’s skill roadmaps, to indicate progress in meeting strategic, overall skills training demands in the MII’s manufacturing technology sector. The additional metric suggested for the securing human capital (EWD) objective is:
- Can the Institute provide evidence of trends of EWD metrics data for the MII’s period of operations, applied against workforce education goals set out in the MII’s skill roadmaps, to indicate progress in meeting strategic, overall skills training demands in the MII’s manufacturing technology sector?
When these additional assessment topics are taken into account, it becomes clear that additional sources of input for the MII 5-year evaluation will be needed beyond those shown in Figure 2.2. The committee believes that the JDMC 5-year review would be significantly improved if inputs from the sources shown on the left in Figure 3.2 are obtained. Any data collection efforts will need to be designed in advance to adequately capture and evaluate quantitative data. The additions to the JDMC evaluation process are the following:
- An initial terms of reference-like agreement that specifies the nature, scope, schedule, and ground rules for the evaluation process, which is tailored to reflect the specific MII being reviewed. Tailoring would take into account the unique characteristics of individual MIIs, such as a foundry-style MII or a networking-style MII, an emerging/revolutionary technology MII or an evolutionary technology MII.
- The results of the survey of the non-DoD stakeholders that will provide an assessment of the health of the MII ecosystem. Examples of non-DoD stakeholders could include NSF and the Small Business Innovation Research (SBIR) communities feeding technology ideas to the MII, the stakeholders who are requesting/paying for results from the MII, the recipients of EWD training, and those hiring of EWD-trained employees.
- The results of the JDMC review team’s assessment of the MII’s performance relative to the key evaluation questions.
- The JDMC review team’s assessment of the information provided at the site visit.