Click for next page ( 67


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 66
3 National Naval Responsibility for Naval Engineering Mission and Process for Achieving Goals A robust naval engineering science and technology (S&T) enterprise that supports the needs of the current and future Navy must perform its core functions effectively and efficiently, consistent with its mission and the expectations of a high-reliability research organization (Pelz 1956; Roberts 1990). The core functions include • Establishing the research agenda and allocating resources, • Identifying performers, • Measuring outcomes and evaluating results, • Maintaining connections among the wider naval engineering com- munity, and • Developing the requisite human capital to sustain the nation’s naval engineering capability. While effective performance of these functions is necessary, it is not sufficient for success in complex, dynamic research enterprises (NRC 1999; National Academies 2005). In addition, a high-performance orga- nization such as the Office of Naval Research (ONR) must clearly artic- ulate its mission and goals; measure and reward performance against those goals; incentivize and educate participants about desired organi- zational performance; and develop a robust continuous process improve- ment activity that assesses organizational performance; communicates best practices and lessons learned; provides for systematic dissemination of goals, activities, and achievements; and assesses organizational, group, and individual performance over time (Roberts 1990; Grabowski and Roberts 1999). These challenges are compounded for research organiza- tions whose missions involve interdisciplinary research, such as naval 66

OCR for page 66
National Naval Responsibility for Naval Engineering Mission 67 engineering. In such research organizations, measures and metrics of performance need to address the degree of integration and interdiscipli- nary activities required for mission success (National Academies 2005; Porter et al. 2006). This chapter presents a description of National Naval Responsibility for Naval Engineering (NNR-NE) core functions along with an examination of the NNR-NE’s interdisciplinary and integrative science and technology efforts. It also examines how well ONR performs its core functions and how effectively it achieves successful outcomes. ONR and its NNR-NE initiative have multiple processes and proce- dures in place that the committee believes are meant to support both the core and the integrative functions. For example, ONR has developed a Naval S&T Strategic Plan (ONR 2009b) that outlines the S&T vision and key objectives in 13 naval focus areas. ONR also tracks and reports on a variety of metrics, including the number of refereed papers that grow out of the projects it funds, the number of students it supports, and the num- ber of advanced degrees completed by individuals its funds support. However, the committee sensed that these individual processes and pro- cedures were not integrated into a cohesive whole that would support the alignment of NNR-NE’s research agenda, resources, activities, and incentive structure to its goals or to measurable objectives and outcomes. The following sections describe each of the NNR-NE core functions and how ONR’s processes support the NNR-NE mission. In addition, alternative methods to enhance organizational, individual, research, and educational performance are presented. ESTABLISHING THE RESEARCH AGENDA AND ALLOCATING RESOURCES As discussed in Chapter 1, naval engineering was designated a National Naval Responsibility in a 2001 ONR memorandum that specified the purpose of the designation and the activities that were to constitute the NNR-NE (ONR 2001). ONR was already engaged in all or nearly all of the specified activities before the memorandum was issued. Rather than initiating new programs, the memorandum served as a declaration of policy: assigning the NNR designation indicated that (a) the listed activ- ities are deserving of special priority in planning and budgeting at ONR because the identified S&T fields are critical to the Navy and no one else

OCR for page 66
68 Naval Engineering in the 21st Century will support them and (b) management of these activities must be coor- dinated with the declared policy objective in mind. The 2001 ONR memorandum set out the broad outlines of the orga- nization’s research agenda, envisioning an NNR-NE set of disciplines focused on the “development of educated and experienced people, expan- sion of the knowledge base, and cultivation of a climate supportive of innovation.” It also called on ONR to “formulate and maintain invest- ments” in these science and technology areas: ship design tools, ship structural materials, hydromechanics, advanced hull designs, ship propulsion, ship automation, and systems integration (ONR 2001). ONR has regrouped the NNR-NE S&T areas as follows: • Ship design tools; • Structural systems; • Hydromechanics and hull design; • Propulsors; • Automation, control, and system integration; and • Platform power and energy. Another category of activities that ONR includes within the NNR-NE definition is the University Laboratory Initiative, which concentrates on developing the future workforce and sustaining the education infrastruc- ture for naval engineering. In the current grouping, ONR has combined hydromechanics and hull design into a single area; renamed the ship propulsion area as propulsors; added the power and energy area; and grouped automation, control, and system integration into a single area. The committee’s analysis used the categories listed above. The overall scope of the NNR-NE research agenda is shaped to a large extent by the size of the budget devoted to NNR-NE research projects. In FY 2009, the Navy devoted $44.1 million to basic and applied research within the NNR-NE domain (Table 3-1), 3.4 percent of the Navy’s total $1.3 billion budget for basic and applied research (DON 2010, v, vii). The memorandum that established NNR-NE did not establish a pre- ferred level of funding or share of ONR budget for activities to be carried out under the initiative. The specifics of the research agenda are reflected in the projects that have been grouped under the NNR-NE technical areas. In presentations

OCR for page 66
National Naval Responsibility for Naval Engineering Mission 69 TABLE 3-1 ONR Outlays for NNR-NE Basic and Applied Research, by Technical Area, FY 2006–2009 Average Outlays ($ millions) Annual Outlay Total, per Project 2006 2007 2008 2009 4 years ($ thousands) Automation, control, and 2.2 2.8 2.0 3.2 10.2 232 system integration Basic 1.6 1.8 1.1 1.8 6.3 233 Applied 0.6 1.0 0.8 1.4 3.9 231 Ship design tools 2.4 3.4 3.0 3.0 11.9 165 Basic 2.4 3.4 3.0 3.0 11.9 165 Applied 0.0 0.0 0.0 0.0 0.0 Hydromechanics and 7.2 7.1 7.7 8.7 30.7 101 hull design Basic 4.8 5.5 5.5 5.4 21.2 94 Applied 2.4 1.6 2.2 3.3 9.5 121 Platform power and 20.2 13.7 20.6 18.7 73.3 852 energy Basic 1.4 1.3 1.4 1.9 6.0 136 Applied 18.8 12.4 19.2 16.8 67.3 1,601 Propulsors 2.0 2.1 2.0 2.4 8.5 105 Basic 0.8 0.8 0.9 1.0 3.5 82 Applied 1.2 1.4 1.0 1.4 5.0 131 Structural systems 6.5 6.9 4.7 8.1 26.2 133 Basic 4.1 3.7 3.7 3.5 15.0 106 Applied 2.4 3.2 1.0 4.6 11.2 203 Total 40.6 36.1 40.0 44.1 160.8 205 Basic 15.1 16.6 15.7 16.6 64.0 115 Applied 25.5 19.6 24.3 27.5 96.8 421 SOURCE: Tabulations of ONR 331 basic and applied research projects provided to the committee by ONR. to the committee, ONR delineated its research agenda within these categories for FY 2009 by using a combination of specific examples of funded projects and summary tables showing the number of projects and the level of funding in each of the technical areas. Data on funding trends for projects in each area are provided in Table 3-1. How much money ONR devotes to each of the NNR-NE S&T cate- gories each year is a crucial factor in setting the research agenda. The 2001 memorandum establishing the initiative called on ONR Code 33 to

OCR for page 66
70 Naval Engineering in the 21st Century “formulate and maintain investments in [all] seven key S&T areas in naval engineering.” The memorandum was silent on how any funds should be apportioned among the areas, however. ONR’s 2009 project list within NNR-NE categories shows an invest- ment profile with a large number of projects in hydromechanics and hull design ($8.7 million in FY 2009, or 19.7 percent of NNR-NE basic and applied research) and structures ($8.1 million, or 18.4 percent) and few in propulsors ($2.4 million, or 5.4 percent); ship design tools ($3.0 million, or 6.8 percent); and automation, control, and system integration ($3.2 million, or 7.3 percent). Much of the $73 million in 2006–2009 platform power and energy funding was the result of a short-term initiative. The Navy’s 2011 research and development (R&D) budget estimate reports a decline in all Navy applied research [Budget Area (BA) 2] spending for power and energy in 2010. Applied research funding for the budget category “surface ship and submarine hull mechanical and electrical (HM&E)” declined from $79 million in FY 2009 to $46 million in FY 2010 (DON 2010, 135). The budget estimate document states that “the funding decrease from FY 2009 to FY 2010 is due to the completion of the energy and power technology initiative that accelerated research in the following Energy and Power efforts: Distribution/Control and Alternative Energy efforts, Energy Storage and Power Generation efforts and the Medium Voltage Direct Current (MVDC) architecture efforts in support of the Next Generation Integrated Power System (NGIPS) Roadmap efforts,” as well as the tran- sition of some projects from applied research to the advanced technology development (BA 3) stage (DON 2010, 136). The Energy and Power Tech- nology Initiative was a 5-year program begun in 2002 throughout the Department of Defense (DOD) to coordinate R&D on energy-efficiency technology improvements (Taylor et al. 2010). ONR sees the development of a balanced portfolio as important: “Assessing the state of the health of Naval Engineering disciplines unique to the Navy is critical to ensure a balanced portfolio” (J. Pazik, briefing, Sept. 2009). That said, the annual share of NNR-NE designated projects and funding that go toward each of the technical areas depends on a vari- ety of factors. The question becomes how ONR decides on the amount of money to allocate to each of those categories. Determinants include

OCR for page 66
National Naval Responsibility for Naval Engineering Mission 71 the success of ONR program officers in negotiating for projects in the NNR-NE technical areas for which they are responsible. It is not clear that program officers and ONR managers have used NNR-NE designa- tion consistently as a determining factor in allocating funds to projects or in measuring the relative strength of the proposals submitted by offer- ors in response to ONR’s broad agency announcements (BAAs). The difficulty of planning and evaluating a basic research program should not be minimized. Outcomes often develop over years, and many important breakthroughs are unplanned. In developing its research port- folio, ONR appears to attempt to maximize outcomes by reliance on highly qualified managers with authority for program decisions, the tracking of short-term output indicators, feedback on the results of earlier efforts, advice from the technical community, and direction from Congress and the Navy. However, ONR does not appear to apply these informal pro- cesses explicitly to the NNR-NE as a coordinated program with specified objectives. (For example, program officers apparently do not consider whether an activity falls within the definition of the NNR-NE in making program decisions.) Furthermore, these informal processes do not match the requirements for monitoring and evaluation contained in the 2001 memorandum establishing the NNR-NE, which include monitoring of ONR’s traditional output metrics for the NNR-NE as a unified initiative, strategic planning of the NNR-NE, monitoring of the health of the S&T enterprise supporting naval engineering, and annual reporting and peri- odic external review of the NNR-NE. As a coordinating office that lacks direct authority over the funding and award decisions outside of Code 33, however, whether a project has NNR-NE designation generally does not determine in advance what share of the projects or funding will go toward each category. Moreover, as discussed in a later section, NNR-NE program officers strive to iden- tify projects within their portfolios that most merit funding, even though individual NNR-NE program officers may include S&T areas that fall outside the NNR-NE purview. However, the committee could not deter- mine whether anyone assumed responsibility for integrating research across NNR-NE functional areas or across naval weapons platforms. Achieving balance in a research portfolio is a desirable goal and has been achieved in a number of research settings by using techniques such

OCR for page 66
72 Naval Engineering in the 21st Century as the balanced scorecard method, which balances four perspectives to integrate quantitative and qualitative performance measures (Kaplan and Norton 1992). Studies evaluating the validity and strength of bal- anced scorecard methods have shown strong links between client or sponsor satisfaction and organizational performance, as well as between client satisfaction and economic variables such as client or sponsor retention, revenue, and revenue growth (Ittner and Larcker 1998a; Frigo and Krumwiede 2000). Conclusion: The committee could not identify a process by which NNR-NE mission area needs and research strategies were prioritized. In addition, the committee could not identify any systematic process by which ONR research funds were allocated by NNR-NE mission area needs or prioritized research strategies. Instead, it appears that NNR-NE program officers fund research projects and principal investigators as opportunities arise, without an enterprisewide eval- uation process that prioritizes and evaluates research project merit in a consistent manner across the NNR. Conclusion: The committee did not find evidence that NNR-NE is measuring or achieving balance in its research portfolio, despite its stated balance goal. The committee found no metrics to measure or establish balance in a research portfolio, leading to questions about how such a portfolio could be balanced or could demonstrate balance. Recommendation: ONR should establish an enterprisewide strate- gic planning and assessment process to develop a strategic plan for NNR-NE, link the plan to guiding goals and objectives, communicate those goals and objectives clearly throughout the naval research commu- nity, and evaluate and incentivize NNR-NE performance against the strategic plan and objectives. The NNR-NE strategic planning and assessment process should encompass all facets of the NNR-NE mission. The strategic planning and assessment process should include a process for NNR-NE research fund allocation that is aligned with mis- sion area needs and priorities so that resource allocation decisions are guided by a transparent, enterprisewide evaluation process that pri- oritizes and evaluates research project merit in a consistent manner across the NNR.

OCR for page 66
National Naval Responsibility for Naval Engineering Mission 73 Recommendation: ONR should identify, utilize, and periodically reassess metrics to measure NNR-NE portfolio balance, in line with ONR’s stated goals and articulated mission needs. Once established, these metrics should be incorporated into an enterprisewide assessment and continuous process improvement program, as described in subse- quent sections of this chapter. IDENTIFYING PERFORMERS ONR generally makes its research awards in response to BAAs.1 A con- solidated annual BAA pulls together instructions to potential research performers for submitting award requests for a large share of ONR’s proj- ects, including those related to the NNR-NE. Most such awards are solicited through that consolidated BAA. For example, ONR released ONR BAA 10-001, Long Range BAA for Navy and Marine Corps Science and Technology, on September 18, 2009, with the expectation that it would remain open for 1 year. Proposals can be submitted at any time during the year (ONR 2009a). Naval engineering research performers in the private sector include universities and industrial firms.2 Research within the University Labo- ratory Initiative is conducted by universities. For allocating projects among university and industry performers, ONR relies heavily on its program officers’ assessments of research merit, relevance to Navy mis- sions, the value of sustaining long-term relationships with productive principal investigators, and the need to develop new promising princi- pal investigators. ONR reported to the committee that program officers are mindful of the need to balance the long-term value of continued investment in ongoing research with research breakthrough opportuni- ties and shorter-term needs for research transitions in a constrained funding environment. 1 ONR occasionally uses requests for information and requests for proposals to solicit research offer- ings. For example, Solicitation No. N00014-10-0001 requests proposals for a contractor to operate the Navy Metalworking Center and conduct research on technical projects related to metalworking. ONR also makes use of other instruments for support contracts. 2 Basic research (Budget Activity 6.1) and applied research (Budget Activity 6.2) awards are usually provided as grants to universities and as contracts to industry. Advanced technology development (in Budget Activity 6.3) is usually performed under contracts. See ONR 2009a, 3.

OCR for page 66
74 Naval Engineering in the 21st Century Federally funded R&D centers, such as Rand, the MITRE Corporation, and the Department of Energy’s National Laboratories, are not eligible to receive awards under ONR’s consolidated BAA, although they may team with eligible partners. DOD laboratories, including the Navy’s own labo- ratories and warfare centers, are also precluded from bidding directly. ONR publishes on its website a list of technology areas in which it is interested, together with the names of and contact information for pro- gram officers who handle those areas. The BAA urges offerors to contact the program officer whose technology portfolio best matches their fields of interest before they develop their proposals. Program officers are responsible for evaluating the technical proposals that are submitted in their technical areas. As stipulated by the BAA, award decisions are “based on a competitive selection of proposals resulting from a scientific and cost review.” Box 3-1 lists the evaluation criteria to be con- sidered in evaluating the BAA for 2010. The BAA indicates that Factors 1 through 3—the technical factors— are of equal weight and that those technical factors are significantly more important than Factor 5, cost realism. BOX 3-1 Evaluation Criteria for ONR’s 2010 BAA 1. Overall scientific and technical merits of the proposal; 2. Potential Naval relevance and contributions of the effort to the agency’s specific mission; 3. The offerors’ capabilities, related experience, facilities, tech- niques or unique combinations of these which are integral factors for achieving the proposal objectives; 4. The qualifications, capabilities and experience of the proposed principal investigator, team leader and key personnel who are critical in achieving the proposal objects; and 5. The realism of the proposed costs and availability of funds. SOURCE: ONR 2009a, 21.

OCR for page 66
National Naval Responsibility for Naval Engineering Mission 75 One of the key inputs to the NNR-NE R&D process is knowledge of Navy needs and mission areas. ONR program officers often work as intermediaries between Navy laboratories, the academic and industrial research community, and other stakeholders. Such an integrative role is critical to the success of the NNR-NE initiative. The committee noted that links to the operational Navy community from designated NNR-NE projects were not as well articulated, nor could the committee identify a systematic mechanism that communicated Navy operational needs to program officers managing these projects. The committee concludes that no formal process exists within ONR for regular review of Navy mission needs relevant to its S&T planning for new projects with NNR-NE desig- nation or for determination of allocation plans for funding to performer organizations. ONR’s performer evaluation process, including that for its NNR-NE portfolio, differs from that of some other government research sponsors in not including an evaluation of its basic research proposals by external peer reviewers. External review of proposals can be a valuable tool for government agencies that fund basic research, whose impact on future capabilities systems may not become apparent for decades. Organiza- tions that use external scientific peer review for most or all of the basic research they fund include the National Science Foundation (NSF), the National Institutes of Health (NIH), and the Office of Research and Eval- uation of the National Institute of Justice (DHS 2009). Within DOD, the Air Force Office of Scientific Research employs a peer review process using review panels that typically include two reviewers from other DOD offices and one from outside of DOD (Sharp 2007). In contrast, the Defense Advanced Research Projects Agency (DARPA) generally does not bring external experts into its evaluation process. Instead, it uses its cadre of program officers, who typically rotate into the organi- zation from positions outside of government and serve in DARPA for only a few years, thus ensuring a fresh flow of expertise and perspective. The committee understands that in recent years, ONR’s program officers have stayed for substantially longer periods. Supporters of ONR’s proposal evaluation process argue that the com- munity of scientists with relevant expertise—particularly in the naval engineering fields—is small, making it hard to find outside technical

OCR for page 66
76 Naval Engineering in the 21st Century experts to serve as external peer reviewers. They might also point out that this committee’s assessment constitutes an external peer review of NNR- NE’s overall program and thus serves as an implicit review of the award choices made by NNR-NE program officers. However, the committee found that there are sound reasons to con- sider bringing external peer scientists in to help with the evaluation of proposals. Bringing outside experts into the proposal evaluation process can help an organization sustain competition and avoid parochialism. It can also help to build a cohort of outsiders familiar with and interested in the particular areas of research. In NNR-NE’s case, bringing experts from other government organizations into the proposal review process might help to forge and strengthen partnerships that the ONR organiza- tion aspires to develop. Observers have found that external assessments like the one con- ducted by this committee can be useful in helping government research organizations to improve the merit and relevance of the research they fund and to develop plans for the future (Lyons and Chait 2009). Because such reviews are aimed at the organizational level, however, they lack the immediate impact on funded projects of external scientific reviews of proposals. The Navy Warfare Centers use peer-review evaluation, with external reviewers encouraged, for proposal selection in certain pro- grams. Box 3-2 describes examples of the use of peer review of project proposals by research organizations within DOD and at other federal agencies. Box 3-3 summarizes conclusions of a 2002 National Research Council (NRC) study of approaches to organizing cooperative research on naval engineering, conducted at the request of ONR, concerning the value of peer review in the research programs it examined as models. ONR leadership has formed a similar opinion with regard to the mer- its of external review in the monitoring of projects that have already been selected for funding and is establishing a peer-review process. The process described to the committee involves assembling a panel of three to five external technical experts who review the project’s progress in the second (and potentially third) year of execution. The objective of these panels is to assess the efficacy of the ongoing project and to make recommenda- tions to the program officer for continuation or termination. Unfortu- nately, this process does not appear to achieve all of the benefits that can be accrued through early participation of peers in project selection.

OCR for page 66
102 Naval Engineering in the 21st Century Navy laboratories and academic research institutions and in opera- tional Navy settings. INTEGRATING NAVAL ENGINEERING S&T As discussed earlier, the committee suggests that ONR needs to take additional steps to enhance its organizational and management practices in setting performance goals and evaluating results. This is especially critical for research organizations such as ONR with significant multi- disciplinary programs and related challenges. The committee found several examples of interdisciplinary and inte- grative research in the NNR-NE portfolio. In its commissioned papers and workshops, the committee found additional evidence of integrative and interdisciplinary naval engineering projects such as the integrated composite mast (Hackett 2010), and it found a number of materials, hydrodynamics, and ship structures programs. However, the committee concluded that these projects resulted from the efforts of individual pro- gram officers or industry representatives who, for personal or profes- sional reasons, engaged in interdisciplinary research and played a key role in developing such programs, rather than being an outgrowth of system- atic ONR processes that fostered interdisciplinary or integrative research. Recommendation: As part of its enterprisewide strategic planning process, ONR should establish a culture of interdisciplinary and integra- tive research within and around the NNR-NE S&T enterprise and should establish processes that foster, encourage, and incentivize inter- disciplinary or integrative research. The NNR-NE interdisciplinary and integrative research objectives should be established as part of the strate- gic planning processes and should include assessment, benchmarking, and continuous process improvement components. DEVELOPING HUMAN CAPITAL AND REVITALIZING NAVAL SHIP SYSTEMS ENGINEERING The 1990s were a period of great change within DOD and the Depart- ment of the Navy precipitated by the fall of the former Soviet Union, the end of the cold war, and the desire to capitalize on the so-called “peace

OCR for page 66
National Naval Responsibility for Naval Engineering Mission 103 dividend.” One result was a substantial downsizing of the Navy organi- zations previously responsible for ship design and acquisition, accompa- nied by the outsourcing of these services to industry. According to the General Accounting Office, “DoD performed this downsizing [from 1989 to 2002] without proactively shaping the civilian workforce to ensure that it had the specific skills and competencies needed to accom- plish future DoD missions” (GAO 2004, 7). During that decade, the Department of the Navy in general and NAVSEA in particular saw a deep reduction in the human capital required to design, develop, acquire, deploy, and maintain the naval fleet. NAVSEA headquarters alone saw the cadre of highly experienced naval ship design engineers shrink from about 1,200 in 1992 to fewer than 300 in 2005 (Keane et al. 2009, 47). Concerns related to the naval acquisition work- force were articulated by then Secretary of the Navy Donald Winter in a 2007 speech before the Navy League: “There has been a steady erosion in domain knowledge within the Department of the Navy over the past several decades, resulting in an overreliance on contractors in the per- formance of core in-house functions” (Winter 2007). Secretary Winter went on to say that while “the Department’s level of technical expertise associated with naval architecture and design is relatively high, our knowledge of the shipbuilding process is short of what it has been in the past, and what it needs to be in the future. Our challenge is to under- stand how to integrate design and production technology into an acqui- sition process that industry can execute. This requires a deep knowledge of systems engineering and a profound understanding of the acquisi- tion process. Systems engineering is key to ensuring that each ship is configured to optimize the fleet” (Winter 2007). Secretary Winter discussed the steps necessary to correct the deficien- cies in naval ship acquisition, and the workforce in particular, saying that “the Navy needs to provide knowledgeable program oversight. Hiring top-quality people who have experience with large shipbuilding pro- grams is essential. The ability to assign an experienced and capable team must be a precondition to a program’s initiation. Finding and develop- ing the people we need is easier said than done, and it will take time to rectify this problem, but we cannot ignore the leverage that can be obtained by putting the right, experienced and prepared people, in the right posi- tions” (Winter 2007).

OCR for page 66
104 Naval Engineering in the 21st Century The need to develop the requisite human capital and revitalize naval ship systems engineering has been clearly recognized by the Navy leader- ship as a key goal. Today, efforts exist not only to protect and maintain the mission-critical competency areas but also to develop them for the present and future. The development and monitoring of the health of naval engi- neering human capital have been actively pursued within NAVSEA by using tools such as the Human Capital Digital Dashboard (Tropiano 2005), which provides an objective assessment of the following: • Alignment of engineers with the technical authority chain of command; • Availability and adequacy of technical documentation, including spec- ifications, standards, tools, and processes; • Workforce demographics, including age and levels of education; • Workforce skills, including experience, certifications, and other special abilities; • Workforce health metrics, including assessments of leadership skills, mission capability, and technical documentation; • Problem areas, such as critical vacancies, anticipated retirements, and substandard assessments; and • Long-term health actions in these areas. Developing the Navy’s next generation of naval engineering leaders is a challenging problem. During the 1990s, as a result of changes in acquisition policy, preliminary and contract design for Navy ships that NAVSEA had previously performed in-house began to be contracted out to shipbuilders. In addition, the rate of new ship acquisition declined in this period compared with that of the previous decade. The contraction of the NAVSEA headquarters ship design staff noted above was a conse- quence. This problem is being addressed on several fronts. One initiative was the creation of the Center for Innovation in Ship Design (CISD) in 2002 by NAVSEA, ONR, and NSWC. CISD was tasked in 2006 “to develop a Human Capital Strategy (HCS) for Ship Design Acquisition Workforce Improvement. The Ship Design Management HCS will ensure a highly experienced warship design workforce to sustain NAVSEA as the nation’s leader in naval ship design” (Keane et al. 2009, 46). The commit- tee noted that this focused program has in large part sustained the core

OCR for page 66
National Naval Responsibility for Naval Engineering Mission 105 competencies that are essential to rebuilding the naval ship systems engi- neering and acquisition workforce. The need to train, develop, and refresh the naval ship systems engineer- ing workforce and technology base continuously was articulated in previ- ous studies (NRC 2000; TRB 2002; U.S. Department of Commerce 2001). It was widely discussed in the naval engineering professional journals (ASNE 1992) and in academic settings (Chryssostomidis et al. 2000). These writings served to identify the “failure of government and industry research and development (R&D) organizations to stimulate the education, inno- vation, and competitiveness improvements needed to support the U.S. shipbuilding industry. These reports highlight the significant role the Department of Defense must play in leading the R&D investment stimu- lus for the cooperative development of innovative, cost and labor saving technologies by the U.S. shipbuilding industry and the supporting aca- demic institutions. Additionally, each subsequent report has continued to identify the areas of education, innovation and competitiveness as problematic in the U.S. shipbuilding industry” (ONR 2001, 1–2). The naval engineering human capital pipeline is illustrated in Fig- ure 3-1. The pipeline begins with the kindergarten through 12th grade (K-12) pool of STEM students. Those high school students who enter universities and colleges and graduate with a bachelor of science degree will enter the general engineering workforce in the tens of thousands annually, while thousands will continue on for advanced degrees. Each year, no more than a few thousand new graduates (and in some years probably less than a thousand) at all degree levels will enter the naval engineering enterprise workforce. Of those graduates who do enter naval engineering, a small number each year will leave the workforce to pur- sue a higher degree, motivated by their experience in naval engineering. A select few will stay on in academia to educate the next generation of naval engineers. The ever-present demand signal for graduates is driven by the natural progression of scientists and engineers in their careers and eventual attrition from the workforce either through a career change or retirement. Supporting this pipeline for the development of naval engi- neers is the infrastructure of primary and secondary schools, colleges and universities, government research activities, private-sector research

OCR for page 66
ONR, NAVSEA, $ Industry Research Funding Research Stay in Faculty to teach Faculty Undergraduate academia Projects Students Masters PhD Hundreds K-12 K2 Thousands BS STEM STEM Tens of thousands Enter the workforce Enter the workforce Thousands Hundreds Enter the workforce Enter the Naval Enterprise Enter the Naval Enterprise Tens Tens of thousands Hundreds Enter the Naval Enterprise Thousands Leave Naval Engineering Return to School FIGURE 3-1 Naval engineering value stream and pipeline. (SOURCE: National Shipbuilding Research Program 2009. Printed with permission from the National Shipbuilding Research Program, from the Shipbuilding Engineering Education Consortium Viability and Operational Concepts Final Report, June 16, 2009.)

OCR for page 66
National Naval Responsibility for Naval Engineering Mission 107 institutions, and university research centers. The National Shipbuilding Research Consortium’s 2009 study of the naval engineering workforce, conducted for NAVSEA, concluded that the demand for hiring of entry- level naval engineers by NAVSEA, U.S. shipbuilders, and the supporting industries is about 2,000 per year, while graduates of accredited pro- grams of naval engineering total only about 200 annually (National Ship- building Research Program 2009, 30). This demand estimate appears inconsistent with the report’s estimate of total employment of naval engineers in these sectors of 15,000 (National Shipbuilding Research Program 2009, 21). Any excess of demand over supply must be filled by hiring and training engineers from other specializations. Developing a robust naval engineering pipeline is critical to the devel- opment of a robust naval engineering enterprise. NNR-NE efforts in naval engineering S&T workforce development have been sporadic and inadequately supported to date. ONR has been designated the lead agency for STEM efforts for the Department of the Navy; however, such responsibilities are considered an ancillary rather than a core functional responsibility. Outreach programs have been successful in reaching students and cre- ating an interest in STEM education and potential naval and maritime careers. ONR supports SNAME efforts to deploy the SeaPerch program nationally and to develop ways to expand and enhance the promotion as part of ONR’s NNR-NE outreach. Professional technical societies such as SNAME and ASNE appear to be well positioned to provided leader- ship and support for these outreach initiatives. However, limitations do exist in the professional societies’ ability to perform this outreach given their modest number of volunteers and funding for professional staff in relation to the broad K-12 population. Recommendation: ONR should reinvigorate its efforts in developing the 21st century naval engineering workforce, including improvement of outreach activities to underrepresented groups. ONR’s lead role for STEM activities should be strengthened and incorporated into its enter- prisewide strategic planning processes, and performance metrics for workforce development and STEM achievements should be identified, measured, incentivized, and included in ONR’s assessment, benchmark- ing, and continuous process improvement activities.

OCR for page 66
108 Naval Engineering in the 21st Century ONR should consider additional approaches to increase the efficacy of the workforce development and STEM initiatives, including the following: • Targeting specific populations in a geographic region with professional connection to naval engineering activities (e.g., local naval architecture university, shipbuilder, naval facility); • Expanding funding and volunteer support for outreach programs though collaborative efforts between government activities, indus- try, and professional societies (e.g., the Junior Engineering Technical Society); and • Leveraging NAVSEA funding under the Naval Engineering Education Center Consortium to support SeaPerch and other initiatives. REFERENCES Abbreviations AFOSR Air Force Office of Scientific Research ASNE American Society of Naval Engineers DHS Department of Homeland Security DON Department of the Navy GAO General Accounting Office or Government Accountability Office NRC National Research Council NSF National Science Foundation ONR Office of Naval Research TRB Transportation Research Board AFOSR. 2007. Proposer’s Guide to the AFOSR Research Programs. Aug. ASNE. 1992. Preserving Our Naval Engineering Capability. Naval Engineers Journal, Vol. 104, No. 4, July, pp. 11–13. Revised May 1998. Bond, T. C. 1999. The Role of Performance Measurement in Continuous Improvement. International Journal of Operations and Production Management, Vol. 19, No. 2, pp. 1318–1334. Brown, M. 1996. Keeping Score: Using the Right Metrics for World Class Performance. American Management Association, Washington, D.C. Bukowitz, W., and G. Petrash. 1997. Visualizing, Measuring and Managing Knowledge. Research and Technology Management, Vol. 40, No. 4, pp. 24–31.

OCR for page 66
National Naval Responsibility for Naval Engineering Mission 109 Chang, L., and B. Birkett. 2004. Managing Intellectual Capital in a Professional Service Firm: Exploring the Creativity–Productivity Paradox. Management Accounting Research, Vol. 15, pp. 7–31. Chen, J., Z. Zhu, and H. Xie. 2004. Measuring Intellectual Capital: A New Model and Empirical Study. Journal of Intellectual Capital, Vol. 5, No. 1, pp. 195–212. Chryssostomidis, C., M. Bernitsas, and D. Burke, Jr. 2000. Naval Engineering: A National Naval Obligation. Massachusetts Institute of Technology Ocean Engineering Design Laboratory, May. DHS. 2009. Developing Technology to Protect America. Science and Technology Direc- torate. National Academy of Public Administration, Washington, D.C. Doerry, N. 2010. Transitioning Technology to Naval Ships. Paper commissioned by the committee, June. DON. 2010. Department of the Navy Fiscal Year (FY) 2011 Budget Estimates: Justifi- cation of Estimates: Research, Development, Test and Evaluation, Navy: Budget Activity 1–3. Feb. Eccles, R. 1991. The Performance Measurement Manifesto. Harvard Business Review, Jan.–Feb., pp. 131–137. Eccles, R., and P. Pyburn. 1992. Creating a Comprehensive System to Measure Perfor- mance. Management Accounting, Oct. Edvinsson, L. 1997. Developing Intellectual Capital at Skandia. Long Range Planning, Vol. 30, No. 3, pp. 366–373. Edvinsson, L., and M. Malone. 1997. Intellectual Capital: Realizing Your Company’s True Value by Finding Its Hidden Brainpower. Harper Business, New York. Frigo, M. L., and K. R. Krumwiede. 2000. The Balanced Scorecard: A Winning Perfor- mance Measurement System. Strategic Finance, Jan., pp. 50–54. GAO. 2004. DOD Civilian Personnel Comprehensive Strategic Workforce Plans Needed. Report GAO-04-753. June. GAO. 2006. Stronger Practices Needed to Improve DOD Technology Transition Processes. Report GAO-06-883. Sept. Grabowski, M., and K. H. Roberts. 1999. Risk Mitigation in Virtual Organizations. Orga- nization Science, Vol. 10, No. 6, Nov.–Dec., pp. 704–721. Hackett, J. P. 2010. Composites Road to the Fleet—A Collaborative Success Story. Paper commissioned by the committee, June 18. Hagan, J. 2010. Human Systems Integration/Crew Design Process Development in the Zumwalt Destroyer Program—A Case Study in the Importance of Wide Collabora- tion. Paper commissioned by the committee, June 8. Ittner, C. D., and D. F. Larcker. 1998a. Are Nonfinancial Measures Leading Indicators of Financial Performance? An Analysis of Customer Satisfaction. Journal of Accounting Research, Vol. 36, pp. 1–35.

OCR for page 66
110 Naval Engineering in the 21st Century Ittner, C. D., and D. F. Larcker. 1998b. Innovations in Performance Measurement: Trends and Research Implications. Journal of Management Accounting Research, Vol. 10, pp. 205–238. Joia, L. 2000. Measuring Intangible Corporate Assets: Linking Business Strategy with Intellectual Capital. Journal of Intellectual Capital, Vol. 1, No. 1, pp. 68–84. Kaplan, R., and D. Norton. 1992. The Balanced Scorecard—Measures That Drive Per- formance. Harvard Business Review, Vol. 70, No. 1, pp. 71–79. Kaplan, R., and D. Norton. 1996a. The Balanced Scorecard. Harvard Business School Press, Boston, Mass. Kaplan, R., and D. Norton. 1996b. Linking the Balanced Scorecard to Strategy. California Management Review, Vol. 39, No. 1. Kaplan, R., and D. Norton. 1996c. Using the Balanced Scorecard as a Strategic Manage- ment System. Harvard Business Review, Vol. 74, No. 1, pp. 75–85. Kaplan, R., and D. Norton. 2001. The Strategy-Focused Organization. Harvard Business School Press, Boston, Mass. Kaplan, R., and D. Norton. 2004. Strategy Maps: Converting Intangible Assets into Tangible Outcomes. Harvard Business School Press, Boston, Mass. Keane, R. G., Jr., H. Fireman, J. J. Hough, and K. Cooper. 2009. A Human Capital Strategy for Ship Design Acquisition Workforce Improvement: The U.S. Navy’s Center for Innovation in Ship Design. Naval Engineers Journal, Vol. 121, No. 4, pp. 45–67. Lyons, J. W., and R. Chait. 2009. Strengthening Technical Peer Review at the Army S&T Laboratories. Center for Technology and National Security Policy, National Defense University, Washington, D.C., March. Melnyk, S. A., D. M. Stewart, and M. Swink. 2004. Metric and Performance Measure- ment in Operations Management: Dealing with the Metrics Maze. Journal of Opera- tions Management, Vol. 22, pp. 16–29. National Academies. 2005. Facilitating Interdisciplinary Research. National Academies Press, Washington, D.C. http://www.nap.edu/catalog/11153.html. Accessed Dec. 8, 2010. National Shipbuilding Research Program. 2009. Shipbuilding Engineering Education Con- sortium (SEEC) Viability and Operational Concepts Final Report. June 16. NRC. 1999. Evaluating Federal Research Programs: Research and the Government Perfor- mance and Results Act. National Academy Press, Washington, D.C. NRC. 2000. An Assessment of Naval Hydromechanics Science and Technology. National Academy Press, Washington, D.C. NRC. 2004. Accelerating Technology Transition: Bridging the Valley of Death for Materi- als and Processes in Defense Systems. National Academies Press, Washington, D.C.

OCR for page 66
National Naval Responsibility for Naval Engineering Mission 111 NSF. 2011a. FY 2012 Budget Request to Congress. Feb. 14. http://www.nsf.gov/about/ budget/fy2012/pdf/fy2012_rollup.pdf. NSF. 2011b. Proposal and Award Policies and Procedures Guide. Jan. ONR. 2001. Memorandum: National Naval Program for Naval Engineering. Oct. 22. ONR. 2009a. Long Range BAA for Navy and Marine Corps Science and Technology. BAA 10-001. Sept. 18. ONR. 2009b. Naval S&T Strategic Plan: Defining the Strategic Direction for Tomorrow. Feb. http://www.dtic.mil/cgi- bin/GetTRDoc?AD=ADA499909&Location=U2&doc=GetTRDoc.pdf. Pelz, D. C. 1956. Some Social Factors Related to Performance in a Research Organiza- tion. Administrative Science Quarterly, Vol. 1, No. 3, Dec., pp. 310–325. Porter, A. L., J. D. Roessner, A. S. Cohen, and M. Perrault. 2006. Interdisciplinary Research: Meaning, Metrics and Nurture. Research Evaluation, Vol. 15, No. 3, Dec., pp. 187–195. Reugg, R. 2007. Quantitative Portfolio Evaluation of U.S. Federal Research and Devel- opment Programs. Science and Public Policy, Dec., pp. 723–730. Roberts, K. H. 1990. Some Characteristics of One Type of High Reliability Organization. Organization Science, Vol. 1, No. 2, pp. 160–176. Roberts, K. H., and D. M. Rousseau. 1989. Research in Nearly Failure-Free, High Relia- bility Organizations. IEEE Transactions on Engineering Management, Vol. 36, No. 2, May, pp. 132–139. Sharp, W. 2007. Research Managers Skillfully Navigate, Execute the Basic Research Funding Process. Wright–Patterson Air Force Base News, updated June 28. Tan, K., and K. Platts. 2003. Linking Objectives to Action Plans: A Decision Support Approach Based on Cause–Effect Linkages. Decision Sciences, Vol. 34, No. 3, pp. 569–593. Tan, K., K. Platts, and J. Nobel. 2004. Building Performance Through In-Process Mea- surement: Toward an “Indicative” Scorecard for Business Excellence. International Journal of Productivity and Performance Management, Vol. 53, No. 3, pp. 233–244. Taylor, J., J. L. Price, and D. K. Phelps. 2010. Energy and Power Community of Interest: Energy and Power S&T Overview. Presented at 46th AIAA–ASME–SAE–ASEE Joint Propulsion Conference and Exhibit and 8th International Energy Conversion Engi- neering Conference, Nashville, Tenn., July 28. TRB. 2002. Special Report 266: Naval Engineering: Alternative Approaches for Organizing Cooperative Research. National Academies, Washington, D.C. Tropiano, M., Jr. 2005. Human Capital Digital Dashboard: NAVSEA’s Future Method of Measuring Community Health. Defense AT&L Magazine, Nov.–Dec.

OCR for page 66
112 Naval Engineering in the 21st Century U.S. Department of Commerce. 2001. National Security Assessment of the U.S. Shipbuild- ing and Repair Industry. May. U.S. Department of Energy. 1995. How to Measure Performance: A Handbook of Tech- niques and Tools. Defense Programs, Special Projects Group (DP-31). U.S. Department of Energy. 2001. Establishing an Integrated Performance Measurement System. Performance-Based Management Special Interest Group. Winter, D. 2007. Remarks at Sea–Air–Space Exposition. Navy League, April 3. Yeniyurt, S. 2003. A Literature Review and Integrative Performance Measurement Framework for Multinational Companies. Marketing Intelligence and Planning, Vol. 21, No. 3, pp. 134–142.