3
Review of the Navy’s Analytical Processes and Methods

THE COMMITTEE’S APPROACH

The committee framed its review of the Navy’s efforts to implement capabilities-based planning (CBP) by asking about the Navy’s (1) conceptual framework for CBP, (2) its analytic framework, (3) its explicit attention to future building blocks, and (4) implementation at the level of personnel and organizations.

Conceptual Framework for Capabilities-Based Planning

Basic Ideas

As discussed in Chapter 2, the CBP approach is fundamentally about planning under uncertainty by emphasizing flexibility, robustness, and adaptiveness, while doing so within an economic framework. That is, choices must be made about how much risk of various types is tolerable, how to exploit opportunities for efficiency and effectiveness, and how to live with the budget that is finally decided upon by national authorities. The capabilities in question are to be outputs—measures of the ability actually to execute tasks, missions, and operations. These capabilities should also be conceived as joint capabilities, even though in some instances a particular joint capability may effectively be a Service capability (e.g., undersea surveillance).

“Planning under uncertainty” is not a mere phrase being emphasized; it is the essence of CBP, which recognizes that the United States cannot reliably predict how its military forces will be used—against whom, for what purpose, and in



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning 3 Review of the Navy’s Analytical Processes and Methods THE COMMITTEE’S APPROACH The committee framed its review of the Navy’s efforts to implement capabilities-based planning (CBP) by asking about the Navy’s (1) conceptual framework for CBP, (2) its analytic framework, (3) its explicit attention to future building blocks, and (4) implementation at the level of personnel and organizations. Conceptual Framework for Capabilities-Based Planning Basic Ideas As discussed in Chapter 2, the CBP approach is fundamentally about planning under uncertainty by emphasizing flexibility, robustness, and adaptiveness, while doing so within an economic framework. That is, choices must be made about how much risk of various types is tolerable, how to exploit opportunities for efficiency and effectiveness, and how to live with the budget that is finally decided upon by national authorities. The capabilities in question are to be outputs—measures of the ability actually to execute tasks, missions, and operations. These capabilities should also be conceived as joint capabilities, even though in some instances a particular joint capability may effectively be a Service capability (e.g., undersea surveillance). “Planning under uncertainty” is not a mere phrase being emphasized; it is the essence of CBP, which recognizes that the United States cannot reliably predict how its military forces will be used—against whom, for what purpose, and in

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning what circumstances. Nor can it reliably predict precisely how conflict situations will evolve or how well each and every system and operation will work. Hedging is essential. But hedging is costly. How much is enough? And what framework is used to judge? Simple Tests Some simple tests can indicate whether a Service is applying CBP. One is whether the emphasis is on achieving capabilities rather than, as in prior periods, platforms and weapons systems. A second test is whether options for achieving capabilities are joint and whether trade-offs cross Service boundaries where appropriate. A third test is whether risk is considered, in its various dimensions. And, last but not least, are assessments accomplished within an economic framework, which includes identifying funding sources for additions that would otherwise defy the fiscal guidance? Analytic Framework Given a conceptual framework, an organization also needs a suitable analytic framework to conduct CBP, preferably one that is widely understood and that provides for the following: An understanding of capability needs at the mission or operation level; An understanding of aggregate capability needs (for theater and multitheater challenges); The development and assessment of options for providing needed capabilities, including options that maintain the overall funding level specified by fiscal guidance; and The assessment of options and trade-offs in an integrative portfolio-management structure suitable to Chief of Naval Operations (CNO)-level review. Because issues arise at different levels (e.g., strategic, campaign, and mission), it follows that the analytic framework must be hierarchical, with a clear logic trail from the high-level constructs down to those in which one can see the critical components and subcomponents of capability that make operations successful. The relationships among levels of analysis cannot merely be asserted on the basis of implicit assumptions; instead, they must be derived from thoughtful, explicit analysis with conscious trade-offs. Consideration of Future Building Blocks In domains in which one seeks flexible, adaptive, and robust capabilities, effective solutions typically depend on developing appropriate capabilities as

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning building blocks. These exist in the realm of systems and platforms, organization, and operations. Planning for the future in an uncertain era of dynamic change implies rethinking—and probably transforming—the building blocks. For the Navy, this will likely mean new strike groups, different concepts of manning, and new joint operations or ways of conducting old ones. Implementation—Moving Toward First-Class Analysis Finally, conducting CBP well will require first-class analysis. Achieving this objective involves institutional issues and has major implications for staffing, organization, and reward systems. With the background presented in this and the preceding subsections as a framework, the remainder of the chapter addresses the issues in the same order discussed above and provides the committee’s assessments and recommendations. THE CONCEPTUAL FRAMEWORK Does the Department of the Navy have a sound, top-level conceptual framework to guide capabilities-based planning? To address this question, the committee drew primarily on the following documents and briefings: (1) “Sea Power 21 Series” articles from U.S. Naval Institute Proceedings,1 (2) Naval Transformation Roadmap 2003,2 (3) a set of briefings presented to the committee by the Office of the Deputy Chief of Naval Operations (DCNO) for Warfare Require- 1   ADM Vern Clark, USN, Chief of Naval Operations. 2002. Sea Power 21 Series—Part 1, “Projecting Decisive Joint Capabilities,” U.S. Naval Institute Proceedings, October; VADM Mike Bucchi, USN, and VADM Mike Mullen, USN. 2002. Sea Power 21 Series—Part II, “Sea Shield: Projecting Global Defensive Assurance,” U.S. Naval Institute Proceedings, November; VADM Cutler Dawson, USN, and VADM John Nathman, USN. 2002. Sea Power 21 Series—Part III, “Sea Strike: Projecting Persistent, Responsive, and Precise Power,” U.S. Naval Institute Proceedings, December; VADM Charles W. Moore, Jr., USN, and LtGen Edward Hanlon, Jr., USMC. 2003. Sea Power 21 Series—Part IV, “Sea Basing: Operational Independence for a New Century,” U.S. Naval Institute Proceedings, January; VADM Richard W. Mayo, USN, and VADM John Nathman, USN. 2003. Sea Power 21 Series—Part V, “ForceNet: Turning Information into Power,” U.S. Naval Institute Proceedings, February; VADM Mike Mullen, USN. 2003. Sea Power 21 Series—Part VI, “Global Concept of Operations,” U.S. Naval Institute Proceedings, April; VADM Alfred G. Harms, Jr., USN, VADM Gerald L. Hoewig, USN, and VADM John B. Totushek, USN. 2003. Sea Power 21 Series—Part VII, “Sea Warrior: Maximizing Human Capital,” U.S. Naval Institute Proceedings, June. 2   ADM Vern Clark, USN, Chief of Naval Operations; and Gen Michael Hagee, USMC, Commandant of the Marine Corps. 2004. Naval Transformation Roadmap 2003: Assured Access and Power Projection from the Sea, Department of the Navy, Washington, D.C.

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning ments and Programs (N70)3 and the Assessments Division of the Office of the DCNO for Resources, Requirements, and Assessments (N81);4 the Naval Air Systems Command (NAVAIR);5 and the Office of the DCNO for Manpower and Personnel (N1).6 The relationships of these Navy CBP efforts to approaches of the Office of the Secretary of Defense (OSD) and the Office of the Joint Chiefs of Staff (OJCS) are discussed in Chapter 4. This assessment addresses separately the Navy’s broad strategic approach, its system level of analysis, and its analysis at the mission and operational level, as in development of Program Objective Memorandums (POMs) responsive to strategic guidance. Broad Strategic Approach The committee concludes that the Department of the Navy has done a creditable job in laying out a broad strategic approach and has gone on to delineate sensibly the special responsibilities that the maritime Services have in national strategy and joint operations. The Navy’s approach is organized at the top level in terms of Sea Shield, Sea Strike, Sea Basing, and the enabling “glue” of FORCEnet, as indicated in Figure 3.1.7 These are supported by what are termed Sea Trial, Sea Warrior, and Sea Enterprise. Were the planning to stop with this top level, it would produce little more than good viewgraphs, but the Navy has put considerable effort into assuring that all of the important functions of the department are mapped into this structure and that useful decompositions exist down to meaningful levels of detail (see the next subsection). Such breakdowns are always imperfect because of crosscutting factors, but the committee was satisfied that the structure largely makes sense. The structure will probably change over time as the Navy gains experience with the decomposition and makes adjustments, but the approach is sensible. This said, the committee notes that the approach is quite different from that being used in OSD and the Joint Staff (in Chapter 4, see the discussion of the Joint Capabilities Integration and Development System (JCIDS)), which involves functional capa- 3   CAPT Terry McKnight, USN, N70, “Naval Capabilities Development Process,” presentation to the committee, July 27, 2004, Woods Hole, Mass. 4   LCDR Kenneth Masson, USN, N815, “Capabilities Based Planning,” presentation to the committee, July 27, 2004, Woods Hole, Mass. 5   Patrick McLaughlin, NAVAIR, “Naval Analytical Capabilities and Improving Capabilities-Based Planning,” presentation to the committee, July 28, 2004, Woods Hole, Mass. 6   Richard Robbins, N1Z, “N1 and Capabilities-Based Planning,” presentation to committee members, July 21, 2004, Navy Annex, Washington, D.C. 7   CAPT Terry McKnight, USN, N70, “Naval Capabilities Development Process,” presentation to the committee, July 27, 2004, Woods Hole, Mass., slide 10.

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning FIGURE 3.1 Top-level components of Sea Power 21. SOURCE: CAPT Terry McKnight, USN, N70, “Naval Capabilities Development Process,” presentation to the committee, July 27, 2004, Woods Hole, Mass., slide 10. bility areas identified as focused logistics, battlespace awareness, force application, force protection, command and control, network-centric operations, training, and force management. It is important that the Navy have clear mappings from its decomposition to the Department of Defense’s (DOD’s) functional capability areas if it is to participate and compete effectively in overall DOD planning. The mapping issue is nontrivial because the Navy capabilities, natural in an operations-oriented decomposition, depend on a number of the functional capabilities in JCIDS. The committee is also convinced that at the highest levels of the Navy and the Marine Corps there is a commitment to jointness—not merely to offer lip service to it but because jointness is a fundamental aspect of overall transformation for the new era in which the United States finds itself. At that highest level, as reflected in the core documents, it is appreciated that warfighting will almost always need to be joint in the future. Even under those circumstances, however, enormous responsibilities will continue to devolve upon the maritime commanders. Finally, the higher-level documents all reflect a commitment to flexibility, adaptiveness, and robustness. This is perhaps not surprising, since the Navy and Marine Corps have traditionally emphasized these qualities to a greater extent than have the Army and Air Force, which became more captive to planning for particular war scenarios. In contrast, the committee was not persuaded that the translation of higher-level intentions into lower-level processes and practices is going well, as discussed below in the subsection “Operational Analysis for the Department of Defense and Office of the Chief of Naval Operations” and in the next major

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning section, “Analytic Framework.” First, however, work at the system-command level is discussed. System-Level Analysis The committee was generally impressed by the presentations made by representatives of naval organizations two levels down from the highest level. These presentations had been generated by the Navy’s Space and Naval Warfare Systems Command (SPAWAR), the Naval Sea Systems Command (NAVSEA), and the Naval Air Systems Command. Here the committee saw evidence of managerial rethinking about organization, process, and products to support CBP. For example, in the briefings cited, the committee saw reference not only to analytical work on capabilities, but also to life-cycle costs, and business-case assumptions. The Navy has even reorganized to operate what it calls a Virtual Systems Command to increase agility and integration.8 Illustrative discussion of the Virtual Systems Command’s analytical process reflected a systems-engineering perspective, with the inclusion of systems of systems and connections to mission-capability packages and the discrete “things” that end up being line items in budgets. It also suggested determination not only to identify overlapping capabilities but to distinguish between desirable and undesirable redundancy and to identify both capability gaps and trade-offs. At least in a quick-look review, this class of work appeared to be professional and responsive to the new paradigms of CBP. Whether the Virtual Systems Command will work out is, of course, something that only experience will show. The committee also heard a briefing from the Navy Warfare Development Command (NWDC),9 which addressed rather extensive fleet-based experimentation to support near-term assessments closely related to filling recognized capability gaps (e.g., those against small-boat attacks). This effort reflected a laudable Navy decision to reemphasize fleet-level experimentation and the accumulation of substantial empirical and analytical data. The experiments described, however, were all focused on the near term. Although all of them were clearly desirable and important, the committee was concerned that the effort might remain too exclusively concerned with near-term, incremental issues. The Navy leadership will wish to review issues of balance over time. 8   Patrick McLaughlin, NAVAIR, “Naval Analytical Capabilities and Improving Capabilities-Based Planning,” presentation to the committee, July 28, 2004, Woods Hole, Mass. 9   Wayne Perras, Technical Director, Navy Warfare Development Command, “What We Do/Who We’re Doing It For,” presentation to the committee, July 27, 2004, Woods Hole, Mass.

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning Operational Analysis for the Department of Defense and Office of the Chief of Naval Operations In contrast to the experience described above, the committee found many reasons for concern at the level between top-level guidance and systems command (SYSCOM)-level work. Here the committee observed severe disconnects between top-level intentions and reality in the ranks. When the committee asked about excursions and exploratory analysis around baseline assumptions, briefers reported that there had been very little done. Thus, while some viewgraphs had been changed to be consistent with CBP, much of the ongoing work still had the problems of the previous era, particularly those surrounding point-scenario analysis. Although the problems seen by the committee may have been temporary, they appeared more likely to be chronic. If so, the Navy should recognize these as being systematic, indicating deep-seated issues, and act accordingly. Corrective measures will take much more than top-level documents, because staff take their lead from a myriad of actions and priorities expressed in the course of time. These problems are discussed more fully in the next section. THE ANALYTIC FRAMEWORK Understanding Needs at Mission and Operation Levels Description of Analytic Approach at Mission and Operation Levels The Department of the Navy’s core documents include useful decompositions from high-level components (e.g., Sea Strike) down to meaningful levels of detail. It is always a matter of judgment how far to carry such breakdowns. As one goes into more detail, issues and tasks become increasingly well defined and challenges become more explicit. However, excessive decomposition also generates a morass of detail that is not useful for higher-level planning.10 And, to make things worse, it can introduce biases by “hard-wiring” the way in which missions and higher-level tasks are to be performed. In capabilities-based planning, it is desirable to stop decomposing before that happens or, at least, to carry along alternative decompositions reflecting alternative concepts of operation. As can be seen from Figure 3.2, the Department of the Navy’s objectives-to-challenge decomposition structure goes down three to four levels. For example, in the FORCEnet component (close to bottom right), it goes down to the level of “Detect and Identify Targets/Moving Land Targets.” This level of specificity is 10   Detailed decomposition is, however, valuable for defining the myriad detailed tasks that must be mastered, supported, and coordinated. All Services are required to prepare detailed decomposition, and the results are published as the Unified Joint Task List by the Joint Staff.

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning useful for highlighting an important mission that is very different from other detect-and-identify missions, and one that the Navy has not traditionally emphasized. If the mission to engage moving land targets had been left implicit, it might not receive adequate attention. By and large, the committee concludes that the decomposition shown in Figure 3.2 is suitable as a top-down depiction. In particular, it has enough detail so that responsibilities for follow-up work can be assigned meaningfully. And, although there are scores of capability areas indicated (counting at the lowest level), the number is small enough to be managed. What matters, of course, is that for each one of these capability areas, the Department of the Navy does in-depth analysis to assess needs, capabilities, and improvement options. The committee could hardly review or assess that effort in a cursory review. However, Figure 3.3 illustrates how, for a large subset of items in Figure 3.2, the Navy has sought to assess capabilities versus time. For example, in the highest bar (“Neutralize Submarine Threats”), the Navy’s assessment is that the ability to neutralize submarine threats will improve from poor (black) to marginal (gray) within the time period shown (roughly through 2020). In contrast, much better progress is projected for countering minefields (by what mechanism was not made clear to the committee). The assessments were the result of subjective warfighter estimates, informed also by results of POM-06 campaign analyses and mission-level analyses. The process used to obtain the estimates was neither rigorous nor satisfactory to participants, but it was a systematic first effort that can be refined with time. The analytical approach being employed, then, appears to be that of using the decompositions, examining needs and capabilities in each area, and projecting changes over time in high-level depictions. That approach is reasonable, and was also consistent with the need in CBP to go to mission level rather than merely reporting results of theater-level campaigns in particular scenarios. The committee was surprised by the results shown in Figure 3.3 (almost none of the assessments improve beyond marginal (gray) or poor (black), thereby suggesting that a hard look at the criteria used would be appropriate), but the assessment process was at least a good beginning for something that can be much enriched over time. So far, so good. Unfortunately, the committee’s assessment was that many problems exist at the next level of analytical detail, as discussed below. Assessment of Analytic Framework for Mission and Operation Levels In assessing the Navy’s mostly implicit analytic framework, the committee drew on its experience and looked for generic problems that often beset analysis that is intended to, but actually does not, support capabilities-based planning. OPNAV will wish to review the situation when this report emerges, but the generic problems are as follows:

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning FIGURE 3.2 Decomposition of capability needs. SOURCE: CAPT Terry McKnight, USN, N70, “Naval Capabilities Development Process,” presentation to the committee, July 27, 2004, Woods Hole, Mass., slides 11-14. NOTE: SOF, Special Operations Force; CBRNE, chemical, biological, radiological, nuclear, explosives; C2, command and control; AFSB, afloat forward staging base; PNT, precision, navigation, and timing.

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning FIGURE 3.3 Projected capability versus time (roughly through 2020) for each of the capability areas. SOURCE: Adapted from CAPT Terry McKnight, USN, N70, “Naval Capabilities Development Process,” presentation to the committee, July 27, 2004, Woods Hole, Mass., slide 25. NOTE: PNT, precision, navigation, and timing; SOF, Special Operations Force. Key: Poor – black; Marginal – Gray; Good – White

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning tack.” The inner, dashed, contour suggests that capabilities are strong for two axes—“Control seas” and “Maintain presence”; fairly strong for another—“Assure early access”; and not very strong for the others. One possible goal for future Navy capabilities would be to achieve the outer, solid contour. That would require additional emphasis on ballistic-missile defense, early access, and the ability to project force inland even in difficult circumstances. General Attributes of Rigorous Analysis Another major concern of the committee relates to the need for first-rate analysis to be rigorous, documented, transparent, and as objective as possible. Rigor is a matter of degree. High-level decisions on programs and budgets depend on analysis being approximately right and appropriately insightful, not on high precision. Nonetheless, viewgraphs do not constitute analysis, nor do viewgraphs plus assertions about how hard people have worked. Good analysts universally acknowledge that the discipline involved in writing down assumptions and working carefully through the logic—that is, generating documentation—is exceedingly important. The committee’s impression is that OPNAV-level analysis, by contrast with rigorous analysis, is more ad hoc, undocumented, and rather opaque on key assumptions, and that it tends to have an advocacy bias and is constructed to focus only on particular issues (e.g., what might be needed to deal with certain bad-case naval scenarios). The committee did not see broad areas of choice with a hard-edged assessment of strengths and weaknesses being presented to the leadership. The OPNAV-level analysis is worthwhile in some respects, but it is not yet of the quality appropriate for senior decision making. Choosing Among Options in a Portfolio Framework Suitable to Top-Level Needs Developing the appropriate portfolio views is a complex undertaking that is highly dependent on the particular organization and decision context. Strategic Planning Versus Operations Research As suggested above, the current approach of OPNAV to analysis appears to be one of presenting charts that indicate adequacies and shortfalls (see, e.g., Figure 3.3), by capability area, and presenting occasional operations-analysis charts illustrating particular points. For example, the committee was briefed on some interesting work by the Assessments Division of the Office of the DCNO for Resources, Requirements, and Assessments (N81) that examined “Scud hunting problems” in more technical depth and with more operationally realistic assumptions than usual. As a result, different conclusions were suggested about

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning the potential mix of satellites and unmanned aerial vehicles and about the speed needed for air-to-surface missiles. Such work is useful and has some of the features of portfolio-management-style work in that one can look critically at the various capability areas and see where the greatest shortcomings exist (subject to the appropriateness of the underlying assumptions). That process, in turn, can lead to suggestions about resource allocation. It is, nonetheless, an operations-analysis perspective rather than one ideally suited to resource allocation. Assessment of the Navy’s Portfolio-Management-Style Analysis Strategic-Level Portfolio Analysis. In the committee’s view, portfolio-management-style presentations for the CNO and his top leaders should often have a more strategic, top-down character, and should more explicitly address economic issues. The committee did not see much economics-sensitive analysis, although some had been done in the preparation of the briefings that the committee was given. It seems, however, that the emphasis in the Navy’s CBP is exclusively on identifying shortfalls and finding ways to fund them. Notably absent, except at the systems command level, is the search for opportunities to accomplish missions effectively but at less cost, thereby freeing up funds for other purposes. Zooming. Portfolio work should allow for zooming in on an area as desired, so that the basis for high-level charts can be examined in depth. It appears, however, that there is minimal rigor in the Navy’s current assessments and no systematic way to trace the assumptions and logic from a top-level portfolio view to deeper capabilities analysis in which assumptions and their consequences could be seen parametrically. Arguably, this type of ability requires a family-of-models approach. The Navy is working to establish such a family of models, for which the CNO has provided funds. The architecture for what is needed, however, was not clear to the committee. It must include high-quality policy and system analysis, not just more investment in big models and simulations. It should also be tied to real-world data, not just to simulation.19 Highlighting Risks. Highlighting types of risk is a key part of portfolio-style analysis. Examples of different types of risk include the following: technical risk (Will a system in development work?), program risk (Will the program slip in time or have cost overruns?), future technology risk (Is the base being laid, in 19   National Research Council. 2004. The Role of Experimentation in Building Future Naval Forces, The National Academies Press, Washington, D.C.

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning research and development (R&D), for necessary future systems?), and strategic risk (Will the capability developments prove seriously inadequate because of changes in the strategic environment or national policy?).20 Adaptive Options. Strategic options should be explicitly adaptive and should hedge against their key assumptions’ proving to be wrong. One way for the Navy to do that is to consider seriously the broad range of scenario classes identified in the Strategic Planning Guidance (SPG). 21 Another is to recognize that the SPG itself is a baseline, not a definitive roadmap into the future. Indeed, key assumptions of the SPG will likely change with administrations and with strategic developments in the world. Thus, the Navy’s planning should also consider how robust its program would be in the event of such changes. The Navy should have contingency plans for such possibilities as (1) a greatly increased emphasis on defending the homeland from missile attacks (e.g., from containerized missiles) and (2) much-greater-than-expected threats to aircraft carriers, even at rather long ranges. These possibilities are offered purely as examples. It appears that current Department of the Navy work does not include true strategic options (e.g., adaptive options that hedge against events unfolding in unexpected ways). It is too formulaic and too slavishly responsive to CNO and DOD guidance, without providing feedback that might help reaffirm or adjust that guidance. Implications for Personnel. Presenting broad, discerning, strategic-level analysis for the CNO requires a higher level of analysis than that characteristic of systems analysis or operations research. This broad analysis is in the realm of strategic planning and policy analysis. Current personnel requirements for OPNAV analysis are predominantly limited to capabilities and experience possessed by operations-research-oriented personnel (and even those requirements are often not met). Revised personnel requirements and personnel policy changes would make it more plausible that those chosen would be rewarded with promotion. 20   Portfolio methods are discussed in the following work and references therein: Paul K. Davis, 2002, Analytic Architecture for Capabilities-Based Planning, Mission-System Analysis, and Transformation, National Defense Research Institute, RAND, Santa Monica, Calif. Current applications to the Missile Defense Agency highlight these particular risks. Unpublished discussion of the subject arose in a summer 2004 workshop at the Naval War College sponsored by OSD’s Director of Net Assessment. Many portfolio-related discussions, of course, can be found in the business literature, some of which are relevant even though the DOD and military Services do not have simple “bottom lines” against which to measure everything. See, for example, Robert S. Kaplan and David P. Norton, 1996, The Balanced Scorecard: Translating Strategy into Action, Harvard Business School, Cambridge, Mass. 21   Department of Defense. 2004. Strategic Planning Guidance, Secretary of Defense Donald Rumsfeld, Washington, D.C. (draft). (Classified)

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning The Need for Assimilating Capabilities-Based Planning Principles in Navy Analytic Processes The committee based the following recommendation on the assessments in this section. The recommendation will remain applicable until Navy leaders, after reviewing the situation, discover that they have eliminated the disconnects referred to above between leadership intentions and day-to-day analysis. Recommendation 1: The Chief of Naval Operations should reiterate principles of capabilities-based planning and ensure that they are truly assimilated in Navy analytic processes. The criteria for implementing Recommendation 1 include the following: The work accomplished should be joint and output-oriented, with the ability to actually execute operations as output. Successful CBP will require analysis over a broader scenario space, extensive exploratory analysis within specified scenarios, development of options both to solve capability problems and to achieve efficiencies, and portfolio-style assessments of those options at different levels of detail. The portfolio-style assessments should assist in making difficult trade-off decisions and should also address various types of risk that Navy leadership must take into account. Strategic options should be adaptive, because world developments and technological developments will undoubtedly force changes—the potential need for which are not much discussed in the DOD’s Strategic Planning Guidance. Although the committee did not discuss tools to support the types of analysis referred to here, it is quite aware that analytic organizations have trouble responding to the demands of good capabilities-based planning. The difficulties are rooted in excessive dependence on large, complex models and related databases; in management demands for detail; and in the ways in which analyses have been framed and conducted. Breaking these molds will not be easy. It will require a family-of-models approach that includes links to war-gaming and experimentation, but that must also include an often-ignored component: “smart,” low-resolution modeling and analysis that can support exploratory analysis (grounded in higher-resolution work or empirical data when appropriate) and put a premium on higher-level insights rather than focusing on minutia. Recommendation 2: The Chief of Naval Operations and the Secretary of the Navy should ensure that the Navy invests in defining and developing the new generation of analytic tools that will be needed for capabilities-based planning. Some of the attributes needed in tools include the following: agility in low-resolution modeling coupled with the ability to go into greater depth where needed (achievable with a sophisticated family of models and games); the ability to represent network-centric operations well (including publish-subscribe archi-

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning tectures, rather than node-to-node representations); and the ability to deal with challenges such as those that the OSD refers to as disruptive, catastrophic, and nontraditional scenarios. The committee is aware that the CNO has funded new work on a family of models. It is quite possible, however, that the funds will quickly be exhausted in improvements to “big models” and databases, with little benefit for higher-level capabilities-based planning, as described above. The committee encourages a balanced use of funds, including the potential purchase or use of available off-the-shelf tools. It is not possible for the committee to make more detailed suggestions here without a more extensive study. The committee notes, however, that examples of the kinds of tools mentioned above have been developed and applied. FUTURE BUILDING BLOCKS Flexible, robust, and adaptive capabilities invariably stem from having “building blocks” that can be quickly assembled, tailored, and used in diverse ways. Such capabilities require suitable building blocks, an appropriate command-and-control system, doctrine, and “practice, practice, practice.” The Naval Services have always been relatively good at such things. Carrier battle groups were building blocks tailored to their theater; today’s strike groups have evolved a great deal since the Cold War, and the Navy is actively considering an even wider range of employment options. Marines have always done building-block planning, explaining the absence of a standard Marine Expeditionary Force. Building blocks come in different forms: equipment (e.g., platforms), organization (e.g., carrier strike groups), and operations (e.g., for conducting long-range air strikes or mounting a surprise assault by Marines). Building blocks are also hierarchical. And, in today’s world, networking allows more and quicker tailoring and adaptation, as well as leveraging of the capabilities of individual platforms, units, and suboperations. Overall, the Department of the Navy appears to be addressing the building-block issues vigorously. Problems are likely to occur, however, such as that of allowing important future building-block innovations to slip away when funding becomes tight. For example, funding the full contingent of carrier strike groups and raising their readiness for rapid deployment (up to eight strike groups within a specified number of days) might come at the expense of more actively pursuing non-carrier strike groups or next-generation carriers that would be more difficult for a future adversary to attack. To help reduce the likelihood of such problems, the Department of the Navy should conduct a review of future building-block options and focus on those designed to increase the range of decision options available to the top leadership. This could be accomplished as part of the actions suggested by Recommendation 1.

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning IMPLEMENTATION—MOVING TOWARD FIRST-CLASS ANALYSIS The Office of the Chief of Naval Operations has only recently been reorganized, and much more extensive research would be needed to make any useful recommendations about further changes. Thus, the committee did not discuss organizational issues in great depth, instead commenting on problems that could be resolved with incremental changes. The assessment below touches on organizational problems, developing a first-rate analytical staff, the culture needed for such a staff, and links to the DOD and the other Services. Finally, it touches on the temporal issue of what can be done in the short term rather the long term. Organization The committee’s conclusions about organizational problems are as follows. The logic for the responsibilities assigned to the Deputy Chief of Naval Operations (DCNO) for Warfare Requirements and Programs (N6/N7) on the one hand, and to the DCNO for Resources, Requirements, and Assessments (N8) on the other, is not entirely clear or persuasive, either to the committee or to some of the officers who briefed the committee and talked with its members. To elaborate, the idea and use of competitive analysis and creative tension are fine, and the intention of generating alternative perspectives is excellent. However, the current competition between parts of N6/N7 and N8 does not appear to be helpful and involves high opportunity costs. Rather than having two alternative, ad hoc versions of any given issue and sets of undocumented analysis, it would be better to have a first-rate job done of objective analysis informed by alternative points of view. The analysis would be more nearly comprehensive, systematic, parametric, questioning of assumptions (even sacred cows), and transparent than today’s. Alternative perspectives could be compared by juxtaposing their implications in analysis charts. Consistent with this suggestion, the committee recommends more emphasis on solid, first-rate analyses by a single organization within OPNAV. These could spin off quick-response, ad hoc analyses as needed. If the Chief of Naval Operations and the Secretary of the Navy (SECNAV) are to have a highly competent analytical organization, it is essential that the organization (1) report directly, or relatively directly, to the CNO/SECNAV, rather than being relegated to low levels in the Department of the Navy with layering to dilute its influence; (2) be institutionalized so that it cannot easily be disbanded at the whim of a future CNO or SECNAV (the dissolution of the Systems Analysis Division of the Office of the Chief of Naval Operations (OP-96) in the 1980s has long been viewed as a disaster); and (3) be closely linked to program builders. Whether these criteria can be met within the current OPNAV organization was not something that the committee could easily assess in the time available.

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning The Need for a Small, First-Rate Staff Particularly for strategic-level analysis, the CNO needs products that can best be obtained by a small, first-rate staff that would include a number of exceptionally talented individuals and future leaders and would be connected well to first-rate outside research-and-analysis organizations. The ideal staff should be seen not just as a collection of operations research personnel, but as a multidisciplinary group with a mix of warriors, policy analysts, systems analysts, engineers, economists, and managers (perhaps with master’s degrees in business administration). Further, this staff should include members with outstanding potential and promise (e.g., military who will reach flag rank). Culture Much is known about the cultural characteristics of first-rate analytical defense organizations, and the Navy can draw upon its own experiences over the years for examples of both good and bad practice. Generically, however, the good characteristics include the following: The ethic of getting the problem straight, even if it revisits guidance or assumptions; Loyalty to the boss—the CNO—but also to the Secretary of Defense, the President, and the nation, rather than to Navy warfare areas, platforms, and so on; Integrity; The mind-set to think joint, but also having the ability to do superb competitive innovation and analysis for the Department of the Navy; The mind-set to seek broad, complete analyses rather than analyses to support a superior’s talking points; Respect and energetic search for empirical and expert information, whether it is obtained from people in the field, through experiments, by augmentation of staff, or from other mechanisms; Rigor in everything (but not always in numbers or precision); A good process that includes (not always linearly: problem definition, identification of assumptions, a plan for analysis, appropriate tools, and so on); A very high ratio of thinking and smart, simple analysis to model running and data analysis (but with subcontracts to specialists);22 22   For many years the OSD’s Office of Systems Analysis, later the Office of Programming Analysis and Evaluation, did not use any large and complex models, believing that it was instead essential to remain focused on higher-level issues and relatively reductionist (but not naive) analysis. Today, most analytical shops appear to be tilted far to the big-model extreme, to their detriment.

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning Dynamism, such as can be achieved by reasonable turnover rates and constant contact with “outsiders” and their ideas, whether elsewhere in the organization or in Federally Funded Research and Development Centers, professional societies, universities, or industry; and The opportunity for self-motivated work that attempts to look beyond the current in-basket and conventional wisdom. These characteristics can only be developed and sustained within a top-notch analytic organization if the leaders (i.e., the CNO and other key leaders in the Navy) instill and support them. Links to the Department of Defense and the Other Services For the Navy, it is important that its analytical shop(s) have good links within the DOD and particularly to the other Services, not just in required coordination meetings but also as a standard part of doing good, professional work on behalf of the nation. Creating an Appropriate Analytic Organization With the material provided in earlier subsections as background, the next question is how to go about creating the appropriate analytic organization. The committee’s basic recommendation on this question is presented below, followed by a discussion of possible models for the Navy to use in addressing the issue. Recommendation 3: The Chief of Naval Operations and the Secretary of the Navy should develop a clearly delineated concept of the Navy’s future senior-level analytic support organization and define goals for its composition, including multidisciplinary orientation and officers appropriate for high positions. Potential Models At least two good models exist for creating and maintaining a highly competent organization to perform analysis and package options for choice. One is similar to the Navy’s OP-96 model of the late 1960s and early 1970s and the Army’s Office of the Assistant Vice Chief of Staff (AVCS) in the same period. In these cases the organization in charge of the analysis also prepared the resource-allocation decision packages. The other model was developed by the Air Force. In this case, the position of Assistant Chief of Staff for Air Force Studies and Analyses (AF/SA) was created, with a responsibility limited to performing in-house, independent analyses on issues impacting Air Force operations and programs, current and future. The key

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning distinction was that OP-96 was part of the OPNAV organization responsible for providing resource-allocation packages to the CNO, whereas the AF/SA organization’s only function was to do studies and analyses for the Chief of Staff, the Service Secretary, and other Air Force organizational elements requesting analyses (i.e., it did not have a direct resource-allocation responsibility for preparing Air Force programs and/or budgets). An advantage to having the strong analytic arm working for (or with) those preparing resource-allocation decision packages is that the analysts are grounded in the reality of the issues and types of decisions that must be made each year. However, a potential disadvantage of a direct tie to the resource-allocation staff is that the analytic staff members can get so caught up in the immediate issues that they are not provided the opportunity to build analytic capital and focus on larger, longer-term issues that may be much more important to the future of the Service (a matter of analysis production with available tools versus long-term analytic development with the anticipation of essential issues). In addition, some view a very strong analytic arm in the resource-allocation organization to be an overconcentration of power that may be misused and abused and thus be a detriment to the Service. This source of contention had developed in the Army, and the AVCS office was dissolved, in part because of criticisms regarding its power and effectiveness. When at its peak effectiveness, the AF/SA office was viewed more as an honest broker of studies and analyses, with independence from functional program responsibilities, that would provide quality products to the Air Force staff and major commands. Those elements would then use the products in conjunction with their own work to prepare decision packages and advocacy positions for programs. The decision maker (e.g., Chief of Staff or Service Secretary) could get a “second opinion” on the decision packages if needed, by asking the Assistant Chief for AF/SA if AF/SA products were being misused. If the AF/SA team had not provided analyses on the issue, it could be asked to provide an independent assessment. A current analogy would be a request by the DOD or a military department acquisition authority for an independent cost estimate of an important acquisition program, or a request for an independent review of the program manager’s cost estimate by the Cost Analysis Improvement Group or its Service equivalent. A key criterion for success is that the CNO and the SECNAV get as close as possible to unbiased analysis and presentation of decision packages. Additionally, they need to have the means to get an independent assessment on important issues when needed. It is best to find out if there are any weak links in a package and to deal with those weaknesses before making a decision on, or recommendation for, a major commitment. What model would be best for the Navy in the current environment is not clear to this committee at the present time (and, as with much capabilities-based planning, there may be more than one viable solution). The Navy could address

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning the issue in a subsequent tasking to an outside entity, it might do an in-house assessment, or it could make some decision based on available information. This committee believes that the CNO and SECNAV should expend the necessary time and effort to be assured that they are making the best choice for the Department of the Navy. The way in which they choose to obtain and organize their high-level decision preparation and analytic support could be the most important decision that they make regarding possible future success of the Department of the Navy’s efforts in capabilities-based planning and overall resource-allocation for a decade or more. In addition to the analytic organization, the Navy also will need to develop an increased supply of top-flight officers suitable for working in this organization and then moving on to operational assignments and flag rank. Developing this supply of officers will require looking at potential major changes of guidance and process in the personnel system, as well as related changes of the incentive structure.23 The following is worth noting: In the 1960s and 1970s, the Navy made a concerted effort to retour officers in a subspecialty ashore (such as in the planning, programming, and budgeting system (PPBS)), to improve their experience and qualifications in that area. A number of admirals came out of that program and served the Navy and several CNOs well with their experience and insights. In the 1980s, the Navy changed its policy and programmed officers with no prior experience into critical PPBS positions—a policy that continues today. The Navy would never consider assigning a captain to command at sea without extensive experience and proven performance in prior sea tours. However, it quite often assigns captains to critical billets in PPBS—on the OPNAV staff, SECNAV staff, and the Joint staff—with little or no prior analytical and resource-allocation (e.g., PPBS) experience or training. Having a small, qualified analytic staff for the Navy will be very difficult to achieve, at least in the near term, until sufficiently experienced officers are developed. The committee believes that the Navy needs to change some current manpower and personnel policies in order to enhance its ability to build a longer-term, high-quality OPNAV staff with enhanced potential for performing excellent capabilities-based planning and analysis. A key element of those changes should involve creating assignment patterns for future leaders to introduce such individuals early to the discipline of analytical thinking in a real-world context (e.g., the analysis for, preparation of, and review of the Navy Program Objective Memorandum and/or equivalent parts of the overall DOD program). Such assign- 23   In the past, some distinguished Navy four-star admirals have had Ph.D.s in “hard” disciplines as well as tours in Navy or DOD analysis.

OCR for page 30
Naval Analytical Capabilities: Improving Capabilities-Based Planning ment patterns should continue to expose such individuals in their careers to the world of analysis and trade-offs in which outcomes influence budgets and/or major programs. The Temporal Issue: Near Term and Longer Term There clearly exists a temporal issue with regard to obtaining the type of analytical support that the committee believes the top-level Navy decision makers should have. Actually obtaining such support would take time even if a decision to do so were made immediately. Thus, the committee suggests the following: Recommendation 4: In the short term, the Chief of Naval Operations and the Secretary of the Navy should go outside their organizations to sharpen concepts and requirements, drawing on the external community of expert practitioners in analysis. Also, they should augment their in-house analytical capabilities in the short term by drawing on Intergovernmental Personnel Act assignments (and other individuals who could take leave from their home organizations), Federally Funded Research and Development Centers, and national and other nonprofit laboratories.