3
Examples of Processes Employed by Government and Industry for Providing Capability Planning and Analysis

INTRODUCTION

This chapter addresses part 2 of the terms of reference for this study: “Review various analytical methods, processes and models for large scale, complex domains like ISR [intelligence, surveillance, and reconnaissance] and identify best practices.” The chapter also discusses government and industry capability planning and analysis (CP&A)-like processes and associated tools, with the aim of identifying attributes and best practices that might be applied to the Air Force ISR CP&A process. The following sections provide brief descriptions of the processes and tools used by several organizations to showcase salient attributes and illustrate best practices of each. Appendix C contains descriptions of additional organizational processes and tools that do not appear in this chapter.1 At the end of the chapter, Table 3-4 correlates the findings in this chapter with best practices.

EXAMPLES OF GOVERNMENT PROCESSES FOR PROVIDING CAPABILITY PLANNING AND ANALYSIS

The scope of Air Force responsibilities to provide global, integrated ISR capabilities across strategic, operational, and tactical missions is extraordinarily broad

_______

1 The descriptions of the individual organizations’ CP&A-like processes and tools vary considerably and are the result of the intent to provide an unclassified report. Much of the information provided to the committee during its data gathering was classified or otherwise not releasable to the public.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 47
3 Examples of Processes Employed by Government and Industry for Providing Capability Planning and Analysis INTRODUCTION This chapter addresses part 2 of the terms of reference for this study: "Review various analytical methods, processes and models for large scale, complex domains like ISR [intelligence, surveillance, and reconnaissance] and identify best practices." The chapter also discusses government and industry capability planning and analy- sis (CP&A)-like processes and associated tools, with the aim of identifying attri- butes and best practices that might be applied to the Air Force ISR CP&A process. The following sections provide brief descriptions of the processes and tools used by several organizations to showcase salient attributes and illustrate best practices of each. Appendix C contains descriptions of additional organizational processes and tools that do not appear in this chapter.1 At the end of the chapter, Table 3-4 correlates the findings in this chapter with best practices. EXAMPLES OF GOVERNMENT PROCESSES FOR PROVIDING CAPABILITY PLANNING AND ANALYSIS The scope of Air Force responsibilities to provide global, integrated ISR capa- bilities across strategic, operational, and tactical missions is extraordinarily broad 1The descriptions of the individual organizations' CP&A-like processes and tools vary considerably and are the result of the intent to provide an unclassified report. Much of the information provided to the committee during its data gathering was classified or otherwise not releasable to the public. 47

OCR for page 47
48 C a pa b i l i t y P l a n n i n g and A na ly s i s to O p t i m i z e A i r F o r c e ISR FIGURE 3-1 Analytic underpinnings of intelligence, surveillance, and reconnaissance (ISR) force sizing for the Army. NOTE: Acronyms are defined in 3-1.eps the list in the front matter. SOURCE: LTG Richard Zahner, Deputy Chief of Staff, G-2, Headquarters, U.S.bitmap Army. "Military Intelligence Rebalance." Presentation to the committee, November 9, 2011. and complex. Although the organizational processes described below apply, for the most part, to arenas with smaller scope and less complexity, each process was reviewed with the goal of identifying best practices and tools that the Air Force might consider incorporating into its own CP&A process. U.S. Army The U.S. Army developed a strategy to rebalance the Army Military Intelligence (MI) Force after a decade of intense ISR system development and deployment in support of operations in Iraq and Afghanistan.2 This protracted period at war resulted in many system deployments accomplished with great urgency as Quick Reaction Capabilities (QRC), depicted in Figure 3-1. The overarching strategy for Army Intelligence is to optimize core intelligence capabilities in support of Brigade Combat Teams (BCTs) and division and corps full-spectrum operations on a sus- tained Army Force Generation (ARFORGEN) cycle.3 Thus, the Army's approach relies principally on its own organic ISR capability rather than on Air Force or na- 2U.S. Army. A Strategy to Rebalance the Army MI Force--Major Themes and Concepts. Available at http://www.dami.army.pentagon.mil/site/G-2%20Vision/nDocs.aspx. Accessed February 29, 2012. 3LTG Richard Zahner, Deputy Chief of Staff, G-2, Headquarters, U.S. Army. "Military Intelligence Rebalance." Presentation to the committee, November 9, 2011.

OCR for page 47
Examples of Government and Industry Processes for CP&A 49 FIGURE 3-2 The Army's intelligence, surveillance, and reconnaissance (ISR) requirements and in- formation density generation in past and in present and future threat environments. SOURCE: LTG 3-2.eps U.S. Army. "Military Intelligence Rebalance." Richard Zahner, Deputy Chief of Staff, G-2, Headquarters, bitmap Presentation to the committee, November 9, 2011. tional capabilities. Looking to the future, the MI rebalance is intended to determine which capabilities are enduring, using a Doctrine, Organization, Training, Materiel, Leadership and Education, Personnel, and Facilities (DOTMLPF) assessment.4 Figure 3-2 identifies the Army's ISR requirements and information density generation for past threat environments compared with those for present and fu- ture threat environments. Cold War requirements were hierarchical and focused on the operational level, whereas contemporary requirements are networked, with a tactical focus. Additionally, for the most part, the Army has recently faced a benign air threat, as coalition forces enjoyed air superiority in Iraq and Afghanistan. This led the Army to focus ISR support more toward tactical units, which are at present and can be expected in the future to prosecute much of the fight. In support of these decentralized and networked operations, Army Intelligence devised the Integrated Sensor Coverage Area (ISCA) construct, featuring three dis- tinct ISR mission sets, shown in Figure 3-3: (1) Persistent Area Assessment (PAA), (2) Mission Overwatch (MO), and (3) Situation Development (SID). 4Ibid.

OCR for page 47
50 C a pa b i l i t y P l a n n i n g and A na ly s i s to O p t i m i z e A i r F o r c e ISR FIGURE 3-3 The Army's Integrated Sensor Coverage Area (ISCA) construct that defines three intel- 3-3.eps ligence, surveillance, and reconnaissance (ISR) mission sets. SOURCE: LTG Richard Zahner, Deputy Chief of Staff, G-2, Headquarters, U.S. Army.bitmap "Military Intelligence Rebalance." Presentation to the committee, November 9, 2011. The ISCA construct integrates all ISR collection in support of a maneuver battalion's mission, with sensors and platforms dynamically tailored by the BCT and battalion, synchronized by the unit's operational cycle. The "hourglass chart" (Figure 3-4) depicts how the ISCA construct is incorporated into the ISR capabil- ity development process, which translates the strategy into requirements and the requirements into sensors and platforms, with sensor and platform attributes that are summarized in Figure 3-5. What is distinctive about this process is that the Army investment strategy to deliver ISR capabilities begins with a threat-and-environment-based definition of ISR requirements. These are then deconstructed into mission requirements and subsequently into three ISR mission sets, with variable sensing requirements. These sensing requirements are then translated into sensors and platforms that allow force-development options to be evaluated in a holistic, needs-based manner that is quantifiable, repeatable, transparent, and easy to explain. The selection of ap- propriate sensor and platform attributes provides a set of relevant and consistent criteria for defining and assessing both sensor and system performance, which, in turn, informs acquisition decisions. Characteristics applicable to airborne collec- tion assets are summarized in Box 3-1.

OCR for page 47
Examples of Government and Industry Processes for CP&A 51 3-4.eps FIGURE 3-4 The Army's intelligence, surveillance, and reconnaissance (ISR) capability development process. SOURCE: LTG Richard Zahner, Deputy Chief bitmap of Staff, G-2, Headquarters, U.S. Army. "Military Intelligence Rebalance." Presentation to the committee, November 9, 2011. FIGURE 3-5 The Army's Integrated Sensor Coverage 3-5.eps Area (ISCA) functions and intelligence, surveil- lance, and reconnaissance (ISR) mission requirements. SOURCE: LTG Richard Zahner, Deputy Chief of Staff, G-2, Headquarters, U.S. Army. "Military bitmap Intelligence Rebalance." Presentation to the committee, November 9, 2011.

OCR for page 47
52 C a pa b i l i t y P l a n n i n g and A na ly s i s to O p t i m i z e A i r F o r c e ISR BOX 3-1 Characteristics of the Aerial Layer Construct 1.Sensors must be optimized to support Integrated Sensor Coverage Area (ISCA)-related information collection (Intelligence, Surveillance, and Reconnaissance [ISR]) operations of Persistent Area Assessment (PAA), Situation Development (SID), and Mission Overwatch (MO). 2.The appropriate multiple-intelligence sensor array must be resident on dedicated ISR plat- forms to meet intelligence requirements associated with unified land operations (formerly, full-spectrum operations). 3.Intelligence sensors must be assigned (and possess the resolution requirements) to platforms that possess the endurance to support the specific requirements of the ISCA concept--PAA, SID, and MO. 4.Intelligence captured by these sensors must be accessible by forward-deployed processing, exploitation, and dissemination (PED) (in the case of Intelligence 2020 concepts, in the PED Company of the Military Intelligence [MI] Pursuit and Exploitation Battalion, the PED detach- ment of the MI Brigade, and the proposed PED element located in the Aerial Exploitation Battalion co-located at the Corps Headquarters). In turn these PED "platforms" are linked in the Intelligence Readiness Operations Capability (IROC) network, which will provide intelli- gence overwatch for deployed units as well as expand analytical and intelligence exploitation opportunities. Finding 3-1. The U.S. Army's ISCA construct uses a process that links require- ments analysis with force development and portfolio management in a way that helps synchronize planning and execution. Keys to this linkage are the ISCA analytical underpinnings and the methodology that enables sensor-platform aggregations. Additionally, the ISCA construct uses measured performance to inform acquisition decisions in a manner that lends transparency, responsive- ness, and repeatability. U.S. Navy The overall U.S. Navy (USN) requirement-generation process is governed by Secretary of the Navy Instruction 5000.2E and defines a capabilities-based ap- proach to developing and delivering technically sound, sustainable, and affordable military capabilities.5 The process is implemented by means of the Naval Capabili- 5USN. 2011. Department of the Navy Implementation and Operation of the Defense Acquisition Sys- tem and the Joint Capabilities Integration and Development System. September 1. Available at http:// nawctsd.navair.navy.mil/Resources/Library/Acqguide/SNI5000.2E.pdf. Accessed April 13, 2012.

OCR for page 47
Examples of Government and Industry Processes for CP&A 53 FIGURE 3-6 Department of the Navy requirements and acquisition, two-pass, six-gate process. 3-6.eps SOURCE: Paul Siegrist, N2N6F2 ISR Capabilities Division. Personal communication to the committee, March 6, 2012. bitmap ties Development Process (NCDP), the Expeditionary Force Development System (EFDS), and the Joint Capabilities Integration and Development System (JCIDS) to identify and prioritize capability gaps and integrated DOTMLPF solutions. The Chief of Naval Operations (CNO) is the user representative for executing actions to identify, define, validate, assess affordability determinations, and prioritize required mission capabilities through JCIDS, allocating resources to meet requirements through the Planning, Programming, Budgeting, and Execution System (PPBES). For ISR capabilities and requirements, Navy N2/6 coordinates with the Office of the CNO (N81) throughout the process. The NCDP creates the Integrated Capabilities Plan, which translates strategic guidance and operational concepts to specific warfighting capabilities. The Navy uses two flag-level forums--the Naval Capabilities Board and the Resources and Requirements Review Board--to review and endorse all JCIDS proposals and documents. In translating requirements to operational capability, the Navy employs the two-pass, six-gate process depicted in Figure 3-6. This process ensures align- ment between service-generated capability requirements and systems acquisition.

OCR for page 47
54 C a pa b i l i t y P l a n n i n g and A na ly s i s to O p t i m i z e A i r F o r c e ISR It supports Program Objective Memorandum development as well as urgent need and rapid development in streamlined, tailored implementations. A brief descrip- tion of this process follows: Gate 1 reviews and grants authority for the Initial Capabilities Document submission to joint review, validates the proposed analysis of alternatives (AOA) study guidance, endorses the AOA study plan, and authorizes con- tinuation to the Material Development Decision. Gate 2 reviews AOA assumptions and the total ownership cost estimate, approves the AOA preferred alternative, approves the creation of the Ca- pabilities Development Document (CDD) and Concept of Operations (CONOPS), approves the initial Key Performance Parameters and Key System Attributes, reviews program health, and authorizes the program to proceed to Gate 3 prior to Milestone A. Gate 3 approves initial CDD and CONOPS; supports the development of the service cost position; reviews technology development and system en- gineering plans; provides full funding certification; validates requirements traceability; considers the use of new or modified command, control, com- munications, computers, and intelligence (C4I) systems; reviews program health; and grants approval to continue to Milestone A. Gate 4 approves a formal system development strategy and authorizes pro- grams to proceed to Gate 5 or Milestone B. Gate 5 ensures readiness for Milestone Decision Authority (MDA) approval and release of the formal Engineering and Manufacturing Development (EMD) Request for Proposal to industry, provides full funding certification, and reviews program health and risk. Gate 6 follows the award of the EMD contract and satisfactory completion of the initial baseline review, assessing the overall health of the program. Reviews at Gate 6 also endorse or approve the Capabilities Production Document, assess program sufficiency and health prior to full-rate produc- tion, and evaluate sustainment throughout the program life cycle. In summary, the application of the process employed by the Navy and applied to urgent needs involves streamlining and tailoring requirements and assessing options more rapidly than the normal process, and expediting technical, program- matic, and financial decisions as well as procurement and contracting. Finding 3-2. The U.S. Navy's capability-based process is collaborative across the Department of the Navy and is synchronized with the PPBES and system acquisition life cycles. The process can be streamlined to address urgent needs. The process deals largely with naval requirements; utilizes existing PCPAD

OCR for page 47
Examples of Government and Industry Processes for CP&A 55 (planning and direction, collection, processing and exploitation, analysis and production, and dissemination)/TCPED (tasking, collecting, processing, ex- ploitation, and dissemination) architectures; and connects with other ISR enterprise providers through the Office of the Under Secretary of Defense for Intelligence (OUSD[I]). Office of the Under Secretary of Defense for Intelligence The mission and vision of the OUSD(I) require an integrated approach to ISR across the Department of Defense (DoD): a global and horizontally integrated DoD intelligence capability consisting of highly qualified professionals and skilled leaders employing advanced technologies dedicated to supporting the needs of the warfighter and the Director of National Intelligence (DNI).6 The OUSD(I) for Port- folios, Programs and Resources (PP&R) oversees the development and execution of a balanced portfolio of military and national intelligence capabilities.7 Toward this end, the Battlespace Awareness (BA) portfolio builds the ISR investment strategy by balancing capabilities across TCPED, as shown in Figure 3-7.8 As shown, the OUSD(I) process leading from national-level strategy to bud- get decisions involves numerous organizations and staffs. Capability needs are derived from national-level defense and intelligence guidance and strategy. These needs are translated into an ISR investment strategy, with a portfolio of programs constructed and shaped to provide an optimal mix of capabilities for TCPED and analysis, given political, budgetary, and national security realities. Success depends on an understanding of top-level priorities, knowledge of ISR requirements and system capabilities, open communication (transparency), and effective collabora- tion among the participants. Within the OUSD(I), the Director, Battlespace Awareness and Portfolio As- sessment (BAPA) has responsibility for assessing and recommending the optimal the mix of BA capabilities to the warfighter. Figure 3-8 shows a number of key activities conducted by the BAPA staff to support portfolio development and their relationship to the PPBES process. 6DoD. 2005. "Under Secretary of Defense for Intelligence (USD(I))." Directive 5143.01. Available at http://www.fas.org/irp/DoDdir/DoD/d5143_01.pdf. Accessed February 28, 2012. 7DoD. 2010. "Fiscal Year 2011 Budget Estimates. Office of the Secretary of Defense (OSD)." Wash- ington, D.C.: Department of Defense. Available at http://comptroller.defense.gov/defbudget/fy2011/ budget_justification/pdfs/01_Operation_and_Maintenance/O_M_VOL_1_PARTS/OSD_FY11.pdf. Accessed February 28, 2012. 8Col Anthony Lombardo, Deputy Director, ISR Programs, Agency Acquisition Oversight, Office of the Under Secretary of Defense (Intelligence). "OUSD(I) Overview." Presentation to the committee, October 7, 2011.

OCR for page 47
56 C a pa b i l i t y P l a n n i n g and A na ly s i s to O p t i m i z e A i r F o r c e ISR FIGURE 3-7 Office of the Under Secretary of Defense for Intelligence (OUSD[I]) strategy to budget 3-7.eps involves numerous organizations and staffs. NOTE: Acronyms are defined in the list in the front mat- ter. SOURCE: Col Anthony Lombardo, Deputybitmap Director, ISR Programs, Agency Acquisition Oversight, Office of the Under Secretary of Defense (Intelligence). "OUSD(I) Overview." Presentation to the committee, October 7, 2011. BAPA develops the ISR roadmap every 2 years, as directed by Congress. The roadmap provides information on the current ISR portfolio, its ability to meet national and defense intelligence strategies, and how the portfolio will change to remain relevant and maximize capability. Individual systems are addressed, but the overarching goal is to evaluate the portfolio and an integrated architecture. The Consolidated Intelligence Guidance gives both the DNI and OUSD(I) pro- grammatic and budgetary guidance for programs and budgets that fall under the National Intelligence Program (NIP) and Military Intelligence Program (MIP), and it provides strategic priorities, program guidance, and areas in which to assume risk. It also directs studies when necessary to help resolve programmatic ques- tions and uncertainties. A significant amount of analysis underpins the portfolio assessment process. The analysis comes in various forms, from major studies with cross-community participation, to Capability Area Deep Dives (CADDs), which are relatively short, intense assessments of specific issues led by the BAPA staff. Assessment efforts feed focus area teams, which are organized by domain (i.e., sea, air, and space) and help frame BA portfolio issues that need resolution. The OUSD(I) for PP&R recognizes that current processes for prioritizing needs and analyzing risk should be improved in order to address acknowledged

OCR for page 47
Examples of Government and Industry Processes for CP&A 57 JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC BA Strategic Plan and ISR Roadmap DNI Strategic Program Review Planning Draft (Feb) / Final (Apr) Consolidated Intelligence Guidance (CIG) Analyze Alternatives (IPG & RMD Studies / SWarFs) Capability Area Deep Dives BA Issue Paper to CAPE & brief Focus Area Teams to ISR Council/3 Star/DAWG Programming BA Issue Issue Teams RMD Paper Build Hearings/Staffer Visits Congressional Justification Books POM/BES submissions to MIP To Hill (for Previous cycle) OCO Requests Budgeting Execution OMNIBUS Reprogramming End of Year Execution Review FIGURE 3-8 Office of the Under Secretary of Defense for Intelligence (OUSD[I]) and the Planning, Programming, Budgeting, and Execution System (PPBES) process. NOTE: Acronyms are defined in 3-8.eps the list in the front matter. SOURCE: Col Anthony Lombardo, Deputy Director, ISR Programs, Agency Acquisition Oversight, Office of the Under Secretary of Defense (Intelligence). "OUSD(I) Overview." Presentation to the committee, October 7, 2011. shortfalls, which include the following: (1) little consideration of trade-offs among cost and schedule and performance, (2) no prioritization across portfolios and little to no risk analysis, (3) the overly bureaucratic and time-consuming nature of the processes, and (4) the impact on shaping the force. The OUSD(I) for PP&R also seeks a more dynamic and iterative process throughout a program's life cycle-- one that will revisit validated requirements when necessary and adjust to strategy shifts and changes in the threat, considerations that are very timely.9 Additionally, the OUSD(I) for PP&R has recommended changes to the Joint Requirements Oversight Council (JROC) and Functional Capabilities Boards (FCBs).10 Although OUSD(I) does not have a "standard" modeling and simulation (M&S) tool kit per se, it leverages tools and Systems Engineering and Technical Assistance (SETA) developed by Federally Funded Research and Development Centers, contractors, the services, and the Office of the Secretary of Defense (OSD) on a case-by-case basis to address specific questions (see Box 3-2). For example, the Satellite Took Kit (STK) and the Satellite Orbit Analysis Program (SOAP) have been used primarily to help leadership and decision makers visualize overhead ISR systems and evaluate 9Ibid. 10Ibid.

OCR for page 47
Examples of Government and Industry Processes for CP&A 63 DoD Capabilities Driven by Service Contributions Horizontally Integrated DoD Enterprise Varied Ops Tempos (peace, crisis, war) Drive Architecture Need for Varied, Adaptive Capabilities Strategic Understanding ISR Trades Must Include Deterence HLS Other Related Architectures Stability JOC Architectures Ops MCO Focus Force Log JFC App Architectures Net Centric JBA C2 TPED Wx ITW Comm Mission Challenge : Analyze system and Architectures ISR info architectures concurrently Illustrative elements of within & across missions to interest for a study of plan for total capability effects System/SensorArchitectures ISR system (Predator) FIGURE 3-11 TASC's solution to analyzing complex domains begins with a layered, iterative approach, 3-11.eps segregating and describing architectures within and across missions. NOTE: Acronyms are defined in the list in the front matter. SOURCE: Doug Owens, Manager, Enterprise Analysis, Defense Business Unit, TASC. "An Enterprise Approach to Capability-Based Analysis: Best Practices, Tools, and Results." Presentation to the committee, January 5, 2012. various COTS/government off-the-shelf (GOTS) and TASC tools to the analysis of large-scale, complex domains.15 The TASC solution to analyzing complex domains begins with a layered, iterative approach, segregating and describing architectures within and across missions, as shown in Figure 3-11. In this way, TASC described a quantifiable analysis across complex domains, informed by affordability, and with traceability from requirements to decision outcomes. The basic premise of the TASC approach is that complex domains of capability can be analyzed from different perspectives with tailored models and tools appro- priate for each perspective, and the various segments of the analysis are integrated to provide traceability of cause and effect for the combined total impact, shown in Figure 3-12.16 For the ISR mission area, those perspectives include the following: sensor and collection platform performance; the network topology connectivity that enables the overall ISR mission; the command and control of the various assets; the communications capabilities and allocations; the vulnerabilities of the information architecture for the command, control, communications, and com- puters (C4) capabilities that enable ISR; the processes for TCPED information in 15Doug Owens, Manager, Enterprise Analysis, Defense Business Unit, TASC. "An Enterprise Ap- proach to Capability-Based Analysis: Best Practices, Tools, and Results." Presentation to the commit- tee, January 5, 2012. 16Ibid.

OCR for page 47
64 C a pa b i l i t y P l a n n i n g and A na ly s i s to O p t i m i z e A i r F o r c e ISR Study planning Multi-Resolution Analysis Applications Interactive For CBA Decision Analysis of Info Mgt Decision Mission Metrics Environment Netted Factors DQAT: Utility Decision Information Integrated Analysis Questions & Integration Decision Analysis (MUA) Approach (ANII TM) (IDATM) JCIDS Tools Effects Costs, Profiles, DAS Trades PPBE Performance Needs QDR Financial & OPLAN Physics - based Capability & Business ... Architecting Analysis (PCA) Analytics (FBA) Tailored Applications ISR, C4, Space Control, Cyberspace / IO, ITW/AA Traceability of Effects Responsive/Adaptive to Decide System Performance, Families of Systems, Program, Requirements, Asset Management, Ops Plan, ... FIGURE 3-12 TASC's approach to Multi-Resolution Analysis (MRA) integrates various perspectives. SOURCE: Doug Owens, Manager, Enterprise Analysis, Defense Business Unit, TASC. "An Enterprise 3-12.eps Approach to Capability-Based Analysis: Best Practices, Tools, and Results." Presentation to the com- mittee, January 5, 2012. support of operations; and the manpower and infrastructure through which the ISR missions are accomplished.17 MRA enables the examination of each of these elements within an integrated capability context using an interactive, iterative flow through the analysis. Multi-criteria methods are then correlated to cost estimating and program risk analysis, cost profiling, organization assessments, and six-sigma process improvement. The use of full-spectrum analytics within an integrated, interactive process combines the science of systems engineering and systems inte- grations with decision making. TASC executes full-spectrum, cross-domain, multi-resolution analyses using a variety of GOTS, COTS, and custom tools to address the command, control, com- munications, computers, intelligence, surveillance, and reconnaissance (C4ISR) enterprise. For example, GOTS/COTS tools like ADIT (Advanced Data Integra- tion Toolkit) for multi-intelligence data fusion analysis, GeoViz for geolocation performance analysis, STK for tracks and orbital coverage analysis, SEAS (System Effectiveness Analysis Simulation) for ISR mission effects and CONOPS develop- ment, JIMM (Joint Integrated Mission Model) for integrated operations analysis and detailed constructive analysis, and other tools provide the physics-based analy- sis for the quantification of capability impacts. A sample of TASC custom tools and processes for ISR capability analysis is shown in Table 3-2. Among them are ANIITM and a-MINDTM, which allow the 17Ibid.

OCR for page 47
Examples of Government and Industry Processes for CP&A 65 TABLE 3-2 Sample of Tools, with Products and Benefits of Each, Employed by TASC Tool Name Category Tool Product Benefits JFORCES: Mission Utility Stochastic and deterministic Provides robust analysis Joint Force Operational simulation of total capability. Readiness Combat Archives every element Effectiveness Simulator of a simulation to allow (TASC) analysts to create new metrics to explore issues without re-executing simulation. a-MINDTM with ANIITM Processing, Statistical relational models; Reduces analysis cycle Process: Exploitation, mission impact of network time by means of rapid automated Mission Impact Dissemination (PED): design, cyber impacts on diagnostic evaluation of of Network Design Analysis mission effectiveness; cyber network or architecture for (TASC) Communications Data mitigation options mission impacts to identify Integration Decryption alternative structures Language Translation from potentially millions Data Reduction of options. Quantifies correlation of networks to missions. TPAT: General simulation Visualization of TCPED Process process provides quick Assessment Tool (built in reference point to ExtendSIM) identify bottlenecks and (TASC) inefficiencies. Enables rapid exploration of process options. CACI: Inference modeling (Infer Reduces architecture costs Collection Architecture potential changes in outcome by identifying key elements Capability Influence or effects from possible and indifferent elements, (TASC) variations in metric results) allowing capability development and selection to focus on critical pieces. Tasking to Value (T2V): Simulation Increased operational Geospatial Modeling reality and ability to Environment assess ISR operations (TASC) effectiveness. continued

OCR for page 47
66 C a pa b i l i t y P l a n n i n g and A na ly s i s to O p t i m i z e A i r F o r c e ISR TABLE 3-2 Continued Tool Name Category Tool Product Benefits SMART: Integrated Decision Multi-attribute Utility Analysis Reduces decision Strategic Multi-Attribute Aides complexities and provides Resource Tool real-time decision-maker (TASC) interaction with metric data to explore trade space of options. QATO: Excel-based statistical model Rapid correlation of Quick Automated Tool for metrics and cost profiles, Optimization reducing programming (TASC) trades and impact analysis of program decisions. MIATI: Asset allocation Reduces time and cost Multi-theater Integrated of analysis by quickly Allocation Tool for ISR narrowing the trade space (TASC) of asset management. H-BEAM with MESA Architecture Analysis Technical performance Consolidated display of Process metrics of system and family architecture effects and Horse Blanket Enterprise of systems effectiveness. contributing elements. Architecture Methodology Assessment of system impacts (TASC) to mission effects as scoping analysis for subsequent detailed Mission Utility Analysis (MUA). CERA: Financial and Business System and family of systems Develop credible Cost Estimating and Risk Analytics cost estimates and profiles; estimates and profiles for Analysis Process correlation of costs to metric systems and families of (TASC) performance from PCA, ANIITM, capability. Assess risks of or MUA programmatic changes on capability effects. NOTE: For a more complete list of tools used by both government and industry, see Appendix C in this report. SOURCE: TASC. 2012. Written communication. Industry and Government ISR Tools and Processes. Response to inquiry from the committee. analysis of interconnected capabilities and associated cyberspace vulnerabilities in a C4ISR information architecture; Strategic Multi-Attribute Research Tool (SMART), which supports metric-driven decision analysis, including uncertainty and cost- benefit trade-offs; Collection Architecture Capability Influence (CACI), which supports risk analysis in collection architectures; and the Mission Engineering and Systems Analysis (MESA) process paired with the Horse Blanket Enterprise Architecture Methodology and visualization tool (H-BEAM), which graphically traces capability across an enterprise architecture, from strategic guidance and requirements, to systems, to architecture options, to capability impacts.

OCR for page 47
Examples of Government and Industry Processes for CP&A 67 In summary, TASC described to the committee the inherent challenge in trying to analyze system and information architectures concurrently within and across missions to plan for total capability effects. Specifically, the networked architectures are extremely complex, and the TASC solution is a layered analytic discipline to provide quantifiable analysis informed by affordability. TASC maintains that MRA manages this complexity while maintaining traceability of effects through engi- neering analysis, family of systems and architecture trade-offs, networked infor- mation and integrated C4 for ISR, mission utility effects, and decision and costing analysis. Further, MRA provides multiple views for decisions on system technical performance parameters, network connectivity and information vulnerabilities, family of capabilities, concepts of operations, policy, total capability versus cost trade-offs, operations planning, and asset allocation.18 Finding 3-5. TASC's capability-based assessment process employs MRA, which in turn allows the complexity of ISR to be handled in a straightforward, transparent, tailorable, scalable, repeatable manner, incorporating a suite of tools that are optimized for a specific purpose. Such an approach can support a wide range of decisions and decision time lines. RadiantBlue, Inc. RadiantBlue, Inc., is a specialized provider of information technology develop- ment, consulting, and program support services for the DoD and the intelligence community.19 As with TASC, RadiantBlue implements the mission utility analysis and physics-based capability and architecture assessment phases of an MRA process using its "Blue Sim" Tool, an ISR high-fidelity simulator with agile software that easily accommodates new assets, payloads, and requirements scenarios. Figure 3-13 illustrates a typical BlueSim model with various payload types and relevant vehicle subsystems for the ISR trade space. RadiantBlue has used BlueSim to performed detailed analysis, including analysis in the following areas: space and air, sensor performance, flight profiling, attitude and orbitology, communications, TPED (tasking, processing, exploitation, and dissemination), collection satisfaction, force sizing, architecture, and visualization. The BlueSim simulator allows the integrated analysis of space, air, and ground systems--across integrated IMINT and SIGINT payloads, with cyberspace effects, against classic portfolios of ISR targets, or target decks, and vetted DoD scenarios. 18Ibid. 19More information on RadiantBlue's mission is available at http://www.radiantblue.com/about/. Accessed February 28, 2012.

OCR for page 47
68 C a pa b i l i t y P l a n n i n g and A na ly s i s to O p t i m i z e A i r F o r c e ISR FIGURE 3-13 RadiantBlue's BlueSim accurately models the full depth of key subsystem and payload aspects of intelligence, surveillance, and reconnaissance 3-13.eps (ISR) collection. NOTE: Acronyms are defined in the list in the front matter. SOURCE: Larry Shand, President, RadiantBlue, Inc. "RadiantBlue Model- ing and Simulation Capabilities." Presentationbitmap to the committee, January 5, 2012. Government users have employed BlueSim across the full range of analysis, from large architecture studies to detailed collection planning studies. Likewise, industry users have employed BlueSim for diverse applications, including ISR ar- chitectures, system performance, predictive simulations, and detailed system and payload studies. Table 3-3 describes the tools employed by RadiantBlue Tool Set as well as its products and benefit. The following sections describe RadiantBlue's study process (shown in Figure 3-14).20 Essential Study Documents in Place, Study Kickoff, Study Trade-Space Definition The first three steps illustrated in the RadiantBlue process description define the details of the requested study, including study tasks, the study trade space, and desired outcomes, as well as a set of study messages and themes to guide the de- velopment of study output products. The trade space details the systems, payload variations, architectures, CONOPS, target decks, and analysis vignettes simulated 20RadiantBlue. February 7, 2012. Written communication to the committee.

OCR for page 47
TABLE 3-3 Description of Tools Employed by RadiantBlue Tool Set, with Products and Benefits Tool Name Category Tool Product Benefits BlueSim Mission Utility Simulation Complete ISR mission effectiveness and utility. Measured in terms of classic ISR mission utility parameters (points and areas per day) or lower-level military/tactical utility parameters such as number or percentage of key enemy behavior detected. BlueSim Physics-Based Simulation Highly detailed physics-based model that simulates key ISR and supporting payload Capability system and architecture features. These supporting models phenomenology tools provide detailed inputs to BlueSim to enable mission- and architecture-level analysis with rigorous physics-based underpinnings. BlueSim Architecture Analysis Simulation Combines large architecture analysis and accurate physics- based modeling. Unique features: integrated cross-system tipping and cueing, the ability to model graduations between unified and stovepiped tasking systems, integrated dynamic tasking, and the ability to model intelligence gained through collection (information model). BlueSim Processing, Exploitation, Simulation Enables PED analysis either combined with or separate from Ground Model Dissemination (PED): the ISR collection modeling, includes features to model Analysis Communications cyber and input/output (IO) effects such as intermittent Data Integration outages, random packet loss, link degradation, node failure, Decryption and automatic communications. Includes all of the key ISR Language Translation wideband network node types to include routing, processing, Data Reduction storage area network (SAN) storage, exploitation, etc. NOTE: For a more complete list of tools used by both government and industry, see Appendix C in this report. SOURCE: RadiantBlue. 2012. Written communication. Response to inquiry from the committee. 69

OCR for page 47
70 C a pa b i l i t y P l a n n i n g and A na ly s i s to O p t i m i z e A i r F o r c e ISR Start: EssentialDocs In Place Study Kickoff to Define: Study Tradespace Definition Patriot Round D CaseRun Matrix as of 2/15/11 Status Legend Not Started Contract Detailed study questions/objectives Architectures and systems to model Complete In Progress Global Min Qual Min Qual Category Case Colombia Venezuela Iran Afghanustan Background 445 4 1 Ball 2 Ball Case Run Description/Filemane Run Status Unit In t Bi n Bi n Bi n Bi n Bi n Bi n Bi n Bi n Bi n Text In t 1 0 0 0 1 0 0 1 1 0 Patriot_RoundD_Case1_AF_GBOFF_MQ4_1Ball.txt Complete SOW Intended audience(s) Target decks and vignettes to model 2 0 0 0 1 0 1 0 1 0 Patriot_RoundD_Case2_AF_GBOFF_MQ445_1Ball.txt Complete 3 0 0 1 0 0 0 1 1 0 Patriot_RoundD_Case3_IR_GBOFF_MQ4_1Ball.txt Complete 4 0 0 1 0 0 1 0 1 0 Patriot_RoundD_Case4_IR_GBOFF_MQ445_1Ball.txt Complete 5 0 0 0 1 0 0 1 0 1 Patriot_RoundD_Case5_AF_GBOFF_MQ4_2Ball.txt Complete 6 0 0 0 1 0 1 0 0 1 Patriot_RoundD_Case6_AF_GBOFF_MQ445_2Ball.txt Complete 7 0 0 1 0 0 0 1 0 1 Patriot_RoundD_Case7_IR_GBOFF_MQ4_2Ball.txt Complete 8 0 0 1 0 0 1 0 0 1 Patriot_RoundD_Case8_IR_GBOFF_MQ445_2Ball.txt Complete Security framework Outbrief circuit MOP's, MOE's, MOU's to characterize 9 0 0 0 1 1 0 1 1 0 Patriot_RoundD_Case9_AF_GBON_MQ4_1Ball.txt Complete 10 0 0 0 1 1 1 0 1 0 Patriot_RoundD_Case10_AF_GBON_MQ445_1Ball.txt Complete 11 0 0 1 0 1 0 1 1 0 Patriot_RoundD_Case11_IR_GBON_MQ4_1Ball.txt Complete 12 0 0 1 0 1 1 0 1 0 Patriot_RoundD_Case12_IR_GBON_MQ445_1Ball.txt Complete 13 0 0 0 1 1 0 1 0 1 Patriot_RoundD_Case13_AF_GBON_MQ4_2Ball.txt Complete 14 0 0 0 1 1 1 0 0 1 Patriot_RoundD_Case14_AF_GBON_MQ445_2Ball.txt Complete 15 0 0 1 0 1 0 1 0 1 Patriot_RoundD_Case15_IR_GBON_MQ4_2Ball.txt Complete Desired outcome(s) 16 0 0 1 0 1 1 0 0 1 Patriot_RoundD_Case16_IR_GBON_MQ445_2Ball.txt Complete 17 0 0 1 1 0 0 1 1 0 Patriot_RoundD_Case17_IRAF_GBOFF_MQ4_1Ball.txt Complete 18 0 0 1 1 0 1 0 1 0 Patriot_RoundD_Case18_IRAF_GBOFF_MQ445_1Ball.txt Complete 19 0 0 1 1 0 0 1 0 1 Patriot_RoundD_Case19_IRAF_GBOFF_MQ4_2Ball.txt Complete 20 0 0 1 1 0 1 0 0 1 Patriot_RoundD_Case20_IRAF_GBOFF_MQ445_2Ball.txt Complete 21 0 0 1 1 1 0 1 1 0 Patriot_RoundD_Case21_IRAF_GBON_MQ4_1Ball.txt Complete Messagesand themes 22 0 0 1 1 1 1 0 1 0 Patriot_RoundD_Case22_IRAF_GBON_MQ445_1Ball.txt Complete 23 0 0 1 1 1 0 1 0 1 Patriot_RoundD_Case23_IRAF_GBON_MQ4_2Ball.txt Complete 24 0 0 1 1 1 1 0 0 1 Patriot_RoundD_Case24_IRAF_GBON_MQ445_2Ball.txt Complete 25 0 1 0 0 0 0 1 1 0 Patriot_RoundD_Case25_CO_GBOFF_MQ4_1Ball.txt Complete 26 0 1 0 0 0 1 0 1 0 Patriot_RoundD_Case26_CO_GBOFF_MQ445_1Ball.txt Complete Preliminary Study ProgressReview Preliminary Tradespace Execution Review of preliminary results Engineering TEMsand/or Set up all available components Review/accept software changes Data gathering Run available systems and architectures Incorporate system, target set and Detailed system TEMs Prototype MOP, MOE, MOU products vignette changes and parameter Prototype study deliverables discussions Internal Model Assessment constantSettleTimeSecs Data gathering from velocityRollDegPerSec Currency of existing models accRollDegPerSecSq external organizations jerkRollDegPerSecCubed New models needed velocityPitchDegPerSec accPitchDegPerSecSq Maturity of existing vignettes jerkPitchDegPerSecCubed velocityYawDegPerSec Software Integration and Test Vignettes or target sets needed accYawDegPerSecSq jerkYawDegPerSecCubed Standard regression tests Internal Software Assessment Custom tests based on CRs Production Tradespace Setupand Execution Existing CRsneeded for this study All software updates Any new software requirements All new models and model updates Final Model Refinement and Tradespace All target sets and vignettes with updates Execution Understanding low level model outputs Customer Delivery Formal QA/QC of inputs and outputs Low level model tuning Delivery of products Selected re runs as required Outbriefs with customer's target audience End Study Product Refinement Debrief and next directions Full MOP, MOE, MOU Final Study Product Completion with customer development Final Analytical product updates Full deliverables development Final Study Deliverable updates Multiple iterations of these process steps with customer involvement as required FIGURE 3-14 RadiantBlue process description. SOURCE: RadiantBlue. 2012. Written communication to the committee. 3-14.eps in various combinations to create the raw quantitative data that will serve as the basis of the technical analysis to be conducted in the study. Also part of the detailed study definition are the measures of performance (MOPs), measures of effective- ness (MOEs), and measures of utility (MOUs) that are used to quantify trade-space performance and allow analysis thread comparisons as the study matures. Preliminary Trade-Space Execution, Internal Model Assessment, Internal Software Assessment With the study trade space fully defined, RadiantBlue then uses an extensive pre-existing library of ISR system models, target decks, and vignettes to allow a study team to take existing, simulator-ready data and build significant portions of a trade space to begin runs immediately. These three parallel activities (trade space, model, and software assessments) provide analytical products that drive the next several process steps in an iterative and collaborative manner with the larger study team. Engineering Technical Exchange Meetings, Software Integration and Test, and Progress Review Each of the analytical products is subjected to internal model assessments and internal software reviews in which the preliminary trade-space run data results are reviewed, and a new version of the simulator is provided. Then a progress review is

OCR for page 47
Examples of Government and Industry Processes for CP&A 71 conducted with the prime contractor or government point of contact (POC) dur- ing which the preliminary trade-space results are discussed, sample products are reviewed, and the status with respect to model additions and changes is reviewed. This is also an opportunity for the customer POC to refine or alter the direction of the study on the basis of these preliminary results and/or programmatic, financial, or political developments outside the study team. At the end of this process step, the study team is put on a refined vector for where to take the study in terms of priorities and trade-space definition. Production Trade Space, Study Products, Final Study Products With the refined study direction vector, updated executable(s), and updated models, the study team then sets up the full production trade space that includes all of the key models, target decks, vignettes, CONOPS, and software features. As the simulation data emerge from the trade-space runs, the MOP, MOE, and MOU products can be developed to address the messages and themes that were defined in the study kickoff phase. Technical measures can be refined or replaced as needed, and these then feed modifications to the messages and themes as required. In conjunction with refining technical measures, detailed low-level analysis of the simulator output data is conducted to make sure that the macro-level trends that are emerging are supported by coherent physics and technology-based micro-level system behaviors. The final study products are then formulated to meet the desired study outbriefing plan and the internal needs of the study team. These final study products are typically developed collaboratively and iteratively with the primary prime contractor or government POC and other key members of the larger study team as required. Summary RadiantBlue works collaboratively and iteratively with the customer to re- fine the details of the desired study.21 Continued collaboration with customer subject-matter experts serves to detail and enhance the understanding of the trade space, including systems, payload variations, architectures, CONOPS, target decks, analysis vignettes, MOPs, MOEs, and MOUs. With the trade space clearly defined, RadiantBlue conducts analysis of the trade space through multiple simulation runs of scenarios and vignettes. Iterative sessions between the customer and Radi- antBlue serve to refine tools and scenarios, ultimately leading to study results and 21RadiantBlue. February 7, 2012. Written communication to the committee.

OCR for page 47
72 C a pa b i l i t y P l a n n i n g and A na ly s i s to O p t i m i z e A i r F o r c e ISR completion.22 Finally, RadiantBlue's process (using the BlueSim simulator) requires iterative customer engagement and collaboration between operators and analysts and is supported by a large, pre-existing, model library of air and space systems. RadiantBlue provided for the commiteee a second industry example of how a very complex set of assets and vignettes can be evaluated iteratively through an MRA process that is thoroughly documented for transparency, accuracy, and re- peatability and can be tailored and scaled to customer desires.23 Both TASC and RadiantBlue identified analysis approaches that are responsive to their customers' needs by taking full consideration of ISR assets and trade-offs across the enterprise, spanning air, space, and, to a lesser extent, cyber effects. What is most helpful is the approach of pairing physics-based, layered analysis tools, cost-estimating, risk analysis trade-offs, along with the cost projections over various planning horizons (e.g., Analysis of Alternatives and Program Objective Memorandums) when implementing full-spectrum MRA. Finding 3-6. RadiantBlue's modeling, simulation, and analysis capability fo- cuses on the physics-based capability and architecture analysis and mission utility analysis found in MRA. The BlueSim tool, combined with RadiantBlue's methodology, has been used to successfully support trade-space studies of vari- ous ISR and processing, exploitation, and dissemination (PED) architectures. CONCLUDING THOUGHTS In the committee's reviewing of the government and industry CP&A-like pro- cesses described in this chapter, it became apparent that multiple tools, including both commercial-off-the-shelf and proprietary tools, are utilized effectively across government and industry for modeling, simulation, and analysis, and that "one size does not fit all." Second, none of the non-Air Force CP&A-like processes reviewed adequately addresses the emergent challenges posed by the cyberspace domain. Third, most of the non-Air Force CP&A-like processes reviewed do not adequately deal with the complexity of PCPAD, which, in turn, can affect cost, performance, and schedule. This latter issue can also result in capabilities that are not end to end and contributes to information and data that cannot be shared, correlated, or fused by users or customers. Finally, the objective of considering a wide range of government and industry CP&A-like processes was to gain insight into potential best practices to incorporate into this study's overall recommendations. Table 3-4 maps findings to these best practices. 22More information on RadiantBlue's methodology is available at http://www.radiantblue.com/ solutions/software-development/. Accessed February 28, 2012. 23Congressional professional staff members who spoke with the committee identified RadiantBlue as the best modeling organization at the architecture level.

OCR for page 47
Examples of Government and Industry Processes for CP&A 73 TABLE 3-4 Best Practices and Corresponding Findings Best Practice Finding Process includes consideration of "enterprise" ISR systems Findings 3-3, 3-4, 3-5, and 3-6 and/or capability. Process is transparent, responsive, scalable, and repeatable. Findings 3-1, 3-4, 3-5, and 3-6 Process is underpinned by multi-resolution-like analysis, Findings 3-4, 3-5, and 3-6 modeling and simulation. Process is collaborative and links planning, acquisition, and Findings 3-1 and 3-2 operations. Process is informed by operational metrics. Finding 3-1 Process incorporates network/PCPAD/TCPED architectures and Findings 3-2, 3-3, 3-4, 3-5, and 3-6 cyberspace considerations. NOTE: Acronyms are defined in the list in the front matter.