Analysis in the U.S. Intelligence Community: Missions, Masters, and Methods
The intelligence establishment of the United States is a vast enterprise with more than a dozen agencies, roughly 100,000 employees (Sanders, 2008), and a budget larger than the gross domestic product of many nations.1 Approximately 20 percent of the employees are analysts,2 a category that subsumes photo interpreters, those who interpret intercepted signals, specialists on foreign military systems, and a number of other specialists in addition to those who analyze political, economic, societal, and other security-related developments. All are members of the intelligence community (IC), but their missions, customers, professional identities, and organizational cultures are to a substantial extent determined by the agency (or agency component) to which they are assigned.3 They work on different kinds of problems for diverse sets of institutional and individual customers. The diversity of missions and masters has resulted in a pluralistic structure with sensible—if not always optimal—divisions of labor and professional specialization.
The National Intelligence Program budget for fiscal year 2008 was $47.5 billion (Office of the Director of National Intelligence, 2009).
The approximate percentage of analysts is based on the number of analysts listed in the Analytic Resources Catalog and the total number of military and civilian U.S. government personnel working in the IC (Sanders, 2008).
For descriptions of IC organizations and their primary missions, see Members of the Intelligence Community at http://www.dni.gov/members_IC.htm [accessed December 2009], 2009 National Intelligence: A Consumer’s Guide at http://www.dni.gov/IC_Consumers_Guide_2009.pdf [accessed December 2009], and An Overview of the United States Intelligence Community for the 111th Congress at http://www.dni.gov/overview.pdf [accessed December 2009].
This essay is intended to set the stage for the discipline- and field-specific essays of the other contributors. It seeks to identify key characteristics of the IC and to explicate, albeit in abbreviated fashion, why the IC is organized as it is and how mission, expectations, and structure empower and constrain the work of individuals, agencies, and the IC as a whole.
ANALYTIC MISSION OF THE INTELLIGENCE ENTERPRISE
The mission of intelligence analysis is to evaluate, integrate, and interpret information in order to provide warning, reduce uncertainty, and identify opportunities. Providing insight on trends, the political calculus of particular foreign leaders, or the way problems are perceived by people outside the United States is often more helpful to decision makers than is the presentation of additional “facts” or speculation about “worst case” possibilities.4 Discovering that a country is cheating on a treaty commitment may be less important than providing insight into why it is doing so.5 Ferreting out all details of an adversary’s new weapon system may be less useful than finding a vulnerability that can be exploited. Prompting decision makers to rethink their own assumptions and preliminary judgments may be more beneficial to the national security enterprise than providing definitive answers to specific questions.6
Intelligence, especially analytic support, is useful to decision makers in direct proportion to the degree to which it is timely, targeted, and trusted by those who receive it. Thorough examination of all relevant factors and how they interact is seldom possible within the real-world decision timelines of U.S. officials, and getting it completely right is often less important than providing useful information and insights to the right people at the right time. Even data-rich and methodologically brilliant analytic products may contribute little to the national security enterprise they are supposed
to support if they are prepared without understanding the knowledge, timelines, and objectives of officials working on the issue.7
In addition to being factually accurate, intelligence analysis must be—and be seen to be—both objective and germane to the needs of those for whom it is intended. The importance of tailored support is one of the reasons the U.S. intelligence enterprise has so many different and somewhat specialized components. Oversimplifying greatly, the 16 constituent agencies—with 19 analytic components counting the National Intelligence Council, National Counterintelligence Executive, and National Counterterrorism Center—exist because each serves different, and somewhat unique, customers and missions. Each has developed expertise and analytic tools to meet the needs of its primary customers. Their customers have confidence in the work performed by “their” intelligence unit because they know the people and routinely find the work they produce to be more useful than that provided by analysts elsewhere who perforce are less well attuned to the specific intelligence requirements of the parent department.8
FORM FOLLOWS FUNCTION
Legacy arrangements whereby individual and institutional customers rely primarily on analysts and agencies that look at issues and intelligence through lenses keyed to their own mission requirements are logical and often sufficient to meet core requirements. Indeed, the approach adopted by the Office of the Director of National Intelligence (ODNI) in 2005 and implemented thereafter has sought to preserve and build on the best features of a de facto federated system of intelligence support. That approach made it easier to take advantage of complementary skills, achieve more rational divisions of labor, and improve the overall performance of the analytic community by improving the performance of all analysts and each of the analytic components.9
This approach deliberately eschewed institutional consolidation and the formation of country- and/or issue-specific centers intended to “rationalize”
organization charts and lower institutional barriers to information exchange and collaboration because the ODNI judged that potential gains from co-locating analysts working on similar problems would be less than the probable loss of insight and trust resulting from proximity to particular customers.10 Rather than consolidating analysts, the ODNI approach sought to preserve and enhance the advantages of analytic boutiques (e.g., the Marine Corps Intelligence Activity and the State Department’s Bureau of Intelligence and Research) that were able to provide tailored support while making it easier for them to contribute to, and benefit from, the work of colleagues elsewhere in the IC. Furthermore, the approach aimed to reduce the autarky and isolation of analysts by facilitating knowledge of, access to, and collaboration with colleagues and counterparts in other components of the intelligence enterprise. The notional “model” for the analytic enterprise was more like Radio Shack’s networking of widely dispersed affiliates located near their customers than Walmart’s distribution of standardized goods through megastores located far from people previously served by neighborhood shops.
PARAMETERS AND PRESSURES AFFECTING ANALYTIC PERFORMANCE
Implementation of the blueprint summarized above has begun, and the initial results suggest it is both workable and worthwhile. The results also demonstrate, however, that several more challenges must be understood and addressed to minimize unnecessary duplication while providing more accurate, insightful, and useful analytic support to the IC’s large, diverse, and demanding customer base.11 The magnitude of the task is complicated and compounded by the explosive growth of requirements and
The call for formation of subject-specific centers was made, i.a., in The 9/11 Commission Report (National Commission on Terrorist Attacks, 2004, pp. 411–413). Preservation of multiple analytic components that had evolved independently in a context that made it difficult to rely on work done by colleagues in other components—because of impediments to knowing precisely who did what, the expertise of analysts elsewhere, or how responsive they would be to requests for assistance—also preserved unnecessary as well as appropriate duplications of effort. It also perpetuated cultural differences, bureaucratic rivalries, and other organizational pathologies (in this volume, see Zegart, Chapter 13; Tinsley, Chapter 9; and Spellman, Chapter 6). Knowing more about the capabilities, staffing, and missions of each component was a requisite for identifying which capabilities were redundant and which could be eliminated without risking a single point of failure or jeopardizing the ability of the IC to obtain multiple independent analyses of critical issues. Reducing and realigning independent capabilities was postponed until more was known about individual and aggregate strengths and weaknesses.
These are the personal observations of a participant observer. I made many of the decisions incorporated into the approach summarized here and closely monitored their implementation, but the judgments about their efficacy are largely subjective and impressionistic.
escalating expectations of customers, overseers, and the attentive public. Simply stated, in addition to their many other challenges, IC analysts must contend with more requirements from more customers, and must answer more difficult questions more quickly and with greater precision than ever. Moreover, they must do so while coping with exponentially increasing volumes of information (for further discussion, see Fingar, 2011b). Each of these interconnected challenges warrants both explication and illustrative examples of their implications for the analytic enterprise.
In the years since the demise of the Soviet Union, and especially since the attacks of 9/11, “national security” has been redefined, often implicitly, in ways that require radically different approaches to analysis, the way analysts engage with one another, and the missions they support. Once limited almost exclusively to concerns about military, diplomatic, and political/ideological threats to “American national interests,” national security now subsumes concerns about the geopolitics of energy, global financial flows, spread of infectious disease, and the safety of individual American citizens anywhere on the globe.12 Expansion of the concept and concerns of “national security” has also expanded the scope (i.e., number and variety) of institutions and individuals who desire or demand analytic support from the IC.13 Because intelligence support has long been treated as a “free good,” there are few constraints on what customers can request or what members of Congress expect to be provided.14
The proliferation of customers and topics on which the IC was expected to acquire information, develop expertise, and deliver analytic insights raised questions about how to do so. The default setting was for new customers to go to the Central Intelligence Agency (CIA) because its
The broader scope of questions addressed to the IC is illustrated by the titles of unclassified reports published by the National Intelligence Council during the past decade. They include: The Impact of Climate Change to 2030: Commissioned Research and Conference Reports (National Intelligence Council, 2009), Strategic Implications of Global Health (National Intelligence Council, 2008b), SARS: Down But Still a Threat (National Intelligence Council, 2003), and Global Humanitarian Emergencies: Trends and Projections 2001–2002 (National Intelligence Council, 2001).
Perhaps the clearest example of this expansion is the creation of the Homeland Security Council by the George W. Bush Administration and the subsequent incorporation of “domestic” agencies into the restructured National Security Council undertaken by the Obama Administration. It is also reflected in the redefinition of “national intelligence” in the Intelligence Reform and Terrorism Prevention Act of 2004 (Section 1012).
Members of the Intelligence Oversight Committees in both the Senate and the House of Representatives have raised questions about the appropriateness of devoting intelligence resources to nontraditional issues and customers, but members who sit on committees with responsibility for the nontraditional issues and agencies generally take the opposite view. For examples of debate over the proper scope of topics to be addressed by the IC, see the blog, Kent’s Imperative (n.d.), http://kentsimperative.blogspot.com/ [accessed May 2010]. For an example of disagreement among members of Congress, see Congressional Record–House (2007).
mandate was to support all national security customers, and the CIA initially accepted the new requirements. Rather quickly, however, customers and intelligence analysts rediscovered the value of proximity and tasking authority that had spawned the creation of so many different analytic components. Simply stated, the U.S. government faced, at least implicitly, the question of whether to replicate the old approach of creating new specialized units co-located with customers, or to develop better ways to frame requirements and tap expertise without creating new units. In other words, the IC had to find a way to provide boutique-like service and attention to customers without creating new bureaucratic units or substantially increasing the number of analysts.
In addition to coping with a wider range of requirements from a larger and more diverse set of customers, intelligence analysts had to address many questions that were inherently more complex than most of those that had become routine during the Cold War. One dimension involved the shift of focus from the national level (e.g., what does Moscow or Cairo want?) to subnational and nongovernmental organizations and groups (e.g., is the basis for the insurgency political, tribal, economic, religious, or something else?). Addressing such questions requires both greater and different kinds of expertise and analytic techniques than were sufficient in the past. These challenges are further compounded by shorter deadlines—to be useful now, analytic insights often must be provided in days or hours rather than weeks or months—and the demand for more “actionable intelligence” (i.e., information that can be used to disrupt a terrorist plot, prevent the delivery of chemical precursors, or freeze bank accounts being used for illicit purposes).15
More numerous and more complex issues require use of more and different types of information. Much of the required information is readily available to anyone at no or little cost; other types of information can only be acquired, if at all, through clandestine methods. Knowing what to look for, where to seek it, and how to provide guidance to collectors have become much more demanding aspects of an analyst’s job than in days when much
of the job often entailed evaluating and explaining secrets and other bits of information collected and disseminated “because it was obtainable” rather than because it addressed high-priority analytic questions. Moreover, the dramatic increase in publicly available information that has characterized the past two decades, and the extraordinary capabilities of new methods of technical collection and data storage, have greatly increased the height of the information haystack. It probably does contain more “needles” than before, but they are often much harder to find.
WHAT ANALYSTS DO: INDIVIDUAL AND COLLECTIVE RESPONSIBILITIES
Every analyst’s job is multifaceted and somewhat unique, but all entail core responsibilities and employ—or should employ—the same high standards of analytic tradecraft. The challenge, and it is a significant one, is for every individual and the analytic community as a whole to strike the right balance when allocating time and effort to each component of the job. This cannot be achieved by assigning arbitrary priorities or percentages of time. The generic tasks summarized below are—or should be—complementary, but they are more often characterized as zero-sum with a bias for addressing what is current at the expense of what might be more important. This is a long-standing lament, but most proposals to alleviate competing demands do not go beyond calling for more long-term strategic analysis and less attention to current issues (e.g., Russell, 2007; Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction, 2005).
A portion of every analyst’s job involves answering questions. Sometimes the questions are posed in the course of a meeting and may require both an immediate answer and a longer and more considered response. One’s ability to provide confident answers with adequate levels of detail is a function of one’s expertise and ability to anticipate what the customer or meeting is likely to require; the adequacy of the response is, in part, a function of the degree to which those present have confidence in the analyst.16 Sometimes the most important “answers” are the ones provided by
an analyst to questions that customers should have asked, but did not.17 To be useful, the analyst needs to find out what his or her customers “know,” what they are trying to accomplish, and what approach is being used to formulate and evaluate policy options. Questions that are more difficult to address include those that come to an analyst indirectly, with little or no information on why the question was asked.18 The objective in all cases is to provide more than “just the facts.” Good analytic tradecraft requires providing information on context, patterns, the quantity and character of intelligence germane to the subject, and other insights likely to help customers to understand the issues that prompted the query (Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction, 2005).19 Three keys to providing timely and useful answers are (1) command of one’s portfolio, (2) knowledge of where to go for data and help to interpret what it means, and (3) practice of good tradecraft even on routine or quick turnaround matters.
Every analyst has a responsibility to monitor developments and trends in his or her portfolio in order to determine where they seem to be headed and whether they might “threaten” American interests or the viability of approaches being considered or implemented by those they support. Analysts should also be alerted to potential opportunities for policy intervention to mitigate or capitalize on what is taking place. For most analysts, most of the time, the focus should be on providing strategic warning—informing customers what could happen far enough in advance to allow deliberation and the formulation of policies to encourage what appears desirable and to thwart or mitigate unfavorable or dangerous developments. But no policy maker likes to be surprised. Too often their expectations and demands for “warning” are conveyed or interpreted as demands to be informed or alerted about any development that might be made known to colleagues and counterparts, or about which they might be asked questions by Congress or the media. This desire for “no surprises” often skews the work of analysts too far in the direction of “current intelligence,” amounting to little more than duplicative and ill-informed commentary on developments that, in the grand scheme of things, are not all that important (e.g., Russell, 2007).
Monitor and Assess Current Developments and New Information
The ability to provide warning of what lies over the horizon, around the bend, or behind a tree requires continuous and close monitoring of developments that might affect places, problems, people, or policy maker requirements in every analyst’s portfolio. This dimension of the analyst’s job involves more than just evaluating, assessing, interpreting, and transmitting the latest fruits of collection efforts.20 Many analysts feel overwhelmed because they attempt to—and cannot—“read everything” that collectors push at them and they know is available in unclassified materials (“open source” in the argot of the IC). The days when an analyst could, or could be expected to, read everything are long gone. It would be counterproductive and fruitless to try to solve the problem by narrowing the scope of
portfolios and adding more analysts.21 What is required is better understanding of complex problems, not a large contingent of analysts who know more and more about less and less.
To perform this part of the job, analysts must begin with a clear (or as clear as their relationship with customers allows) understanding of what customers are working on, worry about, and want to know.22 Armed with this knowledge, and the analyst’s own subject matter expertise and understanding of the issues and dynamics involved, the analyst can narrow the scope of his or her search and analysis efforts to what are thought to be key drivers, key indicators, and key developments germane to the concerns of customers and, as importantly, to their own ability to understand what is happening, why, and where events appear to be headed. There are obvious advantages to divisions of labor with fellow analysts and increasing opportunities to work together via collaborative tools such as A-Space and other capabilities to access and assemble data and to garner insights from colleagues. However, at the end of the day, each analyst is responsible for identifying and interpreting information germane to the interests of his or her customers that might affect their understanding of the situation and ability to achieve their objectives.23
Building Expertise and Strategic Analysis
Observations—and criticism—that analysts devote too much time to “current intelligence” often lament that too little time is spent on “strategic analysis.” Many prescribe corrective measures that include setting up separate staff to conduct long-range studies or assigning all “current intelligence work” to a small staff so that most analysts can engage in strategic analysis (e.g., Treverton and Gabbard, 2008). From my perspective, both the diagnosis and the prescriptions are somewhat off the mark. The IC certainly can do a lot better in terms of the way it monitors and reports breaking developments (what Secretary Powell correctly referred to as “the news”). Yet that does not obviate the need for the vast majority of analysts to address issues already or soon to be on the agendas of those they support because if they do not and cannot do that, the IC will not meet the requirements and expectations of those it supports.
Second, although many proclaim the need for more strategic analysis, I have found the “market” for such work to be both small and episodic. So-called “tyranny of the inbox” is a bigger problem for policy makers than for analysts, and the needs of customers drive the process. Perhaps policy makers should think more about the long-term future, but few do so on more than an intermittent basis, and all tend to have less interest in long-term issues as they spend more time on the job. One can lament or decry the situation, but it is difficult for officials to think about how events might play out after their term of office while piranhas are working on their legs.24 Intelligence is fundamentally a support function; it exists to provide information and insight that will help customers to perform their assigned missions in the national security enterprise (for further discussion, see George and Rishikof, 2011). Analysts can, should, and do regard reminding customers of long-term trends and strategic implications of current decisions as an important part of their job, but they must do so within the parameters of trust, temporal pressures, and the agendas of those they support. The alternative is to be regarded as unhelpful or irrelevant (e.g., Treverton, 2008).
Rather than focusing on structural solutions such as creating strategic analysis units, or on changing the behavior and expectations of decision makers, the most useful proposals to improve analytic support begin from the premise that providing useful insights and context when addressing “current” issues requires both deep expertise and understanding of strategic
trends and long-term dynamics. The implication of this is that “every” analyst not only should—but also must—continuously examine “strategic” questions to enhance his or her ability to provide better daily support to the national security enterprise. One can imagine multiple ways to combine current and strategic work, but the key is continuous integration of insights from the strategic dimension into what the analyst carries in his or her head and contributes to the policy-making process through oral and written assessments and projections. A State Department colleague once likened the process to continuously updating the “elevator briefing” that an analyst should be prepared to deliver in the time to accompany a key customer from his or her office to the basement of the building. Such a briefing would summarize what was new, what it seemed to mean, and how it affected trends and strategic concerns.
Analysis of Topics Assigned in Accordance with Agency Production Plans
The job elements described above assume and require regular interchange between analysts and customers. They involve a high degree of contingency because analysts must adapt and respond to changing requirements, the serendipity of events, and the fruits of collection efforts. The degree to which analysts focus on or are consumed by these job elements is a function of where they work, what accounts they follow, whom they support, and a number of other situational factors.25 For some analysts, these tasks are all consuming, but a subset must also devote time
and attention to topics assigned in accordance with agency or IC production plans. Other analysts, probably the majority, are able and expected to devote most of their time to production intended to close intelligence gaps, illuminate new issues, or satisfy internally or externally mandated requirements to update information on leadership biographies, military orders of battle, developments in foreign science and technology, foreign direct investment in particular countries or industries, and other such issues. Some of this work is crucial and contributes directly to the work of other analysts; some of it requires more effort than may be warranted to produce information and insights of interest to only a small number of people who may or may not have any reason or ability to act on that information.26
Contribute to Community and Collaborative Products
Contributing to the President’s Daily Brief and participating in the production of National Intelligence Estimates and other formally “coordinated” assessments impose heavy demands on analysts, but for all but a tiny number, the duration is short and the frequency is very occasional. Meeting the standards and procedural requirements of the IC’s flagship products takes a great deal of work, but only a small percentage of all analysts write or make significant contributions more than a few times per year, if that. When they do, it is all consuming for a short period; how easy or difficult it is depends on how efficiently and effectively they perform other tasks in their job jars. Nevertheless, the high standards, obvious importance, and requirement to look closely at sources, assumptions, alternative hypotheses, and other facets of good tradecraft make this job element more important than suggested by the infrequency of individual participation. Expectations and enforcement of high standards in this arena exert an
upward pull on the quality of work done by analysts in other venues and by the IC as a whole.27
Efforts to integrate the IC and to forge a “community of analysts” who collaborate without regard to parent agency have added new dimensions to the analyst’s job.28 One new element is the increased requirement for consultation and coordination in the production of items for the President’s Daily Brief and briefing materials prepared by the National Intelligence Council for the National Security Council and other high-level meetings (for additional detail, see Fingar, 2011a). Though intended to require minimal time commitment on the part of those asked to comment on or coordinate most products, the importance of the products causes most analysts to take this task seriously and to invest as much time as they believe is necessary to “get it right.” As a result, many will identify this dimension of their job as more time consuming than it probably is.
Bottom-up, analyst-initiated collaboration is an even more significant new element in the job jar. This takes many forms, including informal collaboration with colleagues within and beyond an analyst’s home agency to produce better products, to the use of Wiki, blog, and other collaborative tools in Intellipedia (the IC’s classified version of Wikipedia) and A-Space (for additional information on Intellipedia, see Calabresi, 2009; for
A-Space, see Shaughnessy, 2008). Analysts in and influenced by the “digital generation” find these tools to be helpful, but using them to share information, enhance understanding, build “living documents,” and perform other analytic tasks has taken the IC in new directions that have no roadmaps, few standards, and only limited understanding on the part of most managers.29 No matter how great a boon or burden, actual or perceived, these increasingly used forms of collaboration are still a matter of dispute. It is not uncommon to hear complaints from both analysts and managers that analysts “must” spend too much time collaborating with colleagues or using new analytic tools.30
Provide Guidance to Collectors
The existence and utility of the National Intelligence Priorities Framework (NIPF) notwithstanding, analysts play the primary role in translating
customer needs into guidance for collectors.31 The IC collection system is both vast and nimble. It can be tweaked to go after specific topics and targets, and collectors do their best—which is a substantial effort—to meet the ever-changing panoply of needs given to them by analysts. Analysts do not—or should not—simply relay questions from customers. They translate such requests by asking themselves—and their colleagues—what the question is intended to illuminate, what kind of information would produce the greatest understanding of the underlying problem, and where collectors should look to find that information. The formal process for translating information needs into guidance to collectors is still a work in progress and is still more cumbersome than it should be, but analysts are and will remain the key to its success.32
The schematic summary above somewhat obscures the extent to which all of these job elements are interrelated and occupy a continuum rather than compartmented activities. It also omits activities such as updating databases that occupy significant portions of some analysts’ time. That said, many analysts and commentators speak as if the different elements are in zero-sum competition and lament that certain ones constrain what can be done to address others. As one might imagine, there is a natural tendency to decry and exaggerate the amount of time that must be allocated to tasks an
individual finds more difficult or less rewarding than those on which he or she would rather spend time. All are important and interconnected efforts to make analysis more accurate, more useful, and more efficient. They will have the greatest impact if they address all of the elements in the job jar as parts of an integrated whole rather than as specialties that can be compartmentalized and assigned to discrete groups of analysts.
MAKING NECESSITY A VIRTUE LEADS TO A BETTER WAY OF DOING BUSINESS
In the past—and here the past is as recent as the immediate post-9/11 period during which there was tremendous growth in the IC budget and the number of analysts—the standard response to increased demands was to add people and/or create new analytic components.33 To improve information sharing, reduce “cultural” barriers to collaboration, and consolidate work on important issues, the 9/11 Commission recommended and the Intelligence Reform and Terrorism Prevention Act of 2004 (IRTPA) endorsed the creation of specialized “centers.” The IRTPA also gave statutory authority to the National Counterterrorism Center (NCTC) that had been created four months earlier by Executive Order (National Commission on Terrorist Attacks upon the United States, 2004; Section 1021 of the 2004 IRTPA).
The newly established ODNI made a conscious decision not to adopt that approach. By mid-2005, calls to integrate and rationalize the IC and competing demands for “more analysts” made it impractical and imprudent to create and staff new analytic units to support new missions and new customers. New units would have had to be either too small to achieve critical mass on any issue or so large that duplication of effort would have been inevitable. Moreover, the start-up problems of the NCTC and the fact that no agency abolished or significantly downsized its own counterterrorism unit when NCTC was established underscored previously learned lessons about distancing analysts from their primary customers and providing what looks like one-size-fits-nobody analytic support (see DeYoung, 2006; Whitelaw, 2006; for additional analysis, see Fingar, 2011a).34 The ODNI
was determined to find a better way to organize and integrate IC analytic capabilities.
To address the need to bring new types of expertise to bear on new problems for new customers without creating new units or adding significantly to the analytic workforce, the ODNI set out to discover whether such expertise existed anywhere in the IC, where this expertise was considered critical to the performance of core missions, and where it was vestigial or serendipitous.35 This effort also revealed how strong or weak the analytic community was in each area, now and when factoring in projected retirements and other forms of attrition. Using loose and subjective criteria, the ODNI set out to determine where the IC had sufficient expertise (if it could be harnessed effectively), where gaps existed in specific agencies and in the IC as a whole, and where there was potential to “grow” expertise by mentoring across agency boundaries.
Developing the “better way” is still a work in progress, but the principal building blocks of the approach are relatively clear and experience to date provides an empirical basis for adjustments and improvement. The first building block was to identify with a fair degree of precision what each of the component analytic elements did (i.e., the missions and customers they supported, the areas of expertise they had developed, and the kinds of assessments they produced). This inventory revealed less redundancy than many assumed, especially when one examined specific areas of focus subsumed under broad rubrics such as “China” or “missiles.” Yet it also indicated that many agencies had developed small elements to address subjects tangential to their core missions because they did not know where relevant expertise could be found elsewhere in the IC, could not “task” analysts elsewhere to provide necessary input, or could not have confidence in the quality of the work done by people they did not know and could not evaluate on their own.36 This mapping exercise also revealed that most components judged that they lacked a “critical mass” of expertise on all but
a small number of topics. When combined, these agency-by-agency mapping exercises provided a reasonably complete picture of the customers and activities supported by the IC and a first-cut approximation of duplication and deficiencies.
The second building block was to inventory the skills and experience of the analysts themselves by reinvigorating and making enrollment mandatory in the Analytic Resources Catalog (ARC).37 A primary objective of the ARC was to map what analysts know, individually and collectively. An assumption confirmed by the ARC data was that expertise on many subjects is deeper than organizational and staffing charts would suggest because analysts retain knowledge from previous assignments even as they assume new responsibilities. The mapping exercise, in conjunction with demographic data using years of experience as a proxy for age and similar ways to avoid running afoul of privacy laws, also revealed areas where expertise was concentrated in particular age cohorts (e.g., a disproportionate percentage of those working a given subject had 20 or more years of experience, suggesting an upcoming problem of simultaneous retirements with no suc-
cessors in the pipeline).38 One objective of this inventory of expertise was to make it easier for analysts to find potential collaborators and for analytic managers to find persons with the skills and experience needed to address subjects beyond the competence of their own agency. Stated another way, the goal was to be able to harness the totality of expertise in the analytic community, not just that of persons currently occupying particular billets (e.g., “Southeast Asia terrorism” or “Andean economics”).
The exercise described above made clear that the IC had more expertise than suggested by staffing patterns if it could find a way to tap what people already knew, even if that knowledge was from previous assignments, and if the IC found a way to enable analysts to collaborate at a distance. The goal was to facilitate voluntary formation of “virtual” teams with the advantages of proximity to key customers and synergistic benefits from collaboration.39 Realizing the potential benefits inherent in this vision required overcoming a number of technical, policy, and cultural obstacles.40 Some have been surmounted; others have yet to be tackled. It also made clear, however, that the IC did not have and was unlikely ever to have enough people with
sufficient expertise to cover all of its missions in the small time frames that had become the norm. It was imperative to find ways to develop continuing relationships with scholars, journalists, think tank researchers, diplomats, and others with deep knowledge of subjects of interest to policy makers and essential to the analytic mission of the IC.41 This had to involve more than just compiling a list of “experts on everything.” Indeed, one objective was to make the incorporation of information and insights from outside experts a regular part of each analyst’s job in order to raise the level of individual and corporate expertise in the IC. A second was to be able to use the outside expert as a sounding board for ideas and as a source of guidance on where to look for answers to specific questions. A third objective was to nurture these relationships so they could be activated immediately in the event of a crisis or extremely short fuse requirements. The advantages are obvious, but not enough to overcome concerns, many of them legitimate, about interchange with people outside the IC.42 The proposed arrangements also raise important questions about deference to authority figures, protection of sources and methods, and other methodological concerns.
Perhaps the most important characteristic of the analytic workforce is its youth. Any plans to improve the quality of analytic products must give proper attention to the fact that more than 50 percent have joined the IC since 2001. A second is that the age distribution of the other 50 percent is skewed toward the retirement end of the scale, largely because the hiring freezes, downsizing, rightsizing, and organizational turmoil of the 1990s limited intake and caused many younger analysts to seek employment elsewhere, often in firms that do contract work for the IC. These demographics create a number of challenges (e.g., the need to use and capitalize on the expertise of senior analysts now serving in managerial positions, and to pull junior analysts up the learning curve faster than would normally have been the case in the IC).43 This also means that more formal training is required
to compensate for the brevity of on-the-job learning through observation of how more senior analysts practice their craft.
Those are the downsides of demography, but there are a great many upsides as well. For example, the cohort that has joined in the past 7 to 8 years is extremely talented and exceptionally well trained in the disciplines they pursued in graduate school (and most of the new analysts do have graduate training, many from leading universities). They are also of the “digital generation” and completely at home in environments requiring collaboration at a distance, sharing and providing information to trusted interlocutors, experimenting with analytic tools, searching the Internet and classified databases, and performing other tasks (Palfrey and Gasser, 2008). They routinely communicate with friends across institutional boundaries and expect to do the same in their professional lives. Persuading them to adopt new techniques and to work differently than the generations they are succeeding is easy. What is less easy—but essential—is developing modern-day means to vet information, exercise quality control on products developed using Wikis and blogs, and maintain the requisite security safeguards when dealing with persons outside of the IC.44 In doing so, IC leaders must diligently adapt policies and procedures developed for a different time, different types of problems, and different generations to suit the capabilities and expectations of the youthful workforce.
WILL AND ABILITY TO ADAPT
The IC as a whole and the analytic community in particular are neither broken nor bad, but they can and want to be better. They want to be better for the right reasons: to ensure the security of our country, the safety of our fellow citizens, and the success of policies to protect American interests and promote American ideals. The majority of analysts are new to the IC, but they are not new to analysis. As a group, they represent and reflect the best training available in America’s best universities. Their seniors, in both age and position, are among the most knowledgeable subject matter experts in their fields. Most of them feel a strong sense of professional responsibility to move successors up the learning curve as rapidly as possible. Top analytic managers “get it” and are (mostly) eager to do what is necessary to transform the way analysis is done in the IC in order to satisfy burgeoning
the IC in the past decade and will leave if they feel inhibited by hoary traditions that no longer make sense to them or their expectations of the intelligence profession.
requirements, support new missions, realize the full potential of the analytic workforce, and retain the talented people who have joined the IC in the past decade. Getting it right will not be easy or quick, but conditions for sustained improvement have never been better.
Boudreaux, R. 1994. Yeltsin fires chemical warfare chief. Los Angeles Times. April 8. Available: http://articles.latimes.com/1994-04-08/news/mn-43642_1_chemical-weapons [accessed May 2010].
Calabresi, M. 2009. Wikipedia for spies: The CIA discovers Web 2.0. Time. Wednesday, April 8. Available: http://www.time.com/time/nation/article/0,8599,1890084,00.html [accessed October 2010].
Central Intelligence Agency. 1998. Statement by the Director of Central Intelligence regarding the disclosure of the aggregate intelligence budget for fiscal year 1998. Press release, March 20. Available: https://www.cia.gov/news-information/press-releases-statements/press-release-archive-1998/ps032098.html [accessed May 2010].
Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction. 2005. Report to the President of the United States. March 31. BookSurge, LLC. Available: http://www.gpoaccess.gov/wmd/pdf/full_wmd_report.pdf [accessed April 2010].
Congressional Record–House. 2007. Proceedings and debates of the 110th Congress, first session, 153: 77—Part II, May 10, pp. H4895–H4896. Available: http://www.gpo.gov/fdsys/pkg/CREC-2007-05-10/pdf/CREC-2007-05-10-pt2-PgH4881-3.pdf [accessed October 2010].
DeYoung, K. 2006. A fight against terrorism and disorganization. Washington Post, August 9. Available: http://www.washingtonpost.com/wp-dyn/content/article/2006/08/08/AR2006080800964_pf.html [accessed May 2010].
Director of National Intelligence. 2007a. Intelligence community directive (ICD) 203: Analytic standards. June 21. Available: http://www.dni.gov/electronic_reading_room/ICD_203.pdf [accessed May 2010].
Director of National Intelligence. 2007b. Intelligence community directive (ICD) 204: Roles and responsibilities for the National Intelligence Priorities Framework. September 13. Available: http://www.dni.gov/electronic_reading_room/ICD_204.pdf [accessed May 2010].
Director of National Intelligence. 2007c. Intelligence community directive (ICD) 206: Sourcing requirements for disseminated analytic products. October 17. Available: http://www.dni.gov/electronic_reading_room/ICD_206.pdf [accessed May 2010].
Director of National Intelligence. 2008. Intelligence community directive (ICD) 205: Analytic outreach. July 16. Available: http://www.dni.gov/electronic_reading_room/ICD_205.pdf [accessed May 2010].
Fingar, T. 2006. DDNI/A [Deputy Director of National Intelligence for Analysis] ad dresses the DNI’s information sharing conference and technology exposition. Intelink and Beyond: Dare to Share. Denver, CO, August 21. Available: http://www.dni.gov/speeches/20060821_2_speech.pdf [accessed May 2010].
Fingar, T. 2007. Remarks and Q&A by the Deputy Director of National Intelligence for Analysis & Chairman. National Intelligence Council at the ODNI Open Source Conference. Washington, DC, July 17. Available: http://www.dni.gov/speeches/20070717_speech_3.pdf [accessed May 2010].
Fingar, T. 2011a. Office of the Director of National Intelligence: Promising start despite ambiguity, ambivalence, and animosity. In R. Z. George and H. Rishikof, eds., The national security enterprise: Navigating the labyrinth. Washington, DC: Georgetown University Press.
Fingar, T. 2011b. Reducing uncertainty: Intelligence analysis and national security. Stanford, CA: Stanford University Press.
George, R. Z., and H. Rishikof (Eds.). 2011. The national security enterprise: Navigating the labyrinth. Washington, DC: Georgetown University Press.
Gordon, M. R. 1990. Beijing avoids new missile sales assurances. The New York Times. March 30. Available: http://www.nytimes.com/1990/03/30/world/beijing-avoids-new-missile-sales-assurances.html [accessed May 2010].
Hackman, J. R., and M. O’Connor. 2004. What makes for a great analytic team? Individual versus team approaches to intelligence analysis. February. Available: http://www.fas.org/irp/dni/isb/analytic.pdf [accessed May 2010].
Kelly, M. L. 2007. Intelligence community unites for “Analysis 101.” National Public Radio, May 7. Available: http://www.npr.org/templates/story/story.php?storyId=10040625 [accessed May 2010].
National Commission on Terrorist Attacks Upon the United States. 2004. The 9/11 Commission report. New York: W.W. Norton. Available: http://www.9-11commission.gov/report/911Report.pdf [accessed April 2010].
National Intelligence Council. 2001. Global humanitarian emergencies: Trends and projections 2001–2002. Available: http://www.dni.gov/nic/special_globalhuman2001.html [accessed May 2010].
National Intelligence Council. 2003. SARS: Down but still a threat. Available: http://www.dni.gov/nic/special_sarsthreat.html [accessed May 2010].
National Intelligence Council. 2008a. Global trends 2025: A transformed world. Available: http://www.dni.gov/nic/PDF_2025/2025_Global_Trends_Final_Report.pdf [accessed May 2010].
National Intelligence Council. 2008b. Strategic implications of global health. Available: http://www.dni.gov/nic/PDF_GIF_otherprod/ICA_Global_Health_2008.pdf [accessed May 2010].
National Intelligence Council. 2009. The impact of climate change to 2030: Commissioned research and conference reports. Available: http://www.dni.gov/nic/special_climate2030.html [accessed May 2010].
Nuclear Threat Initiative. 2007. China’s chemical and biological weapon-related exports to Iran. Available: http://www.nti.org/db/China/cbwiran.htm [accessed May 2010].
Office of the Director of National Intelligence. 2009. DNI releases budget figure for 2009 National Intelligence Program. October 30. Available: http://www.dni.gov/press_releases/20091030_release.pdf [accessed January 2010].
Palfrey, J., and U. Gasser. 2008. Born digital: Understanding the first generation of digital natives. New York: Basic Books.
Partnership for Public Service. 2009. The best places to work in the federal government 2009. Available: http://data.bestplacestowork.org/bptw/index [accessed May 2010].
Russell, R. L. 2007. Sharpening strategic intelligence. New York: Cambridge University Press.
Sanders, R. 2008. Conference call with Dr. Ronald Sanders, associate director of national intelligence for human capital. August 27. Available: http://www.asisonline.org/secman/20080827_interview.pdf [accessed April 2010].
Shaughnessy, L. 2008. CIA, FBI push “Facebook for Spies.” CNN.com/technology. September 5. Available: http://edition.cnn.com/2008/TECH/ptech/09/05/facebook.spies/ [accessed May 2010].
Steele, J. 2008. Maliki drops the mask: With his tough stance on U.S. withdrawal, Sunni militias and the Kurds Iraq’s leader risks doom. September 5. Available: http://www.guardian.co.uk/commentisfree/2008/sep/05/iraq.middleeast [accessed December 2009].
Time. 2008. Time’s best inventions of 2008. October 29. Available: http://www.time.com/time/specials/packages/completelist/0,29569,1852747,00.html [accessed May 2010].
Tiron, R. 2007. Afghanistan officials question drug-eradication, nomination. The Hill. Available: http://thehill.com/business-a-lobbying/2261-afghanistan-officials-question-drug-eradication-nomination [accessed May 2010].
Treverton, G. F. 2008. Intelligence analysis: Between “politicization” and irrelevance. In R. Z. George and J. B. Bruce, eds., Analyzing intelligence: Origins, obstacles, and innovations (pp. 91–106). Washington, DC: Georgetown University Press.
Treverton, G. F., and C. B. Gabbard. 2008. Assessing the tradecraft of intelligence analysts. The RAND Corporation, National Security Research Division. Available: http://www.rand.org/pubs/technical_reports/2008/RAND_TR293.pdf [accessed May 2010].
Voice of America. 2009. Afghan election poses policy dilemmas for U.S. August 3. Available: http://www.voanews.com/english/2009-08-03-voa34.cfm [accessed May 2010].
Whitelaw, K. 2006. The eye of the storm. U.S. News and World Report 141(7):48–52.