The Intelligence Community (IC) has always needed a workforce that is responsive, flexible, effective, and well equipped to learn and adapt to change. This workforce will surely be influenced in the coming decade by factors that will drive change across most industries, including “ubiquitous high-speed mobile internet; artificial intelligence (AI); widespread adoption of big data analytics; and cloud technology” (World Economic Forum, 2018, p. vii). The opportunities discussed in this report are ways in which these technologies and other developments can bring fundamental changes in the way intelligence analysis is conducted. In a decade or less, for example, analysts may have the capacity to obtain sophisticated analysis of a months-long narrative stream on social media sites, compare it with activities from that period identified through geospatial imaging, and develop a graphical representation of the intersections between the two—as part of a day’s work.
To leverage these opportunities, the analytic workforce will need new skills: developments in such areas as network science, complex systems models, statistics, and data analytics of all kinds will likely add new methods and tools to the analyst’s toolbox.1 In areas in which intelligence analysts are expert—qualitative analysis of text and narrative, for example—new developments such as improved quantitative methods for text analysis,
including methods for analyzing social media, offer possibilities that may not yet have been integrated into common practice within the IC.
The analytic workforce already reflects diverse and valuable technical and academic skills and experience. Today, intelligence analysts typically join the workforce with specific disciplinary subject matter knowledge; Box 8-1 suggests the range of expertise analysts bring to the job. As analysts mature in their careers, they gain deep regional expertise (e.g., China, Russia, the Middle East) and become functional experts in such topics as fighter aircraft, naval systems, nuclear issues, missile technology, counterterrorism, counterproliferation, or counterintelligence. As discussed in Chapter 4, day-to-day analysis is typically core task-based, with analysts culling volumes of classified and unclassified intelligence information to build analyses associated with their region of interest or the function being surveilled. Analysts need critical thinking skills, and they also need strong writing and communication skills because the quality of their writing is vital to the effectiveness of the briefings they provide.
The IC will need to continue building on the expertise of the analytic workforce as it integrates new tools into the workflow, and may need to compete with the private sector for individuals with the expertise needed, though this is not a new challenge. Analysts of the future will need to build on the skills they have always had, including technical skills, domain-specific knowledge, social intelligence, strong communication skills, and the capacity for continued learning (Dawson and Thomson, 2018). But they will also need to function in new ways. In particular the innovations in the integration of human–machine systems discussed in Chapter 7 are likely to provide new working environments and methodologies that expand analysts’ capacity and productivity. Survey results indicate that human–machine systems of some sort are likely to become increasingly common, and that the percentage of task hours performed by machines is likely to increase across industries (Schwartz et al., 2017; World Economic Forum, 2018). Research in contexts other than the IC has emphasized that the rapid influx of AI and automation into the workplace has intensified the demand for such “human” skills as complex problem solving and social skills (Agrawal et al., 2018). As discussed in Chapter 7, the optimal integration of human and machine skills for intelligence analysis will require careful research; it will also require close attention to the management and structure of the workplace and the workforce.
As in any large organization, the agencies of the IC pay attention to means of identifying, recruiting, and selecting individuals likely to excel as intelligence analysts; providing training and using other means to develop their skills and abilities; obtaining optimal performance from the workforce; and retaining effective employees. Researchers in the fields of industrial-
organizational psychology and human resource development have produced a robust body of work on ways to pursue most of these objectives. Applying this work for the IC context, however, requires translational research on the precise applications of well-established findings in the unique context of intelligence analysis.
This chapter provides a review of the state of the foundational research in industrial-organizational psychology and human resource development that is relevant to the evolving needs of the IC. It highlights developments in a few areas that hold particular promise for supporting the IC in meeting four key workforce challenges: selecting applicants likely to be effective in analytic roles, retaining effective analysts, developing skills on the job through both formal training and informal learning, and providing support for a potentially stressed and fatigued workforce.
The research base on workforce selection is robust. Researchers in psychology and other fields have been working for more than a century to develop empirically based approaches to selecting those workers most likely to be effective, long-term employees in a particular context. They have explored the use of psychological measures to predict how well individuals will perform aspects of particular jobs. They also have developed psychological tests of cognitive abilities, personality traits, interests, values, and work styles, as well as other selection tools, such as interviews, situational judgment tests, and assessment centers (see Ployhart et al.  for a review of work on the development of selection systems and Sackett et al. [2017b] for a review of work on the study of individuals’ personal attributes that influence job performance).
This work has provided the basis for approaches with a strong record of effectiveness in predicting job performance and improving the quality of the selected workforce. It has led to the development of measures commonly used in personnel selection that include tests of knowledge, skill, and abilities; personality inventories; structured interviews; situational judgment tests; and job samples and simulations (National Academies of Sciences, Engineering, and Medicine, 2018b). Here we explore advances in measurement tools designed to predict the performance of potential employees, outcomes of interest in evaluating the effectiveness of a system for selecting new hires, approaches to evaluating the effectiveness and utility of a selection system, and the implications of these findings for the IC.
Testing to predict how well an individual is likely to perform in a particular job has obvious utility for employers, including IC agencies that must sort through large numbers of applicants. While the well-established and well-researched selection methods outlined above are obvious starting points for thinking about selection in the IC context, new approaches are emerging, particularly new testing formats made possible by computer technology that have expanded the kinds of information that can be collected and assessed. For instance, researchers have investigated how well information about such psychological constructs as the ability to address multiple competing demands in an environment can be extracted from individuals’ performance in carefully designed computer games. Likewise, big data analytics (see Chapter 2) have made it possible to develop predictive algorithms based on large numbers of individual bits of data that can be used in the selection process.
These big data approaches have led to the emergence of people analytics as an approach to addressing human resource issues such as recruitment, selection, and retention and predicting satisfaction and performance (Bersin, 2015). With people analytics, individual words or phrases used in a resumé or interview can be evaluated for predictive relationships with the outcome(s) of interest (e.g., subsequent job performance) (see the discussion below). Digitally captured information such as voice patterns, perhaps collected during interviews, can also be examined. Information about job applicants gleaned from online information (e.g., social media) can also be used in the selection process—either in an informal, ad hoc way, as in the case of a hiring manager who looks online for information about a job candidate, or using more systematic, automated approaches.
The emergence of these approaches has led firms such as Google to create a people analytics division (Bock, 2015). Google was among the first to recognize the potential for using big data to infer individual attributes of individuals (such as their personality traits), as well as their knowledge capital (such as specific skills). More recently, as part of Project Aristotle, Google used people analytics to understand drivers of individual and team performance. Digital trace data captured passively from the workforce’s use of social media and enterprise social media channels can also be used to assess individuals’ social capital, based on their position in the network. The result has been a growing interest in relational analytics (or the interplay or interactions among people) (Leonardi and Contractor, 2018). A key remaining challenge is the development of techniques for aggregating various digital trace activities (such as posting a message, “liking” a post, “following” a person or project, or “@mentioning” a person in a post) across multiple enterprise social media digital platforms (those that have been developed and used for communication and social interaction within organizations, such as e-mail, Slack, Jive, and IBM Connections) to provide meaningful relational data, such as who trusts whom, who seeks advice from whom, and who is likely to be an innovator or influencer or leave the organization.
While much work is ongoing in the development of these new approaches, many questions remain. Much of the work is proprietary, conducted by firms marketing services to organizations; relatively little has been subjected to the traditional peer review process. In some cases, marketing claims go beyond what can be supported by solid research. Some of these new approaches involve applying different analytic methods to information collected in traditional ways (e.g., interviews, resumés), while others involve new types of information (e.g., evaluation of candidates based on data gleaned from social media). These latter approaches raise significant ethical and privacy issues. With traditional approaches, for example, the candidate chooses whether to provide information (e.g., to submit a resumé, to par-
ticipate in an interview or test), but this may not be the case when online information is extracted.
Privacy concerns arise in the security clearance process as well. Employment in the IC is contingent on a potential employee’s ability to successfully obtain a security clearance at a level that is appropriate for the position. The requirements for enhanced security review include collection of information from “government, publicly available and commercial data sources, consumer reporting agencies, social media and such other sources.”2 In other words, the IC can collect publicly available information on U.S. citizens for the purposes of employment. The number of possible data points collected in such searches can easily stretch into the billions, so computational algorithms are used to identify relevant information for both prospective and current employees. Bias and other issues may affect the output of such algorithms, so their use for personnel matters must be considered with care (see Chapter 7).
The selection approaches we have discussed offer both opportunities and challenges. There are tensions between the behavioral scientist’s orientation toward careful measurement of well-defined constructs (e.g., combining multiple items into scales, with careful evaluation of the contribution of each item to the scale score) and the application of the big data approach, where it may be difficult to identify how and why the algorithm produces scores that appear to predict performance. The possibility that conscious or unconscious bias against certain subgroups (e.g., based on racial, ethnic, or gender identity) may affect outcomes is a significant issue, particularly in the case of algorithmic approaches of a “black box” nature (see Chapter 7). While these new approaches merit attention, they also warrant a cautious and critical stance (Society for Industrial and Organizational Psychology, 2018). Novel approaches may or may not prove superior to tried-and-true, established selection procedures.
Industrial-organizational researchers use the performance of employees once hired (i.e., outcomes) as the basis for evaluating how well a system for selecting new hires has functioned. Recognizing the limitations of definitions of job performance, researchers have recently focused on developing more nuanced understanding of what characterizes effective performance. Once viewed as an undifferentiated phenomenon, performance is best understood in terms of various components that contribute to an employee’s effectiveness. Researchers have identified such measurable elements of
2Consolidated Appropriations Act of 2016. Public Law No. 114-113, § 11001, 129 Stat. 673 (2015) (p. 673).
job performance as (1) performing core tasks; (2) being a “good citizen” of the organization (i.e., behaving in a way that benefits the organization by contributing to its social and psychological environment, such as persisting to complete a time-consuming job, providing personal support to coworkers, or representing the organization in a professional manner [Sackett and Walmsley, 2014]); and (3) avoiding various forms of counterproductive behavior (e.g., theft, sexual harassment, violations of company policy, drug and alcohol abuse). Specifying such elements has provided the basis for developing both tools for identifying facets of performance that are of greatest interest in a particular situation and predictors of these facets (Rotundo and Sackett, 2002; Campbell et al., 1993).
Cognitive ability testing is an example of how this new insight into the multidimensionality of job performance has influenced thinking and practice in personnel selection. Before this differentiated perspective emerged, it was generally accepted that cognitive ability tests were the best single predictor of job performance. Now, however, it has become clear that while such tests forecast certain aspects of performance well, they are of little to no value in forecasting others. A meta-analysis of studies on the relationship between cognitive ability and other aspects of task performance revealed strong relationships for task performance, more modest relationships for citizenship, and a near-zero relationship for counterproductive behavior (Gonzalez-Mulé et al., 2014).
These developments in the evaluation of potential job performance make it possible for organizations to identify the facets of performance in which they have the greatest interest and design selection systems accordingly. One likely outcome of this approach is a more elaborate selection system, with different predictors targeting different facets of performance. These developments have clear utility for the IC. For example, although 2007 Intelligence Community Directive (ICD) Number 203 set standards regarding the production and evaluation of intelligence analysis and analytical products in order to improve the critical thinking and writing skills of analysts across the IC (Miles, 2016), some note that one of the largest issues in analysis is the hiring of new employees who lack adequate critical thinking and writing skills (Gentry, 2015). While the committee did not have access to complete information on selection practices within the agencies of the IC, the input we received indicated that the primary focus is on selecting for the cognitive aspects of the analyst’s job: obtaining, evaluating, and synthesizing information and presenting one’s findings, commonly in written form.
The cognitive demand on analysts is surely considerable—the analytic workforce performs widely varying functions in an often fast-paced, unpredictable, high-stress, and largely classified environment (Pirolli et al., 2004), as demonstrated by the discussion of global challenges and the role of the
analyst in Chapters 3 and 4, respectively. Analysts are also under persistent pressure not to miss critical signals (Connors et al., 2004). When a surprise occurs, it is often considered to be an “intelligence failure,” most often a failure in analysis; the terrorist attacks of September 11, 2001, and the claim that Iraq possessed weapons of mass destruction are two examples often cited (Johnston, 2005; Lowenthal and Marks, 2015).
In addition to core tasks associated with their fields of expertise and assigned accounts, analysts must coordinate and collaborate with peers and supervisors. They must carefully weigh the best and most accurate way to present complex assessments in situations that may be momentous, urgent, and politically charged. And they must shepherd an analytic report through a review process in which they must defend their analysis while being open to useful changes. These are but a few examples of a broader view of the analyst’s job, with implications for the attributes to be assessed in a selection system. It is critical to clarify from the outset which facets of performance are of interest, and to design a selection system accordingly.
Perfect prediction of performance is an unattainable ideal. Performance is affected by a large number of factors—such as the support and mentoring available from supervisors and peers within the workplace and life circumstances (e.g., marital difficulties, child- or eldercare issues)—some of which are unknowable when individuals are hired and all of which can significantly influence a worker’s performance. Thus the effectiveness of a selection system is best viewed in terms of improvement over a baseline level: a well-developed selection system might, for example, allow an employer to move from having 50 percent up to 70 percent of hires meet expectations. Without knowing what percentage of the IC analytic workforce meets expectations or their rate of turnover, the committee assumes that improvement in this regard is an objective.
Both initial development costs and per-person administration costs for sophisticated selection systems with multiple components can be substantial, but these costs must be evaluated relative to potential benefits. What is the value to the organization of, say, a 20 percent increase in the number of high-performing employees? Utility analysis is an approach used by industrial-organizational psychologists and human resources specialists to systematically balance multiple possible benefits and costs. Utility models express the benefits of selection systems using a variety of metrics, from percentage improvement in the proportion of hires reaching a specified performance threshold to increase in the dollar value of performance per employee per year resulting from the implementation of a selection system. Pushback against the initial costs of developing a selection system is not
uncommon, and utility analysis is a useful tool for identifying benefits an improved workforce can bring. (See Cascio  and Cascio and Scott  for discussion of utility analyses applied to selection systems.)
Researchers studying selection have recently focused on troubling findings that are difficult to reconcile with expectations. For example, elaborate job simulations have been developed for use in evaluating job candidates, yet a meta-analytic comparison found that a brief multiple-choice test of cognitive ability had greater predictive power than a lengthy (e.g., 1- to 2-day) set of simulation-based assessments known as an assessment center (Schmidt and Hunter, 1998). A recent study has shed new light on this unexpected finding. Sackett and colleagues (2017a) note that meta-analytic comparisons of the validity of two predictors are questionable unless both predictors are compared using the same job and the same facet(s) of job performance. They found that in head-to-head comparisons in which both predictors were used with the same sample with the goal of predicting the same criteria, the assessment center had twice the predictive power of the short ability test. This evidence supports the utility of complex job simulations.
There is strong evidence that the quality of selection decisions in virtually any organization can be improved with the use of scientifically based selection systems. Many organizations still rely on more casual and informal approaches to selection, using, for example, some form of resumé screening to reduce the applicant pool to a manageable size and then basing decisions primarily on interviews. The committee did not have access to information that would indicate where IC entities would fall on a continuum from informal to sophisticated selection approaches. Individuals who made presentations to the committee3 did, however, describe the selection strategies used in some IC agencies and offered their suggestions for improvement. Among the attributes suggested for inclusion in a selection system were critical thinking skills, the ability to write, and the ability to prepare effective briefings; teamwork skills; openness to feedback; curiosity; a proactive personality; an orientation toward learning goals; the courage to deliver an unwanted message when necessary; the intellectual humility to consider that one’s conclusions may be in error; and a willingness to stand one’s ground under pressure.
3 The committee held a workshop on workforce issues pertaining to national security on January 24, 2018; a summary of the workshop presentations and discussions can be found at http://sites.nationalacademies.org/DBASSE/BBCSS/DBASSE_183502. See also http://sites.nationalacademies.org/DBASSE/BBCSS/DBASSE_178412 for white papers on this topic.
The research literature on employee turnover and retention is vast and nuanced. This is a topic that attracts a broad range of scholars, from psychologists to labor economists, and research has focused on issues affecting individual employees as well as their collective workgroups and organizations. Here we summarize findings from key research domains that could be applied to the job of the intelligence analyst, ranging from classic findings to modern advances.
Although employee turnover has been an active research topic since the early 20th century, early investigations were largely atheoretical: only
in the 1950s did researchers begin to articulate formalized theories about why people leave their jobs. The first attempt at a comprehensive theory of turnover was offered by James March and Nobel Prize winner Herbert Simon (1958) in their seminal book Organizations. The authors posited that turnover is influenced by (1) the desirability of movement (e.g., job dissatisfaction makes other jobs appear more attractive) and (2) ease of movement (e.g., perceptions of job opportunities).
The March–Simon model sparked both research on predictors of turnover and efforts to develop conceptual models of its determinants. Perhaps most notably, Mobley (1977) identified intermediate psychological processes that could explain how employees arrive at the decision to leave their jobs. Mobley proposed that turnover decisions result from a chain of cognitions and emotions in a process of (1) evaluating one’s job, (2) experiencing (dis)satisfaction, (3) thinking of quitting, (4) considering the costs and benefits of quitting, (5) developing an intention to search for alternatives, (6) following through with a search, (7) evaluating alternatives, (8) comparing alternatives with one’s present job, (9) developing an intention to quit or stay, and (10) choosing to quit or stay. Although he posed this as a linear process, Mobley noted that the strict progression of options need not occur in all cases and that choices might also be influenced by nonjob factors (e.g., relocating because of family members), the availability of unsolicited opportunities, or impulsive decisions to quit. Elaborations on this model included more processes and predictors of turnover. Mobley and colleagues (1979), for example, emphasized the role of expected utility judgments (i.e., evaluations regarding the likelihood that opportunities will provide desired outcomes) in turnover decisions.
The many predictive variables identified during the early days of research on turnover have been summarized in several meta-analyses over the past two decades. Of particular note is the work of Griffeth and colleagues (2000), who showed that turnover is not reliably predicted by such individual characteristics as cognitive ability, education, training, tenure, or demographic factors, but is predicted by variables from the work context. These authors found, for example, that employees were less likely to leave their jobs when they reported more job satisfaction, supervisory satisfaction, coworker satisfaction, role clarity, participation, and organizational commitment. Turnover also was less likely among employees with more tenure (but only in samples with an average age under 40), as well as among those who exhibited high performance within a year of the performance assessment. On the other hand, employees were more likely to leave their jobs when they reported greater role overload, role conflict, overall stress, alternative job opportunities, comparisons of alternatives with their present
jobs, and turnover intentions/cognitions (e.g., search intentions, intention to quit, thinking of quitting, withdrawal cognitions, expected utility of withdrawal). Withdrawal behaviors such as lateness and absenteeism were also predictive of turnover.
Research on employee turnover underwent a transformation in response to a highly influential paper by Lee and Mitchell (1994) describing an “unfolding” model of turnover. In contrast with earlier researchers, who focused on job attitudes and utility judgments as predictors of turnover, these authors used qualitative methods to describe the types of events that precipitated workers’ job searches. They described how disruptive events they called “shocks” can lead employees to quit and described four paths, or sequences of events and cognitions, that can lead to stay/quit decisions. One path is similar to the one established in the earlier turnover research (e.g., Mobley, 1977), in which job satisfaction and searches for alternatives predict turnover decisions. The other three paths are initiated by shocks that prompt workers to evaluate their options and determine which option best matches their mental image of a preferred job situation. Path 1 is activated by an event that prompts action on a preexisting turnover plan (e.g., a pregnancy prompts a worker to quit in order to devote time to parenting, consistent with an existing plan to quit upon becoming pregnant). Path 2 is activated when a negative event prompts a reevaluation of the worker’s fit with the job, which leads to quitting if there is a misfit (e.g., a request to falsify financial records prompts a worker to quit because of misalignment between the worker’s and manager’s morals). Here the “shock” is so severe that the worker quits before initiating a search for alternatives. Finally, path 3 is activated when an event prompts a worker to actively seek out and evaluate alternative job opportunities; if the search turns up an alternative that fits with the worker’s preferred job image, the worker is inclined to quit his or her current job.
After introducing their unfolding model, Mitchell and colleagues (2001) again changed the direction of the field by shifting the focus from determinants of leaving to determinants of staying. They introduced the concept of “job embeddedness,” which represents a collection of work and nonwork factors that encourage people to remain in their jobs. They proposed three general categories of embeddedness mechanisms that can
predict staying: (1) links (i.e., connections with people, teams, and groups, both at and outside of work); (2) fit (i.e., workers’ compatibility with their job, organization, and community); and (3) sacrifices (i.e., things one would have to give up if one took another job). The authors also posited that embeddedness helps predict both turnover intentions and voluntary turnover, even after accounting for the predictive power of job attitudes, job alternatives, and job search behaviors.
Job embeddedness can be combined with the unfolding model for a more comprehensive explanation of job turnover. According to Hom and colleagues (2012), a simple stay/leave dichotomy is not always useful because “everyone eventually leaves; no one stays with an organization forever” (p. 831). They identified different types of people who stay and who leave and broadened the range of turnover criteria that researchers should consider. In this paradigm, it is not enough simply to know whether an employee quits. It is also valuable to know (1) whether the employee wants to leave or stay, (2) whether the employer is exerting pressure for the employee to leave or stay, and (3) whether extrinsic factors (e.g., pay, benefits, perks) exert pressure for the employee to leave or stay. The combination of these three factors, along with the decision to stay or quit, gives rise to four general “proximal withdrawal states”: (1) enthusiastic stayers, (2) reluctant stayers, (3) reluctant leavers, and (4) enthusiastic leavers (i.e., the type of leaver who is of interest in most historical turnover research). These general categories can be further broken down to understand workers’ motives. Workers from different categories are motivated by different forces, demonstrate different types of attitudes and behaviors in response to their work situations, take different amounts of time to exit the organization, and exit for different reasons (e.g., another job, retirement, disability). This model emphasizes that not all turnover is bad turnover: some employees exit an organization as a result of pressure from the organization to leave, in reaction to a poor performance appraisal, or because of low perceived fit. Similarly, not all turnover, such as that due to retirement, disability, or familial relocation, is avoidable. By clarifying whether turnover is functional/dysfunctional and avoidable/unavoidable, the taxonomy of proximal withdrawal states is useful for making sense of how, why, and when employees leave an organization.
A relatively recent development in research on turnover is the study of “turnover contagion,” which occurs when an individual’s turnover is influenced by that of coworkers. Felps and colleagues (2009) proposed
that workers’ job search behaviors can be influenced by coworkers through social comparison processes. When workers observe colleagues updating resumés, searching for job postings, or taking time off from work for interviews, they can be provoked to begin engaging in their own turnover cognitions (e.g., “If so-and-so is trying to leave, should I start looking for other opportunities, too?”). In support of the theory of turnover contagion, Felps and colleagues (2009) found that coworkers’ job search behaviors predicted individual turnover above and beyond individual job attitudes, coworker job attitudes, and other individual- and group-level predictors. In terms of the unfolding model, seeing coworkers engage in job search activities could be viewed as a “shock” that initiates an individual’s own evaluation of present and alternative opportunities.
The notion that turnover is driven by unmet expectations has driven a large body of work on ways to give potential new employees a realistic job preview to limit inaccurate expectations. Work on such previews has explored the association between their use and both affective and behavioral consequences on the job, including turnover. Meta-analyses of the relevant research have identified small but statistically reliable associations between the use of previews and lower turnover (Earnest et al., 2011; Phillips, 1998; Premack and Wanous, 1985). Experiments in the use of previews have been conducted in a variety of settings with varying degrees of fidelity, so it is particularly valuable to note that the association between previews and turnover is slightly stronger in field than in laboratory studies (Earnest et al., 2011). While previews have shown benefit when presented both before and after hiring (Earnest et al., 2011; Phillips, 1998), the most recent meta-analytic evidence suggests that the effect of posthire previews is slightly stronger (Earnest et al., 2011).
Although it has been presumed that previews are effective because they increase the likelihood that new hires’ expectations will be met, recent work suggests that the primary mechanism may be perceptions of employers’ honesty. Earnest and colleagues (2011) meta-analytically tested met expectations, role clarity, perceptions of honesty, and attraction as mediators between previews and turnover. These researchers found that previews had a significant effect on turnover through perceptions of honesty, but not through the other mechanisms.
We have focused thus far on research examining what prompts individuals to stay or leave. But researchers have also explored the issue on the aggregate level, examining “collective turnover,” where the unit of analysis is the group or organization (Hausknecht, 2017). Here the outcome is the rate of turnover in the group or organization, and aggregate-level variables (e.g., human resources practices) are examined as predictors of turnover rates. In their meta-analysis, Hancock and colleagues (2017) found less collective turnover in units with high-commitment human resources systems (i.e., those whose practices increase employees’ sense of investment in the organization); more personnel changes; and lower turnover intentions. Less turnover also is seen in units with stronger perceptions of cohesiveness and teamwork, management and leadership quality, satisfaction, commitment, climate and culture, and justice and fairness.
Keeping employees in the organization once they have joined is a focus of the retention literature, but a related important issue is whether the pool of recruits persists through the application process or drops out along the way. Observers have noted that while the IC identifies and attracts highly qualified individuals, many candidates opt not to join the IC workforce because of the lengthy and difficult hiring process (Miles, 2016). A prospective employee may wait as long as 20 weeks after submitting an application before receiving approval for a position at the Office of the Director of National Intelligence (ODNI).4
Despite decades of research on turnover and a large volume of published findings on the subject, it remains unclear whether those findings generalize to the IC. Many turnover studies are conducted in high-volume employment settings (e.g., fast food restaurants, retail stores, call centers), and meta-analyses on turnover have not explored effects at the level of specific industries/occupations/jobs with confidence because of the limited availability of data points. Thus, the relationships described above may or may not generalize to the IC setting. However, the literature can certainly provide a set of testable hypotheses that could be examined in the IC context.
Further developing existing knowledge and skills and acquiring new ones are key objectives for most workers, including intelligence analysts. IC leadership has identified “improved opportunities for achieving and maintaining mastery in their chosen career fields” as important for IC leaders and staff and for maintaining and sustaining “current and emerging mission success” (Office of the Director of National Intelligence, 2014, p. 7). These opportunities include acquiring new knowledge and skills to increase effectiveness in one’s current role, as well as to advance to a broader career path. Organizations commonly invest effort in helping workers acquire and develop knowledge and skills that are relevant to their current roles or progression to new roles within the organization (rather than learning that would help individuals pursue a change of career and/or organization).
Organizations foster learning by providing training, whether formal (e.g., classroom-based) or informal (e.g., offered by mentors), as well as
by supporting autonomous learning, in which the individual identifies the need for new knowledge and/or skills and a mechanism for acquiring them. All of these types of learning will be increasingly critical for intelligence analysts as new technologies become available and must be quickly assimilated into their workflow and as the nature of technologically based conflict evolves. Indeed, the analyst’s job by its very nature requires constant adaptation and learning.
Formal training, which is provided by most organizations, has been studied extensively (see, e.g., Aguinis and Kraiger, 2009; Salas and Cannon-Bowers, 2001). However, informal training is increasingly widespread: according to one study, it accounts for up to 75 percent of learning in organizations (Bear et al., 2008). Formal training is common for new employees who may not be fully ready to step immediately into the job, but is also used to help employees remain current and/or become more effective as the job and work environment change. This type of formal training may be a one-time occurrence (e.g., a change in technology requires updating skills), but the nature of the work may call for continuous learning, an ongoing process of updating and developing skills. Some employees are predisposed to orient themselves toward continuous learning, while others are not. A continuous learning orientation is also influenced by organizational values, and organizations use various measures to create incentives for and otherwise support such an orientation.
The IC has identified continuous learning as an important component in developing the analytic workforce. ODNI’s Human Capital Vision 2020 (Office of the Director of National Intelligence, 2014) identifies continuous learning as one of three focus areas for the IC, with the stated goal that “leaders and staff will have improved opportunities for achieving and maintaining mastery in their chosen career fields, resulting in a workforce better able to maintain and sustain current and emerging mission success” (p. 7).
Organizations and researchers have focused on formal means of providing continuous learning, but autonomous learning, also termed “informal” or “self-directed” learning, is an emerging research frontier (see Ellingson and Noe, 2017). Among the questions researchers have begun to explore is which organizational factors facilitate autonomous learning. For example, Tannenbaum and colleagues (2010) identify as a key factor an organizational climate that signals to employees that learning is a valued activity. A considerable body of research documents the importance of this factor with respect to formal training, and while the same might be hypothesized for autonomous learning, more empirical research on this linkage is needed. Other key organizational factors identified by Tannenbaum and colleagues (2010) include the opportunity to obtain feedback, tolerance of error as employees attempt to use new skills, support and encouragement from supervisors and peers, and the allocation of time for autonomous learning.
Tannenbaum and colleagues (2010) also identify individual factors that facilitate autonomous learning. One such factor is learning motivation, which has multiple facets—motivation to improve and learn (i.e., setting learning goals), motivation to take action in pursuit of these goals, motivation to seek feedback, and motivation to reflect upon one’s learning experiences (see NASEM [2018a] for more on motivation and learning). Other facets include an internal locus of control (i.e., to what extent a person attributes outcomes to his or her own efforts and abilities), higher self-esteem, a feedback-seeking orientation, and a learning goal orientation (as opposed to a performance orientation, which does not facilitate the needed exploration and trial and error).
Other research delves more deeply into motivational processes underlying autonomous learning. For example, Vancouver and colleagues (2017) explored the application of self-regulatory theories5 to goal setting in autonomous learning. The authors highlight the importance of the concept of self-efficacy, which encompasses an individual’s assessment of (1) her current capability for the task in question, (2) the potential level of capability she might attain with investment in learning, (3) the likelihood that she will reach this potential, and (4) her ability to regulate the processes needed for this learning to take place. Other researchers focus on the distinction between a current state (e.g., current skill level) and a desired state (e.g., desired skill level). The central mechanism in these theories is a feedback loop between the current and desired states, with the discrepancy between the two driving action.
Self-regulatory frameworks have been used to examine a variety of questions relevant to autonomous learning. While much of the research examines pursuit of a single goal, an increasing number of researchers are recognizing that in many settings, individuals face competing goals. Autonomous learning in an organization is a classic example because time devoted to new learning is time not spent on current tasks. The process used to allocate effort in the face of competing goals is illuminated by a combination of research (e.g., Schmidt and DeShon, 2007) and computational modeling (Vancouver et al., 2010). This work has revealed that initial effort commonly is allocated to the task that is characterized by the greatest discrepancy between the current and desired states, and that this allocation shifts as performance deadlines approach. Halper (2015) provides another example, explicitly in the learning context, by examining the choice of whether to allocate learning effort to further build a strength or address a weakness. She found that if individuals were going to be evaluated in both areas, learning effort would focus on the weakness, whereas if individuals
could choose the area in which to be evaluated, they would choose the area of strength and allocate their learning effort to that area.
As discussed in Chapter 7, fundamental capacity limits affect all humans. The tempo and complexity of intelligence analysis, including the need to assess large amounts of information, can tax or overload the processing capacity of the human agent. Mental and physical fatigue can in turn markedly reduce attention, leading to impaired performance and decreased efficiency (Grandjean, 1979). It is worth noting that these forms of fatigue—mental and physical—are at least partially independent; mental fatigue as manifested by performance deficits, for example, can occur in the absence of muscular or physiological fatigue (e.g., Van Custem et al., 2017). The ability to recognize the cause of and possibly mitigate cognitive decline related to mental fatigue would have clear benefits for the intelligence analyst, who often works under substantial time pressure and whose work may have life-and-death implications. There are many ways to reduce mental fatigue (e.g., promoting sleep, improving diet, increasing exercise), but the focus here is on methods based in neuroscience (including psychopharmacology), as they have the greatest potential to bring new benefits to the IC in the coming decade.
The study of neuroergonomics integrates theories and principles of ergonomics6 and neuroscience to provide insights into brain function and behavior in a variety of settings, including at work (Parasuraman, 2011). The field of ergonomics originally focused on ways to increase productivity, particularly within physical work environments, but has expanded to encompass interactions between humans and the work ecosystem and
6 The terms “ergonomics” and “human factors” are synonymous and used interchangeably.
ways to improve human well-being and overall ecosystem performance (Karwowski et al., 2003). One of the most widely studied topics in ergonomics is mental—not physical—workload (Wickens and McCarley, 2008).7 Although neuroergonomics and neuroscience are distinct fields, studies and assessments in neuroergonomics rely heavily on current neuroimaging methods (see Box 8-2).
Researchers who work on resource theory have also contributed to understanding of the importance of cognitive workload for workforce performance. Study in this area has shown that, for tasks that are not highly automatic, performance is directly proportional to the use of attentional resources. Multiple cognitive resources are available and may be recruited in varying combinations when two or more tasks, such as driving and texting, are performed simultaneously (Wickens, 1984, 2002; see also Wickens and McCarley, 2008; Mehta and Parasuraman, 2013). Attentional distractions can be external (e.g., from the environment) or internal (e.g., mind wandering), and both can hinder vigilance and performance, particularly
7 Mental workload that is too high or too low can cause the human–system performance to decline, so workload assessments are conducted on existing systems and during the design phases of new systems, such as those described in Chapter 7.
Cognitive workload is directly related to human agents’ vigilance and their cognitive fatigue in work environments. Vigilance on critical tasks decreases with time spent on the task because cognitive resources get depleted (Davies and Parasuraman, 1982; Warm et al., 2008). A number of mitigation strategies can be used to address this decline in vigilance, depending on the nature of work tasks and settings. While loss of vigilance can be mitigated by reduced work hours and more frequent breaks, for example, these strategies may not be appropriate for time-sensitive intelligence analysis.
The use of “cueing” has been shown to improve performance on tasks that require vigilance (such as detecting particular phenomena), while also reducing or eliminating the decline in vigilance associated with time spent on the task (Wiener and Attwood, 1968). Cueing allows human agents to manage their information-processing (cognitive) resources by reducing the time during which they must remain vigilant. With cueing, the agent is prompted to monitor a display for the arrival of a signal, whereas without cueing, the agent must monitor the display at all times in anticipation of a signal, thereby using more cognitive resources. Cueing may therefore be an appropriate strategy for mitigating cognitive load on such tasks as monitoring input from human–machine teaming or input from sensors and other sources (see Chapter 7).
The use of a visual or auditory warning signal to refocus the human agent has been studied extensively in the context of driving. This work has shown, for example, that auditory warnings improve performance in drivers experiencing fatigue (and thus cognitive decline) (Lin et al., 2009, 2010). In these studies, however, the auditory feedback is provided after cognitive lapses (e.g., nonresponsiveness to simulated lane departures), and such delays in warning could have catastrophic results depending on the task at hand. To implement more timely warnings, Wang and colleagues (2014) proposed a smartphone-based system that would first detect and track the human agent’s cognitive state and then deliver an arousing signal should cognitive fatigue be detected. Other researchers are pursuing mobile brain imaging techniques for workload and fatigue assessment. Gramann and colleagues (2011) identified two critical requirements of such systems: (1) robust mobile sensor technology to measure brain activity and (2) powerful computational software to process and analyze the data. The committee recognizes the potential security issues posed by wireless and mobile technologies in classified environments, but nevertheless notes the possibilities afforded by these technologies, which will continue to be refined and will likely be implemented in non-IC workforces over the next decade.
Another possible approach to mitigating decreased vigilance and cognitive fatigue is the use of noninvasive8 brain stimulation. Noninvasive stimulation of the peripheral parasympathetic vagus nerve, for example, yields a specific pattern of brain activation and has been shown to have mood-elevating effects (Hein et al., 2013; Kraus et al., 2013). In this regard, the U.S. Department of Defense (specifically the Defense Advanced Research Projects Agency [DARPA]) has sponsored a program, Targeted Neuroplasticity Training, aimed at determining whether peripheral nerve stimulation can enhance learning and the development of cognitive skills. Research has also indicated that noninvasive transcutaneous direct current stimulation of the brain improves performance in professional athletes (e.g., Okano et al., 2015). Noninvasive stimulation techniques have been used as well to modulate brain activity in order to improve performance on cognitive or motor tasks and to reduce or eliminate normally occurring performance declines in tasks requiring vigilance (Coffman et al., 2014). Follow-up work (e.g., Nelson et al., 2014) has explored questions about the most effective timing for applying noninvasive stimulation techniques and other questions, some in a military context (Nelson et al., 2016).
This line of research suggests that enhancement of attentional performance in the absence of sleep deprivation or other fatigue-inducing factors can be effective, but additional research is needed to examine the long-term effectiveness of noninvasive brain stimulation as a strategy for mitigating decreased task vigilance and cognitive fatigue. Further, because noninvasive brain stimulation methods require the application of either magnetic pulses or a small electric current to the scalp, careful consideration of the ethical use of such devices to improve work performance is needed.
Pharmacological solutions to cognitive fatigue have also been investigated. Examples include a mouth rinse with caffeine and maltodextrin (uningested) to counter mental fatigue (Van Cutsem et al., 2018) and Modafinil (a weak dopamine reuptake inhibitor and neuropeptide activator) to keep sleep-deprived individuals awake (Fernández et al., 2015). Further research on the efficacy of such approaches in an IC context is needed.
Physiologically based methods for cognitive enhancement are not limited to the neuroscience/neuroimaging and pharmacological domains. Any procedure capable of enhancing ongoing neural processing may have performance-enhancing effects in some domains or contexts. Further, neuroscience-based interventions do not necessarily require sophisticated instrumentation. One ongoing line of research concerns the effects of the circadian rhythm on behavior and cognition, an active area of investigation in the basic science community as well as the military (see, e.g., the
8 Noninvasive in this context refers to a technique that entails no incisions or insertions in the body.
Military Health System Research Symposium [MHSRS], August 2017). Even relatively simple manipulation of aspects of the environment, such as illumination levels and light frequencies, may hold potential for enhancing performance (Zakerian et al., 2018).
Likewise, researchers have noted a number of possible cognitive benefits of automation in the workplace, including a reduction in workload, fatigue, and stress induced by high-stakes situations. Automation can free attentional resources that can be allocated to other tasks, decrease the occurrence of human errors, and improve data monitoring and analytic capabilities (Breton and Bossé, 2003). The use of adaptive automation (such as that described in Chapter 7) may reduce the cognitive load for the human agent, particularly during emergencies, time-sensitive tasks, and other circumstances that increase task load. In adaptive automation, the workload assigned to human and machine agents becomes flexible to allow for greater use of automation during conditions of high task load (e.g., emergencies) and less use during normal operations (Lintern, 2012). Researchers have investigated neuroergonomic assessments and other measures that could be used to incorporate such adaptive systems into the workflow (Byrne and Parasuraman, 1996; Ting et al., 2010) by assessing the agent’s state in real (or near real) time. The accuracy rates of statistical and machine learning techniques (e.g., discriminant analysis, artificial neural networks, Bayesian networks, fuzzy logic) implemented in real time for this purpose has been approximately 62–85 percent (Baldwin and Penaranda, 2012; Berka et al., 2004; Borghetti et al., 2017; Ting et al., 2010; Wang et al., 2011).
Despite the evidence of potential benefits, the utility of automation in the workplace depends on a number of factors, such as the nature of tasks that are automated, the human oversight required, the age and experience of the workforce, and the accuracy of the automation. Automation can even have a counterintuitive effect on human workload. Automation does decrease workload, but it can also increase workload because the automation itself must be monitored, a task added to the human workload (Parasuraman, 1987). Other research indicates that task performance, situational awareness, and workload effects are dependent on the level of automation and the length of time a task is automated (Kaber and Endsley, 2004).9 When designing complex human–machine systems as a mitigation strategy for cognitive fatigue, then, it is important to evaluate both human and machine abilities and limitations, as well as the appropriate levels of
9 In dual-task situations, low levels of automation lead to improved performance, and intermediate levels of automation lead to improved situational awareness on the primary task. But when a greater percentage of the primary task is automated, the human agent is able to shift resources, which leads to improved performance on the secondary task.
automation. Not doing so can lead to safety, security, and performance issues (Habib et al., 2017).
Potentially viable neuroergonomic, pharmacological, and neuroscience methods, as well as biologically based interventions, for assessing attention and arousal, workload and workload capacity, and stress and fatigue hold potential for application in the IC. However, this area is in need of further research.
Biological interventions for cognitive enhancement, be they behavioral, electrical, mechanical, or pharmacological (e.g., nootropic agents) may have the potential for negative or adverse outcomes. Any such approach to performance enhancement must be considered from the standpoint of the risk–reward trade-off. Some of the approaches mentioned here have been studied extensively for safety. For example, transcranial brain stimulation has been the subject of numerous safety reviews that have shown no serious adverse events in tens of thousands of patients and subjects across varying health spectrums (Antal et al., 2017; Bikson et al., 2016). Other interventions, however, need to be carefully evaluated for efficacy and safety.
Clear guidelines for the ethical application of such methods in the IC workforce are needed. Moreover, workers’ acceptance of and compliance with their use will depend on input from the intended users into the design of methods for monitoring cognitive workload/fatigue and potential interventions (including adaptive systems) (Mehta and Parasuraman, 2013).
Both relatively settled and emerging findings from the fields of industrial-organizational psychology and human resource development suggest the value of significant modifications and adaptations in the management of the analytic workforce, the selection of candidates, and the way organizations and workflow are structured. While the committee was unable to gather systematic information about current analytic approaches across the IC, it is likely that agencies make use of some of these methods. Thus at least some IC agencies may rightly conclude that they have already incorporated the ideas and approaches discussed in this chapter. It is also likely, however, that at least some agencies have not implemented these ideas and approaches at the scale and with the complexity that are possible, and so would be well served by a more in-depth understanding of the benefits of the existing and emerging research presented here.
The research highlighted in this chapter includes some ideas and approaches that could be integrated immediately into the intelligence analysis workforce, as well as others that will be increasingly applicable as they evolve. The findings, methods, and tools described in this chapter can be used to strengthen the analyst workforce and prepare it to meet emerging security challenges, make optimal use of available technologies, and collaborate with others who are instrumental to analytic success (including machines). The committee sees both (1) opportunities that can be exploited regardless of whether or how quickly innovations in methods for sensemaking and human–machine interactions may occur, and (2) opportunities to transform the workforce to compete with the intelligence efforts of other nations as they also innovate and exploit new technologies.
Translational research is needed, however, to identify specific ways the IC can take advantage of these opportunities. Moreover, the nature of analytic work is a moving target. In the next 10 years, new work challenges, new analytic technologies, and new work practices can be expected to emerge, and new collaborations will become necessary. Selection practices, training regimes, and teamwork requirements will need to be adapted to the new work requirements that will result. Therefore, additional research will be needed to translate the opportunities discussed in this chapter accordingly.
This chapter has reviewed three domains—employee selection, training, and retention—in which well-developed bodies of knowledge can support effective translational research for the IC environment. It has also reviewed a fourth domain—mitigating cognitive fatigue—in which research is still at a relative early stage, and in which more work is needed and ethical challenges must be addressed before conclusions about applicability to the IC can be drawn. Systematic attention to all of these issues will be essential for the IC to ensure that its workforce is optimized for the future.
Important questions also remain for which research has not yet provided clear answers. An important example is how the IC can most profitably recruit potential employees likely to be interested in—and to succeed at—intelligence analysis. Nor have questions about, for example, factors that may attract or repel promising young people and generational differences that may call for an array of approaches yet been systematically addressed. Similarly, the use of gaming and simulation in a training context may hold promise for intelligence analysis, but no body of research sufficient to guide the IC in this area currently exists. Answers to these and other questions will be important to the intelligence analysis enterprise moving forward, and the IC will need a mechanism for monitoring ongoing developments in potentially applicable research.
CONCLUSION 8-1: A range of personal attributes—including skills in critical evaluation, writing and presentation, and teamwork; openness to feedback; and a continuous learning orientation—contribute to successful job performance as an intelligence analyst. To strengthen its capacity to select individuals well suited to work as an intelligence analyst, the Intelligence Community (IC) would benefit from
- regularly updating its assessment of the facets of the analyst’s job performance that are of greatest value to the IC and the attributes most useful for selection of personnel for intelligence analysis roles;
- having the capacity to measure a broad range of attributes for use in selecting individuals who possess those attributes; and
- evaluating the predictive power and potential ethical implications of such assessment devices as digital games, gleaning information
about candidates from social media, and using machine learning approaches to extract information from interviews and resumés and develop scoring algorithms.
CONCLUSION 8-2: A large body of social and behavioral sciences research identifies individual and organizational factors linked to employee retention, including employees’ attitudes and engagement, unit cohesiveness, and leader quality, but these factors have not been examined in the Intelligence Community (IC) context. Translational work examining the role of these potential influencing factors could aid in managing retention in the IC.
CONCLUSION 8-3: A systematic review of the degree to which the organizational culture within the agencies of the Intelligence Community supports both organizationally directing training and autonomous learning could provide valuable information that could be used to promote these means of enhancing the skills of the analytic workforce. This review could focus on practices that promote such a culture, including
- opportunities for workers to receive feedback;
- tolerance for error as employees attempt to use new skills;
- support and encouragement from supervisors and peers; and
- allocation of time for autonomous learning.
CONCLUSION 8-4: Emerging research indicates that developing tools and methods could be used to assess and mitigate issues related to the effects of work in the high-stress environment of intelligence analysis, including cognitive fatigue, reduced attention, impaired performance, and decreased efficiency. Possibilities include the application of neuroergonomics (e.g., cueing, visual or auditory warning signals, automation); neuroscience (e.g., noninvasive brain stimulation); and neuropharmacology. The development of effective and safe tools and methods ready for implementation would require (1) research on the utility and applicability of these methods in the Intelligence Community environment, and (2) careful consideration of safety and ethical issues related to their use.
CONCLUSION 8-5: To fully benefit from research findings relevant to the development of an optimal analytic workforce, the Intelligence Community (IC) would need to invest in research and evaluation to guide their application in the context of intelligence analysis. Translating key insights about selection, training, retention of, and support
for the IC analytic workforce will in itself require a team approach in which members of the IC, SBS researchers, applied scientists, and others collaborate to help translate the approaches discussed here for the IC context and assess their effectiveness.
Agrawal, D., Bersin, J., Lahiri, G., Schwartz, J., and Volini, E. (2018). Introduction: The rose of the social enterprise. Deloitte Insights, March 28. Available: https://www2.deloitte.com/insights/us/en/focus/human-capital-trends/2018/introduction.html [December 2018].
Aguinis, H., and Kraiger, K. (2009). Benefits of training and development for individuals and teams, organizations, and society. Annual Review of Psychology, 60, 451–474. doi:10.1146/annurev.psych.60.110707.163505.
Antal, A., Alekseichuk, I., Bikson, M., Brockmöller, J., Brunoni, A.R., Chen, R., Cohen, L.G., Dowthwaite, G., Ellrich, J., Flöel, A., Fregni, F., George, M.S., Hamilton, R., Haueisen, J., Herrmann, C.S., Hummel, F.C., Lefaucheur, J.P., Liebetanz, D., Loo, C.K., McCaig, C.D., Miniussi, C., Miranda, P.C., Moliadze, V., Nitsche, M.A., Nowak, R., Padberg, F., Pascual-Leone, A., Poppendieck, W., Priori, A., Rossi, S., Rossini, P.M., Rothwell, J., Rueger, M.A., Ruffini, G., Schellhorn, K., Siebner, H.R., Ugawa, Y., Wexler, A., Ziemann, U., Hallett, M., and Paulus, W. (2017). Low intensity transcranial electric stimulation: Safety, ethical, legal, regulatory and application guidelines. Clinical Neurophysiology, 128(9), 1774–1809. doi:10.1016/j.clinph.2017.06.001.
Austin, J.T., and Vancouver, J.B. (1996). Goal constructs in psychology: Structure, process, and content. Psychological Bulletin, 120(3), 338−375. doi:10.1037/0033-2909.120.3.338.
Baldwin, C.L., and Penaranda, B. (2012). Adaptive training using an artificial neural network and EEG metrics for within- and cross-task workload classification. Neuroimage, 59(1), 48–56. doi:10.1016/j.neuroimage.2011.07.047.
Bear, D.J., Thompson, H.B., Morrison, C.L., Vickers, M., Paradise, A., Czarnowsky, M., Soyars, M., and King, K. (2008). Tapping the Potential of Informal Learning: An ASTD Research Study. Alexandria, VA: American Society for Training and Development.
Berka, C., Levendowski, D.J., Cvetinovic, M.M., Petrovic, M.M., Davis, G., Lumicao, M.N., Zivkovic, V.T., Popovic, M.V., and Olmstead, R. (2004). Real-time analysis of EEG indexes of alertness, cognition, and memory acquired with a wireless EEG headset. International Journal of Human–Computer Interactions, 17(2), 150–171. doi:10.1207/s15327590ijhc172_3.
Bersin, J. (2015). The geeks arrive in HR: People analytics is here. Forbes, February 1. Available: https://www.forbes.com/sites/joshbersin/2015/02/01/geeks-arrive-in-hr-people-analytics-is-here/#24d8463373b4 [December 2018].
Bikson, M., Grossman, P., Thomas, C., Zannou, A.L., Jiang, J., Adnan, T., Mourdoukoutas, A.P., Kronberg, G., Truong, D., Boggio, P., Brunoni, A.R., Charvet, L., Fregni, F., Fritsch, B., Gillick, B., Hamilton, R.H., Hampstead, B.M., Jankord, R., Kirton, A., Knotkova, H., Liebetanz, D., Liu, A., Loo, C., Nitsche, M.A., Reis, J., Richardson, J.D., Rotenberg, A., Turkeltaub, P.E., and Woods, A. (2016). Safety of transcranial direct current stimulation: Evidence-based update 2016. Brain Stimulation, 9(5), 641–661. doi:10.1016/j. brs.2016.06.004.
Bock, L. (2015). Work Rules: Insights from Inside Google That Will Transform How You Live and Lead. Boston, MA: Twelve.
Borghetti, B.J., Giametta, J.J., and Rusnock, C.F. (2017). Assessing continuous operator workload with a hybrid scaffolded neuroergonomic modeling approach. Human Factors, 59(1), 134–146. doi:10.1177/0018720816672308.
Braboszcz, C., and Delorme, A. (2011). Lost in thoughts: Neural markers of low alertness during mind wandering. Neuroimage, 54(4), 3040–3047. doi:10.1016/j.neuroimage. 2010.10.008.
Breton, R., and Bossé, E. (2003). The cognitive costs and benefits of automation. In The Role of Humans in Intelligent and Automated Systems (RTO-MP-088). France: Research and Technology Organisation/North Atlantic Treaty Organisation. doi:10.14339/RTO-MP-088.
Byrne, E.A., and Parasuraman, R. (1996). Psychophysiology and adaptive automation. Biological Psychology, 42(3), 249–268. doi:10.1016/0301-0511(95)05161-9.
Caffier, P.P., Erdmann, U., and Ullsperger, P. (2003). Experimental evaluation of eye-blink parameters as a drowsiness measure. European Journal of Applied Physiology, 89(3-4), 319–325.
Campbell, J.P., McCloy, R.A., Oppler, S.H., and Sager, C.E. (1993). A theory of performance. In N. Schmitt, and C.W. Borman (Eds.), Personnel Selection in Organizations (pp. 35–70). San Francisco, CA: Jossey-Bass.
Cascio, W.F. (1991). Costing Human Resources: The Financial Impact of Behavior in Organizations (3rd ed.). Boston, MA: Kent.
Cascio, W., and Scott, J. (2017). The business value of employee selection. In J.L. Farr and N.T. Tippins (Eds.), Handbook of Employee Selection (pp. 226–248). New York: Taylor and Francis.
Chun, M.M., Golomb, J.D, and Turk-Browne, N.T. (2011). A taxonomy of external and internal attention. Annual Review of Psychology, 62, 73–101. doi:10.11496/annurev. psych.093008.100427.
Coffman, B.A., Clark, V.P., and Parasuraman, R. (2014). Battery powered thought: Enhancement of attention, learning, and memory in healthy adults using transcranial direct simulation. Neuroimage, 85(Pt. 3), 895–908. doi:10.1016/j.neuroimage.2013.07.083.
Connors, E.S., Craven, P.L., McNeese, M.D., Jefferson, T., Jr., Bains, P., and Hall, D.L. (2004). An application of the Akadam approach to intelligence analyst work. Proceedings of the Human Factors and Ergonomics Society 48th Annual Meeting, 627–628. Available: https://pdfs.semanticscholar.org/93ff/40b52a54d919bbe3295c4590fced7730dc9a.pdf [February 2019].
Davies, D.R., and Parasuraman, R. (1982). The Psychology of Vigilance. London, UK: Academic Press.
Dawson, J., and Thomson, R. (2018). The future cybersecurity workforce: Going beyond technical skills for successful cyber performance. Frontiers in Psychology, 9, 744. doi:10.3389/fpsyg.2018.00744.
De Rivecourt, M., Kuperus, M.N., Post, W.J., and Mulder L.J.M. (2008). Cardiovascular and eye activity measures as indices for momentary changes in mental effort during simulated flight. Ergonomics, 51(9), 1295–1319. doi:10.1080/00140130802120267.
Earnest, D.R., Allen, D.G., and Landis, R.S. (2011). Mechanisms linking realistic job previews with turnover: A meta-analytic path analysis. Personnel Psychology, 64(4), 865–897. doi:10.1111/j.1744-6570.2011.01230.x.
Ellingson, J.E., and Noe, R.A. (2017). Autonomous Learning in the Workplace. New York: Routledge.
Felps, W., Mitchell, T.R., Hekman, D.R., Lee, T.W., Holtom, B.C., and Harman, W.S. (2009). Turnover contagion: How coworkers’ job embeddedness and job search behaviors influence quitting. Academy of Management Journal, 52(3), 545–561. doi:10.5465/AMJ.2009.41331075.
Fernández, A., Mascayano, F., Lips, W., Painel, A., Norambuena, J., and Madrid, E. (2015). Effects of modafinil on attention performance, short-term memory and executive function in university students: A randomized trial. Medwave, 25(5), e6166. doi:10.5867/medwave.2015.05.6166.
Gentry, J. (2015). Has the ODNI improved U.S. intelligence analysis? International Journal of Intelligence and Counterintelligence, 28(4), 637–661.
Gonzalez-Mulé, E., Mount, M.K., and Oh, I.-S. (2014). A meta-analysis of the relationship between general mental ability and nontask performance. Journal of Applied Psychology, 99(6), 1222–1243. doi:10.1037/a0037547.
Gramann, K., Gwin, J.T., Ferris, D.P., Oie, J., Jung, T.P., Lin, C.T., Liao, L.D., and Makeig, S. (2011). Cognition in action: Imaging brain/body dynamics in mobile humans. Review of Neuroscience, 22(6), 593–608. doi:10.1515/RNS.2011.047.
Grandjean, E. (1979). Fatigue in industry. British Journal of Industrial Medicine, 36(3), 175–186.
Griffeth, R.W., Hom, P.W., and Gaertner, S. (2000). A meta-analysis of antecedents and correlates of employee turnover: Update, moderator tests, and research implications for the next millennium. Journal of Management, 26(3), 463–488. doi:10.1177/014920630002600305.
Habib, L., Pacaux-Lemoine, M.-P., and Millot, P. (2017). A method for designing levels of automation based on a human–machine cooperation model. IFAC-PapersOnLine, 50(1), 1327–1377. doi:10.1016/j.ifacol.2017.08.235.
Halper, L. (2015). Continuous Learning: Choosing and Allocating Resources to Strengths and Weaknesses (Electronic Thesis or Dissertation). Available: https://etd.ohiolink.edu/!etd.send_file?accession=ohiou1427828223&disposition=inline [February 2019].
Hancock, J.I., Allen, D.G., and Soelberg, C. (2017). Collective turnover: An expanded meta-analytic exploration and comparison. Human Resource Management Review, 27(1), 61–86. doi:10.1016/j.hrmr.2016.06.003.
Hausknecht, J.P. (2017). Collective turnover. Annual Review of Organizational Psychology and Organizational Behavior, 4(1), 527–544. doi:10.1146/annurev-orgpsych-032516-113139.
Hein, E., Nowak, M., Kiess, O., Biermann, T., Bayerlein, K., Kornhuber, J., and Kraus, T. (2013). Auricular transcutaneous electrical nerve stimulation in depressed patients: A randomized controlled pilot study. Journal of Neural Transmission, 120(5), 821–827. doi:10.1007/s00702-012-0908-6.
Hom, P.W., Mitchell, T.R., Lee, T.W., and Griffeth, R.W. (2012). Reviewing employee turnover: Focusing on proximal withdrawal states and an expanded criterion. Psychological Bulletin, 138(5), 831–858.
Jiao, K., Li, Z., Chen, M., Wang, C., and Qi, S. (2004). Effect of different vibration frequencies on heart rate variability and driving fatigue in healthy drivers. International Archives of Occupational and Environmental Health, 77(3), 205–212.
Johnston, R. (2005). Analytic Culture in the U.S. Intelligence Community: An Ethnographic Study. Washington, DC: Center for the Study of Intelligence.
Kaber, D.B., and Endsley, M.R. (2004). The effects of level of automation and adaptive automation on human performance, situation awareness and workload in a dynamic control task. Theoretical Issues in Ergonomics Science, 5(2), 113–153. doi:10.1080/14639220 21000054335.
Kanfer, R. (1990). Motivation theory and industrial and organizational psychology. In M.D. Dunnette and L.M. Hough (Eds.), Handbook of Industrial and Organizational Psychology (Vol. 1) (pp. 75−170). Palo Alto, CA: Consulting Psychologists Press.
Karwowski, W., Siemionow, W., and Gielo-Perczak, K. (2003). Physical neuroergonomics: The human brain in control of physical work activities. Theoretical Issues in Ergonomic Science, 4(1-2), 175–199. doi:10.1080/1463922021000032339.
Keebler, J.R., Ososky, S., Taylor, G., Sciarini, W.L., and Jentsch, F. (2010). Neuroethics and neuroergonomics: Protecting the private brain. In T. Marek, W. Karwowski, and V. Rice (Eds.), Advances in Understanding Human Performance Neuroergonomics, Human Factors Design, and Special Populations. Boca Raton, FL: CRC Press.
Kraus, T., Kiess, O., Hösl, K., Terekhin, P., Kornhuber, J., and Forster, C. (2013). CNS BOLD fMRI effects of sham-controlled transcutaneous electrical nerve stimulation in the left outer auditory canal—a pilot study. Brain Stimulation, 6(5), 798–804. doi:10.1016/j. brs.2013.01.011.
Lee, T.W., and Mitchell, T.R. (1994). An alternative approach: The unfolding model of voluntary employee turnover. Academy of Management Review, 19(1), 51–89. doi:10.5465/AMR.1994.9410122008.
Leonardi, P., and Contractor, N.S. (2018). Better people analytics: Measure who they know, not just who they are. Harvard Business Review, November–December. Available: https://hbr.org/2018/11/better-people-analytics [December 2018].
Lin, C.T., Chiu, T.T., Huang, T.Y., Chao, C.F., Liang, W.C., Hsu, S.H., and Ko, L.W. (2009). Assessing effectiveness of various auditory warning signals in maintaining drivers’ attention in virtual reality-based driving environments. Perceptual and Motor Skills, 108(3), 825–835.
Lin, C.T., Huang, K.C., Chao, C.F., Chen, J.A., Chiu, T.W., Ko, L.W., and Jung, T.P. (2010). Tonic and phasic EEG and behavioral changes induced by arousing feedback. Neuroimage, 52(2), 633–642.
Lintern, G. (2012). Work-focused analysis and design. Cognition, Technology, and Work, 14(1), 71–81. doi:10.1007/s10111-010-0167-y.
Lowenthal, M., and Marks, R. (2015). Intelligence analysis: Is it as good as it gets? International Journal of Intelligence and Counterintelligence, 28(4), 662–665.
March, J.G., and Simon, H.A. (1958). Organizations. Oxford, UK: Wiley.
Mehta, R.K., and Parasuraman, R. (2013). Neuroergonomics: A review of applications to physical and cognitive work. Frontiers in Human Neuroscience, 7, 889. doi:10.3389/fnhum.2013.00889.
Miles, A.D. (2016). The U.S. Intelligence Community: Selected Cross-Cutting Issues. Washington, DC: Congressional Research Service.
Mitchell, T.R., Holtom, B.C., Lee, T.W., Sablynski, C.J., and Erez, M. (2001). Why people stay: Using job embeddedness to predict voluntary turnover. Academy of Management Journal, 44(6), 1102–1121. doi:10.2307/3069391.
Mobley, W.H. (1977). Intermediate linkages in the relationship between job satisfaction and employee turnover. Journal of Applied Psychology, 62(2), 237–240.
Mobley, W.H., Griffeth, R.W., Hand, H.H., and Meglino, B.M. (1979). Review and conceptual analysis of the employee turnover process. Psychological Bulletin, 86(3), 493–522.
National Academies of Sciences, Engineering, and Medicine (NASEM). (2018a). How People Learn II: Learners, Contexts, and Cultures. Washington, DC: The National Academies Press. doi:10.17226/24783.
NASEM. (2018b). Workforce Development and Intelligence Analysis for National Security Purposes: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi:10.17226/25117.
National Research Council. (2011). Intelligence Analysis for Tomorrow: Advances from the Behavioral and Social Sciences. Washington, DC: The National Academies Press. doi:10.17226/13040.
Nelson, J.T., Mckinley, R.A., Golog, E.J., Warm, J.S., and Parasuraman, R. (2014). Enhanced vigilance in operators with prefrontal cortex transcranial direct stimulation. Neuroimage, 85(Pt. 3), 907–917. doi:10.1016/j.neuroimage.2012.11.061.
Nelson, J., McKinley, R.A., Phillips, C., McIntire, L., Goodyear, C., Kreiner, A., and Monforton, L. (2016). The effects of transcranial direct current stimulation (tDCS) on multitasking throughput capacity. Frontiers in Human Neuroscience, 10, 589. doi:10.3389/fnhum.2016.00589.
Office of the Director of National Intelligence. (2014). Human Capital Vision 2020. Available: https://www.dni.gov/files/documents/CHCO/US_IC_Human_Capital_Vision_2020_Strategy%202020_5_March_2014_U.pdf [December 2018].
Okano, A.H., Fontes, E.B., Montenegro, R.A., Farinatti, P.T., Cyrino, E.S., Li, L.M., Bikson, M., and Noakes, T.D. (2015). Brain stimulation modulates the autonomic nervous system, rating of perceived exertion and performance during maximal exercise. British Journal of Sports Medicine, 49(18), 1213–1218. doi:10.1136/bjsports-2012-091658.
Parasuraman, R. (1987). Human–computer monitoring. Human Factors, 29(6), 695–706. doi:10.1177/001872088702900609.
Parasuraman, R. (2011). Neuroergonomics: Brain, cognition, and performance at work. Current Directions in Psychological Science, 20(3), 181–186. doi:10.1177/0963721411409176.
Phillips, J.M. (1998). Effects of realistic job previews on multiple organizational outcomes: A meta-analysis. Academy of Management Journal, 41(6), 673–690. doi:10.2307/256964.
Pirolli, P., Lee, T., and Card, S.K. (2004). Leverage Points for Analyst Technology Identified through Cognitive Task Analysis. Palo Alto, CA: Palo Alto Research Center.
Ployhart, R.E., Schmitt, N., and Tippins, N.T. (2017). Solving the supreme problem: 100 years of selection and recruitment at the Journal of Applied Psychology. Applied Psychology, 102(3), 291–304. doi:10.1037/apl0000081.
Premack, S.L., and Wanous, J.P. (1985). A meta-analysis of realistic job preview experiments. Journal of Applied Psychology, 70(4), 706–719.
Rotundo, M., and Sackett, P.R. (2002). The relative importance of task, citizenship, and counterproductive performance to global ratings of job performance: A policy-capturing approach. Journal of Applied Psychology, 87(1), 66–80. doi:10.1037/0021-9010.87.1.66.
Sackett, P.R., and Walmsley, P.T. (2014). Which personality attributes are most important in the workplace?. Perspectives on Psychological Science, 9(5), 538–551.
Sackett, P.R., Shewach, O.R., and Keiser, H.N. (2017a). Assessment centers versus cognitive ability tests: Challenging the conventional wisdom on criterion-related validity. Journal of Applied Psychology, 102(10), 1435–1447. doi:10.1037/apl0000236.
Sackett, P., Lievens, F., Van Iddekinge, C.H., and Kuncel, N.R. (2017b). Individual differences and their measurement: A review of 100 years of research. Applied Psychology, 102(3), 254–273. doi:10.1037/apl0000151.
Salas, E., and Cannon-Bowers, J.A. (2001). The science of training: A decade of progress. Annual Review of Psychology, 52(1), 471–499.
Schmidt, A.M., and DeShon, R.P. (2007). What to do? The effects of discrepancies, incentives, and time on dynamic goal prioritization. Journal of Applied Psychology, 92(4), 928–942.
Schmidt, F.L., and Hunter, J.E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124(2), 262–274. doi:10.1037/0033-2909.124.2.262.
Schwartz, J., Collins, L., Stockton, H., Wagner, D., and Walsh, B. (2017). The future of work: The augmented workforce. Deloitte Insights, February 28. Available: https://www2.deloitte.com/insights/us/en/focus/human-capital-trends/2017/future-workforce-changingnature-of-work.html [December 2018].
Society for Industrial and Organizational Psychology. (2018). Principles for the Validation and Use of Personnel Selection Procedures (5th ed.). Cambridge, MA: Cambridge University Press.
Tannenbaum, S.I., Beard, R.L., McNall, L.A., and Salas, E. (2010). Informal learning and development in organizations. In S.W.J. Kozlowski and E. Salas (Eds.), Learning, Training, and Development in Organizations (pp. 303–332). New York: Routledge.
Ting, C.H., Mahfouf, M., Nassef, A., Linkens, D.A., Panoutsos, G., Nickel, P., Roberts, A.C., and Hockey, G.R.J. (2010). Real-time adaptive automation system based on identification of operator functional state in simulated process control operations. IEEE Transactions on Systems, Man, and Cybernetics. Part A: Systems and Humans, 40(2), 251–262. doi:10.1109/TMSCA.2009.2035301.
Tippins, N.T., and Macey, W.H. (2007). Consortium studies. In S.M. McPhail (Ed.), Alternative Validation Strategies: Developing New and Leveraging Existing Validation Evidence (pp. 233–251). San Francisco, CA: Jossey-Bass.
Van Cutsem, J., Marcora, S., De Pauw, K., Bailey, S., Meeusen, R., and Roelands, B. (2017). The effects of mental fatigue on physical performance: A systematic review. Sports Medicine, 47(8), 1569–1588. doi:10.1007/s40279-016-0672-0.
Van Cutsem, J., De Pauw, K,. Marcora, S., Meeusen, R., and Roelands, B. (2018). A caffeine-maltodextrin mouth rinse counters mental fatigue. Psychopharmacology, 235(4), 947–958.
Vancouver, J.B., Halper, L.R., and Bayes, K.A. (2017). Regulating our own learning: Stuff you did not realize you needed to know. In J.E. Ellingson and R.A. Noe (Eds.), Autonomous Learning in the Workplace (pp. 95–116). New York: Routledge.
Vancouver, J.B., Weinhardt, J., and Schmidt, A.M. (2010). A formal, computational theory of multiple-goal pursuit: Integrating goal-choice and goal-striving processes. Journal of Applied Psychology, 95(6), 985–1008. doi:10.1037/a0020628.
Wang, Z., Hope, R.M., Wang, Z.Ji, Q., and Gray, W.D. (2011). Cross-subject workload classification with a hierarchical Bayes model. Neuroimage, 59(1), 64–69. doi:10.1016/j. neuroimage.2011.07.094.
Wang, Y.T., Huang, K.C., Wei, C.S., Huang, T.Y, Ko, L.W., Lin, C.T., Cheng, C.K., and Jung, T.P. (2014). Developing an EEG-based on-line closed-loop lapse detection and mitigation system. Frontiers in Neuroscience, 8, 321. doi:10.3389/fnins.2014.00321.
Warm, J.S., Parasuraman, R., and Matthews, G. (2008). Cerebral hemodynamics and vigilance. In R. Parasuraman and M. Rizzo (Eds.), Neuroergonomics: The Brain at Work (pp. 146–158). New York: Oxford University Press.
Wickens, C.D. (1984). Processing resources in attention. In R. Parasuraman and R. Davies (Eds.), Varieties of Attention (pp. 63–101). Orlando, FL: Academic Press.
Wickens, C.D. (2002). Multiple resources and performance prediction. Theoretical Issues in Ergonomic Science, 3(2), 159–177. doi:10.1080/14639220210123806.
Wickens, C.D., and McCarley, J.S. (2008). Applied Attention Theory. London, UK: CRC Press.
Wiener, E.L., and Attwood, D.A. (1968). Training for vigilance: Combined cueing and knowledge of results. Journal of Applied Psychology, 52(6, Pt.1), 474–479. doi:10.1037/h0026444.
World Economic Forum. (2018). The Future of Jobs Report. Available: http://www3.weforum.org/docs/WEF_Future_of_Jobs_2018.pdf [December 2018].
Zakerian S.A., Yazdanirad, S., Gharib, S., Azam, K., and Zare, A. (2018). The effect of increasing the illumination on operators’ visual performance in the control-room of a combined cycle power plant. Annals of Occupational and Environmental Medicine, 30, 56. doi:10.1186/s40557-018-0267-3.
This page intentionally left blank.