Industry is leading the way in many data analytics advancements. While there are many similarities between the personnel analytics of the Office of the Under Secretary of Defense (Personnel & Readiness), referred to throughout the report as P&R, and those of industry—for example, both are intently interested in retaining specific employee skill sets and in optimizing the performance of the overarching organization—there are many characteristics unique to P&R that make it difficult to directly apply the methods used in industry. The contractual nature of servicemembers’ employment and lengthy training periods are notably different from what is seen in industry. The size of the Department of Defense (DoD) workforce eclipses that of even large industry employers, therefore exaggerating the challenges faced in industry, such as developing and scaling personnel analytics software. P&R also faces legislative constraints that can make it difficult to quickly implement desired changes. Accordingly, P&R can benefit from examining the lessons learned and analytical techniques of industry-oriented analytic approaches, but it must generally adapt them to fit its unique environment.
The committee’s examination of commercial-sector human resource analytics had two parts: products and practice. For products, the committee heard from a sampling of companies offering personnel analytics products (also called talent analytics or workforce analytics): IBM Kenexa, Cornerstone OnDemand, and Workday. While it is unlikely that these and similar products in their current form offer off-the-shelf solutions to DoD
personnel and readiness analysis problems, there are some useful trends and insights to be garnered from this sector. On the practice side, the committee met with Google and Intel, both of which are considered to be at the forefront of applying data-intensive approaches to workforce issues, and it also researched the experiences of other companies applying data analytics to human resources (HR). While the commercial state of the art is quickly evolving, there are important lessons that may apply to DoD. The following sections describe some of the commercially available products and some of the HR analytics practices being employed in large companies.
From what the committee observed, current personnel analytics tools do not seem a good match for DoD needs, for at least a couple reasons. First, they are largely targeted at operational decision making related to HR, rather than policy-level analysis. They tend to focus on decisions around hiring, career paths, evaluation, and training and retention of individual employees, rather than on general policy questions (although Cornerstone OnDemand was deriving useful insights from the combined employment data of companies it serves). Second, the targeted market segment is not a great match. Analytics companies are typically oriented toward solving specific problems facing small- to medium-sized companies. For example, Cornerstone OnDemand focuses on initial hires of hourly workers. This segment has high turnover, which leads to a multiplicity of hiring decisions on the part of employers as well as the need to sharpen those decisions. Also, many of these jobs are easy to “instrument”—that is, to automatically collect data related to job performance, such as time cards or call-center completion rates. That market segment would perhaps contrast with active servicemembers and civil service employees in DoD, where turnover, while a concern, is much less rapid.
The development of personnel analytics products is taking place against a broader landscape of evolving enterprise resource planning tools. Those tools started out as process-centric, with different offerings for different functions (e.g., manufacturing, finance, hiring, benefits), then moved toward integrated suites of tools that span functions. Most recently, the products have become more data driven, as users try to leverage the large amounts of business data captured from internal and external sources. The commercial focus, however, seems to be on making products easier to use and easier to own, rather than on developing novel analytical methods.
On the easier-to-use front, multiple sample visualizations of analytic results to aid in choosing an appropriate visualization and automatic highlighting of interesting results are current product features, as well as “wizard” interfaces to help import data and set up analyses. Cornerstone
OnDemand emphasized the importance of integrating analytics into existing platforms (e.g., viewing attrition predictions in the same interface used to review employment applications). That point was echoed by Workday, which said it was better if the user could explore analytics results in the context of normal tools (even if the analysis itself required a larger platform) rather than having to switch to a different tool. That advice merits consideration by DoD: Look for ways the results of data analysis can be integrated into the everyday workflow of the intended consumers. The committee also heard an estimate that only 4 percent of companies were getting past descriptive and predictive analytics in the personnel domain to prescriptive analytics, and the main way to get beyond that point is to broaden the base of those who can make use of the tools or results (by, for instance, hiring more managers). That broadening, in turn, depends on easier-to-use products.
As for easier to own, vendors are making products available in the cloud rather than installing them on-site. In fact, the business model for all of Workday’s products is based on cloud delivery. The advantage on the consumer side is decreased information technology costs, as there is no need for staff to install, update, and operate the product. On the producer side, it means updates are easier to roll out and can be released to all customers at once, avoiding the need to support multiple versions of a product at the same time. Presumably, the vendor also sees lower costs for support personnel that can be passed on to the customer. However, there are unique security and privacy considerations associated with cloud-based storage. This strategy may make sense for some DoD applications, where having a single server with data and analytic tools that can be accessed remotely and where security and privacy can be managed appropriately could reduce the cost of tool and data support relative to a mode of operation where analysts download data and install tools locally.
Another strategy the committee learned about from vendors is to have a common software platform that deals with functionality that all the tools from various products can use such as data storage and the user interface. Workday had a particularly flexible platform based on a triple store (Rohloff et al., 2007) that facilitated the addition of new kinds of information.
Overall, it appears that current personnel analytics products would be more applicable at the operational functions level of the personnel offices of the various Service branches (in aid of recruiting, training, retention, or billeting) than at the policy-setting level of P&R.
The committee’s investigation of the literature and selected interviews reveal that a few companies are leading the way in applying data-intensive
approaches to personnel operations and policy. Many companies have built up specialized groups for personnel analytics and are trying to base decisions on the actual conditions in their own companies rather than on boilerplate “best practices” or prior research (Derose, 2013). The range of conditions considered is quite broad; they include structuring of the interview process, identifying the attributes of successful (and unsuccessful) managers, setting family leave policy, challenging common knowledge within the organization, determining the key factors that affect retention, studying how patterns of communication affect unit performance, and charting effective career paths. The increasing ease with which personnel data can be archived across time has allowed more longitudinal studies to identify the most reliable indicators of employee behavior. For example, Intel found that the answers to eight survey questions plus the time accrued at the company sufficed to accurately predict retention.
One of the highest profile studies in the personnel analytics domain was Project Oxygen at Google (Bryant, 2011). There had been significant doubt in Google about the value of engineering managers, and at one point the company had gone so far as to eliminate project managers because it had no proof that they had had any impact. Project Oxygen started by assembling a large base of observations on managers from existing sources, such as performance reviews and feedback surveys. Those were supplemented by hundreds of hours of interviews with managers (which required hand coding). Analysis of the data revealed eight behaviors of good managers, such as expressing interest in employee success and well-being and helping with career development. It also revealed pitfalls, such as spending too little time on managing and communicating. (Interestingly, technical expertise ranked last on the list of eight.) The results were used to revise employee-training practices and institute coaching for low-performing managers.
Some of the trends the committee observed and advice it received on personnel analytics are highlighted below.
The committee learned of multiple instances where data analytics were used to confirm (or not) the common knowledge in an organization or to challenge the status quo (the latter was colorfully referred to as “poking the bear”). Two specific examples the committee heard of myths that had been refuted by analysis of actual data were that (1) employees at headquarters are promoted faster and (2) job candidates with multiple employments in the last 5 years or who have been unemployed for a prolonged period of time are less likely to stay with a new position. In reality, employee attrition and retention are much more heavily influenced by colleagues (think of the damage that can be done by a “toxic” coworker).
In other cases, data analytics suggested alternative personnel strategies. For example, observing unusual but successful job moves suggested alternative career pathways for some employees. Another study revealed that a small segment of the workforce was happy to work on a series of short-term projects, leading to a pilot program for “internal freelancers,” who are assigned to projects for a few months at a time and who can tolerate breaks in employment.
Combination of Skills
The committee saw that employees in Personnel Analytics groups (sometimes called People Analytics) often possess a variety of skills and backgrounds. In addition to data scientists, there are experts (often PhDs) in HR domains such as psychology and organizational theory as well as staff (generally MBAs) focused on different HR functions: hiring, benefits, and training. A particular study or analysis might originate with an MBA interacting with a functional group to figure out a current issue or question. The appropriate domain-area expert might design a study to answer the question, deciding which existing data would be applicable and what new information would need to be collected. A data scientist might extract and prepare the data and run the appropriate analyses, which would be interpreted by the domain expert and then be communicated back, perhaps through the MBA, to the HR client group.
The committee observed the importance of putting analysis results in a form that an end user can understand. It was stressed that correlations and trends were not in and of themselves generally actionable by decision makers. The results needed to be explicable before decision makers felt comfortable recommending changes in procedures or policies based on them. Especially when machine learning is involved, where the computation techniques can be somewhat opaque, there is a need to investigate further to develop cause-and-effect explanations. Also, it helped to invest a group leader in the results by briefing them on the results and having them present the results to the rest of the group. The committee also learned that in order to maintain high response rates to information-gathering mechanisms, it was important for employees who had answered surveys and responded to such internal communications to know that their input was being noted and acted upon.
Launching a Personnel Analytics Effort
Building a successful personnel analytics capability is not trivial. In addition to finding appropriate staff in the face of stiff competition for data scientists in particular, developing the appropriate data collection enterprise takes time and organizational will. The committee observed the need for substantial foundational work within the organization on what data mean. In one case, for example, simply coming up with a uniform definition of “full-time equivalent” took over a year of discussion among business units (engineering, sales, finance, etc.). Some of the committee’s interviewees also stressed that the first projects needed to be creditable when starting a data analytics effort—initial efforts needed to show success and value.
Not every study requires huge amounts of data and sophisticated methods. For example, a simple spreadsheet-based model was sufficient to do a multiyear forecast of workforce shape (number of employees at each level) from current employee counts and rates of promotion, hiring, and retention. It revealed the company would shortly become top-heavy with managers if current practice continued, and it led to changes in promotion rates and policies on external hires at upper levels.
Nontraditional Data Sources
All the analytics organizations the committee studied were looking at extending their data assets beyond what they could obtain from internal business-data processing systems. For example, text analytics capabilities (discussed in more detail in Chapter 4) are reaching a level of maturity such that they are suitable for preparation and analysis of unstructured text data.
Many in industry are noting the value of natural experiments in the analytics domain. For example, studying the performance of different teams over time revealed dimensions that were highly predictive of team success, such as how long a team has been together and who is leading it.
Bryant, A. 2011. Google’s quest to build a better boss. The New York Times. March 12. http://www.nytimes.com/2011/03/13/business/13hire.html.
Derose, C. 2013. How Google uses data to build a better worker. The Atlantic. October 17. http://www.theatlantic.com/business/archive/2013/10/how-google-uses-data-to-build-a-better-worker/280347/.
Rohloff, K., M. Dean, I. Emmons, D. Ryder, and J. Sumner. 2007. An evaluation of triple-store technologies for large data stores. Pp. 1105-1114 in On the Move to Meaningful Internet Systems 2007. OTM 2007 Workshops. Berlin Heidelberg: Springer-Verlag.