The Department of Defense (DoD) currently employs over 3 million personnel (DoD, 2015a; Defense Civilian Personnel Advisory Services, “DoD Demographics as of Sept 30, 2014”) and operates and maintains over 800 military bases around the world (Vine, 2015). In order to efficiently manage and effectively use DoD resources to achieve its mission requirements, it is critical to obtain and understand data concerning the availability and readiness of both servicemembers and civil service employees, the condition of available and appropriate equipment, and the operating status of DoD installations. The Under Secretary of Defense (Personnel & Readiness) supports DoD’s mission by advising it or developing policies and plans, as well as by monitoring the readiness of deployable combat and noncombat units. The Defense Manpower Data Center (DMDC) and its predecessor organization, the Manpower and Research Data Analysis Center, were created with the primary mission of supporting the information management needs of the Office of the Under Secretary of Defense (Personnel & Readiness), referred to throughout the report as P&R. Among other things, the DMDC collects and collates data from the military Services to inform P&R and DoD in their decision making. Before the creation of the DMDC in the early 1970s, there existed few cross-Service databases available to support policy or program decisions. Typically, when elements within the Office of the Secretary of Defense required cross-Service analysis, contractors requested data from the Services and assembled custom databases for those analyses. With the creation of DMDC, however, common data
formats across the Services were able to be established in advance, and the data could be routinely provided for storage and maintenance—allowing analyses to be performed in a much more timely manner.
This chapter describes the data that DoD and P&R collect and the analytical capabilities available to these organizations, including the role of relevant federally funded research and development centers (FFRDCs). A brief description of the structure of P&R data follows. How these data and capabilities inform P&R decision making is then discussed. The findings and recommendations regarding DoD data capabilities can be found in Chapter 7 of this report.
Data Available to P&R
Under the supervision of the Defense Human Resources Activity (DHRA), DMDC collects personnel, training, financial, and other data for DoD. DMDC manages information relating to the retirement, health care, and other needs of personnel and their families. The center also collects and maintains much of the data collected by the Services and coordinates an overall database of 35 million individual records on servicemembers, employees, contractors, retirees, and family members.1 DMDC operates in five areas—(1) decision support; (2) entitlements, benefits, and readiness reporting; (3) personnel identification, validation, and authentication; (4) enterprise integration; and (5) survey management—with data largely provided by the Services (DMDC, 2014). There are similar systems for civil service employee data, maintained as the Defense Civilian Personnel Data System,2 and for the Reserve Component, maintained as the Reserve Components Common Personnel Data System (DoD, 2011). These record systems provide basic demographic data and a history of the individual’s assignments; for military personnel, they include educational information and cognitive, physical, and moral aptitude test results.
DMDC is responsible for validating and issuing the Common Access Card (CAC), the identification card for servicemembers, civil service employees, and some contractors, which allows access to buildings and other spaces, as well as computer networks and systems (DMDC, 2016a). On average, 2.8 million CACs are issued annually, with approximately 3.4 million CACs in circulation (DHRA, 2009). DMDC also administers the following databases:
1 For more information about DMDC, see https://www.dmdc.osd.mil/appj/dwp/dmdc_overview.jsp.
- Defense Enrollment Eligibility Reporting System (DEERS). As a user-input database of personnel and benefits information on servicemembers, their families, and DoD civil service personnel and contractors, DEERS is used to determine who is eligible to receive benefits, decide who is to be issued CACs, help detect fraud and abuse in benefits programs, and answer other personnel and readiness questions. Enrollment in DEERS is a prerequisite for eligibility in many medical and dental benefits (such as TRICARE) (DMDC, 2016b).
- Joint Personnel Adjudication System (JPAS). JPAS tracks security clearances for all DoD personnel. It is divided into two systems: the Joint Adjudication Management System (JAMS) and Joint Clearance and Access Verification System (JCAVS). JAMS is responsible for recording changes in eligibility for security clearances, while JCAVS allows employers to view current security clearance statuses (Defense Security Service, 2016).
- Automated Continuous Evaluation System (ACES). ACES provides automated assessment of eligibility of DoD personnel to access classified information. This information includes responses to official questionnaires, personnel administrative data (e.g., demographic, employment, financial, educational, citizenship, and contact information), and criminal justice records. The automation of the database has streamlined the security clearance process, saving DoD both cost and time (PERSEREC, 2016).
- Defense Biometric Identification System (DBIDS). DBIDS is the current DoD identification system that combines biometrics and barcodes to identify and grant DoD personnel access to buildings and other secure locations, as well as to distribute departmental equipment and vehicles. Fingerprint or hand geometry is read and then combined with proper barcode readings before complete verification (Department of the Navy, Chief Information Officer, 2005).
- Defense Incident-Based Reporting System (DIBRS). Designed to meet DoD requirements for statutory reporting, DIBRS monitors, tracks, and organizes law enforcement information on crimes of interest and criminal investigations (DTIC, 2014).
- Emergency Evacuation Tracking and Repatriation System. Tracks persons who were evacuated while abroad owing to emergency situations and helps to recover the costs of such relocations (DPCLD, 2009).
Important additional sources of data are surveys undertaken by P&R, whether by DMDC acting for P&R, by a firm with which it contracts, or by one of its outside research organizations in support of the tasks it has been
asked to undertake. For more than 15 years, DMDC has undertaken both one-off and ongoing surveys, the latter of which ask standardized questions at close intervals (e.g., once a year or even more frequently for military personnel). Recent examples include the Survey of Active Duty Spouses,3 Gender Relations Surveys (for active duty,4 reserve component,5 and Service academies6), Survivor Experience Survey,7 Workplace and Equal Opportunity Survey of Reserve Component Members,8 and the Quickcompass of Financial Issues.9 Besides eliciting demographic data from the respondents, these surveys seek to gather reports on specific phenomena (such as sexual assault) and on the outlook of the target personnel community toward their responsibilities and continued willingness to undertake them (such as questions about stress or intention to reenlist).
P&R also has available a variety of databases that derive from its management responsibilities. Prominent among these are the data sets it maintains on health care, whether provided in military facilities (overseen by the Defense Health Agency) or purchased from the private sector via the TRICARE contract program, which now accounts for well over half of all medical care that DoD finances. Likewise, in administering the Military Entrance Processing Stations (MEPSs)—which, as the name indicates, test incoming enlisted personnel—P&R accumulates test data for all potential military recruits.
Additional databases were established over time to collect essential data to address specific challenges. One such challenge is the possibility of operating more economically by using civil service employees in positions now filled with military personnel, who have higher lifetime costs for DoD. Executive agencies are required to conduct an annual inventory of the commercial and the inherently governmental activities being performed by federal employees.
4 The website for the Gender Relations Surveys for active duty servicemembers is https://www.dmdc.osd.mil/appj/dwp/rest/download?fileName=WGRA1201_TabsVolume.pdf&groupName=pubGenderActive, accessed January 5, 2016.
5 The website for the Gender Relations Survey for the reserve component is https://www.dmdc.osd.mil/appj/dwp/rest/download?fileName=WGRR1201_TabsVolume.pdf&groupName=pubGenderReserve, accessed January 5, 2016.
8 The website for the Workplace and Equal Opportunity Survey of Reserve Component Members is https://www.dmdc.osd.mil/appj/dwp/rest/download?fileName=WEOR1101_Tab-Volume.pdf&groupName=pubOpSurResComp, accessed January 5, 2016.
9 The website for the Quickcompass of Financial Issues Surveys is https://www.dmdc.osd.mil/appj/dwp/rest/download?fileName=QCFIA1301_TabVolume.pdf&groupName=pubFinSurAD, accessed January 5, 2016.
This report, referred to as the “Inherently Governmental and Commercial Activities Inventory,” looks at every position in DoD from the standpoint of whether it needs to be filled by a servicemember or a civil service employee.
Likewise, concerns in both Congress and the Executive Branch over the accuracy of reporting on the readiness of military units to carry out their assignments led in the last decade to establishing the Defense Readiness Reporting System (DRRS). A federation of existing databases, DRRS requires commanders—starting at the four-star level of the Combatant Commands—to periodically rate the ability of their units to carry out every mission-essential task in the operational plans, with the scoring responsibility cascading down the chain of command to the brigade level in the Army, for example, and analogously in the other Services (U.S. Army, 2011). To illuminate the reasons for readiness conditions, the system allows the user to query the supporting databases in the federated system (e.g., the database on equipment condition).
Beyond the large databases of this sort, P&R either assembles or has access to a variety of more limited databases, typically created to deal with a specific responsibility. Sometimes these are in the form of reports, whether recurring or one-time; however, a time series can be constructed only to the extent that the necessary archive of past submissions has been maintained. One of the most important one-time sources of data is the submission required annually for DoD’s future resource planning process, the Program Objective Memorandum, usually known by its acronym, POM.
The Army Analytics Group and DMDC implemented a Person-Event Data Environment (PDE) in 2006 in an attempt to centralize portions of DoD health, military service, and demographic data into an electronic repository, with improved data security and accessibility (Vie et al., 2013). This capability was designed to make it easier to analyze topics such as unit readiness, recruitment trends, retention rates, and other key issues involving DoD personnel. The PDE is a first attempt to connect these disparate data.
While the data held within the PDE are neither classified nor secret, the PDE does hold sensitive information, such as medical records and fitness-for-duty reports (Vie et al., 2013). Therefore, maintaining the confidentiality of the data is critical. The PDE aims to do this by requiring researchers to apply for access to the data, explain what analyses are being conducted with the data, and access the PDE through a secure remote connection.
The committee met with PDE users from the FFRDCs, who noted that while the PDE vision holds promise for improving data access, some practical deficiencies exist, including a slow and complicated approval process for researchers trying to access the system; limited computational power, memory, and tools; concerns about the quality and comprehensiveness of data availability, difficulty uploading and merging external data, and
ownership of data that is uploaded; and the extensive reviews required to access some personally identifiable data and to export analysis results.
Most of the data sources described were constructed originally for administrative purposes, not for policy analysis or research. The data elements in the sets are usually defined by the needs of the administrative process (e.g., billing records for purchased health care, which do not necessarily document the episode of illness or the patient’s health status), and the data structure typically reflects the administrative process being supported. For example, to create a data file that tracks retention of military personnel, DMDC’s records (which are typically monthly snapshots of Services personnel data) can be linked longitudinally, revealing if an individual serving at month t is still serving at month t + 1. Similarly, to evaluate the effectiveness of the Services’ policies in granting waivers to entry standards for applicants who have disqualifying incidents in their background checks, P&R contracted for a study that combined recruit entry data with subsequent disciplinary actions, promotions, and other performance indicators (Putka et al., 2003; Putka, 2004). As these examples illustrate, the policy analyst must typically use more than one file and be willing to use proxy indicators for the underlying variables of interest in order to carry the inquiry forward.
The principal exceptions to this generalization are some of the data provided in response to a specific data call (e.g., for multiyear budget development) and the data collected by surveys. These efforts usually respond to an analytic issue that motivated the request or the collection of data. Nonetheless, those issues may be quasi-administrative in nature and driven by an immediate administrative need. For example, how does the beneficiary population view a specific benefit? Or what are the earnings from employment of military retirees? The data call or survey may not be designed to gather the variables that might explain why the benefit generates the observed response or the variables that explain the choices military retirees make in seeking and accepting civil employment.
Federal staff and staff at the relevant FFRDCs are the principal users for all sources of data that P&R manages. DMDC’s personnel records, for example, were used by P&R to answer Secretary Rumsfeld’s questions about who was bearing the burden of military deployments to Iraq and Afghanistan during the extended operations in those countries. His concern reflected his interest in the all-volunteer force and a conviction that those burdens should be shared equitably. DMDC’s data revealed a pattern of
utilization that failed to meet his standard and led to extended discussion with military leaders about corrective actions (Baiocchi, 2013). Based on these discussions, Army and Marine Corps leadership took action that resulted in an improvement to the balance of personnel used in deployments (CRS, 2012).
There are 10 independent FFRDCs that work for DoD,10 three of which conduct analysis and data collection principally related to personnel and readiness and utilize DMDC data: the National Defense Research Institute (NDRI), which is operated by the RAND Corporation; the Institute for Defense Analyses (IDA); and the Center for Naval Analyses (CNA).
NDRI analyzes issues related to human resources, including those relating to force management, readiness, support, and health care. It recently conducted, for example, a study on sexual assault and harassment that was based on a survey of 560,000 servicemembers (DoD, 2015b). It also recently reviewed personnel systems to examine how DoD matches its personnel to positions.
IDA examines defense policy and force planning in its Strategy, Forces, and Resources Division. In particular, it assesses issues related to organizational effectiveness and human capital management, as well as force structure and military capability alternatives. It recently completed a study, for example, of how to encourage personnel to learn to speak foreign languages and how to make sure that servicemembers who already have language and cultural skills are assigned the right locations. In this study, IDA argued that the military should not try to develop or to identify servicemembers with such skills but should instead contract out for these skills. It also conducted a study to project the psychological health care needs of active duty servicemembers.
CNA conducts research that evaluates workforce management and military readiness. Its researchers have, for example, developed simulation tools to project what positions are needed, how those positions will be staffed, and how the personnel in the positions are to be trained and educated. CNA also drafts congressionally mandated reports on servicemember characteristics (DoD, 2016).
Advisory panels to the government also draw on DMDC data. The Military Compensation and Retirement Modernization Commission used these data sources for crafting an alternative approach to military retirement and for testing its ability to replicate the current profile of military
10 The 10 DoD FFRDCs are divided into R&D laboratories (Lincoln Laboratory, Software Engineering Institute, and Institute for Defense Analyses Communication and Computing Center), systems engineering and integration centers (The Aerospace Corporation and MITRE National Security Engineering Center), and study and analysis centers (CNA, IDA, RAND Arroyo Center, RAND NDRI, and RAND Project Air Force).
retention, and thus the experience levels the military would enjoy. Those estimates were produced by the FFRDCs that work for DoD, in this case, RAND’s NDRI. Likewise, the IDA used the data sources of the Defense Health Agency to estimate for the Military Compensation and Retirement Commission the likely financial impact of a revised approach to the military health benefit.
DoD can and does turn to other analysis organizations besides the FFRDCs, including large for-profit consulting firms. In doing so, however, it must be especially vigilant about conflict of interest issues and analytic independence, which the FFRDCs are explicitly structured to protect. These firms may likewise employ the same data sources described here. For example, the Human Resources Research Organization (HumRRO) developed a cost-performance trade-off model for P&R to use in estimating the effects of recruiting budgets on the quality and job performance of new recruits (McCloy et al., 1992).
Groups with an interest in DoD policies and their implications may also seek to use these data sources. For example, the Services often use DMDC data in their Service-specific personnel research and analysis efforts. Independent scholars may also apply for access to these data sources.
As might be imagined, release of these data beyond the originating office is governed by the applicable statutes and federal policies, including those intended to protect human subjects. The federal staff enjoys the greatest access, as do advisory panels. The FFRDCs, by the terms of their charters and the restrictions placed upon them (such as specific policies for avoiding organizational conflicts of interest), may also be granted a high level of access, although not normally to personally identifiable information (records are often anonymized before being released to the FFRDCs). Although the FFRDC data requests respond to taskings from DoD, the granting of access to data may involve considerable negotiation and delay. Beyond the FFRDCs, sharing of these data with other organizations is more limited, although the Army and DMDC’s PDE represents an effort to respond to requests for data in a manner that balances those requests against both privacy and government concerns.
FFRDCs house their own data in some cases, but usually such data collections are limited in scope and relate to ongoing or long-term studies. The majority of data utilized by FFRDCs are housed within DMDC and are often used in conjunction with external data such as those from the VA and the Bureau of the Census. Researchers are attempting to use the DMDC’s PDE to access and utilize data but are encountering challenges, discussed earlier in this chapter and in Chapter 7.
Mission-Critical Decisions in Need of Improved Data Analytics
As mentioned in Chapter 2, P&R decision making is focused in six areas, each of which is discussed in this section:
- Ensuring DoD can recruit, train, motivate, and retain the necessary numbers of qualified personnel;
- Creating incentives that guide DoD to an optimal mix of personnel;
- Ensuring DoD creates a force that is ready to carry out directed actions;
- Influencing DoD’s decisions that affect the shape of military careers;
- Ensuring the Services supporting DoD’s personnel are properly structured and provided; and
- Anticipating and responding to sensitive behavioral issues.
Ensuring DoD Can Recruit, Train, Motivate, and Retain the Necessary Numbers of Qualified Personnel
P&R must ensure that DoD can recruit, train, motivate, and retain the necessary qualified personnel. For DoD civil service employees, P&R works in partnership with the Office of Personnel Management in accordance with Title V of the U.S. Code. However, there is a particular subset of DoD-employed civil service employees who are managed entirely within the department—for example, highly qualified experts and nonappropriated fund employees.11
P&R is responsible for ensuring the success of the all-volunteer force. Among other things it makes recommendations on pay policy and fringe benefits (which must be codified in statute) and on the setting of standards (physical, mental, and moral) for military service, and it participates in decisions on the host of factors that make up what one might call the “social compact” between DoD and those who choose to serve it.
For analysis of personnel issues—the recruit, train, motivate, retain function—P&R can turn to the Active Duty Military Personnel Master File maintained by DMDC, the Reserve Components Common Personnel Data System, and the Defense Civilian Personnel Data System. These record systems provide basic demographic data, educational information, and a history of individual pay grades and assignments. For civil service employees, the record includes training received, performance ratings and awards, and
11 Nonappropriated fund employees are civilian employees who are paid from nonappropriated funds (10 U.S. Code § 1587).
disciplinary actions. The military personnel records include entrance exam results (including physical and cognitive aptitude test results) and military occupational specialty (defined for civil service employees by the position held and by occupational series). Because these records are continuously maintained, they can be used to analyze cohort behavior (e.g., retention of individuals from one year to the next or retirement from the civil service) and to relate those behaviors to characteristics such as gender, race, education, employment history, and income.
For a broader behavioral picture, P&R can turn to surveys undertaken by DMDC, principally of military personnel, both active and reserve components. For over 15 years, these surveys have been regularly fielded with standardized questions focused on retention and satisfaction with military life, plus usually at least one other topic of immediate interest. At longer intervals, surveys tailored to a major issue are fielded (e.g., employment of military spouses, earnings of military retirees, or sexual assault). Such focused surveys may also be undertaken by outside research organizations, including the FFRDCs, at the behest of P&R or one of the Services, which will also conduct their own surveys.
Contemporary budget pressures may be responsible for the marked reduction in the number and frequency of DMDC surveys beginning in 2010, relative to 2003-2009. This decrease makes the survey less useful as a predictive or leading-indicator tool. The Status of Forces survey of active duty personnel, for example, was typically taken two or three times a year in the earlier period. DMDC has also dropped its pursuit of QuickCompass surveys, which were intended to provide more rapid results than a standard survey instrument. Data analytics may turn out to be an economical substitute for the earlier survey approach.
Offsetting these reductions, DMDC launched its first longitudinal survey, the 2010 Military Family Life Project, focused on military family life and intended to measure the well-being of military spouses and families over time. This study consisted of four surveys given over a 2-year period, with every active duty spouse surveyed at least once to create a matched couple file. The Military Family Life Project also measured how families cope with relocations and deployments.
P&R and the military Services also use data from publicly available sources (e.g., the Employment Cost Index or the unemployment data published by the Department of Labor) to gauge the attractiveness of military service relative to civilian opportunities. P&R may collect more detailed data itself (e.g., its annual survey of prevailing wages, used to set blue-collar wages for both DoD and the federal government as a whole) or engage others to do so (e.g., tracking the propensity of American youths to volunteer for military
service, ongoing since 197512). Many of the data sources used to formulate decisions are also used to track and ensure that the decisions are robust.
P&R occasionally partners with other agencies to ensure its interests are reflected in the data they collect. Perhaps the most notable example is the National Longitudinal Survey of Youth 1979 (NLSY79), in which DoD financed an expanded military sample of the Department of Labor initiative, using it to create national norms for the Armed Services Vocational Aptitude Battery (ASVAB), the cognitive entrance examination for military service, and other purposes.13
Creating Incentives That Guide DoD to an Optimal Mix of Personnel
P&R helps create the incentives that guide DoD to an optimal mix of personnel, for which the headline issue is the mix of servicemembers, civil service employees, and contractors who staff the enterprise (see DoD Directive 1100.4). Because settling on the best mix of personnel types can engender considerable controversy, it is all the more important that the data on which these decisions are based be viewed as accurate and definitive. P&R’s influence on overall personnel mix is often indirect, through budget decisions or planning rules that channel choices appropriately.
In contrast to the robust set of data sources available to P&R for the analysis of the “recruit, train, motivate, and retain” function, fewer data are available to analyze the “mix of personnel” issue. An important resource is the Fully Automated System for Classification (U.S. Army, 2012). P&R and allied offices (especially in the military Services) supplement this data source by commissioning analyses of specific personnel trade issues, often by the FFRDCs. Because of the interest in what the private sector should (or should not) do, these analyses have created considerable literature on public–private competitions (long an interest of the Office of Management and Budget under its Circular A-76).14 That literature concludes that the
12 Although the methodology changed between 1999 and 2001. See http://www.dtic.mil/dtic/tr/fulltext/u2/a416458.pdf.
13 For a brief summary of NLSY79, see http://www.users.nber.org/~kling/surveys/NLSY79.html.
14 OMB Circular A-76, Performance of Commercial Activities (05/29/2003), including technical corrections (OMB Memorandum M-07-02 (10/31/2006) and OMB Memorandum M-03-20 (08/15/2003)): https://www.whitehouse.gov/sites/default/files/omb/assets/omb/circulars/a076/a76_incl_tech_correction.pdf.
Examples of the “outsourcing” or “competitive sourcing” literature include Frances P. Clark et al., “The Impact of Large Multi-Function/Multi-Site Competitions,” CNA, CRM D0008566.A2-Final, August 2003; Edward T. Morehouse, Jr., “Overview of Competitive Sourcing and Privatization, Framing the Issues for Army Environmental Cleanup,” IDA, D-2359, February 2000; and Beth J. Asch and John D. Winkler, “Ensuring Language Capability in the Intelligence Community: What Factors Affect the Best Mix of Military, Civilians, and Contractors?” RAND, TR-1284-ODNI, 2013.
competitions typically generated savings for the federal government, often because a revised but streamlined government operation won (OMB, 2003). Government organizations often commission studies to estimate what staffing is required to undertake the assigned tasks.15
The private sector has invested heavily in relevant decision-making research areas. The problem of determining staffing requirements to undertake projects or tasks has been studied using predictive (statistical) analytics; for example, Hu et al. (2007) and Cao et al. (2011) apply statistical methods to historical data on business service engagements in order to infer the levels of staffing skills that have led to successful engagements. The optimal composition of the workforce over time has been considered through a combination of predictive analytics and prescriptive (optimization) analytics; this combination includes optimal decisions with respect to recruiting, retention, and training or retraining over time to realize the optimal workforce composition and address skill and talent shortages and surpluses; refer to Cao et al. (2011) and the corresponding examples in Appendix D. Lastly, the problem of sourcing—that is, choosing among different suitable resources to satisfy demand for a particular project or task—has been studied as a specific instance of a general class of dynamic resource allocation problems and addressed through approaches based on prescriptive analytics (refer to Gao et al., 2013, and the corresponding example in Appendix D). Hoffmann et al. (2012) and Dietrich et al. (2014) describe the use of data analytics to address various aspects of workforce and talent management in organizations.
Research in this area began in the 1970s. White (1970) studied workforce and talent management using models of mobility in organizations, including the notion of vacancy chains. Then, Bartholomew (1973) developed mathematical models of social phenomena and applied stochastic processes to workforce planning. Vajda (1978) also considered mathematical aspects of workforce planning. Gael (1988) described methods of analyzing jobs to meet specific situations and objectives, including job evaluation, wage incentives, job design, affirmative action, employee performance measurement, data collection techniques, and job diagnosis.
Ensuring DoD Creates a Workforce That Is Ready to Carry Out Directed Actions
P&R is responsible for helping to ensure that DoD creates a workforce that is ready to carry out the actions the President directs. While that
15 See, for example, Gerald E. Cox, “Improving Cost Estimates in the Force Mix Allocation Process for the Active and Reserve Components,” CNA, DRM-2012-U-003418-Final, February 2013; Thomas H. Barth et al., “Staffing Cyber Operations,” IDA, NS D-5472, May 2015; and Terrence K. Kelly et al., “Stabilization and Reconstruction Staffing: Developing U.S. Civilian Personnel Capabilities,” RAND, MG-580-RC, 2008.
responsibility is generally measured in terms of agreed indicators of unit performance (e.g., as instantiated in the DRRS), it also includes responsibility when units or individuals do not perform as expected. Its responsibility for readiness outcomes meant that P&R would play a significant role in the decision regarding which units would carry out operations in Afghanistan and Iraq (the so-called deployment orders process). Readiness also embraces the physical health of the workforce, with issues ranging from vaccination compliance to the stockpiling of medicines against pandemic disease.
DoD traditionally measured readiness through inputs to the Status of Resources and Training System (SORTS). It relied first on empirical measures (e.g., personnel fill) and then on an overall readiness judgment by commanders. The latter, of course, is subject to the same weakness as any judgmental score (such as manipulation in response to bureaucratic pressures),16 and the former fails to address whether the unit can carry out its assigned missions. In the 1990s, Congress insisted on a review of readiness assessment bias, which helped justify the implementation of the revised system DRRS (Tillson et al., 2000). While DRRS does include SORTS data, it focuses on the mission-essential tasks associated with each operational plan and asks commanders—starting with the most senior and cascading down the chain—to evaluate the ability of their units to carry out those tasks. This process provides finer resolution and more nuanced data than were previously available. At the same time, DRRS provides access to other DoD data files that facilitate judging the reasonableness of those evaluations, the causes of any shortcomings, and the potential for remedial action.
Influencing DoD’s Decisions That Affect the Shape of Military Careers
P&R plays a role in DoD’s decisions that affect the shape of military careers—for example, promotion criteria and policies. The basic structure is prescribed by Congress through a combination of statutory authority and direction (e.g., the Defense Officer Personnel Management Act) and limits placed on certain choices (e.g., grade ceilings). Under Declarations of National Emergency, which have been in effect continuously since September 2001, Congress has given DoD broad authority to waive many of the statutory limitations, with the result that P&R must, at a minimum,
16 In 1999, the Washington Post published an article entitled “Two Army Divisions Unfit for War,” which told of both the 10th Mountain Division and the 1st Infantry Division receiving the lowest possible rating for unit readiness. In response to inquiries from Congress and the White House, the Army authorities said the primary reason for this rating was the peacekeeping mission in the Balkans, which required up to half of their troops. The Army went on to state that the two divisions in question were “more ready to fight than the new evaluation suggests.”
advise on how that waiver authority should be used and often actually administer it.
Some Secretaries of Defense take an especially active interest in shaping military careers and enhancing the role that P&R is expected to play. During Secretary Rumsfeld’s tenure, DoD proposed and persuaded Congress to make a number of statutory changes, including loosening restrictions on the length of military careers. Secretary Carter is likewise taking a strong interest in personnel issues, tasking his Acting Under Secretary of Defense (Personnel & Readiness) to consider sweeping changes in order to better manage DoD’s talent and to attract new talent to its ranks. Although the shaping of military careers is importantly governed by statute, P&R is the steward of the process by which statutory changes are proposed. Change may arise from concerns of the military departments or Congress, or from the agenda of the Secretary of Defense or the President. Again, it is the personnel master data files, supplemented by survey responses and special studies, that give P&R insight into potential statutory solutions and their likely effects.
Special studies can play a particularly important role in this regard. In the early 2000s, RAND’s Aligning the Stars report was used in support of the argument to lengthen military careers in order to capitalize more fully on the experience of long-term personnel (Harrell et al., 2004). As is often the case, this study utilized several databases and models to conduct its analysis. The RAND study utilized three databases to generate overall historical patterns, provide detailed information on common sequences of jobs, and generate inputs for modeling: the General and Flag Officer (G/FO) database maintained by the Directorate of Information Operations and Reports (DIOR),17 the G/FO database maintained by DMDC,18 and the Joint Duty Assignment Management Information System (JDAMIS).19 The analysis was then conducted using two independent models (a steady-state system dynamic model and a more detailed entity-based model) and was subsequently validated on a third model to demonstrate that keeping very senior officers in service longer would not erode promotion opportunity
17 The G/FO database maintained by DIOR is an aggregation of the General and Flag Officer Roster, an exhaustive list published monthly by DIOR. It tracks all active and reserve General and Flag Officers, maintaining information such as rank, specialty, service, job title, and unit.
18 The G/FO database maintained by DMDC contains the personnel history (starting in September 1975) of all officers promoted to the rank of O-7 on or after January 1, 1990. Fields include name, date of birth, service, rank, occupational code, unit identification code, and unit address. The RAND report notes that the data in the DMDC database were more complete than those in the DIOR database but lacked a job title and unit name, which made it infeasible to use the DMDC data to perform the filtering of positions at the center of RAND’s analysis.
19 JDAMIS is a relational database containing data on “joint” positions and the officers who have served in at least one of those positions.
for the next cohort, provided prompt decisions were made about which officers truly merited that opportunity. While in its later action Congress left in place the looser age limits needed to implement change, it revoked some of the better pension rights awarded the most senior officers. The RAND report helped put to rest the fear that such a policy change would clog the promotion system. While other controversies made it difficult to secure and implement change, DoD did succeed in some loosening of the statutory restrictions.
Good data are especially important under a Declaration of National Emergency, which is how DoD currently operates, because many of the statutory restrictions on military personnel management (e.g., limits on the number of personnel in the more senior grades) can be waived in that circumstance. The personnel master files contain information on personal characteristics, such as name, social security number, date of birth, gender, race, ethnic group, and education, as well as information on military characteristics such as service, pay grade, months of service, and duty occupation (GAO, 2005). These data on servicemembers, which have been actively collected since 1971, give P&R the ability to judge whether requests for waivers are justified.
A different sort of waiver, likewise linked to career shaping, involves the statutory requirement that promotion to general or flag officer (i.e., one star and above) requires a certain amount of “joint” experience (essentially, experience in a non-Service-specific position, with the Joint Staff or a combatant command) (DoD, 2014). That experience is routinely credited when the billet is designated, but an issue arises when the content might meet the spirit of the statute even if the billet lacks such a designation. P&R is empowered to approve such a substitution and must judge based on descriptive material submitted by the military Service.
The issue of career shape is a central element of the current Secretary of Defense’s Force of the Future initiatives, described in Chapter 2. Analogous issues related to career shaping and talent management arise in the private sector. They include recruiting, training, retaining, managing, promoting, compensating, and developing critical skills and talent across different parts of an enterprise (such as the military and civil service sectors of DoD). Other analogous issues include the sourcing of demand for DoD projects and tasks via servicemembers, civil service employees, and alternative options, as well as the allocation of human capital resources. Lessons learned and the combinations of predictive (statistical) analytics and prescriptive (optimization) analytics developed in these areas, some of which have been described above and others of which are described in Appendix D, may be useful to help address P&R requirements with respect to career shaping and talent management.
Ensuring That Programs Supporting DoD’s Personnel Are Properly Structured and Provided
P&R is responsible for ensuring that various services supporting DoD personnel are properly structured and provided. Health care for military personnel and their families, and for military retirees and their families, is the leading example. Others include the availability of household goods, the education of military children overseas, and the opportunity for healthy leisure activity (mostly through nonappropriated fund activities). This responsibility is reflected not only in the Assistant Secretary for Health Affairs reporting to the Under Secretary, but also in an entire office focused on “Military Community and Family Policy.”
The responsibility for providing key services, especially health care, drives some of P&R’s most extensive data collection and analyses. It is responsible for the data systems that record all inpatient and outpatient care delivered in military facilities, as well as care purchased from private providers, including civilian hospital and nonhospital care to eligible dependents and retirees (through the Civilian Health and Medical Program of the Uniformed Services, commonly known as TRICARE) (Jansen, 2014). P&R is interested in the health status of its personnel and the productivity of its facilities—for instance, if vaccinations are up-to-date and how its clinicians compare to civilian practitioners. DoD is now developing a revised electronic health record for the care it delivers, which must be sharable with the Department of Veterans Affairs (DoD, 2015c).
One of the significant sources of health data originated from concerns that arose in the first Persian Gulf War about the effects of deployment on long-term health outcomes (“Gulf War Syndrome”). To improve its ability to assess the health effects of future exposures, DoD initiated the Millennium Cohort Study, where individuals are followed longitudinally and asked questions about post-traumatic stress disorder (PTSD) and marital status, for example. Scholars have used the data from the survey to examine questions related to health, sexual harassment, and employment (NRC, 2014).
Data sets assembled by P&R in the course of managing services extend well beyond health issues. For example, because it is responsible for MEPS, P&R oversees the administration of ASVAB and maintains the test results. In similar fashion, P&R is responsible for operating the schools for military children overseas (and, for historical reasons, in certain parts of the United States) through its DoD Educational Activity, which uses standardized test scores, such as TerraNova Assessment tests, to monitor the progress of students.
Anticipating and Responding to Sensitive Behavioral Issues
P&R must both anticipate and respond to sensitive behavioral issues. These include long-term issues such as the incidence of tobacco use and abuse of alcohol, and urgent issues such as sexual assault or constraints on religious expression. P&R’s responsibility for dealing with issues arising from personal behavior engenders a number of specialized databases assembled both to gauge the extent of a problem and to monitor progress of policy actions taken. For example, the concern with sexual assault led P&R to conduct four major surveys of active duty troops, in 2002, 2006, 2010, and 2012; reserve components were surveyed similarly in 2004, 2008, and 2012; and DoD commissioned an independent study of the frequency and magnitude of unwanted sexual contacts (RAND, 2014).
P&R’s survey instruments are designed to provide immediate answers, often by tabulating the frequency of responses after weighting the data to reflect the underlying population. A number of key questions are asked consistently over time so that trend analyses can be constructed (e.g., for satisfaction with military life, family satisfaction, intentions to remain in military service, stress). The instruments may also be used to estimate variables needed for specific policy purposes (e.g., estimates of family income to gauge the effect of policy changes). But as is the case with the sexual harassment/sexual assault surveys, the instruments are also designed to facilitate deeper analyses over a longer period of time. Typically, predictive models are constructed to explain the observed results as a function of (1) respondent background and (2) respondent behavior hypothesized to influence the observed results. These, in turn, are often employed for prescriptive purposes (e.g., to redesign elements of the compensation package).
Baiocchi, D. 2013. Measuring Army Deployments to Iraq and Afghanistan. RAND Corporation. http://www.rand.org/content/dam/rand/pubs/research_reports/RR100/RR145/RAND_RR145.pdf.
Bartholomew, D.J. 1973. Stochastic Models for Social Processes. 2nd ed. London: Wiley.
Cao, H., J. Hu, C. Jiang, T. Kumar, T.-H. Li, Y. Liu, Y. Lu, S. Mahatma, A. Mojsilovic, M. Sharma, M.S. Squillante, and Y. Yu. 2011. OnTheMark: Integrated stochastic resource planning of human capital supply chains. Interfaces 41(5):414-435.
CRS (Congressional Research Service). 2012. Troop Levels in the Afghan and Iraq Wars, FY2001-FY2012: Costs and Other Potential Issues. Washington, D.C.
Defense Security Service. 2016. “Joint Personnel Adjudication System (JPAS).” http://www.dss.mil/diss/jpas/jpas.html. Accessed March 31, 2016.
Department of the Navy, Chief Information Officer. 2005. “The Defense Biometrics Identification System.” CHIPS. October-December. http://www.doncio.navy.mil/chips/ArticleDetails.aspx?ID=3181.
DHRA (Defense Human Resource Activity). 2009. “Fiscal Year 2010 Budget Estimates.” May. http://comptroller.defense.gov/Portals/45/Documents/defbudget/fy2010/budget_justification/pdfs/01_Operation_and_Maintenance/O_M_VOL_1_PARTS/DHRA.pdf.
Dietrich, B.L., E.C. Plachy, and M.F. Norton. 2014. Analytics Across the Enterprise: How IBM Realizes Business Value from Big Data and Analytics. Indianapolis: IBM Press.
DMDC (Defense Manpower Data Center). 2014. “The Strategic Planning Process at DMDC.” https://www.dmdc.osd.mil/appj/dwp/documents.jsp.
DMDC. 2016a. “DMDC Overview.” https://www.dmdc.osd.mil/appj/dwp/dmdc_overview.jsp. Accessed March 31, 2016.
DMDC. 2016b. “Personnel Data.” https://www.dmdc.osd.mil/appj/dwp/personnel_data.jsp. Accessed March 31, 2016.
DoD (Department of Defense). 2011. “Department of Defense Instruction: Reserve Component Common Personnel Data System (RCCPDS).” Number 7730.54. Issued May 20. Under Secretary of Defense (Personnel & Readiness). http://www.dtic.mil/whs/directives/corres/pdf/773054p.pdf.
DoD. 2014. “Department of Defense Instruction: DoD Joint Officer Management (JOM) Program.” Number 1300.9. Issued March 4. http://www.dtic.mil/whs/directives/corres/pdf/130019p.pdf.
DoD. 2015a. 2014 Demographics: Profile of the Military Community. http://download.militaryonesource.mil/12038/MOS/Reports/2014-Demographics-Report.pdf.
DoD. 2015b. Department of Defense Annual Report on Sexual Assault in the Military. Washington, D.C.: Department of Defense.
DoD. 2015c. “DoD Awards Contract for Electronic Health Records.” U.S. Department of Defense News article. July 29. http://www.defense.gov/News-Article-View/Article/612714.
DoD. 2016. Population Representation in the Military Services, Fiscal Year 2014. Office of the Under Secretary of Defense (Personnel & Readiness). https://www.cna.org/pop-rep/2014/. Accessed February 15, 2016.
DPCLD (Defense Privacy and Civil Liberties Division). 2009. “Non-combatant Tracking System (NTS) & Evacuation Tracking and Accountability System.” System of Record Notices (SORNs): Office of the Secretary of Defense and Joint Staff (OSD/JS): DMDC 04. http://dpcld.defense.gov/Privacy/SORNsIndex/DOD-Component-Notices/OSDJS-Article-List/.
DTIC (Defense Technical Information Center). 2014. “Department of Defense Instruction: Defense Incident-Based Reporting System (DIBRS).” Number R 7730.47. January 23. http://www.dtic.mil/whs/directives/corres/pdf/773047p.pdf.
Gael, S. 1988. Job Analysis Handbook for Business, Industry and Government. New York: Wiley.
GAO (Government Accountability Office). 2005. Military Personnel: Reporting Additional Servicemember Demographics Could Enhance Congressional Oversight: Report to Congressional Requesters. GAO-05-952. Washington, D.C.
Gao, X., Y. Lu, M. Sharma, M.S. Squillante, and J.W. Bosman. 2013. Stochastic optimal control for a general class of dynamic resource allocation problems. Pp. 3-14 in Proceedings of the IFIP WG 7.3 Performance Conference. Vienna, Austria, September 24-26.
Harrell, M.C., H.J. Thie, P. Schirmer, and K. Brancato. 2004. Aligning the Stars: Improvements to General and Flag Officer Management. Santa Monica, Calif.: RAND Corporation. http://www.rand.org/pubs/monograph_reports/MR1712.
Hoffmann, C., E. Lesser, and T. Ringo. 2012. Calculating Success: How the New Workplace Analytics Will Revitalize Your Organization. Boston: Harvard Business Review Press.
Hu, J., B.K. Ray, and M. Singh. 2007. Statistical methods for automated generation of service engagement staffing plans. IBM Journal of Research and Development 51(3):281-293.
Jansen, D.J. 2014. Military Medical Care: Questions and Answers. Congressional Research Service. https://www.fas.org/sgp/crs/misc/RL33537.pdf.
McCloy, R.A., D.A. Harris, J.D. Barnes, P.F. Hogan, D.A. Smith, D. Clifton, and M. Sola. 1992. Accession Quality, Job Performance, and Cost: A Cost/Performance Trade-off Model. FR-PRD-92-11. Alexandria, Va.: Human Resources Research Organization.
NRC (National Research Council). 2014. The Context of Military Environments: An Agenda for Basic Research on Social and Organizational Factors Relevant to Small Units. Washington, D.C.: The National Academies Press.
OMB (Office of Management and Budget). 2003. “Circular No. A-76 Revised.” May 29. https://www.whitehouse.gov/omb/circulars_a076_a76_incl_tech_correction/.
PERSEREC (Defense Personnel and Security Research Center). 2016. “PERSEREC Initiatives.” http://www.dhra.mil/perserec/currentinitiatives.html. Accessed March 31, 2016.
Putka, D.J. 2004. “An Evaluation of Moral Character Waiver Policy: Database Documentation.” FR-04-17. Alexandria, Va.: Human Resources Research Organization.
Putka, D.J., C.L. Noble, D.E. Becker, and P.F. Ramsberger. 2003. “Evaluating Moral Character Waiver Policy Against Servicemember Attrition and In-service Deviance Through the First 18 Months of Service.” FR-03-96. Alexandria, Va.: Human Resources Research Organization.
RAND. 2014. The RAND Military Workplace Study: Sexual Assault and Sexual Harassment in the U.S. Military. Santa Monica, Calif.: RAND Corporation.
Tillson, J.C.F., R.J. Atwell, J.R. Brinkerhoff, W.R. Burns, Jr., M. Burski, J. Castillo, M. Diascro, R. Fabrie, W.D. Freeman, M.R. Lewis, C. Lyman, and L. Morton. 2000. Independent Review of DoD’s Readiness Reporting System. IDA P-3569. Alexandria, Va.: Institute for Defense Analyses.
U.S. Army. 2011. “Defense Readiness Reporting System-Army Procedures: Commander’s Unit Status Reporting Procedures.” Army Pamphlet 220-1. November 16. http://ssitoday.armylive.dodlive.mil/files/2014/03/DA-PAM-220_1.pdf.
U.S. Army. 2012. “Fully Automated System for Classification (FASCLASS).” http://cpol.army.mil/library/permiss/38.html.
Vajda, S. 1978. Mathematics of Manpower Planning. New York: Wiley.
Vie, L.L, K.N. Griffith, L.M. Scheier, P.B. Lester, and M.E.P. Seligman. 2013. The Person-Event Data Environment: Leveraging big data for studies of psychological strengths in soldiers. Frontiers in Psychology 4:934.
Vine, D. 2015. Where in the World Is the U.S. Military? Politico Magazine, July/August. http://www.politico.com/magazine/story/2015/06/us-military-bases-around-the-world-119321.
White, H.C. 1970. Chains of Opportunity: System Models of Mobility in Organizations. Boston: Harvard University Press.