Advisers to the Nation on Science, Engineering, and Medicine
National Academy of Sciences
National Academy of Engineering
Institute of Medicine
National Research Council
Commission on Behavioral and Social Sciences and Education
Committee on the Youth Population and Military Recruitment
Dear Admiral Tracey:
As you know, the National Research Council is conducting a 4-year study on military advertising campaigns and long-term planning in the Department of Defense (DoD) and the several Armed Services. The recent shortfall in military recruiting has prompted DoD to examine changes in youth attitudes and consider new strategies for attracting youth to the military. In response to a request from the Office of the Assistant Secretary of Defense (OASD), the Committee on the Youth Population and Military Recruitment was formed to examine the demographic trends, cultural characteristics, attitudes, and educational attainments of American youth in order to help military planners improve recruiting for the military (the committee's task statement and committee roster are attached).
One aspect of the committee's charge is to periodically evaluate surveys and interpretive reports provided by contractors. At our first meeting, we were asked by OASD to review and comment by June 2000 on the data collection in and analyses of the Youth Attitude Tracking Study (YATS), a survey administered by the Defense Manpower Data Center (DMDC) that measures youth attitudes toward military service.
This letter is based on the committee's evaluation of materials provided by DMDC, including YATS trend and focused reports, questionnaires, data from 1998 and 1999, and discussions with individuals who have examined YATS data. We examined (1) the need for further analysis of YATS data, (2) methodological issues associated with various approaches to data collection, (3) the current administration of YATS, (4) YATS item content and areas for new questions, and (5) accessibility of YATS data to DoD policy makers and decision makers. We begin with a brief description of the YATS survey followed by our findings and recommendations.
2101 Constitution Avenue, NW, Washington, DC 20418 Telephone (202) 334 3027 Fax (202) 334 3584 national-academies.org
Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 1
THE NATIONAL ACADEMIES Advisers to the Nation on Science, Engineering, and Medicine National Academy of Sciences National Academy of Engineering Institute of Medicine National Research Council Commission on Behavioral and Social Sciences and Education Committee on the Youth Population and Military Recruitment June 16, 2000 ADM Patricia A. Tracey Deputy Assistant Secretary of Defense for Military Personnel Policy Room 3E767, The Pentagon Washington, DC 20310-4000 Dear Admiral Tracey: As you know, the National Research Council is conducting a 4-year study on military advertising campaigns and long-term planning in the Department of Defense (DoD) and the several Armed Services. The recent shortfall in military recruiting has prompted DoD to examine changes in youth attitudes and consider new strategies for attracting youth to the military. In response to a request from the Office of the Assistant Secretary of Defense (OASD), the Committee on the Youth Population and Military Recruitment was formed to examine the demographic trends, cultural characteristics, attitudes, and educational attainments of American youth in order to help military planners improve recruiting for the military (the committee's task statement and committee roster are attached). One aspect of the committee's charge is to periodically evaluate surveys and interpretive reports provided by contractors. At our first meeting, we were asked by OASD to review and comment by June 2000 on the data collection in and analyses of the Youth Attitude Tracking Study (YATS), a survey administered by the Defense Manpower Data Center (DMDC) that measures youth attitudes toward military service. This letter is based on the committee's evaluation of materials provided by DMDC, including YATS trend and focused reports, questionnaires, data from 1998 and 1999, and discussions with individuals who have examined YATS data. We examined (1) the need for further analysis of YATS data, (2) methodological issues associated with various approaches to data collection, (3) the current administration of YATS, (4) YATS item content and areas for new questions, and (5) accessibility of YATS data to DoD policy makers and decision makers. We begin with a brief description of the YATS survey followed by our findings and recommendations. 2101 Constitution Avenue, NW, Washington, DC 20418 Telephone (202) 334 3027 Fax (202) 334 3584 national-academies.org
OCR for page 1
BACKGROUND: YATS FRAMEWORK YATS is a computer-assisted telephone interview (CATI) of 10,000 young American men and women between 16 and 24 years of age. DMDC began administering the survey in 1975, and it has been administered annually since then. In order to limit the interview length to 30 minutes, the survey questions are grouped into sections and asked of subsets of the survey participants. That is, the questionnaire is partitioned such that different respondents are asked different subsets of items within a block of items on a given topic. The 1999 YATS questionnaire covers a wide range of issues, including propensity to enlist. The overall complexity of the questionnaire is demonstrated in Table 1, which lists and briefly describes the major questionnaire sections. TABLE 1 Contents of YATS Category Description Background Items Gender, age, previous military service, education (a number of questions), employment status (a number of questions), future plans and unaided propensity (a number of questions), aided propensity (a number of questions). Military Benefits and Incentives One question on competitiveness of military pay and a number of questions about the attractiveness of educational incentives, prohibition of smoking, and terms of enlistment. College Programs A number of questions about the service academies, ROTC, and sources of money for college. Current Events I Questions about combat interest and Kosovo. Current Events II Questions about likelihood of acceptance by the military and career options in the military. Junior ROTC Questions about participation in high school programs. Civilian vs. Military Perception Questions concerning importance of 26 issues including money for education, development of self-discipline, opportunity for travel, working as part of a team, getting experience preparatory for a career, working in a high technology environment, etc. Media Habits and Internet Use Numerous questions on use of television, radio, newspapers, magazines, and the Internet. Advertising Awareness General question concerning recall of any military advertising, recall of any advertised services, etc. Slogan Awareness Aided recall of advertising theme lines. Advertising Response and Information Seeking Questions concerning readership of direct mail pieces, and actions taken to call for information or contact recruiters. Influences I Questions concerning persons with whom the study participant discussed the possibility of military service and the attractiveness of military service. Influences II Questions concerning personal sources of information, military related movies, and perceptions of military life. Background I Questions concerning the ASVAB test, MEPS test, and high school grades. Background II Questions concerning parents' education, marital status of study participant, etc. YATS data are used by the Office of the Under Secretary of Defense (Personnel and Readiness), military recruiting commands, and all of the Armed Services and their advertising agencies to track trends in youth attitudes, understand the effect of these trends on recruiting, and evaluate the effectiveness of recruiting programs. Information about YATS data and analyses is disseminated through briefings, presentations, and reports. The most recent analysis, Youth Attitude Tracking Study 1998: Propensity and Advertising Report (Wilson et al.,
OCR for page 1
2000), provides information on the demographics of the youth population, propensity for military service, reasons for joining or not joining the military, and the effect of recruiting efforts (including advertising). In addition, Bruce Orvis and his colleagues at RAND have evaluated youth and recruiting issues using YATS data (Orvis et al., 1996). FINDINGS AND RECOMMENDATIONS The recommendations below represent the collective judgment of the committee after careful evaluation of YATS and its existing analytic products and discussions with people who have done extensive analyses of the YATS data. As our work continues, we may offer other recommendations. Further Data Analyses Further analyses of the YATS data could yield valuable insights about the causes of current recruiting shortfalls and possible remedial actions. Most of the recommended analyses involve studying potential causal links1 between the propensity to enlist and various youth and societal factors, although other types of analyses may also yield important insights regarding youth attitudes and behaviors that could aid in recruitment strategies. We recommend that DoD consider comprehensive bivariate and multivariate analyses that attempt to relate trends in the propensity to enlist to possible causal factors, such as trends in youth attitudes and values, trends in demographic characteristics of youth, trends in youth influencers (such as family members with military experience), trends in military recruiting resources (e.g., number of recruiters, advertising, and enlistment incentives), trends in military operations and conflicts, and trends in such exogenous variables as civilian pay, unemployment rates, and college demand and incentives. It would also be valuable to examine differences in propensity by respondents' planned occupational field. According to DoD, existing studies and analyses of YATS tend to focus on snapshot tabulations, single-variable trends, and relationships among, at most, two or three variables of interest. Standing alone, univariate and bivariate analyses can generate misleading and inappropriate conclusions from the YATS data and sole reliance on such analyses is not appropriate for the kinds of information needed by DoD. We are recommending more comprehensive multivariate analyses that might use several different techniques, including cross-sectional multiple regression, multivariate time-series regression, and even combined time-series cross-sectional regression insofar as the data permit these kind of analyses. Analyses should also use alternative definitions of propensity (e.g., those definitely planning to enlist) and different age groups that might generate closer relationships to actual enlistment behaviors. In addition, since there is a perception that private industry provides better training in some kinds of jobs and that the military might be the only place to obtain training in other kinds of jobs (e.g., combat), it would be useful to understand more about how propensity differs by different occupational choices of youth. As part of this work, it might also be appropriate to conduct factor analyses of some of the youth attitude and value data, such as reasons for joining/not joining, reasons for 1 Care must be exercised in making conclusions about causality when correlational data are used.
OCR for page 1
increased/decreased interest, and importance of various career attributes or life goals. Not only might this help identify broader themes in and structures of youth attitudes and values, and whether these themes and structures change over time, but it would also serve as a useful data reduction technique that will aid in the construction of multivariate models. We recognize that there are some limitations in carrying out further analyses of the YATS data. First, these further analyses, as is the case with all work for YATS, would focus on propensity rather than actual enlistment behavior, and although propensity is related to enlistment, it is not a one-to-one correspondence. Second, there are some serious data limitations, including changes in attitudes surveyed and in question wording over time, and significant amounts of missing data, most of which arise from the practice of asking questions only of randomly selected subsets of respondents. Under the current structure, the questionnaire is partitioned across participant subgroups such that different respondents are asked different subsets of items within a block of items on a given topic. This last data feature makes cross-sectional multivariate analyses (e.g., regression or factor analysis) very difficult due to inadequate numbers of observations. Some of these problems might be overcome by factor analyzing pairwise correlation matrices or by aggregating data into higher level units of analysis (e.g., geographic regions). However, the two primary options for factor analysis of partitioned data are problematic. One option is analysis of a pairwise correlation matrix, where each correlation is based on the subsample responding to the particular pair of items. The other option is to impute missing values for each item for those respondents who were not asked particular items. In both cases, the observed patterns can be affected by the implicit assumptions that the approaches make about the responses of those individuals whose responses are missing. Furthermore, because the questionnaire is partitioned in this manner, it is difficult to determine how measures of values and interests relate to propensity to enlist among various subgroups within the participant sample. In planning further analyses of the YATS data, we recommend that some attention be given to already completed analyses based on other data sets, such as Monitoring the Future surveys at the University of Michigan Survey Research Center. Some of those analyses are similar to what we are recommending, and they may offer useful leads for some of the more promising approaches that might be applied to the YATS data. In spite of some understandable limitations in the data, the committee believes that further secondary analyses of YATS can offer greater insights into some of the reasons for changes in propensity to enlist, which in turn can be tested and validated by similar analyses using other surveys (such as Monitoring the Future) or by analyzing actual enlistment trends. We recommend that a procedure be established (e.g., an advisory board) to periodically review and evaluate the adequacy of analytic approaches to YATS data.
OCR for page 1
Methodology In considering further analyses of the YATS data, we recognize that the current methodology for collecting YATS data involves procedures that necessarily limit the ability to apply multivariate techniques. Most of these limitations arise from the fact that YATS is administered only once a year and needs to gather enormous amounts of data from one large sample. As noted above, many of the questionnaire items are asked of only subsamples of the total survey sample. Despite this careful effort, the questionnaire is nevertheless daunting. The introductory boilerplate is quite formal and, for some survey participants, it may take 2 minutes or more before the first question is asked. Also, within the body of the questionnaire, many of the questions are both lengthy and conditional. For example, question Q545A consists of 66 words and asks survey participants to speculate on how military enlistment as a whole might change if a particular change were made in the terms of service. Changes in design should be guided by the analytic goals and models established by DoD and its contractors who study and use YATS data. The questions asked and the grouping of the questions should be consistent with these analytic goals. We recommend that DoD consider improving YATS methodology by implementing changes in data collection and sampling techniques. We understand that DoD is already planning to administer the YATS surveys on a more frequent basis to improve the timeliness and responsiveness of results. This change provides opportunities for other methodological changes that could enhance the usefulness of the YATS data. For example, while every YATS survey will probably contain a common core of questions, especially those relating to propensity to enlist and various background questions, other questions do not need to be asked at every survey administration. This approach would enable gathering complete data from all respondents on the block of items in question without placing an undue time burden on respondents. It is likely that there are certain types of information, particularly with respect to attitude and value items, that do not need to be assessed on a continuous basis. We recommend that DoD consider alternating blocks of in-depth questions on different administrations, so that certain attitude items (e.g., life goals) are assessed in one or two administrations during the year but not others. There are other methodological issues that might arise from more frequent (e.g., monthly) administrations of the YATS that should be considered. For example, sample sizes will probably be smaller, thereby introducing greater sampling error, especially when studying subgroups defined by demographic characteristics, such as race, gender, and age. Some of these problems can be overcome with careful nonproportional sampling designs (i.e., oversampling some demographic groups), and some consideration might be given to somewhat longer survey intervals with larger sample sizes (e.g., every 2 months). Another possibility is to do monthly surveys of time-sensitive information (propensity, advertising-related questions) and less frequent surveys (yearly, biannual) with larger samples for those items that are less time sensitive (general attitudes and values, views of the military, etc.).
OCR for page 1
We recommend that whenever a survey is designed to partition the questions so that not all questions are asked of all respondents, consideration should be given to randomly assigning interrelated blocks of information to the same subgroups, such as asking one subgroup all life/career goal questions, another subgroup slogan recognition, another subgroup Service-specific questions, and so forth.2 Consideration should also be given to maintaining sufficient sample size and content within a block of relevant questions so that multivariate analysis can be conducted without serious missing data problems. A Portfolio of Surveys We recommend that a portfolio of surveys at different time intervals replace the current annual YATS administration. A variety of approaches might be considered. One approach, continuous tracking, offers the benefit of revealing month-to-month changes that might be related to changes in program activities by the military and related changes in economic and social conditions. It would provide more timely data that could be used analytically in an attempt to better understand the effects of various information campaigns and communication techniques. Although the week-to-week sample sizes would be smaller for continuous tracking, the total sample size would build over time and enable the useful tracking of appropriate moving averages for key variables. For some issues, specific studies could be designed to provide detailed perspectives on the interests, motives, role models, and perceptions of American youth as they relate to their life choices and the possibilities of choosing military service. For other issues, one-on-one in-depth interviews about the tradeoffs seen in various career and life choices could be used to reveal the specific criteria used to make choices and the value structure underlying those choices. 7a. We recommend that DoD consider using a continuous tracking survey methodology for such issues as propensity to enlist, advertising awareness, awareness of direct response campaigns, involvement in high school activities, and perceptions of the military. 7b. We recommend that DoD consider conducting specific national surveys every 2 years on such issues as values relating to careers, family life, consumption, lifestyle, leisure, education, interest in information technology, and public service. 7c. We recommend that DoD consider conducting specific, smaller-scale studies as needed, to examine issues such as trade-off analyses of specific “offers” with respect to combinations of terms of service, educational benefits, and pay, among other things. Further, small-scale studies are recommended to examine: (1) postponement of focus or commitment and a sense of direction in life, and (2) perspectives on family life and community involvement. 7d. We recommend that DoD consider using in-depth qualitative studies to offer insight about the decision-making processes of potential recruits. 2 Each respondent should be given multiple blocks of items with each block paired with every other block for some subset of respondents to allow cross-block comparisons.
OCR for page 1
Finally, communication from the military reaches today's youth in a larger context, and it would be useful to understand the other voices (e.g., universities, corporations) and to examine how communication from the military is viewed in that larger context. Although other organizations exist in other markets and decision-making realms, military communication is competing for attention and will inevitably be seen by youth as fitting or not fitting in a contemporary context. We recommend that DoD consider examining communication strategies through a specific study of how a variety of organizations communicate with youth. Item Content The committee examined the content of the YATS survey to identify areas in which additional useful information might be collected. We understand that the overall length of the survey and the attendant administration costs are important considerations; therefore, we addressed strategies for deleting as well as adding items. We recommend that linkages to the propensity to enlist established by prior analyses be used as a basis for retaining or deleting items .3 To support the development and selection of alternative communication strategies, we recommend that DoD expand the coverage of: (1) perspectives on service to the country, (2) understanding of the mission of the military, and (3) life values and motives. Since college is a major competitor to the military, we recommend that DoD consider including questions comparing benefits attributed to the military and to college education. We recommend that items involving attribute importance (i.e., the importance of traits associated with military or civilian jobs) be reconsidered, because an importance rating (how important a trait is to a person) provides no information on whether the attribute is viewed as positive or negative. We recommend that DoD review the content of current survey items and consider adding questions that identify the characteristics of three career choices— military, work, and education—and measure youths' evaluations of these characteristics. For example, questions on total compensation (e.g., pay, health benefits, 401K plans, stock options, value of education and training), pay-for-performance plans, recognition programs, currency of technology, job security, career progression, work/family balances, etc. could be added. 3 Some items with no link to propensity might be retained if they provide information about targeted groups and methods to facilitate communication with them.
OCR for page 1
Accessibility of YATS Data Accessibility and timeliness are especially important in view of more frequent survey interval; much of the advantage of more frequent surveys will be lost if the information is not made available more quickly to researchers and policy makers. Since YATS interviewing is conducted using CATI techniques, the raw data is available in computer form as soon as the interviews are completed. After a reasonable period for validity checks and creation of summary variables, DoD should make the database available on-line so that interested users could retrieve information, such as summary statistics, simple tabulations and cross-tabulations, and so forth. In addition, it would be especially useful to let users retrieve information from the current survey and compare it with similar information from earlier surveys. Although the software for such comparisons will be somewhat more involved than that needed to provide access to a single survey, it seems especially important to have this capability in view of the more frequent periods of administration. We recommend that DoD consider doing more to improve the accessibility of YATS data to policy makers and their advisers, thereby improving the timeliness of the information. We recommend that DoD consider making the YATS and related databases available on-line, so that people with proper clearance could access the database for basic results. DoD should also consider being more proactive in making YATS and related data available to the research community. In doing so, confidentiality needs to be ensured. We appreciate the opportunity to examine the YATS methodology and analyses and hope that our recommendations provide useful input to your efforts to improve YATS. We would be pleased to discuss these recommendations with you and your staff at your convenience. Sincerely yours, Paul Sackett, Chair Committee on the Youth Population and Military Recruitment Attachments: Statement of Task and Membership Roster REFERENCES Orvis, B. R., N. Sastry, L. L. McDonald 1996 Military Recruiting Outlook: Recent Trends in Enlistment Propensity and Conversion of Potential Enlisted Supply. Santa Monica, CA: RAND. Wilson, M. J., J. B. Greenlees, T. Hagerty, D. W. Hintze, and J. D. Lehnus 2000 Youth Attitude Tracking Study 1998: Propensity and Advertising Report. Arlington, VA: Defense Manpower Data Center.