National Academies Press: OpenBook

Youth Employment and Training Programs: The YEDPA Years (1985)

Chapter: Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981

« Previous: Appendix D: Estimates of Effects of Employment and Training Programs Derived from National Longitudinal Surveys and Continuous Longitudinal Manpower Survey
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 281
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 282
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 283
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 284
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 285
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 286
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 287
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 288
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 289
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 290
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 291
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 292
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 293
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 294
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 295
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 296
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 297
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 298
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 299
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 300
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 301
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 302
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 303
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 304
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 305
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 306
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 307
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 308
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 309
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 310
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 311
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 312
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 313
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 314
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 315
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 316
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 317
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 318
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 319
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 320
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 321
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 322
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 323
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 324
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 325
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 326
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 327
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 328
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 329
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 330
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 331
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 332
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 333
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 334
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 335
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 336
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 337
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 338
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 339
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 340
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 341
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 342
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 343
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 344
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 345
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 346
Suggested Citation:"Commissioned PapersKnowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981." National Research Council. 1985. Youth Employment and Training Programs: The YEDPA Years. Washington, DC: The National Academies Press. doi: 10.17226/613.
×
Page 347

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

COMMISSIONED PAPERS Knowledge Development Under the Youth Employment and Demonstration Projects Act, 1977-1981 Richard F. Elmore INTRODUCTION In July and August of 1977, Congress passed and President Carter signed the Youth Employment and Demonstration Projects Act (YEDPA). The law substantially increased authorizations for two existing youth employment programs, the Job Corps and the Summer Youth Employment Program (SYEP). It added three new programs, Youth Community Conser- vation and Improvement Projects (YCCIP), the Youth Employment and Training Programs (YETP), and the Young Adult Conservation Corps (YACC). It also authorized a large-scale demonstration of strategies designed to encourage high-risk youths to stay in school, using guaranteed work as an incentive--the Youth Incentive Entitlement Pilot Projects (YIEPP). (Table 1 summarizes the target groups and activities included within these programs.) In the fiscal year immediately prior to the passage of YEDPA, federal outlays for youth employment programs were about $955 million (Hahn, 1979~. Over the next four fiscal years, 1978-1981, about $8 billion was spent on programs and about $500 million on research and development addressed to youth employment, serving about 1.5 million youths annually (see Tables 2 and 3~. YEDPA was administered by a newly created Office of Youth Programs (OYP), which was located in the Employment and Training Administration (ETA) of the U.S. Department of Labor (DOL) and which relied on a large number of independent contrac- tors, as well as state, local, and other federal agencies. Included in the legislation authorizing YEDPA was a broad charge "to explore methods of dealing with the structural unemployment problems of the Nation's youth" and n to test the relative efficacy different ways of dealing with these problems in different local contexts." This charge was backed by substantial discretionary - Richard F. Elmore is at the Graduate School of Public Affairs, University of Washington. 281 or

282 TABLE 1 Federal Youth Employment Programs, 1977-1981 Program Target Group Activity Job Corps 16-21; economically disadvantaged Summer Youth 14-21; economically Employment disadvantaged Program (SYEP) Youth Community Conservation and Improvement Projects (YCCIP) Youth Incentive Entitlement Pilot Projects (YIEPP) 16-21; unemployed; preference to economically disad- vantaged 16-19; economically disadvantaged; 17 selected sites-- 7 large cities, 10 smaller cities Young Adult 16-23; unemployed Conservation Corps (YACC) Youth Employment 14-21; at least and Training 85 percent economically Programs (YETP) disadvantaged Residential centers; education, skill train- ing, work experience, counseling, health care; stipends; centers admin- istered in cooperation with other federal agencies, state and local governments, and non-profit organizations Work in public or private, nonprofit agencies; some educational enrichment Work in community-planneo public projects Two-year demonstration; guarantee of minimum wage, part-time work during school year, full- time work during summer; contingent on satisfac- tory performance in school and work Work for up to 1 year on conservation, public projects; by statute, 70 percent administered through interagency agreements with Departments of Interior and Agriculture, 30 percent through formula to states Classroom or on-the-job training, work experi- ence, pre-employment skills; administered through local prime sponsors; 22 percent set-aside for cooperative programs with local educational agencies

283 TABLE 2 Outlays for Federal Youth Employment Programs, Fiscal 1978-1981 (in millions of dollars) Program Fiscal 1978 1979 1980 1981 Job Corps 280 380 470465 SYEP 670 660 721769 YCCIP 61 103 1220 YIEPP Larch 1978-August 1980: 218 YACC 139 273 234174 YETP 294 556 695719 SOURCE: Data from U.S. Department of Labor (1980a, 1981, 1982~. authority and money, granted to the secretary of labor and delegated to the Office of Youth Programs, to conduct research, demonstration, and evaluation activities around the structural unemployment problems of youths. This effort was described with the arresting phrase "knowledge development." The youth employment knowledge development effort was remarkable in many respects: It was one of the largest short-term investments in social research and development ever undertaken by the federal govern- ment. Its scale and complexity dwarfed any research and development effort undertaken by the Department of Labor before or since. It generated a large body of research, evaluation, and practical knowledge, which is only now being sorted, assessed, and assimilated into policy and practice. It coincided with a sharp surge in political attention to youth employment problems, creating many opportunities for connect- ing research with policy. And it galvanized a broad-based national constituency of policy makers, researchers, and local practitioners, at least for a short time, around the problems of youth employment. These features argue for a retrospective look at the process. This paper analyzes the conduct of the knowledge development effort, from its origins in congressional decision making, to its design, implementation, and results. The paper addresses five main questions: (1) What congressional and executive expectations shaped the knowledge development process? (2) How was the knowledge development process designed? (3) How was the process organized and managed? (4) How has the process influenced policy and practice in youth employment? (5) What lessons can be learned from the process that might shape future large-scale research and development efforts in social policy? Because the knowledge development effort was vast and the scope of this analysis is modest, the answers to these questions are necessarily

284 TABLE 3 Participation in Federal Youth Employment Programs, Fiscal 1978-1981 (headcounts) Program19781979 1980 1981 Job Corps72,00085,000 103,800 il4,400 SYEP1,009,300888,000 800,000 774,000 YCCIP28,70038,500 43,000 38,400 YIEPP March 1978-August 1980: 72,000 YACC 51,900 67,200 68,000 68,000 YET P 359,200 413,600 463,000 393,700 SOURCE: Data from U.S. Department of Labor (1980a, 1981, 1982). tentative and incomplete. But, even with these limitations, the paper provides an occasion to examine the broader consequences of large-scale investments in social research and development. It also provides a political and organizational complement to more methodologically oriented reviews of the evidence. The YEDPA knowledge development effort raises questions that have run through federal investments in social research and development since at least the mid-1960s. Prior to that time--during the New Deal, for example--the unspoken assumption was that public money spent on remediation of social problems was effective simply by having been spent. In the 1930s, the National Youth Administration's employment programs and the Civilian Conservation Corps were assumed to have accomplished their purposes when federal money was transferred through public employment to unemployed young people. In the 1960s, this view began to shift markedly. It was no longer adequate justification for public expenditures simply to pass money from the government to individuals who needed it; public expenditures had, in some way, to contribute to the long-term solution of basic structural problems in society--poverty, unemployment, crime, delinquency, and the like. Some argued that even this was not sufficient justification, proposing instead that expenditures be based on comparisons of their net returns to society, rather than just on their relationship to social problems. This shift in perspective coincided with a marked increase in federal social expenditures and at least a five-fold increase in federal expenditures on social research and development (Rein and White, 1977~. The dominant theme of social research and development in the 1960s and 1970s was the seemingly straightforward matter of "finding out what works" (Rivlin, 1971~. The dominant analytic model was a combination of experimental method and economic analysis. Experimental method, which in its most rigorous form prescribes pre and post-measurement coupled with random assignment of subjects to treatment and control groups, would provide the means of disentangling

285 the effects of social programs from the effects of various other factors. Economic analysis would provide the means of attributing value to program effects and of assessing their net social benefit. As the custodian of expertise and money in this effort, "the federal government should take the leadership in organizing, funding, and evaluating systematic experiments with various ways of delivering education, health, and other social services . . . trying out new methods in various places under various conditions'' (Rivlin, 1971:86-87~. The evidence emerging from these systematic experiments would inform public policy decisions. The underlying assumptions were that (1) knowledge of effects and net benefits was a key determinant of public policy decisions; (2) systematic knowledge, when marshaled in support of decisions, would be used by policy makers; and (3) better knowledge meant better decisions and more value to society from social expenditures. This analytic model produced some notable successes--well-conceived and well-implemented experiments in income maintenance, health insur- ance, and housing subsidies--and some notable embarrassments--vaguely conceived and erratically implemented experiments in educational vouchers, compensatory education, and educational performance contract- ing, for example. The analytic model also began to find its way into federal evaluation requirements that accompanied categorical grants and federally sponsored research and demonstration activities under the umbrella of operating programs. In education, for example, a national Dissemination Review Panel was established to review evaluations of exemplary programs for methodological rigor and results and to "validate" those programs for broad-scale dissemination. Parallel patterns developed in delinquency prevention, employment, and mental health. This analytic model had no sooner become a fixture of federal policy than experience began to surface a number of problems: . Timing. Social experiments and evaluations, especially the well-conceived ones, took so long to produce results that they usually answered questions no longer being asked by policy makers. The demands of the political process were out of synch with what the experimental method could produce. · The Nature of the Treatment. Most of the notable successes with social experimentation were with policies involving relatively simple cash-transfers (income maintenance, health insurance, and housing subsidies). Most of the notable failures were with policies involving complex changes in existing organizations, the creation of new organizations, or the delivery of highly individualized services. · Organizational Complexity. The large-scale accumulation of knowledge about social problems turned out to require orchestrating competing political demands, marshaling money and expertise behind policy questions, and constructing organizations to deliver services and do research. The skills necessary for these activities were more akin to the skills of management than the skills of scientific inquiry. People who have the required management skills do not necessarily have the skills, interest, or commitment to scientific inquiry, and vice versa.

286 · Implementation. Complex interventions have to be implemented before they can be tested. Implementation requires skill and commitment ~ ~ interest - ~ " Implementation also requires organizational and administrative capacity--people and institutions who are ready to apply their practical knowledge to the problems raised by policymakers. These practical concerns often came as a shock to social scientists whose main concerns were methodological and theoretical. · Variability and Robustness. The cash-transfer experiments seemed to produce findings that were robust from one setting to another, if not from one experiment to the next. Evaluations and experiments requiring complex organizational solutions and individualized services produced findings that were highly sensitive to setting--differences among sites were consistently greater than differences among treatments across sites. · Small, Ambiguous Effects. The social interventions of the raid-1960s were justified politically in rhetoric that suggested broad-scale reform of society. The actual results of experiments and program evaluations showed small, often inconclusive effects. The interventions worked with some groups and not others; the effects were sometimes so small as to be of questionable practical significance; important sources of variation (site-to-site differences, for example) were often not included in the design of evaluations; and effects often did not persist over time. · Methodolocical Uncertaintv Better-desianed. more r icorous. trom people whose main "finding out what works. · . . . . · . · VariahiliEv and Robustness;. Is In aellverlng services, not In . . · ~ more analytically sophisticated experiments and evaluations did not reduce the uncertainty and conflict surrounding policy decisions. Indeed, they often aggravated it. Serious discussions of important policy questions often got sidetracked into arcane debates over methodological decisions, analytic assumptions, and statistical techniques, leaving the intended consumers of the results confused. The most frequent conclusions of policy research were recommendations for more research. The research community seemed reluctant to apply the same benefit-cost calculus to its own work that it applied to social policy. · Conflict Between Science and Practice. As the application of the breach widened social science to social policy making proceeded, between people who delivered services, on the one hand, and the people who conducted experiments and evaluations, on the other. Practitioners argued that the quantitative findings of rigorous research and evalua- tion were too abstract to be of any practical use, too insensitive to practical problems, and that experimentation and evaluation were expensive ornaments hung on social programs for the benefit of social scientists. Social scientists argued that, without a scientific basis, practice could not be justified to the public and that resistance to systematic analysis stemmed from the professional's usual hostility to external scrutiny. It was impossible to engage in large-scale policy research, experimenta- tion, or evaluation in the 1970s--or the 1980s, for that matter--without

287 confronting these problems in one form or another. They were part of the baggage of systematic inquiry in the service of policymaking. Out of these misgivings there began to emerge a different, more tempered view of the connection among systematic inquiry, policy, and practice. The utility of experimental method and economic analysis came to be defined in narrower terms. Rigorous social experimentation required relatively strong theory, analytic skill, time, money, and organizational capacity--conditions that could not be met in all instances. Social scientists began to acknowledge an explicit tradeoff between internal validity (the ability to distinguish treatment effects) and external validity (the ability to generalize effects beyond an experiment). The degree of experimental control required for a precise estimate of effects was to some degree inconsistent with the ability to transfer the treatment from an experimental setting to a practical operating environment. Policy analysts began to speak with more respect about the "ordinary knowledge" (Lindblom and Cohen, 1979), or practical understanding, necessary to make complex decisions and to get from an analytic result to a prescription for action. Views about the relation- ship among systematic inquiry, policy, and operating decisions became more elaborate and less hard edged. Systematic inquiry, even when it met rigorous methodological stan- dards, was rarely brought to bear on clearly specified decisions-- legislative, budgetary, or administrative. But systematic inquiry did have longer-term, more diffuse effects on the conventional wisdom that policy makers used to define problems, on the way organizations were structured, on the directions administrators pushed their organizations, and on the way practitioners handled day-to-day problems in providing services. The shifts, in other words, were less a repudiation of the experimental/analytic model and more a domestication of it to political, organizational, and operating realities. The Department of Labor had, by the mid-1970s, accumulated con- siderable capacity and experience in economic analysis and evaluation, although its experience with large-scale experimentation was more limited. The department's analytic functions in the employment and training area were the responsibility of the Office of the Assistant Secretary for Policy Evaluation and Research (ASPER) and, within the Employment and Training Administration, the Office of Policy Evaluation and Research (OPER). The Policy Evaluation and Research budget of DOL was consistently around $35 million a year between 1976 and 1980; over $20 million was in earmarked appropriations and about $15 million in discretionary funds (apart from YEDPA). The varied collection of state and local government agencies and community-based organizations that delivered employment and training services under the Comprehensive Employment and Training Act (CETA) had become acclimated to a relatively high level of federally required evaluation, but no less resistant to its costs and inconveniences. An array of external research and evaluation organizations had developed around DOL-sponsored evaluations, as well as a large array of university-based research organizations. Not much of this capacity for research and analysis, however, was focused specifically on youth employment--a matter that would become important with the passage of YEDPA. in economic analysis and evaluation,

288 The youth employment knowledge development effort commenced at a time, then, when federal investment in analysis, research, and evalua- tion related to employment had been relatively high for a number of years, when methodological and analytic sophistication were on the rise, when major uncertainties were surfacing about the role of systematic inquiry in the service of policy making, and when the administrative structure for employment programs had become acclimated to, if not totally accepting of, evaluation. The uncertainties that characterized policy analysis, research, and evaluation generally at that time were necessarily part of any effort to apply systematic inquiry to the youth employment problem. Among the questions growing out of this larger context are the following: · What constitutes "useful" knowledge? Is the utility of knowledge, and hence the value gained from investment in systematic inquiry, to be judged in strictly methodological and quantitative terms--that is, are "useful" results measures of specific impacts to which no alternative causal explanation can be offered under the methodological conventions of social science? Or is the utility of results more a matter of practical use--that is, "useful" results are those that are perceived to be helpful in solving political, administrative, and practical problems? · What should be the relationship between the delivery of services and the discovery of effects? Is it the primary responsi- bility of government agencies to deliver services to people, consistent with the political decisions of elected officials? Or is their primary responsibility to "find out what works," consistent with the economic criterion of positive net benefit? Is it possible to accommodate the delivery of services and the measurement of effects within a single organizational structure? Should the delivery of services be con- strained by the methodological conditions necessary to identify effects, or should methodological conditions be constrained by the practical necessities of delivering services? · What are the political and organizational correlates of successful accumulation of knowledge? If the accumulation of knowledge about social problems requires orchestrating competing political demands, marshaling money and expertise behind policy questions, and constructing organizations to deliver services and do research, then how do we distinguish between better and worse ways of doing these things? · What payoffs should we expect from large-scale research, demonstration, and evaluation activities? Should the payoffs be on the order of resolutions to the problem of youth unemployment!'? Or is it sufficient to offer solutions, on the order of "ways to reduce the high school dropout rate" or "ways to impart employment skills," that offer constructive solutions to practical problems, but little hope of solving the overall problem? The analysis that follows is divided into five main sections: (1) expectations about knowledge development on the part of Congress and

289 the executive branch; (2) design of the knowledge development effort; (3) organization and management of the effort; (4) influence of the effort on policy and program; and (5) guidance for the future that might be gained from the knowledge development effort. POLITICAL EXPECTATIONS Shortly after the inauguration of President Carter in January 1977, several representatives of the new administration were summoned to a meeting on the Senate side of the Capitol Building. The Carter appointees were Bill Spring, from the White House Domestic Policy Staff; Ray Marshall, secretary of labor; Ernest Green, assistant secretary for employment and training; and Nik Edes, deputy under- secretary for legislative and intergovernmental affairs at the Labor Department. Prom the Senate was assembled a rare array of senior members. Among the Democrats were Henry Jackson, Washington; Hubert Humphrey, Minnesota; Edward Kennedy, Massachusetts; Alan Cranston, California; Harrison Williams, New Jersey; Gaylord Nelson, Wisconsin; and Jennings Randolph, West Virginia. Among the Republicans were Jacob Javits, New York, and Robert Stafford, Vermont. According to Nik Edes and Richard Johnson, then staff director of Nelson's Senate Employment Subcommittee, the Senators delivered a simple message to the Carter appointees: A youth employment bill would be introduced in the Senate immediately, with or without administration support. The administration could collaborate or be left behind. The reasons for the pressure were political. According to Johnson, "There were youth proposals coming from all over the Senate; they could have gone to [the] public works, interior, or labor [committees]. Javits sensed that the whole thing was about to come apart. Whichever committee got to the floor first with a proposal would get credit. He decided it was time to call a meeting. He told the administration, 'We're about to produce a piece of legislation. If you want in, now is the time'." Spring, recently transplanted from the Senate Labor and Public Welfare Committee Staff to the White House, recalls, "It was a rescue effort by the Labor and Public Welfare Committee to maintain its jurisidiction. Javits [ranking member of the Labor and Public Welfare Committee] saw that Jackson [chair of the Interior Committee] and Randolph [chair of the Public Works Committee] were about to move, and understood that if something didn't happen quickly they were going to lose it." Within weeks, Edes, Johnson, and Spring had drafted a proposal incorporating the special programs of the key Senators. This particular selection of staff was not accidental. Edes and Spring, representing the Carter administration, had only weeks before been members of the Senate staff--Edes working for Senator Williams, chair of Labor and Public Welfare, and Spring working for Senator Nelson, chair of the Employment Subcommittee. According to Spring, "It was the old Senate staff pulling together around an issue. There really wasn't an administration position, because the Carter White House hadn't gotten organized."

290 Congressional Perspective According to Johnson, "As far as the Senators were concerned, the ideas were simple. You needed to have better cooperation between the schools and the employment and training system. You needed to do something about dropouts. You needed to provide opportunities for kids in trouble with the schools to do useful, productive work and prepare themselves for employment." Jackson, Humphrey, and Randolph came of age politically in the New Deal. Their ideas about what young people needed were consistent with the New Deal view of employment in the service of conservation and public works. YACC and YCCIP were manifes- tations of this view. Nelson and Williams had a large political stake in maintaining their committee's jurisdiction over employment policy and assuring that the federal employment and training structure provided adequate access for youths. YETP was the solution to that problem. Javits's special interest was in the connection between the schools and the employment and training system. On the strength of Javits's interest, a provision was drafted requiring that a proportion of YET P funds (originally 15 percent, later 22 percent) be allocated to projects jointly involving local education agencies (LEAs) and CETA prime sponsors. For the Carter administration the top domestic priority was dealing with persistent inflation and rising unemployment. Youth employment, per se, was not part of their early agenda. As one congressional staff member said, "They didn't have any hip-pocket proposals on youth employment coming out of the transition, so it was relatively easy for them to buy into whatever the Senate had to offer." On January 31, 1977, Carter proposed a $20 billion emergency economic stimulus package, composed of supplemental budget requests for fiscal 1977, to cover the 18-month period from April 1977 to September 1978. The package con- tained an $8 million addition to public service employment (PSE), $4 billion for public works jobs, over $5.5 billion in aid for local governments, and $1.5 billion for unspecified youth employment programs. The administration's original intent was to implement its youth program administratively, without new legislative authority. Senate aide Richard Johnson said, "We told them, 'You can't do that on Capitol Hill, legislators want to pass legislation and get some visibility'." So on March 9, the administration followed with a youth employment message, containing a proposal that had been worked out jointly with the Senate. It requested authority for three new youth programs--YACC, YCCIP, YETP; it provided a set-aside of joint school-prime sponsor projects; and it provided that half the YETP funds would be distributed by formula to prime sponsors and the other half used to fund "innovative and experimental" programs at the discretion of the secretary of labor or his designee. Explaining the purpose of the discretionary funding, Richard Johnson argued, "We [the Senate] had always been inclined to put rather generous discretionary funding into the employment programs because we recognized that the formulas for distributing money sometimes resulted in problems getting the money to the right constituencies." For the Senate, in other words, discretionary funding was a way of adjusting formula-funded

291 programs to special geographical, political, or organizational interests. In retrospect, according to DOL's Nik Edes, "leaving the House out of the early negotiations was a major tactical error." The administra- tion's affinity for working with the Senate was understandable. Two key actors for the administration, Edes and Spring, were former Senate staff. Also, according to Spring, "The House got left out because the internal politics of the Senate were so delicate we were afraid we'd lose the whole thing if we tried to accommodate the interests of House members too." When the House got wind of a Senate-administration proposal, they decided to move on their own youth employment bill. The late Carl Perkins, chair of the House Education and Labor Committee, and Augustus Hawkins, newly designated chair of the Employment Subcommittee, intro- duced the administration's youth employment bill on April 6 and then developed an alternative proposal. According to a member of the House staff, "The White House didn't have a lot of experience in these things; they said, 'Wait a minute, you can't develop your own bill; we already have a bill in.' We went ahead with our own proposal." The Senate and House approaches differed in several respects. First, whereas the Senate proposal created new programs focused on youths, the House proposal amended Title III of CETA, a general grant of discretionary authority to the secretary of labor for research and demonstration projects. The Senate saw itself as initiating new, more or less permanent, youth employment programs. The House, by contrast, saw itself as initiating demonstration projects that would form the basis for later, more permanent programs. In a House staff member's words, "our philosophy was 'let a thousand flowers bloom,' and then come back after we'd had some experience and decide what was promising." Another key Senate-House difference was the House's Youth Incentive Entitlement Pilot Projects (YIEPP). YIEPP was important politically to the House proposal because it originated from the Republican side. According to Nat Semple, minority staff to the House Employment Subcommittee, the idea had its origins long before 1977. "Marvin Esch [former Republican member from Michigan] liked to think in big terms. He had kids of his own and he was concerned about how to get kids to stay in school and how to get schools to respond to kids who might not be the greatest academically. We had a long discussion one evening after work over hamburgers and beer at the Hawk and Dove Ma Capitol Hill eatery] and I started sketching out the ideas for entitlement on the tablecloth. The problem was how to get the Republicans to buy off on some kind of a job guarantee. We struck on the idea of a contract. The kids would have to agree to stay in school in return for a job. There would be rules and structure. We weren't offering something for nothing. The whole idea was basically very conservative: keep kids in school, promote the work ethic, make kids contract for benefits, etc. It had a lot of appeal to the Republicans." Esch ran for the Senate from Michigan in 1976 and lost. Semple took the entitlement proposal to Ronald Sarasin, moderate Republican from Connecticut, and Sarasin agreed to sponsor it.

292 A common element of all proposals was an initial 1-year authorization. The entire CETA package was due to expire in 1978. Everyone anticipated that youth employment would be integrated into CETA when the 1978 reauthorization occurred. Rather than putting youth programs on a different authorization schedule than the rest of CETA, there was substantial agreement that the new youth programs should be authorized for 1 year and then taken up again in 1978 with the reauthorization of CETA. In explaining why the proposal was couched in the language of demonstration projects, rather than new programs, the House Committee Report said, "The Committee approach allows for learning as much as we can in order that when CETA is reauthorized next year, the Committee will have a better idea as to what type (or types) of programs) actually work" (U.S. Congress, House Committee on Education and Labor, 1977:4, also 9; hereafter House Report). While the Senate version stressed bold, New-Deal-like solutions to the problems of youth employment, the hallmark of the House version was "learning what works." At several points in its report on the youth bill, the House Committee referred to the uncertainty of expert opinion about youth employment (House Report, 1977~. Of all the witnesses that appeared before the Committee, not one had a definitive answer as to what would solve the problem of chronic youth unemployment. All agreed that a variety of methods should be tested and the educational system should be linked with whatever approach is finally agreed upon. But if the committee was emphatic about "finding out what works," it was for the most part strategically vague about what that meant. YIEPP was the most clearly specified of the House proposals, and it left considerable ambiguity. The committee's advice about what it meant by "finding out what works" was couched in the following terms (House Report, 1977~: In placing a major emphasis . . . on innovative and demonstration programs, the Committee intends that a broad range of activities be tested . . . to learn what works to remedy the structural nature of the youth employment problem and to meet the employment and training needs of specific target groups in the youth population. These activities include outreach, counseling, activities promoting education to work transition, labor market information, attainment of high school equivalency, job sampling, on-the-job training, supportive services, programs to overcome sex-stereotyping in job development and placement, outreach programs for minorities and women and other activities designed to improve the employability of youth. This laundry list was indicative of the uncertainty that charac- terized both expert opinion and political judgment about the youth employment problem and its solutions in 1977 (see Hahn, 1979~. There seemed to be consensus that the youth unemployment rate was too high,

293 but little agreement on what an acceptable rate would be. There was consensus that unemployment was borne disproportionately by minority, especially black, youths, but little understanding of the relative importance of skills, basic education, family background, and dis- crimination in predicting minority youth unemployment. There was consensus that the time was ripe for political action, but little confidence in past solutions to youth unemployment and little specific agreement on what would constitute success. Asked whether a more detailed analysis of the youth employment problem and its solutions might usefully have preceded a multi-billion dollar demonstration effort, one congressional staff member replied, "Are you kidding? When you get the kind of political weight behind a proposal that this one had--Jackson, Humphrey, Randolph, Javits, Hawkins--you don't say, 'Give us a couple of years and we'll come up with a proposal.' You move. Right now' You go with what you have and try to make sense of it as you go along." Bill Spring, a veteran of employment legislation as a Senate staff member, observed, "We were coming out of a period extending from the old S-1 [a federal manpower bill] in 1960, through the Economic Opportunity Act of 1965, up to CETA, in which we had spent huge sums of money on work experience for the unemployed and disadvantaged." The evidence was now pretty clear that work experience had the smallest impact of anything that had been tried. But it wasn't clear what would work better. The House's uncertainty was well founded. Nat Semple had a further explanation of how the youth proposals took shape. "When you ask most adults what to do about any problem with young people, they generalize from their own experience and from what they think is good for kids. Congressmen and Senators are no different. Some of them thought kids ought to be out working up a healthy sweat in the country, some thought they ought to be doing useful public deeds around town, some thought they ought to be staying in school, some thought they ought to be getting useful training to prepare them for jobs. The bill was an amalgamation of all the adult ideas about what's good for kids." In the end, the Senate conceded reluctantly to the House's demonstration approach. In the Conference Report, which stated the terms of compromise between the House and Senate versions, the Senate accepted the House's language stating that the purpose of the law was the "establishment of pilot, demonstration and experimental programs to test the efficacy of different ways of dealing with the problem of youth unemployment," but stipulated that the statement of purpose also contain language "specifying that a variety of employment and training programs, as well as demonstration programs, are authorized" (U.S. Congress, 1977:35; hereafter Conference Report). As is usually the case, the Congress skirted conflict between two alternative purposes by opting for both. While the Congress was strategically vague on the issue of "learning what works," it was more specific on a number of other issues. Both the House and Senate strongly emphasized the need to pay attention to in-school youths and the lack of coordination between the CETA system and the educational system at the local level. The Senate took the

294 view that good in-school programs and strong CETA-education cooperation were "preventive medicine" against the more difficult problem of what to do about school dropouts (U.S. Congress, Senate Committee on Human Resources, 1977:10; hereafter Senate Report). The Senate saw the set-aside for CETA-education cooperation as the solution to this problem. The House observed that "perhaps the greatest weakness of most of the youth employment proposals that have been introduced in the current session is their failure to place any emphasis on in-school youth or on encouraging out-of-school youth to . . . [return to] school" (House Report, 1977:10~. The House saw YIEPP as a way of speaking to this problem. Another issue that acutely concerned both the Senate and the House was the wage and job displacement problem for adult workers that was associated with youth employment measures. Both versions included language requiring the payment of prevailing wages, rather than the minimum wage, to youths filling an existing position. The final version contained language encouraging prime sponsors to take the initiative in developing new and restructured job classifications, in cooperation with labor organizations, to accommodate youths. The Conference Report stressed that the wage standards in the law "seek to promote the interests of both youths and currently employed workers and to engage prime sponsors, employers, and labor organizations in a cooperative effort to expand opportunities" (Conference Report, 1977:40) A related issue that did not arise explicitly in the youth employ- ment bill, but lay behind it, was the youth subminimum wage. The idea of offering employers exemptions from the minimum wage for hiring youths had long been a popular conservative proposal for addressing youth unemployment. It was, however, anathema to labor organizations and liberal legislators, who saw it as a mechanism for eroding the minimum wage and promoting youth displacement of adult workers. The Senate and House versions dealt with the issue by diverting attention away from it. In the words of a House staff member, "A major advantage of the House bill was that it temporarily defused the youth subminimum wage. Earlier in 1977 a youth subminimum amendment to the Fair Labor Standards Act had failed by one vote in the House. The big advantage of the House bill was that it gave Republicans something constructive to vote for without raising the youth subminimum again." Two final congressional concerns were maintenance of constituencies and intragovernmental coordination. The Department of Labor, in the administration of employment and training programs, had, partly by congressional request and partly by its own initiative, developed a broad network of working relations with a very diverse array of organi- zations. The CETA system was, of course, based on prime sponsors--units of state and local government charged with responsibility for adminis- tering federal employment and training funds. Prime sponsors, and the state and local government interest groups representing them (e.g., National Governors Conference, National Conference of Mayors, National Association of Counties), were expected to play a key role in any new program.

295 Prime sponsors, however, services; the remainder were delivered only a fraction of CETA-funded delivered by contractors, some locally Dasea community groups, some affiliated with national organizations (e.g., Urban League, Opportunities Industrialization Centers), many of which had been in existence since the emergence of federally funded employment programs in the late 1950s. These community-based organiza- tions (CBOs) were, and still are, an important part of the political constituency for federal employment and training. Their interests were expected to be represented in any new programs. In addition to these state and local constituencies, DOL also maintained working relations with a number of other federal agencies through a variety of congres- sionally mandated, cooperatively administered programs. Congress expected all these working relations, plus the newly mandated cooperative arrangements with local educational systems, to be carried over into the administration of youth employment programs. These expectations were stated in explicit statutory language. The governors were given their own set-aside of 5 percent of ~ . ~. . . ~ . . . . . .. . , . . . . . . . total YEDPA tuning for exemplary projects ana coordinating activities at the state level. The secretary of labor was charged with implementing "coopera- tive arrangements with educational agencies," including "secondary schools, postsecondary educational institutions, and technical and trade schools." There was a directive to "give special consideration" to community-based organizations, such as "Opportunities Industrializa- tion Centers, the National Urban League, SER-Jobs for Progress, Community Action Agencies, union-related organizations, employer- related non-profit organizations, and other similar organizations." There were instructions to "consult, as appropriate, with the Secretary of Commerce, the Secretary of Health, Education, and Welfare [later Health and Human Services], the Secretary of Housing and Urban Development, the Secretary of Agriculture, the Director of the ACTION Agency, and the Director of the Community Services Administration." The rationale for these instructions was partly political--assuring that key constituencies would be included--and partly adm~nistrative-- assuring that DOL would orchestrate the efforts of diverse federal agencies. In the words of Senate aide Richard Johnson, "The idea was that someone needed to pull together the pieces around a common theme of youth employment." YEDPA, then, embodied a special convergence of congressional interests. It authorized bold new programs, but only for one year and only as part of the general CETA authority for demonstration projects. It gave DOL a broad charge to "find out what works" and substantial discretionary resources to do it, but tempered that grant with a 1-year authorization, limited guidance about what to focus on, and a reminder that it was delivering services at the same time it was running research and demonstration projects. It reminded DOL of its responsibilities for maintaining good relations with federal, state, and local constitu- encies in the process of mounting new programs. It clearly signaled that the Congress expected increased attention to in-school youths and to the connection between employment and training programs and local educational systems.

296 By singling out youth employment for special attention, though, Congress was significantly shifting its expectations for the employment and training delivery system. In 1973 r with the passage of CETA, the Congress had ~ - altered the mode of delivery for employment and training programs by shifting from categorical to block grants. This shift in federal policy meant changing from a system in which the federal government gave grants directly to local service deliverers (community-based organizations, for example) to a system in which federal funds went to state and local officials, who exercised substantial administrative control over the allocation of federal funds to local deliverers. In simple terms this meant a dramatic expansion in the administrative complexity of the employment and training system. It put a premium on the indirect management of local delivery through state and local government organizations with their own political constituencies. Under the previous system, youths were singled out for attention by categorical programs, notably the Neighborhood Youth Corps and the Job Corps. After CETA, the Job Corps and the Summer Youth Employment Program remained separately authorized, but the expectation was that state and local governments would make their own decisions about the appropriate mix of youth and adult programs within broad guidelines set by the federal government. An important part of the rationale for the change was that state and local governments knew more about the special needs of their areas than did the federal government, and therefore, they should exercise wide discretion in the use of federal funds. With the passage of YEDPA, CETA prime sponsors saw a significant shift in federal policy, which many interpreted correctly as a "recategorization" of federal employment and training programs. While the youth programs brought new money, they also brought increased federal program requirements, reduced flexibility, political stresses entailed in focusing on one target group when others were perceived as equally needy, and with time, a more active management role from the federal Office of Youth Programs. As indicated in Table 4, youths continued to participate at a relatively high rate in "regular" CETA programs at the same time they were receiving greater attention through YEDPA. This led many state and local administrators to believe that young people were receiving a disproportionate share of federal funds. Coupled with this recategori- zation, prime sponsors were also confronted with the demands of mounting large public service employment programs (see Table 4), participating in other DOL-initiated research and demonstration activities, and responding to increased DOL demands for better information on local decisions and their effects. Under the structure of indirect manace- ment, cooperation of state and local governments was a key element in the success of any federal venture, but singling youths out for special attention did not inspire unqualified state and local support.

297 TABLE 4 Participants Under Age 22 in CETA Programs Other than YEDPA, 1977-1981 (in percentages) Program 1977 1978 1979 1980 1981 itle I, Employment Training (1973-1978) Title II, Public Service Employment (structural) Title VI, Public Service Employment (cyclical) 52 49 \ 60 21 64 21 II-B&C, employment training II-D, public service employment (structural) [Moved to Title II, B&C] 48 48 45 23 36 26 22 24 24 SOURCE: Data from U.S. Department of Labor (1978-1982). Executive Branch Perspective Within the executive branch at the federal level, expectations for the new youth employment effort were quite modest. YEDPA was seen as a congressional initiative. The Carter administration was happy to accommodate it, especially insofar as it dove-tailed with the President's economic stimulus package. But, between January 1977 and the early spring of 1979, the administration had bigger fish to fry. The administration's top domestic priorities were, first, reducing unemployment and inflation, and second, reforming the welfare system. An important feature of the economic stimulus program was publicly subsidized employment. The economic stimulus package provided for an increase in CETA-funded public service employment (PSE) from about 300,000 to 725,000 people. At its peak, in spring 1978, over 750~000 people were in PSE positions. In 1979, public service jobs were reduced to under 600~000, but the rapid buildup and the high turnover of PSE participants administered a severe shock to the CETA system (U.S. Department of Labor, 1978 and 1979; hereafter DOL). The shock had two important effects. First, it focused a attention at the large amount of local level on finding public sector jobs to create employment. second, and perhaps more importantly, it created a permanent and indelible notion among the public and politicians that C ETA was a public employment program, not an employment and training program. After 1977, the fate of CETA would hang on the uses and abuses of PSE, not on its less-visible training programs.

298 The hallmark of the Carter welfare reform package was the use of employment to reduce welfare dependency. Called the Program for Better Jobs and Income, the proposal promised "to provide a work or training opportunity for an employable adult in every needy family that includes a child under age 18" (DOL, 1978:123; see also Lynn, 1981a). This objective was to be achieved by coupling welfare benefits with work requirements and by relying on the employment and training system to absorb large numbers of welfare beneficiaries. Quite apart from the political difficulties of selling such a proposal and the practical difficulties of administering it, the welfare reform proposal posed a gargantuan job of interdepartmental coordination at the federal level. The Department of Health, Education, and Welfare (HEW), with its prickly and combative Secretary Joseph Califano, considered itself to be the custodian of federal welfare policy. The Department of Labor (DOL) saw in the Carter proposal an opportunity to become a central actor in a large new policy area. The White House Domestic Policy Staff was saddled with the role of orchestrating this complicated bureaucratic minuet. The Carter welfare reform proposal eventually failed to get congressional approval, but not before it had consumed more than two years of the time of top policy staff and political leadership within the administration. The net effect of these two domestic priorities on the youth employment effort was, first, to push YEDPA into the background within DOL and the executive branch generally, and second, to create an intense competition between YEDPA-funded activities and other CETA activities at the local level. This condition persisted until early 1979, when the tide began to turn. During the 1978 CETA reauthorization debate in Congress, there was an ugly and protracted discussion of local misuses of public service employment funds, which had serious repercussions for CETA and its political supporters. In the words of Bill Spring, White House Domestic Policy Staff member, "We came within an inch of losing the whole thing." In an effort to refocus attention on the positive side, the Depart- ment of Labor began increasingly to emphasize its youth employment efforts. By late 1978, the Carter welfare reform proposal had gotten bogged down in a tangle of interdepartmental, congressional, and interest group fights that eventually led to its demise. At that point, in search of a domestic initiative that would serve to focus positive attention on the administration in the 1980 election, the White House staff turned to youth employment. A staff member within the executive branch, who was a persistent critic of DOL'S youth employment activities, recalls a meeting in the fall of 1978. "There was a proposal floating for still more money for youth employment, and I was making the usual arguments against it with Bill Spring when, as a I recall, Bert Carp [a Domestic Policy Staff member] walked into the room and said, 'Youth employment is going to be the administration's number one domestic priority in the 1980 election.' At that point, I knew the discussion was over." In late 1978, the President appointed the Vice President's Task Force on Youth Employment, chaired by then-Vice President Walter Mondale, and charged it with developing a new youth initiative. In

299 roughly 18 months, then, youth employment moved from being a back- burner, congressionally initiated enterprise to being a top domestic priority in the President's bid for reelection, and the administra- tion's only new domestic initiative. These shifting expectations were to have significant effects on the administration of YEDPA. DES ION OF KNOWLEDGE DEVELOPMENT EFFORT The administration lost little time in responding to the congres- sional youth initiative. In July 1977, before YEDPA had been signed by the President, Secretary Marshall created the Office of Youth Programs (OYP) within the Employment and Training Administration (ETA) and appointed Robert Taggart to be its administrator. The new OYP was allocated 49 positions to carry out its charge; of these, 16 were existing Job Corps positions, and 27 of the remaining 33 positions were "mandatory hires," or transfers from other parts of DOL, over which the new administrator had no control. This left Taggart with 6 positions to fill. Of the total OYP positions, 14 were allocated to research, demonstration, and evaluation, and the remainder were allocated to program administration. The magnitude of the task confronting Taggart was extraordinary. The final terms of the congressional charge involved roughly doubling the size of the Job Corps, as well as enriching the program's education and training components; increasing the Summer Youth Employment Program; launching the Youth Incentive Entitlement Pilot Projects, a $200- million-plus, multi-site demonstration; launching three new operating programs--YCCIP, YACC, and YETP; and, most importantly, deciding how to use the discretionary funding allocated to the secretary under the terms of YEDPA. It was from the last of these--discretionary funding--that the "knowledge development agenda" grew. The YEDPA funding formula was of byzantine complexity: it began by taking the total appropriation for YIEPP , YCCIP, and YETP and dividing it into three parts: 15 percent went to YIEPP, 15 percent to YCCIP; and the remaining 70 percent to YETP. YACC was funded separately. Of the 70 percent allocated to YETP, three-quarters went by formula to prime sponsors; the remaining one-quarter, after some small deductions for special allocations to states, and set-asides for migrant workers and native Americans, went to the secretary for discretionary allocations. Depending on appro- priation levels and how one defined "discretionary," this formula would deliver between $300 and $500 million in discretionary money to OYP in fiscal 1978, 1979, and 1980. The low end of this range included only those funds authorized for discretionary allocation under the formula; the high end included YIEPP, which had to be used for a specific programmatic purpose, but which could be allocated at the discretion of the secretary. The term "discretionary" was deliberately ambiguous. As noted earlier, Congress viewed discretionary funds as a way of adjusting formula allocations to constituency interests and congressional expectations. With the implementation of YEDPA, however, the term

300 "discretionary" became synonymous with the knowledge development agenda. This change in emphasis was the result of Taggart's initiative, not the explicit direction of Congress. While Congress's intent was to "find out what works," and dis- cretionary funds were clearly to play a major role in meeting that intent, it was by no means a foregone conclusion that the use of discretionary funds would be organized around a centrally administered research and development agenda. The House's expectations were that YIEPP would be run as a multi-site demonstration and that the remainder of YEDPA would, in a House staff member's words, be allocated on the principle of "let a thousand flowers bloom." According to Senate aide Richard Johnson, "Knowledge development was Bob Taggart's method for bringing some sort of order out of the collection of programs he had to administer. In fact, we had an embarrassing interlude with Bob right after the bill passed when word got back to the Hill that he was calling YEDPA a 'disorganized hodge-podge' of programs--a little insensitive to the Members' interests. To his credit, though, he seized the initiative. He saw the discretionary money as an opportunity to be innovative and systematic, and pull things together under a larger strategy. That was all right by us." Centralizing Control of YEDPA in OYP Within DOL, it was far from a foregone conclusion that OYP would control all program operations, research, development, and evaluation activities associated with YEDPA. There were at least three alter- natives to this model. The usual approach would have been for the secretary to delegate operating authority to OYP and responsibility for research, development, and evaluation to the Office of Policy Evaluation and Research in the Employment and Training Administration or jointly to OPER and to the Office of the Assistant Secretary for Policy Evaluation and Research in the Office of the Secretary. Another alternative would have been to allocate the bulk of the discretionary money to prime sponsors under a series of large-scale grant competitions, and then to require the recipients of those grants--state and local agencies--to develop research and evaluation plans and relations with research and evaluation organizations as part of their projects. A third option might have been for OYP and OPER jointly to develop plans for a limited number of large-scale demonstra- tions or social experiments, along the lines suggested by YIEPP, to manage those projects jointly, and to contract with external organiza- tions to evaluate the projects. The decision to locate all responsibility for program operations, research, development, and evaluation in OYP was taken at Taggart'S initiative. "No money is ever really 'discretionary,"' Taggart said. "It's all got to be used to serve a variety of missions--political, administrative, and research. The question was how much control would we exercise over the discretionary money and whether we would divide it up within the Department. There were a number of people who wanted to

301 put it all out by administrative formula to prime sponsors; even Entitlement could have been sent out on a modified formula basis. I got a quick sign-off [from the secretary] immediately tying it all up [in OYPl." Taggart's aggressiveness in seizing control of the discretionary funds did not endear him to others in DOL, but neither did it create serious bureaucratic problems. "The scale of this thing [YEDPA] was unlike anything the department had ever done," Taggart argues. "ORD [the Office of Research and Development, the unit within OPER with responsibility for demonstrations] had a budget of around [$201 million a year. We were talking about putting something like $200 million out in the first year. The existing structure just wasn't designed to do that." While others within DOL perceived Taggart as aggressive, brash, and abrasive, they did not actively oppose his design to control discretionary funding after the secretary's approval and, in fact, assisted him in certain ways. Howard Rosen, then head of ORD, and Seymour Brandwein, then head of OPE (Office of Policy Evaluation, the other branch of OPER), worked with Taggart. Rosen helped GYP in contracting for outside services; Brandwein offered advice on research questions and negotiating the DOL bureaucracy. One ORD staff member, speculating about why Rosen did not fight for more control of YEDPA discretionary funds, said, "I think Rosen . . . saw YEDPA as more of an institutional capacity, delivery, building effort and thus didn't see it as a proper ORD effort. n The same relationship held with ASPER, in the Office of the Secretary. Robert Lerman, then an ASPER staff member, recalls, ''ASPER was much more preoccupied with welfare reform, PSE' and [the] Humphrey-Hawkins [full employment proposal! than with youth employment; the youth programs were less than highest priority, and because of that Taggart was given much freer rein." Taggart used another device to solidify his position within DOL. He contracted with OPER and ASPER to carry out pieces of the knowledge development agenda. ASPER was given money to conduct basic research on the nature of the youth employment problem, which it used to contract for a number of studies. OPER, in addition to helping with contracting, was given funds to extend two major longitudinal surveys to provide more detailed coverage of youth problems and to fund other youth- related activities. OPER, on its own, also conducted an extensive evaluation of the Job Corps and an assessment of YEDPA implementation by local prime sponsors. By establishing this relationship Taggart deflected any oversight they may have conducted on his research and development activities. According to ASPER's Lerman, "Our relationship with OYP tied us up a little bit. It's kind of hard to fully oversee another operation when you can't even spend your own money. We were all understaffed and that worked to Bob's advantage. Besides, Bob is a doer; he doesn't wait, he acts. He just took control and pushed ahead, and no one was there to tell him otherwise. n These sentiments were endorsed by an OPER administrator, who said, "We were reluctant to take on new responsi- bilities beyond our capacity or to get into wrangling with Taggart, with whom [many of us] agreed anyway."

302 Taggart's tactics for dealing with ASPER and OPER also avoided a chronic organizational problem that DOL had been grappling with as early as 1973. ASPER was staffed mainly by economists, who usually took short-term appointments of two-to-four years, often on leave from academic appointments, and whose main interest was the application of economic analysis to policy decisions at the departmental and presidential level. OPER, on the other hand, was staffed mainly by career civil servants whose background was in employment programs and whose main interest was the program monitoring and evaluations aimed at improving operations. These institutional loyalties tended to reinforce mutual stereotypes within the department, not always accurate, that ASPER was populated by "academic economists" and OPER by "program people." Another characterization of the difference, offered by an OPER administrator, was that the "academics believed that conceptualizing an evaluation was the key issue . . . with little regard for feasibility or . . . implementation." The split resulted in delay and disagreement around the planning for the national evaluation of CETA in 1974 (Hargrove and Dean, 1980~. A major effect of Taggart's move to centralize program operations, research, development, and evaluation in GYP, then, was to avoid a major source of past institutional conflict within DOL. Another possible source of external scrutiny over OYP might have been the Office of Management and Budget (OMB) in the Executive Office of the President, which monitors the research and evaluation activities of federal agencies. The OMB examiner for YEDPA, a critic of oYP'S knowledge development efforts, explains why OMB exercised little influence or oversight: "I came on board just after YEDPA passed. It was clear that Taggart was unwilling to take any outside advice he didn't agree with. No one within DOL was willing to try to corral him. It was a clear case of institutional default. From OMB's point of view, if the money comes to the agency from Congress on a set-aside basis, we have no direct way to reach it. We stay out of the sec- retary's internal business and focus on the budget and the President's program. We are not in a position to tell people in the departments what kind of research to do; we can try to cajole and persuade, but we don't have much influence. One thing is for sure, though. Allowing Taggart to grab control of the discretionary money was a very significant decision; once that happened, we all lost our ability to influence what was going on." Taggart, not surprisingly, saw the stakes differently. "We had enormous resources, basically no staff, multiple objectives, and very little time. People in the department didn't pay much attention to us; they were consumed by PSE. We ended up being the only program in the department combining policy, research, and operations. The law said, first, get the money to people in the right places, second, achieve some kind of coordination between federal agencies--take a leading role, and third, do something about the relationship between schools and employment programs. The research focus was my way of exercising administrative control. My view of the purpose was to establish an

303 institutional base for youth employment programs and make it work to serve youths. You had to achieve large-scale institutional change, and the way to do that was to put the money out there and then use monitoring, evaluation, demonstration, and research to pull the system along." In other words, centralized control was Taggart's way of putting research into the service of management. "Finding out what works" was useful only insofar as it was instrumental in building a structure of institutions focused on youth employment. Taggart was young--in his thirties--relatively inexperienced as an administrator, very ambitious, and possessed of strong ideas about the role of research in policy making and administration. His major experi- ence before coming to GYP had been as a researcher, having worked with Sar Levitan on a number of studies of federal employment and poverty programs. From his prior work and his early experience in DOL, he evolved some working principles. One of these was that all program effects are marginal. "Whatever we deliver as a program is one of many factors operating on kids, and not the most important one at that. The best you can expect is a 10 percent effect. You can never separate participant, site, and program Another principle was deep the researchers who produce it' munity is that they don't know effects. n skepticism about employment research and "The problem with the research com- substance, institutions, and procedures-- mosc employment research Is useless if you need to figure out what to do [with it]." This skepticism about research was matched by an equal skepticism about the competence and knowledge of the people who operate employment programs. "You can't rely on practitioners to find the _ ~ ~. I: ~_~ mat: ~ ~ : ~ ~ ~ ~_~.~ ~ 7~ IF; ~1 ~_; -it; ~1 ~ OVER ~ ~ Ally ~ · e · lllelL HAL ;, Levi: 1;~ Jan llama ~ VW · ~ ~ .L11~-L ~ fill-~= I which evolved with experience, was that the content of the program was less important in determining outcomes than the skill with which the ~ ~ rim_ ~ ~1 LEA_ ~ MA L"lt' W"= ~lli~=li1ClI~C-- "Everything's good that's done good." Together, these principles comprised an instrumental view of the relationship among research, evaluation, management, and policy. The purpose of research and evaluation was not simply, or even primarily, to inform policy decisions. It was to create a management structure, a structure for judging and rewarding performance, for developing pro- grams, for dispensing money and assistance, and for weeding out ineffective practices and replacing them with effective ones. You couldn't evaluate until the institutional structure was there to develop and implement a program. Whatever was implemented was highly dependent on the limited skills of the people who worked in the delivery system. The function of research and evaluation was, first, to create a management structure and, second, to nudge local adminis- trators by stages into better performance. From this perspective, separating the research, development, and evaluation purposes of YEDPA from its programmatic purposes would have been unthinkable. For Taggart, research, development, and evaluation were, primarily, tools of management and, secondarily, mechanisms for systematic inquiry or policy making.

304 The Knowledge Development Plans The first attempt to design the research, development, and evaluation component of YEDPA was embodied in the 1978 Knowledge Development Plan, written soon after Taggart became administrator of GYP. The term "knowledge development" is credited to Joe Seller, a veteran of OPER and an assistant to Taggart (DOL, 1980e:9~. This first knowledge development plan was, in Taggart's words, a "seat-of-the- pants" document, crafted from two sources. "First, we took the law and broke it into pieces that were consistent with congressional intent. Next, I gave my best reading of the issues in youth policy that had developed since the 60s." The body of the 1978 plan closely followed the structure of YEDPA and its funding formula. It included descriptions of research, development, and evaluation activities to be undertaken in each of the mandated programs--YACC, YIEPP, YCCIP, and YETP--and a careful statement of how those activities would correspond to congressional expectations. The plan also contained the first list of eight cross-cutting research issues (Table 5~. This list later evolved into 15 questions (Table 6) and, in the 1979 plan, into 16 issues. The first plan was mute on the practical question of how those broad, cross-cutting questions would be answered by the specific studies taking place under each congressionally mandated program. The 1979 plan made the conceptual connections between broad issues and specific studies clearer by organizing specific studies around broader issues, keyed to time lines (DOL, 1980c). But throughout the plans there was no explicit discussion of who would draw disparate studies together and how that would be done. The fact that these issues were left unspecified, however, did not mean that Taggart had no solutions to them. One solution was to contract with an outside research organization, the Center for Employment and Income Studies (CEIS; later consolidated into the Center for Human Resources) at Brandeis University, to help OYP exercise lateral influence over the design of evaluations in separate projects and to help synthesize results. But the primary solution, for Taggart, was that he understood connections among pieces of the design. On this matter, Taggart is unapologetic. "I'm the only one who knows how the pieces of the process fit together because I'm the one who designed it." The practical problems of mounting a large-scale research and development enterprise were another major theme in the knowledge development plans--problems of management, organization, time, and methodology. The main management problem was how to mount good demonstration programs without "locking resources into an operational mode such that it would be difficult to transfer them in the future to approaches which prove more effective." Another management problem was that YEDPA was intended to provide jobs as well as research on what works, which created "a tradeoff between careful research design and rapid implementation to maximize economic impacts." The main organizational problem was limited staff at the federal, regional, and local levels and "resources scattered over myriad projects."

305 TABLE 5 Research Questions, 1978 Knowledge Development Plan 1. Does school retention and completion increase the future employability of potential dropouts and the disadvantaged, and is subsidized employment a mechanism for increasing school retention and completion? 2. Can the school-to-work transition process be improved? This involves several related questions. Are new institutional arrangements feasible and warranted? Will increased labor market information and assistance expedite the transition? Can employer subsidies and other private sector approaches create new transition routes? 3. Work experience has become the primary emphasis of youth programs. Jobs are to be "useful" and "meaningful," i. e., having both a worthwhile output and an impact on future careers. Are the jobs productive? Which ones are most "meaningful" and how can they be identified? .. _ .. .. 4. Does structured, disciplined work experience have as much or more impact on future employability than other human resource development services or a combination of services and employment? 5. Are there better approaches or delivery mechanisms for the types of career development, employment, and training services which are currently being offered? 6. To what extent are short-run interventions and outcomes related to longer-term impacts during adulthood? Put in another way, how do public interventions affect the maturation and development process? 7. What works best and for whom? This is a perpetual and critically important question of matching services with needs. To answer this, it is first necessary to develop a set of performance or outcome standards which determine what does and does not work. The second step is to try to determine who realizes these benefits under which programs and approaches. 8. What are the costs of fully employing youths? Unemployment rates for youths are of questionable meaning because of the substantial number of "discouraged" individuals who are outside the labor force but would be attracted to minimum-wage jobs. Others are working less than the desired number of hours. It is important to determine the extent of the job deficit and the costs of eliminating it. SOURCE: U.S. Department of Labor (1980b).

306 TABLE 6 Knowledge Development Research Questions, 1979-1980 Does school retention and completion increase the future employability of potential dropouts and the disadvantaged, and are employment and training services linked to education an effective mechanism for increasing school rentention and completion? 2. Can the school-to-work transition process be improved? This involves several related questions. Are new institutional arrangements feasible and warranted? Will increased labor market information and assistance expedite the transition? Can new transition routes be created? 3. Given the fact that work experience leas become the primary emphasis of youth programs, are the jobs productive, which ones are most "meaningful" and how can they be improved? 4. Does structured, disciplined work exprience have as much or more impact on future employability than other human resource development services or a combination of services and employment, i. e., should public policy emphasize straight work experience, combinations of work and training and other services, or should training, education, and supportive services he emphasized? 5. Are there better approaches and delivery mechanisms for the types of career development, employment, and training services which are currently being offered? 6. To what extent are short-run interventions and outcomes related to longer-term impacts on employability during adulthood? Put in another way, how much can public interventions redirect the developmental process? 7. What works best for whom? What performance or outcome standards are best to determine what does and does not work for youths? Which youths with what characteristics benefit from which programs and approaches? 8. What is the universe of need for youth programs? What is the cost of fully employing youths? How many would take jobs if they were available and how many hours of employment do they require? 9. What approaches and procedures can be used to involve the private sector in employment and training efforts and to increase the placement of the participants in private sector jobs? How effective are those approaches in accessing new jobs and providing better career tracks for youths? Are they preferable to public sector approaches? 10. What is the best mix of enrollees in terms of age and income status? Will poor youths benefit from interaction with nondisadvantaged youths or with older persons? Is targeting achieved and is it a worthwhile notions 11. What arrangements can be made to increase the duration of employment and training interventions and to assure that participants realize lifetime benefits? Will youths demonstrate the commitment and consistency to make these long-term investments pay off?

TABLE 6 (Continued) 12. What strategies are most important at d ifferent points in the 1 ives of youths? Must training be delayed until greater maturity is achieved? Are employment and training programs a way of inducing maturity; 13. How can separate youth programs be better integrated to improve administration and to provide more comprehensive services to youths? To what extent are the programs already integrated at the local level? 14. How do the problems of significant youth segments differ, including those of migrants, rural youths, the handicapped, offenders, young women with children, runaway, and the like? Are special needs groups and special problems better handled by mainstreaming or by separate programs for those groups? 15. How can the lessons from knowledge development activities best be transferred to improve existing youth programs? How can the institutional change process be promoted? What are the learning curves on new programs and how much can they be expected to improve with time? SOURCE: U.S. Department of Labor (1980a). The main problem with time initially was that, with the exception of YACC, YEDPA programs were authorized only through October of 1978. This meant that, while many research questions required long-term studies, commitments could only be made for one year. After October 1978, when YEDPA was reauthorized through 1980, there were additional demands to provide timely results, through the summer and fall of 1979, for the policy development effort operating under the Vice President's Task Force for Youth Employment, which culminated with a proposal to the Congress in January of 1980. The main methodological problem was how to devise studies that could provide verifiable results within the constraints of program, organization, and time (DOL, 1980c). A sense of how time constraints drove design decisions can be gleaned from the 1979 Knowledge Development Plan (DOL, 1980b:111~: Three of the four major YEDPA programs--YETP, YCCIP, and YIEPP-- are authorized only through fiscal 1980. It is anticipated that by that time many of the critical issues underlying youth policy will be resolved to a greater degree so that major decisions can be made. For recommendations to be formulated and legislation passed by the end of fiscal 1980, these must be based on results which will be available at the latest by fall of 1979. The . . . schedule for the implementation of 1979 discretionary activities makes it quite apparent that there will only be

308 limited information from these projects by this time. Even on a rapid implementation schedule, most will not complete a design and contracting until the end of the first quarter of fisca' 1979. The results of the first half year's operations can hardly be tabulated and analyzed by the end of 1979 and only interim process findings will be available reflecting mainly start-up difficulties. Most of the information yield for the end-of-1979 decisions will have to come from projects implemented in fiscal 1978. Here too, the findings are limited to early results and developments rather than long-term impacts. A less ambitious and committed person than Taggart would have concluded from this analysis that congressional and executive expectations for results from YEDPA were simply incompatible with time and resource constraints. Taggart did not draw this conclusion: Whatever could be produced, would be produced. Key Design Features As the knowledge development strategy evolved, certain design features emerged. Among these were (1) complexity in the range of issues, program activities, research projects, and products; (2) relatively heavy emphasis, in early phases, on process information, rather than outcome data; (3) wide variability in research design, method, and type of results from one knowledge development activity to another; and (4) major changes over time. The knowledge development plans were complex largely because the congressional and executive expectations that accompanied YEDPA were complex. Granting the complexity of expectations, though, a common theme among both Taggart's harshest critics and strongest allies was that he did little to control the complexity of the enterprise. Robert Lerman, ASPER staff member, recalls that in late 1978, when Taggart convened a conference at Reston, Virginia to discuss knowledge develop- ment efforts, "It struck me that the plan just had too many questions. He [Taggart] listened carefully to people's reservations and he thought about the problems they raised, but early on he bought into a big multi-demonstration view of what he was doing, which didn't accommodate much to clarity in design. Things didn't seem to have a clear logical structure to them." Andrew Hahn, who worked with Leonard Hausman in the Brandeis Center for Employment and Income Studies as part of the technical assistance function of GYP, recalls, "Len Hausman argued that the first plan was too complex. He said it could be organized around three aspects of the youth labor market--labor supply, or how to affect the skills and attributes that kids bring to employers; labor demand, or how to influence employers' demand for kids through various kinds of incentives; and intermediary linkages, or how to smooth out the transition to the work." Taggart resisted this advice. "There were probably two reasons why he resisted," Hahn continues. "First, he had a hard time prioritizing.

309 Bob always thought in long, complicated lists, rather than in simpler frameworks. Also, it may have been that he deliberately wanted an inelegant design; by keeping the agenda complex, he maintained control and kept [Arnold] Packer [assistant secretary for policy, evaluation, and research!, Brandwein, Rosen [from OPERl, and the White House off his back." The net result of this complexity, says Hahn, who is a strong advocate of the knowledge development process, was "to encumber the design with a huge number of second-and third-order questions that were often brilliant and insightful but totally confusing to anybody but Bob." The second reason why Taggart resisted Hausman's advice was that the early results of knowledge development were heavily weighted toward descriptions of the process by which projects were developed and implemented in various sites. This characteristic caused considerable friction between Taggart and, among others, the Office of Management and Budget. The OMB examiner for YEDPA, reflecting a characteristic institutional bias of OMB, said, "I expressed very strong reservations from the beginning about how it was developing, particularly about the large amount of money being spent on studies that didn't produce outcome data. OMB likes to see good, strong impact evaluations. Taggart didn't see it that way. He had his political constituency to protect. So most of the stuff he produced was very uninteresting to us. ll The reasons for the emphasis on process data were fourfold. First, GYP was under pressure to produce results, but lacked the time to mount programs, let them mature, and then measure their effects. The next best thing, from OYP's perspective, was to build into the knowledge development process extensive information about the process by which programs were implemented. Or in Taggart's words, "In the first year and a half, our problem was how to do research when what was actually going on was start-up and implementation." Second, the relatively heavy emphasis on process information in the early stages was consistent with Taggart's view of research and evaluation as an instrument of management control. Process information may not have been useful to OMB in making government-wide allocation decisions, but it was valuable intelligence to Taggart in his attempt to create and manage a youth employment delivery system. Moreover, by asking for process information, Taggart was communicating that he placed a high priority on creating an infrastructure to mount, administer, and evaluate youth programs. Third, the wide variability in design, method, and type of results from one knowledge development project to another meant that there was no straightforward way to bring specific results to bear on cross- cutting questions. Separate projects were contracted through a variety of organizational arrangements, discussed below, and decisions about the research and evaluation design for each project were worked out, case by case, by Taggart and the OYP staff. While there was an overall "design," in the sense that individual projects were related to a broader set of policy questions, there was no mechanism for assuring that the design decisions of one project were consistent with those of other projects or with some overall set of methodological criteria. In

310 some instances--YIEPP, for example--design decisions were argued through with the responsible organization in a detailed way and with an explicit analysis of their methodological consequences. In other instances--typically, joint projects with other federal agencies--design decisions were allowed to evolve according to the preferences of the responsible organizations, or were not addressed explicitly at all. In still other instances--the creation of large- scale data bases or the adaptation of existing data bases to youth questions, for example--there was a high level of delegation to the responsible organization, with review and comment by GYP. Variability, then, was partly a function of the complexity of the issues, programs, projects, and products that the knowledge development plans focused on and the organizations that were used to translate those plans into action. But variability was also a function of Taggart's own lack of enthusiasm for consistency and rigor in methodology. He did not believe that focusing on methodological rigor and consistency, at the expense of other objectives, would pay off, either in new knowledge or in better programs. "People complain, after the fact, on a study-by- study basis, about things like the lack of adequate comparison groups,)' Taggart argues. "S---' At least we had comparison groups in a lot of studies. That was more than anyone else had done on that scale before. We introduced as much methodological rigor as we could, even though I believed, and still do, that it wouldn't work. What you're doing, when you apply fancy research methods to projects like the ones we had early in the program, is researching ineffectuality, not intervention. The fact of the matter is that people don't know what to do. All you're discovering is that they don't." For Taggart, the primary questions were developmental, not methodological. "None of this research will yield anything if people don't know what they're doing. What you need to do is to give each part of the delivery system a piece of the action, use monitoring and evaluation to generate competence, pick up the threads running through the system to get a broad understanding of what makes effective programs, and then get states and locals to make decisions about who gets what and get them to monitor and enforce." Questions of capacity, organization, and management were prior to questions of design; research methods were instrumental to the development of a delivery system. Finally, the overall design of the knowledge development effort and the design of specific studies changed markedly over time. For example, the YIEPP demonstration started by testing the effect of a fully subsidized work guarantee on school attendance, school completion, and short-term employment. About halfway into the demonstration, the design was changed to accommodate variable subsidies, on the expectation that Congress would want to know whether a less expensive program would have positive effects. Also, it became clear after YIEPP commenced that assuring school attendance through employment guarantees was not necessarily a clear benefit to young people if the the school program was not adapted to their needs. Well into the demonstration, then, attention shifted to providing better educational programming for YIEPP participants. Another example

311 of a significant design shift was the introduction in 1979 of the Consolidated Youth Employment Program (CYEP) demonstration in nine prime sponsor areas. The purpose of CYEP was to test the consolidation of YETP, YCCIP, and SYEP grants into a single grant directed at multiple purposes. The project was based on congressional and executive expectations that the separate youth programs authorized under YEDPA would be consolidated in the 1980 reauthorization (DOL, 1980c:162-163~. The YIEPP example indicates how shifts in design can be stimulated by external expectations and by discoveries of weaknesses in program design. The CYEP example shows how design is a function of the political agenda. In both instances, though, changes in design raise the issue of whether it is better to stick to a single, well-specified set of projects for as long as it takes to get results, or whether designs should be adjusted to external changes and internal discoveries. A strictly methodological view would argue for holding projects constant until results are clear, since "finding out what works" depends on delivering a uniform treatment and controlling for alternative explana- tions of program effects. A developmental view, however, argues for making adaptations whenever they are required to improve program design and adapt to changing expectations. The knowledge development plans clearly embodied the developmental view. Design, then, meant two distinctly different things in the knowledge development plans. First, it meant accommodating congres- sional, executive, and institutional interests involved in the youth employment problem in some sort of overall scheme and using that scheme to develop an institutional base for youth programs. Second, it meant, in the more conventional methodological sense, designing specific projects to deliver specific results on specific issues. Methodological questions were clearly instrumental to institutional development. There is a third meaning of design, which was not explicitly represented in the knowledge development plans. That is the integration of specific findings into some overall set of cross-cutting questions. The lack of this kind of design was the result of Taggart's strongly centralized view of this role and of the complexity of the issues incorporated into YEDPA. ORGANI ZATION AND MANAGEMENT The magnitude of the organization and management problems confronting Taggart and his GYP staff in the fall of 1977 have already been sketched: Launch three new national programs (YCCIP, YETP, YACC) , launch one national demonstration program (YIEPP), expand and enrich two existing youth programs (Job Corps and SYEP), and allocate over $200 million in discretionary research and development funds. At a minimum, launching new programs would entail writing the basic rules that would govern state and local administration, or in the case of YACC, negotiating the necessary interagency agreements that would result in other agencies writing the basic rules. Taggart estimates that he wrote, or supervised the writing of, about 40,000 pages of program guidelines in the first year.

312 Launching a nationwide demonstration, like YIEPP, is an exercise in politics, administration, and research: The political element comes from the fact that, unlike the formula grant programs, only a limited number of localities--17 eventually in the case of YIEPP--can par- ticipate. Which localities apply and which are eventually selected are matters of considerable political sensitivity. Administratively, the problem is how to get state and local organizations, mainly in the business of delivering employment and training services, to agree to participate in fixed-term research and development efforts. The research problem is devising and implementing a design that will answer policy questions within the operating constraints imposed by the existing delivery system. Allocating discretionary money, as we have seen, would entail defining questions responsive to congressional and executive interests, elaborating those questions into plans for discrete projects, and turning those plans into operating programs and designs. Another way of illustrating the magnitude of the organization and management problems posed by YEDPA is to focus on the organization and management problems involved in the allocation of discretionary funds. If the average discretionary project, defined as one attempt to mount, operate, and evaluate an idea in one site, were to cost $500,000 over the course of two-and-a-half years, there would be roughly 1,000 projects. If one were to assume that OYP would have 20 full-time staff available to focus exclusively on discretionary activities--an extremely generous assumption, given the office's other responsibilities--each staff member would have responsibility for roughly 50 projects. More- over, this example takes account only of the oversight necessary to mount, operate, and evaluate projects. It does not include the effort necessary to mount broad-scale data collection across projects, to oversee the reporting of data, and to synthesize the results of disparate projects into general conclusions. The administrative feasibility of the knowledge development effort was dubious, then, under even the most generous interpretation of OYP's staff capacity. But, according to a number of observers, including Taggart, the quality of OYP staff fell short of the best. With the exception of a limited number of staff, perhaps three or four, whom Taggart had recruited from the outside or from positions elsewhere in DOL, OYP staff were neither trained for nor particularly interested in research and evaluation. "They were basically program people," said one individual who worked closely with OYP staff, "and not the best program people at that." Sensitivity to research design questions, the ability to work with contractors on complex research issues, and an awareness of the broader consequences of specific research decisions were attributes that, according to most observers, were in short supply among OYP staff. One of the central puzzles of the knowledge development effort was why, given the enormous federal investment and the risks involved in poor execution, OYP did not hire more highly qualified staff. At least two explanations have been advanced. One explanation is that federal personnel requirements do not allow agencies to respond flexibly to large new projects with high short-term requirements for people with specific skills. The personnel system is designed to supply and

313 maintain a stable career work force, not to meet peak-load demands of large-scale research and development projects. Hence, OYP was initially staffed by career employees transferred from other DOL programs, and Taggart's requests for additional staff were met with the reply that OYP's needs had been met. Another explanation is that the DOL budget office and OMB deliberately used staffing as a way of showing their disapproval of Taggart's abrasive, autonomous, highly political style of management and the lack of methodological clarity in the knowledge development design. One executive branch budget analyst said, "That's how a number of actors within DOL and OMB got at Taggart. None of us could effec- tively control his financial resources, but we could [control! his employment resources. Both the department and OMB gave his requests each year for more staff very short shrift--even though it was manifestly clear he was way understaffed." Furthermore, the analyst argued, "Taggart never submitted a clear, workload-based research design that we could use to evaluate his requests." OMB's formal position on Taggart's requests was that knowledge development should be staffed by OPER, and that OYP staff should focus on program operations. Behind this formal position, though, was a strong distaste for Taggart's unabashed empire building, which was shared by DOL budget staff. A budget analyst observed, "Note that when [Tim] Barnicle [Taggart's successor as head of OYP] took over he immediately got an OYP [personnel! increase--that's because he knew how to play ball. . . . There are many weapons in bureaucratic warfare." Whatever the explanation, OYP's staff was a serious, some would say fatal, constraint on its ability to mount the knowledge development effort. These constraints, coupled with the congressional charge to forge federal interagency connections and to rely on community-based organizations, quickly led Taggart to "management by remote control" or "indirect management" of the knowledge development effort (see Salamon, 1981~. In Taggart's words, "It takes as much time to process a $5 million contract as it does a $100,000 contract." Given a choice between managing thousands of contracts in the hundreds of thousands of dollars or dozens of multi-million dollar contracts, there was no contest in Taggart's mind. The basic plan was to get the money out of OYP as quickly as possible in a series of large chunks; to use existing organizations, or to create new ones, outside OYP/DOL to manage discrete pieces of the knowledge develop effort; and to create capacity, also outside OYP, to monitor, assist, and manage relationships among the pieces. In its basic form, this organizational scheme was not unlike the modern corporate conglomerate. It was a collection of free-standing enterprises, each with one or more "product lines," each with its own set of projects, clients, and outputs, held together by contractual relations with the center. The function of the center was not to manage projects, clients, and outputs, but to see that the constituent enterprises were following through on their contractual obligation to manage those things themselves.

314 As with all forms of organization, this one has its characteristic strengths and weaknesses. Its main strength is that it reduced the span of control at the center by roughly a factor of 10--from poten- tially thousands of separate projects to, as it turned out, something over 100. The main weaknesses of this organizational scheme are that, first, its success depends almost totally on strong management capacity in its constituent enterprises, and second, that the structure itself contains no obvious solution to management failures in constituent enterprises. When problems develop in the pieces of a corporate conglomerate, central management either replaces the management of those enterprises or sells them outright. These solutions are less feasible in the public sector. More importantly, though, the constituent pieces of a conglomerate--public or private--are relatively immune to central control of their internal operations, even when they are poorly managed. The slightest increase in management control from the center can create an enormous overload of central management. For this reason, among others, corporate management has tended more recently to move away from conglomerates and toward organizational schemes that permit "tight" central management of finance and output targets, coupled with "loose" central management of internal organization and operations (Peters and Waterman, 1982~. Taggart's strategy of indirect management depended on pulling at least five distinctly different types of organizational arrangements into a single conglomerate structure. Table 7, drawn from OYP's knowledge development projects for fiscal 1978 and 1979, illustrates these organizational arrangements. Organizational Arrangements Intermediaries The use of intermediaries was an outgrowth of DOL's prior experience with the Manpower Demonstration Research Corporation (MDRC). The brainchild of a federal interagency task force, aided by Ford Foundation support, MDRC had designed, implemented, and evaluated a national demonstration of supported work as a solution to welfare dependency (Lowry, 1979~. Because of the extremely short time lines involved in launching YIEPP, MDRC emerged as the most likely candidate to manage that demonstration. If the MDRC model could work with the entitlement project, why not try it with others, Taggart reasoned. Hence, in November 1977, the Corporation for Public/Private Ventures (CPPV) was established to handle demonstrations of private sector youth employment; in January 1978, You thwork was established to handle exemplary in-school employment programs; and in May 1978, the Corpora- tion for Youth Enterprises (CYE) was established, through an inter- agency agreement with the Community Services Administration (CSA), to manage demonstrations of youth-run enterprises. The role of intermediaries in knowledge development has to be understood in connection with Taggart's instrumental view of research

315 a) a' 3 o X V U] m ED 1 ;J) U] ED 1 Go ~ V 3 U] O ~ At; - C o -,1 N . - o 1 ·` U) ._ ~ .' ~ c: ~ ~ 0 3 a) c: a) a~ ~c: at Cal U] ~O ~ ~ O O _; ~ ~Q ~ ~O O h U] ~O U) to a go) ~ (15 C5 0 ( S Q. (1) ~q4 U] Al ~ ~(~5 c: ~ ~ ~1 a) ~ u ~ ~ ~ ~ ~ ·' ~O ~O - 1 ~ O · ~ ~ ~!J, ~ ~ ~ ~ I ~ , - a' ~ ~ ~ ~ ~ ~ ~0 ~0 a) ~ ~ ~ ~ ~ ~ ~ 0 ~ ~ ~ 0 ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ 1 -- ~O' Q ~ ~ ~;0 ~ o? ~ U) ~O ~: ~ ·' ~ ~ ~ O 3 ~ U. o ~ ~ ~ o ~ ·' ~ o ·' ~ ~ ~ ~I ~ ~ ~ ~ 3 u: a' ~ ~ ~ ~ 0 ~ u: C) ~ U] ~ ~ v `: s ~ ~ a) Q4 ~ ~ C} O ~U] ~0 s ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ 3 ~ ~ ~ ~ ~1 O oU) ~ ~ ~ ~ ~ ~ ~ C ~ ~ ~ - ~ ~ O ~ ~ ~ ~ ~. ~ O ·' ~ O 0 1 ~ r~ ~ ~ C) O (O ~ ~c: O O C ~ a cn ~ ~ ~I S U) ~O ~'~ ~ O ~ Q4 0 S == ~ O 3 ~ ~ ~ O Q~ ~ ~ O · - - ~ V a' ~ ^ - ~o ~ ~ ~O ~O v a, O ss O m-- u, 0 - - u: ~ ~ ~ ~ ~ Q ~c: ~ ~ O c: 3 ~ 3 = ~, ~ u: ~o ,~ , a~ · ~:>~ 3 :^ ~ O ~ :^ :^ ~ ~ ~ O v ~ O ~ a~ ~ ~ ~ u~ ~e c: u: ~ ~ ~, ~, ~, ~e ~.~, ~, ~ 3 ' ~ ~ ~ ~ ~ O ~ - - ~ O ~ ~ ~ ~ ~ ~ ~ O ·' ~ ~ O u: ~ v ~ u: £ ~ ~ a~ tr 3 ~ a, ~ O ~ u, ~ c: ~Q ~ O ~ ~ ~ ~ ~ L~ a~ u: a~ 0 ~ a ~ c: u: ~ ~ ~:: -- v ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~u: Q~ ~ ~ a) ~Q v x ~ ~ a) ~ I ,~ ~Q I ~ Q ~ Q ~ ~ Q ~ ~ Q a~ ~ c: :> J:: ~ :> ~ I ~ ·- ~ ·~ ~ v~ ~ ~ ~ ~ ~ .- ~ .- c~ H O a' a) ~a' ~ ~ ~ ~ ~ u, c' c~ u2 ~ ~ u, ~ ~ u' ~ 0 ~ u~ O 0 3 N ~ ~ =) ~1S >~ ~ ~ ~ ~ 3 ~ ~ ~ ~ ~ ~ ~ ~ ~ O ~ ~ Q 3 ~ ~ ~ '= C) ~ ~ {0 Q5 ~ ~ ~ ~ V N O ~ V ~ ~ O ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ I ~ O C: - - ~ O ~ V ~ ~ ' - O ~ ' - ~ O ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ · - ~1 ~ ~ ~ ~ ~ O ~ ~ V ~ ~ c: a~ ~ ~ ~ s:: a) a~ ~ ~ ~ a~ ~ a~ ~ O a~ ~ c: ~ ~ ~:: 0 ~ ~ a~ 0 u~ ~ ~ ~ Q~ QJ ~ ~ ~ ~ a) ~ O ~ ~u~ U] ~ u~ ~ ~Q U] ~ ~ ~n ~ ~ <: u a~ a' a) r~ ~0) O 0) 0) 0) a a ~ H O E~ i E~ E~ C~ C5N ~ ~ ~ ~ _ _ ~ _ _` I ~ 00 ~00C0 ~ 00 C ~CO ~00 00 ~ ~ ~ r~ 1- ~1- ~1 ~ cn ~1 ~ ~ ~cs ~cs~ ~1 ~. - ~ ~ ~r ~1 - ~1 - ~1 _ _ __ _ _ _ _ _ _ _ _ _ _ O O ~ O ~ 0 0 0 ~O O O O O t ~o O O O O O O O O ( ~1 ~ O m) o O ~1- 0 0 0 O O ~ ~ ~ ~O O ~0O 0 0 0 O O If ~o CO ( ~O ~1 ~O O O O ~-H ~ ~ ~d. O ~) ~O ~ ~ ~ ~ ~' ~ [~ 00 0~ ~t O O ~ .~1 ~5) \ ~V V ~ _ - - O ~ ~ ~ ~ V '0 ~4 ~3 ^ ~C) ~ ~ 3 U) O ~ Q) C) '0 O ~ - ~ 0 p~ a) ~ ~ ~ s -- c) a' <' Q~ c ~C) ~ ~ ~ ~v O ~ ~- ~ ~ ~, 0 ' - a' .- 0 ) 3 0 -- O a, >.- a~ c ~ ~-, ~n ~ cn cn :> ~, a ~a~ ~ U] (0 ~) ~a :> ~ ;n ~ ~ ~ s~ '' u: - ~ a) ~ ~ ~ a) ~ U] S= >1 >1 u a; ~ ~,~ ~ ~ ~ ~ - - ~v 0) ~ ^ ~ ~O ~ ~ 3 Q' ~ ~ ~. - ~ 3 ~ ~ ~3 ~ ~ O ~ ~ - - ~ Z Q) 0 ~ ~ 0 ~0 ~ ~ ~ ~ ~ ~ 3 3 o ~ ~ ~ ~ ~ ~ ·0 c ~ ~3 ~ =1 ~ .~ ~ ¢ ~ E ~ ¢ a) ~ ~ 0 0 0 ~ ~° : ~V ,.

316 a) ·rl o i_ I U: By V] Ed - an 1 oo ~ U] o ~ - o .. , N .,1 O _ Q O U) ~U) ~- ~C) ~ H U] >1 ~(L) Al ~ ~a) Cot) Q 1) 1 ~`4 ~Q) n:5 a' 3 C: O~ ~ ~o ~ .L} 3 0 r(5 ~U) ~ ~ m ~ ~ ~ ~ ~al ~ ~ ~ u: ~a) ~° ~ ° ~ ~ ~ ° ~ ~ ~ 0 1 ~ · - ~d4 ~ :>1 ~ ~ U) ~En ~ o >, ~ () ~'¢ 0 ~- c: A, ~01 Up Q ~ O ·- of X ~ a~ c~ :^ E~ ~ ~·~ ~n ~ m ~ ~ ~a~ :~ ` ~ ~ a.' i-, ~ u' ~a' v, ~ O ~ ~ ~ ~ ~u ~ V c: O ~ ~ (: ·- ~ Ll 0) ~ ~ ~ ~ ~ ~ ·- ~! ~¢ · - O O ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~O O ~ £ O - ~ ~ ~ ~ 3 · - ~3 ~ · - ~ =) V ~ ~ Y ~Q ~ Q4 ·' ~ ~· - 3 0 ~ u, a' ~ ~ ~ ~u 0 ~ ~ ~ ~ ~ ~ ~ ~ ~ v a~ U] c: ~ 0 ~ ~ ~ ~ ·- ~ ~ ~ 0 ~ ~·- ~ c: ·~ ~ 1 0 ·` a) ·- ~1 u~ ~ ~ c: t4-d l~ ~ ~ ° ~ U o ~ U] ~ ~ U] ~ ~ ~ ~ ~ o ~ ~ ~ o ~ ~. - V ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~o m~ ~ ~O c: ·- a) ~ 0 :5 u :> v ~ ~ c: c: ~O ~ ~ ~ O c: 0 ~S · - ~ ~ ~ ~ ~ ~ O ~ ~ ~ · - ~ ~ ~ ~ ~ O O Z ~ ~ ·' a) ~ Y ~ 0 ~ 1:~1 0 ~ ~ O (O ~ U] ~ ~ ~15 : ~O o (_, ,, ~ ~ ~ ~ o . - ~ ~ ~ o ~ ~V ~ ~ C - '-~ ~ Q4 ~ ~t >1 0 Q4 ~ Q 1: ~>1 ~ O ~ ~ ~ ~1 ~ 5 U] ~1 - ~0) Q~ ~ H ~ ~ ~ ~ ·- ~ ~r~ ~ ~ ~5 ~ ~4 0 a~ (15 -1 ~ O N ~5 Z ~ () '= O Y U] _I _ ~0) ·-~ O _~ 1~ 0 ~1 ~ ~ O ~ ~ S ~1 ~ ~ aJ - ~O -1 k4 ~ ~ ~ _l ~ c) ·_~ ~ 0 ~Q ~ ~ ~ ~· ~ a) ~ ~ ~ ·_~ ~ s~ "LJ O ~ ·- ~ "LI ~ - P4 Q ~ ~ ~ ·' O ~2 ·- (U 0) ~: Q ~ Q~ ~ 3 ~ ~ Q o ~ ~o a) ~ U] 0 ~n ~ v ~ ~ ~ ~ O ~ ~ ~ ~ ° ~ Z :^ S ~ ~ ~ O ~U) ~ ~ ~ U) ~ ~ O ~D ~ a) ~ u, ~ ~_ ~ ~ ~ m V ~ ~ ~ U) ~ ~ ~ ~ ~ ~ ~ ~ V ~ V ~ ~ ~5 U a) 0 3 ~ ~ ~ V I a) :25 ~ u~ c: ~ 0) ~ ·' V ~ ~ ~ a ~ ~: ,~ v s:: ~5 ~ ~ ~ ::, ~ v ~ ~ Y 0 ~ ~ a) ~ ~ ~ a' Q~ ~ Q O ~ ~ ~ ~ O ~ ~ ~ V ~ I ~ ~ ~ o ~_ O ~ ~ ~ ~ ~ ~ ~ O O ~ ~ ~ ~ ~ ~ ~ ~X X U] a) ~; ~ Q4 0) ~ ~4 P4 t,Q O PJ H ~ ~ U] > 3 O ~ Q~ m U] k ~-- W [ a) ~a) a) ~a) :5 E ~E ~u o .- - =: ~ 3 a' ;I: o i= a ~ _ ~0 V ~- . - 0 ~ v ^ P4 . 0~ ~ ~ ~ ~n v 0 a (: a' ~ a m~ v ted ~ _~ !- ~ ~ 3 ~ 4~ ~ O S:

317 a o - - a) ~ ~o ~c: oo ~s an' ~ c: ~in o ~ ~ ~ U2 ~C: rO - 1 ~ ~N (~5 ~ ~O O :5 ~ O · - ~ 1 U? ~' - S ~ ~ ~ ~ S ~C\ ·' ~ ~' - O S ~ ~5 ~:5 0 3 ~' - ' - ~ ~ ~ 1 0 0 ~ ~ ~ ~' - ,- ~ ~ u: A: In ~ A, ~O O ~ ~ O ~ ~ ~ ~' - O ~ - O ~ ~ ~ 1 ~ ~ ~ ~ ~ ~ ~ ·- ~0 Q5 ~ {O ~ as J~ a) 0 0 ~ ~ ~ =) O ~ ~ ~ ~ O V ~ ~ ~·' O ·' ~ O · ~Go 0 Q5 u] ~ ~1} ~ R5 ~u] A u] Go ~ N O ~ ~ ~ ~ O ~ O ~ . O Z ~ ~ ·` ~U) (d ~ Q. ~ =) c: O s Z ~ c: u~ ~, ~ ~ ~ ~ ~ O ~ ~ ~ ~ ~ ~ ~0 0 ~ ~u, 0 u, :25 ~ `:: O ~ ~ 1 ~ a' ~ ~ 1 ~ a' 0 0 ~ ~ ~ ~ ~5 u: Y Q. ~ ~ c: a ~c~ ~) o S V ~ ~cn ~ ~ s ~ n5 ~a) a ~a, ~c: v ~ 0 ~ ~ ~ 0 ~ ~ ~ ~ ~ u, Y O ~ ~s: ~ ~ (O ~ u, u: ~ 3 ~ ~ ~ u, ~0 . - ~ ~0 ~ ~ ~ tn >1 S ~ ~ ~ ~n ~ Q 0 a) u) ~ . - ~ . - O ~ O ~ u, ~ 1 C4 ~ ~ ~ ~ ~ ~ ~ O ~ ~ ~ ~ ~ O ~ ~ ~ ~ ~ ~ ~ ~ Y . - u, ~ ~ ~n ~ ~ ~ a) ~° ~ ~ 0 c: (~ a~ ~ t15,- .- a~ ~ ~x v x ~ 0 ~ c: ° :> ~ a) ~ ~ C) ~ ~ cn ~ ~ ~ ~a) c: a~ ~ ~ ~ 3 O ~ V ·- N .` ~Q) Q5 ~ · - ~ ~ ~ ~ ~ ' - O ~ ~ ~' - ~n ~ 0 k4 ~ 0 ~V x - ~ - ~ ~ ~ ~ ~ ~ ~ O ° u ~P., ~.~.~ c: ~< ~ <~ ~ ~ ~5 3 -- Q, a ~. - . - .- s ~u~ V O Q Q4 V s: O ~c: ~ ~ a' ~ u, ~ u: ~ V ~ s ~ ~ 0 v ~s 0 a)~- ·- v ~ u: ~ u~ c: a) c: 0 ~ v O ~ ~1 U] V ~ O v ~ ~ ~ ~ ~ o ~a~ ~ a~ ~ a' ~ 0 u a.' ~ ~ ~a . o ~ E ~H HH - 00 1-~ ~ _ ~ _` _ ~ _ co ~a) ~00 oo co ~ ~ c~ r~ ~r ~ ~ ~ r~ C51 ~ 45' O - - - - _ - _ _ _ _ _ o o L~ ( ~O ~r ° ° ° ~) ~ 'f) O Ln ~O l_ O O ~,- O ~ ~. ' ' ' ' ~' ~ ' ~' ~_ ,~ ~3 ~0 ~<c 0 U~ ~ ~r) Q L ~ ~r- ~ ~ ° ~ ~0 [_ ~ ~ ~oo ~g oo :^ V ~4 o Q P4 s u: £ ~ V ~:5 ~O ~ ~O ~ ~N O ~O O a U] O ~ Z ~ X O ~O C) ~J ~'- ~'~ Q5 Q4 - - ~ 0 ~, ~a) c' ~ n~ ~O (c O t ~1 ~ 0 0 Q5 Z5 ~ O Q, ~' ~ . ~ ~ ~O v ~c: :~ ~O ~ ~O a) ~.- ~ ~v: a) ~n ~ C ~ ~ ~ ~v O ~ ~ ~O ~O ) Q)~ Q) ~ U2 V ~Q C) ~l 1) ~3 tJ) V ~ ~ ~ O ~ O O rn U] ~ ~ ~ ~ ' - ~ ' - '- ~ O~ O ~ 4J ~ V :0 :5 ' ) ~ ~ ~ O ~ ~ ~ ;n ,~ ~ ~0 `= ~ ° ~ ° ~'~ ~ ~ ~ ~ ~.. ~ ~ < ~ u1 s: ~ ~-- ~ O ~ ~n c: _ O ~ c: a) c: v, ~ ~ ~ `O ~ ~ ~ ~ (: ~ ~O ~ ~0 0 £ O ~a) 25 ~,\5 v~ .~ ~5 ~ a~ I cr, ~ ~ O <: ~ ~ ~Z ~ ~ Z ~ £ X 0 ~ 0 ~ ;n E~ ~ ~ ~O ,_ ;~ ~ ~ ~; ~ ~u:

318 and his developmental view of how to approach the youth employment problem. The task was not just to demonstrate that certain programs operating in certain settings could work, but more importantly, to create a large-scale constituency of organizations committed to making certain programs work in certain settings. Later, Taggart (1980:15) would describe his purposes as follows: The aim of the involvement strategy was to build up the expertise for the involved groups and institutions to provide assistance in the replication of specific proven models under an incentive grant structure, as well as intensive technical assistance on specific substantive components of youth programming. . . . Without this institutional infrastructure, there would be no capacity to deliver whatever "solutions" emerged from the knowledge development process. The investments in intermediaries, then, were partly investments in research and development and partly investments in institutional capacity. Andrew Hahn, from Brandeis, puts the problem this way: "When you go to do research and demonstration in the educational system, you've got a Jot of established organizations who can create curriculum, train, test, and evaluate. Before YEDPA, there was no capacity like that in the youth employment area, just a collection of small entrepreneurs and a big employment and training delivery system focused mainly on adult programs." Intermediaries were a short-term capital investment in a longer- term problem of institutional capacity. They were also a high-risk investment. With the exception of MDRC, none of the intermediaries existed prior to YEDPA, nor were they managed or staffed by people who had experience in similar settings. Consequently, as one might expect, CPPV, Youthwork, and CUE made their early decisions on an opportunistic, trial-and-error basis that produced a predictable mix of successes and errors (Lowry, 1979~. Youthwork, for example, recruited its staff disproportionately from the education community, giving it little credibility with CETA prime sponsors. When this became clear, the organization adjusted, but lost precious time in the process. All the intermediary organizations, with the exception of MDRC, had difficulty attracting and holding qualified research specialists, and this fact showed up in the quality of their initial plans. CPPV managed to recruit qualified staff, but its rela- tive inexperience in management created start-up problems. Youthwork had a high level of internal turnover in its first two years, which undermined its ability to develop research expertise. CYE was slow in developing and never managed to attract and hold strong research staff. Interagency Agreements Interagency agreements were an outgrowth of congressional expectations that DOL would "pull together the pieces" of the federal government around the youth employment problem. The portfolio of

319 interagency projects was substantial and reflected a number of agendas. The ACTION projects focused on youth community service, consistent with the objectives of YCCIP. The CSA projects also focused on youth community service, but with a strong emphasis on operations through local community organizations spawned under the federal antipoverty program of the 1960s. The HEW projects focused on con- nections between local employment and training programs and secondary or postsecondary institutions--a key congressional concern. The Department of Energy project focused on drawing disadvantaged youths into new careers in the energy field. The interagency projects returned little in the way of structured research and evaluation, for reasons that are relatively clear. while OYP often referred to its federal collaborators as playing the role of prime sponsors, the facts were that OYP could exercise virtually no control over the projects they administered after the interagency agreements were signed. The agencies were neither creatures of DOL--as prime sponsors were--nor full-fledged contractors--as intermediaries were. They were free-standina federal agencies with independent authority. Hence, if they lacked the capacity to do systematic research, or if they disagreed with the demands that research and evaluation imposed on their discretion, there was little OYP could do to force their cooperation. Moreover, since the purpose of interagency agreements, from the congressional point of view, was to cement internal relations within the federal government, it was not necessarily in the interests of OYP to provoke embarrassing interagency conflicts that would be difficult to explain to Congress. _ _ _ ~ . . _ _ . . . . . Intraagency Projects , Intraagency projects, as noted above, served an important internal objective by stabilizing OYP's relationship with ASPER and OPER. But two equally important additional purposes of these projects were, first, to develop a basic research constituency for youth employment among academics, and second, to assure that youth employment issues were adequately addressed in established longitudinal data bases, like the National Longitudinal Survey and the Continuous Longitudinal Manpower Survey. In their own way, the intraagency projects were among the most successful in the knowledge development process. They involved relatively low-cost, finite, well-defined tasks; they could build on established institutional capacity (e.g., the National Bureau of Economic Research, the National Council on Employment Policy); and they had relatively self-explanatory payoffs. But for all their appeal in specificity and feasibility, these projects were not very valuable in political terms. Better research and more complete data about youth employment were useful in dealing with Congress only if it could also be demonstrated that DOL was "doing something" about the problems it was documenting.

320 External Staff Support External staff support was a direct outgrowth of limited staff capacity within OYP. The key organization was the Brandeis Center for Employment and Income Studies. CEIS had an intentionally broad and ambiguous charge: "~1) To provide technical research design guidance to the national array of experimental and demonstration projects implemented under YEDPA, and (2) to develop administration processes for the retrieval, dissemination and policy utilization of research findings and other knowledge development products of these discretion- ary projects" (DOL, 1980b:278~. These responsibilities overlapped those of a number of other organizations, including the Educational Testing Service's (ETS) development of a Standard Assessment System (SAS), the design and evaluation functions of the intermediaries, and a number of other individual projects with their own evaluations. But if CEIS's role was ambiguous in a formal sense, its practical function was much less so. CEIS staff were the only source of "lateral intelligence" in the complex array of organizations spawned by OYP. All the other organizations were producing "vertical intelligence," in the sense that they were assigned projects with specialized target groups and particular programs. As noted above, this meant that the design of the knowledge development effort, if it was to exist at all, depended on the ability to make cross-cutting conclusions from disparate projects. In its evaluation consulting and technical assistance role, CEIS was not just trying to improve the quality of project evaluations (a difficult task by itself), it was also gathering intelligence about what the developing delivery system looked like across a variety of localities and projects. CEIS also performed the function of convening periodic conferences to review design decisions, interim results, and , Another practical lessons. In the absence of these activities, there was no formal mechanism for getting people involved in the knowledge develop- ment process to talk to each other about their results. While the lateral intelligence function is hard to specify in formal terms, and while one could argue that under the best of circumstances it would have been performed inside OYP, it was a practical necessity, given OYP's staff capacity and the organizational complexity of the knowledge development effort. Another important external support function was provided by the Educational Testing Service's Standard Assessment System. The initial idea behind SAS was plausible. ETS would develop a single battery of instruments, composed of measures of client background characteristics, educational measures, and employment measures, which would be adminis- tered to a large sample of YEDPA demonstration project participants, before and after their participation, and would generate a data base that could be used to analyze effects across sites and programs. This battery of instruments would then be administered by prime sponsors as part of the routine requirements that accompany YEDPA-funded demonstra- tion projects. The results would be collected, compiled, and analyzed by ETS, but also made available to others for special studies.

321 By most estimates, the SAS was less than a complete success. One problem had to do with conflicting expectations about its use. "People got very confused about the purpose of the SAS," said CEIS'S Andrew Hahn. "ETS was never intended to be the evaluator. The idea was to create a large data base and make ETS the repository." But because ETS was cast in the role of developing the SAS, explaining it to program administrators, and collecting data, it became identified as an evaluator, whether that was its role or not. This led to a second problem, which was conflict between ETS and program operators over the use of the SAS. "Program people hated it," Hahn recalls, "and it was very difficult to get their cooperation in administering it." A third problem was ETS's lack of experience with the employment and training system, a problem it shared with a number of other educational organizations that got involved in YEDPA projects. Schools are relatively acclimated to periodic testing, even though principals and teachers resist it. They are also accustomed to ETS as a prominent institution in the testing field. Most delivery-level organizations in the employment and training system had had little or no experience with testing and saw no particular reason to cooperate. ETS's experience did little to prepare it for these practical problems. A final problem was disagreement over the design and content of the survey. In retrospect, a number of people saw gaps in the data and in ETS's analysis of it, but those gaps were not clear when initial designs were presented and approved. Taggart explains, "I delegated the design of the SAS internally and never reviewed the actual content of the instruments before they went out. That was about the only thing I didn't review in detail. When the first results came back, I went through the ceiling--they were not what we needed at all--and read the riot act to the staff and ETS. From that point on, we had a constant battle to try and turn it around." Constituency Support Constituency support projects were designed to make good on Congress's expectation that client groups, community-based organiza- tions, and intergovernmental constituencies would be involved and consulted in the implementation of YEDPA programs. The mayors', counties', and governors' associations were important in maintaining any political support for any future youth employment activities, since they were the host governments for CETA prime sponsors. In addition, they had a strong incentive to resist separation of youth programs for other employment and training activities, because they constituted a "recategorization" of CETA and a retreat on the initial broad grant of discretion that accompanied CETA. Putting these organizations to work identifying exemplary youth programs in their jurisdictions, even if it did not produce much in the way of research, was a useful way of giving them some ownership in YEDPA. Constituency and client groups, like the Urban League and SER, were key to development of a youth employment system in two respects. They were the national interest groups for their local community organiza

322 tions and therefore wielded some political influence in Washington. In addition, though, their local organizations were often the only legitimate route of entry into the minority community. Where prime sponsors and local school systems were identified with the dominant local political system, community-based organizations were an important alternative way of reaching communities that might not be well connected to that system. None of the constituency support projects was distinguished for its research and development value, but their political value was apparent. For Taggart (1980:10), the payoff for investing in constituency support lay more in the creation and management of a delivery system than in the research it generated: Rather than passively reacting to the pressure of interest groups, an active and conscious involvement strategy was adopted from the outset which sought to identify the complete range of institutions that could and should be involved, their areas of possible comparative advantage and interest, and then to utilize these institutions in structured demonstration projects where their effectiveness could be tested. The knowledge development plan was, in a sense, a protective system; to get funding, institutions had to adapt to the design and structure of demonstration approaches. The overlay of research requirements and outside evaluation agents was a disciplining force, serving a monitoring and management function which would not otherwise have been possible given limited staffing in the Office of Youth Programs. The range of projects described in Table 7, as wide as it is, constitutes only a very small sample of the total universe of activities funded under YEDPA discretionary authority. Hence, it does not, by itself, give a complete picture of either the variety of projects or the complexity of the organization and management problems confronted by Taggart and the GYP staff. The selection of 27 projects in Table 7 is a small fraction of the 127 projects listed as knowledge development activities for fiscal 1978 and 1979. Roughly 30 OYP staff are listed as project monitors, but 15 of those oversaw 100 projects, which often amounted to an individual responsibility for $20-$30 million. Even if the staff had been well prepared for their research and development roles, their ability to attend to detailed project decisions would have been severely limited. It is also important to note that while discretionary funds were used to finance knowledge development activities, they were not exclusively, or even primarily, used for research and evaluation. In fact, by DOL's estimate, 88 percent of total discretionary funds was spent on employment and training services, 3.3 percent on basic research, .6 percent on evaluation of regular programs, 5.3 percent on evaluation of demonstration projects, and 2.8 percent on technical assistance to program operators. About 30 percent of discretionary funds, outside YIEPP, was directed through other agencies of the federal government to interagency projects. Close to 78 percent of

323 discretionary funds was allocated to prime sponsors, 18 percent directly to community-based organizations, and 6 percent to an assortment of other organizations, including schools and private employers COOL, 1980b; also DOL, 19811. The message these figures drive home is that, regardless of how important research and evaluation were to the mission of knowledge development, service delivery was the main activity performed with discretionary money and the organizations performing that service delivery were not, by their nature, sympathetic to research and evaluation. Organization and management problems were not limited just to the organizational alliances spawned by discretionary funding. There were at least two other spheres that demanded active and continuous atten- tion. One of these spheres was the CETA delivery system, represented by prime sponsors and their local contractors. The other was what might be called the "policy system," composed of actors outside OYP who were consumers of knowledge development products, some of whom were involved in developing the administration's youth employment proposal. The CETA Delivery System The CETA delivery system was both the operating base for most YEDPA activities and the major source of practical implementation problems faced in the knowledge development process. It would have been pos- sible, at least in theory, to run demonstration projects outside the CETA system, by creating "hot-house" projects designed, run, and evaluated by researchers. This approach was antithetical to both the legislative charge that accompanied YEDPA and to Taggart's strategy for using knowledge development as a management tool. By insisting that the knowledge development process be run through the existing delivery system, Congress and OYP achieved a degree of involvement and practical experience that would otherwise have been impossible, but they also bought all the political and administrative problems that accompany that system. Prime sponsors represented units of local government; the adminis- trators of local prime sponsors, as well as their local constituent organizations, were a political force in their communities and in Congress. When knowledge development imposed demands on local prime sponsors that they considered to be unreasonable, and when OYP refused to concede, prime sponsors had alternative routes of political access through which to get the results they wanted. Erik Butler, a former administrator of youth programs in Boston and later executive director of the Vice President's Task Force on Youth Employment said, "When MDRC came to talk to us about the Entitlement project, they were talking research, while we were talking program. The issue was how to accom- modate their interests and ours." Marion Pines, a nationally visible employment and training administrator from Baltimore, took her com- plaints about the reporting and administrative demands of the entitle- ment project directly to Congress, making a plea for more local control over design decisions. This tension between national objectives and

324 local political and administrative realities was played out in a number of settings on a number of issues. Taggart cites OYP's attempt to improve the Summer Youth Employment Program (SYEP) as one of his most illuminating confrontations with the CETA delivery system. "Summer Enrichment," as it came to be known, was an important component of Taggart's strategy for using knowledge development to influence the quality of youth programs. As is clear from Tables 2 and 3, SYEP accounted (and still does) for a large proportion of both outlays and participants in federal youth employment programs. Since the mid-1960s, it had come to be regarded cynically by politicians and administrators largely as an income support program or, in the language of the street, "fire insurance." Taggart's approach was to focus on developing better jobs for the summer program and adding an educational component. "In the first round, we tried to rewrite the program requirements to include better monitoring of jobs and an education component, assuming that if we asked for them and made it explicit in evaluation requirements we would get it. We completely misjudged the capacity of prime sponsors. What we discovered was that there was rot at the bottom of the pyramid. The problem was bad management; they wouldn't have known how to do it even if they had wanted to. So we put some discretionary money behind it, got some intermediaries involved in creating and implementing programs, and focused on the problem of poor management at the local level. Over time, we began to see results. But the problem was that we wasted a year finding out that the delivery system couldn't respond to the demands we were putting on it." The problem of local management capacity also surfaced in a YIEPP, where MDRC was required to make heavy demands on local prime sponsors to implement a specialized information system to track entitlement participants f and in the implementation of the Standard Assessment System, where prime sponsors and other administrators of discretionary projects often balked at the additional effort required to administer the complex battery of instruments. Taggart's approach to both the political entrepreneurship and management capacity problems was to rely heavily on external support staff and intermediaries both to buffer GYP from political pressure and to deliver much-needed advice. The Policy System The policy system posed another set of problems. Taggart's strategy of creating an extensive network of external organizational alliances also meant that it had to be consulted, reinforced, and accommodated. The major problem with asking for advice from your constituents, of course, is that you often get it. And, more often than not, it comes in the form of contradictory messages. The most prominent example of consultation was the conference held in October 1978, convened by the Brandeis center, at Reston, Virginia. The conference occurred early in the development of the knowledge development process, a year after the passage of YEDPA. Its nominal purpose was to bring the research and development organizations

325 involved in the process together to discuss common methodological problems and to develop a familiarity with the overall objectives of the process. As one might expect, the conference turned out to be a collection of distinct presentations on specific projects, followed by discussions in which participants argued positions based as much on their institutional and methodological biases as on their interest in knowledge development. Arnold Packer, DOL assistant secretary for policy evaluation and research, commenting on a presentation of the entitlement research design, argued, "We must be sure when we're spending the public's money that we have a scientifically solid approach and the ability to accept or reject specific hypotheses which are recognized as the ones that are implicit in what policy makers are doing and thinking about in such programs." Policy relevance and methodological rigor, it seems, were equally important. But so was timeliness. "Will the results be available by the time legislation must be drafted next year?" Packer asked. "January 1980 is the scheduled date for submitting administra- tive recommendations. When the budget goes up a year from this January, if we are going to ask for any money to continue youth programs, the legislation has got to accompany the budget" IDOL, 1980e:25~. How MDRC, local entitlement projects, and GYP were to accommodate to this schedule was their problem, not the problem of the administration's policy planners. John Palmer, an employment specialist from the Brookings Institution, took advantage of the occasion to counsel moderation on methodological questions. Commenting on YETP discretionary projects, he said, "Some of the discussion seems to suggest that we're going to be able to vary components or individual elements of these program structures and see what difference it makes. I'm dubious that that's going to be possible in most cases. I just don't think that the methodology or the resources that are being brought to bear are going to permit that to happen. You're just not going to get effective answers to those questions in the strict research sense. You're going to get important answers out of the more qualitative analyses that are being done and from the hunches that have been made." Palmer continued by counseling attention to what he called "first order questions," such as, "Was it feasible simply to mount and execute the program under the design conditions we are trying to accomplish? Who is being served? Are we reaching the target population? Is it working, in some sense, at that level?" IDOL, 1980e:56~. These ques- tions were several notches below those considered important by other participants. Donald Nichols, an ASPER staff member, took strong exception to the lack of methodological rigor he observed in many knowledge development projects. "I want to emphasize the need for consistency across these various programs," he argued. "We want to strive to bring about some kind of consistency, so that we can not only make comparisons within each of these projects, but so we'll also be able to make pretty good comparisons of one approach against another." He continued, "A feature common to most of the demonstrations being discussed . . . is that they are not experiments with random assignment [to] groups and the like.

326 They lack the pure classical experimental approach in that all the results are hedged ahead of time. Some of the researchers sound a great deal more like advocates rather than scientists. It may well be that advocates run better programs . . . but it's probably not a good model for getting research results on something you might think you could replicate on a large scale" (DOL, 1980e:58~. Vernon Briggs, from the Cornell Labor and Industrial Relations faculty, issued a rebuttal. "I feel we're missing what is perhaps the greatest contribution of these programs in the discussion over classical research design. It seems to me that the major overriding focus behind all this is changing institutions in desired directions." He continued, "I think, when you consider the whole range of employment training programs, the results of the research and the demonstrations are used by a number of different actors and, depending on where those actors sit, they have different agendas. I would go as far as to say that I think that a typical legislator would probably think more in terms of whether services are actually delivered than in terms of subtle assessment of impacts" (DOL, 1980e:60-61~. Othello Poulard, director of the Center for Community Change, based in Washington, D.C., delivered an even more fundamental critique of rigorous research. "As a practitioner, I can say there isn't that much mystery, as might be suggested, with further discoveries in uncovered truth. I wish that were the problem. It would be easy if the accumulation of a few more facts would provide the remedy. But the residual of so many basic societal patterns and attitudes, political stances and the like, seem to be so obviously at the heart of the matter. . . . I wish there could be 'advocate' research. I don't think that bastardizes research at all. It tempers it. It is too risky, too hazardous to just assume that it is appropriate let alone judicious, to take the pure researchers' approach. If the attitude behind the process is one that is devoid of passion and commitment, that is not a virtue" (DOL, 1980e:61-62~. One of the more impassioned versions of this argument was made by Robert Schrank of the Ford Foundation. "Large sums of money have been allocated for massive quantitative evaluation effort," he argued, "but no one is asking what the pitfalls of such research might be, or whether it is even appropriate to what we are trying to study." He continued, "The object of the research is a network of youth programs," not the production of research results. If research focuses attention on measurable results, at the expense of producing long-term effects on institutions, he argued, "the objective social science research model may turn out to be more of a burden than a beacon for policymakers." He went on to note "a terrible tension between doing objective evalua- tion and trying to make a program succeed" that worked against long- term solutions and in favor of short-term results. He also observed that one effect of doing evaluations of War on Poverty programs that were not effectively institutionalized was to reinforce the notions that "the social problems we were attempting to solve were intractable," rather than the right solutions hadn't been tried (DOL, 1980e:36-40~. In essence, then, when Taggart consulted his policy research constituency for advice about how to do knowledge development, he got

327 back a faithful representation of the prevailing disagreements within that constituency over the methods, content, uses, and practical consequences of policy research, which were outlined at the beginning of this paper. There was pressure for timely results that would inform policy, but little understanding of what policy makers actually wanted to know and less appreciation for how long it would take to find out. There was pressure for methodological rigor, but no agreement on whether there were treatments that were compatible with the experimental model or whether experimentation was compatible with the commitment necessary to make programs work. There was advice to stick to the business of research and avoid the pitfalls of advocacy, but no advice about how to mount programs in a complex political and administrative setting without advocacy. There was counsel to respect the tension between dispassionate research and commitment to particular programs, but no concrete organizational solutions for how that tension could be resolved. There was advice about the dangers that accompany premature tests of program effects, but no clear understanding of when new programs should be evaluated. No doubt, the participants in the Res ton conference believed they had delivered a clear message to Taggart about the direction knowledge development should take. The overall effect, though, was to reproduce the general lack of agreement and to strengthen Taggart's resolve in pursuing the strategy he had chosen. For all its defects, Taggart's strategy at least had tentative solutions to the problems of large- scale policy research. A far more serious set of policy system problems was posed by the Vice President's Task Force on Youth Employment. As noted above, sometime in the fall of 1978, the administration seized on youth employment as its major social policy issue for the 1980 campaign, having run into difficulty with welfare reform. The Task Force was as ambitious a policy development exercise as ever takes place in the federal government. It involved a significant central staff, led first by Tom Glynn, former director of planning and budget for ACTION, and later by Erik Butler, a director of youth programs from Boston and a researcher/practitioner at Brandeis. The Task Force drew on the policy staff of the Domestic Council, including Bill Spring and Kitty Higgins, a DOL staff member on detail to the White House, and on outside consultants, including Peter Edelman, former director of youth services for New York State. It involved extensive interagency consultations between the Departments of Education and Labor. It served as the locus for wide consultations around the country with business, labor, education, and employment leaders and practitioners. And it resulted in the presentation of a major piece of legislation to the Congress in January 1980. The Task Force's budget, amounting to $1,027,485 over fiscal 1978 and 1979, was financed from YEDPA discretionary funds (DOL, 1980b). Aside from the fact that the Task Force was funded from his budget, Taggart and GYP had larger interest in its work. If the President's proposal passed Congress, it would set the structure of youth programs for the foreseeable future. Taggart saw the longer-run stakes of the Task Force's work and focused a large amount of his energy, between

328 early 1979 and January 1980, on drafting the DOL side of the proposal. In doing so, he attempted to draw whatever lessons were available from the first year's experience with YEDPA and from the conventional wisdom emerging from sustained attention to the problem of youth employment. From Taggart's review he deduced a few relatively straightforward principles that were to shape both the administration's youth initiative and subsequent changes in federal employment and training policy. In part, these principles were as follows COOL, 1980a:86-93~: · Standards. Everyone involved in the employment training enterprise should be held to mutually agreed r self-imposed standards, or benchmarks, of performance. Trainees who do not meet performance expectations should be moved out of programs to make room for those who are willing to try. Employers should be willing to provide structured and demanding activities in the workplace. Training organizations should be willing to set performance standards for themselves and their clients. · Sequenced Activities. Programs should begin at the level of competence of entering trainees and should follow a sequence of structured steps designed to move trainees into unsubsidized employment. · Targeted Resources. Funding formulas and administrative decisions should reflect the difference between high-cost, intensive services for high-risk youths and low-cost, less-intensive "transitional" services for more mainstream youths. The highest priority should be highly targeted, concentrated programs for the neediest. · Consolidated Programs. The array of categorical youth programs initiated by YEDPA should be consolidated into a single program structure. · Employment-Education Collaboration. The early efforts at better coordination between prime sponsors and local educational systems did not produce widespread changes, but the objective is an important one for federal policy. · Institutional Comparative Advantage. Some organizations, notably community-based organizations, have a comparative advantage in reaching high-risk youths, although they vary widely in capacity to deliver services. Their role should be strengthened. · Local Accountability. The federal program structure should encourage attention to measurable, quantitative outcomes, rather than to implementing complex regulations on program content. Other, more specific, lessons emerged from reviews of the preliminary YEDPA evidence performed by Erik Butler and Jim Darr (Butler and Darr, 1979~. These lessons focused on a review of program-by-program results, but generally followed the same themes. In short, the Task Force and the administration's youth initiative forced a telescoping of the larger knowledge development process into a short period. Many of the longer-term institutional development and research objectives, while they continued to be important within GYP, were pushed aside in the policy-making arena in the interest of developing an administration proposal.

329 If the the Carter youth initiative had passed Congress, and if Carter had won reelection in 1980, the knowledge development process might have continued to pursue "he longer-term institutional develop- ment, and the research objectives it contained might have received attention. But these things didn't happen. The Carter youth initiative passed the House in August 1980, but failed to reach the floor in the Senate. Carter lost his reelection bid. In the early months of the Reagan administration, federal employment and training policy underwent a major change, including the elimination of most YEDPA programs, with the passage of the Job Training Partnership Act. The Demise of Knowledge Development Long before the introduction of the Carter youth initiative, though, there was evidence that the knowledge development process was beginning to come apart in certain critical ways. First, the organizational network created at the beginning of the program began to require management from the center that OYP was hard-pressed to provide. In the words of a DOL observer, "It was one thing to get all those contracts negotiated, written, and signed in the first place, and quite another to deal with the problems that surfaced when the organizations started to have problems, not to mention turning the whole system over when the contracts needed to be renewed or terminated." Second, the politics around the Vice President's Task Force began to take its toll on Taggart, politically and physically. By early 1980, it had become clear that youth employment was the only game in town for those interested in affecting domestic policy. With the election approaching, activity around the administration's proposal became feverish. There were predictable tensions between the Task Force and Taggart over details of the administration's proposal and the mechanics of assembling it. Taggart worked around the clock for months drafting a proposal and selling it within the administration. At one point before the administration's proposal had been sent to Congress, one participant remembered, "When Taggart's recommendations weren't incorporated fully into the administration bill, he circulated his own version around town and on the Hill. This, needless to say, did not endear him to the Task Force people." In the end, the employment provisions of the administration's proposal were largely determined by Taggart. But the costs of this political manuevering were reckoned in the loss of sustained attention to the management of the knowledge development process. The unraveling of the knowledge development process began in March 1980, when Taggart resigned his position as OYP administrator. After leaving OYP, Taggart worked independently, with foundation funding, assembling research results on employment and training programs. He then established the Remediation and Training Institute, a private nonprofit organization, again with foundation funding to provide assistance to local employment and training operators. Taggart was replaced by Tim Barnicle, a regional DOL administrator from Boston, from March 1980 to January 1981. After that, OYP was run in the early

330 months of 1981 by Richard Gililand, a former DOL regional adminis- trator, who was transferred to the job by the then Assistant Secretary for Employment and Training, Albert Angrisani. After the passage of the JTPA, GYP was disbanded. Some of its staff were reassigned to various other parts of the agency, some left the department in a series of reductions in force, and "knowledge development" ceased to exist. Though the activity called knowledge development ceased to exist, many of the contracts negotiated as part of the knowledge development strategy were still outstanding. Some contracts extended into 1982. In an effort to bring some order and closure to these contracts, the Brandeis Center, in late 1981, compiled a list of unfinished projects, along with recommendations for their disposition. They identified about 120 incomplete discretionary projects, of which all but a few required additional DOL action to close them out. They also examined data collection activities under ETS's Standard Assessment System, and found 20 of 48 sites in which data were incomplete. The posture of Reagan appointees in the Department of Labor toward these unfinished projects, by most accounts, ranged from indifference to outright hostility. "When the new administration arrived," one insider recalled, "an immediate freeze was put on all time extensions and refundings of the ongoing research efforts. They wouldn't even let the entitlement research be completed until an editorial appeared in the Washington Post that created a Congressional uproar. It is in the nature of research that repeated time extensions and some cost overruns occur. By systematically refusing extensions and [by] disbanding the youth office, most research was halted in its tracks. Without a program officer to follow up for final reports and without time extensions, even where new funds were not required, and without any hope of new research money coming from the department lots of these committed academics just went off in search of other grants. Others didn't have the funds to analyze the data they had collected. . . . "No one ever bothered to follow up. In fact, the atmosphere was very hostile to research and other discretionary projects. We were instructed to call grantees and tell them they couldn't publish papers unless the department cleared them. Such an instruction contradicted the actual language encouraging publication in the grants themselves." A career DOL employee with extensive experience in evaluation observed that after the change in administrations the contracts were "technically" administered, "but no longer with knowledgeable staff or any sense of high-level attention." Andrew Hahn, from Brandeis, is more pointed: "Our posture was that the taxpayers had already paid for the research and they should have the benefit of the results. We tried to lay out what was necessary to close it [knowledge development] off with the best results possible. It became clear, though, that finishing was not a high priority." The Reagan administration's interest in dissociating itself from the program of an earlier administration was understandable. Its means of dissociating itself so was less under- standable to people with a strong interest in research and evaluation. The demise of knowledge development, then, was a compound of manage- ment and politics. The management problems stemmed directly from strategic decisions about the purposes of knowledge development and the

331 organizational form necessary to carry out those purposes. Knowledge development began to come apart organizationally when the problems of conglomerate organization became clear and no solutions were forthcoming from Taggart and his staff. The complex system of intermediaries, external staff, intraagency and interagency agreements was predicated on the accurate assumption that GYP, by itself, could not manage an enterprise of the scale required. The system was a way of dispersing responsibilities among a variety of organizations, while at the same time maintaining a central agenda. Taggart could, in the early stages, by sheer force of personality and intellectual energy, give this system some coherence through his use of research as a management tool. The chief vulnerability of this kind of system, though, is that when the constituent parts begin to have problems, the center is ill equipped to solve them. And when the center becomes overloaded, as it did, with the problems of the constituent parts and with external pressures of the policy agenda, the system begins to shake itself apart into individual projects. The center depends on the constituent pieces to make the system work. But GYP was not able to generate enough capacity in its constituent pieces fast enough, or uniformly enough, to relieve pressure on the center. The political causes of the demise of knowledge development lie, ironically, in the close connection between research and policy. Congress, or at least the House, expected useful answers to the question "What works for whom?" in time for the COTS reauthorization in 1978. The Carter administration, when it finally turned its attention to youth employment, expected support for a new domestic initiative. Taggart's research, program, and policy constituencies expected methodological rigor, sensitivity to administrative constraints, and firm answers to policy questions. How the contradictions among these demands were to be reconciled was a problem his constituents happily left to Taggart. From the beginning, knowledge development was expected to inform policy--early and often. It was not to be a long-distance run with a single finish on some remote horizon; it was to be a series of sprints with the finish lines dictated by political landmarks. Taggart did little to discourage, and much to encourage, this view; it reflected his own belief in the active relationship among policy, program management, and research. The consequences of this close connection between policy and research, however, were twofold. First, as the 1980 election approached, the demands of managing the conglomerate were overwhelmed by the demands of policy making. It wasn't enough for Taggart to focus on making an unwieldy-organization work; he had also to focus on influencing administration policy. Second, when the political agenda shifted, with the election of Ronald Reagan, the ground was cut from under the program. ~ NFLUENCE ON POLICY AND PRACTICE Despite the unfinished work, and the ignominious end, the knowledge development process generated a large volume of research on the subject

332 of youth employment. One tangible proof of this output is the pub- lished collection of knowledge development reports, begun before Taggart left and continued through 1980, which contain the planning documents, evaluations, and basic research reports completed before 1981. The complete collection comprises more than 70 reports, from 100 to 400 pages, color coded by topic, in the form they were received as final products from research contractors, with short introductions explaining how their content relates to the overall set of questions around which the knowledge development process was designed. The idea behind this method of dissemination was to make as much of the knowledge development research quickly available to policy makers, policy analysts, researchers, and practitioners as possible and to synthesize it later. CEIS, at Brandeis, was to play the role of pulling the pieces together around common themes. Because of the abrupt way the process was terminated, the product of the federal investment became those undigested reports, and the Brandeis synthesis continued later under private funding. GYP began mailing the reports to potential consumers in mid-1980. In some quarters, notably Capitol Hill, this approach had the opposite of its intended effect. Some people, it appears, preferred their research in smaller bites. Nat Semple, former House minority staff member, said facetiously, "We came to one morning and they backed a dumptruck up to the building and unloaded a ton of reports. The stuff just wasn't useful." The same people who ridicule the way the knowledge development reports were disseminated, however, are generally complimentary of the background materials and "lessons from experience" papers that accom- panied the Carter administration's youth initiative. These summaries of the first two years of knowledge development were, in the view of Hill staff, written in language understandable to legislators, addressed to issues considered important on the Hill, and generally responsive to questions that arose in consideration of the youth initiative. Reinforcing this perception from the Hill is the perception of those who worked on the Carter youth initiative. Domestic Policy Staff member Bill Spring argues, "I think there is broad agreement among those of us who worked in the White House that the Youth Initiative was probably the best-run policy-making exercise in the Carter years. It got the right information to the right people, it forced the education folks to talk to the employment and training folks, it forged a broad consensus on how to get at an important problem. You have to say that none of that would have been done in the same way without the knowledge development process to back it up." Most of the supporting documents for the youth initiative would not be considered "research," in the strictest sense of that term, although they were often couched in the language of "what works." They more often took the form of recommended standards, criteria, funding mechanisms, and institutional arrangements emerging from practical experience. Around these operational issues, a consensus began to emerge in about 1980 that spans partisan loyalties and institutional affiliations. This consensus has had a significant influence on

333 federal policy. It forms the basis for much of the statutory language and administrative structure surrounding the Joint Training Partnership Act (JTPA). Robert Gutman, the Senate majority staff member who took the leading role in drafting JTPA, was also involved in the drafting of YEDPA and in discussions of the Carter youth initiative, in which the same issues surfaced. A wide range of people, from Taggart to the Brandeis staff to the Vice President's Task Force staff to congressional staff, claim credit for influencing the content of JTPA. This consensus is an important indication that YEDPA and its attendant policy activities created an occasion for rethinking the legislative and administrative structure surrounding federal youth employment programs. This consensus has been described in a number of ways in a variety of documents (Taggart, 1981; Hahn and Lerman, 1983; National Council on Employment Policy, 1983, n.d.), but it includes at least the following basic elements: · Focus on High-Risk Youths. Limited federal resources, the indeterminacy of aggregate unemployment statistics for youths, and the seriousness of the problems faced by economically disadvantaged high school dropouts all argue for a strategy more highly focused on what Taggart calls "the leftovers"--young people excluded from conventional routes to education and employment. · Deemphasize Income Maintenance, Emphasize Employment. Employ- ment programs should have employment objectives, income maintenance programs, and income maintenance objectives. Training stipends and wage subsidies should be set to encourage unsubsidized employment and reward performance, rather than to provide income. · Deemphasize Work Experience, Emphasize Basic Skills and Job Training. Work experience should no longer be used as the catch-all solution for the unemployed. It seldom leads to longer-term employment unless it is linked in some systematic way to education and training. · Use Individualized and Sequenced Programs. Many of the failures of employment and training programs stem from mismatches between the competencies of the trainees and the content of the programs. Programs should be designed to provide basic education, training, and job entry in a sequence and combination that matches the individual's requirements, with intermediate benchmarks to gauge performance along the way. · Require Performance Standards for both Individuals and Programs. Employment is the expected outcome of employment programs r therefore people in the employment and training system should be evaluated and rewarded on the basis of their performance in securing long-term, unsubsidized employment. Intermediate benchmarks are important, but employment outcomes are essential. · Reward Performers. The absence of positive rewards for both individuals and program operators leads to a focus on the lowest common denominator of participants. Programs should select from the most disadavantaged and reward those who succeed in meeting expectations. · Use Mainstream Institutions. The isolation of education and training for the disadvantaged from mainstream institutions, especially employers, compounds problems of access. Heavier reliance on apprenticeships and employer-based training decreases barriers to entry.

334 · Invest in Capacity. The employment and training system has been among the most unstable of domestic service systems. It needs a stable base of federal support and a professional constituency to do its job. · Incremental Expansion. The employment and training system has been buffeted by a series of dramatic shifts in policy since it was established. New policies should be allowed to mature and develop before they are dramatically altered. None of these principles seems particularly revolutionary, nor particularly "scientific" or counter-intuitive for that matter. But measured by the distance between the conventional wisdom of 1976 and that of 1985, rather than the tenets of social science method, they constitute a profound shift. This shift is attributable largely to the fact that YEDPA focused the attention of researchers, practitioners, and policy makers for a time on connecting practice with policy. Another indication of the influence of knowledge development is the general perception among policy staff and their bosses that the YIEPP demonstration, the largest and most visible of the knowledge development activities, was successful. Most staff cite the findings of the MDRC reports on entitlement as evidence that well-designed and well-managed programs can have an impact on high-risk youths, that despite start-up difficulties in some settings it is possible to mount a program based on the entitlement principle, that requiring youths to manifest certain commitments and competencies as a condition of support is workable, and that educational institutions must modify their programs to make the entitlement principle work. The fact that the entitlement demonstration did not lead to a full-scale national program, is not troubling to most insiders. Nat Semple, the staff person who had probably a larger stake in the entitlement demonstration than most, says, "The political realities of 1981 were that you weren't going to get anything through Congress with the word "entitlement" in it. Also, there was one serious design flaw in the entitlement program that required attention: the fact that when the subsidy ran out, a lot of employers just gave kids pink slips. The demonstration, though, had effects well beyond the evaluation. It legitimized the idea that standards were fair and effective.'' Another view of the influence of knowledge development comes from the Brandeis staff, who remain the single repository of knowledge development products, the main synthesizers of that evidence, and one of the few organizations created by that process that still exist. Andrew Hahn describes the influence this way. "Before YEDPA and knowledge development people who worked in the youth employment field basically had no common professional identity. One effect of knowledge development was to put a large infusion of resources behind the creation of a professional constituency for youth programs. That is the first step in raising the standards in the field to the point where, as in education, you can start to expect people to perform effectively." Brandeis's current, privately funded work is training and technical assistance for youth practitioners; the network they use to deliver these services is constructed from people and organizations involved in YEDPA and the knowledge development effort.

335 Taggart takes yet another view of the influence of knowledge development. By 1979-1980, he had evolved a longer-run strategy for using research and technical assistance to raise the quality of youth employment programs, consistent with his pessimism about the ability of practitioners to discover more effective techniques by themselves. The idea was that the evaluations of discretionary knowledge development projects and the activities of intermediaries would produce a list of discrete program options. Taggart uses the term "cookie cutter programs" to describe these options. Different settings would require different combinations of options, given their youth populations, mix of organizations, and employment problems. The intermediaries would function as technical assistance agents, under contract with localities, to deliver pieces of a program. With the demise of the network of organizations created by the knowledge development process, this mechanism failed to materialize. But Taggart's own activities now focus on the use of computer technology to construct education and training programs for high-risk youths from the available body of packaged curricula. Among his clients are local councils created under JTPA. Against this relatively sanguine view of the influence of knowledge Development is arrayed a more pessimistic view, which takes its point of departure from assessments of the methodological quality of knowledge development activities and their payoff in terms of scientifically verifiable results. Michael Borus, a researcher from Rutgers, has reviewed research on employment programs for high-risk youths, including the Neighborhood Youth Corps, the Job Corps, and a number of discretion- ary knowledge development projects under YEDPA. He found serious methodological flaws in most impact evaluations of these programs- including low response rates, ~ ~ ~ lack of adequate comparison groups, and rudimentary development of treatments--and little evidence, outside of Job Corps, of positive effects. He concludes that no progress has been made in creating effective programs, and that, because of a lack of methodological and substantive sophistication, policy makers and program administrators continue to make the same mistakes with each new initiative. His recommended solutions include evaluating only fully implemented programs and using "true experimental designs," carefully designed and implemented data collection instruments, benefit-cost analyses of program effects, and planned variations in program design (Bows, 1984~. Between the sanguine view of policy- and program-oriented researchers, on the one hand, and the unrelenting skepticism of the academic research community, on the other, lies a vast gulf of misunderstanding, disagreement, and conflict over what constitutes "useful" knowledge. For policy staff and program-oriented researchers, knowledge is useful when it helps to solve immediate problems that legislators, administrators, and practitioners think need solving. Knowledge comes in many forms--logic, insight, operating skill, political intelligence, and empathy, to name a few--only one of which is social science research. Creating new knowledge depends on a prior investment in programs and institutions to deliver them. Research methods are only useful insofar as they are instrumental in solving . ~

336 problems; when they get in the way of institution building and problem solving, they should be modified. For the social scientists, knowledge is useful only when it can be verified and replicated with known levels of certainty. Methodological rigor is a prior condition for any useful knowledge. Problem solving, whether in policy or in practice, is meaningless unless it involves the systematic accumulation of replicable research over time. As noted at the outset of this paper, these disagreements have been rehearsed with monotonous regularity in virtually every large-scale policy research effort since the 1960s, with only a modest recognition of the common ground between the two views. The arguments are compli- cated by ominous attributions, on both sides, of political agendas and personal ambitions. Policy and program enthusiasts are accused by social science researchers of "advocacy" (as if it were possible to be an effective practitioner without being an advocate) and of using public funds to further private agendas (as if social scientists did not benefit from doing research on program ineffectiveness). Social scientists are accused by policy and program advocates of being chronically in opposition to whatever the prevailing conventional wisdom is and of putting their own peculiar tools of the trade ahead of the interests of regular folks (as if advocates never did the same thing). These debates are inevitable, in some cases useful, and almost always amusing. But they often do not shed much light on the larger questions of how to make judgments about the investment of the public's money in large-scale research and development enterprises, such as the youth employment knowledge development effort. These larger questions often broach the diffuse and difficult subjects of political, organi- zational, and management strategy--subjects in which neither social scientists nor policy advocates believe they have a comparative advantage. Yet, as this analysis makes clear, large-scale research and development enterprises succeed or fail based not on people's fervor, commitment, nor methodological orthodoxy, but on how skillfully they make strategic decisions. GUIDANCE FOR THE FUTURE At the outset, I posed four broad questions raised by the youth employment knowledge development effort: What constitutes "useful" knowledge? What should be the relationship between the delivery of services and the discovery of effects? What are the political and organizational correlates of successful accumulation of knowledge? And what payoffs should we expect from large-scale research, demonstration, and evaluation efforts? In the tentative answers to these questions lie whatever guidance the knowledge development effort has to offer future policy makers, administrators, and researchers.

337 Useful Knowledge Running through the knowledge development process is a tension between knowledge acquired through social science and knowledge based on practical insight--a tension between science and ordinary knowledge. When House members asked the Department of Labor to "find out what works," they stated their concerns as a potpourri of questions and problems. Some of those questions, such as how to solve structural unemployment among young people, implied sophisticated, long-term research. Others--the effects of specific training and job-search activities, for example--implied shorter-term project evaluations. The congressional mandate did not take account of the vast differences in those questions, nor the time and resources required to answer them. The questions specified by the House were, from a research perspective, exceedingly vague. They provided little guidance for what the Congress meant by "finding out what works." Assuming that Congress meant rigor- ous research when it said "find out what works" probably overstates the sophistication of Congress's concern. Congress was more interested in generating a variety of practical activities addressed to youth employ- ment than in setting the conditions for rigorous social research. On this score the House and Senate agreed. The objective was to launch a wide variety of activities and see if they could survive administra- tively and politically. In the words of a House staff member, "finding out what works" meant "let a thousand flowers bloom," not the conduct of rigorous research. When members of Congress said "find out what works," they had in mind nothing more complicated than demonstrating whether new programs could be instituted administratively and whether young people could find useful work in the process of participating in them. Larger, more sophisticated research questions were embedded in this basic concern, but were not central to Congress's thinking. With certain routine qualifications, the answer to the questions posed by Congress, after three years of research and demonstration, was "yes." Knowledge of this kind is far from trivial, even though it does not meet many social scientists' theoretical or methodological standards. Congress had other important items on its agenda beyond finding out what works. Distributive politics--by age, by region, by constituency group, by federal agency, and by level of government--was Congress's major concern. The legislative language and history of YEDPA manifested far more attention to the distribution of money among competing interests than it did to discovering solutions to youth unemployment. Making the CETA system more responsive to the problems of youths was another agenda item. By targeting youths for special concern, Congress was, in effect, telling the Department of Labor and the CETA system that they had not paid adequate attention to the problems of youths. Still another agenda item was using federal funds to make the schools and the employment and training system work more closely together. From the point of view of certain members, the gap between schools and CETA-funded organizations was inexcusable and should be closed. Each of these items brought with it a collection of problems that Taggart and his staff had to solve in the implementation of YEDPA.

338 Failing to address these items would have meant failing to respond to the manifest concerns of Congress. One can argue that Congress was irresponsibly vague, that it failed to provide the necessary guidance in structuring a research agenda, and that it undermined the possibility of finding out what works by loading too many other items on the agenda. But these arguments all miss an essential point: Congressional action requires coalitions; coalition politics requires vagueness and multiple agenda items. In some instances, as YIEPP illustrates, the demands of coalition politics and the demands of rigorous research are not incompatible. One cannot expect them to be compatible in all, or even most, instances. Ordinary knowledge of politics, in other words, should shape our sense of what we can feasibly expect of Congress in setting the initial conditions of large-scale research on social problems. Ordinary knowledge of administration also played an important role in the knowledge development process. Federal employment and training programs are administered through units of state and local government, which are in some senses autonomous, but which also assume the delegated authority of the federal government to make contracts for the delivery of services. When a shift in policy creates new demands on that system, these units are entitled to ask a host of practical questions about the consequences of those demands. How should new programs be meshed with existing delivery structures? How should the competing demands of services for youths and adults be sorted out administratively, organizationally, and politically at the local level? If local employment training programs are supposed to be coordinated with local educational systems, what is acceptable evidence of coordination and how have other jurisdictions responded to the requirement? If young people are to be given clear expectations of performance as a condition for participation in employment programs, what constitutes satisfactory performance and what happens to those young people who do not meet expectations? Again, these questions are relatively far removed from the con- ventional social science questions about Treatment A and Treatment B. but they describe knowledge that plays an important role in addressing Congress's concerns about whether new programs can be made to work administratively. Moreover, since the administrative structure is composed not just of functionaries working under contract to the federal government, but also of governors, mayors, legislators, council members, and the like, who are elected officials in their own right, these people are entitled to answers. If we probe far enough into the administrative structure, we eventually reach the people who call employers to ask if they would be willing to hire a young person, who teach reading, multiplication, and long division to 18-year-old dropouts, who try to find housing for a young man who is sleeping in his car, and who try to find child care for a young woman who is about to leave the welfare roles and start working as an orderly in a nursing home. These people ask a different order of question. If we add another section to our remedial General Equivalency Diploma course, who will we

339 get to teach it? If we are expected to get rid of kids whose attendance and academic performance are poor, how do we keep our enrollment at a high enough level to meet our placement objectives? Is there a way to combine the teaching of elementary math with training in the use of common measurement tools? Is it okay to send a bright, but poor and neglected, kid to the academic program at the local community college rather than to a job placement--will it count against our placement results? These questions are also somewhat removed from the Treatment A versus Treatment B questions of social scientists. But if someone cannot answer these questions, it is highly unlikely that the designs set in motion by Congress will be translated into employment for disadvantaged young people, or that the application of research methods to employment programs will yield information useful to policy makers. What constitutes "useful knowledge," then, depends on where you stand in the complex system of relationships that operates on the youth employment problem. From this premise, three conclusions follow: First, only a small part of what the system as a whole regards as useful knowledge meets the social scientist's definition of useful knowledge. Second, ordinary knowledge, in the form of answers to practical questions about whether things can be done, is a precondition for more sophisticated forms of knowledge, like that resulting from social experiments. And third, if political and administrative systems fail to accumulate ordinary knowledge, they will, with absolute certainty, fail to accumulate scientific knowledge. The notion that social problem-solving requires the faithful application of social science methods to policy decisions, then, is not so much wrong as it is incomplete. Social science deals in a kind of knowledge that is derivative of, and dependent on, other kinds of knowledge. Failing to distinguish between ordinary knowledge and scientific knowledge, and failing to understand the role that ordinary knowledge plays in the creation of scientific knowledge, is the single largest problem with social science in the service of policy making. As an exercise in the creation and codification of ordinary knowledge, the knowledge development process was a qualified success--at least in the eyes of people who regard ordinary knowledge as important. As an exercise in the application of social science methods to the problem of youth employment, it was less successful, but by no means a complete failure. whatever its other defects, the knowledge development process did reflect, in its design and execution, the distinction between ordinary knowledge and scientific knowledge. Taggart observed that the applica- tion of social science methods to early YEDPA projects was "researching ineffectuality, not intervention." He observed later that, for all its defects, the knowledge development effort produced more social science on employment questions than any previous federal intervention. His understanding of the limits of the existing delivery system led him to take a skeptical view of the possibilities for experimentation and to focus on creating the prior conditions for scientific knowledge. On the one hand, this focus resulted in what seemed, from the point of view of social science, a disproportionate investment in activities that did not produce "results" in the form of clear treatment-control

340 comparisons. On the other hand, the focus seems far more troubling to social scientists than it does to other actors in the process, including the Congress, which authorized the program to start with. At a minimum, then, it seems that future large-scale employment research and demonstration projects should begin with a frank acknowledgment that experimentation is the final stage of some larger effort to codify ordinary knowledge, not the first step in finding out what works. Doing research and demonstration projects involves a large-scale investment over a long period of time in creating a conventional wisdom, translating it into structures and beliefs and behavior, and then (after a fashion) subjecting it to some sort of rigorous empirical test. Beyond this minimum condition, it seems reasonable to promote actively the notion that different levels of knowledge are required to mount large-scale research and demonstration projects, and that doing research is only one way of gathering the necessary knowledge. Simple expedients are often the most effective, like practitioners' workshops, regularly scheduled congressional visits to pilot projects, and head-to-head discussions among administrators, practitioners, and researchers. All of these, and more, occurred in the knowledge develop- ment process. Whether they are understood as legitimate parts of knowledge development in retrospect is problematical. When the results of the knowledge development process are culled for "hard" conclusions about what works, these parts of the process are often lost. Delivering Services and Discovering Effects Another important tension running through the knowledge development process is that between delivering services to constituents and tracing the long-term benefits of those services for disadvantaged youths and for society at large. Most descriptions of YEDPA begin with the statement that its purpose was to find out what works in getting high-risk, disadvantaged youths into the labor market. As we have seen, this is not so much an inaccurate reading of the intent of Congress as it is an incomplete one. Certain members of the House had a genuine interest in finding out what works, but that interest was also rooted in a politically motivated desire to restrain the Senate's enthusiasm for spinning out new programs. Most key Senators thought they knew what to do and saw YEDPA as the vehicle for doing it. The compromise between the House and Senate incorporated both the House's tentativity and the Senate's commitment to specific solutions. More importantly, though, Congress's charge to DOL made clear that the new resources were to be deployed to support the network of constituencies that had grown up around employment training programs. If DOL failed in that mission, the issue of "what works?" would be moot, since there would be no political constituency to support youth programs in the next round of congressional debate. While finding out what works was an important purpose of YEDPA, delivering services to political con- stituencies, state and local jurisdictions, employment training organizations, and disadvantaged youths was instrumental to that

341 purpose. Research and development without a political constituency is of little practical use to elected policy makers. Most of the money spent on knowledge development was not spent on research. It was spent on providing jobs, training, and education to disadvantaged youths. Most of the decisions about which organizations would receive YEDPA discretionary funds were not based on the proven research capacity of those organizations, or even the expected payoff of the funds in research results. In fact, most organizational recipi- ents were chosen on the basis of the constituencies they represented. Within the vast collection of projects that knowledge development comprised were a limited number of projects chosen explicitly for their research value--some on the basis of congressional intent, some on the basis of OYP's policy research agenda. It was in this limited array of projects that the research payoff of knowledge development was to occur. One can argue about whether the research agenda was well formulated, whether the right projects were chosen and developed in the right ways, whether the proportion of constituency-based projects was too large, or whether the right organizations were represented in the constituency- based projects. But it is difficult to argue with the fact that most of what goes on in research and development activities of the scale represented by YEDPA consists of delivering services to constituents, not doing research. It is also difficult to argue with the fact that creating political constituencies is an important part of the process of getting from research to a change in policy. This intimate connection between delivering constituent services and discovering effects did not elude Congress, nor did it elude Taggart when he deployed YEDPA discretionary money. It did, however, seem to elude many of the social scientists and policy analysts who criticized the knowledge development effort. The confusion between "advocacy" and "research" troubled some, as did the raggle-taggle quality of the research in many of the demonstration projects. Anxious to show that social science could deliver clear, policy-relevant guidance, they failed to see that the delivery of services was driving research, not vice versa. There is vicious paradox in the use of social science rhetoric to justify social intervention. YEDPA is described as an attempt to find out what works, when in fact it was an attempt to deliver services to constituents while at the same time finding out what works. Because many people, even the politically sophisticated who presumably know better, accept that the primary purpose was to find out what works, the "mere" delivery of services becomes tainted. It is not enough to get the money out to the right people and to get the right organizations involved in searching for solutions to the problems of disadvantaged youths. If the delivery of services does not add significant new knowledge to social science, or provide solutions to the problem of structural unemployment, it is a failure. Anything short of significant new social science knowledge is just pork barrel. There is nothing wrong with aspiring to significant new social science knowledge, or to long-term solutions to structural unemployment. The problem occurs when, aspiring to these things, we conclude that merely providing jobs, training, and education to disadvantaged youths, and merely building a

342 professional constituency with an interest in providing those services, means that policies have failed. When this happens, the gulf between science and politics widens irreparably. The fact that we find it easy to discredit interventions that merely deliver services, but difficult to find scientifically valid solutions to chronic social problems, may mean that we have gotten too sophisticated in using the rhetoric of social science to justify social intervention. Until the "solutions" come along, we may simply need to do a better job of delivering services. Rather than arguing that large-scale social interventions will result in solutions to chronic problems, we may want to say that, while we are working on the chronic problems, we intend to see that some number of disadvantaged young people get access to jobs, training, and education. If we fail at the more ambitious task of finding scientifically valid solutions, we have at least succeeded in delivering services and at creating a constituency committed to search for the solutions e In practical terms, researchers and policy makers alike should moderate their use of social science rhetoric to justify social inter- ventione Finding out what works, in the scientific sense, requires a long-term investment in practical knowledge as well as research. If that investment is not possible, then we should not expect to find solutions to chronic social problems. In the meantime, merely delivering services may be the best we can do. Political and Organizational Correlates Laurence Lynn (1981b), in his book Managing the Public 'S Business, argues that the alleged failures of public management are as much a result of poorly framed policies as they are of incompetent administra tars. The initial conditions set for public servants often make their success unlikely. There is probably no better illustration of this argument than YEDPA. DOL was given four new youth programs to imple- ment. It was directed to expand two existing programs dramatically, and it was given a large amount of discretionary money to find out wha works for disadvantaged youths--all with a 1-year authorization. The programs were reauthorized in 1978, but by that time the Carter administration had launched the Vice President's Task Force on Youth Employment, with instructions to produce a new youth employment policy by the following year. The pressure mounted within the administration to produce results that simply were not there. By 1980, as the YEDPA research and demonstration agenda was beginning to produce results, the presidential election brought a reversal of the mandate under which YEDPA was launched. Each of these events can be explained by the logic of electoral politics. Electoral politics is what makes policy research possible. But, against this background, it should surprise no one that the results of YEDPA knowledge development fell short of expectations e In a practical sense, there was little anyone in DOL could do to control the volume or the pace of the political demands they were operating under. No Sectretary of Labor in his right mind would tell the leading members of the U.S. Senate on both sides of the aisle that

343 they should scale back their ambitions. DOL faced the choice of participating in the authorization of YEDPA and sharing in the credit, or not participating and getting the program anyway. Nor would a sensible secretary discourage the President from making his department's program a central domestic initiative in the next campaign. There were, strictly speaking, no solutions to the problem of external demands on YEDPA, only adaptations. These adaptations carried a high cost, both to the delivery system and the production of useful knowledge. The political lesson from YEDPA is relatively clear, although probably not very helpful. The scale of the enterprise was incompatible with the pace of external demands. A research and demonstration effort, without the complex structure of operating programs, could have produced modest, short-term results within the amount of time available. A number of new operating programs could have been launched, with limited payoff in terms of new research and development. But both demands together were incompatible with the time and institutional capacity available. It is instructive that the entitlement demonstra- tion, the one piece of the knowledge development effort that had a relatively clear mandate, a finite research agenda, and a considerable amount of institutional research capacity behind it, came the closest to meeting congressional and executive expectations. It is also instructive that the Job Corps, the federal youth program with the greatest institutional maturity, the longest history of trial and error (in both the political and experimental sense), and the most sustained evaluation, is the example that most policy makers reach for when they try to define successful employment policy. The more diffuse the mandate, the more complex the research agenda, and the less well-defined and mature the institutional capacity of the delivery system, the more difficult it is to deliver services and do research on them. The fact that the knowledge development effort produced as much as it did is testimony to the ability of many people to operate under heavy expecta- tions and unreasonable time constraints. On the organizational side, two main facts stand out: the lack of capacity within DOL to manage an effort of the scale required by the YEDPA mandate, and the lack of explicit consideration of organizational alternatives to the one finally chosen. The lack of capacity is as much a commentary on the nature of federally managed programs as it is on the qualifications of DOL/OYP staff. There were limits to how much research expertise one could expect people with essentially program- matic backgrounds to bring to their jobs. But even with the best- qualified federal staff, running a large-scale federal research and development program is an exercise in indirect management. The programs are administered by people whose main interest is in deliver- ing services, the research and evaluation are done bv People whose main : _ ~ ~ ~ ~ _ ~: _ : ~: ~: _ ~ _ _ ~ ~a: _ ~: ~ A_ _ ~ , ~ _ all U=vl~l~ly =~1~ e^~u~lrls ~lu~l~. The job of the federal administrator, in this set of relationships, is to mediate conflicting interests and to use financial and regulatory incentives to get other people to do their jobs. As Taggart can testify, this is devilishly difficult work for which few people are equipped by experience or training. The more complex the system of administrative relationships, the more skill required to manage it, and the less uniform one can expect the results to be.

344 In other words, "lack of capacity" can mean both lack of qualified staff and lack of direct control. Taggart's administrative strategy for dealing with limited capacity was to create capacity in other organizations and manage them from the center. It was well suited to OYP's capacity, in both senses of the term. But it had the weakness of all such strategies--it was vulnerable to variability at the periphery. Some external alliances worked well, because they were well organized and well staffed; others did not. If there are too many cases of the latter, the system becomes difficult to manage from the center. The solutions to this problem lie either in working on a much smaller scale--an alternative not really available under YEDPA--or in gen- erating more capacity on the periphery--something that takes time to do. The lack of an explicit consideration of organizational alternatives to the one that evolved is not unusual in federal agencies. No one in the executive branch specializes in thinking about alternative ways to organize complicated undertakings. DOL and other executive actors with an interest in YEDPA were preoccupied with larger issues at the begin- ning of the effort. Taggart was not the sort either to pose alter- natives or to stand back and wait while others did. He did what he considered necessary: he consolidated program operations, research, and evaluation in OYP. Prom Taggart's point of view this was the best solution. It is not clear, however, that it was the best solution from the point of view of DOL, Congress, or the executive branch. Neither is it clear, however, that any of the alternatives for dispersing YEDPA authority among other DOL units would have worked any better. The lesson is not that there was a better way to organize knowledge develop- ment. The lesson is, rather, that the decision of how to organize such an effort is probably the most important high-level executive decision that cabinet-rank officials face. It merits careful analysis. It did not get that analysis in this instance. Payoffs A few conclusions about the expected payoffs of large-scale research and development efforts like YEDPA follow from this analysis. The first is that, especially when solutions to chronic social problems involve changes in existing institutions or the creation of new ones, ordinary knowledge is a prior condition to the creation of scientific knowledge. Administrators and practitioners need to know what to do, or what to do differently, in the most practical sense, before they can begin to act in systematically different ways. Legislators need to know whether programs can be administered and whether benefits can be delivered, before they can make judgments about whether broader social problems can be solved. Social science methods, by themselves, do not deliver this knowledge. Investing in useful knowledge, then, entails investing as much in simple information, practical intelligence, and networks of communication as in research and evaluation. Second, there is a serious danger in justifying new policies on the basis that they will increase our knowledge of how to solve chronic problems, rather than merely delivering services to constituencies and individuals. If the

345 problems turn out to be resistant to social science inquiry, as they usually do, the failure of research discredits the delivery of services. Third, there is little anyone can do to limit the effect of shifts in the political environment on large-scale research and demonstration efforts, but if the complexity of the enterprise is inconsistent with the time constraints imposed by shifting political priorities, the blame for failures should be shared equally by elected officials and administrators. Fourth, one element of large-scale research and development efforts that is subject to executive control is their organization. Initial decisions about how to organize large-scale efforts should be subjected to explicit analysis and high-level executive scrutiny: What capacity is required? What organizations have the required capacity? What capacity needs to be developed? What incentives are available for mobilizing that capacity? ACKNOWLEDGMENTS I would like to acknowledge the assistance of Charles Betsey, study director for the Committee on Youth Employment Programs, and Robinson Hollister, chair of the committee, in preparing this paper. Members of the committee provided useful comments and criticism at an early stage of the research. Special thanks also to Bernard Anderson, Gordon Berlin, Michael Borus, Seymour Brandwein, Erik Butler, David Cohen, Fred Fischer, Tom Glynn, Andrew Hahn, George Iden, Arnold Packer, Dan Saks, Nat Semple, and Bill Spring for comments on earlier drafts, though they bear no responsibility for the final product. This paper is based in part on a series of interviews with participants in YEDPA program development and policy-making processes. Their willingness to discuss events and share insights contributed greatly to the paper. REFERENCES Borus, M. 1984 Why Do We Keep Inventing Square Wheels? What We Know and Don't Know About Remedial Employment and Training Programs for High School Dropouts. Unpublished paper. Manpower Demonstration Research Corporation, New York City. Butler, E., and J. Dar r 1979 Lessons from Experience: An Interim Review of the Youth Employment and Demonstration Projects Act of 1977. Center for Public Service. Waltham, Mass.: Brandeis University. Hahn, A. 1979 Taking stock of YEDPA: the federal youth employment initiatives, Part I. Youth and Society 2~2~:237-261. Hahn, A., and R. Lerman 1983 The CETA Youth Employment Record. Washington, D.C.: U.S. Department of Labor.

346 Hargrove, E., and G. Dean 1980 The bureaucratic politics of evaluation: a case study of the Department of Labor. Public Administration Review - 40(March/April):150-159. Lindblom, C., and D. Cohen 1979 Usable Knowledge: _Social Science and Social Problem Solving. New Haven, Conn.: Yale University Press. Lowry, J.H., and Associates 1979 Determining the Viability of Intermediary Non-Profit 1980 Corporations for Youth Programming. 4 Vols. Chicago, Ill.: James H. Lowry and Associates. Lynn, L. 1981a The President as Policymaker: Jimmy Carter and Welfare Reform. Philadelphia, Pa.: Temple University Press. 1981b Managing the PUbliC'S Business. New York: Basic Books. National Council on Employment Policy 1983 Back to Basics Under JTPA. Washington, D.C.: National - Council on Employment Policy. n.d. Investing in America's Future. Alexandria, Va.: Remediation and Training Institute. Peters, T., and R. Waterman 1982 In Search of Excellence. New York: Harper and Row. - Rein, M., and S. White 1977 Policy research: belief and doubt. Policy Analysis 3:239-271. Rivlin, A. 1971 Systematic Thinking for Social Action. Washington, D.C Brookings Institution. Salamon, L. 1981 Rethinking public management: third-party government and the changing forms of government action. Public Policy 24(3):255-276. Taggart, Re 1980 Youth Knowledge Development: The Process and Product. Unpublished paper. 1981 A Fisherman's Guide: An Assessment of Training and Remediation Strategies. Kalamazoo, Mich.: Upjohn Institute for Employment Research. U.S. Congress 1977 Conference Report 95-456. June 22, 1977. U.S. Congress, House of Representatives, Committee on Education and Labor 1977 Youth Employment Innovative Demonstration Projects Act of 1977. Report 95-314. June 22, 1977. U.S. Congress, Senate, Committee on Human Resources 1977 Youth Employment Act. Report 95-173. May 16, 1977. U.S. Department of Labor 1978 Employment and Training Report of the President. Washington, D.C.: U.S. Department of Labor. 1979 Employment and Training Report of the President. Washington, D.C.: U.S. Department of Labor. .: The

347 1980a Employment and Training Report of the President. Washington, D.C.: U.S. Department of Labor. 1980b Knowledge Development Activities, Fiscal Years 1978-1979. Office of Youth Programs, Youth Knowledge Development Report 1.2. Washington, D.C.: U.S. Department of Labor. 1980c Knowledge Development Agenda. Office of Youth Programs, Youth Knowledge Development Report 1.1. Washington, D.C.: U.S. Department of Labor. 1980e Proceedings of an Overview Conference. Office of Youth Programs, Youth Knowledge Development Report 1.3. Washington, D.C.: U.S. Department of Labor. 1981 Employment and Training Report of the President. Washington, D.C.: U.S. Department of Labor. Employment and Training Report of the President. D.C.: U.S. Department of Labor. 1982 Washington,

Next: Social Context of Youth Employment Programs »
Youth Employment and Training Programs: The YEDPA Years Get This Book
×
 Youth Employment and Training Programs: The YEDPA Years
Buy Paperback | $85.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Do government-sponsored youth employment programs actually help? Between 1978 and 1981, the Youth Employment and Demonstration Projects Act (YEDPA) funded extensive programs designed to aid disadvantaged youth. The Committee on Youth Employment Programs examined the voluminous research performed by YEDPA and produced a comprehensive report and evaluation of the YEDPA efforts to assist the underprivileged. Beginning with YEDPA's inception and effective lifespan, this report goes on to analyze the data it generated, evaluate its accuracy, and draw conclusions about which YEDPA programs were effective, which were not, and why. A discussion of YEDPA strategies and their perceived value concludes the volume.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!