The Education of Public Health Professionals in the 20th Century
Over the past 50 years or more, many reports and conference proceedings have discussed the nation’s system of public health education. In general, these tend to deplore the general state of public health education and the inadequate preparation of the public health “workforce.” Recently, Kristine Gebbie crisply summed up the contemporary state of the discussion in her editorial, “The Public Health Workforce: Key to Public Health Infrastructure.”1 A longer version of the argument2 joins a series of recent publications and manifestos on the problems of public health education.3,4 These in turn appear to derive some of their general framework from the rather unflattering view of public health encapsulated in the Institute of Medicine’s report of 1988 on The Future of Public Health.5 Briefly character-
ized, these various analyses assert that public health departments are poorly staffed, and that many of the people working in them lack the specific skills, qualifications, and abilities they need to fulfill their responsibilities of protecting the public health. The faculty members of public health schools, for their part, are busy doing research, and training students to do research, but they are failing to turn out the highly educated labor pool needed to adequately staff the public health departments of the future. Phrased another way, the “theory” of public health as taught in the academy does not cohere tightly to its “practice” as performed in state and local health departments. Public health “leadership” is said to be needed to connect the fragmented pieces by taking the knowledge produced in the schools and applying it in the “laboratory” of people’s lives.
Within schools of public health, most faculty members are scientists and researchers with a Ph.D. degree. Few have any work experience outside of academia, much less in city or state health departments. Not surprisingly, they have little interest in becoming engaged with the practical work of public health agencies. Many, especially in the laboratory-centered disciplines, have little knowledge of, or interest in, politics or policy, or they regard politics as merely some distasteful contaminant of an otherwise orderly search for knowledge. Even social and behavioral scientists are often more interested in their statistical methodologies than with the messy arts of organization, advocacy, and policy-making. They shy away from the popular media, television cameras, news magazines, street demonstrations—among the various modes of informing, shaping, and challenging public opinion—as perhaps undignified and definitely distracting. Nor are they often to be found in the schools, clinics, churches, and community organizations of the decaying sections of the cities in which they work.
From the point of view of the faculty of public health schools and programs, there is little time for the multiplicity of things they are already being pressured to do. To be required to raise the best part of one’s own salary, and to write grants to cover research assistants, secretaries, students, equipment, or other research needs, focuses the mind admirably. All other activities become luxuries. To be successful in the research funding world requires associated and time-consuming commitments: to read the work of one’s colleagues, to review other people’s grant applications, to publish on a regular basis, to participate in academic and professional meetings, to have pieces of one’s time scattered across other people’s projects in case one’s own project lacks sufficient funding. None of this allows much leisure for intellectual or political activities that are not directly related to the research agenda, such as exploring the messy world of community organizations or writing for popular, as opposed to scientific, journals. It is only on rare occasions and more or less by accident that schools of public health harbor public intellectuals or effective public advocates for the public’s health.
If schools of public health have become mainly research institutes, where students learn the art of preparing grant proposals and writing scientific articles, what about the local departments of public health? In general, these are staffed by people with little public health training— people who learn the processes and problems of public health on the job. Some have scientific, medical, nursing, or engineering degrees that may be relevant to their work but the matching of credentials to tasks is often haphazard. Certainly, there is no assumption that all members of a local health department will be graduates of an accredited school of public health. Salaries in public health are low and political pressures are often strong; many public health departments survive in a more or less permanent state of crisis, coping with the last budget cut and waiting for the next one. Their contact with the schools of public health is likely to be sporadic—a lecture series here and there, an occasional joint project.
If there is indeed something lacking in the structure and processes of public health education, then, from the historian’s perspective, it is useful to find out when the problem started. Has it always been thus? How did this state of things come to pass? What forces are responsible for the peculiar disjuncture between schools of public health and the departments of public health where the work of public health gets done? In order to explore these questions, we need to examine the two general phases of public health education in America: the phase of private funding by the great philanthropies when independent schools of public health were first created and second, the period of federal and state funding. Although there is overlap between these two phases, it seems reasonable to date the first as 1914–1939, and the second as 1935 to the present. As part of phase two were the wartime programs in public health funded by the armed services.
After the war, as in other sectors of the economy, there was a long era of postwar expansion, with smaller bumps and recessions along the way. Overall, funding for public health education has been on an upward trajectory but the development has been uneven; wavelike patterns of expansion and retrenchment make for instability and great difficulty in planning. If health departments have often lurched from crisis to crisis, schools of public health have accustomed themselves to an often erratic funding cycle, with sudden infusions of funds for special areas of concentration, political shifts and cutbacks, and the giving and taking away again of grants and training funds. The miracle of it all is that so many excellent and talented students pass through, are educated, and receive credentials, before emerging into the intersecting worlds of government agencies, voluntary associations, foundations, academia, international organizations, and managed care companies.
THE FOUNDING OF SCHOOLS OF PUBLIC HEALTH
The first independent schools of public health in the United States were funded and nurtured by the Rockefeller Foundation. Rockefeller philanthropies were by far the largest and most important in terms of their influence on public health education, so I will focus on them here, but it is notable that other foundations, such as Commonwealth, Kellogg, and Milbank, were also extremely involved in and supportive of public health education during the interwar years. Not until 1935 did the federal government provide any significant level of funding for public health education.
To set the context for the recurring struggles over public health education, it may be helpful to note that medical schools had proliferated throughout the 19th century because they were economically advantageous to both faculty and students. A few faculty members could get together, create a medical school, and charge tuition; assuming the fees were not too high, nor the entrance requirements too strict, the students would come. Then as now, medical students were making a wise investment in their future earnings. Schools of nursing, by contrast, were created by hospitals that needed a well-trained and well-behaved labor force to staff their wards; the hospitals thus had an economic interest in creating their own diploma schools. Once the nursing profession was more fully established, universities found that women students (or their families) were willing to pay tuition as an investment in a respected female career. In the case of public health, however, by the later 19th century, when cities and states were calling for public health officers, there were no established career patterns. Public health leaders were generally people like Hermann Biggs or Josephine Baker—physicians who, with lucrative private medical practices on the side, could devote themselves to the public’s health as a largely voluntary activity. The rank and file of public health officers were simply practicing physicians who could be called out in times of crisis to assist in coping with epidemic diseases, but who were otherwise fully involved in caring for their own patients. Municipalities employed a variety of health inspectors and street cleaners but these were largely untrained and often unreliable workers, many of whom obtained their positions through political patronage.
It was thus the leaders of the Rockefeller philanthropies who, in the early 20th century, set themselves the task of creating a public health profession. The Rockefeller officers became involved in public health education because of their experience with the hookworm eradication campaign in the southern United States. The hookworm eradication campaign was part of a massive program to modernize the South—besides building railroads and factories, the representatives of northern capital would raise the productivity of the rural southern workforce by eliminat-
ing the “germ of laziness.”6 This was a perfectly logical approach because hookworm infestation produces anemia and thus decreases the population’s ability to work; a healthier workforce would indeed be more productive.
Members of the Rockefeller Sanitary Commission’s staff had initially assumed that they could rely on public health officers in the southern states to help carry out their program. But to their distress, they found these part-time health officers displayed little interest in or dedication to the task. Rural southern physicians disliked the northern Yankees, resented being ordered about, and generally refused to believe that hookworm was a serious problem. Wickliffe Rose, the architect and organizer of the Rockefeller Sanitary Commission, came to believe that a new profession was needed—separate from medicine—composed of men and women who would devote their whole careers to the control of disease. Rose insisted that there must be two professions: medicine, for treating disease at an individual level, and public health, for controlling disease and promoting health at a population level.
Rose turned to Abraham Flexner whose “Flexner Report” of 1910 had been central to the reorganization of American medical education.7 Flexner was then head of the General Education Board, the Rockefeller organization responsible for education programs. Flexner was involved in a struggle to make medical school professors “full-time” faculty—to separate teaching and research from private practice so that professors would be able to devote their entire attention to their academic pursuits. To Rose, the problem of part-time health officers appeared in a similar light: public health practitioners should be “full-time” so that they would devote their whole attention to the needs of public health and not be distracted by the demands of private practice.
Flexner found that Rose’s concerns were widely shared by prominent leaders in public health. Indeed, the Massachusetts Institute of Technology and Harvard University had already put together an impressive curriculum for training health officers in communicable diseases, sanitary engineering, preventive medicine, demography, public health administration, sanitary biology, and sanitary chemistry.8 Students generally entered with professional degrees—they could be engineers or physicians— and completed a two or three year course of additional study before receiving a certificate in public health. The combined program graduated a small number of highly-trained health officers each year.
Hearing about the interest of the General Education Board, and hoping for some of the Rockefeller largesse, several universities submitted competing proposals for a school of public health. Harvard University naturally thought that the project could best be entrusted to them, and had in mind an expanded School for Health Officers. Charles-Edward A. Winslow, however, argued in favor of a school in New York City that would focus on training public health nurses, sanitary inspectors, and health officers for small towns—the rank and file of the profession, not just the most highly educated elite. Wickliffe Rose agreed that one or two schools could be established and asked Abraham Flexner to organize a planning conference for October 1914.9
Columbia University now submitted a plan for a school—combining medical, engineering, and social science courses—to be established in New York. The Columbia plan especially emphasized the social and political sciences, in contrast to the more usual emphasis upon biological sciences and sanitary engineering. In the discussions that followed, three competing conceptions of public health emerged: the engineering or environmental approach, the sociopolitical, and the biomedical. In the end, the biomedical approach would dominate, with sociopolitical and environmental concerns relegated to a very subsidiary role.
Wickliffe Rose asked Abraham Flexner to consult with medical school professors, members of the newly formed United States Public Health Service, the medical departments of the army and navy, state and city health departments, registrars of vital statistics, representatives of life insurance companies, and health managers of large industries. Flexner, however, preferred to rely on the advice of a few trusted friends and never consulted most of these varied experts. Instead, he brought together a group of 20:11 public health representatives and 9 Rockefeller trustees and officers for a one-day meeting on October 16, 1914. The decisions made during that conference would shape public health education for the next 25 years.
First was the question of the types of practitioners for whom training was needed. Hermann Biggs, the health commissioner of New York state, declared that there were essentially three classes of public health officers. The “health officials of the first class,” were those with executive authority such as city and state health commissioners. The health officials of the “second class” were the technical experts in specific fields: bacteriologists, statisticians, engineers, chemists, and epidemiologists who would run health department programs and conduct research. The “third class,” the “subordinates” or “actual field workers,” were the local health officials,
factory and food inspectors, and public health nurses. Members of this last and most numerous group would be the “foot soldiers” in the war against disease.
The most difficult question was whether the “first class” officials had to be medical men. If public health were to become a full-time career, was it reasonable to suppose that physicians would be willing to give up their independence to become salaried employees? As a consequence of the Flexner reforms in medical education, physicians’ incomes were rising sharply, so it was hardly a propitious time to expect a large influx of doctors into public health. But William Henry Welch of Johns Hopkins brushed these concerns aside, stating—as it would turn out, with excessive optimism—that physicians would be eager for the “splendid opportunity” of education in public health. Hermann Biggs argued in vain that the requirement of a medical degree was unrealistic, for most of those present at the meeting believed that only medically qualified health officers would be able to gain the cooperation of medical men in the community. Already, the potential for conflict between medical men and public health officers was evident to these experienced observers but the proposed solution—to make public health officers medical men—would prove ineffective. It did not address the real source of the conflict and ignored the looming contradiction between the interests of the majority of the medical profession, engaged in fee-for-service private practice—and a new minority group of salaried public health doctors.
At the October conference, Wickliffe Rose laid out a carefully articulated vision of the future of public health education. At the center he placed a scientific school, well endowed for research. This school would belong to a university but be independent—specifically, it would not be a department of a medical school. Students attending the school would be selected from across the country and its graduates would be carefully placed in strategic positions throughout the United States. This central scientific school would be linked to simpler schools of public health to be established in every state; these state schools would focus on teaching rather than on research. The state schools would in turn be affiliated with medical schools and with state health departments and would offer short training courses for health officers already in the field. Following the pattern of the agricultural extension courses and farm demonstration programs that the Rockefeller Foundation had already used to modernize agriculture in the southern states, they would offer extension services for rural health education.10 Both central and state schools would teach public education methods and seek to extend public health information to the entire population. The central school would take the whole country as its
“field of operations,” sending out “an army of workers” to demonstrate the best methods of public health, and bringing back their practical experience to be “assembled and capitalized” at the center of operations.11
Rose and Welch were given the task of writing up this draft plan to be mailed to the meeting participants for their criticisms and suggestions. Rose now outlined a memorandum entitled “School of Public Health,” and Welch countered—at the last possible minute—with a plan for an “Institute of Hygiene.”12 Because of Welch’s perhaps unconscious procrastination, there was no time to circulate this document to the meeting participants before its official presentation to the General Education Board; although Rose himself had not had time to review the draft, it was presented as the “Welch-Rose Report.” As I have previously argued, Welch’s version of the plan was more oriented to scientific research than was Rose’s more practice-oriented model; Welch’s version dropped almost all mention of Rose’s system of state schools, practical demonstrations, and extension courses.13 Enthusiastic paragraphs about the need for an army of public health nurses and special inspectors had been eliminated; instead, Welch dwelled happily on the development of “the science of hygiene in all its branches” that would be the focus of the central school of public health. He dropped Rose’s phrases about the divergent aims of medicine and public health and instead suggested that the new school of public health should be close to a good teaching hospital.
Some of the participants at the October conference and other public health leaders complained that Welch’s version of the report was closer to the German than to the English conception of public health. In other words, the focus on research largely ignored public health practice, administration, public health nursing, and health education. The medical side of public health was emphasized to the virtual exclusion of its social and economic context; no mention was made of the political sciences or of the need to plan for social or economic reforms. Public health was to be biomedical, not social in orientation. Abraham Flexner, who greatly admired Welch, brushed aside all such objections and subtly maneuvered the decision-making process towards Welch’s ideas and the selection of Johns Hopkins University as the site of the first endowed school of public health. The Johns Hopkins School of Hygiene and Public Health opened its doors to its first class of students during the influenza epidemic of 1918. Only later did the Rockefeller officials agree to provide funding for other schools of public health, most notably at Harvard and Toronto.
Wickliffe Rose’s grand conception of a network of state schools with extension agents fanning out into the countryside, major emphases on public health education, short courses and extension courses to upgrade the skills of health officers in the field, and demonstrations of best practices in public health were not implemented by the Rockefeller Foundation—although much would later come into being albeit in a more haphazard and less carefully planned fashion. For most of the Rockefeller men of that era, it made sense to start at the top, create one or two elite schools of public health, and let the rest flow from the center. Had the emphasis on modernization and increasing worker productivity that had been characteristic themes of the hookworm eradication program been maintained as the central motive and justification for public health campaigns, perhaps other private interests would have helped bankroll the rest of Rose’s initial vision. But as history turned out, it would take the crisis of the Depression and the creative responses of the New Deal to impel the next major leap forward in public health education.
The first schools of public health: Johns Hopkins, Harvard, Columbia, and Yale, tended for the most part to follow the model set by the Hopkins school. They were well-endowed private institutions with high admission standards; they favored medical graduates, and often admitted rather distinguished mid-career people already experienced in public health. In the 1920s and early 1930s, the curricula of the schools tended to be heavily weighted toward the laboratory sciences: bacteriology, parasitology, immunology, and what was called “physiological hygiene,” along with instruction in epidemiology, vital statistics, and public health administration. The main emphasis was on infectious diseases, with some attention to nutrition (biochemistry), water quality, and occupational hazards. In the 1920s, little was attempted in the way of field practice but this was, perhaps, relatively unimportant as so many of the students were already experienced practitioners. The Rockefeller Foundation gave fellowships to medical graduates around the world who were interested in studying public health, so that from the beginning, the schools tended to have an international flavor. The Foundation would later use these graduates to help establish schools of public health in Brazil, Bulgaria, Canada, Czechoslovakia, England, Hungary, India, Italy, Japan, Norway, the Philippines, Poland, Rumania, Sweden, Turkey, and Yugoslavia.
The Rockefeller Foundation also tried to convince the schools to establish programs of field training. Using the model of medical school education, the students, they argued, should learn to practice in the community much as medical students learned their art in the wards of a hospital. Johns Hopkins under Welch had been reluctant to pay much attention to practical training but in the 1930s, with additional funding from the Rockefeller Foundation, Hopkins did establish the Eastern Health District, consisting of a study population of about 100,000 people
living in the neighborhoods around the School of Hygiene. These families were intensively studied through a house-to-house health census every three years; as a local newspaper described the population, “They are, by all odds, the most interrogated, surveyed, investigated, and card-indexed citizens of Baltimore—and probably of the 48 states, Alaska, Hawaii, Puerto Rico, and the Philippines.”14 Many of the Hopkins doctoral students wrote their dissertations on some aspect of the health of this population.
By 1930, the first schools of health were turning out a small number of graduates with a sophisticated scientific education. The schools however were doing little or nothing to turn out the large numbers of public health officers, nurses, and sanitarians needed across the nation. In 1932, the American Public Health Association established a Committee on Professional Education chaired by Waller S. Leathers, Dean of the Vanderbilt Medical School, which included many of the then leading names in public health circles, such as Thomas Parran, W.G. Smillie, Allen Freeman, and Huntington Williams, among others. This committee prepared 20 reports on the educational qualifications of 15 professional specialists, and ultimately distributed some 250,000 copies of these reports.15 The idea of this very considerable effort was to inform state and local health departments about the types of employees they should be seeking and the kinds of qualifications appropriate for each, with the idea of creating national standards that, if used by the multiplicity of local health departments, could create some degree of uniformity across the nation.
FEDERAL FUNDING FOR PUBLIC HEALTH EDUCATION
A major stimulus to the further development of public health education came in response to the Depression, with the New Deal and the Social Security Act of 1935. The Social Security Act expanded financing of the Public Health Service and provided federal grants to the states to assist them in developing their public health services. Federal and state expenditures for public health actually doubled in the decade of the Depression.
Federal law required each state to establish minimal qualifications for health personnel employed through federal assistance, and recommended at least one year of graduate education at an approved school of public health. For the first time, the federal government provided funds, administered through the states, for public health training. Overall, the states budgeted for more than 1,500 public health trainees, and the existing
training programs were soon filled to capacity. As a result of the growing demand for public health credentials, several state universities began new schools or divisions of public health and existing schools of public health expanded their enrollments.
In 1936, the American Public Health Association reported that 10 schools offered public health degrees or certificates requiring at least one year of residence; of these, the largest were Johns Hopkins, Harvard, Columbia, and Michigan.16 Also offering degrees in public health were the universities of California at Berkeley, Massachusetts Institute of Technology, Minnesota, Pennsylvania, Wayne State, and Yale. By 1938, more than 4,000 people, including about 1,000 doctors, had received some public health training with funds provided by the federal government through the states. The economic difficulties of maintaining a private practice during the Depression had pushed some physicians into public health; others were attracted by the availability of fellowships or by increased social awareness of the plight of the poor and of their need for public health services. In 1939, the federal government allotted over $21 million for public health programs: $8 million for maternal and child health, $9 million for general public health work, and $4 million for venereal disease control.
Of course, many students and health departments desired the most efficient and least time-consuming process of credentialing they could find. The market favored programs that could produce the largest numbers of graduates in the least amount of time. When there were not enough places in schools of public health to supply the need, many colleges and universities opened public health departments and programs, some offering training courses of just a few months’ or even a few weeks’ duration. Engineering programs turned out sanitary engineers by the score. Summer sessions in public health nursing at Berkeley, Michigan, Minnesota, Columbia, Syracuse, Western Reserve, and several other universities produced over 3,000 graduates annually. These short programs offered a variety of diplomas and certificates in public health; by 1939, 45 institutions were offering 18 different degrees, certificates, and diplomas in public health. Of these 45, 10 were independent schools of public health, 20 were colleges and universities offering programs in public health nursing, and 12 were engineering colleges offering programs in sanitary engineering.
Despite a great expansion of public health training facilities, there were still far from enough graduates to meet the demand. Federal training funds were now allotted to California, Michigan, Minnesota, Vanderbilt, and North Carolina to develop short courses for the rapid training of
public health personnel. These short courses were recognized as emergency measures until the schools were able to develop more adequate graduate educational programs. Perhaps not surprisingly, the faculty of the founding schools of public health generally disapproved of this rush to short training courses. At Harvard, when the Social Security Act was passed in 1935, the faculty immediately understood that there would be a demand for short courses and decided to resist. They unanimously stated that “short courses should not be instituted or standards lowered, no matter what the situations we are asked to meet.”17 To emphasize their concern about maintaining high academic standards, the faculty promptly raised admission standards.18
The tremendous push in the late 1930s toward training larger numbers of public health practitioners was also a push toward practical training programs rather than research. Public health departments wanted personnel with one year of public health education: typically, the M.P.H. generalist degree. If they could not attract public health practitioners with this credential, they settled for a person with a few months of public health training. Ideally, they also wanted people who understood practical public health issues rather than scientific specialists with research degrees. Thus, public health education in the 1930s tended to be practically oriented, with considerable emphasis on fields such as public health administration, health education, public health nursing, vital statistics, venereal disease control, and community health services. In this period, too, many schools developed field training programs in local communities where their students could get a taste of the practical world of public health and a preparation for their roles within local health departments. The 1930s were thus the prime years of community-based public health education.
In 1939, the Rockefeller Foundation decided to evaluate the status and future of public health education. The Scientific Directors of the International Health Division selected Thomas Parran, the Surgeon General, and Livingston Farrand, recently retired President of Cornell University, to study the schools of public health in the United States and Canada.19 Parran and Farrand estimated that about 300 public health physicians and between 2,000 and 4,000 public health nurses would be needed each year to staff public health departments. They also noted an increasing demand for sanitary engineers, epidemiologists, statisticians, and other types of
specialists. Parran and Farrand recommended increased support for the schools of public health at Hopkins, Harvard, and Toronto, mainly to sustain research in the core public health disciplines. They also recommended that regional schools of public health be established in the West (suggesting California at Berkeley), the Midwest (Michigan), and the South (Vanderbilt). Such regional schools, they emphasized, should be oriented to practical training rather than to research.
THE WAR YEARS
Not surprisingly, the proliferation of short training programs continued throughout the war years. The armed services wanted physicians, nurses, and sanitarians with at least a minimal amount of training in tropical diseases, parasitology, venereal disease control, environmental sanitation, and a variety of infectious diseases. For the burgeoning industrial production areas at home, industrial hygiene was in demand; for areas with military encampments, sanitary engineering and malaria control were very urgent concerns. In this period, the Center for Controlling Malaria in the War Areas, the forerunner of the Centers for Disease Control and Prevention, was created. Schools of public health and public health training programs changed their educational programs to meet the various needs of the armed services as rapid training programs turned out large numbers of health professionals with a smattering of specialized education in high-priority fields. The research-oriented schools of public health, such as Hopkins and Harvard, maintained their research programs largely by recruiting foreign students—many of them from Latin America—to staff their laboratory and field programs; in those years, Johns Hopkins was said to resemble an outpost of Latin America. The North American students all wanted quick training programs before going to their war posts at home and abroad.
Deans of the leading schools of public health were no doubt anxious about the future direction of public health education—were all these short training programs going to threaten the long-term standards and standing of the best public health education? In 1941, representatives from Columbia, Harvard, Johns Hopkins, Michigan, North Carolina, Toronto, and Yale met to organize the Association of Schools of Public Health, “to promote and improve the graduate education and training of... professional personnel for service in public health.” The representatives clearly disapproved of many of the new rapid training programs and limited membership in the Association to schools giving graduate degrees. They argued the need for an accreditation mechanism to establish standards of public health education but realized that this goal would have to wait until after the war. The Association had no formal authority over licensing—there has never been any clear agreement over public health creden-
tials—but it claimed a certain moral authority in representing the most highly developed schools of public health.
THE POST-WAR YEARS: TOWARD ACCREDITATION
In 1946, the Committee on Professional Education of the American Public Health Association took over the job of monitoring the standards of public health education. William Shepard, then Third Vice-President of the Metropolitan Life Insurance Company, energetically chaired the committee. Shepard complained about profit-making public health training courses of dubious quality; at least one school was offering public health degrees by correspondence, its “faculty” consisting of several authors of leading texts on public health who were entirely unaware of their “appointment.”20 At least a dozen universities were in the process of establishing schools of public health, some of them with no new faculty— merely using existing faculty as part-time teachers. Proprietary schools, complained Shepard, constituted a “dark period” in the development of a profession—marking the moment when demand for trained people exceeded supply. Given the large demand for public health personnel and the relatively sparse supply, the APHA Committee saw its task in part as differentiating between good and poor candidates and as stemming the tide of poorly-trained “incompetents.”
The Committee on Professional Education also created a plan for the accreditation of schools of public health, financed in its earliest years by the Commonwealth Foundation. Thanks to studies by Haven Emerson and Martha Luginbuhl,21 the Association was able to estimate how many full-time public health personnel were needed in the nation, the replacement rate of existing public health officers, and therefore the number of schools of public health that were really needed—Shepard estimated in 1946 that between 5 and 10 additional schools of public health would be necessary to provide the public health workforce for the nation.
The difficulty with instituting a system of licensing and credentialing was the low salaries involved in most public health positions. With the war and the depression behind, public health positions were failing to attract the most highly-qualified candidates. Physicians, in particular, showed little enthusiasm for public health appointments. The attractions of private and hospital practice far outpaced the appeal of public health agencies. There seemed little point in attempting to impose any form of licensing when the number of jobs so outstripped the number of available candidates, and public health positions for the most part were regarded
as financially undesirable. The Committee’s answer to this structural problem was to urge “a comprehensive public relations program under expert direction,” which would lead to increased public recognition and thus, perhaps, to higher salaries.
With funding from the U.S. Public Health Service, the Committee now set up a kind of public health employment agency in an attempt to match vacant positions in public health with job candidates. In 1947, the “Vocational Counseling and Placement Service” listed some 688 available public health positions and 164 candidates looking for employment—a ratio of 4 available jobs per candidate. The ratio of available physician positions to physician candidates was 7 to 1—meaning that every physician graduating from a school of public health could have his or her pick of public health jobs and that most would perforce go to doctors without any specialized public health training.22 The Committee on Professional Education also made great efforts to recruit candidates into public health, conducting 376 office interviews in the course of the year. With funds from the Children’s Bureau, the Public Health Service, and the National Foundation for Infantile Paralysis, they set up a “Merit System Unit” to prepare “modern, objective types of examinations” as a way of assisting health and personnel officers identify qualified candidates for their openings.
A survey of schools of public health in 1950 found them overcrowded and underfunded, lacking key faculty members, lacking classroom and laboratory space, and lacking necessary equipment.23 All were suffering from high levels of financial stress. The schools were under pressure to provide more practical training but the Deans argued that they needed a 70 percent increase in full-time faculty to expand the “applied” fields of instruction. They also stated that they could double the number of students enrolled if they had the necessary financial support for staff, basic operating funds, and construction. The applied fields most frequently in demand were public health administration, environmental sanitation, maternal and child hygiene, industrial hygiene, mental health, medical care organization, public health economics, public health nursing, and health education.24
Given this context, it seems hardly surprising that the criteria for accreditation of schools of public health as implemented at mid-century seem undemanding by current standards. The physical facilities required, for
example, were defined (in their entirety) as “lecture rooms, seminar rooms, and adequate laboratory facilities for the teaching of subjects in the field of microbiology, including microscope, culture media, apparatus, etc.; for the teaching of vital statistics, including calculating machines for student use, and apparatus for chart-making, with tabulating machinery available for demonstration purposes and for the teaching of sanitary engineering, including laboratory facilities for the examination of water and sewage and for the demonstration of the basic principles of hydraulics.”25
For accreditation, the faculty of a school of public health had to consist of at least eight full-time professors. The school had to have “practical autonomy” such that the public health faculty effectively controlled all degree requirements. The most frequently listed fields of faculty of schools of public health in 1953 were, in order, public health practice, microbiology, epidemiology, sanitation, physiological hygiene, vital statistics, biochemistry or nutrition, industrial hygiene, parasitology, public health nursing, health education, maternal and child health, social and economic problems, and mental hygiene. Between 1947 and 1953, the average number of faculty in accredited schools of public health grew from 13 to 19— an increase of 50 percent. The mean ratio of students per faculty member was 4.5, a ratio that was justified by the need for many diverse disciplines and the “intimate personal contact between teacher and pupil in seminars and in field work.”26 Every accredited school was required to have a library consisting of at least 3,000 volumes in the fields of public health and 50 current periodicals.
Perhaps the most interesting part of the accreditation of schools of public health was the evaluation of practical training and fieldwork. Schools had to be located close to local public health services that could be used for “observation and criticism” and these public health services had to be of sufficiently high quality “to make such observation fruitful.”27 Indeed, all the accredited schools reported some sort of functional association with county or city health departments. The Columbia school, for example, shared a building with one of New York City’s District Health Centers; the school selected the District Health Officer from a list, provided by the Department of Health, of those eligible for appointment. Johns Hopkins had the Eastern Health District, which was jointly operated by the City Health Department and the school. The School of Public Hygiene, thanks to funds provided by the Rockefeller Foundation, paid the salaries of the District Health Officer and several staff members.
In Michigan, teams of public health students were sent out to the surrounding county health departments. Each team member spent time working with their corresponding county health worker, handling mail and telephone calls, and getting the “feel” of the work in progress. Later, the students received weekly reports from the corresponding member of the county staff and held regular meetings to discuss the progress of the county’s health program. The Kellogg Foundation supported this program by paying 10 percent of the county health department’s entire budget. North Carolina’s Department of Field Training worked with local health departments in training, consultation, and the provision of educational materials. The recently formed school at Pittsburgh worked with the Pittsburgh Health Department in organizing the work of the Arsenal Health Center, along the lines of the Eastern Health District of Baltimore. Similarly, the Harvard school used its field training program in the Whittier Street Health Unit of the Boston Health Department to train public health, medical, nursing, and social science students. Toronto had its field training in the East York-Leaside Health Unit, with a population of 60,000. The Toronto school of public health paid the salary of the health officer and contributed directly to the budget of the unit. The Department of Public Health at Yale provided surveys of town and city health programs in Connecticut at the request of local health departments. Each year, the students and faculty completed one such survey and presented their results to the local authorities.
In 1951–52, the schools of public health collectively registered 950 students, of whom over 500 were candidates for the M.P.H., 100 for M.S. or M.A., and 100 for the M.S. in Hospital Administration. With the G.I. Bill, the numbers of physicians training in schools of public health had risen sharply for a few years immediately after the war but then began to fall again in 1949.28 In their place, the schools were admitting increasing numbers of engineers, nurses, and health educators and other students qualified by a bachelor’s degree plus experience in public health. Furthermore, 40 percent of all M.P.H. students were from foreign counties and only 16 percent of the United States students were “new recruits” to public health.
Many of the schools offered a vast array of courses: Columbia, for example, offered 127 courses and Michigan almost matched this record with 120 courses. In general, the schools seemed to offer almost as many courses as they had students. The main areas of the curriculum were public health practice, sanitation, vital statistics, and epidemiology, stan-
dard offerings in all the schools; most also offered environmental fields and microbiology.
In the immediate post-war period, many of the schools of public health were involved in curricular reviews and imaginative planning of core courses. For a few years, the concepts of social medicine, social epidemiology, and the ecology of health generated considerable interest. Iago Galdston, Secretary of the New York Academy of Medicine, organized a conference on social medicine in 1947, later publishing the papers as Social Medicine: Its Derivations and Objectives.29 The conferees examined some of the ideas of John Ryle, the first professor of social medicine at Oxford University, and added their own thoughts about the “ecology of health” and the “epidemiology of health.” The general concept was that although bacteriology was adequate for understanding many of the infectious diseases, study of the chronic diseases required an understanding of the relationship of health to the physical, social, and economic environment.
These radical ideas prompted faculty in schools of public health to develop new core courses that emphasized the social and economic context of health problems. From now on, they said, the technical skills of bacteriological and epidemiological analysis would have to be embedded within a larger vision of public health. They criticized pre-war curricula as being too narrowly focused on laboratory studies of disease organisms, too little on the social environment. At Harvard, for example, the epidemiologist John E. Gordon declared that “most important of all is to incorporate within the general fabric of public health a more adequate emphasis on social and economic factors....”30 Harvard instituted two core courses, one on “Human Ecology” and the other on “Community Organization,” designed to “orient the public health program to the framework of modern society” by discussing such matters as “the problem of food supply in relation to world population” and “the influences of industry and transportation on human health.”31 The department of public health administration also offered a series of lectures and seminars on “the history of the public health movement” and “the cultural, social, and economic forces bearing on the evolution of the science of public health.”32 Similarly, Columbia reorganized its curriculum around a single required course covering such topics as “the community and its needs,” “the evaluation of health status,” “the factors which influence the causation and control of disease,” and “public health as a community service.” At Pitts-
burgh, Thomas Parran had decided that the curriculum should be organized around “the systematic presentation of illustrative topics which deal with the interrelation of man and his total environment and with the political, economic, and social framework within which the health officer must work.”33 Yale’s core course on “Principles and Practice of Public Health” was similarly organized around a series of interdisciplinary seminars running throughout the academic year. Winslow commented approvingly that the 11 schools of public health constituted “eleven experimental laboratories in which new pedagogic approaches are constantly being devised.”34
The overall impression of the accredited schools of public health in 1950 was that they were doing a good job of preparing public health practitioners through courses and fieldwork, that the numbers of faculty and students were growing, and that curricular and research innovations seemed promising. The main complaints of the schools seemed to be lack of funding to pay faculty, expand space, and purchase equipment. One other problem, now as earlier, was the fact that the schools of public health attracted few physicians.35 Instead, the schools were accepting an ever-higher proportion of students without health professional training. Winslow and others made a virtue of necessity, arguing that the many different types of students gave public health its unique character:
. . . public health is not a branch of medicine or of engineering, but a profession dedicated to community service which involves the cooperative effort of a dozen different disciplines. The fact that doctors and dentists and nurses and engineers and health educators and microbiologists and statisticians and nutritionists sit together in our schools and take the same degrees is of incalculable importance. It is based on bold assumptions; but it has worked. It provides the only sure basis for true cooperative community service in the future. It constitutes one of the most significant contributions of the United States to the basic philosophy of public health.36
BIOMEDICAL FUNDING IN THE POST-WAR ERA
The war had demonstrated the success of an organized federal effort in financing scientific research; the wartime Committee on Medical Research could point to many successes: the development of atabrine, an effective new treatment for malaria, the therapeutic use of blood derivatives such as gamma globulin, and most notably, the production of huge stocks of the “miracle drug,” penicillin. After the war, responsibility for the wartime projects still underway was transferred to the Public Health Service and the National Institute of Health (which became the National Institutes of Health in 1948). In the post-war period, the budget of the National Institutes of Health grew from $180,000 in 1945, to $4 million in 1947, to $46.3 million in 1950, to $81 million in 1955, to $400 million in 1960. The budget continued to grow dramatically, especially under the influence of Mary Lasker and Florence Mahoney as wealthy and persuasive lobbyists, and James Shannon, the forceful and impressive Director of NIH between 1955 and 1968.
In 1944, Thomas Parran, the Surgeon General, had drawn up a grand 10-year plan for his agency, the Public Health Service. Parran envisioned a remarkably complete health service, including public health and medical care, as well as health professional education and medical research:
When peace returns, this country should so reorganize and develop its health resource that there will be available to everyone in the population all health and medical services necessary for the preservation and promotion of health, the prevention of disease, and the treatment of illness... .It is believed that the use of public funds is fully justified in developing the physical plant for health, in training professional personnel, in supporting both public and private medical and scientific research of broad public interest, and in reducing the individual financial burden resulting from catastrophic illness or chronic disability.
The principle is accepted that no one in the United States should be denied access to health and medical services because of economic status, race, geophysical location, or any other non-health factor or condition. It is a duty of governments—local, State, or Federal—to guarantee healthful living conditions and to enable every person to secure freedom from preventable disease.37
Only part of this grand vision was to be realized. Because of the hostility and deep pockets of the American Medical Association and their allies, neither the comprehensive expansion of the public health service nor the institution of national health insurance would prove politically
possible. Thomas Parran himself was relieved of his position as Surgeon General and replaced by the more malleable Leonard Scheele. There was no lack of money to spend. In 1946, the Hospital Survey and Construction Act, or Hill-Burton program, was passed to finance the construction of community hospitals, initially providing $75 million a year for five years, and eventually pouring $3.7 billion into new hospital construction. The Hill-Burton program was strongly supported by the American Hospital Association and the American Medical Association; it provided new facilities for medical practice without threatening in any way the method of paying for health services. Indeed, Hill-Burton had a specific provision prohibiting federal involvement in setting hospital policy.38 The system of Veterans Administration hospitals was also greatly expanded and tied in more closely to local medical schools.
Scheele had earlier been associate director of the National Cancer Institute and was now, as Surgeon General, responsible for the National Institutes of Health. Like hospital construction, medical research had many friends and seemingly no enemies. Cancer and heart institutes had been the first, mental health and dental institutes followed, and then came a succession of other special institutes targeted toward a specific disease (diabetes, arthritis), body part (eye, kidney), or stage in the life cycle (child health, aging). The institutes grew and grew wealthy; they also gave away most of their funds to universities and medical schools in the form of research grants. Because the medical schools and the American Medical Association had opposed the direct provision of federal funds to medical education—nursing an avid suspicion of any form of governmental intervention or control—the NIH research grants proved a politically acceptable way of funneling money to the medical schools. No federal bureaucrats were deciding the dollar amounts given to a particular school: grants were awarded on the decisions of peer review committees composed of non-federal experts in the particular field of research. Liberals, conservatives, medical school deans, and researchers were all happy with the system, and members of Congress were pleased to bankroll such a popular and uncontroversial program.39
Schools of public health would have had no objection whatsoever to direct federal funding—assuming only that it were relatively generous. But public health schools were generally lumped in with medical schools
(and later with health professional education) when it came to setting federal policy, so they had to compete with medical schools for research grants—in a grant system dominated by powerful medical school professors. The historic funders of schools of public health, the great foundations, were well aware of the increasingly important role of the federal government in financing medical research and education. Some of their officers were perhaps disappointed with the achievements of the early schools of public health, especially in their failure to spread the preventive point of view throughout medical education; in any case, they now directed their interest toward building departments of preventive medicine and community medicine within medical schools. The Pan American Health Organization, which had sent so many Latin American students to North American schools during the war years, now came to believe that training in the United States was not very relevant to the problems of developing countries, and argued that international students were best trained in countries with similar health problems, culture, and climate.40
Adding to the woes of schools of public health was the period of deepening conservatism from about 1948 through the late 1950s. The mood in government and on campuses changed in the atmosphere of the Cold War. McCarthyism associated any advocacy of public health agendas or national health insurance with “socialized medicine” and identified this in turn with socialism or Communism. When Thomas Parran, who had been ousted as Surgeon General, took over as Dean of the new Pittsburgh School of Public Health, he was attacked as a “Communist,” who favored socialized medicine and compulsory health insurance.41 (The Mellon Trustees who had financed the school poured over Parran’s past speeches and publications and decided that the charges were unfounded.) In the late 1940s and early 1950s, many of the most articulate and outspoken public health leaders were under attack, silenced, or were losing their positions and their influence.
A DEEPENING CRISIS: PUBLIC HEALTH SCHOOLS AND DEPARTMENTS IN THE 1950S
In the early 1950s, schools of public health were attempting both to maintain educational standards and to admit increasing numbers of students, in spite of the fact that most students were unable to finance
their own education, state governments only reluctantly provided minimal funding, the foundations had lost much of their enthusiasm for financing public health education, and international agencies were questioning the value of American schools for their international students. Schools of public health were all complaining that they lacked sufficient funds for operating expenses and faculty salaries. We need to understand the suffering of the schools in the context of the growing conservatism of the country during the early years of the Cold War, growing popular suspicion of government programs, and seething hostility to even such cost-effective public health measures as the fluoridation of water supplies. We also need to see the schools of public health in the context of a massive expansion in funding for biomedical research as an uncontroversial way to pour money into the health enterprise in the post-war era.
It is hardly surprising that the schools of public health all settled on essentially the same survival strategy, which they pursued with greater or lesser enthusiasm, and with greater or lesser reluctance, depending on the orientation and interests of their faculty and deans. They would apply for research grants and use the research funds to pay the salaries of additional faculty members, on the grounds that new faculty could spend some of their time teaching and some of their time on funded research. In 1950, on an average across schools of public health, faculty spent 40 percent time on teaching, 40 percent on research, 10 percent on administration and 10 percent on service. Averages, however, are misleading because they mask the wide variation between schools of public health and even between different departments within a particular school. What happened was that, if the faculty of a particular department was devoted mainly to teaching or to “service” (public health practice), the numbers of faculty stayed stable or gradually declined. If the department was devoted to research, and was reasonably successful at funding that research, the department grew, added more people, consumed more space and equipment, published a steady stream of research papers and reports, and generally gave the impression of being a dynamic and productive place. Size begat size, growth begat growth, and research success bred research success. Over time, the results could be dramatic, with some schools and departments growing at an impressive rate and others appearing moribund. A few schools, especially Hopkins and Harvard, grew large and prosperous. Between them, Hopkins and Harvard had 40 percent of all faculty involved in research, trained most of the faculty for smaller schools, and generally dominated the field. Smaller or less prosperous schools did their best to emulate the research ideal, to garner their own grant funds, and to grow their own faculty.
Robert Korstad, in his history of the North Carolina School of Public Health, has effectively shown how this dynamic played out in the devel-
opment of that school.42 In 1935, the school began as a Division of Public Health in the Medical School, using the new federal funding provided by the Social Security Act; in 1940, it became an independent school of public health, with the eminent Milton Rosenau as its first Director. The school received a small appropriation from the university, some funds from the Public Health Service, and tuition from students. Rosenau recruited part-time faculty from the State Board of Health, obtained part-time teaching assistance from various members of the medical school faculty, and himself taught epidemiology. The Public Health Service supported two faculty members: a professor of public health administration and a professor of sanitary engineering. At first, the school offered a three-month course for public health officers, then developed programs in venereal disease control, public health nursing, and health education—all practice-oriented subjects. The school offered short training courses for armed services personnel and also took in foreign students during the war.
After the war, Edward McGavran, described as a “dyed-in-the-wool field man,” became Dean of the school. The Kellogg Foundation supported a large field training program, including short courses, in-service training, supervised field experiences, apprenticeship training, and residencies. McGavran was an enthusiast for public health practice but struggled with the North Carolina state legislature, which resisted expenditures on the grounds that it wished only to support students from North Carolina, whereas the school was admitting students from all over the South, and many international students as well. Meanwhile, the legislature appropriated funds that, combined with federal support under the Hill-Burton program, were sufficient to build a hospital and expand the medical school. The University also built schools of nursing and dentistry. But while buildings were going up all over campus, the school of public health lacked classroom and laboratory space. McGavran lacked operating funds, teaching staff and teaching assistants, administrative staff, and the ability to give raises and replace key personnel. The school of public health paid salaries well below those of the other schools on campus and below the “market value” of persons qualified to fill the positions. Furthermore, the University refused to maintain the field training programs, which were admittedly expensive undertakings in terms of staff time and travel.
McGavran was a determined public health advocate who defined public health as “the scientific diagnosis and treatment of the body politic.”43 He believed that public health practitioners should be able to pro-
vide analyses of the economy, the political power structure of the community, and the forces determining the acceptance or rejection of progressive change and development. He faced an uphill battle: the Korean War and the increasingly conservative texture of the times favored narrow scientific solutions to health problems rather than a broad social and political understanding of public health. By the mid-fifties, Korstad delicately notes, there was “a perceptible tension between solidarity and individualism” in the school of public health.44 The Public Health Service and the National Institutes of Health provided categorical grant funding to selected faculty but very little funding for core public health activities.
McGavran tried to hold the faculty together but found it was an impossible task, with the growing pressures for individual entrepreneurial activity, the increasingly uneven development of departments, and the rewards available to those who were successful in obtaining external funding.45 The department of biostatistics, successful in obtaining research and teaching funds, grew dramatically. So did parasitology and experimental medicine (later renamed environmental sciences and engineering), although McGavran complained that the latter was really an “institute of research” entirely separate from the real work of a school of public health. Epidemiology also thrived under the leadership of John Cassel. But other departments fared poorly: mental health had only one faculty member for several years and, when that individual left, had no faculty at all. The large field training program, which in the early 1950s had engaged the total faculty and all of the students for one day a week at four field centers within a 50 mile radius of the school, was eliminated. The enterprise had been exhilarating, time-consuming, and expensive. “But it was a superb experiment” said McGavran, “and for two brief years the School of Public Health demonstrated to students, practitioners, and ourselves that there was a public health team.”46
Thus, even a Director who strongly favored field training and distrusted departments devoted to research was unable to resist the pressures favoring research over practical training. The North Carolina school did receive money from the Hill-Rhodes training funds at the end of the 1950s, and the 1960s ushered in an era of growth with increasing research funds and increasing faculty salaries. Successful department chairs built up their faculty by bringing in faculty members on grant (soft) money and then trying to get them hired on state (hard) money. There were battles over space—the people getting research grants constantly needed more space, more laboratories, more offices, and were taking them away from the departments that were slow-growing or static. In the 1960s, many of
the non-research faculty, such as the women who had led the public health education department through the 1950s, simply left.
A later self-study of the North Carolina school pointedly noted that relationships with local communities and the state had deteriorated “as departments were concerned with the federal dollar and were worshiping the idols in Washington and Bethesda.”47 Many faculty members felt no particular obligation to health agencies at the state or county level as shown by their complete lack of interest in the activities of the North Carolina Public Health Association. Faculty members whose careers centered on research were reluctant to spend time training local health workers. In return, the state legislature offered the school little support. As a result of these dynamics, all the service-oriented departments that had failed to grow in over a decade of federal support—the departments of health administration, health education, maternal and child health, mental health, public health nursing, and public health nutrition—were bundled into a single department of community health practice and administration.
The same dynamics were at work in other schools of public health. The available funding—and the faculty members who were suited by education, experience, and personality to succeed in the research system—shaped the institutions and drove their priorities. At Johns Hopkins in the late 1940s and early 1950s, the epidemiology department was completely dominated by laboratory-focused polio research generously funded by the Foundation for Infantile Paralysis. The work of David Bodian and others at the Hopkins school certainly played an essential role in laying the scientific basis for a successful polio vaccine; the point here is that other unfunded, or underfunded, activities were allowed to slide. Thus the Eastern Health District, which had been the pride and joy of the epidemiology department in the 1930s, expired quietly in the early 1950s. According to a survey of recent M.P.H. graduates in 1955, the increased emphasis on research was also hurting the quality of teaching. A sub-committee of the admissions committee, concerned that M.P.H. applications were falling, reported back: “The complaint was made that the staff was more concerned with research and affairs outside the school than with teaching, that lectures were hastily prepared and frequently dull.”48
In this environment, graduate students who helped the professor with his research were of more interest than M.P.H. students, who merely absorbed rather than produced research results. At Hopkins, Elmer McCollum, the professor of chemical hygiene (later biochemistry)
had started the practice of insisting that all his students must work on some aspect of his nutrition studies. These all involved feeding experimental rats different combinations of carefully prepared foodstuffs— adding or eliminating one specific substance at a time—and then measuring the effects of each diet on the weight and health of the rats. The labor force of students who participated in the rat nutrition studies produced a vast number of research papers, most of them co-authored with the professor. This industrial mode of research organization was easily adaptable to other forms of laboratory research and, in time, to other quantitative public health disciplines.
The system of research funding, however, did not work well for field research, public health practice, public health administration, the social sciences, history, politics, law, anthropology, or (at least at this juncture) economics. So within the schools of public health in the 1950s, the laboratory sciences tended to thrive, whereas public health practice and other non-quantitative disciplines suffered. Intellectually, and in the curriculum, there was a state of uneven development. The community-based orientation of the 1930s had disappeared and the field training programs all essentially collapsed.
The Hopkins M.P.H. students who had been queried in 1955 had asked for more instruction in the history, theory, principles, and philosophy of public health.49 They complained of the required microbiology course: “the laboratory work was too detailed, too mechanical and too unproductive in developing the student’s thinking.”50 One student suggested “the general principles of public health administration, field studies in public health, and social medicine and medical care be combined in one comprehensive required course, using the Eastern Health District and the Medical Care Clinic of the Hospital as a joint administrative practice unit for this purpose.”51 In general, the Hopkins students and alumni asked for more attention to problems of chronic diseases, mental illness, and medical care organization; they expressed a desire for a better understanding of social and economic issues, and they wanted a clear overall vision or philosophy of public health.
By the mid 1950s, schools of public health were being pulled in different directions. Much of the rhetoric of change suggested that, as the biological sciences had been needed to solve the problems of infectious disease, so the social sciences were needed to solve the problems of the chronic diseases. Thus the Dean of the Hopkins school, Ernest L. Stebbins, urged the faculty of schools of public health not to shut themselves up in their laboratories but to be actively involved in service to their local com-
munity. “Knowledge of the natural history, the basic etiology, and means of prevention of heart disease,” he contended, “may come from sociologic studies rather than from the biological laboratory.”52 A committee of the faculty, popularly termed the “Crystal Ball Committee,” suggested new areas of research more relevant to the major health problems of the day: epidemiological and field studies of cancer and chronic diseases, epidemiological studies of mental illness, research into the social determinants of illness, child development studies, health promotion methods, medical care organization, accident prevention, and research on radiation hazards.53 But the Committee also stated that they did not favor “a marked expansion of the school activities into these areas if it means that the basic science program would undergo a fundamental change.”54 In other words, they knew what the problems were and what new types of research should be done but they also didn’t want to change.
As the Hopkins faculty struggled with their crystal ball, the financial situation of the school was worsening. A new Development Committee, chaired by environmental engineer Abel Wolman, spent two years studying the problem and then concluded that the school should abandon its M.P.H. program entirely. Instead, Hopkins would focus on its doctoral programs leading to the Dr.P.H. and the Sc.D. or Ph.D. degree.55 Doctoral students were research students; their education did not take away from the research program, but fueled it. Admission to the Doctor of Public Health degree would be restricted to those who already held a doctoral degree in the medical, biological, or health sciences. Only a few students who found it impossible to remain at the school long enough to complete their doctorate would be allowed to terminate their academic work with an M.P.H. degree. Describing this as a program of “advanced post-graduate education,” the Development Committee report explained: ”Admittedly, the admission policy is designed to eliminate students who either have not had medical training or who are strongly deficient in the biological or health sciences.”56 Such students could and should be trained at “other institutions.”
As the Hopkins faculty—against the advice of their own Dean—withdrew into their laboratories, they further distanced themselves from the problems of local health departments. And the health departments were in a sorry state. In the 1950s, federal grants-in-aid to the states for public health programs steadily declined with the total dollar amounts falling from $45 million in 1950 to $33 million in 1959. Given inflation, this represented a dramatic decline in purchasing power.57 Public health departments were caught in a downward spiral. Lacking funds, they couldn’t bring in new people or begin new programs; lack of new people and programs gave them an aura of failure and irrelevance. Health departments ran underfunded programs with underqualified people who answered to unresponsive bureaucrats. When state legislators wanted to start new programs, they tended to overlook the dull and unimaginative state health departments, regarded as backwaters for those who could not succeed in the private sector. Public health officials were expressing “frustrations, disappointments, dissatisfactions, and discontentments” said John W. Knutson in his Presidential Address to the American Public Health Association in 1957.58 As Jesse Aronson, director of local health services in New Jersey, explained:
The full-time health officer is frequently, because of inadequate budget and staff, limited in his activities to a series of routine clinical responsibilities in a child health station, a tuberculosis clinic, a venereal disease clinic, an immunization session, and communicable disease diagnosis and treatment. He has little or no time for community health education, the study of health problems and trends, the initiation of newer programs in diabetes control, cancer control, rheumatic fever prophylaxis, nutrition education, and radiation control. In a great many areas the health officer position has been vacant year after year with little real hope of filling it. In these situations, even the pretense of public health leadership is left behind and local medical practitioners provide these services on an hourly basis.59
Between 1947 and 1957, the numbers of students being trained in schools of public health fell by half. Alarmed, Ernest Stebbins of Johns Hopkins and Hugh Leavell of Harvard, representing the Association of Schools of Public Health, walked the halls of the United States Congress
to urge its members to support public health education. They found an especially sympathetic audience in Senator Lister Hill and Representative George M. Rhodes, and in 1958, Congress enacted a two-year emergency program authorizing $1 million a year in federal grants to be divided among the accredited schools of public health.
The First National Conference on Public Health Training in 1958 noted that these funds had provided 1,000 traineeships and had greatly improved morale in public health agencies. The Conference further requested appropriations for teaching grants and construction costs for teaching facilities, and urged that faculty salary support be provided for teaching. Their report concluded with a stirring appeal to value public health education as vital to national defense:
The great crises of the future may not come from a foreign enemy…”D” day for disease and death is everyday. The battle line is in our own community. To hold that battle line we must daily depend on specially trained physicians, nurses, biochemists, public health engineers, and other specialists properly organized for the normal protection of the homes, the schools, and the work places of some unidentified city somewhere in America. That city has, today, neither the personnel nor the resources of knowledge necessary to protect it.60
President Eisenhower signed the Hill-Rhodes bill, authorizing $1 million annually in formula grants for accredited schools of public health and $2 million annually for five years for project training grants; between 1957 and 1963 the United States Congress would appropriate $15 million to support public health trainees. The worst of the crisis was over. In the 1960s, Lister Hill would continue to champion the cause of the schools of public health in the Senate and John E. Fogarty became their main supporter in the House. The Congress raised the ceiling on the formula grants, provided grants-in-aid for training to state health departments, and authorized special training grants, fellowships for faculty development, and construction grants for schools of public health.
New Life in the Sixties
The federal government now began to reverse the damage that had been done to public health by providing traineeships, formula grants, and project grants to develop new curricular areas. The downward trend in public health enrollments was halted; in 1960, student enrollments again began to climb. The Association of Schools of Public Health happily dis-
TABLE D-1 Federal Support for Schools of Public Health1
cussed the “ferment” in schools of public health around the new, or newly recognized, problems of chronic illness, mental disorder, air pollution, medical care organization, aging, injuries, and radiation hazards. The new federal funds provided some basic operating costs but also encouragement to explore targeted areas of research and training. New schools of public health were created at the University of California, Los Angeles, and in Puerto Rico, and many schools expanded their previously cramped facilities. In 1963, the federal government doubled the ceiling on formula grants and also began offering construction grants to schools of public health.
This was an exciting time for the schools; between 1960 and 1964, the total number of applicants to schools of public health more than doubled; the number of faculty members increased by 50 percent; the average space occupied increased by 50 percent; and the average income of the schools more than doubled.61 New faculty appointments were made in such fields as medical care organization, social and behavioral sciences, public health administration, human ecology, radiation sciences, population studies, and international health.
The newly created Agency for International Development (AID) encouraged schools of public health to develop international health training programs whose students would become “ambassadors of American science” abroad.62 By 1965, the whole country seemed to have become concerned about the “population explosion,” and the United States Congress was voting money to provide technical assistance, often in the form of contraceptives, to the developing world.
The passage of Medicare and Medicaid legislation in 1965 generated
considerable excitement in schools of public health. State health agencies were concerned about being able to monitor and evaluate medical care services and wanted the schools of public health to provide the scientific basis for rational decision-making in health services delivery. They also wanted the schools to provide training for medical care administrators and financial managers. In 1966, a Special Study Commission of the Association of Schools of Public Health estimated that 6,220 new positions in medical care administration required graduate-level educational preparation.63 The United States Public Health Service curtailed its usual grant application procedures to provide quick funding to schools of public health willing to provide short courses in health services administration. As in the 1930s, short courses would be developed to meet the urgency of the national need.
In the context of the Civil Rights movement and the demand for more community participation in health care, education, and other sectors of civil life, the Kennedy administration supported the movement away from mental hospitals and toward community mental health centers, run on an outpatient basis. Community mental health centers were financed by the federal government and locally controlled, thus largely bypassing the states. Many of the other programs of the 1960s and 1970s would be created as independent ventures, thus directly or indirectly weakening the role of the states and of state health departments. In the year before he died, Kennedy began developing an anti-poverty program and, after his assassination, President Johnson expanded this into the “War on Poverty.”64 As part of this general effort, the Office of Equal Opportunity (OEO) helped to start 100 neighborhood health centers and the Department of Health, Education, and Welfare (HEW) supported another 50.65 The aim of these health centers was to provide comprehensive primary care services and to encourage community participation in running the organizations. The centers were, however, dependent on public funds for their survival, and an ambitious plan to build 1,000 centers across the country was never realized.
In the generally progressive social ferment of the 1960s, a strong environmental movement developed around the catalyst provided by publication of Rachel Carson’s Silent Spring in 1962.66 Earth Day in 1970 at-
tracted some 20 million Americans in demonstrations against assaults against nature; by 1990, Earth Day brought out 200 million participants in 140 countries.67 Within the federal government, the environmental movement spurred the creation of the Environmental Protection Agency (EPA) and passage of the Clean Air Act of 1970. At the same time, labor mobilization and public distress over the toll taken by industrial accidents and mining disasters prompted the creation of the Occupational Health and Safety Administration (OSHA) and the National Institute of Occupational Safety and Health (NIOSH).
Environmental protection agencies, like the neighborhood health centers and the community mental health centers, were organizationally independent of state health departments, although they were clearly important agencies for the public’s health. Questions of the definition of public health now became more problematic: public health in the broad sense included many of the activities and responsibilities of a wide variety of agencies: the work of departments of public health now represented only one aspect of public health: public health as narrowly defined. At the federal level, public health was also losing administrative focus. The formation of the Department of Health, Education, and Welfare in 1953 had reduced the visibility and centrality of the Public Health Service; further reorganizations and changes continued to diminish its role. By 1975, it was clear that the Surgeon General no longer functioned as the head of the Public Health Service. Instead, the Office of the Assistant Secretary of Health had been strengthened and the main health agencies, including the National Institutes of Health, the Food and Drug Administration, and the Center for Disease Control reported directly to him. The Surgeon General had become a figurehead, a spokesperson without direct line authority.
Throughout the 1960s and early 1970s, schools of public health thrived with federal funding available for both teaching programs and research. In 1960, there were 12 accredited schools of public health in the United States; 8 more were added between 1965 and 1975. Between 1965 and 1972, student enrolments again doubled, with the large majority being candidates for the M.P.H. degree. The trend to admit more students who were not physicians, and more students without prior experience in public health, continued. Whereas in 1946–1947, 61 percent of all students admitted to schools of public health for the M.P.H. were physicians, by 1968–1969, physicians constituted only 19 percent of M.P.H. candidates.68 Many schools admitted students fresh from their undergraduate degrees.
Graduate Programs in Other Schools of the University
Along with the growth in the accredited schools of public health came a rapid growth in other forms of public health and health services education. Some of these were graduate programs in a variety of university departments and in schools of engineering, medical schools, schools of business administration, schools of nursing, schools of social work, and schools of education and communication. They were offering degrees in such fields as environmental health, health management and administration, nutrition, public health nursing, and health education. Somewhat to the distress of accredited schools of public health, most employers did not distinguish between accredited and non-accredited programs.69 By 1975, there were some 43 graduate programs in health administration offered in schools of public or business administration and 15 graduate programs in nutrition offered by departments of home economics, education, and human development. More than 30 nursing schools offered graduate programs in public health nursing and community nursing. In addition, all nurses enrolled in baccalaureate programs received some public health education; associate degree programs and diploma programs generally did not provide this. About 30 schools of education or allied health offered graduate health education programs and at least 59 technical and engineering schools and departments of environmental sciences offered graduate training in environmental health.
In addition to this flourishing of programs across university campuses, there had been a dramatic growth of junior and community colleges. By the mid 1970s, some 69,000 students were enrolled in various allied health programs.70 Universities were setting up popular baccalaureate programs in health administration, environmental engineering, health education, and nutrition. Some 58 academic units offered four-year undergraduate programs in environmental engineering; 25 colleges offered undergraduate degrees in community health education, 75 in school health education, and 83 in nutrition.
Schools of public health were, at best, ambivalent about undergraduate education in public health. Several schools of public health (Berkeley, UCLA, North Carolina, Michigan, and Puerto Rico) had earlier offered undergraduate degrees but tended to phase these out in the 1960s; some however were adding new programs in response to perceived manpower needs. As the Milbank Commission Report noted in 1975, public health education was a growth industry with no apparent end in sight. But the system was fractured: although 5,000 graduate degrees in public health were awarded each year, approximately half of higher education for pub-
lic health was occurring outside of accredited schools of public health. Were schools of public health still needed?
THE THREATENED WITHDRAWAL OF FEDERAL FUNDS
Evidently, President Richard Nixon thought not, for in 1973, he recommended terminating federal support for schools of public health and the discontinuation of all research training grants, direct traineeships, and fellowships. This sent shockwaves through a system that had grown dependent on a steady flow of federal funding for its basic support. The strain of the funding cutback threats is reflected in the papers from a Macy Foundation-funded Conference held at the Rockefeller Foundation’s Study and Conference Center in Bellagio, Italy, in 1974. In the volume published from that conference, Cecil Sheps, then Vice Chancellor of the University of North Carolina, noted that leading schools of public health were wondering “seriously and agonizingly” about their future.71 The participants offered a generally gloomy assessment of public health education. According to Russell Nelson of the Johns Hopkins Medical Institutions, corridor talk at his campus said that public health was dead. At Hopkins, moves to absorb the School of Public Health into the medical school had been held back mainly because the medical school faculty were unenthusiastic.72 Herbert Longnecker, the President of Tulane University, gave voice to his medical school’s position when he said, “I think I am correct in stating that the record of fundamental scientific contributions of schools of public health is minor.”73 John C. Hume, now Dean of the Johns Hopkins School of Public Health, spoke about the changes that he had experienced over 20 years as a consequence of the patterns of federal support for biomedical research. The once cohesive nature of the school had been lost, he said: there was little shared conversation, and no coherent teaching program. The autonomy and independence of departments and faculty did encourage initiative but also resulted in isolation and fragmentation. Instead of a unified school of public health, the departments constituted “a series of mini-schools with limited interests.” Hume noted that his major problem as Dean was to cope with the fiscal tides—the waxing and waning of federal enthusiasm for particular topics. In the 1960s, for example, population studies had been elevated in impor-
tance with the influx of new funding, but by the end of the decade, this interest had largely evaporated.74
Representatives of all the schools of public health appeared to agree with J. Thomas Grayson of the University of Washington’s relatively new and rapidly-expanding School of Public Health and Community Medicine: “The greatest immediate challenge to the School of Public Health and Community Medicine is the uncertainty of federal funding brought about by the administration’s announced intention to end, or greatly curtail, federal support for the training of public health manpower, coupled with a similar proposal to decrease support for research training.”75 The one student representative at the conference, identified as recent graduate Frank C. Ramsey, stated the students’ distress with an educational system focused on soft money:
The financing of the school I attended is such that the departmental heads and faculty members are mainly responsible for raising money. Most of the funds come from federal sources and virtually all of them go into research. The heads of departments with popular programs find it easier to raise funds than is the case with heads of departments with less research-oriented programs. The grant system influences the school’s organization, function, and orientation...[it] places constraints on the type of professionals employed and the work performed...[among the students] there was a fairly general belief that solutions to societal problems were being sacrificed on the altar of scientific research.76
Some of the threatened funding cuts were restored, but the trend in the 1970s was toward ever more reliance on targeted research funding, thus exacerbating the problems to which Ramsey had referred. In 1976, the Milbank Memorial Fund issued its extensive report, Higher Education for Public Health.77 The Milbank Commission, chaired by Cecil Sheps, asked the usual questions: Why was there not a closer relationship between professional education and professional practice? Should education change or should the practice model? Could departments of community medicine in medical schools serve some of the functions of schools of public health?
And, most sharply: Had schools of public health become so dependent on federal funds that “their policies and programs are determined by dollars available and they no longer control their own destiny?”78
In place of a public health educational system that Cecil Sheps described as “chaotic, wasteful, and dysfunctional,” the Commission proposed what they considered a more rational structure.79 This sounded rather like an updated version of the original Wickliffe Rose design of 1914. There would be a three-tiered system of public health education. Schools of public health should educate people at the highest level to assume leadership positions; they should train the public health executives who must have a broad knowledge of the entire field and be able to function within the full range of the knowledge base for public health.
Next, programs in graduate schools should prepare the large number of professionals engaged in providing clearly differentiated specialty services, e.g., public health nurses, health educators, and environmental health specialists. Third, although Commission members were uncertain about the value of baccalaureate programs, they might provide some of the “trained entry-level personnel.”80 The Commission defined the “three elements of the knowledge base generic to public health” as:
Epidemiology and Biostatistics
Social Policy and the History and Philosophy of Public Health
Management and Organization for Public Health
Their report also listed a series of “cognate fields”: clinical sciences, biomedical sciences, environmental sciences, social sciences, management sciences, law, and ethics that might well be provided by other departments of the university. The schools of public health should focus on the three core curricular areas and should receive basic core support from the federal government for doing so. They should also serve as regional resources by assisting faculties in medical and other health-related schools to develop teaching programs and research in public health. Different schools would serve as national centers of excellence for specific fields but “should avoid setting up special programs in every new area simply because funding is available.”81 Instead, faculty should become involved in the operation of community health services in areas relevant to their areas of academic responsibility, thus offering supervised field experience for aspiring public health practitioners. In general, the Commission proposed that schools of public health become smaller and more focused
on broad research plans rather than grasping at every funding opportunity. Nor should they do basic laboratory work that could as well be done in a medical school; instead, they should recognize and value their unique interdisciplinary character and craft research plans that drew upon these strengths and were relevant to the regions and communities in which they were located.
The Milbank Commission Report offered faint praise for the system of research driven by changing federal funding priorities: “This is not always bad, as it sometimes results in research that is realistically related to the needs and interests of the nation.”82 By implication, schools would do better if their faculty could design their own research within a broad framework established by the needs of public health in practice. Indeed, Sheps urged faculty to take strong advocacy positions as “academic freedom, like all liberties, is bound to atrophy unless exercised.”83
The specific recommendations of the Milbank Commission had little impact. No dramatic redesign of public health education could work when the underlying forces driving the system continued unabated. Indeed, under President Ronald Reagan, the pressures intensified. In 1981, his administration consolidated numerous federal health programs into two block grants, cut the total funds by 25 percent, and gave the remainder to the states to make their own decisions how best to slash their programs.84 Meanwhile, the AIDS epidemic, largely ignored by the White House, spread across the land. As reductions in federal funding decimated many public health programs, leaving Medicaid dollars to dominate the field, local health agencies spent much time and energy providing basic health services for the poor.
Twelve years after the Milbank Commission Report, the Institute of Medicine issued its own landmark report, The Future of Public Health.85 This documented the bleak landscape of many public health departments across the country. Half of the state boards of health had disappeared; important programs had been taken away from health departments; and public health was “in disarray.” The prose of this report was often vivid: “The most frequent perception of the health department by legislators and citizens was of a slow and inflexible bureaucracy battling with chaos, fighting to meet crises, and behaving in an essentially reactive manner.... Just getting through the day is the only real objective of the senior administrator.”86
The focus of the IOM report was on public health practice but it did have a number of recommendations for schools of public health, urging them to offer educational programs more targeted to the needs of practitioners. Schools of public health should establish firm practice links with state and local health departments so that more faculty members could undertake professional responsibilities in those agencies, conduct relevant research, and train students in practice situations. Just as had the Milbank report, so too the Institute of Medicine report urged schools of public health to serve as resources to government at all levels in the development of public health policy, to assist other types of institutions in educating public health practitioners, and to take better advantage of such university resources as schools of business administration and departments of physical, biological, and social sciences. Unlike the Milbank report, the Institute of Medicine committee asked schools of public health to provide short training courses and continuing education opportunities for public health practitioners. They also suggested that schools offer undergraduate courses in public health to attract recruits into the field. In summary, the task, as they defined it, was “to assist the schools in developing a greater emphasis on public health practice and to equip them to train personnel with the breadth of knowledge that matches the scope of public health.”87 The report especially highlighted the need for short courses to upgrade the skills of “that substantial majority of public health professionals who have not received appropriate formal training” and to ensure that all public health practitioners became aware of new knowledge and techniques. Nothing was said about designing a single rationally organized system of public health education.
In the years since the Institute of Medicine’s report, the public health educational system has continued to expand at an accelerated pace. There are currently 31 accredited schools of public health and 45 accredited community health programs.88 The Council on Education for Public Health estimates that the total number of accredited schools and programs may well double within the next ten years. The most dramatic growth is occurring outside the established schools of public health. Close to 40 percent of the nation’s accredited medical schools now have operational M.P.H. programs or are currently developing a graduate public health degree program. New specializations are emerging such as human genetics, management of clinical trials, and public health informatics. Many schools and competing organizations are involved in distance learn-
ing programs that offer the possibility of fulfilling the long-recognized need to bring public health education to the homes and offices of the public health workforce. The Internet also offers the possibility of bringing public health education to populations across the country and around the world; indeed, health information sites are among the most popular and frequently visited of all Web applications.
Is this a system badly in need of rational reconstruction or is it simply a system of dynamic, if sometimes messy, innovation—an academic marketplace evolving rapidly to meet the country’s needs? Although it is not within the purview of the historian to answer such a question, it may be important to note one significant fact. Previous efforts to design truly effective systems of public health education generally foundered because of lack of political will, public disinterest, or paucity of funds. Since September 11, 2001, however, the context has changed dramatically. With public health riding high on the national agenda and an abundance of funds being promised, perhaps there is now an opportunity, as there has not been for a very long time, to shape a future system of public health education that addresses the problems that have been so often described and analyzed.