Drug Administration (FDA) has regulatory authority over plasma collection establishments, blood banks, and all blood products. The Centers for Disease Control and Prevention (CDC) has responsibility for surveillance, detection, and warning of potential public health risks within the blood supply. The National Institutes of Health (NIH) supports these efforts through fundamental research.
AIDS emerged as a threat to the safety of the blood supply in the early 1980s because of a unique confluence of events. Medical breakthroughs in cardiac surgery and other areas resulted in greater use of whole blood and its components. A new treatment for hemophilia, home infusion of AHF concentrate, grew rapidly and significantly improved the health and increased the life span of individuals with hemophilia. In addition, much of the medical community, as well as the country as a whole, believed that epidemics of infectious disease were a thing of the past. There were also many changes occurring in the government and society, such as a presidential mandate to lessen the regulatory role of government and increased public awareness that the homosexual population was enduring stigmatization and discrimination (Bayer 1983).
As evidence for blood-borne transmission of AIDS accumulated in 1982 and 1983, the Public Health Service had to deal with a very difficult problem. On the one hand, the U.S. blood supply was barely adequate to meet the urgent needs of day-to-day patient care. On the other hand, there was growing evidence that a blood transfusion posed a risk of causing a disease that was proving to be fatal for many. However, both the magnitude of the risk and the prognosis were still unknown. An examination of the efforts of the Public Health Service and others to cope with this problem provides a remarkable window into the making of public policy under duress and uncertainty.
The syndrome that came to be called AIDS was first noticed in homosexual men in 1981, but within a year epidemiologic evidence suggested that AIDS might also be a threat to recipients of blood and blood products. Several blood banks, blood collection agencies, and blood product manufacturers (i.e., plasma fractionators) took some actions to increase blood safety (e.g., donor education and screening to exclude known high-risk groups; terminating plasma collection from prisons; and encouraging autologous donations to reduce the risk of infection as early as January 1983), yet thousands of individuals and members of their families became infected before the implementation of a blood test for HIV in 1985.
Perhaps no other public health crisis has given rise to more lasting anger and concern than the contamination of the nation's blood supply with HIV. In response, blood recipients and individuals with hemophilia who were infected during this period, their families and their physicians, and public and private officials with responsibility for blood safety have asked a series of questions: Could this tragedy have been averted? What institutions, policies, or decision processes, had they been in place in the early 1980s, could have helped to