2
The Malaria Threat and Need for a Vaccine
GLOBAL MALARIA PROBLEM
Malaria is an infection caused in humans by four species of the Plasmodium genus of parasitic protozoans (P. falciparum, P. vivax, P. ovale, and P. malariae) that are transmitted by many species of anopheline mosquitoes. Plasmodium falciparum is the most widespread and also the most serious form. Recent estimates of the annual number of clinical malaria cases worldwide range from 214 to 397 million (WHO, 2002; Breman et al., 2004), although a higher estimate of 515 million (range 300–660 million) clinical cases of P. falciparum in 2002 has been proposed (Snow et al., 2005). Annual mortality (overwhelmingly from P. falciparum malaria) is thought to be around 1.1 million (WHO, 2002; Breman et al., 2004). Malaria deaths are believed to account for 3 percent of the world’s total disability adjusted life years (DALYs) lost, and 10 percent of DALYs in Africa (Breman et al., 2004). In developing countries malaria is currently believed to be the third most common cause of death among children less than 60 months of age, after deaths from respiratory infections and diarrheal diseases.
Almost half of the world’s population lives in areas where they are exposed to risk of malaria (Hay et al., 2004), and the increasing numbers of visitors to endemic areas are also at risk. Residents of endemic areas develop clinical immunity to the disease through repeated exposure, but the immunity wanes rapidly once a resident leaves the endemic area. However, this immunity takes years to develop, and adults in endemic areas have frequent infections though they rarely suffer symptoms. Pregnant women experience renewed susceptibility, especially during the first
pregnancy. Thus the burden of the disease in highly endemic areas falls mainly on young children and pregnant women. Malaria also significantly increases the risk of childhood death from other causes (Snow et al., 2004).
The amount spent worldwide on malaria research and development is not commensurate with its contribution to the global burden of disease. The Malaria R&D Alliance (2005) estimated that in 2004, malaria accounted for about 46 million DALYs lost but that only US$288 million was spent worldwide for research and development. This amounts to only about US$6.20 per DALY. This is significantly lower than the amounts spent per DALY on tuberculosis (TB) ($10.88) and human immunodeficiency virus/ acquired immunodeficiency syndrome (HIV/AIDS) ($24.26) in 2004.
MILITARY MALARIA PROBLEM
This will be a long war, if for every division I have facing the enemy, I must count on a second division in the hospital with malaria, and a third division convalescing from this debilitating disease.
—General Douglas MacArthur, 1943
Malaria has persisted as a formidable problem—indeed a veritable scourge—for the U.S. military throughout its history. Tables 2-1 and 2-2 show lists of major U.S. military actions, deployments, or overseas exercises in which malaria posed a meaningful threat. Some actions involved
TABLE 2-1 Major U.S. Military Actions, Deployments, or Overseas Exercises in Locations with a Malaria Threat
Location |
Year |
Threat |
Morbidity and Mortality |
Civil War (Union) |
1861–1865 |
P. vivax P. falciparum |
1.3 million cases, 10,000 deathsa |
Panama Canal |
1904–1914 |
P. vivax, P. falciparum |
1906 malaria rate 1263/1000/year 1913 malaria rate 76/1000/yearb |
WWI |
1914–1918 |
P. vivax |
Estimated 5000 cases overseas 1917: 7.5/1000/year in United Statesc |
WWII |
1939–1945 |
P. falciparum, P. vivax |
600,000 cases mostly in Pacific theater. In some areas of South Pacific malaria rates were 4000/1000/year (4 cases per person per year) (Downs et al., 1947) |
Location |
Year |
Threat |
Morbidity and Mortality |
Korean War |
1950–1953 |
P. vivax |
Malaria rate 611/1000/year 3000 cases in troops returning to United Statesd |
Vietnam War |
1962–1975 |
P. falciparum, P. vivax |
100,000 casese 1.7/1000 case fatality rate Hospital admissions 27/1000/year 1965 malaria rate for U.S. Army forces: 98/1000/year 1970: 2222 cases (mostly P. vivax) treated in United States |
Panama |
1988–1989 |
P. falciparum |
Action primarily in Panama City |
Persian Gulf |
1991 |
P. vivax |
Few cases in northern Iraq, Kurdish area |
Somalia |
1992–1994 |
P. falciparum, P. vivax |
48 cases; 243 cases in forces on return homef (CDC, 1993) |
Nigeria |
2001 |
Chloroquine- resistant P. falciparum |
Special forces 7 cases (2 deaths) in 300 men |
Afghanistan |
2003 |
P. vivax, chloroquine- resistant P. falciparum |
8 cases in 725 Ranger task force membersg (Kotwal et al., 2005) |
Liberia |
2003 |
P. falciparum |
U.S. marines 80/290 (28% attack rate) with 40 Marines evacuated by air to Germany |
Iraq War |
2003– |
P. vivax |
Few cases |
a Records for the Confederate forces were difficult to find (probably not kept). One example in South Carolina was 42,000 cases in 18 months in 1862–1863. (Malaria was endemic in the United States until the late 1940s). b 1913 malaria rate drop was due to control measures enforced by Colonel Gorgas. c Malaria rate for troops training and/or garrisoned in southern states. d In troops returning home there were at one point 629 cases/week. e Some operational areas were intense: Ia Drang Valley (1966) malaria rate 600/1000/year, equivalent of 2 maneuver battalions rendered inoperative. f In Bardera in 1993 where malaria is hyperendemic: 53/490 cases in Marines. g Attack rate (June–September 2002) 52.4/1000/year. |
TABLE 2-2 Other Limited U.S. Military Actions/Deployments (Actual or Standby) in Locations with a Malaria Threat (1990 Onward)
U.S. forces on standby but not deployed. Nonetheless, the malaria threat was present. Where possible, malaria casualty data are included. Notably, since the Vietnam War, U.S. military actions abroad have (with the exception of the Iraqi operations) been increasingly smaller, shorter, more intense and in geographic areas with significant malaria threats (Tables 2-1 and 2-2). To use a preventive medicine phrase, the malaria problem is most often “local and focal,” as was the experience for U.S. Marines in Liberia in 2003.
Prior to World War II, malaria caused significant morbidity and mortality in the American Civil War and Spanish-American War and threatened U.S. strategic interests in Panama. At the end of World War I malaria was a problem in U.S. troops at home that were garrisoned and training in southern states. During this period the U.S. Army Medical Department distinguished itself through the leadership of three remarkable individuals: MAJ George Sternberg, MAJ Walter Reed, and MAJ William Gorgas. Their combined successes (especially Gorgas in Panama) led to today’s U.S. military operational malaria strategy: control, prevent, and treat (Ockenhouse et al., 2005).
During World War II this basic malaria strategy was put to the test of global warfare. Unfortunately, vector control has limited application on the rapidly changing battlefield (although the introduction of dichlorodiphenyltrichloroethane [DDT] in 1944 was a late success [Harper et al., 1947]). Treatment was problematic given the shortage of quinine that
persisted until the introduction of Atabrine (quinacrine) in 1943–1944, after which prophylaxis became a possibility. Prevention was limited to personal countermeasures (topical repellents and bed nets) as there was no malaria vaccine.
The strategic shortage of quinine caused by the Japanese blockade and intelligence that German scientists had developed new synthetic anti-malarial drugs (Sontochin and Resochin) influenced the U.S. military and Allies to focus on antimalarial drug discovery and development. A program for chemotherapeutic research was launched in the United States in 1941 that involved strong collaboration between the armed services, scientific institutions, university laboratories, and pharmaceutical firms (WHO, 1986). Too late for the World War II effort, this exceptionally coordinated alliance produced an arsenal of new antimalarial drugs that included chloroquine, primaquine, and pyrimethamine. Over the next 20 years many experts, military and civilian, came to believe that the world malaria situation could be controlled and that malaria could even be eradicated. This confidence was supported by the success of primaquine to treat relapsing P. vivax malaria in U.S. troops returning home from the Korean War.
In 1960 resistance to chloroquine was reported in South America and Southeast Asia. Soon cases of chloroquine-resistant P. falciparum malaria were being reported in U.S. military personnel in South Vietnam. A new generation of antimalarial drugs was needed to protect and treat U.S. forces. The U.S. Army Research Program on Malaria was launched in 1963 as part of the Walter Reed Army Institute of Research (WRAIR) (WHO, 1986). By 1974, 26 new drugs (or combinations) had been developed, 11 completing clinical trials, with mefloquine as the flagship response to chloroquine-resistant P. falciparum. Fansidar (sulfadoxine/pyrimethamine) became available when it was licensed in the United States in 1983, but it was little used by the military for prophylaxis because of the risk of adverse effects and because of the availability of mefloquine in the late 1980s. While mefloquine was arduously moving through Food and Drug Administration (FDA) approval in 1989, in vitro resistance was reported by Army scientists working in Thailand. Soon reports of clinical failures occurred and by the mid-1990s clinical failure rates reached 50 percent in Southeast Asia.
In the early 1970s, pioneering clinical trials by academic investigators (Clyde, 1975) supported by the U.S. Army Medical Research and Materiel Command (USAMRMC) and Naval Medical Research Center (NMRC) investigators directly collaborating with academic investigators (Rieckmann et al., 1979) showed that immunization of volunteers with hundreds of bites of irradiated mosquitoes protected the subjects from challenge with infected Anopheles. Seizing the initiative from these early insights, in the
early 1980s, both the WRAIR and NMRC had nascent productive malaria vaccine efforts. This work focused on understanding mechanisms of protective immunity in human and animal models. Then the specter of rapidly evolving multidrug-resistant P. falciparum malaria combined with advances in parasite culture and molecular biology converged in the malaria vaccine community to produce the first human recombinant circumsporozoite protein (CSP) and synthetic peptide malaria vaccines (Ballou et al., 1987; Herrington et al., 1987). These early clinical trials were followed by a resurgence of interest in the irradiated sporozoite immunization model and a search for immunologic correlates of protection. The stage was set, and both the WRAIR and NMRC programs began in earnest to work on development of a military vaccine (Heppner et al., 2005; Richie and Saul, 2002).
In 2003, the critical need for a military malaria vaccine and newer anti-malarial drugs to overcome escalating multidrug-resistant P. falciparum and problems with mefloquine toxic side effects was dramatically illustrated when a Marine expeditionary unit deployed 290 men ashore in Liberia on a peacekeeping mission in 2003. In 2 weeks the Marine expeditionary unit suffered a 28 percent P. falciparum attack rate (80/290 men) with 40 Marines evacuated by air to the military regional medical center in Landstuhl, Germany. One must anticipate that operational scenarios similar to the Marine expeditionary unit in Liberia will continue to occur elsewhere in Africa and in other malarious regions of the world.
Currently troops sent to endemic areas are expected to take malaria prophylactic drugs as instructed and to use personal protective measures such as mosquito nets and insecticide-impregnated uniforms. Compliance with chemoprophylaxis is notoriously low, especially when concerns about adverse effects surface and the risks of malaria are not well understood (as in Liberia in 2003). The rapid emergence of malaria drug resistance and the dwindling number of options for chemoprophylaxis make this a risky strategy to rely on. Personal protective measures are not 100 percent effective on their own, and insecticide resistance is an additional threat to the continued effectiveness of impregnated materials. Both chemoprophylaxis and mosquito net availability depend on supply chains that may not be fully operative in combat and emergency deployment situations.
A highly effective vaccine for U.S. forces that could be given to personnel before their departure for a malaria endemic region is a much needed solution and would be much more reliable than the partially effective methods of chemoprophylaxis and personal protection.
Recommendation 2.1: The Department of Defense (DoD) should markedly enhance its research and development efforts to produce
malaria vaccines suitable for DoD needs. Malaria is a severe ongoing threat for U.S. military personnel deployed to malaria-endemic areas of the world, and current malaria prevention and control methods are indisputably inadequate.
COST AND TIME NEEDED TO PRODUCE A VACCINE
Assuming that a successful vaccine can be developed and produced, what are the likely costs of this task? These were estimated in a 2000 report from an independent committee of experts chaired by Franklin Top, M.D., that was convened to make recommendations on improving the DoD acquisition process for vaccines (Top et al, 2000). The report, Department of Defense Acquisition of Vaccine Production (referred to here as the “Top report”) examined the feasibility of vaccine production for defense against biological agents, but the findings are also relevant to naturally occurring diseases. They included cost estimates for vaccine development and production, and some summary findings of this report are reproduced in Table 2-3.
The research and development costs estimated by the Top report for discovery through production and licensure of a single vaccine were $300 to $400 million, in year 2000. It is estimated that clinical trials represent 30–40 percent of the total vaccine development cost. The additional costs listed in Table 2-3 represent what would be required if the DoD were actually to produce a number of vaccines in-house (estimated by this report at 8 different vaccines, requiring human resources of approximately 2,500 skilled individuals). The concept here for a government-owned, contractor-operated facility is for full vaccine production, not just pilot-lot production.
The estimates of the Top report are compatible with those of Greco who used a $300 million estimate, of which the majority ($210 million)
TABLE 2-3 Industry Benchmark Cost Estimates for Vaccine Production
Element |
Cost/Product (in $ millions) |
Research and development |
300–400 |
Facility capital costs |
370 initiala |
Additional production, labs, and support |
75–115b |
Manufacturing operations and maintenance |
30–35 per year |
a First three vaccines. b For each vaccine beyond initial 3 to 4. SOURCE: Top et al., 2000. |
was needed for the late development phase (Greco, 2004). The average time required for new vaccine development was estimated at 10 years (Struck, 1996).
By far, the two major direct costs in a decade-long vaccine development program that results in licensure of a final product are the clinical trials (mainly phase 2 and 3) and the construction and furbishing of a manufacturing facility where the vaccine can be produced following licensure. Depending on the specific vaccine, considerable costs early on may also be expended in process development to learn how to manufacture the vaccine in an economic and consistent manner. The biologics license application submitted to the FDA contains extensive information characterizing the product and summarizes the safety, immunogenicity, and efficacy data from well-designed clinical trials. The application also contains information documenting that the manufacturing processes result in a product that is consistent in relevant characteristics and in the clinical acceptability and immunogenicity of different lots. The biologics license application also describes the features of the facility that will manufacture the vaccine. The costs of constructing and furbishing a manufacturing facility vary greatly from one vaccine to another. This depends not only on the characteristics of the specific vaccine but also on the maximal number of doses that the facility is expected to produce.
In summary, the estimated cost to achieve a high likelihood of development of a successful new vaccine ranges upward from a minimum of $300 million, spent over at least 10 years. Although the cost and time commitments to produce a vaccine seem enormous, these have to be balanced against the current expenses for prevention and treatment of malaria and the ineffectiveness of current prevention and control methods leading to high casualty numbers in many deployments (see Table 2-1).
Recommendation 2.2: The DoD should formally acknowledge the high cost of developing any new vaccine and the fact that the Military Infectious Diseases Research Program (MIDRP) Malaria Vaccine Program is severely underfunded in relation to the goal. To increase the probability of success, this discrepancy needs to be rectified.