Soldier Health and Performance
Many developments in biotechnology will initially be intended for medical applications. Over the next 25 years, the pharmaceutical and biotechnology industries will make enormous investments in technologies to translate information uncovered by genomics into knowledge of disease pathways and targets; that knowledge will then be used to develop novel therapeutic compounds and vaccines. This chapter examines developments in therapeutics and genomics that are expected to be important to the Army as a whole, as well as to individual soldiers.
Based on current trends, vaccines and therapies will soon be tailored to suit individual soldiers. New technologies may lead to dramatic increases in the development of new drugs and vaccines and in equally dramatic reductions in the time and cost of developing them. For the Army, these reductions in time and cost will provide opportunities to develop therapeutics and vaccines against diseases that are not of commercial importance but are endemic to areas where forces may be deployed.
If current trends continue, predicting and designing drugs to augment individual performance will be possible. Unlike previous wars in which dozens of soldiers were needed for each kilometer of front, in future wars there may be only one or two soldiers per kilometer of front or 10 to 15 soldiers per square kilometer in shifting combat zones with no fronts (Rhem, 2000). In these wars, fewer warfighters will have increased responsibility for each battle. With advances in our understanding of biological processes, the effects of combat stress might be mitigated and the survivability of warfighters increased.
The term genomics was first used to refer to information of interest to industry about DNA, including sequence information. The term functional genomics has come to refer to information about what genes do, especially information about RNA and protein products of genes, often called proteomics. In this report, all of these are included in genomics, and this whole area of research is called genomic biology.
In principle, genomics provides a means of identifying, in any cell, tissue, or organism, all of the important genes and regulatory regions in the DNA, all of the mRNAs, and all of the proteins in different states of cell and organ function. Genomics has transformed the science of biology by enabling the discovery of new links between protein structure and function (see Box 7-1).
Almost as a by-product, genomics also provides means of identifying differences among different individuals at the level of their DNA, RNA, proteins, and other expressed molecules and of determining the significance of these differences. This information will be valuable for correlating differences with particular outcomes, gaining insight into the biological mechanisms caused by or affected by these changes, and suggesting grounds for subdividing, or stratifying, populations such as soldiers. Therapies and enhancers could even be tailored for individual soldiers to accomplish specific ends.
Genomics Information-Gathering Techniques
For practical purposes, DNA information does not change during the life of the organism. By definition, this information can be strongly or weakly predictive of behavioral characteristics (see section below on prediction and enhancement of soldier performance). By contrast, mRNA and proteins change in response to events inside and outside the organism and can be used to predict events that occur over long (year to year) or short (hour to hour) time frames. Genomics information can be gathered through techniques involving DNA, RNA, and proteins.
One means of gathering information about DNA is to sequence it, which means determining the nucleotide sequence
DeterminingFunction Through Protein Structure
The central dogma of genomics is that sequence determines structure determines function. Sequence applies to DNA sequences, and structure applies to the structure of the proteins encoded by the DNA. As more and more gene sequences are identified, more information is becoming available for deriving the functions of gene products.
One way to determine the function of genes ought to be by protein structure, because the overall fold of a protein should correlate with, at the very least, its biochemical function. Thus, proteins possessing a Walker ATP-binding motif should bind and hydrolyze ATP or some other nucleoside triphosphate; proteins with an EF-hand should bind calcium ions and use this binding as a conformational switch, and so forth.
This reasoning is the basis for the new Structural Genomics Initiative, a concerted effort by dozens of structural biology laboratories around the world to determine at least one three-dimensional structure—by either nuclear magnetic resonance (NMR) spectroscopy or, more commonly, by x-ray diffraction methods—for every type of polypeptide chain fold. Reasoning that the availability of a “fold library” would enable the structures of any homologous protein to be built by simple modeling using the representative structure from the library as a template, proponents of this initiative hope to determine 2,000 to 5,000 structures a year until the catalog of folds is complete.
Some computational biologists are attempting to solve protein structures by predicting them directly from the amino acid sequence. These efforts are of two main types: (1) recognizing the fold from homology with proteins of known structure and (2) folding of linear polypeptide sequences in silico to produce a three-dimensional model of the protein. The former technique, which is already well established, constitutes a large part of what is commonly termed bioinformatics. The technique of folding a protein sequence directly into its structure, however, has a checkered history. Most “successful” attempts have produced atomic models that only grossly resemble actual structures, with no possibility of the model being useful for drug design or functional studies. These methods involve using empirical energy potential functions to manipulate a set of atomic coordinates in a computer to attain a state of lower empirical potential energy. The underlying assumption is that the global minimum of this function will be the correctly folded structure.
Simulations of protein folding require enormous amounts of computer time even though the time scales are generally much faster than the time actually required for a protein chain to fold up in vivo. Recent efforts have focused on the development of massively parallel supercomputers, such as IBM’s Deep Thought computer, to simulate folding on more nearly physiological time scales. However, it is unlikely that this will provide a simple, near-term solution to the problem of predicting direct structures.
Improving algorithms for recognizing protein-protein association sites from sequence information seems to be worthwhile, as is the development of improved empirical potential-energy parameters for proteins. In the short term, however, experimental determination of protein structure will almost certainly greatly outpace de novo structure prediction, at least for proteins with no obvious homology to proteins with known structures.
Important work being done in this area and enabling technologies include the development of functional databases; computational tools that analyze three-dimensional structures for small-molecule and protein-binding sites; and computational approaches to the recognition of catalytic motifs in protein folds. Determining the structures of several thousand proteins per year is a formidable challenge, but the methods to meet the challenge are nearly in hand. Lack of access to synchrotron radiation, the high cost of multiuser facilities, and the need for new materials are the main obstacles. Increasing access to synchrotron radiation will require equipping new beam lines for high-throughput protein crystallography and providing personnel to operate crystallographic beam lines.
A central facility for the large-scale cloning, expression, and purification of proteins from human cells and pathogens could serve as a resource for the entire structural biology community. Computational tools that can solve the phase problem in protein crystallography and automated electron-density map interpretation will also be necessary. Because not all proteins can be crystallized, high-throughput nuclear magnetic resonance (NMR) initiatives may be helpful for increasing the rate of production of new structures; especially interesting would be using x-ray and NMR techniques in complementary ways.Source: Petsko, 2000.
of contiguous stretches of DNA up to the entire genome of an organism. Now that the DNA sequences of many organisms, including humans, are known, a treasure trove of biological knowledge, including insights into evolutionary history and biology has been revealed. The implications have barely begun to be realized.
The cost of sequencing has dropped tenfold in the past five years to about $0.20 per base pair in 2000, and the development of enabling technologies is still accelerating. At that rate, the cost will have dropped another five orders of magnitude by 2025. By that estimate, it will cost $1 to sequence 500,000 base pairs and $6,000 to sequence an entire human genome. In other words, by 2025, comparisons of complete DNA sequences between individuals and reference sequences may have become routine.
At the moment, although it is not economically feasible to determine complete sequences of human individuals, it is possible to identify differences among DNA sequences by a
number of methods. These differences are referred to as polymorphisms, or nucleotide polymorphisms. If two DNA sequences differ at a single position, the difference is referred to as a single nucleotide polymorphism, or SNP (pronounced snip) (see Box 7-2).
Information about polymorphisms is generated by a growing list of technologies ranging from (1) PCR with selected sets of oligonucleotide primers to (2) differential hybridization to (3) photolithographically synthesized arrays of oligonucleotides on chips to (4) ligation chain reaction to (5) the use of cycles of polymerization oligonucleotide primer resulting in DNAs of different lengths that can be analyzed using time-of-flight mass spectrometry.
SNPs and other nucleotide polymorphisms are especially valuable markers for identifying function or behavior expressed by an individual because they can be easily, and accurately, detected. SNPs and other genetic markers have advantages over phenotypic markers because it is easier to identify many different markers in DNA from two different individuals than to determine an equivalent number of genetically determined phenotypic traits, such as eye or hair color. The Army commonly uses DNA as an aid in the identification of soldier remains; in the future it may become possible to use distinguishing characteristics of DNA as a biometric for secure access (e.g., to identify conclusively soldiers authorized to use classified information).
Although these technologies are too new to establish a trend line with the same confidence as for sequencing, it seems inevitable that the cost of determining a set of SNPs in an individual (i.e., genotyping the individual) will also drop
Single Nucleotide Polymorphisms
The easiest way to introduce the concept of single nucleotide polymorphisms (SNPs) is to think of the different meanings of the word gene. To molecular biologists, for the last 50 years, genes have been the stretches of genomic DNA, encoded proteins, that are components of cells and organisms. This definition of a gene (i.e, something that encodes one of the component parts) has largely superseded an older definition based on work by Mendel and made rigorous by Morgan, Sturdevant, Müller, and others in the early twentieth century. In the older definition, a gene was a unit of heredity (i.e., something that encodes a detectable property of an organism). That property is called a trait or, nowadays, a phenotype. The older definition of a gene only makes sense operationally because one can determine if a trait is present (score a phenotype) if the phenotype comes in at least two distinguishable configurations—tall or short, blue-eyed or brown-eyed, sickle cell or non-sickle cell, dimple or no dimple. Classical genetics depends on these scorable differences.
Different forms of a gene that give distinct phenotypes are referred to as alleles, and genes are said to exist in different allelic states. For example, say that a man has inherited a trait, pattern baldness, from his father or mother. The relevant gene might encode a serum testosterone receptor found in hair follicles, and the man may have inherited the pattern baldness allelic form of the gene, which might, for example, encode a form of the testosterone receptor that causes the hair follicles to die after exposure to serum testosterone for 20 years.
The fact that one has pattern baldness means that one has inherited DNA from one’s mother or father that carries the pattern baldness allelic form of the gene. Because the pattern baldness allele was located on a chromosome next to many other genes, if one has the pattern baldness allele from one’s father, one is also likely to have the same allelic forms of genes near that allele in one’s father. Those traits thus cosegregate and are said to be linked.
The cosegregation of traits (linkage) also occurs in the population as a whole because the human species is still young. Most estimates suggest that humans are no more than 80,000 years old, or only 3,200 generations removed from a founder population in Africa. The small number of generations means that an individual who has a specific allele also has a high probability of having other specific alleles at nearby genes. So the presence or absence of a particular marker, such as a SNP, is a strong predictor of the allelic state of nearby genes. This fact allows us to look at the SNP and guess about the state of the neighboring genes, rather than having to isolate the neighboring genes and sequence their DNA. The closer two locations on the genome (loci) are together on DNA, the greater the chance that allelic forms of them are linked (will tend to be inherited together).
SNPs and other nucleotide polymorphisms are especially valuable markers for identifying function or behavior expressed by an individual because they can be easily, and accurately, detected. An allelic variant that can be easily detected and defined (score) is called a marker. The use of DNA polymorphisms as genetic markers has advantages over using phenotypic markers based on a trait of the organism. The main advantage is that DNA is all alike, and one can score, say, 5,000 different markers in two different individuals much more easily than one can determine the same number of individual genetically determined phenotypic traits (eye color, curly hair or straight hair, dimple in chin or no dimple).
Information on DNA polymorphisms for a population enables one to examine traits and outcomes in the population and determine which, if any, of those traits and outcomes are correlated with particular SNPs. Correlation is evidence that allelic variants of genes responsible for those traits and outcomes are located near the SNP.Source: Brent, 2000.
rapidly until it becomes cheaper to sequence an entire genome than to score a set of SNPs. Thus, comparisons of SNP differences among individuals, which are already being made on a large scale, will increase dramatically.
Genes are transcribed into messenger RNA (mRNA), and the different mRNAs are then translated into proteins, which do the work of the living organism. There is a general positive correlation between the identity and number of mRNAs present in a sample (an extract from a cell, tissue, organ, or organism) and the number of the encoded proteins present in that sample.
Unlike proteins, different mRNAs have similar chemical properties. Therefore, populations of them can be converted by identical manipulations into complementary DNA (cDNA), amplified (by PCR), and detected (by their ability to hybridize to oligonucletides or longer pieces of DNA, for example, immobilized on chips). The determination of the identity and number of different mRNAs in a sample is often referred to as transcription profiling, or gene-expression monitoring (GEM).
Many GEM methods can generate the same information. The best technology at present, Affymetrix photolithographic oligonucleotide arrays, would cost about $1,000 to measure 1,000 different mRNAs, or $1 per mRNA. Motorola and Corning have announced competing products that should cut this cost by a factor of 10. That cost will continue to drop rapidly, and it is fair to anticipate that in 10 or 15 years measuring the expression of as many human genes in a sample as one chooses, up to the entire set of 30,000 or so genes, will cost only a few dollars. The committee believes that widespread monitoring of gene expression using mRNA techniques will make it possible to gather data on human responses and, eventually, apply the information to situations in near real time.
Impact of Genomics on the Prevention and Treatment of Disease
Genomics will certainly alter the treatment and prevention of diseases (pharmacogenomics). Future therapies will depend on a complete understanding of the genetic biology of individuals. With genetic profiling, the effect of individual genetic elements on disease and fitness can be determined. Germ line sequences will be used to reveal inborn and somatic mutations that existed at birth or that have occurred over the life of an individual. Mutations that may predispose an individual to certain diseases, cancers, or environmental sensitivities will be identified. Many illnesses will come to be viewed as phenotypic manifestations of genetic differences.
Genomics-based treatment will become the norm. At the DNA level, there is already a wealth of information that can predict whether a person will respond well to a therapeutic technique. For example, people who carry certain allelic variants of cytochrome P450 CYP2D6 do not convert codeine into morphine and do not benefit from the drug; people with amplifications in CYP2D6 metabolize codeine so well that standard doses are ineffective (Sindrup and Brosen, 1995). Similar DNA-polymorphism-based stratifications of patients will help identify subgroups of military personnel that will benefit from, or be adversely affected by, particular drugs. At the RNA and protein level, expression monitoring will be developed to the point that changes in the patterns of expression of particular genes in patients with particular genetic makeups will provide early warnings of therapeutic or toxicological outcomes (e.g., after receipt of a vaccine or exposure to a chemical or biological threat agent).
Most of these genomic capabilities will be developed by the civilian sector for medical conditions and diseases that are of economic interest in the affluent countries of the industrialized world. Thus, the military will not always be able to rely on the market to generate capabilities for military applications. For example, if the military is interested in DNA, RNA, or protein information that can be used in chemical biological defense toxicology or in guiding malaria prophylaxis or therapy, it will have to complete that research itself.
The Army has good reason to take advantage of the knowledge genomics will provide and is in good position to further the state of the art. For example, the recent anthrax vaccination program undertaken by the Army was for a controlled population against an agent that is not addressed by general public health services. The Army was assigned to manage the administration of vaccinations to 2.4 million members of the military services and to monitor progress in each service (DOD, 1998). The immunization program consists of a series of six innoculations over an 18-month period, followed by an annual booster. The cost of vaccinating the total force over a six- to seven-year period was estimated to be about $130 million (USAF, 1998).
This and similar vaccination programs could provide a unique opportunity for use of genomic techniques to improve prophylaxis. Administration of a program on the scale of the anthrax program requires extensive monitoring and followup and could provide a perfect test environment for monitoring responses to vaccination in a large population. Advancing the understanding of the genetic basis of different responses to this vaccine would advance the prescription of drugs and vaccines by genotype.
Most vaccinations can cause adverse effects in small numbers of people receiving the vaccine. Anxiety about adverse effects to single vaccine (for example, those that occurred during the swine flu vaccination program in the 1970s) often lead to a general anxiety about all vaccinations. During the course of the anthrax program, a small number of service members refused to be inoculated.
Insofar as the military mirrors the larger society, such
social concerns are an important factor in successful vaccination programs. The public discussion of the Gulf War syndrome revealed a widespead distrust of the protection provided to American service personnel shipped to strange lands and subjected to foreign risks. Whether justified or not, the anxieties are real. Anxieties about vaccination in both the military and the general population are likely to persist. A long-term approach to ameliorating this concern could include the systematic collection of genomic data to monitor the responses of participants in vaccination programs.
Assuming that issues relating to personal privacy can be overcome, the Army has a unique opportunity to collect and use genomic data. For example, if the Army collected genotypic information on individuals who receive a vaccine and compared that information with later information about the success of the immunization or adverse effects, it could determine if genetic markers could be correlated with the vaccine’s ineffectiveness. Even weakly predictive information could be useful. For example, information that a given individual was not likely to be easily immunized by a vaccine could be used to recommend a different kind of prophylaxis, such as additional booster doses of the vaccine, increased clinical surveillance, or auxiliary prophylactic therapy, such as antibiotics in addition to the vaccination.
The Army is also in a position to add this information to its research base. It would be of great value to the Army, and possibly to society as a whole, to increase epidemiological monitoring of troops deployed abroad and to add genomic information to such studies. In collaboration with academic and industry researchers, the Army could contribute to a better understanding of vaccination and to the development of vaccines tailored to individual genotypes.
Prediction and Enhancement of Soldier Performance
Combat effectiveness can be increased by enhancing the performance of individual soldiers. Because genomics information offers clues to improving human performance, it could provide the Army with means of increasing combat effectiveness.
Prediction of Performance and Outcomes
Genomic techniques will enable the measurement of differences at the cellular level in DNA, mRNA, and expressed proteins. These measurements could potentially lead to predictions of differences in performance among soldiers, as well as the direction and tailoring of individual therapies and augmentations. In the future, genomic methods could be used to screen or supplement physical tests for qualities such as strength, endurance, marksmanship, or the ability to function when deprived of sleep; these predictions could be used to help in assigning individuals to perform appropriate tasks.
The Army has long used physical characteristics, such as vision or physical size, to subdivide soldier populations and has used the outcome of physical and mental tests to determine who is qualified to pilot helicoptors or to join elite combat units. Experience has shown that some individuals cope with the horrors of combat better than others, but the basis for these differences is not understood. Many police and fire departments use psychological screening tests to evaluate the ability of individuals to deal with stress. Army experts believe that psychological screening may be used to improve soldier performance in close-combat units (Rhem, 2000). Genomic methods may eventually be used to supplement these performance-based tests.
Intelligent decisions based on genomic information can only be made with a clear understanding of what biological information does and does not indicate. Predicting the response of an individual will always be highly incertain. Stratification, particularly when the outcomes can result in inclusion or exclusion, should be based on performance ctitetia rather than genomic criteria. If genomic criteria are used to supplement other selection tools, the criteria should have a mechanistic explanation whenever possible, be highly predictive of performance, and be continually reviewed and revised.
The use of genomic data to predict differences in performance among individuals requires a careful consideration of ethical issues. A good example is the sickle-cell trait, for which the underlying mechanism is well understood. The sickle-cell trait is caused by a point mutation difference (a SNP) in hemoglobin. The point mutation causes differences in the biophysics of the hemoglobin that, among other things, affect how it binds oxygen in low-oxygen environments. If someone is homozygous for the mutant hemoglobin, he or she will not do as well in low-oxygen environments. A DNA test that measures the presence of this mutation will be measuring the cause of the difference in oxygen utilization and will be highly predictive. Other SNPs linked to the sickle-cell trait will be less predictive. Here, the facts that the mechanism is well understood and that a direct test for the trait is available determine what constitutes a fair exclusionary criterion. Measuring for a less predictive SNP might result in the exclusion of qualified individuals.
Emerging genomics tools offer the Army an outstanding opportunity to improve the understanding of the biological bases of differences that affect military outcomes. Because of its mission orientation, the Army may also be in a unique position to benefit from genomic monitoring. In all cases, broad ethical issues, such as privacy and equal treatment, must be addressed, and other issues may arise as we learn more about ourselves than we wish to know or than we wish others to know. Assuming that these issues can be resolved, the committee believes the Army should use genomic monitoring tools to further its purposes.
Physicians distinguish between therapy (restoring a function to normal) and enhancement (boosting a function above the norm) and have traditionally been uneasy with the ethics of enhancement (a good discussion can be found in Kramer, 1993). For the military, therapy, enhancement, and augmentation may all be desirable. As long as social norms of acceptable drug use are observed, the Army should welcome drugs that could ease the adjustment to another time zone or to longer periods without food or sleep; the Air Force should welcome a drug that could increase the G-force a pilot can endure before blacking out; and the Navy should welcome a drug that could ease motion sickness. To be acceptable, the drug technologies must be both safe and reversible. Guaranteeing soldiers that they will be able to return to their original physiological profile (excluding normal wear and tear) will be very important.
In general, performance-enhancing drugs are not likely to be a focus for development by the private sector. The pharmaceutical industry spends hundreds of millions of dollars to develop new drugs targeted against diseases in affluent countries. The markets for performance enhancers, including the military, are smaller, and it is not realistic to think that the industry, much less the military, will be able to spend large sums on development.
Fortunately, radically cheaper and faster approaches to drug discovery are emerging. These include the use of combined genomic information with modern chemistry to generate new drugs and the use of many kinds of genomic information to streamline testing in animals and humans. Therefore, if the military wishes to realize the promise of enhanced performance that advances in biology, genomics, and chemistry can provide, it will have to make common cause with the other constituencies trying to reduce the time and lower the cost of drug development.
The Army can take advantage of commercial improvements in gene-expression-monitoring techniques to monitor threats to soldiers, as mirrored via gene expression, in response to external stimuli and to provide new methods of improving soldier training and performance. The Army should optimize gene-expression-monitoring techniques for soldier applications, especially for the detection of target threat molecules through toxicological genomics.
In the next 15 years, static genomic (DNA) information will be used to target individuals and to direct therapies tailored to be safe and effective for their genotype and to suggest differences in performance. Dynamic genomic (mRNA and protein) information will be used to predict differences in performance, track changes in individuals, and provide early warning of positive and negative health events. Complex ethical and privacy issues will have to be addressed, but the committee believes that guidelines for use of genomics technologies will evolve as the benefits become more firmly established.
Research in genomics is an important way for the Army to ameliorate the risks associated with placing susceptible individuals in harm’s way. The Edgewood Chemical Biological Center has in-house facilities that are uniquely situated and equipped to pursue meaningful genomics research. The Army should develop prophylactic tactics and protocols for rapidly detecting pathogenic or cytotoxic agents to which soldiers may be exposed and for which both high and low levels of exposure could have long-term consequences for their health and performance. The Army should monitor developments in genomics and take advantage of advances to improve its screening tools.
The Army should become an early user of genomic data. It should develop predictors of individualized immune responses to vaccines so that the vaccines can be tailored to genotypes. The Army should lead the way in laying ground-work for the open, disciplined use of genomic data to enhance soldiers’ health and improve their performance on the battlefield.
TRENDS IN DRUG DEVELOPMENT
The advent of genomics, screening techniques based on a mechanistic understanding of molecular and cellular biology, the widespread use of structural biological information, and improvements in medicinal chemistry and computational chemistry have dramatically improved the processes leading to discovery and development of new therapeutic compounds. Biotechnology will be enormously prolific in the next 10 to 25 years. The following developments will impact the Army:
cheaper means of discovering and testing new therapeutic compounds
lower costs of manufacturing through improvements in conventional production methods and the adoption of new ones
Currently, the development of a new therapeutic compound requires 7 to 12 years and costs more than $400 million per product. Drug development is a complex, multistep process beginning with an investment in research leading to an understanding of the biochemistry and function of a putative compound (discovery research), followed by production of a small quantity of material, assessments of toxicity in cell and animal studies (preclinical studies), development of process-scale manufacturing methods, applications to regulatory authorities for permission to conduct clinical trials, clinical trials, and posttrial evaluation.
Protein Therapeutic Compounds
Until the 1980s, the discovery and characterization of proteins (e.g., hormones, antibodies, derivatives of cell-surface receptors, and other protein molecules) as drugs was rare. Today, several dozen therapeutic protein compounds are approved for sale in the United States and Europe, and hundreds more are in development. These include hormones, cytokines, growth factors, monoclonal antibodies, and therapeutic enzymes; most of these compounds are derived from genetically engineered organisms. Protein therapeutic compounds have been effective against diseases as varied as diabetes (insulin), cystic fibrosis (DNAse), hemophilia (Factor VIII), some cancers (interleukin-2, Î±-interferon, humanized anti-Her2 antibody), and hematopoietic difficiencies (erythropoietin). Most of these compounds either replace or supplement proteins with known functions (e.g., erythropoietin, insulin, Factor VIII) or bind to and/or affect known, well-studied receptors or infectious agents (e.g., anti-Her2/ Neu antibody).
Genomics has greatly facilitated the discovery of new protein therapeutic compounds. In addition to the conventional pharmaceutical industry, specialized companies, such as Human Genome Sciences, Millennium Pharmaceuticals, Amgen, Genentech, and Genetics Institute, are making wholesale use of genomic methods to identify potential therapeutic compounds. Of the thousands of human genes, 5,000 to 10,000 may encode secreted proteins, and a significant fraction of these probably have biological effects that could be useful in some therapy (e.g., B-lymphocyte stimulator [BlyS] and osteoprotegrin [Opg1]). Thus, although all secreted proteins may be discovered and characterized by 2015, the development of therapeutic applications will probably take longer.
Protein therapeutics is not restricted to secreted proteins. Most existing therapeutic compounds (both small-molecule and protein) act by enhancing or inhibiting the function of a cell-surface receptor; therefore, it is widely believed that novel cell-surface receptors represent a promising class of therapeutic targets. It is estimated that approximately 20,000 human genes encode cell-surface receptors. One of the most common ways of affecting a response through a cell-surface target is to use a monoclonal antibody that binds to the receptor (see Box 7-3).
Protein Expression and Production
The majority of existing protein therapeutic compounds and vaccines are derived from genetically engineered bacterial, yeast, or mammalian cells grown under well-defined conditions. The manufacturing methods involve careful purification to ensure that the protein is free of impurities. For human proteins that have large therapeutic effects from small amounts of protein, production is not the primary cost driver. For proteins that must be administered in large amounts, or over a period of many years, the cost of production is a major factor. Recently, lower cost production methods have been devised, including expression of transgenic proteins in the milk of goats and cows, in chicken eggs, and even in plants. The abundance of new proteins will accelerate trends in the commercial sector toward improving protein expression, characterization, and purification. The result will be to speed up and lower costs of the development of novel protein therapeutic compounds.
Small-Molecule Therapeutic Compounds
Small-molecule therapeutic compounds dominate the drug market. Although protein therapeutic compounds, such as monoclonal antibodies, represent a rapid route to safe and effective products against novel targets, they have several significant drawbacks. Most significantly, proteins are not generally available orally and must be administered by injection. This limits their therapeutic use to proteins directed against truly life-threatening conditions (e.g., to treat diabetes), for which daily injections are acceptable, or to treatment modalities in which less frequent injections can be beneficial (e.g., anti TNF-Î± to treat rheumatoid arthritis).
Hundreds of small-molecule therapeutic compounds (e.g., aspirin) derived from synthetic compounds and natural products are on the market. Most of these are designed to treat medical problems with significant markets in affluent countries, such as pain, inflammation, high blood pressure, blood clots, depression, schizophrenia, cancer, diabetes, and Alzheimer’s disease. The Army does not and should not support research in these areas. However, the Army does need treatments for infectious diseases, such as malaria, that could have an adverse impact on military operations.
The Army should be very interested in developing small-molecule drugs for indications other than infectious diseases, such as drugs to ameliorate shock caused by blood loss. As sequencing and structural information on all proteins becomes available, the discovery and design of small-molecule compounds that interact with specific novel targets will improve. The pharmaceutical industry will continue to invest heavily in these developments in the next 10 years, and the Army should track these developments closely.
Countering Chemical and Biological Threats
As the executive agent for the DOD Chemical Biological Defense Program, the Army is responsible for the development and acquisition of therapeutic compounds that could be used to counteract the effects of biological and chemical warfare agents, a “market” area that is not likely to be addressed by the commercial sector. In-house expertise will be needed to take advantage of developments in genomics and other areas that might enable leveraging of limited resources to produce vaccines, identify protein therapeutic compounds, identify targets for small-molecule drugs, and
In 1975, Georges Kohler and Cesar Milstein fused cells derived from mouse B lymphocytes, which secrete antibodies, to mouse myeloma tumor cells, which can grow indefinitely in culture (a, at right). Fused cells, called hybrid myelomas or hybridomas (b, below), can grow in cell culture and produce large quantities of antibodies. Populations of cells derived from a single founder cell are clonal; therefore, the chemically identical antibodies they secrete are called monoclonals. Monoclonal antibodies are useful diagnostic agents and, increasingly, protein-based drugs. Unfortunately, humans mount an immune response to monoclonal antibodies from mice, known as the human antimouse antibody (HAMA) response; therefore, current practice is to generate “human” monoclonal antibodies. One way to do this is by “humanizing” murine monoclonals by mutating the amino acids in the framework that differ between mice and humans. Another way is to derive monoclonal antibodies from the spleens of transgenic mice whose immune systems produce human antibodies. A number of in vitro “display” methods can be used to create large libraries of human antibodies or antibody fragments that can be screened against a wide range of targets.Source: Olson, 1986.
~ enlarge ~
~ enlarge ~
otherwise accelerate the development of drugs. For example, a reasonable step would be for the Army to assemble a database of target molecules and make this database available to academic and commercial laboratories so that, if they chose, they could identify those compounds. Given the rapid developments in chemistry and computational structural biology, it may become possible in this decade for the Army itself to identify the most important compounds in some cases.
Closer interaction with the commercial sector might enable the Army to make better use of existing compounds. For example, an antiviral compound that was not developed
by the commercial sector to treat flu symptoms because of the potential side effects might merit evaluation by the Army for conteracting pathogens on the battlefield. Moreover, there are now more than 10 immunomodulatory cytokines (e.g., interleukins, such as IL-1 through IL-18) approved for use in humans. In principle, the proper combinations of these would offer new means of modulating immune responses to existing or new infectious diseases. The Army will be the main driver for clinical research on diseases of interest mostly to the military.
Finally, the Army should also monitor developments in commercial production. Through outsourcing, the United States may become dependent on other countries for the production of many critical pharmaceutical intermediates. U.S. advantages in a conflict could easily be compromised if the Army must depend on sources from countries with which the United States is not allied for the production of critical materials. For example, the committee is aware of only one remaining facility in North America that produces intermediates for beta-lactam antibiotics. Although it probably does not make sense for the Army to invest in infrastructure for the production of these compounds, it would be logical for the Army to establish a database of facilities and associated capabilities available to use in case of an emergency. This database could also be extended to include other key up-stream and downstream aspects of the pharmaceutical industry, such as the status of clinical trials.
Examples of Army-Industry Cooperation
Recently, DOD was able to leverage commercial development by gaining FDA approval to use ciprofloxacin (CIPRO®) to treat people exposed to aerosolized anthrax (Bacillus anthracis). This approval was based on safety and efficacy data from a DOD animal study. Bayer, the manufacturer, applied for the approval after extensive discussions with DOD, CDC, and the FDA (Inglesby et al., 1999). Without treatment, anthrax is 99 percent lethal to unprotected people (DOD, 1998). Although penicillin and doxycycline have been approved to treat anthrax, there are reports that strains of the bacteria have been engineered to resist these antibiotics. A Working Group on Civilian Biodefense, comprised of 21 representatives of academic, government, public health, military, and emergency management organizations, including the U.S. Army Medical Research Institute of Infectious Diseases, was organized through the Johns Hopkins Center for Civilian Biodefense Studies. The group recommended early antibiotic treatment with ciprofloxacin or other fluoroquinolone therapy as the first line of treatment, and the FDA approved its use in this circumstance.
Cooperation has come to be expected in the medical community but is less common in Army relationships with the commercial high-technology industry. The Army could take advantage of commercial research and development through cooperative research and development agreements (CRADAs), which would provide the Army with some of the benefits of teaming with industry. But CRADAs cannot provide industry with the same guarantees of future business as traditional teaming arrangements within the industry.
The DARPA Unconventional Pathogens Countermeasures (UPC) Program is an example of government funding for applications of interest to the military that has resulted in potentially fruitful research. Organizations funded by DOD for this program are using state-of-the-art or revolutionary technologies to address chemical-biological defense requirements. Like past DARPA projects, this research may also have great commercial potential.
Many of the future biotechnology applications of interest to the Army in the fields of genomics and drug discovery are outgrowths of biomedical research sponsored by another government agency, the National Institutes of Health (NIH). The Army should work closely with the NIH to build on the existing research base and should consider cosponsoring research, both to leverage NIH relationships with industry and to ensure that Army needs are met. The Army is currently cosponsoring nonmedical research with DARPA on the Future Combat Systems Program, and similar arrangements with other government agencies, such as NASA or the National Science Foundation, should be investigated.
The examples of successful government-industry-academic collaborations above underscore the potential gains to the Army that can be realized from close interactions with the biotechnology industry. In addition to the DARPA UPC model and the traditional, direct-contract model (e.g., contracts to produce doses of vaccine), intermediate arrangements, such as CRADAs for joint research, would be effective in areas where private sector investment is driving research in new and emerging areas.
The following mechanisms might also be used to leverage private-sector technology developments:
contracts that allow participants to use commercial practices and retain intellectual property
government funding to mitigate the technical risk of producing selected prototypes
teaming relationships, which often lead to cooperative agreements between participants
These mechanisms can allay industry’s reservations about working with government, including noncommercial accounting requirements, potential government audits, special regulations governing government contracts, restrictions on trade, infringements on intellectual property rights, the risk of procurement violations, or possible negative press if the biotechnology is perceived to be used for weapons.
Improved Technologies for Drug Delivery
With micro/nanofabrication technologies, devices and components could be manufactured from a constantly expanding array of materials with unprecedented precision (see
section in Chapter 6 on miniaturization of biological devices). Considerable attention has been focused on using these technologies for the development of novel, advantageous methods of delivering therapeutic and, possibly, performance-enhancing drugs (NIH BECON, 2000).
Several different types of nanotechnology drug-delivery systems are under investigation to introduce drugs via implanted devices and by precise intravascular injection. Research is also being conducted on micromachined needle arrays for the delivery of drugs through the skin (Brazzle et al., 2000). This technology is expected to lead to significant progress in the transdermal delivery of proteins and small molecules. Lin et al. (1998) have proposed portable systems for reconstituting and delivering drugs on the battlefield.
Implantable microsystems for the controlled release of biotechnological molecules are also under development (Desai et al., 1999). The drug-delivery chip presented by Robert Langer and associates at MIT (Santini et al., 1999) can deliver multiple drugs in a pulsatile, remotely controlled or automatically regulated manner ( Figure 7-1). This chip could be used to deliver insulin and health-promoting drugs, as well as performance-enhancing drugs or threat-mitigating drugs on demand.
Several approaches to the delivery of therapeutic molecules have led to currently available pharmaceutical products. These include biodegradable particles, liposomes, and the direct conjugation to immunological molecules. To overcome the limitations of these approaches, several groups are engaged in developing the microtechnological and nanotechnological foundations of intravascular delivery mechanisms (Nashat et al., 1998).
Trends and Barriers
Improved transdermal delivery and implantable, controlled-release microtechnologies are expected to be ready for commercialization in the short term. Implantable devices capable of self-activation through a feedback loop combining sensing and the release of a therapeutic agent could be available in 5 to 10 years. These devices will require the
~ enlarge ~
~ enlarge ~
development of sensing and drug-release mechanisms and the technology to integrate them.
Extraordinarily difficult biological barriers will have to be overcome for the development of effective, biologically targeted, intravascular drug-delivery systems. Micro/ nanofabrication scientists and life scientists will have to work together in the development of delivery technologies based on biologically inspired methods.
The high-throughput screening of molecules that are candidates for therapeutic drugs may be accomplished through new nanotechnological concepts as well as through other procedures. Micro/nanomachined templates might be used for the realization of cell culture environments that mimic the complexity of the biological environment (Bhatia, 1999). In addition, cell microenvironments may be instrumented to provide real-time measures of cellular responses to physical or chemical stimuli. These concepts may also provide the foundations for tissue-engineering constructs.
Technologies for Invasive Interfaces
Invasive interfaces (i.e., implanting a material in the body) may be necessary for sampling and analyzing molecules in the body. Micro/nanofabrication techniques are being used to develop a rejection-free method of transplanting biological cells for therapeutic use (Desai et al., 1999). Therapeutic drugs for insulin-dependent diabetes, chemotherapeutics, and local analgesics are driving commercial investments. The development of compatible biomaterials will also be a necessary part of the solution.
A recently developed microtechnological biocapsule ( Figure 7-2) has been developed that is more efficient than previous methods of isolating the therapeutic from immune system response (immunoisolation), is more biologically benign, and is nondegradable. This technology may have particular specialized applications for the Army. For example, biocapsules could be used to implant cells that have been genetically altered to produce particular molecules. In this way, the biocapsule would become an implantable drug factory that could deliver performance-enhancing or therapeutic molecules for desired time periods.
The concurrent development of a fundamental understanding of cellular biology may interface with, and eventually supplant, such technology platforms, although it is difficult to predict exactly how. In their own way, cells may be considered to perform as biocapsules, and it may prove to be easier to reimplant an individual’s own cells, or work with engineered cells, to achieve the same goals.
The nanopores in the cell transplantation biocapsule shown in Figure 7-2 are obtained by a combination of photolithography and sacrificial-layer technologies. Nanopore technology may also prove to be instrumental in the development of breakthrough instrumentation, such as ultrarapid sequencers. For other biological and nanotechnological constructs, however, much more work will have to be done.
Scalable fabrication methods will be necessary for commercial viability; without them, the private sector is not likely to invest energetically in the type of nanotechnologies described here. Private-sector interest in nonbiological nanotechologies (e.g., nanometer-sized holes, carbon nanotubes) seems certain. But advances in biological nanotech
nology (e.g., programmable molecular assemblers that may be necessary for implants) may not even be possible.
Basic research in nanoscience is now being funded by the Army as one of its Strategic Research Objectives (DA, 1998). The committee suggests that the Army adjust its focus on nanoscience to include the interfacing of biology with materials science to influence the direction of developments in biological nanotechnology. Lack of nanotechnology (with a biological component) is a barrier to important developments in sensor capabilities and can only be addressed through a broad-based effort by industry, government, and the military.
Implants and Biocompatibility
Protective packaging of an implantable sensor and associated actuators may or may not be possible because the sensor will have to be in direct contact with the biological milieu, which may generate adverse reactions from the body and lead to the rapid deterioration of the sensor, or both. Such complications have stalled the development of implantable glucose sensors to monitor therapy for diabetics for many years.
Like other biological material developments, current research to enhance the biocompatibility of silicon is focused on surface modifications that minimize unwanted, nonspecific protein adsorption. Leaders in the field include Frances Ligler and Bruce Gaber at the Naval Research Laboratory, Micronics, Inc. (a Seattle-based company), and Miqin Zhang at the University of Washington (Zhang et al., 1998). All of these researchers are attempting to covalently immobilize protein-resistant groups (e.g., alcohols, polyethylene glycol) along the principles developed by Mrksich and Whitesides (1996). Engineering protein adsorption on surfaces is crucial not only for in vivo implantable devices, but also for microfluidics and nanofluidics in laboratory devices. For instance, nonspecific protein adsorption on silicon-oxide islands spontaneously formed on silicon surfaces may foul microfluidic and nanofluidic devices. Recent developments in the field are discussed in Desai et al. (1999).
Implant materials may interact negatively with the biological milieu in a number of ways that collectively define the notion of biocompatibility. These include chemical or mechanical instability, release of harmful degradation products, cell toxicity, inflammatory reactions, formation of thrombi and emboli upon contact with blood, and the introduction of potentially carcinogenic alterations in nucleic acids. Drug-delivery implants may also fail if the effective release of the therapeutic payload is compromised by the scar tissue naturally produced by the recipient organism as a reaction to both the implantation surgery and the implant material.
Protein-implant and cell-implant interactions are determinants in these failure modes. For instance, the adsorption of plasma proteins is the priming event for the involvement of platelets and the clotting cascade that leads to the formation of thrombi and emboli. The activity of phagocytic cells at the implant site is believed to trigger a chain of events that leads to the formation of a fibrotic capsule around the implant. Ensuring biocompatibility is frequently a matter of engineering protein and cell resistance at implant interfaces, either directly on the implant material or indirectly by a biocompatible coating on the implant structure.
Biocompatibility is such a serious problem with semiconducting materials that direct contact between semiconductors and the biological milieu is not even mentioned in comprehensive references (e.g., Black, 1999; Horbett and Brash, 1995), even though semiconductors comprise the core technology of electrical stimulation devices such as pacemakers, wearable defibrillators, and pain-control and seizure-control implants. In these devices, the semiconducting materials are not directly exposed to the biological environment but are contained in a silicone-lined casing.
Solving the problem of biocompatibility, which is now considered a collection of difficult engineering problems, has been slow. A possible solution might eventually be found with the “biological” engineering of cells, but new ground will have to be broken in this field.
Biocompatibility issues will have to be overcome to enable the development of implant devices, such as biocapsules, to monitor soldier health or deliver antidotes to toxic agents. Such devices could increase soldier survivability, as well as unit combat effectiveness. The challenge for micro/nanofabrication technologists and life-science experts will be to collaborate on the development of biologically inspired methods and materials to improve drug-delivery technologies. The Army should monitor progress in implant research, drug-delivery technologies, and alternatives that could meet its needs.
Somatic Gene Therapy As an Alternative to Implanted Devices
By 2025, it is likely that somatic gene therapy will be developed to the point that it can be used to direct the synthesis of protein therapeutics in individual soldiers, thus obviating the need for implantable devices. For example, gene therapy agents could be transfected into cells by bombarding a patch of skin with DNA-coated pellets from a gene gun. As the cells are sloughed off, expression of the therapeutic protein would naturally cease but could be renewed by another application of the agent. By 2025, reliable and robust means of delivering DNA constructions to other cell types will also become available. In fact, much or all of the technology implanted into the individual soldier will probably be derived from the individual’s own cells rather than from fabricated devices.
Barriers to Development of Therapeutics and Vaccines
The use of genomic information to improve the protection of military personnel, and even to help direct their tasks and training, will require that the Army educate itself about the limitations and potentials of these techniques and, in some cases, will require that complex ethical issues be addressed. In addition, the fabrication of devices that can come into contact with the interior of the body also faces formidable technical challenges. Controlled delivery of therapeutic molecules might be possible someday by “devices” that do not have fabricated components but consist solely of engineered cells reimplanted in the body or even self-assembled after injection. For the present, however, the Army can address the barriers to immunization and drug development using proven developments in genomics and biotechnology.
The challenges facing the Army’s research on vaccines should be considered in the context of the larger society. During the many decades the Army has been engaged in the development of vaccines, it has developed close working relationships with industry and has made substantial progress. The Army has the expertise and experience to define and address its needs for the development of vaccines. In fact, in some cases, the Army is the world authority. The Army also understands commercial-sector strengths and weaknesses and has worked well with industry.
Currently, the Army is actively involved in the development of vaccines against malaria (Plasmodium vivax), diarrheal diseases (including Rotavirus), flavivirus (including dengue), and rickettsia and is actively pursuing research on hemorrrhagic fever viruses (e.g., filoviruses like Ebola) and other highly lethal viruses (Hoke, 2000). The Army also supports significant programs in the development of adjuvants (see Appendix D). In general, the Army has tried to cooperate with, and coordinate its research with, commercial partners. But the interests of commercial companies, whose research budgets are much larger than the Army’s, are not necessarily compatible with the Army’s objectives. Most of the barriers to the development of vaccines of interest to the Army can be attributed to these business realities.
The global vaccine industry is dominated by six big companies, all of which also make therapeutics. The companies are Merck, Smith-Kline Beecham (soon to become GlaxoSmithKline), Wyeth-Ayerst (now part of American Home Products), Pasteur Merrieux (once part of Rhone-Poulenc Roher, which is now part of Aventis), Bristol-Myers Squibb, and a relative newcomer, Chiron (a biotechnology company in the process of transforming itself into a fully integrated vaccine and pharmaceutical company).
For these big companies, research and development costs for a single vaccine can easily top $200 million, and the time from program start to marketing may be as long as 12 years. To counter these costs, a potential vaccine must have peak revenues of at least $200 million per year and must earn revenue for 10 years. Therefore, the target price for a single dose or course of vaccinations is more than $50, and usually more than $100. These economic realities all but preclude commercial investments in vaccines of interest to the Army.
Although a number of small companies are working on vaccines, they usually try to form corporate partnerships with one of the larger companies, which supplies funding. The larger partner usually has exchange rights to market the successful product. Barriers to the entry of new firms in the vaccine business include the vast specialized expertise required to conduct human testing, market the product, and defend the product against product liability claims.
Liability claims have reinforced the conservative practices of the vaccine industry. In a small number of people, vaccinations have adverse effects. The best example of this is the swine flu vaccination program in 1976. In February of that year, the CDC confirmed that an influenza outbreak at Fort Dix had been caused by the swine-type influenza A virus. Subsequently, the Department of Health, Education, and Welfare, concerned about a major flu epidemic similar to the epidemic in 1918, recommended that the federal government vaccinate all Americans, and more than 40 million people were vaccinated. However, the program was suspended following reports that people in more than 10 states had developed of Guillain-Barré syndrome (GBS). By January 1977, more than 500 cases and 25 deaths had been reported (Langmuir, 1979). Despite the lack of a definitive biological explanation for the association of the swine flu vaccine and GBS, there was strong evidence of a causal relationship, which led to millions of dollars in lawsuits (Laitin and Pelletier, 1997). The result of this incident was a decrease in public confidence in vaccination programs and a dampening of the enthusiasm of the pharmaceutical industry to develop vaccines.
In general, because of concerns about safety and product liability, the technology to create new vaccines has not evolved as rapidly as the underlying science. The impact of this conservatism has been a gap between new biological knowledge and capabilities and the availability of new, cost-effective vaccines. Recombinant subunit vaccines coming onto the market are based on 1970s technology. The regulatory, liability, and testing environment has inhibited the development of vaccine technology to the point that current knowledge in genomics has had little impact on the development and production of vaccines.
Drug Development Barriers
Given the current state of drug discovery and development, the time and cost of producing a new drug could potentially be greatly reduced. Genomic technologies could be used to reveal targets; high-throughput screening or compu-
tational structural biology and chemical informatics could suggest inhibitors; a combination of computerization and mechanization could improve compounds; the collection of genomic information from animal studies, including new transgenic animal models, could accelerate preclinical development and toxicity studies; and genomic information could allow much smaller targets and, perhaps, single-step human clinical trials with near-real-time monitoring of beneficial and adverse events.
The pharmaceutical industry is subject to some of the same business pressures and liability problems as the vaccine industry, although the number of companies involved in the oligopoly is larger and the industry as a whole is immensely profitable. Nevertheless, radically new ideas are likely to depend on the entry of new firms into the market. The Army has very little leverage over the business landscape but can, by targeted spending of its own research funds and by working with other government and regulatory agencies, encourage new companies that will make use of the new technologies, thus creating an environment more responsive to Army concerns.
The Army should take personal interest in developing small-molecule drugs to ameliorate shock caused by blood loss. As sequencing and structural information on all proteins becomes available, the discovery and design of small-molecule compounds that interact with specific novel targets will improve, and the Army should track these developments closely.
The Army, and the country as a whole, are becoming increasingly dependent on foreign sources for many critical therapeutic materials. Even though it may be prohibitive for the Army to invest in manufacturing infrastructure, the Army should develop and maintain a database of global manufacturing capabilities, including the biology, processes, and equipment to produce critical therapeutic materials. This database should include other key upstream and downstream aspects of the pharmaceutical industry, such as the status of clinical trials.
Although federal and state regulation of research and development in therapeutics affects both military and civilian products, in exceptional circumstances, national defense needs should be given special priority and should be provided a legal basis for taking action. In urgent cases, the Army simply cannot wait while developers strive to meet the extremely high (> 99.99 percent) effectiveness demanded by federal regulators and civilian consumers. Delays could be critical if the development of antidotes, quasimedical devices, and other biotechnology products are needed to meet immediate military requirements.
The government should define and certify special processes for the development and approval of biotechnology applications to meet exceptional Army and other defense needs. DOD must have the ability to identify exceptional requirements and expedite the development of products with the potential to benefit soldiers confronted with an urgent threat or special need.
Developments in cell biology, immunology, molecular genetics, and genomics have led to new concepts that could greatly improve the safety and efficacy of vaccines and reduce the time and lower the cost of vaccine development and production. As the pace of genomics advances quickens, the Army will be hard pressed to take advantage of the many opportunities to provide better vaccines. Reducing the time involved in clinical trials should be a high priority.
The Army should build on its strengths in vaccine development and fund new technological approaches, including genomics developments, DNA vaccines, cell-based vaccines, and monoclonal antibodies. It should also explore using transgenics to shorten the clinical-trial phase for defining toxicity and using pharmacogenomics to shorten the time for Phase III clinical trials, which involve large populations and are difficult, expensive, and prolonged.