THE CURRENT THOUGHT ON BIOTERRORISM: THE THREAT, PREPAREDNESS, AND RESPONSE
David R. Franz
Southern Research Institute
I have spent the last 15 years thinking about biological warfare. Unfortunately, over the last few months that topic has become extremely popular. Thus, this morning I am going to offer a broad perspective of biological terrorism, with the understanding that this biology is a subdivision of chemistry.
The last 60 years, as evidenced by the former U.S. and Soviet programs, has been the modern era of biological warfare. The agents studied in both nations' programs were very similar, and the workhorse agents in both programs were the zoonotic agents—agents that are transmissible from animals to humans (see Figure D.1). The physical and infectious properties of biological agents are not all the same: they differ in how they act; whether they cause illness or death; their stability during growth, production, weaponization, storage, and dissemination; the number of organisms that cause illness; and how infectious they are. The only seriously contagious biological agents, smallpox and plague, were actually weaponized by the Soviets and put into refrigerated nosecones of ballistic missiles to blanket the United States.
The fundamental differences between chemical and biological agents are very important as we look to the future. Chemical agents are volatile and dermally active and it is immediately apparent when there is human contact with a chemical agent. Biological agents, however, are not volatile, not dermally active, and
have delayed onset of disease. These properties place serious constraints on biological agent delivery. It is easiest to present biological weapons as irrespirable aerosols because they are nonvolatile, but it is a complicated task to prepare agents in that form. Difficulties also arise because distribution of the agent in air is completely dependent on meteorology (outdoors or indoors).
Biological agents also present problems to antiterrorist law enforcement officials. Although chemical weapons can cause fairly immediate death, biological weapons yield no sign of exposure. There are also no tools to detect exposure to a biological agent before the onset of sickness. Medical doctors are unfamiliar with exotic agents used in biological weapons, and the prevalence of flu-like symptoms in the beginning stages of agent-induced illness often leads to misdiagnosis. Not only does this yield a higher probability of serious health problems or even death if infected, but these factors feed an enormous psychological fear of biological agents. Other problems include the lack of “universal” vaccines, the lack of antiviral drugs, social or political issues revolving around prophylaxis, and the difficulty of forensics.
Biological terrorism is a unique threat because of its dual-use nature, evolving technologies, and political factors. The production of vaccines, for instance, requires the growth and subsequent death of viruses or bacteria. A certain veterinary vaccine facility in Russia is capable of producing 12 metric tons of
foot and mouth disease virus in one run. It has been admitted that during World War II, the mission of this facility was to produce 240 metric tons of variola virus, which causes smallpox, to be used as a weapon. A similar dual-use situation exists with technology like crop dusters. It is difficult to know if pesticides are being sprayed or biological weapons are being tested. Biotechnology such as genetic modification adds yet another dimension to the prophylaxis/vaccine dilemma.
Political issues over the last 10 years have had a profound effect on the threat of biological terrorism. As the value of the Russian ruble fell and the country continued in its decline, between 30,000 and 40,000 Russian scientists and engineers, formerly employed at Ministry of Defense weapons facilities, lost their jobs. Perhaps these highly skilled scientists, who had families to feed and rent to pay, simply switched careers, but some were certainly recruited by Syria, Libya, Iran, and North Korea.
It is only in the last 4 or 5 years that the American public has become aware of the perceived terrorist threat. It is actually not significantly different from the threat during the Cold War. To produce an event that causes the death of thousands in this nation, the same agents selected during the Cold War must be used for the same reasons (ease of production, ease of distribution, rate of infection). Most likely, state sponsorship is required to produce the agents in the kind of formulation that would be effective for such a scenario. The exception is the use of a highly contagious agent like smallpox, which would not need to be weaponized.
Agricultural terrorism remains a possibility. The threat here is not to the human body; animal disease agents are not human pathogens and we could safely eat infected meat. The threat here is to the economy: in 1997 one infected and contagious piglet in Taiwan decimated the pork production capacity of the island, costing $5 million in initial damages and eventually costing $14 billion in lost revenue.
Where do the risks truly lie? The highest concern lies with highly contagious viruses like smallpox. Such a virus could cause the greatest damage; however, a smallpox attack is the least likely scenario to occur because the only legal supplies of the virus are in Atlanta and Novosibirsk. Illegal supplies probably do exist, but are difficult to obtain. Foreign animal disease viruses have the next greatest decimation potential. Classical agents that we normally think of but that are fairly difficult to deliver effectively are next on the list and then the hundreds of other more mundane agents like salmonella that will cause illness but not death. In the future, we may also need to beware of genetically engineered agents.
Bioterrorism may occur in the United States today because we cannot be beaten with conventional methods. We do not know how much “brain drain” occurred in the former Soviet Union, and the dual-use nature will never allow the problem to be dealt with through regulation. In the event of an attack, it would be ideal to have a universal detector that would simply indicate “yes” or “no” to the
presence of any biological agent. The next steps would be to identify the tack, the agent, and the people involved, then to neutralize, decontaminate, and remedy the attack site. All of this would ideally be accomplished in 24 to 48 hours, and without any public panic.
Detection is an area that chemists have been involved in for quite a while. Detection of biological agents is much more difficult than originally believed at the beginning of the Gulf War. Although anthrax, with an infectious dose of 100 organisms per liter is relatively easy to detect, many agents like Q fever have an infectious dose of only 10 organisms, requiring sensors to detect 10 organisms in 100 liters.
Currently, biological agent detection in an infected person can only be accomplished by measuring the antibody response (unless methods like nasal swabbing are used, which have a high rate of false negatives, especially for agents that require a very small number of organisms for infection). However, it takes a few days for the antigen to circulate in the blood and for the immune system to respond (see Figure D.2). At this point, it is almost too late to treat the infected individual. Chemical and biochemical research needs to focus on tools to allow us to identify exposed individuals before the onset of clinical disease.
The question remains, how do current times compare with the Cold War era? It is now clear that there are people in the world willing to bring harm to civilians. Medical doctors must add new diseases to their differentials. The public now has
a better understanding of the threat through the experience of the anthrax letters, and funding for counter-bioterrorism will increase. The nation's list of vulnerabilities is undeniably different. However, the technical difficulty of bioterrorism, the importance of meteorology, the difficulty of intelligence, detection and response (public health), deterrence and preparedness (law enforcement), and education on related issues have not changed at all since the end of the Cold War.
The solution comes as a two-pronged approach: education and a strong technical base. We have to educate people to be aware of the risk. Any developed nation in the world could produce biological weapons, and we, as scientists who speak the worldwide language of science, have an opportunity to prevent this. Our nation also must have a strong technical base to be ready for both the expected and the unexpected. This will improve our surveillance, diagnostics, and communication (data integration). So many unanswered questions remain regarding biological agents, the immune system, detection, and the like. The time is ripe for basic research.
The current bioterrorism situation is about more than just science and gadgets. I think it's reflected by the flags that I see on my street and on your street. It's reflected by the flags I see now on the lapels of business suits that I didn't see before. And I think it's about the American spirit: we're all Americans or scientists and we have a very important job to do.
WHAT CAN THE INDUSTRIAL CHEMICAL COMMUNITY CONTRIBUTE TO THE NATION'S SECURITY?
Scott D. Cunningham
Today I am speaking on behalf of larger chemical companies. While I can only relate what I have seen at DuPont, my experiences are most likely typical.
The first response of the chemical industry to the events of September 11 was to donate materials needed in the recovery effort. Then, it began to think about its own vulnerability. Generally, the chemical industry enjoys a good safety record because of a necessarily heightened consciousness of the many hazardous materials handled. Recent events, however, have shown that technically knowledgeable people are willing to die in attacks against U.S. interests, and this is what causes concern in the industry. Companies began to rethink nearly everything, from infrastructure to checking identification at the door. They have changed capital expenditure plans, logistics, operations, and research and development. Industry members are now using the momentum of the September attacks to identify weak points in their systems and to reengineer operations. For example, storage tanks have became a major concern as targets of terrorist attacks, causing companies to redesign processes to minimize the accumulation and storage of hazardous intermediates and products.
As the resident HAZMAT authorities in many areas, chemical company employees found themselves responding to numerous community requests. There were so many calls for assistance that there was a shortage of trained responders to handle them. People began to think about communication devices, new systems, and new ways of doing things. Questions were asked like “What did I learn and how do I do better next time? When a call goes out, how do you mitigate the damage? What systems should be blocked off when entering a potentially contaminated building?” The experience gained through helping communities allowed chemical companies to improve their response procedures.
Chemical companies also received calls for many perceived antiterrorist materials such as filters, Tyvex, Nomex, Evacuate, and Kevlar, and pharmaceutical companies received calls for vaccines. The industry began to investigate who was buying these products and whether the products would meet the customers' needs. Materials scientists began to contemplate faster and different production methods, to combine products, and to define new uses for older products. Companies asked, “What do these things mean to the world? How can we make them better and protect our people and our food?” People responded creatively; for example, pharmaceutical companies are now working on faster vaccine production and thinking about new production paths through plants or bacteria.
In the corporate boardroom, the volition to act runs high, not to make a profit, but to help the national security effort. In fact, the chemical industry has a history
of responding to national needs, evidenced by their production of black powder in the War of 1812, explosives and synthetic materials during the world wars, and materials for the space program. For this national effort, though, it is not clear whether there are technology solutions, and it is uncertain what the chemical industry can do to help.
The chemical industry has a proven track record of safety, and has helped companies to make their production plants safer. The industry also has global presence and experience, providing decision tree analysis and disaster planning around the world, including prevention, detection, damage mitigation, and mediation. The chemical industry is capable of integrating multiple sciences to produce a large amount of product at low cost. The industry has great experience with sensors, and is the largest designer, user, and consumer of sensors for risk avoidance and for quality control. These sensors have already been integrated into production plants, but not yet into office building monitoring.
Many chemical companies produce decontamination agents. DuPont, for instance, makes chlorine dioxide, the material used to decontaminate the Hart office building after anthrax contamination. DuPont did not perform the decontamination, however, because they did not know the effect of that product on a building, its infrastructure, computers, papers, or even the carpet. Although DuPont makes both the carpet and the decontaminant, we have never investigated the interaction of the two products. This is the kind of new, integrated thinking that is called for.
The chemical industry contacts supply chains, such as those for construction materials, automotive materials, and food, at multiple points. We can clean, disinfect, and genetically engineer seed. We can provide advanced coatings and packaging materials, as well as systems to detect bacterial contamination and spoilage. Can we make supply chains safer?
The industry wants to help the nation to get security systems where they are needed, at the right times and the right levels, but to do so industry must have more information about specific needs. The chemical industry brings expertise in product development, management, and integration. It wants to be useful. It wants to help mitigate risk. However, it has been very difficult for companies to get a clear idea of what the needs really are. Confusion about what is necessary does not inspire executives to spend employee time, money, and energy to solve a problem that has not been completely identified.
According to the Technical Support Working Group and the General Accounting Office, the chemical industry is focused on the short term. Certainly parts are. But perhaps 20 percent of research and development funding is focused on what the world may need next, things that are interesting but unsure. Yet there needs to be a better interface between what Defense Advanced Research Projects Agency (DARPA) produces and what industry does. There is a desire to spend more research and development money on terror mitigation, but there needs to be a better way to integrate the programs and systems that DARPA, the National
Science Foundation, and industry are funding, especially as the developer of each small component attempts to protect its own intellectual property.
The terrorist attacks have had a significant impact on the chemical industry. After looking to their own internal safety, companies want to contribute to the homeland defense effort. There is a lot to be gained from industry's scientific knowledge, culture of safety, role in society, science technology, and manufacturing. The chemical industry is willing to help, but obvious solutions are beyond any individual group or company. The process of formulating and coordinating contracts, objectives, and partnerships is underway, but it has only just begun. Getting all the groups and pieces lined up and together is going to take national will, a bigger sense of urgency, or some greater force. Perhaps organizations like the National Academies can help in this process.
THOUGHTS AND QUESTIONS ON COUNTERING THE TERRORIST THREAT
Richard L. Garwin
Council on Foreign Relations, New York
As an resident of IBM's Watson Research Center, and especially as a senior fellow of the Council on Foreign Relations, New York, I'd like to share with you some of the national security and homeland defense related thoughts that I've had since September 11, 2001.
Unfortunately, terrorism and counterterrorism are enormous subjects that are not only vitally important, but are also urgent. A terrorist attack can come in many forms; the most worrisome of these is airborne bioterrorism. Anthrax distributed upwind of a city is a serious threat that could kill 100,000 people or more. More frightening, however, is an agent like smallpox that is not only infective, but contagious. Past experience with smallpox has proven its efficiency at causing tens of millions of deaths around the world.
Another tremendous threat is nuclear explosives, either stolen or improvised. A modern nuclear weapon could kill many millions of people. Even an improvised weapon in New York, on the level of the first-generation weapon used in World War II, could kill a million people. Additionally, there would be complete destruction of a 10-square-kilometer region and hundreds of thousands of people would be exposed to lethal fallout within the first hour. The devastation, however, is limited in extent, unlike a bioterrorism event, placing nuclear weapons second on the list of worrisome scenarios.
Bioattacks on food production need no explanation after seeing the results of the outbreak of foot and mouth disease in England. Since foot and mouth disease does not affect humans, the effects were primarily economic. Attacks on agricultural products would likely be just as damaging.
Based on the sarin attacks in the Japanese subway, chemical agent attack on humans has shown to be rather difficult. Biological toxins, chemical agents that are often overlooked, may also be difficult to use, but if used correctly can wreak much more havoc. Inhaling a single microgram of botulinum toxin is deadly, compared to a lethal dose by ingestion of 1 milligram for sarin and other agents. Widespread devastation may occur as a result of explosive attack on chemical plants or chemicals in transit, as seen in the accidents in Italy and Bhopal, India.
Also on the list of potential methods of terrorist attack is radiological attack. In the long term, relatively few additional people would die of cancer; psychological damage would occur in the short term.
Finally, the threat of a calculated explosive attack on structures still looms large after the three airplanes hit the World Trade Center and the Pentagon.
Through the Council on Foreign Relations, other scientists and I met with New York Governor Pataki's public safety officials to discuss specific counter-terrorism
measures for New York state. We discovered that links between the scientific community and state officials barely exist despite the knowledge that this type of communication is essential.
Collective protection using filtered air was one topic of discussion at the New York meetings. In the event of an anthrax attack, spores will drift into a building or house even with tightly closed windows. This problem can be alleviated with high-efficiency particulate air filters (HEPA filters) that have been in use since the Manhattan project. These filters are currently used in a number of places such as hospitals. In some buildings, HEPA filters can be easily interchanged for regular filters and their cost is not prohibitive.
A change of inside air for outside air needs to occur once every half hour. If the change occurs every few minutes for the circulating air, it would reduce the amount of biological agent that reaches a person inside a building by a factor of 10. If a HEPA filter is used in place of a normal filter, the agent dose is reduced by another factor of 10. Positive pressure within the building that allows no unfiltered air to leak in from the outside also greatly improves the protection level.
Countering explosives was another topic of discussion in New York. We need to be able to detect the presence of hundreds of grams of explosives on aircraft passengers and in bags. The simple act of bag matching is not sufficient, because today's terrorists will take their explosive-loaded bag onto an aircraft for a suicide mission. Explosives in vehicles, which can cause enormous problems if detonated at a choke point in the highway system or near a building, also need to be detected. Luckily, to do damage in this type of scenario, a large amount of explosive material is needed, which is hard to secrete in an ordinary car.
These are some of the issues currently facing us and that are being discussed around the nation. Scientists and engineers can provide options and can contribute to a rational decision-making process helping to make our nation safer for our residents.
VULNERABILITY OF PUBLIC WATER SUPPLIES
Rolf A. Deininger
The University of Michigan
Water supply systems are vulnerable to destruction and contamination. This is nothing new and has its roots in antiquity. A late director of the FBI called this to everyone's attention, 1 and his prescription for defensive measures were gates, guards, and guns. He also recognized the threat by a disgruntled insider and advised a background check on all employees. In 1970, the World Health Organization (WHO) published a booklet on “Health Aspects of Chemical and Biological Weapons.” One appendix of that publication deals with the sabotage of water supplies and discusses agents, scenarios, and expected outcomes of a contamination. There will be a new release of this booklet in 2002.
The major elements of a water supply system are the raw water source (lake, river, reservoir, or groundwater aquifer), the water treatment plant, the pumping stations, and the distribution system consisting of pipes and intermediate storage reservoirs.
Dams on rivers can be destroyed by explosives, not only leading to a loss of the source, but possibly causing serious damage and loss of life due to flooding. Pumping stations are not easily replaced and may require considerable time for repair since replacement pumps are not usually stored on site and may require a lengthy ordering time. A partial destruction is not too serious since there is usually spare capacity allowing a somewhat reduced service. Destruction of pipelines is easily accomplished, but can be fixed in a short time since such occurrences normally happen in distribution systems due to earthquakes and other natural disasters. Utilities are well prepared for such events. There is a great deal of redundancy in a distribution system, and several key elements may be taken out of service without removing the ability of the system to provide service to the consumers at a reduced rate. The key to a higher level of security is therefore the redundancy of the system. This redundancy should be investigated for each particular system to ensure security compliance.
A water supply system may be contaminated at the raw water source, at the water treatment plant, or in the distribution system. Contamination of the raw water source is easily accomplished since it is usually at a location far from the service area. It is not, however, a very effective measure due to the large quantity of water involved. A contamination at the water treatment plant would be more effective, but increases the likelihood of detection since a water treatment plant is staffed around the clock. In addition, the treatment processes will reduce the contaminants by one or two orders of magnitude. Thus the distribution system is
1J. E. Hoover. 1941. Water Supply Facilities and National Defense. Journal of the American Water Works Association 33(11): 1861-1865.
Suggested Reading on Water Safety
1. Berger, B. B. and A. H. Stevenson. 1955. Feasibility of biological warfare against public water supplies. J. AWWA. 47(2): 101-110.
2. Clark, R. M. and R. A. Deininger. 2000. Protecting the Nation's Critical Infrastructure: The Vulnerability of U.S. Water Supply Systems. Journal of Contingencies and Crisis Management. 8(2): 73-80.
3. Delineon, G. P. 2001. The Who, What, Why and How of Counter Terrorism Issues. J. AWWA. 93(5): 78-85.
4. New York Times, Dec. 31, 2001. Files Found: A Computer in Kabul Yields a Chilling Array of al Qaeda Memos.
5. World Health Organization. 1970. Health Aspects of Chemical and Biological Weapons. Annex 5. Sabotage of Water Supplies. pp. 113-120.
the most likely candidate for an attack, especially the distant locations. For this scenario the technology of treatment at the water plant is irrelevant. For a contaminant to be effective it must be tasteless, odorless, and colorless. If these criteria are not met, consumers will recognize a problem and avoid water usage.
There are many agents that can cause serious health consequences or death when introduced into a water system. This has been summarized well by Burrows. 2 A rough classification of the agents might be chemical warfare agents, biological agents (protozoa, bacteria, viruses), toxins, and the large group of toxic industrial chemicals (TICs). Chemical warfare agents are normally deployed through the aerosol route and contamination of the water is a secondary effect. They are not a credible threat, and can be removed in the treatment processes. 3
The most dangerous agents are the biological agents and the toxins. Protozoa, for example Cryptosporidium parvum, have contaminated a water supply system with serious consequences. In 1993, the supply of Milwaukee was contaminated, 100 people died, and over 400,000 became ill. Although this protozoan causes serious health effects in the very young, the very old, and the
2W. D. Burrows and S. E. Renner. 1999. Biological Warfare Agents as Threats to Potable Water. Environmental Health Perspectives 107(12): 975-984.
3National Research Council. 1995. Guidelines for Chemical Warfare Agents in Military Field Drinking Water. Washington, D.C.: National Academy Press.
immunocompromised population, it is not a credible threat to the majority of the population. Among the most dangerous biological agents are Bacillus anthracis, Shigella, Vibrio cholerae, Salmonella, and Yersinia pestis. For these agents, infectious doses are not very well known, nor is their survival in distribution systems that carry a disinfectant residual. Data are available only from mice and primates, and are usually expressed as an LD50 (lethal dose where 50 percent of the exposed die). There are serious questions whether this is a proper criterion, and perhaps an LD10 or even lower percentage is appropriate.
The previously mentioned 1970 WHO publication sets the calculations for contaminating a water supply. It assumes an infectious dose of 106 organisms in a glass of water (200 mL). Consider a medium-size city reservoir of 10 million gallons (40 million liters) or about 200 million glasses. The number of organisms necessary would be 200 × 1012. Freeze-dried bacteria have about 1011 bacteria per gram. So 2,000 g are sufficient, if they can be mixed into the water, to contaminate the entire reservoir. On the other hand, the lethal dose of botulinus toxin is about 1 microgram. Thus 200 g (1/2 lb) are necessary to contaminate the reservoir such that each glass would be lethal. These are extremely dangerous agents and the only line of defense is to maintain a chlorine residual in the distribution system.
TICs are threats to water systems, but are not as problematic as chemical or biological agents. Consider cyanide with a lethal dose of 25 mg. To contaminate the reservoir mentioned above would require 25 × 200 × 106 mg, or roughly 5,000 kg (5 tons). Dimethyl mercury is lethal at 400 mg; more than 50 tons would be required. Of course, it is possible to contaminate a small reservoir or simultaneously a group of several reservoirs.
We are not defenseless. There are many small steps that can be taken to harden water facilities against an attack. More timely monitoring of the water quality is imperative. At the moment, monitoring of surrogate parameters is possible, but we need to develop on-line direct measurement techniques of pathogens and toxins.
MICROFLUIDICS: DEVELOPMENT, APPLICATIONS, AND FUTURE CHALLENGES
Andrea W. Chow
Caliper Technologies Corporation
The recent development of microfluidics technology has mainly been driven by the need to miniaturize, integrate, and automate biochemical analyses to increase speed and throughput and reduce costs. These advantages can provide significant benefits to many applications for national security and homeland defense, including rapid detection and analysis of biological and chemical warfare agents and facilitating the development of therapeutics against these agents. Although the technology is still in a relatively early stage of development, a few commercial microfluidics products are now available and are beginning to demonstrate the technology's inherent advantages. This paper will review a number of current applications of microfluidics and outline some future challenges for new applications for use in national security and homeland defense.
In many microfluidic devices, a microchannel network is microfabricated onto a substrate (glass, quartz, or plastics, for example), and the channel plate is bonded to another piece of substrate with access wells matching the ends of the microchannels. This allows buffers and reagents to be supplied through the wells (see Figure D.3). For controlling the flow of fluids and reagents in the microchannels, electrokinetics and pressure-driven flows are the two most common means used. These driving forces can be applied to the reagent reservoirs through the instrument interface to the chip with no active fluidic control element fabricated on the chip. 4 In addition, a number of laboratories are developing active pumps and valves on chip for flow control. 5 Using these driving forces with on-chip or off-chip fluid control, it is possible to emulate many functions on chip, including flow valves, dispensers, mixers, reactors, and separation process units. For reactions such as polymerase chain reaction (PCR) that require elevated temperatures, precision temperature control of the fluid in a microchannel has also been demonstrated. 6
One important area of technology development is the world-to-chip interface. The requirements for an effective interface are ease of fabrication, low dead volume, ease of automation, no sample biasing from the reagent source, and compatibility with existing sample storage formats. One such interface for high
4A. R. Kopf-Sill, A. W. Chow, L. Bousee, and C. B. Cohen. 2001. Creating a Lab-on-a-Chip with Microfluidic Technologies. Integrated Microfabriated Biodevices, M. J. Heller and A. Guttman, eds. New York: Marcel Dekker.
5P. Gravesen, J. Branebjerg, and O. S. Jensen. 1993. Microfluidics—A Review. Journal of Micro-mechanics and Microengineering 3: 168-182.
6M. Kopp et al. 1998. Chemical Amplification: Continuous-Flow PCR on a Chip. Science 280: 1046-1048.
throughput screening uses technology that has a low dead-volume fluidic coupling of a capillary to a microchannel to enable automated sampling from microtiter plates, a standard format used by pharmaceutical companies for compound storage and bioassay analysis.
Microfluidics-based products for research and high throughput experimentation now exist in the commercial market. Applications such as genomic analysis, clinical diagnostics, and home-care products are undergoing various stages of development as well. For the research instrumentation market, the Agilent 2100 instrument and LabChipTM DNA sizing application were the first microfluidics products, introduced in 1999. 7 In the DNA sizing application, gel electrophoresis is performed in the microchip for up to 12 samples placed into the reagent wells. The samples are serially injected into the gel-filled separation channel, and a fluorescent intercalation dye is used to stain the DNA fragments. As the fragments move down the separation channel by electrophoresis, a laser placed at the end of the channel excites the fluorescence of the intercalated dye, and a photodiode detector measures the fluorescence intensity of each DNA fragment as a function of transit time. Using calibration markers to co-elute with each sample, the DNA size and weight fraction of each fragment in the sample can be determined accurately and automatically with the analysis software.
For diagnostics, microfluidics has been useful for significantly shortening the analysis time for molecular fingerprinting. The combination of Bacteria BarCodes' rep-PCR technology with Caliper's LabChipTM technology has demonstrated in a feasibility study to provide a much quicker turnaround time than conventional slab gel electrophoresis for bacterial strain identification in samples. The rep-PCR technology is based on the discovery that repetitive sequences are interspersed throughout the DNA of all bacteria studied to date. The spacing between these repetitive sequences varies among bacterial isolates because their DNA is different. After DNA is extracted from bacterial cells, reagents for PCR are added, along with Taq DNA polymerase and primers that bind to a repetitive sequence. Since repetitive sequences are highly conserved in all bacteria studied, one set of universal primers can be used on almost every isolate.
During the amplification cycles of the PCR, the Taq DNA polymerase copies the DNA between the primers. Because numerous areas throughout the bacterial chromosome are amplified, many potential genetic differences can be detected. Figure D.4 shows a chip analysis of two samples of bacterial DNA amplified by rep-PCR. The gel electrophoresis results clearly show that the bacteria samples are different strains, as there are band differences appearing at approximately
576, 769, and 1295 base pairs (marked by arrows). This is one example of an existing microfluidic application that can be used for rapid diagnostics following a bioterrorism attack.
Microfluidic technologies are likely to play an important role in high throughput screening of drugs and antidotes. 8 Enzymes and peptides can be mixed in a microfluidic reaction channel. Fluorescence intensity measurements can be used to determine whether the addition of a potentially pharmacologically interesting compound actually has an inhibitory effect on the enzyme. One advantage of using this technology over conventional methods is the reduction in reagent usage, which is especially important for proteins that are difficult to produce or purify. Other advantages are speed, higher data quality, and improved reproducibility due to the increased level of automation. Such technology has possible application for antidotes for chemical and biological weapons.
For other national security and homeland defense applications, specific requirements will no doubt impose new technological challenges and motivate new innovations in the field of microfluidics. Future challenges include developing suitable world-to-chip interfaces for sample collection (from air or liquid sources), devising methodologies to ensure statistically meaningful sampling (especially when the sample concentration is very low), integrating functions for on-chip sample preparation including sample stacking, understanding and mitigating undesirable surface interactions, developing generic detection methods for high sensitivity and specific detection of a wide range of analytes, and executing effective functional and system integration to yield robust and reliable microfluidics products.
8S. A. Sundberg. 2000. High-throughput and Ultra-high-throughput Screening: Solution- and CellBased Approaches. Analytical Biotechnology 11: 47-53.
AFTER SEPTEMBER 11: AN EXPANDED AGENDA FOR SCIENCE AND SCIENTISTS
Ralph J. Cicerone
University of California, Irvine
Tonight I want to give you some largely personal thoughts and observations. One message is that the real edge that the United States has in all of its challenges is science and technology. It's been that way for a long time, over 50 years now. I will argue that we have had previous large crises during which agendas changed, and we have risen to the challenges. Previously and once again, the United States' advantage is science and technology. There's great potential and great responsibility for the chemical sciences community because of that. Our focus is on chemical topics, but it could equally well be communication technology.
Let me go back into the past a little and talk about some other major shifts that have occurred, changes probably as big as the one since September 11. Let's go back to the World War II period, skip ahead to the interval after World War II to the end of the Cold War, and then discuss the period after the end of the Cold War. In our attitudes toward our government and the private sector and how science serves them, pronounced shifts occurred. To close, I will talk about the post-September 11 period and some questions that have arisen. None of us know exactly what all of our bearings will be, so I will address some of the questions that I think all of you have on your minds. The topics that you've been talking about at this workshop are a good representative sampling of all the things we have to think about, many of which we have not thought about before.
World War II
Until World War II, the size and scope of the United States' science research effort were both very small. Those of you who have read about Vannevar Bush will know part of this story. Vannevar Bush (an MIT electrical engineer) helped to convince the United States government in the early stages of the Second World War that the government should broaden its approach: instead of just issuing specific contracts to certain government-corporate labs or to a few universities for specific tasks, the United States should broaden the approach and take a more long-term view with more grant-like support to corporate and university labs.
The successes of science and technology were quite great, including the Manhattan project and the development of radar at MIT with some related work at Columbia University, developments which, along with their British counterparts and Allied fighting forces, led to victory in the Second World War.
Now, it's hard to realize just how little scientific research was being supported in the United States up until the middle of World War II, but it was minuscule. There were a few people who were convinced that basic and long-term
research, not just that applied to specific projects, was worthwhile, and they held sway after the Second World War. The Office of Naval Research had become convinced to support pure mathematics even during the war, and to this day the Department of Defense maintains and supports some of the strongest basic research in the country.
After World War II and the Cold War
After World War II it was seen that the successes of science and technology in the war were so great and the threats of the emerging Cold War that were beginning to be felt in the late 1940s were so large as to encourage the country to create the National Science Foundation (NSF) and some national laboratories. New grant programs were spawned at the NSF, the Department of Energy (it was called the Atomic Energy Commission and other titles), and at the Department of Defense (previously the Department of War). Today's National Institutes of Health, I think, probably would not be what they are without the model of NSF. Also during this time, the National Aeronautics and Space Administration was created largely because of the threat that the Soviet Union's Sputnik represented.
Subsequently, basic scientific research made great contributions to our nation's economic strength and its security. Some wonderful examples are described in a series of publications called “Beyond Discovery: The Path from Research to Human Benefit” produced by the National Academies. Each brochure chronicles some discovery in basic science and how it has had applications beyond what anybody could have imagined. Often as a side benefit, many new products emerged from federal government support (for example, communications satellites) and a series of new discoveries was made.
One example is the development of the Global Positioning System (GPS), which came out of early basic work on atomic clocks and hydrogen masers and led to timing devices that no one could ever have anticipated. These timing devices have enabled positions to be measured with tremendous accuracy and precision using instruments on Earth-orbiting satellites, with applications from national defense to personal safety and convenience. Similarly, cochlear implants that enable hearing arose from early fundamental research in anatomy, electrophysiology, and information theory. Developments in laser physics have led to new types of eye surgery, the science of polymers has led to new fibers and artificial human skin tissue, and basic research in genetics has led to many beneficial applications in medicine and agriculture. In this post World War II– Cold War period, basic long-term research was understood to be good for the country to ensure immediate applications to specific defense projects.
In the Cold War there were also some specific needs that energized basic research. For example, prior to communications satellites, using the ionosphere to reflect radio waves was the only way to communicate around the world at the
speed of light. In World War II, military powers experimented with the Luxembourg effect (heating the ionosphere) to jam communications and to prevent the opponent from communicating quickly around the curvature of the Earth. Such phenomena and their challenges generated a lot of research; in this case, some pretty good physical chemistry research had to be conducted to understand plasma in the upper atmosphere, in particular, ion-molecule reactions. Understanding of collective phenomena in physics was also gained in this same pursuit.
This kind of rapid and robust progress had not occurred before the Second World War. The key point is that research came to be seen as necessary for national security. A scientist who was very eminent in national policy told me that there has never been any real belief in the value of basic research for its own sake. Instead, it has been supported by the government (since the late 1940s) because of national security or more recently, to develop the economy. It was the need of national security that kept the research engine going.
More broadly, this new, widespread faith in science and technology and its successes on the military front led to the creation of something called the National Defense Education Act of 1958. The NDEA did not simply send people to graduate school in physics and chemistry and mathematics and engineering. Instead, there were NDEA graduate fellowships in fields like history, geography, and the foreign languages, and it provided support for science and math teachers in public schools. All of these good things were driven by national security needs. The perceived need for a strong defense capability and national security provided support for all of higher education.
Basic research in corporate labs also flourished after World War II. Just think of the fantastic capabilities and first-rate science that was associated with corporate labs like those of Bell Labs, IBM, Exxon central R & D, Eastman Kodak, Polaroid, Xerox, and others. Notice that I said “was associated,” because those labs (that have survived at all) are now much smaller and are focused on shorter term projects. During the Cold War, some classified research and development went on in those labs also, and national security was a central priority. The phrase “good enough for the government” originated in the Second World War when factory workers and researchers had to produce something that was good enough for the government. In my lifetime, however, “good enough for the government” has been a put-down.
Confidence in government also characterized the time after World War II and during the Cold War. Over the history of our country it has been seen that one of the legitimate roles for the federal government is to provide for the national defense. There's been a lot of disagreement about many other tasks but not over this one. Military and security needs carried a lot of development throughout the post-World War II period. Of course, the Vietnam War led to less confidence in the government and caused the termination of classified research projects at universities.
1989 Until September 11,2001
Now we come to the end of the Cold War and another large shift. Not only did science and technology carry the United States through the Second World War and the Cold War period stably and securely, but in the meantime scientific research and development were growing the U.S. economy. There are a number of economists (growth theory economists) who have quite a bit of evidence to show that more than half the economic growth in the United States since 1945 is due to scientific research and development (R & D). There is some argument among economists over how much is attributable to basic research, but if you lump together all research and development, well over half the growth in the U.S. economy is due to scientific R & D, and again, government sponsorship is right at the core of it.
Until the end of the Cold War there was a great deal of confidence and faith in government and in science because of its achievements such as its role underlying economic growth. Winning the Cold War was great, but it also turned out that nobody (that I know) saw it coming or knew what to do next. The end of the Cold War and the fall of the Berlin Wall in 1989 left everybody in a state of surprise. We didn't know what to anticipate.
Without the threat of a 10-foot-tall enemy who had launched Sputnik and had nuclear weapons just like ours and was out to destroy us, we turned away from science and higher education. The end of the Cold War undercut the support for science and for higher education in general. The National Defense Education Act fell into disrepair. Even scientific research, especially basic research, was under great threat and question. I would like to quote Roy Schwitters, who was the head of the Superconducting Supercollider (SSC) project in Texas. When Congress pulled the plug on the SSC, Roy was quoted in a newspaper as saying that the SSC would be just the first of all the big science projects after the end of the Cold War that would be subjected to very intense scrutiny. No longer would the undercurrent of trust and respect for science and higher education carry projects like that. I think Roy turned out to be correct. (Of course, there were other issues with the SSC.)
At about the same time, the end of the Cold War, Japan not only emerged as a high-tech economic competitor for the United States, but also as a winner. It was only 10 to 12 years ago that we thought we had lost the whole show to Japan in terms of R & D leading to high-tech and high-value products and the knowledge industry. Do you remember? And because we no longer had the large military and ideological enemy to fear, the public was asking science and the government, “What have you done for me lately? And by the way, you're too expensive.” At about the same time, the globalization of the economy arrived and our corporations no longer had the kind of edge that they had had historically. Exposure to low labor cost all around the world undercut prices here and made it very difficult for our leading corporations to maintain robust central R & D labs
that could do long-term research as well as product development. This was a big change, and the agenda for science changed again. Now, instead of providing for the national security and having faith in basic research to provide eventual long-term growth, there was less threat to the national security, and the need of corporations for commercializable products was immediate. Those powerful corporate research labs that used to do basic long-term research vanished.
The good news is that many of those corporations still do marvelous product development and they are holding their own in the world, but they do not do as much long-term research as they did pre-1989. In addition, the national laboratories that were created after World War II and during the Cold War were told to forget about weapons and to do something useful, to create products and do commercially relevant research. Thus the agenda for the national labs also changed. In fact, government's role in general was reduced. The private sector was supposed to take over. The private sector was featured and respected, and the government was less respected.
American corporations have done well post-1989, in the face of having new developments imitated within a period of months after great development costs. As in any new high-value product, the knock-off and imitation industry takes over. Our corporations can stay ahead of the game only due to the research and development that they can draw from. This is true in electronics, computers, memory devices, pharmaceuticals, aerospace, and so forth. But, once again the corporations cannot afford, ever again, because of the globalized economy, to maintain the central R & D labs that could focus on long-term basic research that was not product driven.
The diminution of long-term basic research in corporations opened the doors for universities to take on the role. The national labs were certainly capable of contributing, but they were not encouraged much to do so. At the same time universities were being given a different agenda, especially in science, and that was now not just to teach and train the next generation of people and to do research, but also to spur the local economy, the regional economy, and the national economy.
The 170 or so research universities around the country are under tremendous expectations to take on an additional role: to provide a real boost and a driving force for the economy. This is feasible because with our dual role of teaching and exposing students to up-to-date technologies in our labs they are ever more ready to go out into industry. And we can also do research on campuses without paying the wages of expensive researchers like the corporate R & D labs had to do, because graduate students and postdoctoral fellows do some of the research.
Did the universities actually rise to this challenge? I'm not sure. I don't know of much quantitative evidence, but generally I think you'd have to say basic research in the United States is still very strong.
Another characteristic of the post-Cold War period, 1989 to about now, but pre-September 11, is that corporate/university partnerships were supposed to
flourish. We had laws enacted like the Bayh-Dole Act, whose purpose was to spawn the commercialization of university research, to permit university faculty members and researchers to patent things along with their universities even if they had been supported by the federal government. This is what Birch Bayh and Bob Dole provided for. This has been a success in terms of obtaining more patents and more commercialization.
At the same time, though, there was less support for government in general. And the expenditure of government funds on research was now being questioned, just like Roy Schwitter said it would be, and we have challenges like the Government Performance and Results Act, GPRA, which many of you have spent some time trying to understand. All federal agencies that support research now have to write a report every year and convince the Office of Management and Budget that they are being productive, they are accountable, there's no duplication, there's no waste, there's no repetition, and this is all helping the country in a productive way, preferably today.
This questioning came out of the era when there was less threat to national security, more emphasis on competing with Japan, and more emphasis on the private sector. People were asking, “How much research is enough?” People like Ralph Gomory (a mathematician now at the Sloan Foundation) tried to take that question seriously. Ralph and his colleagues through the Committee on Science, Engineering, and Public Policy (COSEPUP) proposed a reasonable goal for the United States: we must be excellent in some fields and while we cannot be excellent in all fields, we have to be good enough in every field to be able to recognize large breakthroughs. If we are really substandard in some field, then presumably we wouldn't be good enough to recognize big breakthroughs elsewhere. COSEPUP produced such assessments on mathematics and materials science. The point of this story is that there were real efforts and incentives to try to justify science on grounds other than the national security, usually competitiveness and commercialization.
Post-September 11, 2001
Since September 11, 2001, a lot has changed. Suddenly we are aware of new threats to our security. They're everywhere, and the topics of this workshop represent many of them. Water supplies, biological agents, exotic explosives, transport of airborne pathogens, security of chemical plants, use of nuclear waste for harm, civil structures, and so forth. Science and technology, however, are once again our advantage. And I'm very confident that the agenda of scientists and the agenda of the public and its attitudes toward science will change favorably if we do our work.
It is only through science and technology that we can leverage our otherwise small numbers of people. Science and technology must help us to anticipate, prevent, and/or mitigate destructive attacks.
How can we mobilize to be most effective? This workshop is a great start. I was very impressed when I heard Alice Gast give a Board on Chemical Sciences and Technology briefing back in October because, in fact, chemistry is the central science, and it will be in the nation's response.
What shifts are just beginning now and what is the expanded agenda for science and technology? The challenge to us as scientists is to capitalize on our available knowledge and to develop new knowledge and substances and instruments from it, and also to develop new knowledge where it's needed for the conventional military and for antiterrorism in general.
As we face this new and expanded agenda, we will all have to answer some questions and develop some new patterns. No one is sure exactly how this will happen, but one initial reaction after September 11 is to look to the government. That is what people are doing right now. Why? Well, who were the heroes on September 11 and since? They were the firefighters, policemen, policewomen, emergency workers, and the military, all public employees. So there is renewed confidence in the role of government. I don't know how long that's going to continue, however.
There was over-reliance on the private sector, from 1989 to September 11, 2001. One obvious example is the privatization of airport security operations that occurred in the previous 15 years.
While we don't know how this is going to play out, much of the necessary research and development that we're going to need in the coming years is unlikely to be generated by profit motives, at least initially. The initial development, inventorying, and coordination we have to do is unlikely to be taken over by a profit-making company. Perhaps the initial tendency to turn to the government will continue for a while. However, I believe that we need the strengths of the public and private sectors and that total reliance on one or the other is dangerous.
How do we inventory all relevant knowledge and coordinate the R & D? To start, the National Research Council (NRC) is really stepping in with the BCST at the front of it, and the newly reorganized Office of Science and Technology Policy is active, too.
This job is hard. Probably just today some of you are hearing about developments and interests from colleagues you didn't know you had who are doing things that you had no idea anybody was doing. That's the hard part of it.
But the good news is that the job is hard because our R & D enterprise is so vast. There is so much going on out there in university labs, national labs, and corporate labs, that just doing the inventory of all the potentially applicable science and technology that's out there is going to be difficult.
How do we coordinate this work to go forward from where we are when there are many disciplines involved and also many entities? That's going to be tough. Can we change the missions of the federal agencies rapidly enough and responsibly, for example, without giving up other essential parts of their missions? We need people in charge of the federal agencies and the state governments who
are very quick and responsive. With the very small incentives we give for people to serve in government, this is a real handicap. So I think it's going to require a lot of volunteer work from companies and universities, public agencies, and the NRC to make it happen.
Who will do the work? This is a very serious question. As the years pass, and this post-September 11 condition persists, which I think it will, we will need more holders of graduate degrees, for example. One of the many threats out there is that it might become harder for foreign students to enroll in U.S. graduate programs or even to maintain their enrollment, depending on political attitudes. And right now the enrollments in these programs in the United States are mostly noncitizens, especially in engineering.
So my question is, who will do the work? How can we attract more U.S.-born students into graduate study in science and engineering if it becomes necessary, which it might? We really need ideas. Will it happen naturally? Some of us are in science and engineering because of our reaction to Sputnik and what our teachers told us in reaction to Sputnik. Reactions like that that can occur again.
There is also a strong need for people with advanced capabilities in foreign languages and deep understanding of history, other cultures, and some social sciences. The National Defense Education Act seems to have been needed even at the depths of the Cold War, 1958 to 1970, which I think is roughly when it ended. We should reinvent or reinvigorate the NDEA, to include graduate study in languages, history, international affairs, and similarly valuable fields.
Another question that may arise is, will classified research grow again, and what will be our attitudes toward it, for example, on university campuses? Faculty members who grew up in the 1960s may not want to see classified research again in universities. And there are fundamental issues, too. The obligation and necessity for faculty members to conduct free inquiry with the ability to publish freely is a major principle that we hold.
What about the old question, how much research is enough? Will it come back? How will we answer this question? I want to say strongly that when we try to coordinate our R & D effort, we should not start with the view that we should eliminate all overlap and repetition in science funding. A colleague who served on the National Science Board in the mid-1970s came to believe that overlap and repetition is needed to assure that all the necessary research gets done.
So when we talk about federal agencies starting to coordinate with each other, let's all speak up and say that effective coordination does not mean the elimination of all repetition and overlap. To do so would guarantee that all the research that's necessary does not get done. The current reality, however, is that all federal agencies are still coping with GPRA. How can we remind everyone of the need for basic research even when its potential applications are not evident now? Other forces also continue at play that make the job even harder. Research and development must continue to provide economic competitiveness and profits and
satisfy the immediate needs for devices and substances to assure the safety and stability of society.
There are many, many examples of the fruits of basic research, and I think the NAS publication series “Beyond Discovery” is something that we have to get out into the hands of journalists and the general public so that they understand. We have to make these facts known much more widely, that the results of basic research from NSF, DOE, and NIH are very potent and important.
To conclude, I don't know how to answer these questions, but I do think that we can strongly assert that science and technology are our advantage against terrorism. More narrowly, I think the results in Afghanistan to date demonstrate the role of science and technology quite clearly, and I hope that people are catching on. We don't know yet how to use all of our resources here at home, but we do know that science and technology have a lot to offer, and chemistry in particular will once again be central.
We can also be proud of the entire scientific endeavor. The scientific enterprise in this country, indeed the world at large, represents some of the best attributes of human civilization. Nobel Laureate Robert Wilson spoke around 1969 about basic research in the Fermi lab: “This new knowledge has all to do with honor and country, but it has nothing to do directly with defending our country except to make it worth defending.”
A SKEPTICAL ANALYSIS OF CHEMICAL AND BIOLOGICAL WEAPONS DETECTION SCHEMES
Donald H. Stedman
University of Denver
One joy of being around for a long time is that you can become awfully skeptical. You see things that are supposed to be new, wonderful, and just about to happen, when suddenly a vast amount of money is given to someone else, you see the results, and you say, “I could have spent one-tenth of the money for ten times the result.”
There have been many quite useful discoveries in chemistry and chemical engineering over the years that have been used for detection applications. The first example is plasma chromatography, otherwise known as ion mobility spectrometry. In the 1980s this technique became the method of choice for detecting chemical warfare agents and was used by soldiers in Desert Storm with unfortunate results. Official reports tell that the rate of false alarms for these instruments was so high that soldiers became desensitized to real hazards. One infantry battalion eventually turned their alarms off. Much of the Gulf War Syndrome may well have been caused because ion mobility spectrometry was oversold as a detection technique.
Fourier-transform infrared spectroscopy (FTIR) is a very good tool that has been used successfully both in laboratories and the field. With enough optical path length, chemicals can be detected at parts per billion levels in only a few minutes with good resolution. Long-path FTIR has been used successfully in Utah where chemical weapons are destroyed. Nevertheless, FTIR has been oversold as a long-range, look-ahead detection tool. At White Sands, the Defense Threat Reduction Agency dropped large bombs on bunkers containing chemical agent simulants to determine the extent to which the agents were destroyed or dispersed. FTIR was used to measure the dispersed agents. Amazingly, passive FTIR would only work if the sky was blue and the ground warm, providing temperature contrast, or if readily detected SF6 were added to the simulant.
Mass spectrometry (MS) is highly selective. The ability to further perform tandem mass spectrometry (MS/MS) analysis when a compound is detected to confirm the detection virtually eliminates false positive and negative alarms. But MS/MS analysis must be completely automated for the average GI to be able to perform it. A clever hand-held chemical and biological mass spectrometer has been developed that weighs only 4.3 pounds. The problem with the unit is production of the necessary vacuum, which requires 35 amps at 24 volts. Thus, battery-operated portable mass spectrometry is not yet available.
Chromatographic methods have included development of element-specific atomic emission, flame photometric, and flame chemiluminescent detectors. For example, a flame chemiluminescent phosphorus detector has been suggested for
chemical weapon detection and does work for simulants. However, since ordinary flame photometry is sufficiently sensitive, it makes little sense to measure chemiluminescence, which requires hydrogen for the flame, a vacuum pump on the tube, and an ozone generator among other equipment.
Mike Sailor of the University of California, San Diego, has recently developed an element-specific fluorine detector to be used as a portable nerve gas sensor. What makes his instrument so different is that it has been presented to the scientific community (at an American Chemical Society meeting) before it is put into the hands of soldiers. This gives the opportunity for peer review and for corrections to the technology, if needed, to ensure that the instrument is useful and that money isn't wasted or lives aren't endangered.
The National Institute of Justice has put together multivolume compendiums of instrumentation relevant to chemical and biological weapons detection. However, none of these books contains a critical review of the effectiveness of the technologies. One instrument included in the publication is a portable, handheld, ion mobility spectrometry chemical agent monitor with moderate to high selectivity, but only when used in open spaces, far from vapor sources such as smoke, cleaning compounds, and fumes. This would seem to make it useless in the battlefield. Another listed chemical agent monitor has a “below 5% false positive rate.” With one in 20 false positives, no one could reasonably act upon an alarm.
There are many barriers to and challenges for instrumentation commercialization. It seems that most often, the parties involved are working at cross-purposes. The (often academic) inventor is never satisfied and always wants to continue to improve his or her invention. The instrument company only wants to build and sell the product. They would also like to have planned obsolescence to guarantee future sales. The salesperson has to sell what he has now, while convincing the customer that it will meet all his needs. The customer wants a product that is inexpensive and will last forever.
In the meantime, congress wants the military to solve the chemical and biological detection dilemma and military field commanders do, in fact, need the detection problem to be solved. Although the military can write technical specifications for detection equipment, the contract still goes to the lowest bidder who may not actually be able to deliver the specified product. The soldier is left with what is given him, whether or not it is actually a working, useful tool.
Perhaps scientists need to educate the public that it is impossible to have a perfect instrument. If zero false negatives are required for a detector, then there must be some amount of false positive responses. This does not mean that science has failed: risk is inherent in everything.
Conversely, the risks taken by the soldiers that now have Gulf War Syndrome were unnecessary. Beta testing of new technologies should have been done and should still be done before those new technologies enter the field. Because of classification and other problems, there is no peer review, leaving no one to stop
disasters before they happen. While an individual scientist advising the military that the $30 million they just spent on an instrument was wasted might be shooting himself in the foot, a large organization like the National Academies or a large chemical company might more easily and truthfully review an expensive military product without repercussions.
The skepticism of the chemical community can truly provide a service to the nation.
OVERVIEW OF REAL-TIME SINGLE PARTICLE MASS SPECTROMETRY METHODS
Kimberly A. Prather
University of California, San Diego
Today I am going to speak about some of the continuous aerosol mass spectrometry methods that are currently in use. These methods are primarily used for atmospheric chemistry measurements related to human health such as pollution remediation, although now national security and homeland defense applications are starting to evolve. Most aerosol analysis methods are not real-time analyses; bridging the gap between on-line and off-line technologies is a challenge that is being addressed.
Typically, sample collection for aerosol analysis is accomplished by pulling particle-containing air through a filter. Collection times for obtaining enough sample to analyze are typically long. This fact precludes the observation of short-term variations, and also makes it possible for sample integrity to be lost during the long period between collection and analysis. How representative the collected sample is of actual atmospheric compounds is an additional question.
Mass spectrometry is an excellent analytical tool because it has extremely high sensitivity. With time-of-flight mass spectrometry (TOF-MS), if the ions are focused properly, it is theoretically possible to obtain up to 100 percent detection efficiency. A wide range of analytes can be detected, from very small organic or inorganic molecules to proteins and macromolecules, and mass spectrometry can offer both molecular weight and structural information on species in single particles very rapidly. TOF-MS can also be rather rugged and can be taken into the field.
My research focuses on aerosol particles between 0.1 µm and 10 µm. Bacteria, though most often seen with optical devices, fall within this size range where detection by mass spectrometry is ideal. Single particles down to 12 nm have been detected by mass spectrometry so that viruses, though considerably smaller than bacteria, can also be analyzed (see Figure D.5).
The instrument in my laboratory uses laser desorption ionization with a Nd:YAG laser and a TOF-MS. The particles are drawn into the instrument on a continuous basis and undergo a supersonic expansion when they pass through the inlet nozzle. During the expansion, the particles pick up different speeds that are a function of their size. They then pass through two scattering lasers. The time it takes the particle to travel between the two lasers can be correlated with particle size, allowing the particle size to be determined precisely. Knowing the particle speed and position, it is possible to time its arrival at the center of the spectrometer with a Nd:YAG laser pulse (266 nm). The pulse is able to desorb ionized species from the particle, which can then be analyzed by the spectrometer.
When ionized species are desorbed from the particle, care must be taken so that not every species is fragmented. Given a mixture of bacteria, fragmentation
will cause all of the bacteria to look the same. However, gentle desorption and ionization using long laser (i.e., infrared) wavelengths, detection, and further fragmentation and detection in a tandem MS system can yield more detailed information in a useful manner. There are also technologies such as scanning mobility particle sizers currently available to allow particle size selection without using optical or aerodynamic methods. These methods enable size selection down to 3 nm.
Another important technology in the national security and homeland defense arena is ion trap secondary ion mass spectrometry. Many chemical warfare agents are not volatile and tend to condense on particle surfaces. Research at Idaho National Engineering and Environmental Laboratories has used this technology to analyze mustard agent on the surface of soil particles down to a surface coverage of 0.07 monolayers.
The development of matrix-assisted laser desorption ionization (MALDI) has advanced the entire field of mass spectrometry. To use this ionization method, the sample is mixed into a matrix that absorbs the laser wavelength extremely well (approximately 10,000:1 matrix:analyte) and the mixture is placed on a solid substrate. Absorption of the laser causes the matrix to explode, ejecting the intact, nonvolatile molecules of interest into the gas phase. Proton exchange or alkali metal attachment occurs in the gas plume and the ionized species can be detected.
The technique does require sample preparation, and reproducibility is often poor, although the results can be coupled with other analytical techniques for more reliable sample identification.
Ideally, scientists would like to be able to perform laser desorption and analysis directly, but typical laser wavelengths cause fragmentation of bacteria and other particles. Due to the low energy produced by infrared lasers, however, bacterial fingerprints can indeed be obtained as shown by researchers at Lawrence Livermore National Laboratory. It is also possible to detect much larger species, an impossible task with earlier technology. Infrared laser desorption techniques are undergoing constant improvement.
It is desirable to have hand-held, lightweight, and rugged instrumentation to take into the field. Whether the operator is a scientist or a soldier, analytical technology must become portable. For time-of-flight mass spectrometry, this is an especially difficult challenge, as it is the long flight tubes that enable the high resolution to be obtained. However, Robert Cotter of the Johns Hopkins School of Medicine has developed an off-line TOF-MS instrument with a flight tube that is only 3 inches long. The technique, coupled with MALDI for off-line analysis, is successful when used for particles of high mass, such as spores. Spores and other heavier particles travel more slowly, so the shorter flight tube is sufficient to obtain measurements for analysis.
In conclusion, there have been advances in the recent past in both on-line aerosol technology and off-line analytical techniques that show great promise for activities such as the analysis of spores. The next logical step is for chemists and chemical engineers to merge the two types of technologies.
NEW APPROACHES TO DECONTAMINATION AT DOE
Mark D. Tucker
Sandia National Laboratories
Today I will offer you information on the decontamination efforts in the Department of Energy Chemical and Biological National Security Program. In doing so, I will give a brief background of decontamination, touch on some key elements of decontamination, and present some of my own ideas on the future directions and critical needs for decontamination.
There are two main activities related to the decontamination of a site infected by a chemical or biological agent. A first response by firefighters, HAZMAT units, the National Guard, and other emergency personnel cannot be fully planned for and has the objective of rapidly achieving decontamination levels safe enough for treatment and evacuation of casualties. First responders are not attempting to obtain an agent-free site in every corner and crevice. Facility restoration is the second main decontamination activity and is accomplished by a variety of government agencies and possibly private industry. Facility restoration is a planned and controlled event; the amount of damage incurred by the decontaminating agent is minimized.
In the past, decontamination efforts were focused on neutralizing or killing chemical and biological agents, while little thought was given to reusing equipment at an attack site or to environmental effects. This led to the development of very highly toxic and corrosive formulations to neutralize agents such as super-topical bleach, a very toxic and corrosive mixture of sodium hydroxide and calcium hypochlorite. In the last five or six years, a new decontamination concept has emerged that tries to save and reuse equipment and restore facilities. This requires the decontamination formulations to be more environmentally and people friendly, in other words nontoxic and noncorrosive. Many of the historic decontamination formulations that are still in use today are being replaced.
The Department of Energy/National Nuclear Security Administration's Chemical and Biological National Security Program was initiated in fiscal year 1997 with the objective of developing technologies and methodologies to respond to domestic terrorist attacks that use chemical and biological agents. Of the four main thrust areas in this program, the first is detection and has the goal to develop technologies to detect both chemical and biological agents. The second thrust, biofoundations, is a basic science program that analyzes characteristics of microorganisms in order to develop better biological detectors. The third thrust uses modeling and simulation to examine the fate and transport of chemical and biological agents. Modeling and simulation can offer information on how far an agent will disperse and where restoration efforts should be focused, given, for example, an attack in a subway station. The final thrust is decontamination and remediation, which aims to develop effective and safe (nontoxic and noncorrosive)
formulations and technologies to rapidly restore civilian facilities after a terrorist attack. One simple but elusive goal is to create a single formulation to be used by first responders that destroys all chemical AND biological agents. A second objective is to address the complex issues associated with final decontamination and remediation of a facility including “how clean is clean” and convincing the public that a facility is safe.
Three types of projects are funded under decontamination and remediation: methodology (deployment and use), new nontoxic formulations and technology, and field verification. In the area of methodology, two projects, one at Oak Ridge National Laboratory and one at Lawrence Livermore National Laboratory, are focusing on how clean is safe. Several nontoxic formulations have been developed, including the Sandia Decon Foam, which is active against chemical and biological agents, and the L-Gel, which travels through air ducts. A plasma jet technology for decontamination of sensitive equipment has also been built. Mock offices at the Dugway Proving Ground were contaminated with anthrax surrogate, and four novel emerging decontamination technologies were tested there as part of the field verification and testing program. Data from this test was used for technology evaluations and selections for the decontamination and remediation of the Capitol Hill office buildings.
I have been working on the Sandia Decon Foam formulation (see Figure D.6). Generally, foam is quicker to deploy and react with chemical and biological agents than a water- or fog-based decontaminant, and it has very low logistic support and water demand. The Sandia Decon Foam has very low toxic and corrosive properties and provides a single decontamination solution for both chemical and biological agents.
Decontamination of the Capitol Hill office buildings was a learning experience for our nation. The Sandia Decon Foam was one of many decontaminants used in the cleanup efforts. It became clear that no single technology is universally effective and that a suite of decontamination technologies is needed. It is also necessary to develop better methods to decontaminate sensitive equipment and objects such as electronics, computers, copiers, and valuable paintings. Consequently, facilities and standard methods are required for product testing.
Decontamination programs exist in other federal agencies as well. For example, the U.S. Army Soldier and Biological Chemical Command in Edgewood, Maryland, is focusing on military needs. They are looking at solution, oxidation, and enzymatic chemistry for liquid-type decontamination formulations, especially decontamination of sensitive equipment. Their developing technologies include supercritical carbon dioxide, solvent wash systems, thermal approaches, and the plasma jet, which is a cooperative project with the Department of Energy. They are also conducting large-scale tests with solid-phase chemistry focused on decontamination of bulk quantities of chemical and biological agents, if an entire military base, for example, needed to be decontaminated. Most of these types of projects are funded by the Department of Defense.
In conclusion, I think future work should focus on several areas of decontamination. First, we need noncorrosive decontamination formulations and technologies for use on sensitive equipment and items as well as methods to decontaminate hard-to-reach places like air ducts. Fundamental issues like the number of anthrax spores necessary for infection and how clean is clean should be investigated. We also need more extensive lab and field data, and large-scale testing of decontamination formulations and methodologies, especially coordination and integration with sensors and modeling and simulation. Ideally, if we could tie all these things together, effort could be directed at the problem areas without wasting time on areas that remain clean. Personnel decontamination is also a main issue. With each anthrax incident, real or perceived, a personnel
decontamination effort goes into effect, putting people through portable showers and spraying them with bleach, for example. This may not be a realistic measure as shown by the Tokyo subway incident: almost everyone who went to a hospital hopped in a cab and went on their own, thus contaminating the hospitals. Sociological issues are important, as well. First responders destroy evidence when they deploy foam; coordination with forensic and criminal investigators must occur. What evidence will be admissible in court? Finally, regulatory issues will also come into play in establishing standard test methods to determine the efficacy of and give approval to sensors and decontamination formulations for chemical and biological agents.
HOW INTEGRATION WILL MAKE MICROFLUIDICS USEFUL
Stephen R. Quake
California Institute of Technology
You've already heard about the basics of microfluidics and the lab-on-a-chip concept from Andrea Chow. So today, I'm going to talk about the gadgets based on microfluidics that my research group is making. I hope that this talk sparks your imagination and helps you to envision ways that microfluidics can help in the area of national security and homeland defense.
Over the last 10 years or so, scientists and engineers have been focusing on miniaturizing specific functions of lab equipment rather than miniaturizing the entire lab. This is because they had no way to integrate all of the pieces; the plumbing was lacking. More specifically, plumbing is controlled by valves, which are extremely difficult to miniaturize. To solve this problem, several years ago my research group developed a soft microvalve system using multilayer soft lithography.
If you have two layers of orthogonally arranged pipes and fill the bottom layer with fluid, a pneumatic pressure applied to the top pipe will cause the bottom pipe to pinch closed (see Figure D.7). This creates a valve. The valves are made of a soft, inexpensive polymer that allows the use of low actuation forces on the valves and yields small footprints. Pumping in this system is based on peristalsis, enabling the movement of multiple nanoliters per second through valves 100 µm wide and 10 µm deep. A nanopipette is built into the system, with different size plungers allowing different amounts of fluid to be introduced.
A number of devices have been made based on this microvalve system. One of the first devices built was a cell sorter on a chip. By manipulating nanoliters of fluid, different strains of fluorescently activated E. coli were introduced, sorted according to their fluorescent properties, recovered from the chip, and cultured.
My research group felt that high-throughput protein crystallization on a chip would be a useful challenge to meet. We first determined that it was possible to grow protein crystals on a chip, but only under the same conditions that allowed crystal growth in bulk experiments. The next experiment screened for protein crystallization, a difficult process. Success seems to be more due to sheer numbers and probability than on rational design, and due to the small amount of protein available for use, mixing and metering during the screening process are difficult to achieve. The microfluidic, microvalve system we have developed is able to overcome these obstacles.
The mixing ability of microfluidic systems was tested using a rotary pump, a circular channel with inputs and outputs that can be peristaltically pumped, opened, and closed. It was found that after only a few minutes of active mixing (due to pumping), a uniform mixture of particles is obtained that would have taken hours to achieve by diffusion. This is also useful for accelerating diffusion-
limited reactions; a 60-fold enhancement in the kinetics of some assays has been measured. The rotary pump has also been used as a 12 nL polymerase chain reaction system.
In the electronics industry, a Pentium computer chip has hundreds of millions of transistors and only a hundred pins in and out. If each transistor had to be addressed individually, it would be impossible to have such a chip. Microfluidic systems are similarly easy to control: n fluid lines can be controlled by 2log n control lines. Additionally, the pressure to actuate a valve depends on the width of the control line, so by choosing our pressure carefully, one thinner fluid line can be closed while the wider fluid lines remain open due to insufficient pressure. This idea has been used to develop microfluidic systems that can screen enzymatic libraries and perform in vitro transcription translation of DNA to protein in approximately 30 minutes.
I am sure that this technology is useful in many ways to chemists and chemical engineers and can help in the arena of national security and homeland defense.
BIOSYNTHETIC ENGINEERING OF POLYKETIDE NATURAL PRODUCTS
C. Richard Hutchinson
This morning I'd like to address basic medicinal chemistry, and specifically, how to manipulate drugs from natural sources. Biosynthetic engineering has recently been exploited in antibacterial drug discovery, but can also be applied to anticancer drugs, antiviral drugs, and others.
Kosan Biosciences was formed almost 6 years ago, founded on an interest in polyketides, microbial metabolite-based drugs. Polyketides have many diverse chemical structures including erythromycin, which will be mentioned again later. These chemicals include fused-ring aromatic compounds, compounds decorated with sugars, and compounds with large stretches of double bonds. Each of these compounds has different biological activities and utilities, but they are all made in nature by very similar biochemistry.
This biochemistry resembles how long-chain fatty acids are made. Acyl-coenzyme A substrates can be carboxylated, reduced, dehydrated, or otherwise changed from the original molecule. Each of these biochemical activities is accomplished one at a time, by single-function enzymes that are produced by an organism and collectively organized to make a particular kind of long-chain fatty acid. Polyketides use enzymes like this in much more complicated ways than required to make fatty acids. While biological organisms may use only a handful of activities, with modern genetic methods biochemists can combine genes for any or all of the activities to create a complex array of thousands of natural products.
Imagine a set of genes in the chromosome of a bacterium. The protein products of these genes have individual active sites (domains) and are grouped in modules. Each module can have a different number of active sites and hence a different function: joining bonds, selecting and loading substrates, and carrying the substrates, for example. Not every module is used for every synthesis; this characteristic imparts the synthetic flexibility and diversity that creates the eventual polyketides. Modules are strung together in very large proteins to take simple substrates the organism provides, assemble them in a sequential manner, and produce a polyketide. The enzyme then continues to produce polyketides and accumulate the product (see Figure D.8).
For Kosan and others in the pharmaceutical industry, the intent is to learn enough about these enzymes from a structure-function viewpoint so they can be manipulated. Because polyketides are built by starting from one point and continuing sequentially along the pathway dictated by protein structure, a biochemist can trace through a molecule and make structure-function predictions for the assembly enzymes with reasonable accuracy. Using recombinant DNA methods,
it is possible to remove a single module or a specific domain within a module, place it somewhere else in the protein, and make a product with a different structure and functionality.
This technique has been used in the attempt to create new antibacterial agents that target drug-resistant pathogens. For example, erythromycin is a very well known antibacterial drug that has been used for approximately 50 years for a variety of infections, primarily in the lung. A polyketide synthase produces erythromycin and has six modules, each with a certain number of active sites. By manipulating the active sites or modules of the polyketide synthase, the alkyl groups could be taken out or changed, the oxidation state of the hydroxyls or the ketone could be changed, or the length and size of the lactone ring could be changed. Although many variations of the synthesis were attempted, none developed a better antibacterial drug than the existing erythromycin. The experiment was valuable, though, because it established a paradigm for structure-function relationships.
Similarly, we could create a mutant in part of the polyfunctional protein and allow the synthetic substrate to be accepted by the functional parts and carried through to produce a completely different compound. The benefit of this type of manipulation is that synthetic procedures that are very difficult for chemists to do
can be accomplished easily. These processes have allowed scientists to test a large number of macrolide antibiotics and to develop lead compounds for a new type of drug called ketolides, which has activity against usually drug-resistant pathogens.
To summarize, polyketides are simply bacterial products that include erythromycins and chemically derived ketolides. New erythromycins and polyketides can be made by genetic engineering of the polyfunctional giant proteins called polyketide synthases or by taking products of the engineered microbial metabolism and further modifying them. These new antibiotics can be more insightfully designed due to increased knowledge of structure-function relationships for how such antibiotics bind to bacterial ribosomes. This allows treatment of drug resistant organisms that are of importance in community-acquired infections, pathogenic diseases, and especially unique bacterial pathogens that have previously been ignored because they had never been a threat. Although we cannot predict drug efficacy, drug production is more approachable using the technology of microbial metabolism modification and structure-function information for bacterial ribosomes.
CHALLENGES IN RAPID SCALE-UP OF SYNTHETIC PHARMACEUTICAL PROCESSES
Even though synthetic organic chemistry is a very old scientific pursuit, I believe that today most medications needed for homeland defense will be created by means of synthetic organic chemistry. This type of research and development is done by industrial organic chemists and chemical engineers, and there are many opportunities to involve academic scientists to increase the speed of development.
The pharmaceutical industry can improve by better partnering with academia to focus on solving the most pressing problems. In general, there is a lack of understanding of automation and parallel experimentation among chemistry graduates. Engineering students also need training in modeling and computation, specifically in the areas of predicting physical and transport properties, computational fluid dynamics, specific process operations, and control theory. This facilitates obtaining scale-up parameters from small-scale experiments better known as “micro-piloting,” which is of utmost importance. Further development in the areas of in-line analytical and microchip technology would be useful for rapid drug development.
In a national emergency, the pharmaceutical industry could be called upon to bring a medication to market, to a commercial scale, very quickly. Normally, the complex scale-up process does not operate on such a short timeline. Assuming that we can partner with government agencies to ensure compliance in a streamlined manner, assuming we have better predictive toxicology, and assuming that the development of the dosage form (capsule, tablet) is not an issue, there remain a number of barriers to the scale-up process to create the active pharmaceutical ingredients.
Often in a medicinal laboratory, synthetic routes are linear instead of convergent and may contain many steps with low yields and operating conditions not favorable to scale-up (for example, strong exotherms, mass transfer limitations, and gums/heavy slurries). Reactions involving chiral molecules may have low selectivity for the medicinally active isomer. In addition, the reagents used may be toxic, and using large amounts of toxic agents for a scale-up is not an acceptable solution. Some laboratory materials are not readily available in large quantities.
Scale-up to 100-g quantities for animal testing, which must precede human studies, typically takes 3 to 8 months. Then, production must be scaled up to commercial quantities. Traditionally, the entire process takes 2 to 3 years—too long to be useful in a national emergency.
Much of the knowledge needed for manufacturing a pharmaceutical is related to the last step in the process, including understanding the bulk active ingredient, its impurity profile, chiral purity, and crystal form. To cut down the entire scale-up
process to only 6 months would require some of this knowledge to be investigated from the beginning, in parallel with other information. A 6-month timespan may still seem slow, but that is one-fourth the time normally taken by industry.
The scale-up process can be broken into smaller steps. First, the synthetic pathway must be defined. This includes determining how to make the final compound in a safe, environmentally sensitive, and affordable manner. This may require thousands of experiments and an equally large amount of analytical tests. The second step involves developing the pathway into a process. Issues in this stage include ensuring the right chiral and chemical purities, which requires hundreds of experiments. The last stage is performing hundreds more experiments to learn about process parameters and optimization (see Figure D.9).
To minimize the time required for these three steps, chemists need to appreciate the process-related issues not only of how this elegant structure can be made, but also how it can be made efficiently. They should also know chemical engineering concepts like kinetics and continuous processing so that these ideas can be incorporated into the synthesis process. Chemists must be comfortable with automation—with the tools they need for the job. Commonly, automated systems are technologies that sit underutilized in chemical laboratories. Chemical engineers are lacking in-depth knowledge of organic chemistry and particular spectroscopic methods, which are essential in modern industry. It is impossible to function in the pharmaceutical industry without understanding what the chemists are saying. It would also be advantageous for engineers to have hands-
on industrial experience with taking a product from the concept stage to the commercialization stage. The educational system can address many of these problems.
As mentioned earlier, parallel and automated methods are a key part of making the scale-up process faster. Automation technology already exists to provide good agitation, heating, reflux, inert headspace, sample filtration, and sample taking in a variety of vessel sizes. However, there remain challenges in the automated handling of solids and in the use of on-line spectroscopic instrumentation on very small samples, which would aid the quick determination of process optimization, parameters, kinetics, partition coefficients, and the like. It is essential that the automated technology be user friendly, so that chemists can focus on the chemistry and not on how to use the equipment.
Continuous systems will also become an important topic for rapid scale-up during a national emergency. First, continuous reactions allow the scale-up to be considerably smaller than that needed for a batch reaction and allow equal or greater productivity. Continuous systems can much more easily handle highly reactive, unstable ingredients. They can also better handle hazardous reagents; since the reagents are made in situ and are used immediately, they never accumulate.
Use of continuous systems will require chemists and chemical engineers to work together more closely. It will also need more and improved prior knowledge of the physical properties and transport properties of reagents. Consequently, our predictive and modeling capabilities must improve.
Currently, after each step in the synthesis process, a sample is analyzed to approve or reject the operation. Not only is this inefficient for the industry in general, this is also a stumbling block for a rapid scale-up. Controls and measurements of the process need to be accomplished in situ, which is again an issue that requires the chemists and chemical engineers to work together toward a solution. It may, in fact, involve lab-on-a-chip technologies and microreactors that were presented by other speakers in this workshop.
To integrate everything into a continuous process, accelerated reaction kinetics are needed. A reaction that takes 20 hours to complete is difficult to use in a continuous process. However, we have microwave chemistry, ultraviolet chemistry, laser stimulation, sonication, and other technologies to accelerate proven but slow reaction chemistries. Research in this area is merely beginning.
Another area that is lacking is the chemistry of solids. In pharmaceuticals, an active drug is often made into a tablet or some other solid. Unknowns remain in the areas of solids flow, solids compressibility, and compaction. The physical incompatibility of various inert ingredients also remains a mystery. The engineering community has a great opportunity to help in this area.
In conclusion, research is needed for the improvement of automated tools, handling of solids, modeling and computational tools, and in-line analytical technology. To speed the scale-up of pharmaceuticals in a national emergency, these issues must be addressed by both chemists and chemical engineers.