Panel V: Capturing New Opportunities
Pektevich & Partners, LLC
Mr. Borrus observed that today’s booming economy is due, in many ways, to the fruits of past innovations by the private sector and past government-industry partnerships. As Dr. Moore mentioned, many of today’s innovations in electronics can be traced back 50 years to the invention of the transistor at Bell Labs. The conference’s examination of the current state of research in biotechnology and computing identified innovations and partnerships between government, industry, and universities that stretch back over several decades.
In the first panel, the objective is to look at issues in biotechnology and computing that are presently subjects of intense scrutiny. The economic impacts of these topics are perhaps a decade or so away, but it is important nonetheless to make sure that current research needs are attended to, and that the public policy environment encourages further development. Each of the panelists, Mr. Borrus said, is uniquely qualified to provide insights into the possibilities of today’s technologies, and what we need to do today to take full advantage of them in the future.
COMPUTING AND THE HUMAN GENOME
National Center for Biotechnology Information
Up until 5 or 10 years ago, Dr. Boguski said that biologists restricted their studies to in vivo—observing the body or other living organisms, or in vitro—the
test tube. Increasingly, however, biologists are spending all of their time working in silico, that is, making discoveries entirely on computers using data from gene sequences.
A federal advisory committee in June 1999, urged the National Institutes of Health to train biologists in computing. The advisory panel, chaired by David Botstein of Stanford University and Larry Smarr of the University of Illinois at Urbana-Champaign, recommended that NIH establish up to 20 new centers at the cost of $8 million per center per year to teach computer-based biomedical research. This is $160 million per year out of NIH’s budget for training computational biologists.
Citing a quote in Nature magazine that said, “It’s sink or swim as a tidal wave of data approaches,” Dr. Boguski said the need for computational biologists was urgent. The information landscape is vast in genomics research, and Dr. Boguski said he would describe that landscape and the staffing needs that must be fulfilled to negotiate it successfully.
One measure of the growth of scientific information is the growth in the MEDLINE database at the NIH’s National Library of Medicine. The database contains over 10 million articles, and this is growing by 400,000 articles per year; these are peer-reviewed journal articles. If one looks at the subset of molecular biology and genetics articles, the growth rate is considerably faster than that for biomedical literature as a whole. Another measure of the growth in biological information is the growth of DNA sequences. Rapid DNA sequencing technology was invented in 1975, but until it was automated in 1985 and until the Human Genome Project took off, the growth of DNA sequences was modest. Since the early 1990s, however, the growth rate has been extremely steep.
Comparing the growth in articles on DNA sequences with the growth of DNA sequences, Dr. Boguski identified a serious gap for the biomedical research community. From 1975 through 1995, there were more papers published in DNA sequences than the number of sequences. Since 1995—roughly five years after the inception of the Human Genome Project—an enormous gap has opened up. There are now more genes than articles about those genes. The challenge for biology today is to find ways to bridge the gap between the number of genes being discovered and techniques to classify and understand their function.
One technology that seeks to bridge the gap is functional genomics. Functional genomics refers to
The development and application of global (genome-wide or system-wide) experimental approaches to assess gene function by making use of the information and reagents provided by the genome projects. It is characterized by the high throughput or large-scale experimental methodologies combined with statistical and computational analysis of the results. The fundamental strategy in a functional genomics approach is to expand the scope of biological investigation from studying single genes or proteins to studying all genes or proteins in a systematic fashion.
The Human Genome Project was initially planned to last 15 years and conclude by 2005. Technological and competitive forces have accelerated that timetable, and by May 2000, a 90-percent draft sequence of the human genome will be in the GenBank database. To date, 800 million bases of DNA have been produced and are in GenBank; this is out of 3 billion bases to be sequenced, so the project is between 25 and 30 percent complete. This data is available—free of charge on the Internet—the taxpayers have paid for it, so Dr. Boguski’s job is to ensure its availability—and anyone can access it.
Computational biology enables biologists to make new kinds of inferences about genes using computing power. In the past, biology has been largely an observational science. More recently, biologists have classified their observations and put them into databases. However, advances in computing capability allow scientists to take a gene sequence, infer the protein sequence using the genetic code, and then, using modeling techniques, infer the protein structure and function.
For the past 15 years, biologists have been engaged in the comparative analysis of genes. Approximately 40 years ago, scientists discovered that at the molecular level, life is very similar. When biologists discover a new gene and want to know its function, they search the genome database for homologues—genes whose function is known and from which the function of the newly discovered gene can be extrapolated. The basic insight is that what is true for yeast, whose gene has been sequenced, is true for humans. In fact, the cell nucleus for yeast and humans are the same, and one can take a gene out of yeast, replace it with a human gene, and the yeast will thrive as if nothing happened. Furthermore, biologists have advanced from comparing genes to comparing entire genomes of different organisms. In the coding sequence of the human, the human being is 85-percent similar to rodents; that is, to mice and rats.
The theory behind this is molecular evolution. When biologists discover a human gene, and then compare it to databases of mouse or yeast genes and discover that the genes are so similar that the similarity can be no accident, this is evidence of evolution at work. And the increasing processing power of modern computers enables these comparisons to be made.
The Power of the Internet
The Internet has also been a crucial tool for discovery in genomics. One researcher was involved with an 18-year research project whose final answer was compressed into a five-minute search on the Internet. The researcher was searching for a gene that causes a rare disease called ataxia telangiectasia. When the researcher finally discovered the sequence for the gene, he did not know what its role was in the body. After a five-minute Internet search of other organism databases, he was able to find out by comparison the cause of the disease.
Gene Mapping Milestones
In 1996, the first large-scale mapping of the human genome, containing over 15,000 genes, was published in an article by 105 authors from 18 different research institutions in five countries on three continents. This collaborative effort signals a sea change in how biological research is conducted, and indeed the paper resembles the type of collaboration usually seen in particle physics. Scientists have not yet identified many of the functions of these mapped genes. This is why functional genomics technologies are so important. Even with many gene functions unidentified, the current map of the human genome has been instrumental in accelerating the cloning of hundreds of genes for diseases prevalent in humans.
The so-called “obesity gene,” leptin, is an example of the uses of genomic data. When this gene was first cloned, its sequence itself was uninformative; the sequence was not related to anything in existing databases. Therefore, the next level of sophisticated analysis, protein threading, was needed to discover its function. Using protein threading, a type of modeling technique, it was discovered that leptin was similar to a hormone called Interleuken-2. It was not at all expected that the obesity gene would function like this hormone, so this work initially was greeted with skepticism. Eventually, the structure of the protein was determined, and the notion that leptin functioned like Interleuken-2 was shown to be correct. The structure of the protein was determined by computing the structure completely in silico. Since this discovery, an entirely new pathway in obesity research has opened, one that would not have otherwise been undertaken.
An example of a functional genomics technology is DNA microarray analysis, which enables researchers to monitor the activity of approximately 20,000 genes on a single chip. For example, researchers can see how a gene’s behavior changes when a drug is introduced—or perhaps how a gene does not respond to the drug. This would tell a doctor whether to modify dosage or change the drug entirely in order to ensure that the therapy targets what is intended. Computer-enabled functional genomics technology can enable such an experiment to consider tens of thousands of genes. Eventually, researchers want to consider millions of genes, and this will require new visualization and simulation technology.
Another example of functional genomics experiments is in vitro simulation of wound healing. Biologists take the cells responsible for healing, place them in a culture, and treat them with a serum. Biologists then observe which genes are “turned on or off” in response to a wound. That process is then studied and simulated in a test tube using functional genomics technologies. Because the government developed the techniques, databases, and other elements needed to conduct such experiments, it is all freely available on the World Wide Web. This has stimulated additional paths of inquiry in the scientific community.
Dr. Boguski pointed out that NIH did not start the human genome project,
rather it was the Department of Energy (DOE) that did as part of its charge to study the biological effects of ionizing radiation. The idea was to sequence the human genome and then observe the changes in gene sequences that the introduction of radiation induced. One of the pioneers of the project in DOE, Charles DeLisi, said in 1988 that “the anticipated advances in computer speed will be unable to keep up with the growing [DNA] sequence databases and the demand for homology searches of the data.” Luckily, said Dr. Boguski, he was wrong. Likening the growth of gene data to Moore’s Law,1 Dr. Boguski said the amount of gene data doubles about every 18 months. For the foreseeable future, there is sufficient computer power for the searches and structure predictions that he has discussed.
There are research areas in which supercomputing power will be needed. Dr. Ray Winslow at Johns Hopkins University’s Department of Biomedical Engineering is simulating heart failure by studying five different genes. Using partial differential equations, Dr. Winslow simulated irregular electrical activity in the heart and represented it in a three-dimensional simulation of the beating heart. For five genes, this took two CPU days on a 16-processor Silicon Graphics computer. If one were to study 50 genes or more, the computational requirements would be dramatically higher. Biologists understand the growing need for high-performance computers. Dr. Boguski said that grant applications increasingly request funds for new computers.
In closing, Dr. Boguski said that with the completion of the genome mapping by next May, the ethical, legal, and social implications (ELSI) of these breakthroughs will grow. It is impossible at this time to anticipate all of these so-called ELSI issues, but the need to face them will grow with our understanding of the human genome.
Questions From the Audience
Dr. William Long asked Dr. Boguski whether software development was keeping pace with advances in understanding of the human genome. Dr. Boguski said that, even though the human genome has 3 billion base pairs, 97 percent of that is so-called “junk DNA,” repetitive DNA that has no apparent biological function. This simplifies the computational problem in that the volume of data is not as vast as it seems. Dr. Boguski did emphasize, however, that improvements in software would be critical. New search algorithms will be necessary to search expanding databases filled with genomic information.
Dr. Boguski was asked how well biologists were able to model physically and chemically the role of gene expression in driving the “protein factories” within cells. Dr. Boguski responded that this has been accomplished for small
pathways in bacteria. Modeling how a fertilized egg develops over 9 months into a baby is quite a ways off, but Dr. Boguski said it would be possible in the future. Today, biologists receive lots of data from functional genomic tools. The only way biologists will be able to make sense of all this data is to develop quantitative dynamic models of systems and simulate evolving biological systems in a computer.
Dr. Boguski was asked what his “wish list” to Congress would consist of in order to advance his field. Dr. Boguski responded that one colleague in biomedical engineering, Ray Winslow, said that his field needed high-performance computers, databases, and scientific visualization tools. Other colleagues said they need larger computer equipment budgets and larger grants. Dr. Boguski said it was a bit bothersome that on many grant applications, the salary for a computer consultant exceeds that of the principal investigator.
Sandia National Laboratories
Dr. Romig said that the driver behind nanoscience is miniaturization, and there are two schools of thought on this issue. One is that scientists should simply take components in use today and make them smaller and smaller. The other is that scientists should build from the micro-scale up; that is, build systems from the atomic level, perhaps using biological systems or materials, to create very small devices. Dr. Romig said that he thought that neither method would become dominant, but rather the two approaches would converge. At Sandia National Laboratories, the approach is to mix the two approaches in creating new materials and devices.
As background, Dr. Romig described Sandia as a multi-program national laboratory whose principle mission is to help protect the national security. This includes ensuring the reliability of the nation’s nuclear weapons through the stockpile stewardship program, ensuring the efficiency and reliability of the nation’s energy supply, protecting the environment through remediation programs, and developing sensing and monitoring devices to detect environmental problems. It is program need in each of these areas that drives Sandia’s interest in developing nanotechnologies.
Dr. Romig said that the terms “nanoscience” and “nanotechnology” are often used interchangeably. In fact, “nanoscience” may be used in order to obtain money when the funding agency is interested in fundamental research, while “nanotechnology” may be used when a funding agency wants more applied research. To the extent that a distinction between the two terms exists, nanoscience often refers to nanocrystals or quantum dots, layered structures, and
hierarchical assemblies. Nanotechnology refers to technologies such as photonics or magnetics, biomedical or bioremediation, and integrated microsystems. These technologies are integrated into small system-level applications.
When talking about the growing links between the biological and physical sciences, Dr. Romig said the discussion often focuses on how the physical sciences will help biology. There is an important flip side, perhaps further in the future, in which biology serves as a catalyst to advances in the physical sciences. Within the next 10 to 20 years, Dr. Romig said he is quite confident that a number of materials and devices in the market will be manufactured using biological methods, even though the final product is inorganic.
An important example of nanoscience in action, said Dr. Romig, is in building layered structures. Breakthroughs in layered structures in the 1980s using nanotechnology have had broad-based applications in gallium arsenide semiconductors, which in turn have had major impacts on the wireless communications industry. Nanotechnology has also contributed to enhancing magnetic behavior in materials and improving surface hardness. Using nanocrystals, scientists can finely target light emissions and create super-capacitors. Other “nano” applications include nanocomposites that enable the separation of membranes, thin film that can be used in sensors, and materials that can capture certain pollutants and impurities.
With respect to energy efficiency, Dr. Romig described two examples in which nanotechnology will make significant contributions. Displaying a graph showing magnetic response of samarium cobalt and iron multilayers, Dr. Romig said that the magnetic flux density changes as the microstructure of the material is changed. In practice this means that it is possible, for a given amount of material, to create equivalent magnetic effects with a lower volume of material and with greater strength. The net result is greater efficiency in electrical applications, such as in small motors.
At Sandia, work has been done involving implanting aluminum and oxygen in a nickel alloy that creates aluminum oxide precipitates at the surface. Alloys such as this do not usually have good “wear properties” on the surface, but the creation of nanometer-size aluminum oxides greatly strengthens the alloy. The result is greatly reduced friction and wear.
Turning to optical properties, Dr. Romig discussed vertical cavity surface emitting lasers (VCSELs). The typical laser—called a slab laser—consists of a piece of gallium arsenide that emits a single laser beam. In the case of a VCSEL, a quantum well can be built vertically into the surface of the gallium arsenide that enables lasers to be in very close proximity to one another. For example, a semiconductor chip of just a few millimeters square can be constructed that emits a number of laser beams. Among the applications is optical communication within electronic packages. This sort of optical communications enables higher-bandwidth communications and, in military systems, allows more reliable communications because the optical system is immune to electrical disruption.
Dr. Romig’s second example was the photonic lattice, which emanates from work begun in the early 1980s at Bell Laboratories. At Bell, the technology was used to diffract microwaves; using nanotechnology, the same approach can be used to diffract light. In one case, a diffraction grid has been built to bend light 90 degrees with 100 percent efficiency in a silicon environment. An important application is development of optical interconnects on silicon chips. Within 10 years, Dr. Romig said that it should be possible to use photonic lattices to produce laser light from silicon. This could have revolutionary impacts in optical computing.
Semiconductor Nanocrystals in Biological Systems
At Lawrence Berkeley Laboratory, nanocrystalline materials have been injected into living cells and, under the right conditions, the cells have emitted light. So far this has been done only as a demonstration that nanocrystals can be injected into cells without damaging them. Applications are off in the distant future, but it opens up the possibility of tagging cells and watching where they go.
With respect to Sandia’s work on microsystems, Dr. Romig said that with respect to microelectronics, miniaturization accompanied by expanding capability has been the hallmark of the industry for years. However, a microprocessor in principle is very much the same today as it was in the 1970s, only far more powerful. Over the course of the next decade, Sandia scientists believe that a revolution will occur that will enable more than just computation to be done on silicon. Silicon-based technologies will be able to sense physical things, such as acceleration and temperature, act on that data with micromechanical devices, and communicate to the outside world using optical signals. This is a “system on a chip” but a much broader notion of that concept than people discuss today (which usually refers to combining processing and memory or analog or digital functions on a chip).
To make this happen, nanotechnologies will be key, but a number of issues must be addressed to make nanotechnologies viable in practical applications. In microelectromechanical systems (MEMS), tiny gears may be used, but they must be able to withstand wear and tear. One must also be concerned about how these devices respond to humidity or other conditions in the environment. An important design issue to take into account is that devices at the micro-scale are affected by factors at the nano-scale. Gravity and inertia influence normal mechanical devices.
At the scale of a few microns, however, surface forces dominate inertial forces, so things such as “stiction” become more important, and therefore lubrication at a small scale is an issue. Finally, a single dislocation can affect performance at the nano-scale; in macrosystems, it is usually a network of dislocations that generally affects performance. Dr. Romig pointed out that computational
tools could now model the impact of a single dislocation on a particular material under development.
Another application of nanoscale science on technological development at Sandia is the tunneling transistor. The CMOS technology presently used in semiconductor manufacturing, as one tries to shrink it down smaller and smaller, runs into a physics wall having to do with quantum mechanical tunneling—the walls of materials are so thin that electrons pass through them. There are ways, Dr. Romig said, that a structure can be built to take advantage of the tunneling, so that a transistor works because of the quantum tunneling, rather than malfunctioning. This technology may allow Moore’s Law to be extended.
The Future: Nanotechnology in the Biosciences
Collaborative research between scientists at Sandia and Harvard University is developing ways to do rapid DNA sequencing. The technique uses an organic membrane, drills a hole through it using a virus, puts an electrolyte on either side, and then places DNA on one side. Scientists then put a potential between the two, which pulls strands of DNA through the hole, and the “correlation factor” enables the DNA sequence to be determined. Doing that process one at a time is not very efficient, but scaling up this process’s speed could have exciting applications. Dr. Romig said that if one could sequence 1,000 base pairs of DNA per second through a pore developed using this technique, it would take about a month-and-a-half to sequence the entire human genome. If a structure could be built using 1,000 pores, the human genome could be sequenced in a day. Now, this technique would not be used for the human genome, since that sequence is nearly complete, but it could be helpful in countering biological warfare. This technique could very quickly identify the agent in a biological attack, which in turn could be used to rapidly develop a vaccine.
Future Challenges for Nanotechnology
Fabrication on a large scale will be one challenge for nanotechnology in the coming years. Many of the nanotechnologies Dr. Romig discussed have been fabricated once in a laboratory. To have widespread impact in the defense or commercial sectors, it will be necessary to develop methods to manufacture these technologies at a significant scale. Specifically, the challenges have to do with synthesizing, manipulating, and assembling devices at the atomic scale. In some cases, nanotechnology amounts to building an entire chemistry lab at the size of a chip. Another challenge is computational, both in aiding in the design of systems and then in processing all the data that nanotechnologies can collect.
To make further development of nanotechnologies a reality, advanced diagnostics and modeling are crucial. Within the Energy Department’s complex of facilities, the national laboratories provide advanced photon and neutron sourc-
es, as well as electron microscopy centers, such as the one at Berkeley. Sandia has developed something called an “atom tracker” that enables measurements at very small scales with localized probes; the atom tracker actually allows a scientist to trace an atom on a particular surface. Finally, high-end computing helps scientists understand what nanotechnologies do. The passage of drugs through biomembranes calls for first principles models, models that scientists would like to link so that they can understand how an entire system works. Doing this is very computationally intensive.
As scientists and engineers accelerate research on nanotechnologies, Dr. Romig said that multidisciplinary partnerships would be needed. The federal government will have a prominent role, as a funding entity and as an owner of national labs. Universities and the private sector will play important roles as well. What will be most crucial is synergy across universities, the federal government, and the private sector.
Questions From the Audience
Dr. Romig was asked whether there were any commercial products on the market using nanotechnologies. In response, Dr. Romig said that the thin-layered structures in cell phones and pagers use nanotechnology. As for micromachine technologies—micron scale as opposed to atomic scale for nanotechnologies—air bags in cars and a number of engine controls are examples. Dr. Romig was asked about the status of nanotechnology research initiatives in other countries. Dr. Romig responded that the Europeans and the Japanese are every bit as aggressive in nanotechnologies as the United States. Most scientists would consider the United States to be slightly ahead of its competitors in nanotechnologies; with respect to micromachines, neither the Europeans nor the Japanese have shown a great deal of interest in this area.
DEFENSE INTERESTS AND APPLICATIONS
Naval Research Laboratory
Dr. Coffey said that his focus would be on the military applications of nanotechnologies, which fall broadly into the following areas:
Fires and Targeting—identifying where enemy fire is coming from and how to eliminate it;
Command and Control—the systems needed to run military operations;
Survivability and Sustaining Forces; and
Chemical and Biological Defense.
Without focusing exclusively on biotechnology and computing, Dr. Coffey suggested that the following technologies over the next 10 or 15 years would have the most dramatic impacts on the military:
Nanoelectronics and Microelectromechanical Systems (MEMS);
Computers, Displays, Networks, and Advanced Algorithms;
Engineered Biomolecular and Chemical Systems;
High-Energy Density Batteries and Fuel Cells;
Advanced Lightweight Materials, Structures, Composites;
Micro-Air Vehicles and Systems; and
Multi-Spectral and Hyper-Spectral Sensors and Processors.
In thinking about the biotechnology and computing industries, Dr. Coffey pointed out that a variety of scientific communities, from the biosciences, to materials, to computers and information technologies, to electronics are increasingly interconnected. From the Department of Defense’s (DoD) point of view, the confluence of these technologies holds the most promise.
To illustrate this confluence and what it means to DoD, Dr. Coffey used micro-air vehicles and systems as examples. First, the vehicle is not a traditional airplane and indeed looks more like a biological system than an airplane. It is also small—only 15 centimeters across with a weight of 100 grams and a payload of 15 grams. Its range is 10 kilometers at the most and its flight endurance is between 20 minutes and one hour. The types of missions for these vehicles would include radar jammers, small visual cameras (now fully capable at the one-ounce level to take and transmit video), uncooled infrared cameras, and chemical/biological sensors. The key technologies for such vehicles are MEMS, ultra-low Reynolds number aerodynamics, wing-flapping aerodynamics, advanced materials, and artificial intelligence.
One prototype vehicle is the Micro Tactical Expendable (MITE), whose mission is radar jamming. Because disabling radar takes very little power, this small vehicle flies out to the radar tower, lands on it, and emits enough power to jam the radar. New materials make such a vehicle possible, as well as advances in motor technology that enable small and sufficiently powerful motors for such a mission. The MITE is currently functioning and it today disables radar in military missions.
The next evolution of the MITE will allow a number of these devices to work in a swarm—to act in concert as they enter the battlefield and identify and label a number of tanks. The vehicles need enough intelligence to tag vehicles spectrally and communicate among them. Using today’s technology, such devices would have a range of only one kilometer.
A more advanced version of this technology would involve teaching the devices to behave collectively, which would require advanced communications and sensor technology. Individually, the vehicles may be less sophisticated than those used today, but collectively comprise far more sophisticated systems than we presently have. Even with a communications range of a kilometer, a swarm of such things could transmit a large amount of information. But they would have to be very intelligent, able, for instance, to regroup and complete the mission should a substantial number be destroyed by the enemy.
With respect to detecting biological or chemical warfare hazards, a sensor is placed on the small vehicle that then flies into the hazardous area and transmits back what it has found. In this example, biology, electronics, computers, and materials all come together to make the sensor work. A biologically based sensor on a small vehicle will detect the agent, become fluorescent in a specific way depending on what the agent is, and transmit the identified agent using a fiber optic link to the command center. This yields very rapid detection of a particular agent. MEMS will enable this technology to be scaled down even further.
In biosensors, the military has developed sensors that are based on neurons. Neurons have been taught to grow on electronic substrates, and their vitality is monitored. Essentially, this system is analogous to the canary in the mineshaft, in that when these sensors detect certain things, their electrical patterns change in accordance to what the sensors have found.
Another application of biology in military systems is in “biofouling control.” In one example, Dr. Coffey said, the Navy has developed a material that allows a coating that might be used on amphibious vehicles to repel growths that would be harmful to the coating. The result is an environmentally sound way to lengthen the life of coatings used on ships and submarines.
In summary, Dr. Coffey said that as the battlefield demands on the military grow, it will increasingly turn to biological systems for solutions, but that these systems will rely on advances in computing and electronics technologies. Such technologies give military systems the lightweight characteristics that are increasingly important, as well as the communications capability that is also required.
A questioner asked Dr. Romig about the intellectual property regime with respect to nanotechnologies being developed at Sandia. Dr. Romig said that Sandia recognizes the value of intellectual property in terms of possible royalties. Sandia and other DOE labs are also carefully thinking through the right intellectual property arrangements in partnerships. If the intellectual property rights for products from DOE labs are not carefully specified, then partnerships with industry can be poisoned, even though such partnerships can be very mutually beneficial.
Charles Wessner asked panelists to discuss what their agencies need from policymakers to better fulfill their missions. Making an analogy with the Semiconductor Industry Association’s technology roadmap, which identifies technology “show stoppers,” Dr. Wessner asked panelists to point out any “show stoppers” in their fields. Dr. Coffey said that in terms of facilities, the DoD is in reasonably good shape. Computer technology must advance, he added, to enable the sorts of things he discussed with respect to micro-air vehicles. The Defense Department will go to any source, public or private, to find the necessary computing capability. In terms of DoD’s technical contributions, Dr. Coffey said that the Department could probably be most helpful in the interface between the physical sciences and biology.
Dr. Romig said that the DOE labs are resource-constrained at present. The Defense program budget, which is a funding source for the labs, is always under close scrutiny. The labs themselves have adopted a dual production and laboratory role, and production requirements have placed great strain on research budgets. While noting that most laboratories will always make a case for more money, Dr. Romig said that the best way, in his experience, to stimulate collaboration across disciplines is through research money. If the goal is to encourage collaboration between biology and the physical sciences, there must be some targeted funding toward this end, regardless of which agency funds it.
In closing, Mr. Borrus said that at the frontiers of science and technology, whether it is in biosciences or nanotechnology, scientific inquiry is a highly social enterprise increasingly enabled by cross-disciplinary fertilization and applications that permit technology to flow in many directions. Cross-institutional collaborations—among government, industry, and universities—are necessary. For the broader project on Government-Industry Partnerships for the Development of New Technologies, it is important to understand which partnerships work, which do not, and why. This should be of help to the entire process of discovery and breakthrough, one that is always surprisingly fragile. From a policy perspective, this means support for fundamental science, and development of the enabling technologies, tools, methods, and partnerships that create products and processes that impact our economy and society. If past generations had been more timid in support for computing and biotechnology, Dr. Borrus said he doubted very much that the United States would be enjoying the economic boom it does today. Dr. Borrus said that we should be no less timid moving ahead.