Panel IV: Information Technology: New Opportunities-New Needs
Office of Congressman Sherwood Boehlert
In opening the discussion, Mr. Goldston said that having heard about some of the possibilities of the biotech revolution, it was time to discuss public and private initiatives to push the frontiers of information technology in ways that support the biotech industry. The panel would offer insights into current efforts to address some of the information technology challenges, especially when it comes to biotechnology, and how public policy can best provide a supportive environment to respond to these challenges.
BIOFUTURES FOR MULTIPLE MISSIONS
Defense Advanced Research Projects Agency
Dr. Alexander said that the Defense Advanced Research Projects Agency (DARPA) has a great deal of interest in biotechnology, computer science, information technology, and microsystems (i.e., microelectronics, photonics, optoelectronics, sensors and acuators, and the microsystems made of these things). DARPA’s interest is driven by its identification of these systems as important to the nation’s future security. This means moving a whole range of technologies forward, and the relatively recent recognition that biology can play a role offers
great potential for the Department of Defense. DARPA has been involved in the emergence of new fields (e.g., computing in the 1960s and materials in the 1970s), and the agency believes it can play a similar supporting role in new developments in biotechnology and microsystems. DARPA plans to move broadly into the area of biotechnology and microsystems, as opposed to focused technology initiatives that are often how DARPA pursues technology development.
The combined thrust of biology, chemistry, physics, and information technology is therefore very exciting for DARPA. But it requires true collaboration among disciplines on new problems, not simply the juxtaposition of physicists and biologists in the same lab. The objective is to find the places where the fields are strongest and the likelihood for meaningful impact on important problems is the greatest. The hope is to foster synergy that creates a whole greater than the sum of the parts. Moreover, there are already “interesting things brewing” at the intersection of biology and information technologies—a number of foundations, for example, are funding research in this area. DARPA wants to help the field attain a critical mass so that research programs can really take off. DARPA also understands that many young people are enthusiastic about the field, and by investing early, the agency can help develop a pipeline of talent in the field.
Current DARPA Initiatives
DARPA already has activities under way at the intersection of biotechnology and computing. The “electronic dog’s nose” initiative—the ability for canines to sniff out explosives—tries to understand that process at a very high level of sophistication. The objective is to develop electronic devices that can perform this task as well as dogs.
DARPA is also working in the area of controlled biological and biomedical systems. This involves interfacing directly with living creatures, such as insects, and altering them at the larval stage. For a certain type of wasp, for instance, exposure to specific vapors in the larval stage will enable it to detect explosives when it develops into an adult. DARPA is also considering having insects carry electronic chips, so that their hunting patterns become search algorithms for DoD sensors.
DARPA is also funding R&D in tissue-based biosensors. The agency is exploring whether cells and tissues can be used to detect toxins in the environment. This can enable certain cells to determine whether something is dangerous and alert people to the danger before it is too severe. Other initiatives involve DNA computing, microfluidics (i.e., how to move organic materials around on a chip), and diagnostics for biological warfare agents.
These DARPA programs are focused technology initiatives, but the agency also believes that it has to look at some fundamental scientific questions as well. DARPA has labeled this area of inquiry “Biofutures.” Exploring how biology
interacts with microsystems and information technology is the cornerstone of the effort.
With respect to biology and microsystems, microsystems technology offers biology the ability to create interesting interfaces at the micron and nanometer level. For example, microsystems allow biologists to measure a number of things that have been difficult or impossible to measure until now. Single sensing systems will be able to measure optical properties, cell adhesion, temperature, and other cell properties. The challenges for biologists will be to integrate into silicon and machine-based microsystems the biological materials and materials that are suitable to asking biological questions. For microsystems technologists, biology offers tremendous opportunities; for example, biology offers very reliable functions from fundamentally unreliable parts. A problem will be how to do engineering in biology. The field of bioengineering already exists, but to take it to the next level of abstraction will be no easy task.
In terms of biology’s interaction with information technology, many of the challenges have to do with the complexity of systems that integrate the two technologies. Information sciences allow biologists to scan databases with lots of information—genomic databases are good examples—but the complexity of the biological systems on which data is being collected is beyond the capabilities of information sciences. Dr. Alexander, a physicist by training, said she managed a biology group at DARPA several years ago and she was struck by the cultural gap between the two disciplines. Physicists use mathematical tools for research, but these tools are presently not capable of addressing the inherently complex systems in the biological world. At the same time, each field could benefit the other greatly. Information security is a huge concern today, and biological systems may offer new ways to address such problems. Immunology, for example, is about identifying external threats to a biological system and responding; these are exactly the characteristics we want in security for information systems.
Future DARPA Initiatives
In preparing to address fundamental questions in biology and information sciences, DARPA has talked with people in industry, universities, and the policy community to determine emerging needs. The result has been an expansion of efforts to develop underlying tools to enable fruitful collaboration between the two disciplines. Such efforts will be less focused than a typical DARPA initiative, because of the uncertainties involved in an upstream undertaking. DARPA plans to release a “research announcement” in the near future with the goal of beginning the first round of research grants by the Spring of 2000.
DARPA will also hold a large conference to further refine research ideas with the hope of developing a focused research agenda at the intersection of biology and information technology. That conference is scheduled for June
2000. DARPA plans to partner with the National Science Foundation and the National Institutes of Health in this process, and the agency will also reach out to the technology community in developing these workshops.
In conclusion, Dr. Alexander asked for the engagement of participants at the conference. DARPA has no labs of its own, and must therefore work with the scientific and technological communities to be effective. She challenged those in attendance to develop excellent research proposals as DARPA expands its program in biology and information technologies.
MEETING THE NEEDS, REALIZING THE OPPORTUNITIES
Dr. Horn posed the following scenario in opening his remarks: a patient walks into a doctor’s office, describes his symptoms, and then responds to several questions asked by the doctor. The doctor scrapes some DNA off the patient’s cheek, places it in an analyzer, and quickly gives the patient a precise diagnosis of his problem. Next, the doctor prescribes drugs that are tailored to the patient’s DNA to treat the illness. Dr. Horn said that this scenario was not science fiction, but rather likely to be upon us with surprising speed.
In a few years, Dr. Horn said, we will have completed sequencing the human genome. However, we are still a long way from taking a sequence of amino acids and turning it into the fundamental structure of the proteins that make up the human body. The movement from sequencing genes to having biologically useful information with which to treat humans is an information technology challenge. In the not-too-distant future, Dr. Horn predicted, we will be able to provide the kind of diagnosis and treatment options for patients that he described in his scenario.
Developments in information technology hold great promise for turning advances in biotechnology into practical and operational ways to improve patients’ lives. Using the analogy of disk drives in the computer industry, Dr. Horn pointed out that in 1954 IBM built the first disk drive, a sizeable piece of equipment that stored one gigabyte of data at the cost of $10 million. Today, it costs $10 to store a gigabyte of information in a much smaller device. By 2020, one terabyte of storage will cost just a few cents. A terabyte is the amount of information the brain can store, but the processing speed of the brain is far faster than a single personal computer, or even a number of them linked together. In 10 years, high-end supercomputer processors will have the operational power of the human brain—10 pedaflops. The amount of change in computational capabilities will be truly breathtaking over the next decade, Dr. Horn said. Changes in computing will have profound impacts on every facet of our lives.
The Economic Impacts of Computers
Dr. Horn noted that the Chairman of the Federal Reserve, Alan Greenspan, testified before Congress in May that computers—by enhancing productivity— were contributing greatly to U.S. economic growth. As we enter a fundamentally information-based economy, digital information will become increasingly important to everyone. By 2003, Forester Research estimates that worldwide $1.3 trillion worth of goods and services will be purchased over the Internet. That is 5 percent of the world economy; for the United States, electronic commerce will amount to 10 percent of the economy.
Companies will also realize tremendous efficiencies through the use of digital technology. Taking an example from IBM, Dr. Horn said that in 1999 his company purchased about $12 billion worth of goods and services over the Internet. By using the Internet, IBM estimates that it will save $260 million—a substantial contribution to IBM’s bottom line.
The Bioinformatics Challenge
The bioinformatics challenge is to make sense out of the vast amount of information available from the Human Genome Project. As of mid-1999, IBM had 1.4 million pages of information on the World Wide Web. Dr. Horn said that he probably really needed to read at least three of them, but was hard pressed to know which ones. In the world of bioinformatics, the challenge is to manage huge amounts of data; indeed, today the Human Genome Project involves a great deal of data, but a meager amount of information. Turning this data into usable information will be the paramount challenge for biology in the next century.
Information technology, said Dr. Horn, will be the language of biology in the twenty-first century, much like mathematics became the language of physics beginning in the late nineteenth century. Without information technology, it will be impossible to engage in meaningful biological research. To think that we will train future researchers for the bio-pharmaceutical industry through a few bioinformatics institutes, argued Dr. Horn, dramatically underestimates the importance of information technology to the field.
The Role of the Government
Given the growing importance of information technology to biology, the magnitude and nature of government support must be carefully thought out. Dr. Horn listed three elements that would be important in the future:
Support for basic research will continue to be important for biology and the federal government has traditionally provided the bulk of this support.
This research will be most valuable if it is multidisciplinary, that is, at the boundaries of information technology, biology, and physics.
New bridges must be built between basic research and industry. In an information-based economy, technical information of commercial relevance flows rapidly and often freely around the globe. Corporations with strong ties to universities are well positioned to absorb such information and thereby capture economic benefits. Indeed, this is how economic value is created and captured by a region or country. Government programs, such as the ones funded by the National Science Foundation, the Advanced Technology Program, and the Defense Advanced Research Projects Agency, can help speed up the flow and absorption of these ideas.
Government can assist in building the information infrastructure. Today, the government is the primary funder of computers with incredible computational power, such as the teraflop computer at Sandia National Laboratories. As we advance toward the pedaflop computer, government can aid in the research and development of such new computing power. With a pedaflop computer, scientists will be able to do fundamental protein folding to determine a protein’s structure and function nearly on a real-time basis.
Using what is called “deep computing” at IBM—computers with enormous memory and processing speeds—Dr. Horn predicted that it would be a matter of roughly two decades until computers can process the hundreds of thousands of proteins in the human genome to determine their structure. For medical purposes, the human genome will then be fully understood. Thus over the next 10 to 100 years, the combination of information technology and biology will revolutionize society.
Commenting on the challenges of improving quality and quantity of cross-disciplinary contacts between biologists and computer scientists, Dale Jorgenson asked whether trying to train computer scientists in biology would be a wise strategy. Dr. Horn responded that the conference heard about possible difficulties in teaching biologists the quantitative skills necessary for computer science. Without passing judgment on whether overcoming these difficulties would be insurmountable, Dr. Horn said he believed that biologists in the future will have to become skilled “information workers” able to handle databases and computer software. Dr. Horn said that even today a biology graduate student has to be facile in finding, manipulating, and synthesizing information. It is not so much a matter of training biologists in hard mathematics, but rather in ensuring that biologists will be able to use information technology in a reasonably sophisticated way.
Dr. Alexander observed that 15 to 20 years ago, materials science faced these same issues in its nascent stages. As the field developed, physicists, chemists, and engineers each played a role, but it was unclear which discipline would take the lead in training. Eventually, materials science emerged as a discipline, and it attracted a different brand of person, distinct from physics or chemistry. If the intersection of computer science and biology pays off as anticipated, Dr. Alexander believed that a new hybrid field would attract a different brand of person than has been traditionally been drawn to biology or computer science.
NEW INFORMATION TECHNOLOGY INITIATIVES
National Economic Council
Mr. Kalil said he would offer a case study of one of President Clinton’s science and technology initiatives, and do so in the context of the imbalance in support for biomedical research versus funding for research in the physical sciences and engineering. NIH’s budget increased by $2 billion in FY 1999 and should increase by approximately $2.3 billion in FY 2000. In contrast, one does not see Members of Congress “falling on their swords” in defending research funding for physics, chemistry, computer science, and engineering. The Clinton Administration, therefore, has been looking for ways to increase support for research in physical sciences, computer science, and engineering.
To do this, the administration proposed a 28 percent increase in information technology (IT) research for FY 2000. This was to support research in three areas:
Fundamental research for information technology to improve increasing software reliability and network technology (i.e., developing networks able to support billions of information devices). Other areas include exploring the ability to visualize large quantities of data, real-time foreign language translation, and DNA computing.
Tera-scale computing for civilian science and engineering. The government spends significant sums for high-end computing for defense-oriented stockpile stewardship. But these machines are not available on the civilian side, and tera-scale computing will support a variety of modeling and simulation projects.
The ethical, legal, and social aspects of the information revolution. A program such as this exists for the Human Genome Project, but nothing analogous exists for information technology. Although this area had the smallest funding request, the topic is nonetheless extremely important.
The Rationale for Increased IT Spending
One reason for the Clinton administration’s requested increase in research funding for IT has its origins in the President’s Information Technology Advisory Council (PITAC). That panel, headed by Irving Wildavsky-Berger of IBM, consisted of a number of computer scientists. While it may not come as a great surprise that a group of computer scientists recommended an increase in computer science research funding, PITAC’s distinguished membership made its findings well worth taking seriously.
A second reason is that the administration believes that historically payoffs to research funding for information technology have been very high. This year is the thirtieth birthday of ARPANet, and many innovations in IT—and not just in networks—have come about because of DARPA, NSF, and other agencies. Computer time-sharing, reduced instruction set computing (RISC) technology, the first graphical Web browser, and many search engine companies (which came out of the government’s digital library initiative), have origins in government funding. Looking across investments in these activities, it was easy to conclude that the returns had been very high.
The third reason for the administration’s interest is that leadership in IT is critical for economic growth. Over the last several years, information technology has accounted for over one-third of U.S. economic growth. Jobs in this sector pay roughly 80 percent more on average than other jobs. The United States has been the center of innovation in IT, and the administration wants to help maintain U.S. leadership.
The fourth reason for increasing investment in information technology is national security. U.S. national security relies on technological superiority; a cursory inspection of U.S. doctrine shows phrases such as “dominant battlefield space awareness,” indicating a strong reliance on IT.
The fifth reason is that the administration believes that advances in information technology will aid discovery in all disciplines—from chemistry to engineering. More and more disciplines employ modeling and simulation, which rely greatly on IT. Taking an idea of the National Academy of Engineering president, Dr. Wm. Wulf, Mr. Kalil noted that “collaboratories,” in which scientists, large databases, and remote scientific instruments are all linked by high-speed computers, depend on IT to be successful, and may provide a model for research with teams that are geographically separated.
Preparing a technically trained workforce was the sixth reason that the administration requested an increase in funding for information technology R&D. Industry has been vocal in expressing its concern that the lack of a trained workforce is a major constraint on economic growth. Because university professors usually hire a team of graduate students to conduct grant-funded research, increasing the supply of research dollars will have payoffs in terms of training in addition to research outcomes.
Finally, information technology will lead to a number of economic and societal transformations that are important. From electronic commerce, to telemedicine, to electronic government, IT is an important enabler for a variety of developments that can improve people’s lives.
The Clinton administration experienced “a bit of a mixed bag” when it came to having its IT funding requests fulfilled. On the one hand, the administration had bipartisan support for much of its proposal; House Science Committee Chairman Sensenbrenner introduced legislation authorizing the administration’s request for the agencies under his committee’s jurisdiction. There was also widespread industry support for the initiative. PITAC strongly supported it and TechNet, one of the leading high-technology political organizations, sent a letter signed by 37 CEOs from the technology industry advocating the administration’s initiative. The American Association of Universities and the Council of Scientific Society Presidents, in addition to several trade organizations, also endorsed the proposal. The scientific community had some concerns, however, about the size of the increase for IT relative to other disciplines.
In the end, President Clinton’s initiative received approximately $126 million of the $146 million requested for the NSF. Of the $100 million requested for the Defense Department, $60 million will be funded; $40 million for DARPA will not be appropriated. The Department of Energy received none of what the administration requested, primarily because the department receives substantial funding for high-end computing in modeling and simulation for stockpile stewardship. Congress chose not to fund civilian and defense high-end computing in the Energy Department.
In conclusion, Mr. Kalil said that the administration regarded the budgetary outcomes as a useful beginning, and he added that the administration hopes to do better next year. Information technology research is critically important for the United States, and the Clinton administration is committed to building on efforts to increase research funding in this area.
A member of the audience asked Mr. Kalil how the administration planned “to do better” in procuring more funding for its information technology initiative next year. Mr. Kalil responded that one likely approach would be to collapse all IT research funding into one package. This year, confusion arose over the base level of spending versus the increment in the new initiative; this did not help the administration’s cause. Most important, Mr. Kalil said, will be making the case individually to Members—early and often—about the importance of research funding for IT.
Mr. Kalil was asked if he could further explain why the Energy Department received no funding from this initiative. Mr. Kalil responded that the administration has to do a better job of explaining why high-speed computing is important for civilian purposes, not only stockpile stewardship. He noted that the Energy Department has a number of research projects under way—from climate change, to materials by design, to developing more efficient combustion engines—that require high-end computing capabilities.
Dr. Trimble provided an overview of the Global Positioning System (GPS), something he regarded as a very successful government-industry partnership. The government did not fund the commercial end of GPS, but it did fund important elements in basic research and infrastructure, which facilitated the commercial development of GPS. There were also important multidisciplinary aspects to GPS. It may not seem like it today, but 30 years ago hardware, software, and electronics were very separate areas of inquiry. Multidisciplinary research at universities brought those fields close together.
Most importantly, the government provided an early market for the GPS, as the U.S. Geological Survey required that government surveys be done using this technology. In his industry and others, observed Dr. Trimble, government’s role in providing an early market has tapped into the country’s entrepreneurial spirit.
These three components—basic research, university support, and providing an early market—have been important in other fields, such as the Internet and increasingly the Human Genome Project. In biotechnology and computing today, it is important to push universities in the direction of multidisciplinary research and then “let industry mine” this for commercial purposes. With respect to computing, Dr. Trimble emphasized that the research community remains “hungry” for high-speed computing, and the NSF should continue to support and expand development of high-speed computers for the university research community.
As a long-time observer of the technology industry from the university setting, Dr. Rosenbloom said he would focus on turbulence in the information technology industries in the United States. If this conference had been held 10
years ago, the discussion would have focused on a very different set of topics and actors than is the case today. IBM is the only company represented today that would have been on a similar agenda 10 years ago.
The extraordinary rate of change in the computing industry is the hallmark of the computer and IT industries. This gives rise to what Dr. Rosenbloom called the “paradox of Moore’s Law.” Moore’s Law, of course, predicts steady and remarkable improvement over time in computing devices in terms of processing power and cost decreases. However, leaders in the computing and information industries consistently fail to accurately predict or prepare for the changes that are brought about by Moore’s Law. It is difficult for leaders in any industry to cope effectively with revolutionary technological change.
Dr. Rosenbloom said that the quantitative improvements in computing power have been accompanied by qualitative changes in industries. Twenty-five years ago, the information industry could easily have been described; it consisted of IBM and “The Bunch” —Control Data, Burroughs, Univac, NCR, and Honeywell—as well as AT&T, semiconductor firms, and a few firms making minicomputers. “The Bunch” has been wiped out, IBM nearly was, companies such as Digital Equipment have been bought out, and AT&T exists only as a brand name. The ranking of semiconductor companies has changed dramatically, even though many of the same semiconductor firms survive in some form today.
Public policy must recognize that the information technology industries are extremely dynamic. As we develop policies for the future, they should be robust to withstand the tremendous turbulence that characterizes the information technology industry. Ken Flamm commented on this turbulence earlier, Dr. Rosenbloom said, and one lesson he took away from Dr. Flamm’s discussion is that government statistics often cannot keep pace with the change brought about by turbulence. Only recently have government statisticians come to recognize that a software firm such as Microsoft was conducting R&D and that government statistics must account for it. And companies such as Dell and Cisco were on no one’s radar screen in 1989, and they are certainly important companies today. Dr. Rosenbloom argued for caution in relying on data to identify trends in the face of such turbulence.
If so much effort is going to be devoted to encouraging a fusion of biotechnology and computing, Dr. Rosenbloom said, we should expect changes in both industries’ structures comparable to what has occurred in the computing industry in the last decade. Dr. Rosenbloom said that he hoped new IT initiatives at DARPA and other agencies would be fully funded. If history is any guide, handsome payoffs will occur, and the companies exploiting them are likely to be ones whose names we have not yet heard. Dr. Rosenbloom urged policy makers in attendance to take into account turbulence in the IT arena when formulating policy. The pharmaceutical companies, for instance, have been remarkably stable even as their methods for drug discovery have changed. Dr. Rosenbloom conjectured that the future development of computational biology might have a
profound impact on the industrial structure of pharmaceuticals. Within a decade, the pharmaceutical sector may be as turbulent as the information technology sector is today.
A questioner asked whether any of the panelists were aware of important developments elsewhere in the world in biotechnology and computing, and their growing interdependency. Dr. Horn responded that prowess in information technologies is certainly not the sole province of the United States. He said that the United States is probably the best country at adopting new technologies, whatever their origin, and the United States spends the largest share of Gross National Product in IT. Led by Silicon Valley, the United States is best at capitalizing on new technologies, but Dr. Horn cautioned that the rest of the world is not far behind.
Mr. Kalil said that, in terms of foreign competition in IT, Europe is particularly strong in mobile and wireless telephony. The unique advantage of the United States lies in its strong university research base, the venture capital industry, and a culture that encourages entrepreneurship and tolerates risk. Here involvement in a failed start-up is seen almost as a “badge of honor,” Mr. Kalil said, whereas in other countries that sort of failure is a black mark on someone’s career. The turbulence that Dr. Rosenbloom spoke about is in fact an asset in the United States, argued Mr. Kalil, and perhaps not one that we appreciated in the late 1980s and early 1990s.
Dr. Alexander said that one of the strengths of the U.S. academic system is its flexibility relative to other countries. The U.S. system encourages junior faculty to start new lines of inquiry, and although things may not always move as quickly as some would like, the situation in the United States is better than elsewhere.
Dr. Jorgenson asked panelists about two policy areas that had not been discussed, standard setting and antitrust. Mr. Kalil responded that the entire issue of standards is “very tricky.” One can point to the European example of the GSM standard for wireless communications as a stunning success; by gathering European companies into a room to agree on a standard, wireless penetration rates have taken off. It is easy to ask why the U.S. government did not do the same thing. In the specific U.S. context, the wireless industry was split on the appropriate standard. Qualcomm said it should be CDMA and a number of other carriers said it should be TDMA. The Federal Communications Commission asked: On what basis could the agency make the decision? Why should the FCC believe its judgment is better than the market? The FCC chose not to choose in this case, because there was no policy principle to which the FCC could appeal to make a choice.
That said, Mr. Kalil continued, there are examples in which that approach failed. The Japanese government’s promotion of an analog high-definition tele-
vision standard stimulated a lot of investment, but it has gone nowhere. In the United States, one part of the government mandated that a standard called OSI be used for data interfaces. Another part of the government ignored OSI and adopted TCP/IP, which has become the standard for the Internet. Because of conflicting examples, it is difficult to conclude that a “top down” government-imposed approach to standards is the correct approach. When industry wants to work on a standard, government can be supportive in important ways, such as providing funding for test beds.
In the area of competition policy, Mr. Kalil said that promoting competition in telecommunications has been one area in which the government has made sound choices. Because of the Telecommunications Act of 1996, competitive local exchange carriers—so-called CLECs—are now capitalized at $30 billion. CLECs, competitors to the Baby Bells, are rolling out new services and challenging incumbents to respond in kind in terms of service price and quality.
Dr. Trimble commented that with the European standard for wireless communications, it is true that this appears to be a success, but added that competition in wireless telephony is far from finished. In general, Dr. Trimble thought it was unwise for government to “pick winners and losers” in standards situations, because government is likely to pick the low risk option and established companies. The entrepreneurial push in the United States has in fact fostered much innovation and wealth creation in this country.
Dr. Kathy Behrens observed that by the end of 1999, over $30 billion— perhaps close to $37 billion—in institutional venture capital will have been invested in the United States. More than 80 percent will go to information technology and Internet firms, while only $1.1 billion will be invested in biotechnology. This, she said, will certainly contribute to the continued turbulence that Dr. Rosenbloom discussed.
Asked whether the computing and semiconductor sectors would see continued rapid rates of innovation, Dr. Horn said that when it comes to lithography, he expected to see a slowing in the rate of advance in that field within 5 years, at least using traditional lithography technology. At the system level, however, there is room for substantial improvement in packaging technology, compilers, and additional software functions. This will more than compensate for the slowdown at the core silicon level. Thus, Dr. Horn said computing power would increase at a rate greater than exponential over the next 10 years.
Jim Turner of the House Science Committee suggested that the turbulence in the pharmaceutical industry that Dr. Rosenbloom predicted might not occur because the industry is subject to regulation by the Food and Drug Administration. Mr. Turner asked whether panelists had considered whether advances in computing power and bioinformatics might in fact lower the cost of regulation, and thereby create a climate in which more market entry into pharmaceuticals would be possible. If that came about, Mr. Turner suggested that perhaps venture capital would migrate more toward a less regulated biotech industry, and
less so toward the unregulated Internet sector. The end result might be enormous payoffs to human welfare as new drugs and treatments come to market faster.
Dr. Horn discussed several technologies that might lessen regulatory burdens. Many pharmaceutical companies have very heterogeneous databases; it is costly to transfer information across them, and new database technology under development at IBM and elsewhere will address this. Emerging computing technologies will allow computational chemistry to estimate in vivo impacts of drugs. This will not obviate the need for clinical trials, but it will provide regulators excellent data on a drug’s impact in a more timely way.
Dr. Alexander commented that the cost of bringing a new drug to market is between $150 million and $500 million—an enormous barrier to entry. One reason we need such large and costly human clinical trials for new drugs is the incredible genetic variability of human beings. As this variability is better understood through genomics research, it will be feasible to conduct trials on a smaller scale and in a far less costly manner. These advances will not be on-line for another 15 to 20 years, but when they are, the process will be shorter and cheaper.