Panel I: Biotechnology Information Technologies: The Need for a Diversified Federal Research Portfolio
Mr. McFadden opened the panel by observing that biotechnology and computing are two of the most dynamic and penetrating areas of technology. The purpose of Panel I is to compare and contrast technology development in these two areas as it has been affected by government-industry interaction. In computing and semiconductor technology, the government was an early instigator, broadly sponsoring technology development and purchasing much of the early products of the computing and semiconductor industries. Over the past several decades, the role of government in the development of information technology has diminished relative to commercial activity. Although more narrowly focused, government-industry collaboration in information technology still accounts for a substantial portion of infrastructure development and, in a variety of ways, has contributed to significant technology developments.
The government’s contribution in the life sciences has been even more substantial, and has increased steadily in the last decade. The government’s financing and shaping of R&D in pharmaceuticals and biotechnology has had a central impact on these technologies, an impact that has been further accentuated by the government’s role in regulating emerging products in these sectors. An understanding of how government-industry collaboration has contributed to technology in both areas is essential to a rigorous assessment of government-industry partnerships. As part of this assessment, it is also necessary to examine whether the government is striking the right balance across these two sectors in its R&D
portfolio. This is an important issue for Congress, Mr. McFadden emphasized, and one that Rep. Boehlert raised in his remarks.
To kick off the discussion on the funding R&D in the biotechnology and computing industries, Mr. McFadden said that the symposium was fortunate to have two men who have made landmark contributions in their respective industries. Both have been innovators and leaders of technology enterprises. Gordon Moore is co-founder of Intel Corporation and serves as its Chairman Emeritus, in addition to serving as Chairman of the Board of Trustees at the California Institute of Technology. His knowledge of semiconductor technology is so intimate that he has characterized the technology’s path in a few words, which have become known famously as Moore’s Law.1 Articulating simple but profound insights about technology has been the hallmark of Gordon Moore’s career.
Edward Penhoet is Dean of the School of Public Health at the University of California at Berkeley. He is also the founder and was, for many years, the chief executive officer of Chiron Corporation, a leading biotechnology company. Dr. Penhoet combines insight into technology with achievement in technology development. He has a commitment to promoting multidisciplinary research in the biotechnology and computing industries, and, under his leadership, Berkeley’s Health Sciences Initiative represents a major initiative to promote such research.
THE VIEW FROM THE SEMICONDUCTOR INDUSTRY
Dr. Moore said his remarks would focus on his perceptions of how the semiconductor industry—a key input to computers—has developed over his 40-year career watching the industry evolve and contributing to its growth. The birth of the computing industry was driven by the Army’s need for ballistics tables during World War II. Calculating the trajectory of a projectile was extremely time consuming; it took a skilled person approximately 20 hours to calculate the one-minute path of a projectile.
Origins of the Computer and the Transistor
At the University of Pennsylvania, the Moore School of Engineering took on the task of trying to reduce the time needed for this calculation. The differential analyzer, invented by Vannevar Bush, had been invented by World War II and could reduce that to 15 minutes. At the Moore School, J.Presper Eckert and
John Mauchly built an electronic computer using vacuum tubes that could calculate the one-minute trajectory in 30 seconds. This was a major breakthrough in an important area for the military. As a result, the military financed the computer in its early days and helped spur the industry.
The transistor had a different development path. It was invented at Bell Laboratories in late 1947 and early 1948 and the transistor’s invention was the outcome of a search for solutions to problems in the telephone system. In particular, the transistor addressed the lack of reliability in amplifiers and undersea cables. A solid state amplifier, as the transistor enabled, was thought to offer important technological advantages. Bell Labs had a preeminent position in U.S. technology at this time; as a regulated monopoly, the Bell System was able to maintain and build the laboratories through the rate structure. Even with the advantages of rate regulation, approximately 48 percent of Bell Labs’ budget was government supported from 1949 to 1959. Even so, the transistor was developed using Bell Labs’ own funds.
One of the most important developments for the commercial semiconductor industry, continued Dr. Moore, was the antitrust suit filed against Western Electric in 1949 by the Department of Justice. It resulted in a consent decree in 1956 that required Western Electric to license all of its patents to any domestic company royalty-free. This included the early transistor patents. For future patents in semiconductors, Western Electric was required to license them to domestic firms at “reasonable royalty rates.” Under these conditions, Bell Labs essentially became a national industrial research facility.
This allowed the merchant semiconductor industry “to really get started” in the United States, said Dr. Moore. If one looks at the path of silicon semiconductors, there is a direct connection between the liberal licensing policies of Bell Labs and people such as Gordon Teall leaving Bell Labs to start Texas Instruments and William Shockley doing the same thing to start, with the support of Beckman Instruments, Shockley Semiconductor in Palo Alto. This, noted Dr. Moore, started the growth of Silicon Valley.
The Role of the Government
Up to this point of the story, most of the financing of the transistor’s development and the semiconductor was done privately by commercial interests. However, the military “really liked the transistor,” Dr. Moore added, and each of the branches provided some support for the early R&D. One result of the military’s interest was that semiconductor technology was driven to silicon, not germanium; silicon was functional in a temperature range that suited the military’s weaponry needs. The military services actually competed with one another in supporting various research efforts. The Army Signal Corps, for example, put out specifications for about a dozen different types of transistors, and thereby encouraged the industry to compete vigorously “to jump through technological
hoops to realize the specifications.” As it turns out, none of the specifications ever became important to the commercial industry.
The government R&D funds did, however, redirect the priority of universities and this created long-lasting benefits to the semiconductor industry. Semiconductor technology is about getting electrical signals to function properly using chemical properties. At the time, in most universities, chemists knew relatively little about electricity, and electrical engineers knew relatively little about chemistry. Government R&D resources funded a number of programs in universities that helped build the necessary bridges across disciplines.
As Michael Borrus points out in Competing for Control, very few of the innovations in the semiconductor industry came about because of funding from the Department of Defense. Semiconductor technology was advancing so rapidly that people in the commercial sector were in the best position to push the frontiers of the technology.2 Further, Mr. Borrus made the point that companies pursue research and product development where anticipated returns are highest, and that government support may or may not be helpful in advancing these goals.
From Dr. Moore’s perspective, flexibility and timing are critical elements to successful technological development. A government contract takes a long time to negotiate, and often the goal is obsolete by the time the contract is completed. A government contract commits resources with very little flexibility; such contracts tie up key personnel for a year or two without the ability to redirect them if ongoing developments suggest that other research paths are more promising. Moreover, government contracts impose a significant administrative burden. For these reasons, most players in the mainstream semiconductor business took very few government contracts during the 1950s, and when the industry did accept such contracts, it did so reluctantly.
However, the government made one very important contribution to the industry: it served as a market for high-priced semiconductor devices. The cost of the early silicon devices was so high that the consumer demand was insufficient to support volume production. But the military, with its unique requirements, was willing to pay significant premiums for devices. This provided the industry significant resources for R&D that eventually enabled low-cost devices suitable for the commercial market. The existence of the military market in the United States, Dr. Moore said, was far more important than military contracts. For example, at Fairchild Semiconductor, a company Dr. Moore co-founded, the first 100 transistors were sold at $150 apiece to a military program at IBM.
The Minuteman I and Minuteman II missile programs were also very important. Minuteman I had a large demand for silicon semiconductors and the missile required high device reliability; meeting those reliability specifications significantly enhanced the industry’s technological capabilities. Among other technol-
ogy contributions, Minuteman I helped establish planar technology, a technique for manufacturing integrated circuits that the industry has been building on ever since. In Minuteman II, the military made a strong commitment to integrated circuit technology, and the demand for integrated circuits in large volume greatly contributed to a wide variety of improvements in integrated circuit technology.
The Growth of Commercial Markets
By the mid-1960s, the importance of government programs on the mainstream industry declined. The commercial market forces were by then sufficient to drive technology development. The “insertion time” of new products into military systems was so long that by the time a device was ready for the military, it was obsolete. As a result, the military’s needs began to be met by a few specialized companies, or by the in-house capacity of the major defense contractors. This meant that the commercial semiconductor industry developed relatively independently.
Also during the 1960s, government support of R&D in the universities— done mainly by DoD and NSF—increasingly migrated to “non-mainstream projects” such as non-silicon semiconductors. This research investigated interesting physical phenomena, but it had little bearing on the mainstream commercial industry, which was silicon-based. The semiconductor industry benefited from the students trained under the contracts, although companies often had to retrain them a bit to acclimate them to silicon technologies. However, the utility of the research was of relatively little interest to the industry.
As a result, the industry created the Semiconductor Research Corporation (SRC) in the late 1970s, a collective effort by semiconductor firms to address unmet R&D needs. The industry decided to pool the funds it was currently putting into universities, augment it a bit, and support research relevant to the growing commercial semiconductor industry. The general rationale was to provide incentives for university professors to conduct market-relevant research. It was thought, Dr. Moore said, that given a choice between two equally interesting research projects, most professors would choose to pursue the one with practical application. By trying to capture “mind share” within the university environment, the industry hoped to channel more R&D effort to areas of use to the commercial semiconductor sector.
These efforts have been successful, Dr. Moore said. Today, through the SRC, industry funds $30 million per year in university R&D, and over time the SRC has helped ensure that a substantial portion of semiconductor research conducted in universities is relevant to industrial needs. In sum, Dr. Moore considered the SRC to be a very successful program.
The Challenges of the 1980s
The decade of 1980s was easily the most difficult one that the U.S. semiconductor industry has faced, said Dr. Moore. By every measure, Japanese firms were doing a better job of manufacturing semiconductors than U.S. firms. Using the same equipment, Japanese firms would maintain production 98 percent of the time, while U.S. firms would maintain production 85 percent of the time. U.S. firms would produce 20 wafers per hour, while Japanese firms would produce 40 wafers per hour on the same equipment. The United States trailed the Japanese in manufacturing by virtually every measure. A disadvantage in manufacturing technology is extremely alarming in an industry such as semiconductors, in which volume and reliability are crucial determinants of competitiveness.
The U.S. semiconductor industry chose cooperation as a means to address common problems, and the industry sought government support to help address these common problems in manufacturing technology. An important catalyst to cooperation was the National Cooperative Research Act (NCRA), which was passed in 1984 and allowed firms to talk to one another about research problems. This was very important, Dr. Moore said, because prior to that, the potential antitrust ramifications were simply too great to allow firms to discuss common research challenges. As a result, semiconductor firms’ R&D efforts went in separate directions, with a tremendous amount of duplication.
The Role of SEMATECH
Aided by the NCRA, and as a result of studies by the Defense Science Board and the Semiconductor Industry Association (SIA), the industry established SEMATECH in 1987 to address competitiveness issues that the semiconductor industry faced. Initially, the industry planned for SEMATECH to work on process technology and build a fabrication facility for the industry. That turned out not to be the right approach, Dr. Moore recalled, because it was expensive, duplicated a variety of internal efforts in the industry, and required the sharing of proprietary information that individual firms were not willing to release to competitors. After several iterations of mission statements, SEMATECH settled upon a mission that involved working closely with semiconductor equipment manufacturers, a sector on which the semiconductor industry was very dependent and in which quality had been suffering.3
The initial government commitment to SEMATECH was $100 million per year for five years to be matched equally by industry; that commitment was renewed after the initial five-year period. After an additional three years, the industry decided in 1996 to forgo government funding. In total, the government contributed $850 million to SEMATECH from 1988 to 1996 with industry matching that figure. Since the termination of government funding, industry has continued to support SEMATECH at approximately $130 million per year. Obviously, Dr. Moore observed, the industry is quite satisfied with the SEMATECH program, otherwise it would not tax itself at that level to support it. The consortium, Dr. Moore said, “certainly has helped us [the U.S. semiconductor industry] regain our leadership in the semiconductor industry.” Now, when one looks at various measures of manufacturing capability, the United States semiconductor industry is equal to any in the world.
SEMATECH was also a good investment for the U.S. government. Currently, each quarter, Intel alone returns more in taxes to the U.S. government than the entire amount of federal funding that went to SEMATECH. Neither Intel’s current corporate health, nor that of other SEMATECH members, can be attributed to the consortium, but it is a fact that the U.S. semiconductor industry was losing significant market share at a rapid rate when SEMATECH was founded. SEMATECH was an important part of the industry’s turnaround.
The Industry’s Technology Roadmap
Shortly after SEMATECH’s founding, a presidential commission, the National Advisory Committee on Semiconductors (NACS) was established to make additional recommendations on how to address competitiveness challenges facing the U.S. semiconductor industry. The chairman of NACS was Ian Ross, head of Bell Labs. Among NACS recommendations was for the development of a “technology roadmap” by the semiconductor industry. The SIA took up the task of developing the roadmap, which asked what the industry needed to do over the next 10 years in terms of research to maintain the industry’s historical rate of progress. The roadmap broke up the various components of future challenges into pieces, assembled the necessary expertise in the industry, and laid out a research path.
The roadmap has been an excellent vehicle for coordinating research within the industry—laying the track, if you will, ahead of the locomotive to ensure steady progress in the semiconductor industry. The roadmap has been updated four times since 1988 and it is now essentially continuously updated. One reason that the roadmap has been useful to the semiconductor industry is that the technology path is fairly predictable; the industry seeks to make devices smaller and smaller, and ever more complex. The roadmap, Dr. Moore added, is fairly broad, cuts across a wide swath of the industry, and has aided in directing government R&D investment in semiconductor technology.
Future Research Paths
Even with the industry’s resurgence today, cooperation remains an important theme for the industry. Current research in extreme ultraviolet (EUV) lithography technology is an example. EUV lithography is the industry’s preferred technology for moving beyond optical lithography to make small structures. EUV technology originated in the Star Wars program of the 1980s, but it is now showing promise to be a commercially viable alternative to optical lithography in the foreseeable future. The EUV research program is moving at a rapid pace, but much work remains to be done in developing appropriate x-ray light sources, specialized lasers, and mirrors whose specifications exceed those of the Hubble Space Telescope. EUV lithography is a very complex system to develop, and it is a prime example of how future technology development in the semiconductor industry will draw on a wide range of science and technologies.
Directly targeted government programs, Dr. Moore said, did not nearly have the impact on the semiconductor industry’s development that government purchases of advanced devices did. Key advances, such as the original invention of the transistor, came about from broad government support of science and technology. Narrowing the scope of R&D support has not proven to be successful. As the EUV example suggests, the industry remains dependent on a wide variety of advances in physics, chemistry, materials, metrology, and other areas. Improvements in computing power underpin advances in most of the areas mentioned above, and government can play a very positive role in pushing the frontiers of computing power.
For example, with respect to high-performance computing, Dr. Moore observed that much to his surprise, the market is largely the government. The government needs high-performance computing for purposes as diverse as weather forecasting, weapons development, and the space program. There is a limited commercial market for very high-end computers. University researchers, for their part, would love to have access to high-performance computers. It is therefore important that government continue to play its role as a sophisticated demander and buyer of high-performance computers.
With its origins in the United States, semiconductors remain the archetypal high-technology industry, and it has evolved into an important industry domestically and worldwide. Much of the industry’s success can be traced to broad support for science and technology by the U.S. government, whether from R&D funds, support for semiconductor R&D in universities, or from government purchases of high-end devices. As in the past, cooperation among industry and between industry and government will play a role in maintaining the industry’s rapid pace of technological development. The relationship between government
and industry in semiconductors may offer some lessons to the biotechnology industry, or may offer interesting contrasts. The remainder of the conference presents a great opportunity for the two industries to learn from one another, and discover possible new synergies.
THE VIEW FROM THE BIOTECHNOLOGY INDUSTRY
University of California at Berkeley and Chiron Corporation
Dr. Penhoet opened his remarks by saying that, unlike the computer and semiconductor industries, the biotechnology industry has been almost entirely dependent on the government for basic research support. In a sense, the history of the biotechnology industry to date has involved the commercialization of technologies that were funded almost entirely by the National Institutes of Health (NIH). Partnership, therefore, has been a major theme of the biotechnology story—partnership between government, industry, and universities. Much of the technology commercialized by industry was developed in the university setting using NIH grants.
That situation is changing dramatically today and that change, Dr. Penhoet said, is what he would concentrate his remarks upon. In particular, the technology base has been broadened in the biotechnology sector, and today a number of different technologies affect the biotechnology industry.
Origins of the Biotechnology Industry
Historically, the first 20 years of the biotechnology industry have been driven substantially, if not exclusively, by advances in molecular biology. The field was started in the 1970s by the invention of recombinant DNA, which has fueled much of what we think of today as biotechnology, and also the invention of techniques that allowed the production of monoclonal antibodies. If you examine the first wave of biotechnology products, they can be traced fairly directly to the application of these two inventions to a wide variety of uses in health care.
As we move into the late 1990s and the next millennium, the dominance of these two technologies is being challenged, or perhaps aided, by a wide variety of new technologies that are at least as important as recombinant DNA and monoclonal antibodies. The breadth of the new technologies, which are important both for biotechnology and pharmaceuticals, points to the need for a diversified federal research portfolio. To convey the need for adjustments in the federal R&D portfolio, Dr. Penhoet said he would concentrate on the University of California at Berkeley’s Health Sciences Initiative.
Berkeley Health Sciences Initiative
The Berkeley Health Sciences Initiative combines a wide variety of scientific disciplines with an aim toward addressing health problems. The initiative is a working demonstration of the proposition that advances in the health sciences today require a multidisciplinary approach.
At Berkeley, the health sciences initiative involves the construction of a new building that will house scientists from a wide variety of disciplines, such as chemistry, biology, physics, engineering, and others. The objective is to use physical proximity as a catalyst for collaboration. The molecular engineering group, for example, will bring together molecular biologists, chemists, engineers, and physicists. Collaboration across these disciplines, Dr. Penhoet said, will address fundamental problems in the health sciences.
Collaboration in the Berkeley initiative is also designed to address practical problems in the health sciences field. Recalling Dr. Moore’s observation that academicians turned increasingly to practical problems in semiconductors in the 1970s (driven by the creation of the SRC), Dr. Penhoet said that faculty at Berkeley today have a growing interest in pursuing research with practical applications. One reason for the focus on practical applications is that the links between university laboratories and the private sector have been well developed over the past 20 years.
With respect to specific technologies, magnetic resonance imaging (MRI) is among the most important tools in the biosciences, as it allows scientists to see structures within the human body. The current technologies are both expensive and invasive. At Berkeley, a group of scientists is working together to develop new imaging technology that may enable an MRI without the magnet, which makes the technology expensive and the experience of having an MRI rather uncomfortable.
Robotic surgery and microelectromechanical systems (MEMS) are technologies that Berkeley’s engineers are pursuing aggressively in cooperation with biologists. Many drugs produced using biotechnology require injection at very specific time intervals, placing the burden on patients to adhere to schedules. Technologies such as MEMS can measure metabolism on a real time basis, and then regulate the release of appropriate doses of drugs at appropriate times. The development of sensing and delivery devices demonstrates the benefits of collaboration between engineers and the pharmaceutical industry.
With the latest generation of drugs now comes the field of structural biology, and molecular biologists are now able to provide the raw material for structural analysis. Molecular biology can now produce pure proteins for structural analysis. Structural analysis, in turn, has been aided by tools such as powerful x-ray sources that enable the x-ray crystallography of a protein to be determined very quickly. Improved computational power now allows diffraction patterns generated by the x-ray beam to be reduced to structures in a very short period of time.
The first protein to be analyzed by x-ray crystallography was myoglobin, with hemoglobin following shortly thereafter. Those projects each took 20 years to complete. Today, due to improved computing power, we are able to determine a protein’s structure within days of having the crystal. All of this greatly quickens the pace of drug development and indeed the knowledge base in the field of structural biology is growing at a breathtaking rate. In these examples, tremendous strides in biology and health are being driven by contributions from physics (x-ray light sources) and computer science. The fields of multimedia and informatics are also driving innovation in biotechnology, simply because they improve the quality and quantity of information exchanges among researchers.
Berkeley has also built a second facility to house a number of different disciplines as a way to spur collaboration. The second facility focuses on neuroscience, genomics, molecular biology, public health, and cancer research. Cancer research now encompasses many fields. Whereas cancer treatment once was primarily chemotherapy, it now draws on advances in immunology and diagnostic techniques.
Neuroscience is making rapid strides in understanding how the brain functions and develops, and these strides are aided by many disciplines. Geneticists, psychologists, vision specialists, computer scientists, and other disciplines are working together to understand the complex system that the brain is. At Berkeley, the neuroscience group defines its mission as understanding the brain from molecular level to behavior.
The Demand for Computing Power
Within the biotechnology world, genomics progress depends mostly on advances in computing power. The human genome has 13 billion nucleotides and 100,000 genes. Simply collecting and sorting the information in the human genome is an enormous computational undertaking. As we begin to understand what genes mean to any living organism, the next challenge is to determine how genes are regulated and how they interact with each other to affect the metabolism of cells or the functioning of the brain. Diagnostic tools are beginning to emerge that depend on understanding the human genome. To make these tools operational is, however, a massive computational undertaking.
The most widely used genetic analysis technique today is the so-called gene chip, and this perhaps best illustrates the convergence of biology and computing. Gene chips are made by spotting individual genes on small chips, and then “interrogating those chips with laser beams.” With the readout from the chip, it is possible to understand the expression of every gene on the chip. The technology was invented by Layton Reid, and his idea originated after he attended a semiconductor lithography workshop. It occurred to Reid that one could use lithography techniques to make specific gene sequences directly on silicon chips.
Today, as many as 20,000 different gene tags are on a single chip. Gene chip technology is, therefore, a hybrid using biology, chemistry, and semiconductor manufacturing processes. Gene chips also require huge computational resources, as analyzing data from gene chips requires taking 20,000 gene tags in 20,000 combinations.
In the future, Dr. Penhoet said that progress in the health sciences will require that researchers address challenges that cut across standard disciplinary boundaries, as opposed to the more traditional approach of advancing narrow academic disciplines. That said, each of the disciplines, if they are to contribute to progress in health sciences, will have to develop in parallel in order to have maximum impact. In agreeing with the remarks of Rep. Boehlert, Dr. Penhoet said there is no rational way a priori to know what the optimum balance in research spending is at any given point in time. It is difficult to predict the importance of any given field 10 years into the future. Nonetheless, it is clear that a convergence in the fields of physics, chemistry, biology, and computing will continue to occur. Broad support of science is, therefore, very important to sustaining progress.
While acknowledging the importance of NIH funding to his field and industry, Dr. Penhoet expressed his belief that future advances in biotechnology will require a multidisciplinary research approach. A wide range of fields, from mathematics and statistics, to biology, physics, and chemistry work in concert today to develop new solutions to health problems. Cooperation across disciplines, concluded Dr. Penhoet, as well as cooperation between government, industry, and universities, hold the key to biotechnology’s future.
A questioner asked Dr. Moore why university R&D on semiconductors was disconnected from industry needs in the early years and whether the research that universities pursued during this period resulted in any long-term impacts on either the semiconductor or other industries.
Dr. Moore responded that early university research on semiconductors created a number of important developments. Gallium-arsenide semiconductors, for instance, were an outgrowth of some of this research, and such semiconductors are used widely today in cell phones and other applications. Other specialized devices were developed as well. It is not that the university research of this era was not important, Dr. Moore said, but it was not directly relevant to an industry that today has almost $150 billion in revenue worldwide and that is almost exclusively silicon based.
A questioner asked Dr. Penhoet whether there are any risks associated with
encouraging interdisciplinary training at the graduate level. Dr. Penhoet responded by acknowledging the risk of training a generation of generalists if the trend toward multidisciplinary research went too far. Collaboration across disciplines tends to work best when individuals involved are highly skilled in their particular field. At Berkeley, the co-housing is designed not to train generalists, but to make students aware of the opportunities to apply work in their disciplines to other fields.
Dr. William Spencer of SEMATECH asked Dr. Penhoet about the relative roles of government and private funding in developing the computational, software, and mathematical tools for Berkeley’s multidisciplinary initiatives. Dr. Penhoet responded that there is a hybrid of government and private funding for these tools. The Computational Biology program at Berkeley is perhaps most active in pursuing a variety of funding sources, and this program explores biostatistics, statistics, molecular biology, and mathematics. There is dearth of talent in this field, and this presents a challenge in training undergraduates in computational biology. NIH has provided some financial support to encourage more training in this field and the bulk of funding in computational biology comes from public sources.
A questioner asked Dr. Moore why the semiconductor industry grew in the United States and not Europe, wondering specifically whether demand from the government for high-end devices played a role, or whether a greater entrepreneurial spirit in the United States has been the reason. Dr. Moore noted that the United States was better prepared to exploit a new technology such as semiconductors because we have a culture that encourages the formation of small firms. “New technologies are not easily exploited by large existing companies,” Dr. Moore said. In semiconductors, large firms initially pursued semiconductor technology, but they eventually dropped out of the industry. New entrants were the important players in semiconductors. The fact that the semiconductor was invented in the United States, Dr. Moore added, also gave U.S. companies a huge advantage in pursuing business opportunities. The availability of the government market—willing to buy large quantities of devices at high prices—also propelled the industry in the United States.
Dr. William Long of Business Performance Associates said that EUV technology received support from the Advanced Technology Program, specifically in the development of the mirrors necessary to successfully deploy EUV technology. Dr. Long asked Dr. Moore where he thought EUV technology would be without government support for the lasers, mirrors, or other supporting technologies.
Dr. Moore said that it is unlikely that EUV technology would have developed—or at least developed at the pace that it has—without initial support from the Star Wars program. Only because some of the basic research was conducted during the Star Wars program has EUV today become a viable lithography alternative for the industry. Other approaches, such as electronic beam, short wave-length, and shadow x-rays, might have garnered most of the R&D support from
industry without the support for EUV from Star Wars. The existence in the national laboratories of basic research on EUV has greatly enhanced its attractiveness to industry.
A questioner observed that the amount of data being generated by the Human Genome Project is growing extremely rapidly. He asked Dr. Moore whether computer storage would keep pace with the growth of genome data.
Dr. Moore observed that the growth of storage capacity has maintained an exponential rate for approximately 40 years, a truly remarkable fact. Dr. Moore said he expected this growth to continue for several more years, as there remains room for more advances using traditional techniques. Dr. Penhoet commented that biology students at Berkeley spend about 25 percent of their time in front of the computer manipulating databases. Comparative biological research can now be conducted using databases alone, underscoring the need for improvements in storage capacity and computer technology.
Dr. Kathy Behrens observed that Dr. Moore had cited numerous examples of government-industry and intra-industry interactions in the development of the semiconductor industry. Dr. Behrens asked Dr. Moore if there were specific examples that might be useful today in sustaining the advances we have witnessed in the past 30 to 40 years.
With respect to EUV, Dr. Moore cited the industry’s relationship with Lawrence Livermore National Laboratory as a government-industry collaboration that has been very mutually beneficial. Lawrence Livermore has brought tremendous assets to the development of EUV, and the semiconductor industry has invested $100 million annually to support the program. The EUV partnership, Dr. Moore observed, is probably the largest Cooperative Research and Development Agreement in place. In the computing area, government needs to continue to support high performance parallel computing. Such computers have widespread scientific applications, but only the government can afford them.
A member of the audience remarked how accurate Moore’s Law has proven to be, and asked Dr. Moore how much longer we could expect to see the law hold. Dr. Moore responded that the fact that materials are made of atoms would eventually hinder progress; problems at the atomic level would come into play within a decade. But that would not signify the end of progress; larger chips, advances in interconnection technology, and other advances will continue to improve the capacity of semiconductors.