The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology H Trends in Science and Technology Patrick Young CONTENTS     INTRODUCTION   168     TRENDS IN INFORMATION TECHNOLOGY   168     Computers, Lithography, and Thin Films,   168     Data Density,   170     Flat Displays and Printed Circuitry,   172     The Internet,   174     Imaging,   175     TRENDS IN MATERIALS SCIENCE AND TECHNOLOGY   177     Micro- and Nanoscale Fabrication,   177     Photonics,   179     Superconductors,   180     TRENDS IN ENERGY, ENVIRONMENT, AND THE PLANET   183     Energy,   184     Environment and Ecology,   185     Atmospheric Sciences and Climatology,   187     Earthquake Studies,   188     TRENDS IN BIOMEDICAL AND AGRICULTURAL SCIENCES   189     Genomics and Proteomics,   190     Neuroscience,   191     Drug Development,   193     Agrobiotechnology,   194     BLENDING THE PHYSICAL AND THE BIOLOGICAL   195     Biotech and Materials Science,   195

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology     Robot Engineering,   196     Catalysts,   196     DISCUSSION AND CONCLUSION   197 INTRODUCTION Any attempt to predict the future carries risks. Yet, in forecasting the scope of technology advances over a span of 5 to 10 years, the past is prologue to the future. During the first decade of the 21st century, three enabling technologies— computers, communications, and electronics—will continue and accelerate the information revolution. These technologies, in turn, will be supported by innovations across a spectrum of supporting technologies, such as advanced imaging, nanotechnology, photonics, and materials science. Important developments will emerge as well in such areas as sensors, energy, biomedicine, biotechnology, and the interaction of the physical and biological sciences. Many challenges, however, confront those seeking to turn the potential of these technologies into practical applications. How rapidly progress comes will depend on innovative ideas, the economy, and the vision of industrial and political leaders. TRENDS IN INFORMATION TECHNOLOGY In about 10 years, the semiconductor industry will require a replacement technology for the photolithography process now used to make electronic chips if it wants to keep improving their performance. Two advanced fabrication approaches are under investigation at a time when the industry is also in transition from the use of aluminum to copper for circuit lines and is seeking ways to further shrink the size of transistors. Computer makers expect to vastly increase data storage over the next decade, using new magnetic and optical techniques that include magnetic thin films, near-field optics, holography, and spintronics. Scientists will also continue the quest for higher-resolution and flatter display panels, pursuing such approaches as organic light-emitting diodes and electronic “paper.” New and refined imaging techniques will enable cutting-edge studies in the physical and biological sciences. Computers, Lithography, and Thin Films Chipmakers must confront the time, probably toward the end of the current decade, when traditional lithography techniques no longer meet their needs. The Semiconductor Industry Association’s periodic technology road map makes clear that the decade ahead will require unprecedented changes in the design of chips and their materials if the industry is to continue its traditional rate of doubling computing power about every 18 months. Potential approaches to improved per-

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology formance include a new chip fabrication technology, improved circuitry, and new types of transistors. Shrinking the size of microcircuits, the hallmark of the semiconductor industry, has relied in considerable part on the reduction in the light wavelength used for optical lithography. Line widths of 200 nanometers are now in use, but by the end of this decade or soon after, lithography as we know it will likely reach its physical limits. The industry is now pursuing two advanced lithography techniques as potential replacements for the current technology. Each of the two— extreme ultraviolet light (EUV) and projection electron beam lithography (also known as SCALPEL, for scattering with angular limitation projection electronbeam lithography)—has its advantages and limitations. EUV provides radiation at 13.4 nanometers and uses a complex mirror system rather than lenses for focusing it on the photoresist. The process, however, will require defect-free masks—an extraordinarily difficult challenge—and, most likely, some type of chemical post-processing of the photoresist material to ensure proper depth of the etching. SCALPEL forsakes photons for high-energy electrons, which poses problems in developing masks that are both thin enough for the exposure process and capable of withstanding its high heat. Solutions to this problem exist, but they require complex processing to achieve the finished chip. Faster Speeds In an effort to ensure faster computing speeds, chipmaking is in a fundamental transition as it moves away from the historic combination of aluminum circuit lines coated with an insulator, or dielectric, usually silicon dioxide. Following the lead of IBM, companies are substituting copper for aluminum. The choice of an improved dielectric—needed to prevent cross talk as line width continues to narrow—is far less unanimous. IBM has chosen Dow Chemical’s SiLK aromatic hydrocarbon, but other companies, including Novellus Systems and Dow Corning, offer competing dielectrics, and other new insulating materials are under investigation. Innovative ways to shrink transistors remain high on the research agenda. One approach would reduce the thickness of the gate insulator, which is currently about 2 nanometers. Achieving chips with 100-nanometer lines will likely require shrinking the insulator to 1 nanometer, only four times the width of a silicon atom. Several other potential options for reducing insulator width are open. One is to reduce the gate length, which would increase speed without having to shrink the insulator layers. This might be accomplished by silicon-on-insulator transistors, in which the insulator is placed under the transistor rather than burying the transistor in the silicon substrate. Another approach, the double-gate transistor, would place one gate atop another and reduce the gate length by half for the same oxide thickness. The switch to copper wires required new advances in plasma processing of

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology microcircuits, which, in turn, allowed chipmakers a wider selection of materials. Further improvements will be necessary, however, for generating such things as diamond thin films for use in flat-panel displays. Improvements in the deposition of organics and thin films are vital to improving the performance of the next generation of electronic devices. Today, the material to be deposited restricts the choice of technology used. A more universal deposition technique would offer a significant advantage. Nonvolatile RAMs Recent years have seen a surge of interest in developing inexpensive, fast, durable, and nonvolatile random access memories and in the use of a solid-state technology called magneto-electronics to replace volatile and nonvolatile semiconductor memories and mechanical storage. Most of this work is focused on giant magnetoresistance (GMR) and magnetic tunnel junction technology. GMR materials have the advantage of a strong signal, nonvolatility, and compatibility with integrated circuit technology. Magnetic tunnel junction devices have the potential to serve as nonvolatile memories with speeds comparable to those of today’s dynamic random access memories (DRAMs). Much of the research on magneto-electronic memories has emphasized hybrid devices that utilize a magnetic memory and semiconductor electronics. But one start-up company has gone all-metal, developing a magnetic RAM in which the memory arrays and the electronics are all made of GMR materials. The device is based on electron spin rather than electric charge. Data Density More bits per square inch is almost a mantra in the computer data-storage field. But as with photolithography, traditional magnetic and optical storage techniques are approaching their own physical limits, which will require innovative solutions to overcome. In 1999, IBM forecast that it would achieve an areal magnetic storage density of 40 gigabits per square inch (Gb/in.2) by the middle of this decade. Beyond this density, magnetic storage encounters the instability of superparamagnetism, a phenomenon in which the magnetic orientation energy equals the surrounding thermal energy. As a result, magnetic bits flip spontaneously at normal operating temperatures. In May 2001, IBM announced it had developed a new coating for its hard disk drives that bypasses the problem—a three-atom-thick layer of ruthenium sandwiched between two layers of magnetic material. IBM predicted that the material would enable a storage density of 100 Gb/in.2 by 2003. Optical storage faces its own physical barrier—the diffraction limit, at which the size of the optical bits is limited by the wavelength of light used to record

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology them. Solving the two problems will require new materials, structures, and recording technologies. Magnetic Storage Today’s magnetic storage devices, such as hard disk drives, record data on tiny tracks of cobalt-chromium alloy crystals. One approach being explored to increase density in these devices involves creating a thick film of copolymer plastic, burning holes in it as small as 13 nanometers, and filling them with magnetic materials. Because this technique could yield 12 trillion magnetic “posts” or “wires” over a single square centimeter, each separated from the others by plastic, it could result in magnetic storage significantly higher than the 40-gigabit limit imposed by superparamagnetism on conventional systems. Indeed, it might someday boost data density into the terabit range. Another way to improve density would be to replace the cobalt-chromium crystals used in storage media with iron-platinum particles, which have stronger magnetism and could be made as small as 3 nanometers. However, until last year, when IBM scientists succeeded, no one could produce uniform grains of the metal crystals. Uniform grains with greater magnetic strength should enable data densities up to 150 Gb/in.2 and, perhaps, even into the terabit range. Both approaches, however, face a number of challenging development and scaling issues before they are ready for the market, and they will require new read and record technologies as well if they are to reach their full potentials. Optical Storage Near-field optics—which exploits the fact that light placed very near an aperture smaller than its wavelength can pass through the hole—may provide one way to vastly expand the data density of optical storage. Using a very small aperture laser, Lucent scientists have recorded and read out optical data at a density of 7.5 Gb/in.2, and they have speculated that with apertures 30 nanometers in diameter, data density could reach 500 Gb/in.2 Lucent has licensed its technology for commercial development. Holography offers the potential for high storage densities and data transfer at billions of bits per second for two reasons. Unlike traditional magnetic and optical systems, holography can store data throughout the entire medium rather than simply on the surface. Second, holography allows recording and reading out a million bits at once rather than one bit at a time. For example, InPhase Technologies, a Lucent spin-off, is commercializing technology developed at Bell Laboratories. It uses two overlapping beams of light—the signal beam, which carries data, and the reference beam. The two beams enter the storage medium at different angles, create an optical interference pattern that changes the medium’s physical properties and refractive index, and

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology are recorded as diffractive volume gratings, which enables readout of the stored data. Encoding data consists of assembling “pages” of 1 million bits represented by 1’s and 0’s and sending them electronically to a spatial light modulator. This device is coated with pixels, each about 10 square micrometers, which can be switched rapidly to match the content of each page. When a signal beam passes through the modulator, its pixels either block or pass light, depending on whether they are set as a 1 or a 0, and the laser beam carries the message of that specific page. Bringing competitive holographic systems to market that will exceed the traditional storage technologies will require a number of advances in recording materials. These advances include optical clarity, photosensitivity, dimensional stability, and uniform optical thickness, as well as innovations in spatial light modulators, micromirrors, and component-systems integration. The demonstration that information can be stored on and nondestructively read from nanoclusters of only two to six silver atoms, announced earlier this year by Georgia Institute of Technology researchers, opens another potential approach to increasing data density. The Georgia Tech team exposed a thin film of the silver nanoclusters to blue light in the shape of the letter L. Two days later, they exposed the nanoclusters to green light, which caused the nanoclusters to fluoresce in the L pattern. Whether such nanoclusters can be shaped into compact arrays and handle read-write operations at the speeds of today’s computers remains a question for further study. Electron Spin Spintronics could lead to information storage on the same chips that process data, which would speed up computation. Data processing is based on the charge carried by electrons; data storage has relied on magnetism or optics. However, electrons also have spin, and electron spin is harnessed in magnetic storage. Spintronics seeks to manipulate electron spin in semiconductor materials for data storage and perhaps quantum computing. The key lies in devising semiconductor materials in which spin polarized electrons will function. Recent developments in spin polarizers and the synthesizing of magnetic semiconductors suggest this problem can be managed. However, making a marketable product will require ferromagnetic semiconductors that operate at room temperature—a demand not easily fulfilled. Flat Displays and Printed Circuitry Organic light-emitting diodes (OLEDs), a technology that offers more design flexibility and higher resolution than traditional LEDs, are now coming to

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology market. However, these devices are limited in size, so they are not yet practical for such things as monitors and television screens. OLEDs rely on small-molecule oligomers or thin films of larger semiconductor polymers for their illumination. A polymer semiconductor, for example, is deposited on a substrate, inserted between electrodes, and injected with electrons and holes (the absence of electrons). When holes and electrons recombine, they emit light. The technology is expected to one day replace cathode ray tubes and liquid-crystal diodes. Advocates emphasize several advantages of light-emitting polymer-based products, including greater clarity, flatter screens, undistorted viewing from greater angles, higher brightness, and low drive voltages and current densities, which conserve energy. OLEDs for alphanumerical use and backlit units for liquid crystal diodes are currently entering the marketplace. The coming decade will probably see innovations in OLED production, materials, performance, and scale. Advances in all these areas are needed to bring to market such envisioned products as high-sensitivity chemical sensors, roll-up television screens, wide-area displays, and plastic lasers. Electronic Paper This technology may change the configuration and the way we use portable electronic devices such as cell phones and laptops as well as reinvent how newspapers, books, and magazines are “printed” and read. The vision is of a lightweight, rugged, flexible, and durable plastic that combines the best of woodpulp-based paper and flat-panel displays. In a sense, the earliest versions of the vision are available today. E Ink Corp. markets a simple version of electronic paper for large-area displays. Gyricon Media, Inc., a Xerox Corp. spin-off, plans to market a precursor electronic paper for similar uses later this year. The design of electronic paper differs markedly from the electronic displays of today. Instead of cathode-ray tubes or liquid-crystal diodes, silicon circuits, and glass, electronic paper would utilize electronic “inks,” plastic “paper,” and flexible and bendable circuitry. A joint venture by Lucent Technologies and E Ink unveiled the prototype last fall, a device containing 256 transistors, each of which controls a single pixel. A thin layer of ink made of white particles suspended in a black fluid is placed between two electrodes. Switching a pixel on or off causes the white particles to move forward or backward and make the pixel appear black or white. To form an electronic paper, inks must be laminated along with their drive circuitry into flexible sheets. This poses a problem because plastics typically cannot withstand the high temperatures needed for manufacturing conventional silicon circuits. Moreover, the surfaces of plastics are rougher than those of glass or silicon, which can adversely affect viewing. So making electronic paper a viable commercial product will require a number of technological developments. Electronic paper and innumerable other products would benefit from the abil-

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology ity to simply print electronic circuits rather than go through the stressful and complex process used to make chips. The goal is to fabricate transistors, resistors, and capacitors as thin-film semiconductor devices. Ways to do this at low cost and in large volume using standard printing processes or inkjet printers are in development. Working with funds from the Advanced Technology Program, for example, Motorola has teamed with Dow Chemical and Xerox in a 4-year effort to develop novel organic materials and techniques for printing electronic devices. The Internet Rarely, if ever, has a technology changed society as rapidly and unexpectedly as the Internet and the World Wide Web did in the 1990s. The coming decade will also see rapid changes in the way the world communicates and transmits data. The Internet was both revolutionary and evolutionary, a dual process that continues with the next-generation Internet, or Internet II. Initiated at a conference in October 1995, Internet II involves a collaboration of more than 180 universities and a multitude of federal agencies and private companies to develop an advanced communications infrastructure. The major goals are to create a cutting-edge network for the research and education communities, enable new Internet applications, and ensure that new services and applications get transferred to Internet users at large. Designers envision high-speed, low-loss, broadband networks capable of allowing such bit-dense activities as real-time research collaborations and telemedicine consultations of unsurpassed clarity. Internet II encompasses several major new Internet protocols, and, as the original Net did, it will introduce new phrases to the language, such as GigaPOP—the term used for its interconnection points between users and the providers of various services. Among the many innovations needed to enable Internet II are new network architectures, advanced packet data switch/routers, multiplexers, and security and authentication systems. Internet Vulnerability The growth of the Internet has stimulated the study of communications networks to understand their general properties and the physical laws that govern their behavior. Physicists at the University of Notre Dame did a computer simulation of two possible network configurations. In one, each node had about the same number of connections to other nodes in the network. In the second configuration, nodes had greatly varying numbers of connections but nodes with many connections predominated. The second configuration represented the type of connections found on the Internet. On the basis of their findings, the researchers concluded that Internet-like systems are largely invulnerable to random failure but very open to damage by deliberate attack. A better understanding of the

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology Internet’s behavior and its effect on Internet vulnerability is important for national security, communications within and among businesses, and the flow of e-mail and e-commerce. The introduction of the Advanced Encryption Standard algorithm last year could assure that information encrypted by it and sent over the Internet cannot be decoded by anyone who intercepts it, at least for the next several decades. The challenge now is to ensure the security of the encryption process so that the specific information needed to decode messages remains known only to those who should know it. This is primarily an issue of people and policy. However, NIST is working on techniques called key-management protocols that will help enforce the security of the encryption and decoding process. A major challenge, one that could have a significant impact on Internet reliability and speed from the user’s viewpoint, lies in resolving the so-called lastmile bottleneck. This is the connection between the desktop terminal and the Internet service provider. Technical advances in optical communications, some of them associated with developing Internet II, will shorten this last mile stretch by stretch, increase network communications, and perhaps even solve the bottleneck in the coming decade. Imaging Imaging has served as a vital impetus to discovery across the spectrum of science for several centuries. This fact will remain true in the 21st century. Advances in imaging at the nano- and molecular scales by various techniques have contributed significantly to the understanding and exploitation of materials and processes. As science seeks to understand and control nature at its smallest scales, the need for new and improved imaging techniques—more sensitive, more specific, sharper in detail—takes on new urgency. New approaches and refinements of old, reliable methods will certainly emerge in the coming decade. Femtosecond lasers, for example, are opening a new era of investigation, ranging from biochemical reactions to fundamental studies of quantum mechanics. Optical microscopy, the oldest of the imaging sciences, and imaging holography could find new uses. Improvement in synchrotron-radiation resolution promises sharper images of such things as chemical-bond orientation, individual magnetic domains, solid-state reactions, catalysts, and the surfaces of semiconductors. Femtosecond Imaging Refinements in femtosecond imaging and its application to new areas promise greater understanding in a broad range of disciplines, from cell biology to materials science. The laser-based technique already has demonstrated, for example, that DNA is not a rigid molecule but is capable of considerable motion. Currently, laser pulses of 5 femtoseconds can be achieved. The discovery that

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology fast x-ray pulses can be generated when ultrashort laser pulses are reflected from the boundary between a vacuum and a plasma suggests the potential for a new femtosecond imaging approach. Researchers envision opportunities such as observing the biochemical reaction of new drugs, precisely defining the transport of electrons within DNA, and even gaining a greater understanding of quantum theory through the application of femtosecond imagery. Light Microscopy The original microscopy has yet to reach its limits of usefulness, especially in areas such as biotechnology, biomedical science, and medical diagnostics. By integrating advances from several fields, including optics, robotics, and biochemistry, researchers are developing interactive light microscopy techniques to examine the contents and dynamics of cells and tissues. For example, the National Science Foundation is funding development of the automated interactive microscope, which couples advanced fluorescence-based light microscopy, image processing, and pattern recognition to a supercomputer. Researchers are also exploring deblurring techniques to sharpen the images yielded by innovative light microscopes. Near-field scanning optical microscopy takes advantage of the fact that light shined through a tiny nanoaperture can cast an illumination spot the size of the hole. Spot sizes in the range of 50 to 20 nanometers have been obtained. German researchers have reported using a single molecule as a light source. In theory, such a light source could illuminate a spot approximately 1 nanometer across. One current approach to developing nondestructive imaging on the nanoscale combines two established technologies—scanning probe microscopy and molecular spectroscopy. The aim is to harness the high spatial resolution offered by scanning probes and molecular spectroscopy’s chemical specificity to explore the chemical details of nanometer structures. Holography, too, offers a potential means of imaging at the nanoscale. Working with three partners and money from the Advanced Technology Program, nLine Corp. seeks to develop a holographic system capable of imaging defects on the bottoms of deep, narrow features of semiconductor chips. These features include trenches, contacts, and the spaces between interconnects, where depth-to-width ratios run as high as 30 to 1. Algorithms In many instances, the key to improved imaging will be new algorithms and software packages, through which, for example, researchers can obtain enhanced image quality, create extremely accurate three-dimensional images from the observations of different devices, automate image correction, and gain more interactive capabilities with images. Synchrotron radiation facilities of increased intensity and more precise focus will advance observations in protein structure,

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology surface science, and molecular structure. New variations and refinements of scanning probe and atomic force microscopy can be expected to improve imaging of the physical and biological worlds, helping to solve issues in biochemistry and nanostructure synthesis and fabrication, and to advance the quest for biomolecular devices such as high-density processors, optical-communications elements, and high-density storage media. TRENDS IN MATERIALS SCIENCE AND TECHNOLOGY Fabrication at the micro- and nanoscale level will yield a number of new devices and products, ranging from exquisite sensors to tiny walking robots and automated labs-on-a-chip. Key to such advances is understanding how to control the materials used and the development of new molecular manipulation and micromachining tools. Advancements in photonics will help meet the demand for greater bandwidth for communications, and innovations in photonic-integrated systems will expand their use for signal processing. Both high- and low-temperature superconductors pose challenges and promise commercial applications over the next decade, including in the distribution of electric power and as ships’ engines. Creation of new materials will have effects throughout society, and the versatility of polymers makes them a particularly attractive target for research. Self-assembly, by which molecules form into structures on their own, also has gained increasing attention. Micro- and Nanoscale Fabrication Emerging micro- and nanominiaturization techniques promise to transform the typically planar world of these scales into three dimensions and to enable new devices and technologies of scientific, industrial, economic, and security import, including a host of new sensors, walking microrobots, nanomotors, and new polymers. Understanding how to control the chemical composition, physical properties, and configuration of materials at the molecular level is a key element in devising nanoscale building blocks for assembly into working devices and machines. Achieving this knowledge and integrating it into products requires an interdisciplinary effort by chemists, physicists, materials scientists, and engineers. MEMS Microelectromechanical system (MEMS) devices have gone from relatively simple accelerometers to devices that enabled the successful flight of twin tethered experimental communications satellites, each of which is only 12 cubic inches and weighs 0.55 lb. The challenge now is to improve MEMS techniques and develop new ways to do three-dimensional microfabrication of things such as metallic coils. MEMS devices—already in commercial use as sensors and actua-

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology depths. A major challenge to sequestering the greenhouse gas is to find ways to reduce the cost of such storage to about $10 a ton, a goal the Department of Energy has set for 2015. Climate Patterns Sorting out how to best inhibit the harmful effects of human activities on the atmosphere will take on a new urgency in the coming years. The problem has important international economic and social implications—such as coastal submersion from rising sea levels, increased storms and flooding, and disruptions in agriculture—that reinforce the need for excellent science. The Bush administration’s emphasis on the use of fossil fuels for electricity generation and combustion engines does not bode well for reducing greenhouse gases in the near term. Studies of ice cores from Greenland and the Antarctic continue to elucidate evidence that Earth goes through periodic temperature shifts short of full ice ages. Recent data indicate that this warming—cooling cycle occurs poles apart: When it grows colder in the Arctic, the southern polar region warms and vice versa. Work correlating rises and falls in Earth’s temperature with ocean circulation patterns may further explain the intricate interconnection that shifts the climate on scales of hundreds and thousands of years. Climate changes of shorter scales, such as the phenomena known as El Niño and La Niña, present a challenge of even greater human immediacy. El Niño is a movement of warm surface water from the western Pacific to the eastern Pacific off South America. It is propelled by the Southern Oscillation, an unstable interaction between the ocean and the atmosphere. During an El Niño, the trade winds weaken and the warm water rushes east and releases more water into the air, which results in heavy rainfall over Peru and Ecuador. Far to the west, places such as Indonesia and Australia suffer droughts. La Niña is the reverse phase of the oscillation. So expansive is the area of warm Pacific water that it affects climate over much of the globe in both phases. The two strongest El Niños in more than 100 years occurred in 1982 and 1997, and some scientists believe this intensification was a result of global warming. Earthquake Studies The introduction and refinement of plate tectonics provided a unifying theory that in the broadest terms explained the pattern of earthquakes and volcanic activity observed globally, such as the so-called ring of fire around the rim of the Pacific Ocean. Yet knowing generally where a quake might strike does not say when, and plate tectonics itself did little to provide the information needed to accurately predict temblors by precise time and location.

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology Earthquakes occur when stress builds within rock, primarily along faults beneath the surface, to the point the rock breaks and moves. The coming decade, with near certainty, will not yield ways to pinpoint earthquakes in time and space. However, one can predict a better understanding of the buildup of stress, the transfer of stress along faults, and the complex ways the shockwaves are released when rock snaps propagate through the lithosphere and along the surface. This information will have implications for emergency planning and for improving the design and construction of earthquake-resistant structures. Earthquake Behavior Geoscientists in recent years have advanced their understanding of earthquake behavior in several ways. For example, the notion of fault-to-fault communication has shown that faults pass stress quite effectively at the time of a quake from one segment to another. During an earthquake, considerable stress is released as shockwaves, but some of the fault’s stress also is transferred to adjacent segments. By determining the epicenter of a quake and the direction in which the rock broke—strike slip or thrust—seismologists can calculate how much stress was shifted to adjoining rock and where. This ability allows an informed estimate of where the fault will break next, but the question of when remains unanswered. A better determination of timing could emerge from the new ability—made possible by more powerful computers—to pinpoint and observe clusters of thousands of microquakes along a fault over time. Seismologists believe these microquakes, which are magnitude 1 to 3, represent the progressive failure of a fault. However, what triggers the fault to finally break and release its pent-up energy also remains unknown. Currently, several teams are observing clusters of microquakes on the San Andreas fault near Parkfield, California, where a moderate earthquake has been expected to occur for well over a decade. TRENDS IN BIOMEDICAL AND AGRICULTURAL SCIENCES Sequencing the genomes of humans and other species, coupled with proteomics and advances in bioinformatics, will reveal the genes related to diseases, alter the way physicians practice medicine, and have a major impact on the development of new drugs. The growing understanding of the complex activity inside cells will provide equally important insights, as witnessed by research in the neurosciences. Among the emerging findings is that brain cells may be capable of regenerating themselves, and there is a better understanding of proteinprotein interactions, such as those of hormones and their receptors. Biotechnology will play an increasingly important role in developing human drugs, but public resistance in places to genetically modified plants may slow its role in agriculture.

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology Genomics and Proteomics The mapping and sequencing of the human genome, now nearing completion, marks a historic point in biology and the beginning of equally exciting discoveries as scientists make increasing sense of the jumbled A’s, T’s, G’s, and C’s (adenine, thymine, cytosine, and guanine) that make up its genes. This effort will see scientists reporting important data gleaned from the genome, such as the role of specific genes, the identification of genes that are linked to diseases, and a better understanding of the timing of gene expression. These efforts will require new or improved assay systems, automated processing equipment, computer software, and advances in computational biology. Other Genomes Of considerable importance to understanding the human genome are the continuing efforts to sequence the genomes of other creatures, including the mouse. Many genes are conserved; that is, the same gene exists in many species, and the functions of certain of these genes have been discovered in species other than humans. By matching, say, the mouse and human genomes, a gene with a known function in the mouse can be pinpointed in humans. During the next 7 years, progress along the frontlines of human genomics should identify most genes associated with various diseases and indicate how a malfunctioning gene relates to the ailment, which would open new windows to therapy. Genomics will change many current approaches in medicine and related health sciences. One area likely to see radical change is toxicology. The field has traditionally relied on animals—such as rats, mice, rabbits, dogs—to gauge the toxicity of substances. But genomics research has led to an emerging field known as toxicogenomics. In it, researchers apply a suspected toxin to DNA placed on glass to observe any effects it may have on gene expression. Proteomics Genes carry the codes for proteins, which do the actual work within an organism. The genomic revolution has ushered in the age of proteomics, an even more difficult challenge in which the goal is nothing less than understanding the makeup, function, and interactions of the body’s cellular proteins. Indeed, proteomics poses a more complex puzzle than genomics because there are far more proteins than genes. This is so because the messenger RNA that transports the code from a gene for transcription into a protein can be assembled in several ways, and a protein in a cell also can be modified by such processes as phosphorylation and glycosylation. Proteomics, unlike the traditional study of one protein at a time, seeks to pursue its goals using automated, high-throughput techniques. The development

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology of new technologies is a major goal. Areas of investigation include which genes the different cell types express and their proteins, the characteristics of these proteins, studies to determine their interactions, understanding signal transduction, and determining the nature of protein folding and the exact three-dimensional structure of proteins. As a rule, proteins that interact tend to work together, such as antibodies and their receptors. Identifying a protein active in some disease process offers a potential target for intervention. For this reason, the proteomics quest will be led and dominated by the quest to discover new drugs. A key element in the success of genomics and proteomics has been and will continue to be bioinformatics, a field that in the broadest sense weds information technology and biology. Bioinformatics provides the computer systems and strategies needed to organize and mine databases, examine the relationship of polymorphisms to disease, ascertain the role of proteins and how they interact with other proteins, and inform the search for screening tests and therapeutic drugs. Once basically a data management tool, bioinformatics is now important in the analytical functions needed to support continuing progress in genomics and proteomics—not simply cataloging and warehousing data, but turning it into useful insights and discoveries. All this will require the development of new algorithms and computer strategies. Neuroscience Neuroscience provides but one example of the opportunities and advances that will accrue through a greater understanding of the extraordinarily complex activities within and among cells. The last three decades have brought a burst of knowledge about the brain—the routes of neurons throughout the organ, an understanding of neurotransmitters, synapses, and receptors, mappings of responses to various stimuli, and the linking of genes to biochemical networks, brain circuitry, and behavior. Physicians can now track the progress of neurological diseases in their patients with imaging techniques, and new insights into the processes of memory, learning, and emotions have emerged. One can expect the brain to yield far more of its secrets in the coming decade with the introduction of new probes, new imaging technologies, and improved bioinformatics techniques that integrate findings not only from neuroscience but from other research disciplines as well. Nerve Death Consider, as an example, neurodegeneration, which is a key element in ailments such as Alzheimer’s, Parkinson’s, and Huntington’s diseases, multiple sclerosis, glaucoma, and Creutzfeldt-Jakob disease and its variant, commonly called mad cow disease. Evidence suggests a genetic component for each of these diseases, different degenerative mechanisms for each, yet with some similarities

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology among them that may yield insight into more than one disease. In Alzheimer’s, for instance, a genetic defect in the nucleus affects the cell’s mitochondria, its major source of energy, and appears to play a role in triggering apoptosis, or programmed cell death. The question now being raised is whether the same or a similar mechanism may play a role in the optic-nerve degeneration in glaucoma. The ability to regenerate central nervous system neurons could provide a major advance in treating neurodegenerative diseases and paralyzing spinal cord injuries. One target is the biochemical cascade of apoptosis, which ultimately ends in the death of a cell. Preliminary evidence now suggests that it is possible to interrupt the cascade at several points and that doing so may stop the process of cell death and perhaps return the cell to normal functioning. Another target of opportunity is stem cells, partially developed cells that transform into the various cells of the body. Stem cells taken from embryos and fetuses and transplanted into patients once appeared to be the only stem-cell approach capable of replacing dead neurons. However, ethical and moral questions have slowed and delayed investigations of the uses of embryonic and fetal stem cells. Recently, several animal studies indicated that stem cells from adults can be reprogrammed to form specific tissues, including, perhaps, neurons. Studies to date are essentially observations and many key questions remain. What mechanisms determine what type of cell a stem cell will become? What stem cells enter the brain and become neuronlike? What signals attract them? Scientists will devote long hours during the next few years to deciphering these biological codes and seeking to utilize the answers in therapy. Self-Repair The brain itself might one day be stimulated to self-repair. A maxim for years held that central nervous system neurons did not and could not regenerate. Experiments during the last several years in animals, including mice, birds, and primates, have challenged that basic assumption. Some evidence suggests that apoptosis can, in certain circumstances, stimulate stem cells in the brain to form new neurons, and these neurons form the same connections with other neurons as the dead cells. Adult stem cells have also been stimulated to form new heart muscle in mice. Although the 1990s were proclaimed the decade of the brain, the potential for advances in neuroscience during this decade is even greater. Improved techniques for imaging and mapping the biochemical pathways of the brain, more powerful bioinformatics tools and more interaction among scientists working in different areas of neuroscience should reveal considerably more information about the brain. One can expect a better understanding of the neurodegenerative diseases, the functioning and interactions of dendrites, synapses, and ribisomes, the mechanisms of neurotransmitter production and transport, internal cellular signaling, and learning and memory.

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology Drug Development Biotechnology allows the transfer of new genes into animals and humans, the culturing of plants from single cells, and the development of new drugs and diagnostic tests. Knowledge gleaned from the human genome and wedded with biotechnology techniques will play a more and more important role in drug development, gene-therapy treatments, and clinical immunology. For example, the Food and Drug Administration has approved nine genetically engineered monoclonal antibodies for treating several diseases, including cancer. Biotechnology will develop human antibodies and other proteins that can be harvested for therapeutic uses. Clinical diagnostics will remain a mainstay of biopharmaceuticals as companies began uniting genomics with microarray (gene-chip) technology to develop new diagnostic tests and ones that can assess an individual’s risk of developing a genetic disease. Microarrays DNA microarrays will prove an even more essential element in drug development than it currently is as researchers exploit the sequencing of the human genome. These arrays contain thousands of separate DNA sequences on a plate roughly the size of a business card. They enable the analysis of a sample of DNA to determine if it contains polymorphisms or mutations. DNA microassays, which are usually prepared by robotic devices, can be used to diagnose various diseases, identify potential therapeutic targets, and predict toxic effects, and one day they will allow customizing drug regimens for individual patients. Computer modeling and combinatoral chemistry, which enable the rapid creation of thousands of chemical entities and their testing for potential biological activity, hold the promise of new and safer drugs and, perhaps, lower development costs. Companies are devising and using computer models that assess a compound’s absorption, distribution, metabolism, excretion, and toxicity characteristics. The aim is to greatly reduce the number of drugs that go to animal testing and clinical trials, only to prove unacceptable as human therapeutics. Tapping the mapped human genome will reveal many new targets that, combined with computer modeling, will allow drug firms to more rationally design specific drugs, test them at less cost, and feel more confident about them when the companies take them to human trial. Receptor Activity Another area of considerable challenge lies in finding ways to control the interactions of hormones and their receptors and other protein-protein relationships. These interactions present potential drug targets, but devising methods to

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology control them will require many technological advances in areas such as synthesis, analysis, computational chemistry, and bioinformatics. Pharmaceutical companies are also focusing on better ways to deliver drugs, in part as a way to extend their patents. One quest is for better ways to deliver proteins orally. Most proteins used now as therapeutics must be injected, and with a greater number of proteins entering the medical armamentarium in the next 10 years, solving the problems of oral delivery has taken on new urgency. The various drug-delivery methods in development include new inhalation approaches, extended time-release injections, and transdermal administrations. One example of a new transdermal device consists of a tiny pump attached to the skin by an adhesive pad. Pressing a button pushes a needle just below the skin and delivers the drug at a constant rate. Its developers envision that it will be used at first as a way to deliver pain medications. Agrobiotechnology Societal pressures may slow innovations in agricultural biotechnology in the next few years, but research will continue, propelled in part by a simple statistic: The world’s population is predicted to increase from 6 to 8 billion by 2030. With one-third more humans to feed, people may have no choice but to accept genetically engineered foods. Among the genetically modified plants currently in fields are insect-resistant corn and herbicide-resistant soybeans, corn, and canola. The potential benefits of genetically modified crops include fewer environmental problems from pesticides and herbicides, increased yields, enhanced nutrition, drought resistance, and even the production of the building blocks of polymers and the remediation of polluted soils, sediments, and aquifers. However, there are unresolved questions about potential risks as well, which have led to opposition to genetically altered foods, particularly in Europe and Japan. These concerns include the possibilities that a gene-altered plant will become an invasive species and cause ecological damage; that plants producing pesticidal proteins may harm nontargeted organisms and/or have an adverse effect on species that feed on the targeted pests; and that new viruses may evolve in virus-resistant plants. Resolving these issues, as well as the lingering controversy over human safety, poses an immediate challenge for researchers and will be necessary for the acceptance of genetically altered plants. In many ways, the future of food produced by biotechnology—at least in the near term—depends on persuading the public of the solid science behind it. Plant Genomes Announced last December, the first sequencing of the genome of a higher plant—the weed Arabidopsis thaliana—and the nearly completed mapping of the

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology rice genome mark major advances along the way to understanding plant behavior and the opportunities to exploit that knowledge. The rice genome carries particular import for the world’s food supply because an estimated 4 billion people will depend on rice as their dietary staple in the year 2030. The A. thaliana sequencers predict that the plant has about 25,500 genes, and they have assigned tentative functions to around 70 percent of them. As in the human genome, the complete plant genome enables researchers to compare its DNA sequence with DNA sequences from other plants to identify key genes in cash crops. Plant researchers have set a goal of understanding the function of all plant genes by the end of the decade, which would greatly enhance biotechnologists’ ability to generate new genetically altered forms. Those data would then serve as the basis for a virtual plant, a computer model that would enable the simulation of plant growth and development under different environmental conditions. BLENDING THE PHYSICAL AND THE BIOLOGICAL For centuries, a sharp demarcation separated the physical and biological sciences, breached only by an occasional discipline such as biophysics. Today, interdisciplinary research is the norm in many industrial laboratories, and not just in the physical or the biological sciences. To a growing degree, research teams may now include representatives from both. Researchers seek to translate biomolecular recognition into useful nanomechanical devices. Geoscientists are exploring genomics for useful clues to solving problems. Cooperative efforts by biologists and engineers seek to create new robots. Some scientists wonder whether deciphering biosignaling in cells will lead to applications in computer science, and others ponder whether the emerging discoveries of brain science will revolutionize information technology. One can expect a greater breaching of the traditional barriers between physical and biological research and a strengthening of biophysical research during this decade—in industry, government, and academic laboratories. Biotech and Materials Science One promising area is the interaction of biotechnology and materials science. Biological systems have been used to create two- and three-dimensional inorganic nanoscale structures and assemble gold and semiconductor nanoparticles on a DNA template. Such work aims at goals like nanoscale wires, mechanical devices, and logic elements, as well as creating organic-inorganic compounds. The potential from utilizing the knowledge and skills of biotechnologists and materials scientists includes creation of new molecular switches and transistors, nanosensors, catalytic devices, and opto-electronic components. IBM researchers have demonstrated that molecular recognition between a piece of DNA and its complementary strand can translate into a mechanical response, namely, the bend-

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology ing of a nanocantilever. Researchers trying to develop biocompatible materials for use in human replacement parts draw on new findings from a spectrum of disciplines, including molecular and cellular biology, genetics, polymer and surface science, and organic chemistry. Robot Engineering Today, robot building depends almost as much on biologists and neuroscientists as it does on engineers and computer scientists. Robot builders seek insights from the animal kingdom in order to develop machines with the same coordinated control, locomotion, and balance as insects and mammals. The purpose is not to create a robot that looks like a dog (although that has been done and marketed), but to build one—for battlefield use or planet-surface exploration, say— that can walk, creep, run, leap, wheel about, and role over with the same fluid ease as a canine. To do this requires not simply electrical wiring and computer logic, but also a deep understanding of insect and mammalian mobility, which, in turn, requires the input of zoologists, entomologists, and neurophysiologists. What is emerging are some general principles about the complexity of animal mechanics and control. For now, bioinspired robots are mostly creatures of the laboratory. However, one would expect continued development and application of these robots throughout this decade and a backflow of insights to biologists and neurophysiologists as they observe the development of bioinspired machines. Catalysts As the number and understanding of enzyme-crystal structures grow, so does interest in utilizing this knowledge to synthesize new catalysts. Researchers envision harnessing such bioinspired catalysts for green chemistry—through the environmentally benign processing of chemicals—and for use both as new drugs and in their production. The effort is in a formative stage, and although chemists have synthesized enzyme-mimicking catalysts, a great deal of attention is focused on deciphering protein structure and its role in enzymatic behavior. Evidence indicates, for example, that an enzyme is not simply a rigid structure on which catalysis occurs, but that the process actively involves the entire protein. A greater understanding of enzymes has exposed the complexity confronting attempts to develop enzyme mimics as well as totally new ones. Questions remain, however, about protein structure and enzymatic mechanisms and how enzymes work in unison within cells. Many answers should emerge in the next few years through biochemistry, protein crystallography, molecule-by-molecule studies of enzymes, and bioinformatics. Many other examples exist of the melding of the physical and biological sciences. Researchers at Bell Laboratories are trying to exploit both self-assem-

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology bly and the natural electrochromatic properties of the protein bacteriorhodopsin to develop new flat-panel displays. Unraveling the intricate nature of signaling within cells—the focus of a mini-Human Genome Project called the Alliance for Cellular Signaling—holds clear implications not only for basic biology, clinical medicine, and the pharmaceutical industry but also, potentially, for the computer sciences. Some geoscientists look to the sequencing and analysis of a variety of genomes to aid them in understanding the coevolution of life and Earth and the soft-tissue structure of creatures long extinct. Neuroscientists no longer view the brain as a three-pound biological computer but as an even more complex system that organizes thinking, learning, and memory. Understanding these processes holds significant meaning for computer science and information technology as well. DISCUSSION AND CONCLUSION Richard Smalley remarked in a private conversation 6 years ago that the 21st century would be the century of nanoscience. Certainly science at the micro- and nanoscale will play important roles in maintaining the economic competitiveness of the United States in computer chips and storage media, telecommunications, optical devices, display technology, biotechnology, biomedical science, and drug development. At the current time, industry funds more than half of the research most likely to have an economic impact during the coming decade. That situation probably will not change. The current administration appears to view tax policy as a more effective way to stimulate the economy—including its research component—than federal support for research. Although this philosophy might shift with the 2004 or 2008 election, the percentage of research funds provided by the federal government is unlikely to suddenly surge. Two trends will clearly influence U.S. applied research in science, engineering, and technology in the coming decade. One is the growth of interdisciplinary research. Many projects today require the integration of expertise from several disciplines, team research, and a willingness by scientists and engineers to work with others outside their own discipline to bring them to a successful fruition. The second trend relates to the increasing quality of scientific work emerging from foreign laboratories, particularly in Canada, Europe, and Japan. This work will challenge—as well as inform—U.S. science and technology and may affect its domination in areas such as computers, nanoscience, and biotechnology. Beyond that challenge, international treaties, standards, and regulations will affect U.S. competitiveness. For example, although the United States is not a party to the Cartagena Protocol on Biosafety, its provisions will govern U.S. companies whenever they trade with any country that ratifies it. The degree to which research will advance during the next 5 to 10 years depends in part on economic, political, and international factors beyond the scope

OCR for page 167
Future R&D Environments: A Report for the National Institute of Standards and Technology of this report. Although precise predictions cannot be made as to which specific efforts will yield unusually significant results over the next 5 to 10 years, the breadth of the technologies likely to yield extraordinary advances can be identified. However, true breakthroughs, by their nature, are unexpected, unanticipated, and unpredictable.