National Academies Press: OpenBook

Future R&D Environments: A Report for the National Institute of Standards and Technology (2002)

Chapter: Appendix H: Trends in Science and Technology

« Previous: Appendix G: Innovation's Quickening Pace: Summary and Extrapolation of Frontiers of Science/ Frontiers of Engineering Papers and Presentations
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

INTRODUCTION

Any attempt to predict the future carries risks. Yet, in forecasting the scope of technology advances over a span of 5 to 10 years, the past is prologue to the future. During the first decade of the 21st century, three enabling technologies— computers, communications, and electronics—will continue and accelerate the information revolution. These technologies, in turn, will be supported by innovations across a spectrum of supporting technologies, such as advanced imaging, nanotechnology, photonics, and materials science. Important developments will emerge as well in such areas as sensors, energy, biomedicine, biotechnology, and the interaction of the physical and biological sciences.

Many challenges, however, confront those seeking to turn the potential of these technologies into practical applications. How rapidly progress comes will depend on innovative ideas, the economy, and the vision of industrial and political leaders.

TRENDS IN INFORMATION TECHNOLOGY

In about 10 years, the semiconductor industry will require a replacement technology for the photolithography process now used to make electronic chips if it wants to keep improving their performance. Two advanced fabrication approaches are under investigation at a time when the industry is also in transition from the use of aluminum to copper for circuit lines and is seeking ways to further shrink the size of transistors. Computer makers expect to vastly increase data storage over the next decade, using new magnetic and optical techniques that include magnetic thin films, near-field optics, holography, and spintronics. Scientists will also continue the quest for higher-resolution and flatter display panels, pursuing such approaches as organic light-emitting diodes and electronic “paper.” New and refined imaging techniques will enable cutting-edge studies in the physical and biological sciences.

Computers, Lithography, and Thin Films

Chipmakers must confront the time, probably toward the end of the current decade, when traditional lithography techniques no longer meet their needs. The Semiconductor Industry Association’s periodic technology road map makes clear that the decade ahead will require unprecedented changes in the design of chips and their materials if the industry is to continue its traditional rate of doubling computing power about every 18 months. Potential approaches to improved per-

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

formance include a new chip fabrication technology, improved circuitry, and new types of transistors.

Shrinking the size of microcircuits, the hallmark of the semiconductor industry, has relied in considerable part on the reduction in the light wavelength used for optical lithography. Line widths of 200 nanometers are now in use, but by the end of this decade or soon after, lithography as we know it will likely reach its physical limits. The industry is now pursuing two advanced lithography techniques as potential replacements for the current technology. Each of the two— extreme ultraviolet light (EUV) and projection electron beam lithography (also known as SCALPEL, for scattering with angular limitation projection electronbeam lithography)—has its advantages and limitations. EUV provides radiation at 13.4 nanometers and uses a complex mirror system rather than lenses for focusing it on the photoresist. The process, however, will require defect-free masks—an extraordinarily difficult challenge—and, most likely, some type of chemical post-processing of the photoresist material to ensure proper depth of the etching. SCALPEL forsakes photons for high-energy electrons, which poses problems in developing masks that are both thin enough for the exposure process and capable of withstanding its high heat. Solutions to this problem exist, but they require complex processing to achieve the finished chip.

Faster Speeds

In an effort to ensure faster computing speeds, chipmaking is in a fundamental transition as it moves away from the historic combination of aluminum circuit lines coated with an insulator, or dielectric, usually silicon dioxide. Following the lead of IBM, companies are substituting copper for aluminum. The choice of an improved dielectric—needed to prevent cross talk as line width continues to narrow—is far less unanimous. IBM has chosen Dow Chemical’s SiLK aromatic hydrocarbon, but other companies, including Novellus Systems and Dow Corning, offer competing dielectrics, and other new insulating materials are under investigation.

Innovative ways to shrink transistors remain high on the research agenda. One approach would reduce the thickness of the gate insulator, which is currently about 2 nanometers. Achieving chips with 100-nanometer lines will likely require shrinking the insulator to 1 nanometer, only four times the width of a silicon atom. Several other potential options for reducing insulator width are open. One is to reduce the gate length, which would increase speed without having to shrink the insulator layers. This might be accomplished by silicon-on-insulator transistors, in which the insulator is placed under the transistor rather than burying the transistor in the silicon substrate. Another approach, the double-gate transistor, would place one gate atop another and reduce the gate length by half for the same oxide thickness.

The switch to copper wires required new advances in plasma processing of

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

microcircuits, which, in turn, allowed chipmakers a wider selection of materials. Further improvements will be necessary, however, for generating such things as diamond thin films for use in flat-panel displays. Improvements in the deposition of organics and thin films are vital to improving the performance of the next generation of electronic devices. Today, the material to be deposited restricts the choice of technology used. A more universal deposition technique would offer a significant advantage.

Nonvolatile RAMs

Recent years have seen a surge of interest in developing inexpensive, fast, durable, and nonvolatile random access memories and in the use of a solid-state technology called magneto-electronics to replace volatile and nonvolatile semiconductor memories and mechanical storage. Most of this work is focused on giant magnetoresistance (GMR) and magnetic tunnel junction technology. GMR materials have the advantage of a strong signal, nonvolatility, and compatibility with integrated circuit technology. Magnetic tunnel junction devices have the potential to serve as nonvolatile memories with speeds comparable to those of today’s dynamic random access memories (DRAMs). Much of the research on magneto-electronic memories has emphasized hybrid devices that utilize a magnetic memory and semiconductor electronics. But one start-up company has gone all-metal, developing a magnetic RAM in which the memory arrays and the electronics are all made of GMR materials. The device is based on electron spin rather than electric charge.

Data Density

More bits per square inch is almost a mantra in the computer data-storage field. But as with photolithography, traditional magnetic and optical storage techniques are approaching their own physical limits, which will require innovative solutions to overcome. In 1999, IBM forecast that it would achieve an areal magnetic storage density of 40 gigabits per square inch (Gb/in.2) by the middle of this decade. Beyond this density, magnetic storage encounters the instability of superparamagnetism, a phenomenon in which the magnetic orientation energy equals the surrounding thermal energy. As a result, magnetic bits flip spontaneously at normal operating temperatures. In May 2001, IBM announced it had developed a new coating for its hard disk drives that bypasses the problem—a three-atom-thick layer of ruthenium sandwiched between two layers of magnetic material. IBM predicted that the material would enable a storage density of 100 Gb/in.2 by 2003.

Optical storage faces its own physical barrier—the diffraction limit, at which the size of the optical bits is limited by the wavelength of light used to record

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

them. Solving the two problems will require new materials, structures, and recording technologies.

Magnetic Storage

Today’s magnetic storage devices, such as hard disk drives, record data on tiny tracks of cobalt-chromium alloy crystals. One approach being explored to increase density in these devices involves creating a thick film of copolymer plastic, burning holes in it as small as 13 nanometers, and filling them with magnetic materials. Because this technique could yield 12 trillion magnetic “posts” or “wires” over a single square centimeter, each separated from the others by plastic, it could result in magnetic storage significantly higher than the 40-gigabit limit imposed by superparamagnetism on conventional systems. Indeed, it might someday boost data density into the terabit range.

Another way to improve density would be to replace the cobalt-chromium crystals used in storage media with iron-platinum particles, which have stronger magnetism and could be made as small as 3 nanometers. However, until last year, when IBM scientists succeeded, no one could produce uniform grains of the metal crystals. Uniform grains with greater magnetic strength should enable data densities up to 150 Gb/in.2 and, perhaps, even into the terabit range.

Both approaches, however, face a number of challenging development and scaling issues before they are ready for the market, and they will require new read and record technologies as well if they are to reach their full potentials.

Optical Storage

Near-field optics—which exploits the fact that light placed very near an aperture smaller than its wavelength can pass through the hole—may provide one way to vastly expand the data density of optical storage. Using a very small aperture laser, Lucent scientists have recorded and read out optical data at a density of 7.5 Gb/in.2, and they have speculated that with apertures 30 nanometers in diameter, data density could reach 500 Gb/in.2 Lucent has licensed its technology for commercial development.

Holography offers the potential for high storage densities and data transfer at billions of bits per second for two reasons. Unlike traditional magnetic and optical systems, holography can store data throughout the entire medium rather than simply on the surface. Second, holography allows recording and reading out a million bits at once rather than one bit at a time.

For example, InPhase Technologies, a Lucent spin-off, is commercializing technology developed at Bell Laboratories. It uses two overlapping beams of light—the signal beam, which carries data, and the reference beam. The two beams enter the storage medium at different angles, create an optical interference pattern that changes the medium’s physical properties and refractive index, and

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

are recorded as diffractive volume gratings, which enables readout of the stored data.

Encoding data consists of assembling “pages” of 1 million bits represented by 1’s and 0’s and sending them electronically to a spatial light modulator. This device is coated with pixels, each about 10 square micrometers, which can be switched rapidly to match the content of each page. When a signal beam passes through the modulator, its pixels either block or pass light, depending on whether they are set as a 1 or a 0, and the laser beam carries the message of that specific page.

Bringing competitive holographic systems to market that will exceed the traditional storage technologies will require a number of advances in recording materials. These advances include optical clarity, photosensitivity, dimensional stability, and uniform optical thickness, as well as innovations in spatial light modulators, micromirrors, and component-systems integration.

The demonstration that information can be stored on and nondestructively read from nanoclusters of only two to six silver atoms, announced earlier this year by Georgia Institute of Technology researchers, opens another potential approach to increasing data density. The Georgia Tech team exposed a thin film of the silver nanoclusters to blue light in the shape of the letter L. Two days later, they exposed the nanoclusters to green light, which caused the nanoclusters to fluoresce in the L pattern. Whether such nanoclusters can be shaped into compact arrays and handle read-write operations at the speeds of today’s computers remains a question for further study.

Electron Spin

Spintronics could lead to information storage on the same chips that process data, which would speed up computation. Data processing is based on the charge carried by electrons; data storage has relied on magnetism or optics. However, electrons also have spin, and electron spin is harnessed in magnetic storage. Spintronics seeks to manipulate electron spin in semiconductor materials for data storage and perhaps quantum computing. The key lies in devising semiconductor materials in which spin polarized electrons will function. Recent developments in spin polarizers and the synthesizing of magnetic semiconductors suggest this problem can be managed. However, making a marketable product will require ferromagnetic semiconductors that operate at room temperature—a demand not easily fulfilled.

Flat Displays and Printed Circuitry

Organic light-emitting diodes (OLEDs), a technology that offers more design flexibility and higher resolution than traditional LEDs, are now coming to

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

market. However, these devices are limited in size, so they are not yet practical for such things as monitors and television screens.

OLEDs rely on small-molecule oligomers or thin films of larger semiconductor polymers for their illumination. A polymer semiconductor, for example, is deposited on a substrate, inserted between electrodes, and injected with electrons and holes (the absence of electrons). When holes and electrons recombine, they emit light. The technology is expected to one day replace cathode ray tubes and liquid-crystal diodes. Advocates emphasize several advantages of light-emitting polymer-based products, including greater clarity, flatter screens, undistorted viewing from greater angles, higher brightness, and low drive voltages and current densities, which conserve energy. OLEDs for alphanumerical use and backlit units for liquid crystal diodes are currently entering the marketplace.

The coming decade will probably see innovations in OLED production, materials, performance, and scale. Advances in all these areas are needed to bring to market such envisioned products as high-sensitivity chemical sensors, roll-up television screens, wide-area displays, and plastic lasers.

Electronic Paper

This technology may change the configuration and the way we use portable electronic devices such as cell phones and laptops as well as reinvent how newspapers, books, and magazines are “printed” and read. The vision is of a lightweight, rugged, flexible, and durable plastic that combines the best of woodpulp-based paper and flat-panel displays. In a sense, the earliest versions of the vision are available today. E Ink Corp. markets a simple version of electronic paper for large-area displays. Gyricon Media, Inc., a Xerox Corp. spin-off, plans to market a precursor electronic paper for similar uses later this year.

The design of electronic paper differs markedly from the electronic displays of today. Instead of cathode-ray tubes or liquid-crystal diodes, silicon circuits, and glass, electronic paper would utilize electronic “inks,” plastic “paper,” and flexible and bendable circuitry. A joint venture by Lucent Technologies and E Ink unveiled the prototype last fall, a device containing 256 transistors, each of which controls a single pixel. A thin layer of ink made of white particles suspended in a black fluid is placed between two electrodes. Switching a pixel on or off causes the white particles to move forward or backward and make the pixel appear black or white.

To form an electronic paper, inks must be laminated along with their drive circuitry into flexible sheets. This poses a problem because plastics typically cannot withstand the high temperatures needed for manufacturing conventional silicon circuits. Moreover, the surfaces of plastics are rougher than those of glass or silicon, which can adversely affect viewing. So making electronic paper a viable commercial product will require a number of technological developments.

Electronic paper and innumerable other products would benefit from the abil-

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

ity to simply print electronic circuits rather than go through the stressful and complex process used to make chips. The goal is to fabricate transistors, resistors, and capacitors as thin-film semiconductor devices. Ways to do this at low cost and in large volume using standard printing processes or inkjet printers are in development. Working with funds from the Advanced Technology Program, for example, Motorola has teamed with Dow Chemical and Xerox in a 4-year effort to develop novel organic materials and techniques for printing electronic devices.

The Internet

Rarely, if ever, has a technology changed society as rapidly and unexpectedly as the Internet and the World Wide Web did in the 1990s. The coming decade will also see rapid changes in the way the world communicates and transmits data.

The Internet was both revolutionary and evolutionary, a dual process that continues with the next-generation Internet, or Internet II. Initiated at a conference in October 1995, Internet II involves a collaboration of more than 180 universities and a multitude of federal agencies and private companies to develop an advanced communications infrastructure. The major goals are to create a cutting-edge network for the research and education communities, enable new Internet applications, and ensure that new services and applications get transferred to Internet users at large. Designers envision high-speed, low-loss, broadband networks capable of allowing such bit-dense activities as real-time research collaborations and telemedicine consultations of unsurpassed clarity. Internet II encompasses several major new Internet protocols, and, as the original Net did, it will introduce new phrases to the language, such as GigaPOP—the term used for its interconnection points between users and the providers of various services. Among the many innovations needed to enable Internet II are new network architectures, advanced packet data switch/routers, multiplexers, and security and authentication systems.

Internet Vulnerability

The growth of the Internet has stimulated the study of communications networks to understand their general properties and the physical laws that govern their behavior. Physicists at the University of Notre Dame did a computer simulation of two possible network configurations. In one, each node had about the same number of connections to other nodes in the network. In the second configuration, nodes had greatly varying numbers of connections but nodes with many connections predominated. The second configuration represented the type of connections found on the Internet. On the basis of their findings, the researchers concluded that Internet-like systems are largely invulnerable to random failure but very open to damage by deliberate attack. A better understanding of the

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

Internet’s behavior and its effect on Internet vulnerability is important for national security, communications within and among businesses, and the flow of e-mail and e-commerce.

The introduction of the Advanced Encryption Standard algorithm last year could assure that information encrypted by it and sent over the Internet cannot be decoded by anyone who intercepts it, at least for the next several decades. The challenge now is to ensure the security of the encryption process so that the specific information needed to decode messages remains known only to those who should know it. This is primarily an issue of people and policy. However, NIST is working on techniques called key-management protocols that will help enforce the security of the encryption and decoding process.

A major challenge, one that could have a significant impact on Internet reliability and speed from the user’s viewpoint, lies in resolving the so-called lastmile bottleneck. This is the connection between the desktop terminal and the Internet service provider. Technical advances in optical communications, some of them associated with developing Internet II, will shorten this last mile stretch by stretch, increase network communications, and perhaps even solve the bottleneck in the coming decade.

Imaging

Imaging has served as a vital impetus to discovery across the spectrum of science for several centuries. This fact will remain true in the 21st century.

Advances in imaging at the nano- and molecular scales by various techniques have contributed significantly to the understanding and exploitation of materials and processes. As science seeks to understand and control nature at its smallest scales, the need for new and improved imaging techniques—more sensitive, more specific, sharper in detail—takes on new urgency. New approaches and refinements of old, reliable methods will certainly emerge in the coming decade. Femtosecond lasers, for example, are opening a new era of investigation, ranging from biochemical reactions to fundamental studies of quantum mechanics. Optical microscopy, the oldest of the imaging sciences, and imaging holography could find new uses. Improvement in synchrotron-radiation resolution promises sharper images of such things as chemical-bond orientation, individual magnetic domains, solid-state reactions, catalysts, and the surfaces of semiconductors.

Femtosecond Imaging

Refinements in femtosecond imaging and its application to new areas promise greater understanding in a broad range of disciplines, from cell biology to materials science. The laser-based technique already has demonstrated, for example, that DNA is not a rigid molecule but is capable of considerable motion. Currently, laser pulses of 5 femtoseconds can be achieved. The discovery that

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

fast x-ray pulses can be generated when ultrashort laser pulses are reflected from the boundary between a vacuum and a plasma suggests the potential for a new femtosecond imaging approach. Researchers envision opportunities such as observing the biochemical reaction of new drugs, precisely defining the transport of electrons within DNA, and even gaining a greater understanding of quantum theory through the application of femtosecond imagery.

Light Microscopy

The original microscopy has yet to reach its limits of usefulness, especially in areas such as biotechnology, biomedical science, and medical diagnostics. By integrating advances from several fields, including optics, robotics, and biochemistry, researchers are developing interactive light microscopy techniques to examine the contents and dynamics of cells and tissues. For example, the National Science Foundation is funding development of the automated interactive microscope, which couples advanced fluorescence-based light microscopy, image processing, and pattern recognition to a supercomputer. Researchers are also exploring deblurring techniques to sharpen the images yielded by innovative light microscopes.

Near-field scanning optical microscopy takes advantage of the fact that light shined through a tiny nanoaperture can cast an illumination spot the size of the hole. Spot sizes in the range of 50 to 20 nanometers have been obtained. German researchers have reported using a single molecule as a light source. In theory, such a light source could illuminate a spot approximately 1 nanometer across.

One current approach to developing nondestructive imaging on the nanoscale combines two established technologies—scanning probe microscopy and molecular spectroscopy. The aim is to harness the high spatial resolution offered by scanning probes and molecular spectroscopy’s chemical specificity to explore the chemical details of nanometer structures. Holography, too, offers a potential means of imaging at the nanoscale. Working with three partners and money from the Advanced Technology Program, nLine Corp. seeks to develop a holographic system capable of imaging defects on the bottoms of deep, narrow features of semiconductor chips. These features include trenches, contacts, and the spaces between interconnects, where depth-to-width ratios run as high as 30 to 1.

Algorithms

In many instances, the key to improved imaging will be new algorithms and software packages, through which, for example, researchers can obtain enhanced image quality, create extremely accurate three-dimensional images from the observations of different devices, automate image correction, and gain more interactive capabilities with images. Synchrotron radiation facilities of increased intensity and more precise focus will advance observations in protein structure,

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

surface science, and molecular structure. New variations and refinements of scanning probe and atomic force microscopy can be expected to improve imaging of the physical and biological worlds, helping to solve issues in biochemistry and nanostructure synthesis and fabrication, and to advance the quest for biomolecular devices such as high-density processors, optical-communications elements, and high-density storage media.

TRENDS IN MATERIALS SCIENCE AND TECHNOLOGY

Fabrication at the micro- and nanoscale level will yield a number of new devices and products, ranging from exquisite sensors to tiny walking robots and automated labs-on-a-chip. Key to such advances is understanding how to control the materials used and the development of new molecular manipulation and micromachining tools. Advancements in photonics will help meet the demand for greater bandwidth for communications, and innovations in photonic-integrated systems will expand their use for signal processing. Both high- and low-temperature superconductors pose challenges and promise commercial applications over the next decade, including in the distribution of electric power and as ships’ engines. Creation of new materials will have effects throughout society, and the versatility of polymers makes them a particularly attractive target for research. Self-assembly, by which molecules form into structures on their own, also has gained increasing attention.

Micro- and Nanoscale Fabrication

Emerging micro- and nanominiaturization techniques promise to transform the typically planar world of these scales into three dimensions and to enable new devices and technologies of scientific, industrial, economic, and security import, including a host of new sensors, walking microrobots, nanomotors, and new polymers. Understanding how to control the chemical composition, physical properties, and configuration of materials at the molecular level is a key element in devising nanoscale building blocks for assembly into working devices and machines. Achieving this knowledge and integrating it into products requires an interdisciplinary effort by chemists, physicists, materials scientists, and engineers.

MEMS

Microelectromechanical system (MEMS) devices have gone from relatively simple accelerometers to devices that enabled the successful flight of twin tethered experimental communications satellites, each of which is only 12 cubic inches and weighs 0.55 lb. The challenge now is to improve MEMS techniques and develop new ways to do three-dimensional microfabrication of things such as metallic coils. MEMS devices—already in commercial use as sensors and actua-

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

tors—are being developed as micromachines and microrobots. Especially promising to the MEMS field is the advent of laser-based micromachining and other innovations to supplement the photolithography-chemical etching process used originally. Laser techniques now in laboratory development can micromachine metals and fabricate devices of smaller dimensions—a few micrometers today and, perhaps, nanoscale devices tomorrow. Swedish researchers have fabricated a walking silicon microrobot, as well as a microrobotic arm that uses conjugated-polymer activators to pick up, move, and place micrometer-size objects.

At the nanoscale level, the ability to manipulate atoms, molecules, and molecular clusters has revealed often unexpected and potentially useful properties. Now scientists are seeking to exploit these findings to create nanowires, nanomotors, macromolecule machines, and other devices. Nanomotors, for example, will be needed to power many envisioned nanodevices, including switches, actuators, and pumps. The challenge is to devise ways to convert chemical energy into power that enables nanodevices to perform useful tasks and to find ways to control and refuel the tiny motors.

Nanotubes

Carbon-based nanotubes continue to attract interest because of the potential of their physical and electrical properties. They are stronger and tougher than steel, can carry higher current densities than copper, and can be either metals or semiconductors. Uses envisioned for them range from tiny switches to new composite materials capable of stopping high-velocity bullets. Bringing such nanometer applications to fruition, however, especially for electronics, will require new methods to reliably and economically mass-produce nanotubes and control their characteristics, as well as ways to structure and organize them.

Without scanning probe microscopy, nanotechnology would remain largely a concept. Nevertheless, there exists the need for faster techniques to control, image, and manipulate materials and for new ways to make molecular-scale measurements. One approach to greater specificity would harness the spatial resolution of scanning probe microscopy with the chemical specificity of molecular spectroscopy.

Sensors

Microfabrication and MEMS devices will play increasingly greater roles in the development and manufacture of high-tech sensors. The increased potential for terrorist attacks and the threat of chemical or germ warfare has spurred efforts by the civilian and military sectors to detect explosives, deadly chemicals, and biological agents. A single lab-on-a-chip can detect and identify a number of difference substances, a capability useful to medical and environmental monitoring as well as national security. Beyond today’s technology, researchers funded

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

by the Defense Advanced Research Projects Agency (DARPA) are developing small, lightweight, easy-to-use MEMS-based biofluidic microprocessors capable of monitoring a person’s blood and interstitial fluid and comparing readings with an assay reference chip. Such devices could not only give early warning of exposure to chemicals or biological agents but also monitor general health, medication usage, and psychological stress.

The coming decade should see other significant advances in medical microsensors. For example, English scientists are developing a camera-in-a-pill that can be swallowed to examine the gastrointestinal tract. The device, currently 11 × 30 millimeters, contains an image sensor, a light-emitting diode, telemetry transmitter, and battery. Several improvements are needed before it can complement or replace current endoscopy tools, including orientation control and a more powerful battery that will enable it to image the entire gastrointestinal tract, from ingestion to elimination. And a Michigan company, a winner in the 2000 Advanced Technology Program competition, is trying to commercialize technologies developed at the University of Michigan. It hopes to create implantable wireless, batteryless pressure sensors to continuously monitor fluids in the body. Potential beneficiaries include patients with glaucoma, hydrocephalus, chronic heart disease, or urinary incontinence.

Artificial or electronic noses are starting to find a home in industry. These devices typically consist of an array of polymers that react with specific odors and a pattern-recognition system to identify an odor by the change it produces in a polymer. To date, artificial noses have been used mainly in the food industry to augment or replace existing means of quality control, but potential uses include quality control in the pharmaceutical, cosmetic, and fragrance industries. Expanding the uses of electronic noses will require sensors with greater sensitivity and specificity and more advanced algorithms to improve performance.

Photonics

Photonics underpins optical communications. Because photons are more effective carriers of information than electrons, the demand for new ways to harness light to communicate and store data will intensify, driven by the worldwide need for greater bandwidth. The development of wavelength division multiplexing has opened a progressive revolution in data transmission, storage, and processing that could match that of electronics in the 20th century. Although most photonic circuits today are analog, the development of low-cost photonic integrated systems will enable uses beyond communications, including signal processing and exquisite sensors.

Photons travel at the speed of light and do not interact with one another, which all but eliminates cross talk and inference. However, logic functions require some interaction, and researchers have sought ways to process information electronically and transmit it optically. One promising approach to integration is

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

the use of smart pixel devices, which combine electronic processing circuitry with optical inputs and/or outputs and can be integrated into two-dimensional arrays. Researchers are investigating several approaches to applying smart pixels in high-speed switching, as interconnects, and in flat-panel display applications.

Electro-optical Polymers

The increase in optical-fiber capacity also creates the need to speed information in and out of the fibers. One new approach to speeding this information flow uses polymers containing organic chromophores, which are molecules involved in color. Evidence suggests that embedding chromophores can yield electro-optical polymers that have higher speeds—as high as 100 gigahertz—and require lower voltages than present-day electronic modulators. However, chromophores tend to align in ways that reduce the effect of an applied field, a problem that needs resolution.

Researchers are also exploring holography to create three-dimensional photonic crystals for use in ultrasmall waveguides and other optical-communications uses. So far, however, the thickness of the crystal layers is limited to about 30 micrometers, and more advances in processing will be needed to obtain the larger photonic crystals needed for communications applications.

Significant improvements in key photonic devices seem a certainty in the next decade, including in-fiber optical filters (known as fiber Bragg gratings), infiber amplifiers, and fiber lasers. Fiber Bragg gratings, for example, are the building blocks of wavelength division multiplexing and essential elements for the next generation of all-optical switches and networks. In-fiber amplifiers depend on doping the core of optical fibers with rare-earth ions, and refining the doping process should optimize various amplifier designs. Fiber lasers can sustain high power densities and can currently generate pulses of 100 femtoseconds. The development of polymer lasers holds significant potential for speeding communications.

One challenge to developing new photonic devices is that models currently used to characterize materials require unrealistic computer time. As a result, prototype devices often must be built for testing. A great need exists for new algorithms to address issues such as simulation, analysis, alignment tolerance, and increasing the yields of photonic components.

Superconductors

In recent years, applications of high-temperature superconductors have moved beyond the realm of laboratory sensors. The U.S. Navy, for example, has awarded a contract for the design of a 25,000-horsepower superconductor motor to power its next generation of destroyers. The ability of these superconductors to transmit electricity with essentially zero resistance will in and of itself guarantee

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

continued efforts to understand the phenomenon, develop new superconducting materials, and apply them.

Opening up applications and a large commercial market for superconductor wires and cables will require increasing the current-carrying capacity of hightemperature superconductors, and the coming years will witness further efforts to resolve this challenge. German researchers have achieved a sixfold increase in current-carrying capacity at 77 K by manipulating the make-up of a yttrium-barium-copper oxide superconductor. The researchers replaced some yttrium ions at grain boundaries with calcium ions. 77 K is the boiling point of nitrogen and the point above which many believe superconductors can be widely commercialized. This discovery suggests similar chemical tinkering may raise the currentcarrying capacity of other high-temperature superconductor compounds.

Superconducting Plastics

Electrically conducting plastics were discovered a quarter-century ago, but only now have superconductor plastics come to the fore. Superconductor polymers have been problematic because plastics carry current as the result of doping them with impurities, but superconductivity requires an ordered structure that does not disrupt electron flow. An international team that included Bell Labs researchers reported earlier this year that it had observed superconductivity in the polymer poly(hexythiophene). The researchers placed the polymer inside a field-effect transistor, injected it with holes, and achieved superconductivity at 2.35 K. The new discovery should pave the way for a new approach to developing and applying superconductor polymers.

The discovery of an intermetallic superconductor with a critical temperature of 39 K, announced earlier this year, opens another area of opportunity. Although low in operating temperature compared with today’s high-temperature superconductors, the magnesium boride compound can carry three times as much current, weight for weight, because it is a lighter material, and it can be cooled by closed-cycle refrigeration instead of liquid helium coolants.

Nanotubes

Low-temperature superconductors continue to demonstrate resilience and the potential for significant development. An ongoing Department of Energy program supports efforts to move these materials into commercial use. In one demonstration project, a consortium of companies is readying superconductor cable to serve 14,000 inner-city Detroit customers with electricity. Experimentally, nanotubes—which at room temperatures can be insulators, semiconductors, or metals—have added to their reputation as physical chameleons with the discovery that they can become superconductors. European researchers suspended a number of single-wall nanotubes between two superconductor electrodes and

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

found they became superconductors and ultrasmall Josephson junctions at between 0.3 and 1.0 K, depending on the sample.

New Materials and Self-Assembly

Materials science underpinned much of the technological advancement of the 20th century and will continue to do so in this decade. The discipline involves understanding the structure and physical properties of materials and finding ways to alter them for useful purposes. The coming years will accelerate the design of materials with specific performance capabilities dictated by a specific need. The applications of materials science range across modern life, from new structural materials, to sensing devices, to implantable biocompatible replacement parts for humans. As they have in recent decades, innovations in materials will enable a spectrum of new products and technologies.

Polymers

These long-chain molecules have made important contributions to technology and will continue to do so (see Flat Displays and Printed Circuits). Polymers, depending on their composition, can be as rigid and strong as steel, as floppy and stretchable as rubber bands, or somewhere in between. Synthesizing polymers with specific properties, sizes, and shapes, however, remains a significant challenge. Despite substantial progress in controlling the architecture and functions of polymers, considerably more needs to be accomplished in understanding these materials and in developing ways to easily tailor their configurations and properties to meet specific needs—whether as new materials for an old product or a new material for an envisioned use. Japanese researchers, for example, have used the probe tip of a scanning tunneling microscope to fashion conjugated polymers into nanowires. Researchers at the Massachusetts Institute of Technology are developing a shape-memory polymer with properties and applications superior to those of the shape-memory metal alloys now available.

Composite Materials

By chance, guesswork, and science, humans have created alloys for more than 4000 years. Composite materials, in the form of straw-based bricks, go back roughly 7000 years. As scientists develop a deeper understanding of how to structure and process molecules in ways that fine-tune their physical properties, new alloys and composite materials will emerge. In Earth-orbit experiments, researchers have developed a nanocomposite in which magnetic nanoparticles move freely inside cavities that form within the material. Potential applications include tiny

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

compasses, gyroscopes, switches, and, perhaps, microtransformers. Today’s alchemists rely on computers—for modeling, computational chemistry, and computational physics—sensitive imaging studies, and nondestructive testing. Even better tools are needed as the science surges forward.

Self-Assembly

One area likely to progress rapidly over the next decade is that of molecular self-assembly to inexpensively produce atomically precise materials and devices. Harnessing this phenomenon will provide a critical element in fabricating new materials, nanomachines, and electronic devices. Some examples of self-assembling materials in development include smart plastics that assemble themselves into photonic crystals, spinach-based opto-electronic circuits for possible use in logic devices and ultrafast switches, and inorganics and organics that self-assemble between two electrodes to form transistors. Earlier this year, English scientists reported what they called the equivalent of catalytic antibodies for synthetic chemists—a dynamic solution that enables molecules to arrange themselves into the best combination to bind to a specific target. Self-assembly has important applications in the biomedical sciences. One technique, for example, uses a system in which molecules assemble into countless combinations that are quickly tested for their ability to bind to receptors.

A great deal remains unknown about the process of molecular self-assembly or how to utilize it for producing new materials or new applications. Even more challenging is the creation of materials that self-assemble from two or more types of molecules.

TRENDS IN ENERGY, ENVIRONMENT, AND THE PLANET

Fuel cells will enter the marketplace shortly as part of the power systems of automobiles and for use as portable electric generators, and they may soon provide power for smaller devices such as cell phones and laptop computers. Solar cells and nuclear power plants will both receive greater attention in the next decade, but significant problems remain to be solved for both. Hormone disrupters, environmental pollutants, global warming, and ozone disintegration are and will remain issues of public and international concern, as will ecosystems and their preservation. Studies should also reveal whether large-scale sequestering of carbon dioxide is practical and more about long-term patterns of climate change and shorter-term episodes, such as El Niño. Although precise earthquake prediction will remain but a goal, geoscientists should gain a greater understanding of the mechanics and behavior of quakes, and this knowledge should inform the earthquake engineering of buildings and other structures.

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Energy

Energy is again a national issue, and the Bush administration’s energy policy, while emphasizing increased production, includes interest in conservation, efficiency, and innovation. The century’s first decade will likely see increased research related to nuclear power and ways to store electricity in large quantities, but the technologies most likely to have significant impact in the energy area exist today. One can predict with near-certainty that no magic bullet will emerge to cure the nation’s energy ills.

Fuel Cells

Discovered in 1839, fuel cells long remained little more than a potential source of clean and efficient energy. Today, they are poised to become a commercial commodity. The 1990s saw a resurgent interest in fuel-cell technology, largely in proton-exchange membrane (PEM) fuel cells for use as nonpolluting vehicle power plants. PEM fuel cells split hydrogen atoms into electrons and protons and produce water and heat. The first production-model fuel-cell cars should be available in 2004, and some automakers expect to be selling 100,000 fuel-cell-equipped vehicles a year by 2010. The viability of fuel cells for vehicles resulted from a number of technological advances that improved fuel processing at lower costs, and further improvements will come. Current automotive fuel cells convert the hydrogen in fossil fuels such as methanol to electrical energy. Ultimately, the makers of PEM fuel cells expect to replace hydrocarbons with pure hydrogen. Two potential energy sources for producing the needed hydrogen from water are high-efficiency solar cells and nuclear power plants.

Solid oxide fuel cells (SOFCs) are in development as sources of electric power, and at least one company expects to market a model for home use later this year. SOFCs essentially work in the reverse manner of PEMs. They add electrons to oxygen, and the oxygen ions then react with a hydrocarbon to produce electricity, water, and carbon dioxide. Their potential applications include supplying electricity to individual buildings, groups of buildings, or entire neighborhoods. Conventional SOFCs operate at about 1000 °C, which causes materials stress and poor stability. Efforts are under way to find ways to improve material reliability and reduce operating temperatures.

Solar Cells

Although solar cells have a small niche in today’s energy production, their low conversion rate of sunlight to electricity—about 20 percent at best—remains a stumbling block to their greater use. Significant advances in solar-cell materials are urgently needed. One company, Advanced Research Development, has combined two polymers, polyvinyl alcohol and polyacetylene, to produce solar cells

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

that it contends could reach a conversion rate at which nearly 75 percent of solar energy becomes electricity. German researchers have invented a photoelectrolysis cell that they believe can, with improvements, convert 30 percent of solar energy into electricity to produce hydrogen.

Nuclear

The renewed interest in fission-driven power plants preceded the Bush administration, but the administration’s policies and the current shortages of electricity will add impetus to the relicensing of some existing plants and perhaps interest in building new plants before the end of the decade. However, although the nuclear industry has developed new designs for power plants, issues of materials, safety measures, radioactive-waste disposal, and the negative public view of nuclear power remain formidable barriers to bringing new plants online. The revival of interest in nuclear plants, however, suggests the need to pursue work in these areas, particularly ways to improve disposal techniques for nuclear plant wastes, because some states limit on-site storage within their boundaries, and the federal government has yet to open a national storage site.

Environment and Ecology

Environmental problems did not start with the industrial age, but industrialization did exacerbate them. Understanding and countering environmental threats to human health and Earth’s flora, fauna, air, and water will challenge researchers across many disciplines in the coming decade. Progress in environmental science will depend, in part, on new technologies that enable faster evaluation of chemicals, advances in computational biology, new animals models, better databases, and defining the interaction of environmental factors and genes.

Environmental Pollutants

One growing concern is the question of hormone disrupters, or hormone mimics, which are environmental chemicals that evidence suggests may interact with the endocrine systems of humans and animals to cause birth defects and several cancers. A key element in investigating the issue is the need to develop reliable, short-term assays to identify hormone-disrupting chemicals.

Occupational exposure, safe water, and metal contaminants also remain a concern. Although many workers nationwide encounter chemical and biological agents in the course of their employment, the level and risk of exposure for the vast majority remains unknown or poorly defined. Pollutants contaminating water supplies and the use of chlorine pose unresolved scientific and policy questions. “Safe” levels of such contaminants in water as arsenic and the impact of the by-products of chlorination on human illness remain unknown. Metals in the

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

environment—especially lead, mercury, arsenic, cadmium, and chromium—pose health threats and problems in the remediation of industrial and hazardous waste sites. New approaches to removing contaminants are needed, and a number of them, including biosurfactants, should reach the market in the next few years.

Genomics’ Role

The mapping of the human genome will play a significant role in understanding how environmental factors affect human health and the genetic susceptibility of people to various chemical pollutants and infectious agents. The first challenge is to identify genes and/or specific polymorphisms that interact with these agents, as well as metabolic, nutritional, and behavioral factors. These genes include those that influence metabolism, detoxification, cell receptors, DNA repair, immune and inflammatory responses, oxidation pathways, and signal transduction. From this effort should come ways to prevent and intervene in disease processes, both at the individual and population level.

Concerns about pollution in general are contributing to green chemistry, which is the development of environmentally safer chemical processes. One area likely to see a quiet revolution is the development of new solvents that are less volatile and therefore less likely to reach the atmosphere and do harm. Ionic fluids, which have a low vapor pressure at room temperatures, are one example. Another approach seeks to alter conventional solvents in such ways that they retain their desirable activity yet are environmentally benign or less harmful.

Ecosystems

As important as ecosystems are to food production, sequestering carbon, salinity control, and biodiversity, they remain poorly understood. The public recognition of their importance, however, has led to efforts to restore some ecosystems to their original vitality and to some frustration. The history of a specific ecosystem is often lost through the system’s years of alteration, and this loss of knowledge includes the interactions of such factors as water flow and the species of plant life that once flourished there. Moreover, attempts to restore traditional grasses, shrubs, and trees to lure back specific wildlife may fail because of a lack of understanding about what is essential for a successful interaction between the two. If ecosystem restorations are to succeed as intended, many new details of their complexity must be uncovered.

The harm from importing alien species has been documented for more than a century: the disappearance of the American chestnut, the spread of Dutch elm disease, the invasion of the zebra mussel, and other economically disastrous ecological events. Attempts to block the arrival of alien species also require ways to eliminate or control those that establish themselves. Achieving success in this effort seems unlikely in the next 5 to 10 years.

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Atmospheric Sciences and Climatology

Humankind’s influence on the atmosphere and the world’s climate will gain further attention worldwide this century. The seasonal Antarctic ozone hole may have reached a natural limit, but the Arctic region—where ozone thinning occurs but to a lesser extent than over the southern polar region—causes a greater concern because it has a far larger population potentially susceptible to the harmful effects of the Sun’s ultraviolet radiation. New evidence suggests that 10- to 20micrometer particles of frozen water, nitric acid, and sulfuric acid form in the stratosphere in winter water over the Arctic. These particles—known as polar stratospheric cloud (PSC) particles—have perhaps 3000 times the mass of previously known PSC particles and provide a surface on which chlorine and bromine convert from inactive to active molecules that destroy ozone. This discovery illustrates the many unknown effects that gases and submicrometer particles released into the atmosphere—and their interactions—can have. Probing these questions will require not just more measurements and more sensitive sampling and analytical techniques but also new, testable theories that apply atmospheric chemistry, fluid mechanics, and thermodynamics to the problem. Aerosol particles pose a particular challenge because unlike gases, which can remain in the atmosphere for decades, particles in the micrometer range typically last but a few days.

Global Warming

The vast majority of scientists who have assessed the issue now regard global warming, and the human contribution to it through the burning of fossil fuels, as real. This view was reinforced in a recent report by the National Academy of Sciences/National Research Council to President Bush.1 Climate simulations reported last year, for example, compared two sets of data: natural contributions to global warming, such as volcanic activity and variations in solar radiation alone, versus natural contributions plus those from human activities. The results indicated that had natural factors alone prevailed, atmospheric warming, in the 20th century would have stopped about 1960 and a cooling trend would have followed. Combining human and natural activities, however, produced a continued warming pattern up to the present.

Aside from reducing the burning of fossil fuels, several other approaches are under investigation to limit the release of carbon dioxide into the atmosphere. One of these would inject carbon dioxide produced by large combustion plants underground into salt-brine-containing rock to sequester it. Another possibility, but one that requires far more study, is to dissolve carbon dioxide in the ocean

1  

National Research Council, Committee on the Science of Climate Change, Climate Change Science: An Analysis of Some Key Questions, 2001.

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

depths. A major challenge to sequestering the greenhouse gas is to find ways to reduce the cost of such storage to about $10 a ton, a goal the Department of Energy has set for 2015.

Climate Patterns

Sorting out how to best inhibit the harmful effects of human activities on the atmosphere will take on a new urgency in the coming years. The problem has important international economic and social implications—such as coastal submersion from rising sea levels, increased storms and flooding, and disruptions in agriculture—that reinforce the need for excellent science. The Bush administration’s emphasis on the use of fossil fuels for electricity generation and combustion engines does not bode well for reducing greenhouse gases in the near term.

Studies of ice cores from Greenland and the Antarctic continue to elucidate evidence that Earth goes through periodic temperature shifts short of full ice ages. Recent data indicate that this warming—cooling cycle occurs poles apart: When it grows colder in the Arctic, the southern polar region warms and vice versa. Work correlating rises and falls in Earth’s temperature with ocean circulation patterns may further explain the intricate interconnection that shifts the climate on scales of hundreds and thousands of years.

Climate changes of shorter scales, such as the phenomena known as El Niño and La Niña, present a challenge of even greater human immediacy. El Niño is a movement of warm surface water from the western Pacific to the eastern Pacific off South America. It is propelled by the Southern Oscillation, an unstable interaction between the ocean and the atmosphere. During an El Niño, the trade winds weaken and the warm water rushes east and releases more water into the air, which results in heavy rainfall over Peru and Ecuador. Far to the west, places such as Indonesia and Australia suffer droughts. La Niña is the reverse phase of the oscillation. So expansive is the area of warm Pacific water that it affects climate over much of the globe in both phases. The two strongest El Niños in more than 100 years occurred in 1982 and 1997, and some scientists believe this intensification was a result of global warming.

Earthquake Studies

The introduction and refinement of plate tectonics provided a unifying theory that in the broadest terms explained the pattern of earthquakes and volcanic activity observed globally, such as the so-called ring of fire around the rim of the Pacific Ocean. Yet knowing generally where a quake might strike does not say when, and plate tectonics itself did little to provide the information needed to accurately predict temblors by precise time and location.

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

Earthquakes occur when stress builds within rock, primarily along faults beneath the surface, to the point the rock breaks and moves. The coming decade, with near certainty, will not yield ways to pinpoint earthquakes in time and space. However, one can predict a better understanding of the buildup of stress, the transfer of stress along faults, and the complex ways the shockwaves are released when rock snaps propagate through the lithosphere and along the surface. This information will have implications for emergency planning and for improving the design and construction of earthquake-resistant structures.

Earthquake Behavior

Geoscientists in recent years have advanced their understanding of earthquake behavior in several ways. For example, the notion of fault-to-fault communication has shown that faults pass stress quite effectively at the time of a quake from one segment to another. During an earthquake, considerable stress is released as shockwaves, but some of the fault’s stress also is transferred to adjacent segments. By determining the epicenter of a quake and the direction in which the rock broke—strike slip or thrust—seismologists can calculate how much stress was shifted to adjoining rock and where. This ability allows an informed estimate of where the fault will break next, but the question of when remains unanswered.

A better determination of timing could emerge from the new ability—made possible by more powerful computers—to pinpoint and observe clusters of thousands of microquakes along a fault over time. Seismologists believe these microquakes, which are magnitude 1 to 3, represent the progressive failure of a fault. However, what triggers the fault to finally break and release its pent-up energy also remains unknown. Currently, several teams are observing clusters of microquakes on the San Andreas fault near Parkfield, California, where a moderate earthquake has been expected to occur for well over a decade.

TRENDS IN BIOMEDICAL AND AGRICULTURAL SCIENCES

Sequencing the genomes of humans and other species, coupled with proteomics and advances in bioinformatics, will reveal the genes related to diseases, alter the way physicians practice medicine, and have a major impact on the development of new drugs. The growing understanding of the complex activity inside cells will provide equally important insights, as witnessed by research in the neurosciences. Among the emerging findings is that brain cells may be capable of regenerating themselves, and there is a better understanding of proteinprotein interactions, such as those of hormones and their receptors. Biotechnology will play an increasingly important role in developing human drugs, but public resistance in places to genetically modified plants may slow its role in agriculture.

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Genomics and Proteomics

The mapping and sequencing of the human genome, now nearing completion, marks a historic point in biology and the beginning of equally exciting discoveries as scientists make increasing sense of the jumbled A’s, T’s, G’s, and C’s (adenine, thymine, cytosine, and guanine) that make up its genes. This effort will see scientists reporting important data gleaned from the genome, such as the role of specific genes, the identification of genes that are linked to diseases, and a better understanding of the timing of gene expression. These efforts will require new or improved assay systems, automated processing equipment, computer software, and advances in computational biology.

Other Genomes

Of considerable importance to understanding the human genome are the continuing efforts to sequence the genomes of other creatures, including the mouse. Many genes are conserved; that is, the same gene exists in many species, and the functions of certain of these genes have been discovered in species other than humans. By matching, say, the mouse and human genomes, a gene with a known function in the mouse can be pinpointed in humans. During the next 7 years, progress along the frontlines of human genomics should identify most genes associated with various diseases and indicate how a malfunctioning gene relates to the ailment, which would open new windows to therapy.

Genomics will change many current approaches in medicine and related health sciences. One area likely to see radical change is toxicology. The field has traditionally relied on animals—such as rats, mice, rabbits, dogs—to gauge the toxicity of substances. But genomics research has led to an emerging field known as toxicogenomics. In it, researchers apply a suspected toxin to DNA placed on glass to observe any effects it may have on gene expression.

Proteomics

Genes carry the codes for proteins, which do the actual work within an organism. The genomic revolution has ushered in the age of proteomics, an even more difficult challenge in which the goal is nothing less than understanding the makeup, function, and interactions of the body’s cellular proteins. Indeed, proteomics poses a more complex puzzle than genomics because there are far more proteins than genes. This is so because the messenger RNA that transports the code from a gene for transcription into a protein can be assembled in several ways, and a protein in a cell also can be modified by such processes as phosphorylation and glycosylation.

Proteomics, unlike the traditional study of one protein at a time, seeks to pursue its goals using automated, high-throughput techniques. The development

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

of new technologies is a major goal. Areas of investigation include which genes the different cell types express and their proteins, the characteristics of these proteins, studies to determine their interactions, understanding signal transduction, and determining the nature of protein folding and the exact three-dimensional structure of proteins. As a rule, proteins that interact tend to work together, such as antibodies and their receptors. Identifying a protein active in some disease process offers a potential target for intervention. For this reason, the proteomics quest will be led and dominated by the quest to discover new drugs.

A key element in the success of genomics and proteomics has been and will continue to be bioinformatics, a field that in the broadest sense weds information technology and biology. Bioinformatics provides the computer systems and strategies needed to organize and mine databases, examine the relationship of polymorphisms to disease, ascertain the role of proteins and how they interact with other proteins, and inform the search for screening tests and therapeutic drugs. Once basically a data management tool, bioinformatics is now important in the analytical functions needed to support continuing progress in genomics and proteomics—not simply cataloging and warehousing data, but turning it into useful insights and discoveries. All this will require the development of new algorithms and computer strategies.

Neuroscience

Neuroscience provides but one example of the opportunities and advances that will accrue through a greater understanding of the extraordinarily complex activities within and among cells. The last three decades have brought a burst of knowledge about the brain—the routes of neurons throughout the organ, an understanding of neurotransmitters, synapses, and receptors, mappings of responses to various stimuli, and the linking of genes to biochemical networks, brain circuitry, and behavior. Physicians can now track the progress of neurological diseases in their patients with imaging techniques, and new insights into the processes of memory, learning, and emotions have emerged. One can expect the brain to yield far more of its secrets in the coming decade with the introduction of new probes, new imaging technologies, and improved bioinformatics techniques that integrate findings not only from neuroscience but from other research disciplines as well.

Nerve Death

Consider, as an example, neurodegeneration, which is a key element in ailments such as Alzheimer’s, Parkinson’s, and Huntington’s diseases, multiple sclerosis, glaucoma, and Creutzfeldt-Jakob disease and its variant, commonly called mad cow disease. Evidence suggests a genetic component for each of these diseases, different degenerative mechanisms for each, yet with some similarities

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

among them that may yield insight into more than one disease. In Alzheimer’s, for instance, a genetic defect in the nucleus affects the cell’s mitochondria, its major source of energy, and appears to play a role in triggering apoptosis, or programmed cell death. The question now being raised is whether the same or a similar mechanism may play a role in the optic-nerve degeneration in glaucoma.

The ability to regenerate central nervous system neurons could provide a major advance in treating neurodegenerative diseases and paralyzing spinal cord injuries. One target is the biochemical cascade of apoptosis, which ultimately ends in the death of a cell. Preliminary evidence now suggests that it is possible to interrupt the cascade at several points and that doing so may stop the process of cell death and perhaps return the cell to normal functioning.

Another target of opportunity is stem cells, partially developed cells that transform into the various cells of the body. Stem cells taken from embryos and fetuses and transplanted into patients once appeared to be the only stem-cell approach capable of replacing dead neurons. However, ethical and moral questions have slowed and delayed investigations of the uses of embryonic and fetal stem cells. Recently, several animal studies indicated that stem cells from adults can be reprogrammed to form specific tissues, including, perhaps, neurons. Studies to date are essentially observations and many key questions remain. What mechanisms determine what type of cell a stem cell will become? What stem cells enter the brain and become neuronlike? What signals attract them? Scientists will devote long hours during the next few years to deciphering these biological codes and seeking to utilize the answers in therapy.

Self-Repair

The brain itself might one day be stimulated to self-repair. A maxim for years held that central nervous system neurons did not and could not regenerate. Experiments during the last several years in animals, including mice, birds, and primates, have challenged that basic assumption. Some evidence suggests that apoptosis can, in certain circumstances, stimulate stem cells in the brain to form new neurons, and these neurons form the same connections with other neurons as the dead cells. Adult stem cells have also been stimulated to form new heart muscle in mice.

Although the 1990s were proclaimed the decade of the brain, the potential for advances in neuroscience during this decade is even greater. Improved techniques for imaging and mapping the biochemical pathways of the brain, more powerful bioinformatics tools and more interaction among scientists working in different areas of neuroscience should reveal considerably more information about the brain. One can expect a better understanding of the neurodegenerative diseases, the functioning and interactions of dendrites, synapses, and ribisomes, the mechanisms of neurotransmitter production and transport, internal cellular signaling, and learning and memory.

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Drug Development

Biotechnology allows the transfer of new genes into animals and humans, the culturing of plants from single cells, and the development of new drugs and diagnostic tests. Knowledge gleaned from the human genome and wedded with biotechnology techniques will play a more and more important role in drug development, gene-therapy treatments, and clinical immunology. For example, the Food and Drug Administration has approved nine genetically engineered monoclonal antibodies for treating several diseases, including cancer. Biotechnology will develop human antibodies and other proteins that can be harvested for therapeutic uses. Clinical diagnostics will remain a mainstay of biopharmaceuticals as companies began uniting genomics with microarray (gene-chip) technology to develop new diagnostic tests and ones that can assess an individual’s risk of developing a genetic disease.

Microarrays

DNA microarrays will prove an even more essential element in drug development than it currently is as researchers exploit the sequencing of the human genome. These arrays contain thousands of separate DNA sequences on a plate roughly the size of a business card. They enable the analysis of a sample of DNA to determine if it contains polymorphisms or mutations. DNA microassays, which are usually prepared by robotic devices, can be used to diagnose various diseases, identify potential therapeutic targets, and predict toxic effects, and one day they will allow customizing drug regimens for individual patients.

Computer modeling and combinatoral chemistry, which enable the rapid creation of thousands of chemical entities and their testing for potential biological activity, hold the promise of new and safer drugs and, perhaps, lower development costs. Companies are devising and using computer models that assess a compound’s absorption, distribution, metabolism, excretion, and toxicity characteristics. The aim is to greatly reduce the number of drugs that go to animal testing and clinical trials, only to prove unacceptable as human therapeutics. Tapping the mapped human genome will reveal many new targets that, combined with computer modeling, will allow drug firms to more rationally design specific drugs, test them at less cost, and feel more confident about them when the companies take them to human trial.

Receptor Activity

Another area of considerable challenge lies in finding ways to control the interactions of hormones and their receptors and other protein-protein relationships. These interactions present potential drug targets, but devising methods to

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

control them will require many technological advances in areas such as synthesis, analysis, computational chemistry, and bioinformatics.

Pharmaceutical companies are also focusing on better ways to deliver drugs, in part as a way to extend their patents. One quest is for better ways to deliver proteins orally. Most proteins used now as therapeutics must be injected, and with a greater number of proteins entering the medical armamentarium in the next 10 years, solving the problems of oral delivery has taken on new urgency. The various drug-delivery methods in development include new inhalation approaches, extended time-release injections, and transdermal administrations. One example of a new transdermal device consists of a tiny pump attached to the skin by an adhesive pad. Pressing a button pushes a needle just below the skin and delivers the drug at a constant rate. Its developers envision that it will be used at first as a way to deliver pain medications.

Agrobiotechnology

Societal pressures may slow innovations in agricultural biotechnology in the next few years, but research will continue, propelled in part by a simple statistic: The world’s population is predicted to increase from 6 to 8 billion by 2030. With one-third more humans to feed, people may have no choice but to accept genetically engineered foods.

Among the genetically modified plants currently in fields are insect-resistant corn and herbicide-resistant soybeans, corn, and canola. The potential benefits of genetically modified crops include fewer environmental problems from pesticides and herbicides, increased yields, enhanced nutrition, drought resistance, and even the production of the building blocks of polymers and the remediation of polluted soils, sediments, and aquifers. However, there are unresolved questions about potential risks as well, which have led to opposition to genetically altered foods, particularly in Europe and Japan. These concerns include the possibilities that a gene-altered plant will become an invasive species and cause ecological damage; that plants producing pesticidal proteins may harm nontargeted organisms and/or have an adverse effect on species that feed on the targeted pests; and that new viruses may evolve in virus-resistant plants. Resolving these issues, as well as the lingering controversy over human safety, poses an immediate challenge for researchers and will be necessary for the acceptance of genetically altered plants. In many ways, the future of food produced by biotechnology—at least in the near term—depends on persuading the public of the solid science behind it.

Plant Genomes

Announced last December, the first sequencing of the genome of a higher plant—the weed Arabidopsis thaliana—and the nearly completed mapping of the

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

rice genome mark major advances along the way to understanding plant behavior and the opportunities to exploit that knowledge. The rice genome carries particular import for the world’s food supply because an estimated 4 billion people will depend on rice as their dietary staple in the year 2030. The A. thaliana sequencers predict that the plant has about 25,500 genes, and they have assigned tentative functions to around 70 percent of them. As in the human genome, the complete plant genome enables researchers to compare its DNA sequence with DNA sequences from other plants to identify key genes in cash crops. Plant researchers have set a goal of understanding the function of all plant genes by the end of the decade, which would greatly enhance biotechnologists’ ability to generate new genetically altered forms. Those data would then serve as the basis for a virtual plant, a computer model that would enable the simulation of plant growth and development under different environmental conditions.

BLENDING THE PHYSICAL AND THE BIOLOGICAL

For centuries, a sharp demarcation separated the physical and biological sciences, breached only by an occasional discipline such as biophysics. Today, interdisciplinary research is the norm in many industrial laboratories, and not just in the physical or the biological sciences. To a growing degree, research teams may now include representatives from both. Researchers seek to translate biomolecular recognition into useful nanomechanical devices. Geoscientists are exploring genomics for useful clues to solving problems. Cooperative efforts by biologists and engineers seek to create new robots. Some scientists wonder whether deciphering biosignaling in cells will lead to applications in computer science, and others ponder whether the emerging discoveries of brain science will revolutionize information technology. One can expect a greater breaching of the traditional barriers between physical and biological research and a strengthening of biophysical research during this decade—in industry, government, and academic laboratories.

Biotech and Materials Science

One promising area is the interaction of biotechnology and materials science. Biological systems have been used to create two- and three-dimensional inorganic nanoscale structures and assemble gold and semiconductor nanoparticles on a DNA template. Such work aims at goals like nanoscale wires, mechanical devices, and logic elements, as well as creating organic-inorganic compounds. The potential from utilizing the knowledge and skills of biotechnologists and materials scientists includes creation of new molecular switches and transistors, nanosensors, catalytic devices, and opto-electronic components. IBM researchers have demonstrated that molecular recognition between a piece of DNA and its complementary strand can translate into a mechanical response, namely, the bend-

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

ing of a nanocantilever. Researchers trying to develop biocompatible materials for use in human replacement parts draw on new findings from a spectrum of disciplines, including molecular and cellular biology, genetics, polymer and surface science, and organic chemistry.

Robot Engineering

Today, robot building depends almost as much on biologists and neuroscientists as it does on engineers and computer scientists. Robot builders seek insights from the animal kingdom in order to develop machines with the same coordinated control, locomotion, and balance as insects and mammals. The purpose is not to create a robot that looks like a dog (although that has been done and marketed), but to build one—for battlefield use or planet-surface exploration, say— that can walk, creep, run, leap, wheel about, and role over with the same fluid ease as a canine. To do this requires not simply electrical wiring and computer logic, but also a deep understanding of insect and mammalian mobility, which, in turn, requires the input of zoologists, entomologists, and neurophysiologists. What is emerging are some general principles about the complexity of animal mechanics and control. For now, bioinspired robots are mostly creatures of the laboratory. However, one would expect continued development and application of these robots throughout this decade and a backflow of insights to biologists and neurophysiologists as they observe the development of bioinspired machines.

Catalysts

As the number and understanding of enzyme-crystal structures grow, so does interest in utilizing this knowledge to synthesize new catalysts. Researchers envision harnessing such bioinspired catalysts for green chemistry—through the environmentally benign processing of chemicals—and for use both as new drugs and in their production. The effort is in a formative stage, and although chemists have synthesized enzyme-mimicking catalysts, a great deal of attention is focused on deciphering protein structure and its role in enzymatic behavior. Evidence indicates, for example, that an enzyme is not simply a rigid structure on which catalysis occurs, but that the process actively involves the entire protein. A greater understanding of enzymes has exposed the complexity confronting attempts to develop enzyme mimics as well as totally new ones. Questions remain, however, about protein structure and enzymatic mechanisms and how enzymes work in unison within cells. Many answers should emerge in the next few years through biochemistry, protein crystallography, molecule-by-molecule studies of enzymes, and bioinformatics.

Many other examples exist of the melding of the physical and biological sciences. Researchers at Bell Laboratories are trying to exploit both self-assem-

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

bly and the natural electrochromatic properties of the protein bacteriorhodopsin to develop new flat-panel displays. Unraveling the intricate nature of signaling within cells—the focus of a mini-Human Genome Project called the Alliance for Cellular Signaling—holds clear implications not only for basic biology, clinical medicine, and the pharmaceutical industry but also, potentially, for the computer sciences. Some geoscientists look to the sequencing and analysis of a variety of genomes to aid them in understanding the coevolution of life and Earth and the soft-tissue structure of creatures long extinct. Neuroscientists no longer view the brain as a three-pound biological computer but as an even more complex system that organizes thinking, learning, and memory. Understanding these processes holds significant meaning for computer science and information technology as well.

DISCUSSION AND CONCLUSION

Richard Smalley remarked in a private conversation 6 years ago that the 21st century would be the century of nanoscience. Certainly science at the micro- and nanoscale will play important roles in maintaining the economic competitiveness of the United States in computer chips and storage media, telecommunications, optical devices, display technology, biotechnology, biomedical science, and drug development. At the current time, industry funds more than half of the research most likely to have an economic impact during the coming decade. That situation probably will not change. The current administration appears to view tax policy as a more effective way to stimulate the economy—including its research component—than federal support for research. Although this philosophy might shift with the 2004 or 2008 election, the percentage of research funds provided by the federal government is unlikely to suddenly surge.

Two trends will clearly influence U.S. applied research in science, engineering, and technology in the coming decade. One is the growth of interdisciplinary research. Many projects today require the integration of expertise from several disciplines, team research, and a willingness by scientists and engineers to work with others outside their own discipline to bring them to a successful fruition. The second trend relates to the increasing quality of scientific work emerging from foreign laboratories, particularly in Canada, Europe, and Japan. This work will challenge—as well as inform—U.S. science and technology and may affect its domination in areas such as computers, nanoscience, and biotechnology. Beyond that challenge, international treaties, standards, and regulations will affect U.S. competitiveness. For example, although the United States is not a party to the Cartagena Protocol on Biosafety, its provisions will govern U.S. companies whenever they trade with any country that ratifies it.

The degree to which research will advance during the next 5 to 10 years depends in part on economic, political, and international factors beyond the scope

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×

of this report. Although precise predictions cannot be made as to which specific efforts will yield unusually significant results over the next 5 to 10 years, the breadth of the technologies likely to yield extraordinary advances can be identified. However, true breakthroughs, by their nature, are unexpected, unanticipated, and unpredictable.

Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 167
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 168
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 169
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 170
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 171
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 172
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 173
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 174
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 175
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 176
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 177
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 178
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 179
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 180
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 181
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 182
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 183
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 184
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 185
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 186
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 187
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 188
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 189
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 190
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 191
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 192
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 193
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 194
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 195
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 196
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 197
Suggested Citation:"Appendix H: Trends in Science and Technology." National Research Council. 2002. Future R&D Environments: A Report for the National Institute of Standards and Technology. Washington, DC: The National Academies Press. doi: 10.17226/10313.
×
Page 198
Next: Appendix I: Trends in Industrial R&D Management and Organization »
Future R&D Environments: A Report for the National Institute of Standards and Technology Get This Book
×
Buy Paperback | $50.00 Buy Ebook | $39.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In September 2000, the National Institute of Standards and Technology (NIST) asked the National Research Council to assemble a committee to study the trends and forces in science and technology (S&T), industrial management, the economy, and society that are likely to affect research and development as well as the introduction of technological innovations over the next 5 to 10 years. NIST believed that such a study would provide useful supporting information as it planned future programs to achieve its goals of strengthening the U.S. economy and improving the quality of life for U.S. citizens by working with industry to develop and apply technology, measurements, and standards.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!