The role of digital technology in the evolving knowledge society is comparable to that of the railroad during the Industrial Revolution (Attali, 1992). With the aid of information-technology “tracks”—high-speed computer and telecommunications systems—now inter-connecting so much of the world, reaching into the marketplace, government, and our homes and lives, we often learn about events virtually as soon as they occur and we are able to process the information in a myriad of increasingly useful ways.
This extensive network is bringing peoples and cultures together and creating new social dynamics in the process. It is leading to the formation of closely bonded, widely dispersed communities of people united by their interest in doing business or in sharing experiences and intellectual pursuits. New forms of knowledge accumulation are developing, as are computer-based learning systems that open the way to innovative modes of instruction and learning (Brown, 2000). And new models of libraries are exploiting vast amounts of digital data in physically dispersed computer systems that can be remotely accessed by users over information networks (National Research Council, 2000a).
A major frontier over the next one or two decades is certain to be the “user interface” for complex information systems. How can it become a more natural environment that transcends limitations of keyboard, mouse, and screen—moving toward an immersive environment in which attributes of human face-to-face exchange can essentially be captured? Ultimately, “virtual environments,” in which we respond to lifelike simulations that are replete with artificially created sights, sounds, and other stimuli, may liberate us from physical restrictions; current targets
“We are going to have a huge shift in the way people access information. . . . Billions of people worldwide are suddenly able to afford basically the same access that we in this room typically enjoy.”
—Stuart Feldman, Workshop on the Impact of Information Technology on the Future of the Research University, January 22-23, 2001, Washington, D.C. (programs.researchchannel.com)
for application include medicine as well as distance education (Olsen, 2000; Young, 2000b).
Information technology thus presents significant opportunities for those in the higher-education enterprise who seek new and better approaches to teaching and learning, research, and public service (National Education Association, 2000; Carr, 2000b; Mendels, 1999). However, the effective use of knowledge in such forms may well require a rethinking of many current assumptions about education in general and the research university in particular (Hanna, 2000; Wulf, 1995).
This chapter is intended to provide an overview of information-technology advances that the panel expects to see over the next decade. Two caveats should be kept in mind. First, the focus of the chapter is on anticipated hardware advances. Yet equivalent advances will be necessary in software development. We face major challenges in cracking the “complexity barrier” in software and developing software systems that diagnose, repair, and protect themselves (National Research Council, 2002 and 2000b). Today’s large, complex, and critical information systems may involve hundreds of thousands of computers, be based on millions of lines of code, and operate almost continuously, making them more difficult to design and maintain, and vulnerable in unexpected ways. For example, a growing number of large-scale projects have either been cancelled without being deployed or have experienced significant problems in service (National Research Council, 2000b).
The second caveat is to not confuse technological feasibility with commercial and social reality. Changes in technology will be enormous over the next 10 years (not to mention the next 20), and the rate of change is increasing. But individuals, as well as social institutions like the university, cannot rapidly change their behaviors. The fact that we are approaching a time in which communication and access to a great deal of the world’s information will be possible in an instant and at near-zero cost does not necessarily mean that education, or other sorts of “knowledge work,” will have changed at the same rate. While information technology does hold out the promise of enabling advances in the educational, research, and service functions of the research university, realizing this promise will require time and considerable institutional adaptation.
One of the university’s greatest challenges, in fact, will be managing the great discrepancy between technological and
institutional change and exploiting the new technological capabilities as best it can, while recognizing that the ability to retrieve data is not the same as knowledge and technical facility is not the same as wisdom. Such challenges are addressed in Chapter 3.
AN EXTRAORDINARY EVOLUTION
It is difficult to appreciate just how quickly information technology is evolving. Five decades ago ENIAC, one of the earliest computers, stood 10 feet tall, stretched 80 feet wide, included more than 17,000 vacuum tubes, and weighed about 30 tons. Today you can buy a musical greeting card with a silicon chip that is 100 times faster than ENIAC (Huey, 1994). Moreover, the time between such improvements is rapidly shrinking. A $1,000 notebook computer now has more computing horsepower than a $20-million supercomputer of the early 1990s.
This extraordinary pace of information-technology evolution is not only expected to continue for the foreseeable future but could well accelerate. For example, the newest supercomputers are capable of performing over 35 trillion calculations per second (Normile, 2002; Reuters, 2002). And computers yet a thousand times faster are currently under development for applications such as the analysis of protein folding (McDonald, 2001).
For the first several decades of the information age, the evolution of digital technology followed the trajectory predicted by “Moore’s Law”—a 1965 observation by Intel founder Gordon Moore that the density of transistors on a chip doubles every 18 months or so, thereby making it twice as powerful as before (or, alternatively viewed, half as costly). Although this “law” was intended to characterize silicon-based microprocessors alone, it turns out that almost every aspect of digital technology has advanced at an exponential rate, with some technologies moving forward even faster (Wulf, 1995). For example, disk areal density—the number of bits per square inch that can be put on a disk—has been doubling every 12 months in recent years and is expected to continue at that rate over the near future.6
In recent years, information technology has been most dramatically driven not by the continuing increase in computing
Box 2-1: Prefixes Used in this Report
Mega- = 106, or a million
Giga- = 109, or a billion
Tera- = 1012, or a trillion
Peta- = 1015, or a quadrillion
power but rather by the extraordinary growth of bandwidth— the rate at which we can transmit digital data (Feldman, 2001). In the mid- to late 1980s, 300 bit-per-second modems were in wide use; now the local-area networks in our offices and homes communicate at 10-100 megabits per second and the backbone systems for linking regional networks together typically run at gigabit-per-second speeds. With the rapid deployment of fiberoptic cables and optical switching, terabit-per-second networks are just around the corner (Kahney, 2000). According to one market forecast of the next five years, fiber-optic cable will be installed throughout the world at an equivalent rate of thousands of miles per hour, despite the severe spending slump afflicting the telecommunications industry at the time this report was being prepared.7 Meanwhile, researchers are already experimenting with moving data at speeds of petabits per second.
IBM reports success in the lab with communications in the 8- to 10-petabit range, and plans are already being made to move such bandwidths into the marketplace (McGarvey, 1999). Some Internet service providers expect to be employing them in perhaps three to five years for their internal traffic. For global communications, intercontinental bandwidth has recently increased from a relatively sclerotic 45 megabits to 88-100 gigabits, made possible by new fiber-optic cable laid under the major oceans.8
From the average user’s point of view, the exponential rate dictated by Moore’s Law will drive increases of 100 to 1,000 in computing speed, storage capacity, and bandwidth every decade. At that pace, today’s $1,000-notebook computer will, by the year 2020, have a computing speed of 1 million gigahertz, a memory of thousands of terabytes, and linkages to networks at data transmission speeds of gigabits per second.
Put another way, that notebook computer will have astounding processing and memory capacities. Except its electronics will be so tiny as to be almost invisible, and it will communicate with billions of other computers and devices through wireless technology and global networks—what Lucent (Bell) Laboratories calls a “global communications skin” (Lucent, 2000).
AN INTERNET-DRIVEN ECONOMY
While hardware advances are occurring on a clear trajectory, predictions about future applications are more problematic— they have almost always proven to be either too optimistic or too pessimistic. Still, we can be sure that the nature of human interaction with the digital world—and with other humans through computer-based networks—is certainly evolving. New screen displays, such as one that places nine megapixels on the equivalent of a two-page spread, provide resolutions noticeably better than paper. It’s no longer a question of enduring mediocre “I’ll put up with this screen” resolution, but one of superlative “I would really like to have it” quality. Advances are being made in other products as well. Thin, readable, and flexible electronic books, for example, are considered “in-the-bag” technology for broad commercialization over the next few years, as are “computers on a wristwatch” and “knowledge in your pocket.” For example, the Apple iPod already has a 20 gigabyte drive the size of a quarter.
All the while, we are moving beyond the simple text interactions of electronic mail and electronic conferencing to graphical user interfaces (e.g., the Mac or Windows) to voice to video, and next-generation interfaces may use retinal displays—in which lasers paint images directly on the retina of the eye to portray 360-degree immersive environments. With the rapid development of sensors and robotic actuators, touch and action at a distance—already a reality in robot-assisted surgery—may soon be generally available as well.
Thus the world of the user could be marked by increasing technological sophistication. With virtual reality, individuals may routinely communicate with one another through simulated environments, or “telepresence,” perhaps delegating their
own digital representations—“software agents,” or tools that collect, organize, relate, and summarize knowledge on behalf of their human masters—to interact in a virtual world with those of their colleagues. As communications technology increases in power by 100 fold (or more) each decade, such digitally mediated human interactions could take place with essentially any degree of fidelity desired.
Predictions like these may seem like fantasy, but consider the record: the penetration of digital technology into our society has proceeded at a remarkable pace. In less than a decade, the Internet has evolved from a relatively obscure research network to a commercial infrastructure now actively utilized by 61 percent of U.S. households and essentially all of our schools and businesses (Gartner Group, 2001). On the global level, the Internet already connects hundreds of millions of people with one another, and estimates are that by the end of the decade this number could grow into the billions—a substantial fraction of the world’s population.9 Such growth is expected to continue despite, or perhaps as a result of, the recent rude awakenings of e-business investors to the realities of the marketplace.
More uncertain than the technological trajectory is the status of the business environment, which will greatly influence when advanced capabilities reach the marketplace. Specific forecasts should be treated with skepticism. For example, forecasts of the 2004 worldwide e-commerce market made in early 2001 ranged from $1.4 trillion to $10 trillion (Butler, 2001). In addition, the overall U.S. and world economies experienced significant slowdowns during the year prior to publication of this report. Still, even revised market forecasts predict continued growth in information-technology-related industries, and this growth is expected to accelerate as business conditions improve. Although the exact pace is difficult to predict, the clear trend is that much of the growth in business-to-business commerce will be Internet-driven.
Access to computers and the Internet, and the ability to use this technology, are thus becoming increasingly important to full participation in our nation’s economic, political, and social life. Furthermore, the transition from phone links to broadband—and, eventually, fiber optics—will transform the current drippy faucet of modem connectivity to a deluge of gigabits-per-second into our homes, schools, and places of work.
According to several estimates, by 2004 the number of Internet-enabled devices in the world, including mobile phones, personal digital appliances (e.g., Palm Pilots), and other devices, will approach or exceed one billion, and these devices will be “asymptotically cheap”—costing only tens, not thousands, of dollars—and inexorably getting cheaper yet (Feldman, 2001; In-Stat/MDR, 2002). Put another way, over the next decade we could move from “giga” technology (in terms of computer operations per second, storage capacities, and data-transmission rates) to “tera” and then “peta” technology—petabit networks, petabyte databases, and petaflop (quadrillion instructions per second) computing for those applications that need it. We will denominate the number of computer servers in the billions, digital sensors in the tens of billions, and software agents in the trillions.
In effect, we will evolve from “e-commerce,” “e-government,” and “e-learning” to just about “e-everything” as digital devices increasingly become the primary interfaces not only with our environment but with other people.