Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 13 1 U.S. Leadership in Information Technology Information technology is central to our economy and society. The United States has held a commanding lead in this arena, a lead that we must maintain. Meeting the challenges posed by rapid, worldwide change will continue to require our best efforts to advance the state of the art in computing and communications technology. Now, as in the past, our ability to lead requires an ongoing strong program of long-term research. The federal government has supported such research for 50 years with great success. Today, the High-Performance Computing and Communications Initiative (HPCCI) is the multiagency cooperative effort under which most of this research is funded. For this reason, any discussion of the HPCCI must be grounded in an understanding of the role and nature of information technology, the information industry, and the nation's research program in computing. These issues are the subject of this first chapter of the report. INFORMATION TECHNOLOGY IS CENTRAL TO OUR SOCIETY Computers affect our lives enormously. We use them directly for everyday tasks such as making an airline reservation, getting money from an automated teller machine, or writing a report on a word processor. We also enjoy many products and services that would not be possible without digital computing and communications. The direct use of computing is growing rapidly. Personal computers are already pervasive in our economy and society: in the United States over 70 million are installed, and between one-fifth and one-third of U.S. households have a computer.1 The increasing popularity of personal computing is but the tip of the iceberg; education, communication, medicine, government, manufacturing, transportation, science, engineering, finance, and entertainment all increasingly use digital computing and communications to enhance our lives directly by offering improved goods and services. Indirectly, computing and communications are used to make many products cheaper and better. Without computers connected by communication networks, designing the newest U.S. jetliner, the Boeing 777, within acceptable cost and time constraints would have been impossible. Advanced hardware and advanced software working together substituted computer models for expensive and time-consuming physical modeling. The design of complex plastic molded parts, now routinely used for many products, depends on computer simulation of plastic flow and computer control of die making. Automobile engines rely on embedded computers for fuel economy and emission control, and doctors use computer-assisted tomography (CAT) scanners to
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 14 see inside the body. Computers help us understand and tap Earth's resources: our oil is found by computer analysis of geologic data. Interconnected computer systems underlie our entire financial system, enabling electronic funds transfer and services such as home banking. Digital communication extends to a large and rapidly increasing number of businesses, educational institutions, government agencies, and homes. Originally devices for computation and business data processing, computers now are tools for information access and processing in the broadest sense. As such, they have become fundamental to the operation of our society, and computing and communications have come to be labeled widely as "information processing." Remarkably, given its already enormous direct and indirect impact, information technology is being deployed in our society more rapidly now than at any time in the past.2 If this momentum is sustained, then digital technology and digital information-the digital information revolution-will offer a huge range of new applications, create markets for a wide variety of new products and services, and yield a broad spectrum of benefits to all Americans. INFORMATION TECHNOLOGY ADVANCES RAPIDLY The information industry improves its products with amazing speed. For several decades-powered by federal and industrial research and development (R&D) investments in computer science, computer engineering, electrical engineering, and semiconductor physics-each dollar spent on computation, storage, and communication has bought twice the performance every 18 to 24 months. Over the course of each decade, this sustained rate of progress results in a 100-fold improvement, as Figure 1.1 shows for processor speed and disk storage capacity. With continued investment, we can sustain this rate of progress for at least the next decade. Such rapid improvement is possible because of the nature of information and of the technologies required to process it: integrated circuits, storage devices, and communications systems (Box 1.1). Significant improvements in hardware performance in turn make it feasible to create the software required for computers to do new things-electronic and mechanical design, desktop publishing, video editing, modeling of financial markets, creation of digital libraries, and the practice of telemedicine, for example. FIGURE 1.1 Increase in performance per dollar of processor speed and disk storage from 1989 to 1994, shown on a semilog scale. (The right-hand graph uses a linear scale to emphasize the compound effect of successive doublings.)
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 15 BOX 1.1 WHAT DRIVES THE PROGRESS IN INFORMATION TECHNOLOGY? â¢ Integrated circuits improve rapidly. Computers are made from integrated circuits, each component of which gets half as high and half as wide every 5 years, with the result that the same area can hold four times as many components at the same cost. Also, each component runs twice as fast, and the circuit chips get bigger. The physical limits to this progress are still far off. â¢ New designs take advantage of advances in integrated circuits. Today's microprocessors and memories are made from very large-scale integrated (VLSI) circuits. Although modern VLSI microprocessors and memories may have 10 million components, they are actually designed in no more time than the integrated circuits of a decade ago that had only 100,000 components. Advances in designs and design toolsâin our ability to master complexityâhave been, and will continue to be, essential to taking advantage of advances in integrated circuit technology. â¢ It is cheap to make more devices, and the same integrated circuit foundries can make many different devices. The marginal cost of building more computers is small, because the cost of raw materials is low and the components are mass-produced. Further, although an integrated circuit foundry may be expensive to build, it can make many different products, just as a printing press can print many different books. Because the same process is used over and over again, improvements in this process have enormous effects on product cost and quality. â¢ New designs can quickly become products. A new digital system is usually built in an existing foundry that operates directly from a digital description of the design. Investing in prototypes is not necessary, because it is possible to simulate a product accurately and automatically from the design. â¢ Dramatic system advances enable dramatic application advances. The fact that computing power per dollar doubles every 18 months means that capabilities can migrate from the high end to the consumer rapidly and predictably. It is R&D investment at the high end that creates these capabilities. Today's desktop workstation was the supercomputer of a mere decade ago. Today's solutions to the problems of large-scale parallelism will enable us to solve tomorrow's mass-market problems of on-chip parallelism. Rapid progress has produced successive waves of new companies in diverse areas related to information technology and its applications: integrated circuits, computer hardware, computer software, communications, embedded systems, robotics, video on demand, and others. Many of today's major hardware and software firms, including Sun Microsystems and Microsoft, both with revenues of more than $4 billion per year (Computer Select, 1994), did not exist 15 years ago, and none except IBM and Unisys were in the information business 40 years ago. Many of these new companies have drawn on ideas and people from federally funded research projects. RETAINING LEADERSHIP IN INFORMATION TECHNOLOGY IS VITAL TO THE NATION The U.S. lead in information technology has brought benefits that are clearly valuable to the nation: â¢ The jobs and profits generated by the industry itself. The information technology industry is big: Revenues attributable to hardware and software sales plus associated services were on the order of $500 billion in 1993.3 Due to the limitations of what is actually counted in
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 16 any given estimate, this figure may be conservative. The jobs and profits that the information technology industry complex delivers are worth careful preservation. â¢ The advantages that other sectors gain from early access to the best information technology, ahead of our international competitors. Learning how to use computers for new tasks sooner allows firms to capture new market segments and compete more efficiently in old ones. U.S. competitiveness in engineering, manufacturing, transportation, financial services, and a host of other areas depends on U.S. competitiveness in information technology (CSTB, 1994b, 1995). â¢ The benefits that our citizens gain from information technology in their daily lives. Benefits are evident in education, medicine, finance, communication, entertainment, information services, and other areas. The lives of Americans are improved by 24-hour banking, improved fuel economy in automobiles, noninvasive medical diagnosis, and hundreds of uses of computers for generating film sequences and as the basis for computerized games. The reach and impact of such applications are increasing rapidly. Our lead in information technology is fragile, and it will slip away if we fail to adapt. Leadership has often shifted in a few product generations, and a generation in the information industry can be less than 2 years. As a nation we must continue to foster the flow of fresh ideas and trained minds that have enabled the U.S. information technology enterprise as a whole to remain strong despite the fate of individual companies. International competition is fierce, and it is likely to increase. In the mid-1980s Japan and Europe recognized the strategic importance of information technology and began investing massively in the Fifth Generation and Esprit efforts, respectively. Although these efforts did not yield new technologies to rival our own, their very significant investments in research and technology infrastructure are laying the foundation for future challenges.4 Today, through the efforts of government, academia, and industry, our nation retains its lead and continues to enjoy the benefits associated with it. Although many factors contributed, the committee believes that federal investment in the advancement of information technology has played a key role. Indeed, for nearly 50 years federal investment has helped to train the people and stimulate the ideas that have made today's computers and many of their applications possible. Federal support early in the life cycle of many ideas has advanced them from novelties to convincing demonstrations that attract private investment to products and services that ultimately add to the quality of U.S. life. THE FEDERAL INVESTMENT IN COMPUTING RESEARCH HAS PAID RICH DIVIDENDS In the 1940s and 1950s, much of the federal investment in computing research was for defense (Flamm, 1988). Technical needs such as fire control and intelligence needs such as cryptography and mission planning required great computing power. Since the early 1960s the federal government has invested more broadly in computing research, and these investments have profoundly affected how computers are made and used, contributed to the development of innovative ideas and training of key people, and led to the kinds of advances sampled in Table 1.1. Figure 1.2 shows the corresponding timelines for progress from research topics commercially successful applications. Notable among the government efforts stimulating many of these advances have been the Advanced Research Projects Agency's (ARPA's)
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 17 VLSI program, an initiative of the past decade (Box 1.2), and federal sponsorship of research in laying the groundwork for the now-ubiquitous field of computer graphics (Box 1.3) and for sophisticated database systems (Box 1.4). TABLE 1.1 Some Successes of Government Funding of Computing and Communications Research Topic Goal Unanticipated Results Today Timesharing Let many people use a Because many people kept Even personal computers are computer, each as if it their work in one computer, timeshared among multiple were his or her own, they could easily share applications. Information sharing the cost. information. Reduced cost sharing is ubiquitous; shared increased the diversity of information lives on users and applications. "servers." Computer networking Load-sharing among a Electronic mail; widespread Networking has enabled modest number of major sharing of software and data; worldwide communication computers local area networks (the and sharing, access to original networks were expertise wherever it exists, wide-area); the and commerce at our interconnection of literally fingertips. millions of computers. Workstations Enough computing to Displaced most other forms Millions in use for science, make interactive graphic of computing and terminals; engineering, and finance useful led directly to personal computers and multimedia Computer graphics Make pictures on a "What you see is what you Almost any image is computer. get" possible. Realistic moving images made on computers are routinely seen on television and were used effectively in the design of the Boeing 777. "Windows and mouseâ Easy access to many Dramatic improvements in The standard way to use all user interface technology applications and overall ease of use; the computers. documents at once. integration of applications (e.g., spreadsheets, word processors, and presentation graphics) Very large integrated New design methods to Easy access to "silicon Many more schools training circuit design keep pace with integrated foundries"; a renaissance in VLSI designers; many more circuit technology computer design companies using this technology Reduced Instruction Set Computers 2 to 3 times Dramatic progress in the"co- Millions in use; penetration Computers (RISC) faster design" of hardware and continues to increase software, leading to significantly greater performance Redundant Arrays of Faster, more reliable disk RAID is more economical as Entering the mainstream for Inexpensive Disks (RAID) systems well: massive data large-scale data storage; will repositories ride the price/ see widespread commercial performance wave of use in digital video servers personal computer and workstations.
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 18 Topic Goal Unanticipated Results Today Parallel computing Significantly faster Parallel deskside server Many computer manufacturers computing to address system; unanticipated include parallel computing as a complex problems applications such as standard offering transaction processing, financial modeling, database mining, and knowledge discovery in data Digital libraries Universal, multimedia (text, Pending development Beginning development image, audio video) access to, all the information in large libraries; an essential need is tools for discovering and locating information From this record of success we can draw some important conclusions: â¢ Research pays off for an extended period. The federal investment and the payoff, including the spawning of numerous corporations and multibillion-dollar industries, have been continuous for decades. â¢ Unanticipated results are often the most important results. Information sharing is an unanticipated result of timesharing; what-you-see-is-what-you-get displays and hypermedia documents are unanticipated results of computer graphics; electronic mail is an unanticipated result of networking. â¢ The fusion of ideas multiplies their effect. Distributed systems, such as automated teller machine networks, combine elements of timesharing, networking, workstations, and computer graphics. Personal digital assistants, the emerging generation of truly portable computers, combine these elements with new networking and power-management technologies. â¢ Research trains people. A major output of publicly supported research programs has been people. Some develop or create a new concept and start companies to commercialize their knowledge. Others form a pool of expertise that allows new or existing companies to move quickly into new technologies. â¢ The synergy among industry, academia, and government has been highly effective. The flow of ideas and people between government-sponsored and commercial programs is suggested in Figure 1.2.
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 19 BOX 1.2 AN EXAMPLE OF A SUCCESSFUL FEDERAL R&D PROGRAM: THE ARPA VLSI PROGRAM The ARPA VLSI program began in the late 1970s. This program, inspired by the ground-breaking work of Carver Mead and Lynn Conway, envisioned that integrated circuit technology could be made available to system designers and that this would have tremendous impact. The program funded academic research activities as well as the Metal Oxide Semiconductor Implemention Service (MOSIS). MOSIS provided low-cost, fast-turnaround VLSI fabrication services to the research community; established by ARPA, it was expanded and had access broadened by the National Science Foundation. The ARPA VLSI program is widely regarded to have been a tremendous success. Among its notable achievements are the following: â¢ Developed the concept of the multichip wafer, which allowed multiple designs to share a single silicon fabrication run. Together with tools developed to assemble the designs and provide services for digital submission of chip designs, this capability made the concept of a low-cost, fast-turnaround silicon foundry a reality. Several companies were formed based on these ideas, with VLSI Technology Inc. being the best known. â¢ Stimulated development of the Geometry Engine and Pixel Planes projects, which used the capabilities of VLSI to create new capabilities in low-cost, high-performance three-dimensional graphics. The Geometry Engine project formed the basis of Silicon Graphics Inc. Pixel planes technology is licensed to Ivex and Division. â¢ Stimulated development of Berkeley UNIX, which was funded to provide a research platform for the VLSI design tools. This version of UNIX eventually became the basis for the operating system of choice in workstations, servers, and multiprocessors. UNIX went on to become the most widely used vendor-independent operating system, with the code developed at Berkeley being key to this development. â¢ Accelerated understanding of the importance of low-cost, high-quality graphics for VLSI design, inspiring the creation of the Stanford University Network (SUN) workstation project. Together with the UNIX development from Berkeley, this technology formed the basis for Sun Microsystems. â¢ Developed two of the three RISC experiments, the Berkeley RISC project and the Stanford MIPS project, which were major parts of the VLSI program inspired by the possibilities of VLSI technology. These technologies formed the basis for many RISC designs, including those of MIPS Computer Systems (now owned by Silicon Graphics Inc.) and Sun Microsystems. â¢ Sponsored extensive developments in computer-aided design (CAD) tool design. These led to revolutionary improvements in CAD technology for layout, design rule checking, and simulation. The tools developed in this program were used extensively in both academic research programs and in industry. The ideas were developed in commercial implementations by companies such as VLSI Technology, Cadnetix, and more recently, Synopsis. Overall, the ARPA VLSI program was a landmark success, not only in creating new technologies and revolutionizing the computer industry, but also in forming the basis for major new industrial technologies and a number of companies that have become major corporations.* * Interestingly, the success of the ARPA VLSI program stands in sharp contrast to the Department of Defense VHSIC program, which based entirely in industry and is generally regarded to have had only modest impact either in developing new technologies or in accelerating technology.
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 20 FIGURE 1.2 Government-sponsored computing research and development stimulates creation of innovative ideas and industries. Dates apply to horizontal bars, but not to arrows showing transfer of ideas and people. Table 1.1 is a companion to this figure.
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 21 BOX 1.3 AN EXAMPLE OF THE IMPACT OF FEDERAL R&D SUPPORT IN ESTABLISHING A FIELD: COMPUTER GRAPHICS In the middle 1 960s, using a computer on loan from the Air Force and financial support from the Central Intelligence Agency, the computer graphics group at Harvard University demonstrated a prototype virtual reality system. This work contributed significantly to the technological and personnel foundation for the Evans and Sutherland Corporation, which subsequently provided computer-based equipment for training pilots-equipment that is used today by the U.S. military and by commercial pilots the world over. In the early 1970s the University of Utah was host to a leading program in computer graphics. Dave Evans went there to found a computer science department, and realizing that his department could not be all things to all people, he specialized in computer graphics. ARPA provided the main research support. At that time nearly all pictures of three-dimensional objects were drawn with lines only. The resulting images appeared to be of wire frames. They were not very realistic. The Utah group worked mainly on techniques for increasing the realism by omitting parts of objects that were hidden behind other parts and by shading the surfaces of the objects. The resulting pictures were much more realistic. Two key developments had particularly significant impact. First, Watkins and others, following a suggestion of Evans, developed a set of incremental techniques for computing what parts of an object were hidden. The key observation was that two parts of the image must be nearly the same if they are close together. When some part of an image has been computed, nearby parts are easier to compute than they would be if computed in isolation. Second, Gouraud and Phong and others developed incremental algorithms for shading solid surfaces. Prior to their work the best images appeared to be made with flat surfaces; each surface was painted a single shade according to the angle between it, the light, and the observer. Many workers sought methods for representing objects with curved surfaces, but it was then and is still difficult. Instead, Gouraud invented a trick. He painted each surface a gradually changing shade in such a way that at the joints between surfaces they had the same shade. With no demarcation line, the human eye thinks the resulting surface is smooth even though it is made of little flat plates. Phong went a step further, computing highlights as if the surface were curved. Gouraud shading and Phong shading are in standard use everywhere today. It is particularly interesting to note that when government support started, no one knew that these technologies were possible and the people who made the key discoveries were not yet involved. The vast implications of computer graphics (what-you-see-is-what-you-get document creation systems, scientific visualization, the entertainment industry, virtual reality) were of course totally unforeseen at the time that this fundamental research was undertaken. In addition to the specific developments cited above, an essential contribution was the many individuals whose training in universities benefited from ARPA support. A few of the more prominent are John Warnock of Adobe Systems ($300 million per year), Jim Clark (formerly) of Silicon Graphics Inc. ($1 billion per year), Henry Fuchs of the University of North Carolina, and Ed Catmull of Pixar. Many others carried the knowledge to companies and academic institutions throughout the nation. â¢ Even for defense applications, supporting research on strategically motivated but broadly applicable computing and communications technology has clearly been the right approach. In the past, many defense applications and requirements presaged commercial applications and requirements. Today, commercial computer systems and applications often find use in defense environments. â¢ Research and development take time. At least 10 years, more often 15, elapse between initial research and commercial success. This is true even for research of strategic importance. And it is true in spite of the rapid pace of today's product development, as indicated in the timeline for recent commercial successes such as RISC and windows (see Figure 1.2).
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 22 BOX 1.4 FEDERAL R&D SUPPORT PROPELS DATABASE TECHNOLOGY The database industry generated about $7 billion in revenue in 1994 and is growing at a rate of 35 percent per year. All database industry leaders are U.S.-based corporations: IBM, Oracle, Sybase, Informix, Computer Associates, and Microsoft. In addition, there are two large specialty vendors, both also U.S.-based: Tandem, selling over $1 billion per year of fault-tolerant transaction processing systems, and AT&T-Teradata, selling about $500 million per year of data mining systems. In addition to these well-established companies, there is a vibrant group of small companies specializing in application-specific databases, object-oriented databases, and desktop databases. A very modest federal research investment, complemented by a modest industrial research investment, led directly to U.S. dominance of this market. It is not possible to list all the contributions here, but three representative research projects are highlighted that had major impact on the industry. 1. Project Ingres started at the University of California, Berkeley, in 1972. Inspired by Codd's landmark paper on relational databases, several faculty members (Stonebraker, Rowe, Wong, and others) started a project to design and build a system. Incidental to this work, they invented a query language (QUEL), relational optimization techniques, a language binding technique, and interesting storage strategies. They also pioneered work on distributed databases. The Ingres academic system formed the basis for the Ingres product now owned by Computer Associates. Students trained on Ingres went on to start or staff all the major database companies (AT&T, Britton Lee, HP, Informix, IBM, Oracle, Tandem, Sybase). The Ingres project went on to investigate distributed databases, database inference, active databases, and extensible databases. It was rechristened Postgres, which is now the basis of the digital library and scientific database efforts within the University of California system. Recently, Postgres spun off to become the basis for a new object- relational system from the start-up Illustra Information Technologies. 2. System R was IBM Research's response to Codd's ideas. His relational model was at first very controversial; people thought that the model was too simplistic and that it would never perform well. IBM Research management took a gamble and chartered a small (10-person) effort to prototype a relational system based on Codd's ideas. That effort produced a prototype, System R, that eventually grew into the DB2 product series. Along the way, the IBM team pioneered ideas in query optimization, data independence (views), transactions (logging and locking), and security (the grant- revoke model). In addition, the SQL query language from System R was the basis for the ANSI/ISO standard. The System R group went on to investigate distributed databases (project R*) and object-oriented extensible databases (project Starburst). These research projects have pioneered new ideas and algorithms. The results appear in IBM's database products and those of other vendors. 3. The University of Wisconsin's Gamma system was a highly successful effort that prototyped a high- performance parallel database system built of off-the-shelf system components. During the 1970s there had been great enthusiasm for database machines-special-purpose computers that would be much faster than general-purpose systems running conventional databases. These research projects were often based on exotic hardware like bubble memories, head-per-track disks, or associative random access memory. The problem was that general-purpose systems were improving at a rate of 50 percent per year, and so it was difficult for exotic systems to compete with them. By 1 980, most researchers realized the futility of special-purpose approaches, and the database machine community switched to research on using arrays of general-purpose processors and disks to process data in parallel. The University of Wisconsin was home to the major proponents of this idea in the United States. Funded by government and industry, researchers prototyped and built a parallel database machine called Gamma, whose hardware base was an early Intel scalable parallel machine. That system produced ideas and a generation of students who went on to staff all the database vendors. Today, the highly successful parallel database systems from IBM, Tandem, Oracle, Informix, Sybase, and AT&T all have a direct lineage from the Wisconsin research on parallel database systems. The use of parallel databases systems for data mining is now the fastest-growing component of the database server industry. The Gamma project evolved into the Exodus project at Wisconsin (focusing on an extensible object-oriented database). Exodus has now evolved to the Paradise system, which combines object-oriented and parallel database techniques to represent, store, and quickly process huge Earth-observing satellite databases. SOURCE: James Gray and others for the Computing Research Association; reproduced with permission.
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 23 CONTINUED FEDERAL INVESTMENT IS NECESSARY TO SUSTAIN OUR LEAD What must be done to sustain the innovation and growth needed for enhancing the information infrastructure and maintaining U.S. leadership in information technology? Rapid and continuing change in the technology, a 10- to 15-year cycle from idea to commercial success, and successive waves of new companies are characteristics of the information industry that point to the need for a stable source of expertise and some room for a long-term approach. Three observations seem pertinent. 1. Industrial R&D cannot replace government investment in basic research. Very few companies are able to invest for a payoff that is 10 years away. Moreover, many advances are broad in their applicability and complex enough to take several engineering iterations to get right, and so the key insights become ''public" and a single company cannot recoup the research investment. Public investment in research that creates a reservoir of new ideas and trained people is repaid many times over by jobs and taxes in the information industry, more innovation and productivity in other industries, and improvements in the daily lives of citizens. This investment is essential to maintain U.S. international competitiveness. Of course, industrial R&D also contributes to the nation's pool of new ideas, but a company may postpone exploiting its ideas if they disturb existing business. A good example is the evolution of RISC processors shown in Figure 1.2. RISC was invented by John Cocke, an IBM researcher, but IBM made no RISC products for a decade. Only after the ideas were embraced and extended in government-sponsored work at universities did industry adopt them, and this adoption was initiated by young companies, including Sun Microsystems, and start-ups, including MIPS. Now, a decade later, IBM is one of the leaders in exploiting RISC technology, but the cost to IBM of this delay has been significant. Firms have regularly failed to adapt to change as evidenced by the departure from the computer business of GE, RCA, Honeywell, Philco, Perkin-Elmer, Control Data, and Prime; the folding together by merger of other manufacturers; and major downsizing at IBM and DEC. It often is easier for a start-up to form, raise venture capital, and succeed than for an established firm to abandon a currently successful direction in favor of a new approach just when the old approach is most financially successful. Even in a vigorous industrial R&D climate, then, federal investment in research is necessary, both for its long-term focus and for its ability to incubate ideas to the point of clear commercial viability. But the need for federal investment in research is further compounded by the fact that industrial R&D is already under stress. In the computing hardware and software sector, for example, although a small number of new R&D enterprises have been launched, most notably by Microsoft, there has been a general consolidation of resources by companies such as IBM and DEC, including an apparent reduction in their research effort or at least a greater emphasis on short-term R&Dâa change in emphasis is evident to insiders and close observers but not easy to document.5 The industry-wide level of R&D as a percentage of sales has also been brought down by the tendency of low-price vendors, such as Dell and Gateway, to ride on the research conducted by others. The trend toward reduced industrial R&D appears also in the telecommunications industry. The 1984 divestiture of AT&T led to a smaller Bell Laboratories and to the creation of Bell Communications Research (Bellcore), a shared research facility for the seven regional Bell holding companies. Recent deregulation has encouraged a reduction of basic research at both AT&T and Bellcore. Lacking significant research capability at its individual service companies, the cable television industry depends on research done by its
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 24 hardware vendors and its shared CableLabs. Although more new technology has been deployed in telecommunications since deregulation in the early 1980s, and although in both computing and communications there are more companies selling products now than there were 15 years ago, today's sales are based on yesterday's research and do not guarantee a sufficient foundation for tomorrow's sales. Competition in an industry can promote technological growth, but competition alone is not the source of innovation and leadership. Because of the long time scales involved in research, the full effect of decreasing investment in research may not be evident for a decade, but by then, it may be too late to reverse an erosion of research capability. Thus, even though many private-sector organizations that have weighed in on one or more policy areas relating to the enhancement of information infrastructure typically argue for a minimal government role in commercialization, they tend to support a continuing federal presence in relevant basic research.6 2. It is hard to predict which new ideas and approaches will succeed. Over the years, federal support of computing and communications research in universities has helped make possible an environment for exploration and experimentation, leading to a broad range of diverse ideas from which the marketplace ultimately has selected winners and losers. As demonstrated by the unforeseen applications and results listed in Table 1.1, it is difficult to know in advance the outcome or final value of a particular line of inquiry. But the history of development in computing and communications suggests that innovation arises from a diversity of ideas and some freedom to take a long-range view. It is notoriously difficult to place a specific value on the generation of knowledge and experience, but such benefits are much broader than sales of specific systems. 3. Research and development in information technology can make good use of equipment that is 10 years in advance of current "commodity" practice. When it is first used for research, such a piece of equipment is often a supercomputer. By the time that research makes its way to commercial use, computers of equal power are no longer expensive or rare. For example, the computer graphics techniques that are available on desktop workstations today, and that will soon be on personal computers and set-top boxes, were pioneered on the multimillion-dollar supercomputers of the 1960s and 1970s. Part of the task in information technology R&D is to find out how today's supercomputers can be used when they are cheap and widely available, and thus to feed the industries of tomorrow. The large-scale systems problems presented both by massive parallelism and by massive information infrastructure are additional distinguishing characteristics of information systems R&D, because they imply a need for scale in the research effort itself. In principle, collaborative efforts might help to overcome the problem of attaining critical mass and scale, yet history suggests that there are relatively few collaborations in basic research within any industry, and purely industrial (and increasingly industry-university or industry-government) collaborations tend to disseminate results more slowly than university-based research. The government-supported research program (on the order of $1 billion for information technology R&D) is small compared to industrial R&D (on the order of $20 billion; Coy, 1993), but it constitutes a significant portion of the research component, and it is a critical factor because it supports the exploratory work that is difficult for industry to afford, allows the pursuit of ideas that may lead to success in unexpected ways, and nourishes the industry of the future, creating jobs and benefits for ourselves and our children. The industrial R&D investment, though larger in dollars, is different in nature: it focuses on the near-termâincreasingly so, as noted earlierâand is thus vulnerable to major opportunity costs. The increasing tendency to focus on the near-term is
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 25 affecting the body of the nation's overall R&D. Despite economic studies showing that the United States leads the world in reaping benefits from basic research, pressures in all sectors appear to be promoting a shift in universities toward near-term efforts, resulting in a decline in basic research even as a share of university research (Cohen and Noll, 1994). Thus, a general reduction in support for basic research appears to be taking place. It is critical to understand that there are dramatic new opportunities that still can be developed by fundamental research in information technologyâopportunities on which the nation must capitalize. These include high-performance systems and applications for science and engineering; high-confidence systems for applications such as health care, law enforcement, and finance; building blocks for global-scale information utilities (e.g., electronic payment); interactive environments for applications ranging from telemedicine to entertainment; improved user interfaces to allow the creation and use of ever more sophisticated applications by ever broader cross sections of the population; and the creation of the human capital on which the next generation's information industries will be based. Fundamental research in computing and communications is the key to unlocking the potential of these new applications. How much federal research support is proper for the foreseeable future and to what aspects of information technology should it be devoted?7 Answering this question is part of a larger process of considering how to reorient overall federal spending on R&D from a context dominated by national security to one driven more by other economic and social goals. It is harder to achieve the kind of consensus needed to sustain federal research programs associated with these goals than it was under the national security aegis. Nevertheless, the fundamental rationale for federal programs remains (Cohen and Noll, 1994, p. 73): That R&D can enhance the nation's economic welfare is not, by itself, sufficient reason to justify a prominent role for the federal government in financing it. Economists have developed a further rationale for government subsidies. Their consensus is that most of the benefits of innovation accrue not to innovators but to consumers through products that are better or less expensive, or both. Because the benefits of technological progress are broadly shared, innovators lack the financial incentive to improve technologies as much as is socially desirable. Therefore, the government can improve the performance of the economy by adopting policies that facilitate and increase investments in research. TODAY THE HPCCI IS THE UMBRELLA FOR MOST GOVERNMENT-SPONSORED COMPUTING AND COMMUNICATIONS RESEARCH Today, the HPCCI is the umbrella sheltering most government-sponsored computing and communications research. The HPCCI is thus responsible for sustaining the nation's lead in information technology. It centers on rising performance as the driver for progress across a wide range of technologies. "High-performance" means bringing a large quantity of computing and communications to bear on one problem. It is far broader than "supercomputing," which was the focus of early public policy in this area. It is also a moving targetâthe threshold for what is considered "high-performance" advances, as ever-increasing performance levels become more broadly available. The supercomputer of this generation is the group server of the next generation and the personal computer of the generation after that. The same is true for communications: today's leading edge is tomorrow's mainstream. Focusing research on the leading edge of performance accelerates the broad availability of what starts out as limited-access technology, in the following ways:
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 26 â¢ By advancing the hardware and software systems themselves. Many techniques now used to build mainstream computers and their software were originally developed for high-performance computing: specialized floating-point processing, pipelining, multiprocessing, and multiple instruction issue, among others; â¢ By creating new applications today so that they will be mature when the hardware that can run them is cheap and ubiquitous. It can take much longer to develop and fully exploit a new application than to build a new computer. Overcoming this lag is one of the drivers of work on the Grand and National Challenge concepts; and â¢ By attacking problems that would otherwise be beyond reach for several years, thus speeding up the development of new fields of science, engineering, medicine, and business. National access to machines with 100 to 1,000 times the memory and speed of researchers' desktop machines allows them to make qualitative jumps to exploring frontier research problems of higher dimensionality, greater resolution, or more complexity than would otherwise be possible. Fundamental but strategic research under the HPCCIâwhich now encompasses most of the academic computing research sponsored by the federal governmentâcreates a strong pull on the computer science and engineering research community, the user community, and the hardware, software, and telecommunications vendors. For example, it was evident to individuals in the computing and communications research community that as VLSI circuit technology developed, it would favor computing structures based on the large-scale replication of modest processors, as opposed to the small-scale replication of processors of the highest attainable individual performance. (The highest-speed circuits are expensive to design, produce, maintain, and operate.) This vision of high-performance computing and communications based on parallelism brought three major technical challenges: (1) interconnection and memory architectureâhow to unite large numbers of relatively inexpensive processors into systems capable of delivering truly high-performance, and (2) programmingâhow to program such collections of processors to solve large and complex problems. In the networking arena, the obvious issues of large-scale, widespread use and high-speed transmission were compounded by added problems of connecting heterogeneous systems and achieving high reliability. The technical and economic imperatives that led to the HPCCI are discussed in some detail in Appendix A. HPCCI was, and continues to be, an appropriate thrust. As Chapter 2 discusses, the HPCCI can boast a broad range of very significant accomplishments. However, the committee sees an unhealthy dependence of our nation's critical leadership in information technology on the fate of a single initiative. First, not all important computing research is focused on high-performance, although the politics of funding have encouraged researchers and agencies to paint everything with an HPCCI brush. Second, we cannot afford to cripple computing research if the nation's attention and resources turn away from any single goal. At the beginning of the HPCCI in 1992, its increasing momentum made association with it attractive and helped the initiative attain both intellectual and financial critical mass. The rise of the National Information Infrastructure initiative, though, underscores how changeable the federal funding process may be. We must move toward a more mature approach in which a substantial focus on goals of obvious national importance is combined with a diversified program of long-term exploration of important research problems that support the strategic information technology industry. We can change the HPCCI's name, we can change its orientation, but we must move forward. Continuing the momentum of this successful initiative is essential to ongoing U.S. prosperity and leadership in information technology.
U.S. LEADERSHIP IN INFORMATION TECHNOLOGY 27 the original; line lengths, word breaks, heading styles, and other typesetting-specific formatting, however, cannot be retained , and some typographic errors m print version of this publication as the authoritative version for attribution. NOTES 1. Microcomputers (personal computers) are defined as computers with a list price of $1,000 to $14,999; see CBEMA (1994), pp. 60-61. Forrester (1994, pp. 2-3) estimates the share of households with PCs at about 20 percent, based on their survey of households and Bureau of Census data. Forrester's model accounts for retirements of older PCs and for households with multiple PCs. This is a lower estimate than the Software Publishing Association's widely cited 30 percent share. Building on an unusual fourth quarter sales surge, almost 7 million PCs were sold for residential use in 1994 (Markoff, 1995). According to another source, fourth quarter 1994 U.S. PC shipments were 32 percent higher than the corresponding quarter in 1993, 5.8 million units, feeding a 27 percent surge in worldwide PC shipments for 1994 in total (Carlton, 1994). 2. According to Roach (1994, p. 12), "IT [information technology] expenditures now comprise fully 45 percent of total business outlays on capital equipmentâeasily the largest line item in Corporate America's investment budget and up dramatically from a 20% share seen as recently as 1980 .... But does the technology story have staying power? We believe that the new dynamics of technology demand should continue to power the U.S. economy throughout the 1990s. Indeed, the technology-capital-spending link is an integral element of the productivity-led recovery scenario that lies at the heart of our basic macro call for the United States [emphasis in original]. In this light, IT is the principal means by which businesses can improve upon the efficiency gains first derived from cost-cutting, facilitating the transition between slash-and-burn downsizing and the rebuilding eventually required for sustained competitive prowess. Without increasing emphasis on technology, the economy gets hollowed out. The good news is that the long-awaited technology payback suggests this darker scenario won't come to pass." 3. See U.S. DOC (1994); the Department of Commerce utilizes data from the U.S. Bureau of the Census series, the Annual Survey of Manufactures. It places the value of shipments for the information technology industry at $421 billion for 1993. This number omits revenue from equipment rentals, fees for after-sale service, and markups in the product distribution channel. It also excludes office equipment in total. It includes computers, storage devices, terminals, and peripherals; packaged software; computer program manufacturing, data processing, information services, facilities management, and other services; and telecommunications equipment and services. See also CBEMA (1994); CBEMA values the worldwide 1993 revenue of the U.S. information technology industry at $602 billion. In addition to including office equipment, it shows larger revenues for information technology hardware and telecommunications equipment than does the Department of Commerce. 4. In addition, European and Japanese manufacturers have significant sales in computer-related products such as telecommunications switches, semiconductors, and even high-performance computing equipment. 5. See Rensberger (1994) and Corcoran (1994). Also Coy (1993) indicates that data on individual information technology companies show 1992 R&D spending as a percent of sales ranging from 1 to 15 percent, concentrated at the lower end of that range; industry segment statistics fall in the same range. The figures suggest that despite the "high-tech" image of the industry, less R&D is conducted than many believe, and many firms capitalize on research conducted by others. 6. See, for example, CSPP (1994), pp. 1-2. A broad argument for a federal role in support of basic research in critical technologies, including computing and communications, is presented in a Council on Competitiveness (1991) report. 7. A point of reference can be found in a mid-1994 document from the Electronics Subcommittee of the National Science and Technology Council (NSTC, 1994a). It calls attention to U.S. dependence on a healthy electronics industry and speaks of efforts to work with industry to "develop a roadmap for electronics that will illuminate gaps in government-sponsored research and infrastructure efforts," focusing on "information products that connect to information networks, including the National Information Infrastructure (NII)."