Cover Image

PAPERBACK
$118.00



View/Hide Left Panel

Page 315

39
The National Information Infrastructure:  A High-Performance Computing and Communications Perspective

Randy H. Katz, University of California at Berkeley
William L. Scherlis, Carnegie Mellon University
Stephen L. Squires, Advanced Research Projects Agency

Abstract

Information infrastructure involves more than the building of faster and larger computers and the creation of faster and better communications networks. The essence of information infrastructure is a diverse array of high-level information services, provided in an environment of pervasive computing and computer communications. These services enable users to locate, manage, and share information of all kinds, conduct commerce, and automate a wide range of business and governmental processes. Key to this is a broad array of rapidly evolving commonalities, such as protocols, architectural interfaces, and benchmark suites. These commonalities may be codified as standards or, more likely, manifest as generally accepted convention in the marketplace.

Information technology has become essential in sectors such as health care, education, design and manufacturing, financial services, and government service, but there are barriers to further exploitation of information technology. Pervasive adoption of specific service capabilities, which elevates those capabilities from mere value-added services to infrastructural elements, is possible only when value can be delivered with acceptable technological and commercial risk, and with an evolutionary path rapidly responsive to technological innovation and changing needs. Private- and public-sector investment in national information infrastructure (NII) is enabling increased sectorwide exploitation of information technologies in these national applications areas. Although the private sector must lead in the building and operation of the information infrastructure, government must remain a principal catalyst of its creation, adoption, and evolution.

This paper explores the barriers to achieving NII and suggests appropriate roles for government to play in fostering an NII that can be pervasively adopted. The main locus of government activity in research and early-stage technology development is the federal High Performance Computing and Communications (HPCC) program. This program is evolving under the leadership of the National Science and Technology Council's Committee on Information and Communications.

Introduction

Information technologies are broadly employed in nearly all sectors of the economy, and with remarkable impact. Nonetheless, there are still enormous unrealized benefits to be obtained from effective application of information technology, particularly the intertwining of multiple distributed computing applications into national-scale infrastructural systems. In many sectors, including health care, education and training, crisis management, environmental monitoring, government information delivery, and design and manufacturing, the benefits would have profound significance to all citizens (as suggested in IIS, 1992, and Kahin, 1993). These sectors of information technology application have been called national challenge (NC) applications.

The pervasiveness and national role of the NC applications prevent them from developing dependency on new technologies, even when those technologies offer important new capabilities. This is so unless the risks and costs are manageable, and there is a clear trajectory for growth in capability and scale,



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 315
Page 315 39 The National Information Infrastructure:  A High-Performance Computing and Communications Perspective Randy H. Katz, University of California at Berkeley William L. Scherlis, Carnegie Mellon University Stephen L. Squires, Advanced Research Projects Agency Abstract Information infrastructure involves more than the building of faster and larger computers and the creation of faster and better communications networks. The essence of information infrastructure is a diverse array of high-level information services, provided in an environment of pervasive computing and computer communications. These services enable users to locate, manage, and share information of all kinds, conduct commerce, and automate a wide range of business and governmental processes. Key to this is a broad array of rapidly evolving commonalities, such as protocols, architectural interfaces, and benchmark suites. These commonalities may be codified as standards or, more likely, manifest as generally accepted convention in the marketplace. Information technology has become essential in sectors such as health care, education, design and manufacturing, financial services, and government service, but there are barriers to further exploitation of information technology. Pervasive adoption of specific service capabilities, which elevates those capabilities from mere value-added services to infrastructural elements, is possible only when value can be delivered with acceptable technological and commercial risk, and with an evolutionary path rapidly responsive to technological innovation and changing needs. Private- and public-sector investment in national information infrastructure (NII) is enabling increased sectorwide exploitation of information technologies in these national applications areas. Although the private sector must lead in the building and operation of the information infrastructure, government must remain a principal catalyst of its creation, adoption, and evolution. This paper explores the barriers to achieving NII and suggests appropriate roles for government to play in fostering an NII that can be pervasively adopted. The main locus of government activity in research and early-stage technology development is the federal High Performance Computing and Communications (HPCC) program. This program is evolving under the leadership of the National Science and Technology Council's Committee on Information and Communications. Introduction Information technologies are broadly employed in nearly all sectors of the economy, and with remarkable impact. Nonetheless, there are still enormous unrealized benefits to be obtained from effective application of information technology, particularly the intertwining of multiple distributed computing applications into national-scale infrastructural systems. In many sectors, including health care, education and training, crisis management, environmental monitoring, government information delivery, and design and manufacturing, the benefits would have profound significance to all citizens (as suggested in IIS, 1992, and Kahin, 1993). These sectors of information technology application have been called national challenge (NC) applications. The pervasiveness and national role of the NC applications prevent them from developing dependency on new technologies, even when those technologies offer important new capabilities. This is so unless the risks and costs are manageable, and there is a clear trajectory for growth in capability and scale,

OCR for page 315
Page 316 and that growth is responsive to new technologies and emerging needs. For this reason, while computing and communications technologies have taken hold in numerous specific application areas within these sectors, in most cases the challenge remains for advanced information technology to take on a significant sectorwide infrastructural role. The NII, in the simplest terms, consists of available, usable, and interoperable computing and communications systems, built on underlying communications channels (the bitways) and providing a broad range of advanced information technology capabilities (the services). These services provide the basis for a wide range of use (the applications), ranging in scale up to national challenge applications. A key point is that NII is far more than communications connectivity; indeed it is generally independent of how communications connectivity is supplied. Generally speaking, infrastructural systems consist of ubiquitous shared resources that industry, government, and individuals can depend on to enable more productive and efficient activity, with broadly distributed benefit (Box 1). The resources can include physical assets, such as the national air-traffic control system or the nationwide highway system. The resources can also include key national standards, such as the electric power standards, trucking safety standards, railroad track structure, and water purity standards. The resources are ubiquitous and reliable to an extent that all participants can commit to long-term investment dependent on these resources. This also implies a capacity for growth in scale and capability, to enable exploitation of new technologies and to assure continued value and dependability for users. The value can be in the form of increased quality and efficiency, as well as new opportunities for services. It is therefore clear that a critical element of NII development is the fostering of appropriate commonalities, with the goal of achieving broad adoptability while promoting efficient competition and technological evolution. Commonalities include standard or conventional interfaces, protocols, reference architectures, and common building blocks from which applications can be constructed to deliver information services to end users. A fundamental issue is management and evolution, and in this regard other examples of national infrastructure reveal a wide range of approaches, ranging from full government ownership and control to private-sector management, with government participation limited to assurance of standard setting. The Clinton Administration, under the leadership of Vice President Gore, has made national information infrastructure a priority (Gore, 1991; Clinton and Gore, 1993), as have other nations (examples: NCBS, 1992, and Motiwalla et al., 1993). The NII vision embraces computing and communications, obviously areas of considerable private investment and rapid technological change. The definition of the government role in this context has been ongoing, but several elements are clear. At the national policy level, the NII agenda embraces information and telecommunications policy, issues of privacy and rights of access, stimulation of new technologies and standards, and early involvement as a user (IITF, 1993). The federal High Performance Computing and Communications (HPCC) program, and the advanced information technologies being developed within it, play a key role in addressing the research and technical challenges of the NII. In this paper we examine several aspects of the conceptual and technological challenge of creating information infrastructure technologies and bringing them to fruition in the form of an NII built by industry and ubiquitously adopted in the NC applications sectors. Topics related to telecommunications policy, intellectual property policy, and other aspects of information policy are beyond our scope. In the first section below, we examine the federal government's role in fostering NII technologies and architecture. We then analyze the relationship between high-performance technologies and the NII and describe our three-layer NII vision of applications, services, and bitways. This vision is expanded in the next two sections, in which the NC applications and the technologies and architectural elements of the services layer are discussed. We present the research agenda of the Federal High Performance Computing and Communications Program in the area of information infrastructure technologies and applications, followed by our conclusions.

OCR for page 315
Page 317 Role of the Federal Government in Information Infrastructure New technologies are required to achieve the NII vision of ubiquitous and reliable high-level information services. Many of the envisioned NII services place huge demands on underlying computing and communications capabilities, and considerable energy is being applied in industry, government, and research to creating these new capabilities. But there is more to NII than making computers faster, smarter, and more widely connected together. Creation of national infrastructure entails delivery of services with sufficient reliability, ubiquity, and freedom from risk that they can be adopted sectorwide in national applications. The challenge to achieve this is considerable in any infrastructural domain and particularly difficult in information infrastructure. These goals usually involve rigorous standards and stability of technology, which appear all but precluded by the extremely rapid evolution in every dimension of information technology. In the development of other kinds of national infrastructure, government has had a crucial catalytic role in fostering the broad collaboration and consensus-building needed to achieve these goals, even when industry has held the primary investment role in creating the needed technologies and standards. In the case of national information infrastructure, it is manifestly clear that it should not and indeed cannot be created and owned by the government. But the catalyzing role of government is nonetheless essential to bring the NII to realization. The government has an enormous stake in the NII as a consequence of its stake in the national challenge applications. Information infrastructure technologies play a critical role in the federal government's own plan to reengineer its work processes (Gore, 1993). Vice President Gore draws an analogy between the NII and the first use of telegraphy: Basically, Morse's telegraph was a federal demonstration project. Congressfunded the firsttelegraph link between Washington and Baltimore. Afterwards,though—after the first amazingtransmission—most nations treated the telegraph and eventuallytelephone service as agovernment enterprise. That's actually what Morse wanted, too. He suggestedthat Congressbuild a national system. Congress said no. They argued that he should findprivate investors.This Morse and other companies did. And in the view of most historians, thatwas a source ofcompetitive advantage for the United States. Government fostered the technology through initial demonstrations and encouragement of private investment. But the U.S. telecommunications infrastructure has been built with private funds. And analogously, the NII implementation must be a cooperative effort among private- and public-sector organizations. What are the specific roles for government? Addressing this question requires understanding how the NII differs from other major federal research and development efforts. The following characteristics summarize the differences: • The scale of the NII is so huge that government investment, if it is to have an impact, must be designed to catalyze and stimulate investment from other sources rather than subsidize creation of the NII itself. The NII will emerge as an aggregation of many distinct entities that compete to provide products and services. Of course, rudimentary elements of the NII not only are in place but also constitute a major sector of the economy. • The NII will provide long-term infrastructural support for applications of national importance, such as health care and education. Decisionmakers in these application sectors cannot put new technologies in a pivotal role, even when they offer important new capabilities, unless the risks and costs of adoption are manageable. Adoption risk issues include, for example, scale-up, ability to evolve gracefully, competition and mobility among suppliers, and commitment to internal systems interfaces.

OCR for page 315
Page 318 • The NII will support applications by delivering common services well above the level of simple access to telecommunications. These services can include, for example, mechanisms to protect intellectual property and user privacy, support for information management and search, and support for managing multimedia objects. The high level of services will continue to evolve in capability and power, but there is nonetheless an essential requirement to achieve elements of commonality in the system interfaces through which these services can be delivered by NII providers to application users. (This issue is elaborated on below.) • The NII depends on advanced computing and computer communications technologies whose evolution is, in large measure, the result of continued government research investment. Government basic research investment continues to be a primary source of the ideas and innovations that stimulate U.S. industry, sustain a high level of competitiveness in the market, and provide a national competitive advantage. These considerations yield a four-pronged strategy for government investment in research and development related to the NII: • Research and new technology creation; • Interoperability, commonality, and architectural design; • Application demonstration and validation; and • Aggressive early use. The first of these elements, research, is clear. Government has a traditional role as farsighted investor in long-term, high-risk research to create new concepts and technologies whose benefits may be broadly distributed. In the case of the NII, the government needs to invest both in problem-solving research, to fulfill the promise of today's vision, and also in exploratory research to create new visions for tomorrow. Government investment in research and development can support the rapid and continual transition of new NII capabilities into commercialization and adoption. Basic research can yield paradigmatic improvements with marketwide benefits. Intensive discussions among leaders from academia, industry, and government have been under way to develop a view of the technical research and development challenges of the NII (Vernon et al., 1994). The second element involves stimulating commonalities within the NII that can achieve economies of scale while simultaneously creating a foundation for a competitive supply of services. Interface and protocol commonalities foster conditions where the risks of entry for both users and creators of technology are reduced. We use the term commonality because it is more inclusive than the conventional notion of standards. It covers routine development of benchmarks, criteria, and measures to facilitate making choices among competing offerings. It also encompasses the definition of common systems architectures and interfaces to better define areas for diversity and differentiation among competing offerings. Common architectural elements help both developers and users decouple design decisions. Of course, inappropriate standards can inhibit innovation or predispose the market to particular technological approaches. A critical issue for the NIi is the speed of convergence to new conventions and standards. In addition, conventions and standards must themselves enable rapid evolution and effective response to new technology opportunities. These are familiar issues in the realm of conventionalization and standards generally; but they are also among the most fundamental considerations in achieving new high-level NII services, and are in need of specific attention. Demonstration, the third element, involves government sponsorship of testbeds to explore scalability and give early validation to new technology concepts. Testbeds can span the range from basic technologies coupled together using ad hoc mechanisms, to large-scale integration projects that demonstrate utility of services for applications in a pilot mode. These latter integration experiments can bootstrap full-scale deployments in applications areas.

OCR for page 315
Page 319

BOX 1 Information Infrastructure—Shared Resources The analogy between information infrastructure and the interstate highway system has helped bring the concept of the NII into the popular consciousness. The analogy is apt, and not only because of the role Vice President Gore's father played in drafting the legislation that led to the interstate highway system. The fundamental commercial utility of the highway system (and the railroads) is the substitution of transportation for local production, enabling new economies of scale and resource sharing in many industries. Economies of scale in manufacturing industries can be such that the cost of large-scale remote manufacture combined with transport can be significantly less than the cost of local production. The highways are infrastructural in that they make entire industries more efficient, resulting in better value for the customer and expanded markets for producers. It is this value that justifies the public investment in the infrastructure. More importantly, it explains why public investment and policy support are necessary stimuli for infrastructure development: because shared infrastructure provides advantage to all who use it, there is no particular competitive incentive for specific infrastructure users (producers or consumers) to invest directly in its creation. On the other hand, expanded markets benefit all users, and so user investment in infrastructure is justified if it is distributed equitably. For example, public highways may receive their initial funding from bond issues combined with direct taxpayer support, with operating costs and loan service costs funded by users through tolls and licensing. The highway system infrastructure is sustained through a continually evolving set of interrelated infrastructure elements, such as roadbed engineering standards, speed limits, roadside amenities, interstate trucking regulations, tolls, and routing architecture. National-scale users are able to rely on this structure and thus commit the success of their enterprises to its continued existence without expanding their risks. Information infrastructure development involves an analogous set of elements. Network protocols, application interfaces, interoperability standards, and the like define the class of mechanisms through which value is delivered. Value comes from high-level services such as electronic mail, information services, remote access, and electronic commerce all supporting a rich variety of information objects. Reliance and commitment from national-scale users depend on breadth and uniformity of access, common architectural elements, interoperability, and a reasonably predictable and manageable evolution The importance of common architectural elements to infrastructural utility must not be understated. Rail-gauge standardization is a canonical example. But commitment to common architectural elements must also include commitment to a process for evolving them. Achieving the right balance is a principal challenge to creating an adoptable national information infrastructure. Finally, acting in the interest of government applications, the government can take a proactive role as consumer of NII technologies to stimulate its suppliers to respond effectively in delivering information infrastructure that supports government applications. Possible government applications include systems for government information, crisis response, and environmental monitoring. The gigabit testbeds in the HPCC program offer a model for research partnerships among government, industry, and academe and represent a resource on which to build prototype implementations for national applications. Each testbed is cost-shared between government and the private sector and embraces the computer and telecommunications industries, university research groups, national laboratories, and application developers. The key function of the testbeds is to experiment with new networking technology and address interoperability and commonality concerns as early as possible. Relationship Between High-Performance Technologies and the NII The federal HPCC program supports the research, development, pilot demonstration, and early evaluation of high-performance technologies. HPCC's focus in its initial years was on the grand

OCR for page 315
Page 320 challenges of science and engineering, with a strategy of developing a base of hardware and software technologies that can scale up to large-scale processing systems, out to wide-area distributed systems, and down to capable yet portable systems (FCCSET, 1994; CIC, 1994). These scalable technologies will contribute strongly to the NII, as will the legacy of cooperation between government, industry, and academia. These can greatly accelerate the establishment of an evolvable information infrastructure architecture, with testbed development, protocol and architecture design, interoperability experiments, and benchmarking and validation experiments. This legacy has helped facilitate adoption of HPCC-fostered technologies by independent users by significantly reducing their costs and risks of adoption. This twofold HPCC stimulus, of research and cooperation, combines with a program emphasis on demonstration, validation, and experimental application to create a framework for government technology investment in NII. For this reason, HPCC was expanded in FY1994 to include a new major program component, Information Infrastructure Technology and Applications (IITA), focusing directly on creation of a universally accessible NII, along with its application to prototype NC applications. (These activities are described in more detail in the section below titled ''The Federal HPCC Program and the NII.") Each of the other HPCC program activities contributes to IITA. For example, emerging large-scale information servers designed to provide information infrastructure services are based on HPCC-developed, high-performance systems architectures, including architectures based on use of advanced systems software to link distributed configurations of smaller systems into scalable server configurations. The microprocessors used in these large-scale systems are the same as those found in relatively inexpensive desktop machines. High-performance networking technologies, such as communications network switches, are increasingly influenced by processor interconnection technologies from HPCC. Networking technologies are also being extended to a broad range of wireless and broadcast modalities, enhancing mobility and the extent of personal access. Included in this effort are protocols and conventions for handling multimedia and other kinds of structured information objects. NII can be viewed as built on a distributed computing system of vast scale and heterogeneity of an unprecedented degree. HPCC software for operating systems and distributed computing is enhancing the interoperability of computers and networks as well as the range of information services. The software effort in the HPCC program is leading to object management systems, methodologies for software development based on assembly of components, techniques for high assurance software, and improvements to programming languages. These efforts will contribute to the development and evolution of applications software built on the substrate of NII services. Three-Layer National Information Infrastructure Architecture Within the HPCC community, a much-discussed conceptual architecture for the National Information Infrastructure has three major interconnected layers: National Challenge Applications, supported by diverse and interdependent NII communication and computation Services, built on heterogeneous and ubiquitous NII bitways (see Figure 1). Each layer sustains a diverse set of technologies and involves a broad base of researchers and technology suppliers, yielding a continuously improving capability for users over time. By delivering utility to clients in the layers above through common mechanisms or protocols, a rapid rate of evolution of capability can be sustained in a competitive environment involving diverse suppliers. Thus, developments in each of these layers focus both on stimulating the creation of new technologies and on determining common mechanisms or protocols—the commonality—through which that capability can be delivered. For example: • The keys to scaling up in national challenge applications are often in the choice of common application-specific protocols. For example, manufacturing applications require shared representations for product and process descriptions to support widespread interoperability among design systems and tools.

OCR for page 315
Page 321 • Services such as multimedia multicast can be provided to developers of Application capabilities through proper adherence to common protocols. With well-designed protocols and interfaces, rapid growth in multimedia capability and capacity can be delivered to end users and applications developers without requiring major reengineering of the whole applications-level systems. Services are also interdependent and themselves evolve in this manner. • The diverse bitways technologies deliver communications capability in a uniform manner through use of standard protocols, such as SONET/ATM. This has the effect of insulating developers of NII services from the details of the rapidly evolving communications technologies used to deliver information capabilities to end users and applications. This architecture addresses directly the challenge of scale-up in capability, size, and complexity within each of the three layers. Ongoing validation of concepts can be achieved, in each layer, through large-scale testbed experimentation and demonstration conducted jointly with industry, users, and suppliers of new technologies and information capabilities. If the evolution of the NII architecture proceeds as envisioned, the result will be the integration of new capabilities and increased affordability in the national challenge applications. Each layer supports a wide range of uses beyond those identified for the specific national challenge applications. For example, generalized NII service and bitway technologies can also support applications on a very small scale, extensions of existing services, ad hoc distributed computing, and so on. The national challenge applications are described in more detail in the next section, with the issues addressed by the services layer in the succeeding section titled "Services." Bitways technologies are well covered in other sources, such as Realizing the Information Future (CSTB, 1994), and are not discussed here. image Figure 1 A model three-layer architecture for the NII. Bitways provide the communications substrate, the applications layer supports the implementation of the NCs, and the services layer provides the bridge between communications and information. (a) Despite the need for similar application-enabling services, each NC sector might reimplement these from scratch, yielding overly expensive stovepipe systems if there were no common services layer. (b) A common services layer coupled to toolkits for building applications.

OCR for page 315
Page 322 National Challenge Applications Numerous groups have developed lists of critical applications, characterized by the potential for a pervasive impact on American society and exploitation of extensive communications and information processing capabilities. For example, in 1993 the Computer Systems Policy Project identified design and manufacturing, education and training, and health care as the national challenges (CSPP, 1993). A more exhaustive list has been developed by the Information Infrastructure Task Force, representing the union of much of what has been proposed (IITF, 1994). Among these NC applications are the following: • Crisis Management: Crisis management systems exploit information technology to ensure national and economic security through various kinds of crises. This is accomplished by providing timely data collection and intelligence fusion, advanced planning tools, rapid communications with defense forces spread around the globe, and a command and control ability to respond quickly to crises. The same basic capabilities can be deployed on a smaller scale to respond to local emergencies, such as devastating hurricanes, earthquakes, or fires. • Design and Manufacture:These systems integrate engineering design with product manufacturing, to reduce the time to create new products, to lower production costs, and to increase product quality. In a wider sense, a pervasive design and manufacturing system should couple suppliers to their customers throughout the production chain. Goals are more responsive product design, manufacture, and just-in-time warehousing and product delivery. • Education and Training: These systems provide access to online instructional and research materials, anywhere and anytime, as well as more direct communication among students and educators. Once created and made accessible, instructional materials may be reused and evolved by instructors around the country. For example, educational use of the information infrastructure can enable distance learning, where students in remote locations can gain access to specialized instruction. Training could exploit simulation coupled with remote access to actual apparatus. • Environmental Monitoring: These systems integrate data from ground, airborne, and space-based sensors to monitor (and potentially respond to) environmental changes. They may be used to discover a nuclear accident in progress, oncoming climatic effects such as smog conditions, or can be exploited for longer-term studies such as climate change. • Government Information Delivery: Citizens have a right to ready, low-cost access to government information that they have already paid for, including economic statistics, trade information, environmental and land use information, and uniform one-stop shopping for government services such as veterans' and social security benefits. • Health Care: These systems use information technologies to improve the delivery of health care, by providing ready access to patient records, remote access to medical expertise, support for collaborative consultations among health care providers, and rapid, paperless claims adjustment that can help reduce health care costs. There are two additional applications that sit at the interface of the national challenges and the underlying service layer: digital libraries and electronic commerce. In a sense, these are fundamental enablers for information access and electronic exchange of value and will be extensively used by virtually all of the other NC applications described above. • Digital Libraries: A digital library is a knowledge center without walls, accessible from anywhere through networked communications. These systems are leading to significant advances in the generation, storage, and use of digital information of diverse kinds. Underlying services and technologies range from advanced mass storage, online capture of multimedia data, intelligent information location and

OCR for page 315
Page 323   filtering, knowledge navigation, effective human interfaces, system integration, and prototype and technology demonstration. • Electronic Commerce: Electronic commerce integrates communications, data management, and security services to allow business applications within different organizations to automatically interchange information. Communications services transfer the information from the originator to the recipient. Data management services define the interchange format of the information. Security services authenticate the source of information, verify the integrity of the information received by the recipient, prevent disclosure of the information to unauthorized users, and verify that the information was received by the intended recipient. Electronic commerce applies and integrates these infrastructure services to support business and commercial applications, including financial transactions such as electronic bidding, ordering and payments, and exchange of digital product specifications and design data. In each of these applications there is an unmet challenge of scale: How can the service be made ubiquitously available with steadily increasing levels of capability and performance? The applications communities depend on information technology for solutions but are facing scaling barriers, and hence the NII goal of crossing the threshold of ubiquity. In the absence of common architectural elements, such as interfaces, methods, and modules, it may be possible to demonstrate prototype solutions to specific applications problems through monolithic stovepipes. But these solutions may not give any means to pass this threshold of pervasiveness and dependability. Services Overview As we have noted, information infrastructure is more than bandwidth, switching, and ubiquitous communications access. It is (1) the common service environment in which NC applications are built. All applications share generic service needs: human interfaces (e.g., graphical user interaction, speech recognition, data visualization), application building blocks (e.g., planning subsystem, imaging subsystem), data and process management (e.g., search and retrieval, hyperlink management, action sequencing), and communications (e.g., IPC, mobile computation). Also, the engineering of applications requires (2) tools in the form of development environments, toolkits, operational protocols, and data exchange and action invocation standards from which service solutions can be combined, integrated, and reused. Finally, the engineering of applications becomes more efficient (as is already occuring for shrink-wrap software running on personal computers) in the presence of (3) a marketplace of reusable subsystems; in this manner, applications systems can be assembled from competitively acquired subsystems rather than built directly from the raw material of lines of code. We elaborate slightly some of the elements of the common service environment: • Tools, Libraries, and Databases: There already exist major, complex software systems that provide implementations for portions of the national challenge applications. For example, large collections of computer-aided design (CAD) software are already used extensively in engineering design domains. Similarly, relational and object-oriented database management systems provide extensive capabilities for structured data storage, indexing, and management. Diverse sets of software tools and subsystems can be integrated into coherent applications development environments to form the development base with which to assemble the national challenge applications. Similarly, diverse libraries of program components and databases of data elements can be composed and integrated into the development environment. • Composition and Integration Frameworks: Toolkits already exist in certain specific domains to assist in the composition and integration of tools, libraries, and databases. For example, the CAD

OCR for page 315
Page 324   Framework Initiative (CFI) accomplishes this by providing interface specifications for tool-to-tool communications and tool-to-database communications. In addition, the CFI has developed prototype implementations of these capabilities. These can form the basis of value-aided and commercially supported packages and software toolsets. Commercial vendors of applications software for desktop computers are developing a variety of frameworks (such as CORBA, OLE, OpenDoc, and others) for integration of software applications. Users expect that commercial pressures will eventually result in some degree of integration of these various frameworks. This issue of multiple standards is discussed further below. • Building Block Object Sets: The commonality that characterizes many of the service needs of the national challenge applications naturally yields an evolving shared market of software objects (that is, actions, operations, and protocols as well as data structures) to emerge that can be reused across multiple application development efforts. For example, a schedule object, which provides operations for allocating limited resources to critical tasks, could be used as a component of several different applications. • Application Customized Objects: Leveraging the evolving building block object sets, we expect the objects from which the applications are implemented to be customized and extended for the application at hand. For example, though there is much in common in terms of command, control, communications, and intelligence (C3I) for an intensive care unit and an environmental spill, we would expect the details of sensor integration, strategies for alerting, and demands for real-time response to be somewhat different. The elements of the underlying object base will need customization for their use in specific national challenge applications. The degree of commonality across applications, which we hope is large, remains to be discovered. Considerations in Constructing the National Information Infrastructure Common architectural elements. The national challenge applications obtain service capabilities delivered through common protocols or interfaces (known commercially as APIs, or applications portability interfaces). Though service capabilities may evolve rapidly, to the benefit of users, they are delivered through particular interfaces or protocols that evolve more slowly. This insulates the client architecturally from the rapid pace of change in implementations, on the one hand, but it enables the client to exploit new capabilities as soon as they appear, as long as they are delivered through the accepted interface. A competitive supply of services hastens the processes of convergence to common protocols and evolution therefrom. Industry standards, stovepipes, and risk. We have asserted that commonality among the protocols, interfaces, and data representations used in the services layer of the NII will be critical for its success. To the extent that emerging or evolving industry-standard commonalities are replaced by ad hoc or proprietary stovepipe approaches for the national challenge areas, applications developers place themselves at risk with respect to delivery of capability and future evolution path. In particular, in return for complete ownership or control of a solution, they may give up the opportunity to ride the curves of growth in rapidly growing underlying technologies, such as multimedia, digital libraries, and data communication. The challenge of the national challenge applications is how the applications constituencies can have both control of applications solutions and participation in the rapid evolution of underlying technologies. Government, supported by research, can invest in accelerating the emergence of new common architectural elements, and in creating technologies that reduce the risk and commitment associated with adoption of rapidly evolving standards. Evolution of commonalities. Accepted protocols naturally manifest a certain stickiness independent of their merit, because they become a stable element in determining systems structure and

OCR for page 315
Page 325 develop associated transition costs and risks. The history of TCP/IP and OSI is a good example of this well-known phenomenon, as is the recent introduction of de facto standards relating to the World Wide Web (URLs and HTML). In particular, research and government can take a leading role in establishing new commonalities that foreshadow industry standards. Rapid evolution and multiple standards. There are numerous standards presently in use for image representation. Most, but not all, are open standards; several are proprietary or otherwise encumbered. Regardless of the degree of acceptance of any one these standards, the pace of change is such that it would be foolish for a major software application developer to lock itself into accepting or producing images according to just one of these standards. Indeed, most major software applications building-blocks accept multiple such standards, thus increasing the robustness of the client applications with respect to either the technical characteristics or market acceptance of any one of the particular standards for bitmaps. In addition, tools are readily available for converting among the various representations for images. Thus, from the standpoint of applications architecture, a robust design can be created that does not depend on the fate of any one of the many standards, but rather on the evolution of the entire suite. The multiple commonalities emerge as customers and producers seek frameworks for competition in service niches. However, experience suggests that over time multiple related standards may begin to coalesce, as the commercial focus (and margins) move to higher levels of capability and the differential commercial advantage of any specific standard diminishes or even evolves into a liability. Anticipation of the process can yield robust scalable designs for major applications even when there is volatility in the markets for the subsystems they depend on. Competition and layering. With the right approach to standards and infrastructural subsystems, diverse underlying technologies can evolve into common, shareable, and reusable services that can be leveraged across multiple NC applications. Alternative implementations of a frequently used service, such as display window management, eventually will lead to the identification of best practices that can be embodied in a common services layer—for example, for human interfaces. And robust designs of the applications layers above will enable this rapid evolution to be accepted and indeed exploited. (Consider, for example, the rapid rate of release of new versions of World Wide Web browsers, and the huge multiplicity of platforms they run on, and the rapid rate of evolution of the many multimedia (and other) standards they rely on. The Web itself, however, evolves at a slower rate and is not invalidated by these changes in particular niche services. The standards on which the Web is based evolve even more slowly.) The conclusion we draw is that simultaneous evolution at multiple layers is not only possible but also needs to be an explicit architectural goal if ubiquity is to be attained at the applications level. Concerning layers. Services depend on other services for their realization. For example, a protocol for microtransactions will likely rely on other protocols for encryption and authentication. This enables a microtransaction system not only to be designed independently of the particular encryption and authentication services, but also to sustain later upgrade of (or recompetition for) those services in a robust manner. In spite of this dependency, services are not organized rigidly into layers as is, for example, the seven-layer OSI model. the term "layering" is instead meant to suggest that services naturally depend on other services. But the exact interdependency can change and evolve over time. The commonalities through which services are delivered thus form a set of multiple bottlenecks in a complex and undulating hourglass (using the analogy of CSTB, 1994). Service classification. A consequence of the above argumentation is that the success of the overall NII does not depend on achievement of a particular master infrastructural architecture. But it must be emphasized that it does strongly depend on emergence of a broad variety of infrastructural service architectures designed with scale-up, and indeed ubiquity, in mind. Ubiquity (as suggested in the comments

OCR for page 315
Page 326 above on multiple standards) is in the appearance of representatives of a set of related commonalities, and not in any particular protocol or component. This also suggests there is no ultimately correct layering lurking in the soup of services, but rather multiple candidates and arrangements. Without commonalities there is no national information infrastructure, but the particular need for specific all-encompassing commonalities is mitigated to the extent that technologies and tools for interoperability are available. That is, suites of related evolving commonalities can be supported to the extent that conversion and interoperability tools are available. The issue devolves into finding the right balance in this equation. The government thus can employ a mixed strategy in fostering national challenge applications through infrastructural commonalities. It can stimulate development of new services, creation and evolution of new architectural commonalities, and development of readily available technologies of interoperability. Direct research and development is the most effective way to stimulate new service capabilities and associated commonalities. The government can also exploit its own market presence (though the leverage is less), taking an activist role in industry forums for conventionalization (informal emergent commonalities) and standards (formalized commonalities). An illustrative layered model. One possible service taxonomy, elaborated below, classifies generic services into categories: human interfaces, applications building blocks, data and process management, and communications. Human interface services include window managers (e.g., Motif, NextStep), tools for speech handling and integration (generation as well as recognition), handwriting recognition, data visualization packages, toolkits for audio and video integration, and so on. Applications building blocks include planning packages, scheduling packages, data fusion, collaboration support, virtual reality support, and image processing and analysis. Data and process management services consist of capabilities for configuration management, shared data spaces, process flows, data integration, data exchange and translation, and data search and retrieval. Communications services include ubiquitous access through various communications mechanisms (e.g., wireless as well as wired connections into the bitways), mobility services to support users as they move through the points of connection into the network, interprocess communications and remote process call mechanisms to support distributed processing, and trust mechanisms such as authentication, authorization, encryption, password, and usage metering. The service layers themselves evolve as new underlying technologies appear that provide new functionality or better ways of doing things. A construction kit can support the assembly and evolution of applications based on the service suite. Elements of this kit, also elaborated below, could include software environments for developing applications, evolution of standard operational and data exchange protocols, software toolkits and software generators for building or generating well-defined portions of applications, and frameworks for integrating tools and data into coherent, interoperable ensembles. The value of a common services layer is conceptually indicated by Figure 2. In Figure 2(a), the lack of a common services infrastructure leads to stovepipe implementations, with little commonality among the service capabilities of the various national challenges. In Figure 2(b), a common set of services is leveraged among the national challenges, aided by a collection of toolkits, integration frameworks, and applications generators. Information Enterprise Elements Commonalities usually (but not always) emerge in the presence of a diversity of evolving implementations. A commonality in the form of a protocol is an abstraction away from the details of implementation that allows utility or value to be delivered in an implementation-independent manner to the service client. This suggests a threefold analysis for service capabilities: utility of some kind, delivered through a particular commonality such as a protocol, abstracting away the details of the diversity of implementations. Of course, the commonalities themselves evolve; they just evolve more slowly.

OCR for page 315
Page 327 image Figure 2 Technical challenges in building a national information infrastructure. Figure 3 shows examples of elements for each of the three layers of the national information infrastructure architecture. In the figure, the three columns indicate the following: • Utility: Each service provides specific value to users or clients. For example, the bitways are intended to provide ubiquitous data communications, and in a manner such that designers of applications need not know whether the communications links are through fiber, wireless, or some combination of links. The client needs only an abstract rendering of the characteristics of the aggregate link. • Commonality: A common protocol or API creates a framework for delivery of utility. Clients engineer to this framework (and its expected evolution), thereby insulating themselves from the underlying implementation details. Diversification of technology occurs behind the protocol, enabling the technologies to be made accessible to clients with acceptable risk and cost-effectiveness, and also lowering entry barriers both for new users of the technologies and for new sources of capabilities. This is the essence of the principle of open architecture. For example, transport protocols for bitways provide users of communications services with a means to access the service independently of the particular choices of underlying component technologies. • Diversity: These are the areas of implementation technology where innovation, rapid technological growth, and diversity of supply are essential to cost-effective delivery of increasing levels of

OCR for page 315
Page 328   capability. For example, a competitive supply of fiber optic connectivity is needed to provide ubiquitous access to high performance bitways. Also, continued improvements in optical and wireless communication improve affordability of high-performance mobile communication. Figure 3 shows examples of these concepts for each of the layers of the NII conceptual architecture. This organization focuses attention on two critical issues, alluded to in the foregoing, that must be addressed in the design of service commonalities: • Scalability: Testbeds and other mechanisms provide means to assess the degree of scalability of new service concepts and protocols. They also address the extent of dependencies among services. Scalability, for infrastructure, necessarily includes potential for pervasive acceptance. Protocols that are proprietary or encumbered in other ways have a lesser chance of being accepted, because of the degree of technological and programmatic risk associated with them. But, as always, there is commercial advantage to being the first to introduce a successful open protocol, so the incentive persists for commercial introduction of commonalities, even when they are fully open. • Legacy: There are two aspects of the legacy issue, a constraint and a goal. The first is the legacy we inherit, which constrains our architectural design decisions in fundamental ways. The second is the legacy we bequeath in the form of commonalities from which later architectures must evolve. Opportunities for competition are naturally sought by service clients, and adiversity of implementationsindicates success in this regard. At the level of bitways, for example, thepace of change is rapid, and thereare wide-ranging approaches for achieving a given capability (e.g., physicalmedia may consist of opticalfiber, land mobile wireless radios, or laser communications). The challengefor the application developer ishow to exploit the continuing innovation while remaining insulated fromcontinuous change; the clientwants to ride the curves of growth while avoiding continual reengineering. One conclusion to draw from this analysis is that research must focus not only on creation and demonstration of new kinds of service capability, but also on the scientific and technological aspects of architectural design: designing and evaluating candidates for protocol and API definitions, looking at both the supplier and client perspectives. The Federal HPCC Program and the NII Overview In FY1994, the federal HPCC program was extended with a new responsibility, to develop Information Infrastructure Technology and Applications (IITA) to demonstrate prototype solutions to selected national challenge applications using the full potential of the rapidly evolving high performance communications and information processing capabilities. The details of the programs evolving goals and research plans are in its annual reports to Congress (FCCSET, 1994; CIC, 1994). With the incorporation of IITA within its research agenda, the HPCC program is advancing key NII-enabling technologies, such as intelligent system interfaces, real environments augmented with synthetic environments, image understanding, language and speech understanding, intelligent agents aiding humans in the loop, and next-generation data and object bases for electronic libraries and commerce. This is being coupled with a vigorous program of testbed experimentation that will ensure continued U.S. leadership in information processing technologies. IITA efforts are designed to strengthen the HPCC technology base, broaden the markets for these technologies, and accelerate industry development of the NII. Federal HPCC agencies are working closely with industry and academia in pursuit of these objectives. These objectives are to be accomplished, in part,

OCR for page 315
Page 329 image Figure 3 Examples to illustrate the concepts of diversity, commonality, and utility. by accelerating the development of readily accessible, widely used, large-scale applications with significant economic and social benefit. The HPCC program's original focus of enhancing computing and communications capabilities is thus extended to address a broader set of technologies and applications that have an immediate and direct impact on critical information capabilities affecting every citizen. As we have described in the previous section, the development of such applications is predicated on (1) creating the underlying scalable computing technologies for advanced communication services over diverse bitways, effective partitioning of applications across elements of the infrastructure, and other applications support services that can adapt to the capabilities of the available infrastructure; and (2) creating and inserting a richly structured and intelligent service layer that will significantly broaden the base of computer information providers, developers, and consumers while reducing the existing barriers to accessing, developing, and using advanced computer services and applications. In parallel with these activities, a more effective software development paradigm and technology base must also be developed, since full-scale implementations in support of the national challenges will be among the largest and most complex applications ever implemented. This will be founded on the principles of composition and assembly rather than construction, solid architectures rather than ad hoc styles, and more direct user involvement in all stages of the software life cycle. The entire technology base developed in this program, including services and software, will be leveraged across the national challenges, leading to significant economies of scale in the development costs. The intended technical developments of IITA include the following: • Information Infrastructure Services: These are the collection of services provided to applications developers and end users that implement a layered architecture of increasing levels of intelligence and sophistication on top of the communications bitways. Services provide a universally available, network-aware, adaptive interface on which to construct the national challenge applications, spanning communications-based services at the low end, to intelligent information processing services at the high end. These services include network support for ubiquitous access, resource discovery in a

OCR for page 315
Page 330   complex distributed network environment, and intelligent support services that can negotiate and adapt to the service quality needs of the application. Information infrastructure services also include system software and services that implement pervasive privacy, security and trust mechanisms for the information infrastructure, persistent object bases with which to build large-scale data repositories, reliable computing technologies to support the mission-critical nature of the infrastructure, and defensive software organized to protect the infrastructure from intrusion and attack. • Systems Development and Support Environments: This area consists of the enabling technologies to develop and support large, complex information systems that exploit a national-scale information infrastructure. Fundamental to this activity is the use of that infrastructure in the software development and support process. Virtual organizations consisting of end users, contractors, and management will synergistically work together to develop software systems that are easy to use, that can be adapted through use to fit human needs and changing requirements, and that enhance end-user productivity, all despite the complexity of the underlying infrastructure. To achieve these goals, the focus is on software architectures, component prototyping, software composition, libraries of reusable and reliable software modules, end-user tailoring, intelligent documentation and online help, machine learning, and scalable compiler and interpreter technology. • Intelligent Interfaces: Many of the national challenge applications require complex interfacing with humans or intelligent control systems and sensors. In addition, these applications must be able to understand their environment and to react to them. Technology in this area consists of high-level, network-capable applications building blocks for real-time planning and control, image processing and understanding, human language technology, extensive use of intelligent computer-based agents, and support technologies for more effective human-computer interaction. • National Challenges: The concept of national challenge applications has already been described above. It is important to distinguish between the implementation of operational systems and the use of challenging applications testbeds to demonstrate the value of high-performance technologies as well as to drive their continued evolution. The government's research and development role is to focus on the latter; the private sector has primary responsibility for the former. Each of the three technology areas (the first three bullets above) is discussed in additional detail in the following subsections, which include a sampling of technical subtopics. The national challenges have already been summarized in a prior section. Information Infrastructure Services Services provide the underlying building blocks upon which the national challenge applications can be constructed. They are intended to form the basis of a ubiquitous information web usable by all. A rich array of interdependent services bridge the gap between the communications bitways and the application-specific software components that implement the national challenges. • Universal Network Services: These are extensions to the existing Internet technology base to provide more widespread use by a much larger number of users. These include techniques for improved ease of use, plug-and-play network interoperation, remote maintenance, exploitation of new last mile technologies, management of hybrid/asymmetric network bandwidth, guaranteed quality of service for continuous media streams, and scale-up of network capabilities to dramatically larger numbers of users. • Integration and Translation Services: These services support the migration of existing data files, databases, libraries, and programs to new, better integrated models of computing, such as object-oriented systems. They also provide mechanisms to support continued access to older legacy forms of data as the models evolve. Included are services for data format translation and interchange as well as

OCR for page 315
Page 331   tools to translate the access portions of existing programs. Techniques include wrappers that surround existing elements with new interfaces; integration frameworks that define application-specific common interfaces and data formats; and mediators that extend generic translation capabilities with domain knowledge-based computations, permitting abstraction and fusion of data. • System Software Services: These include operating system services to support complex, distributed, and time- and bandwidth-sensitive applications. The services support the distribution of processing across processing nodes within the network, the partitioning of the application logic among heterogeneous nodes based on their specialized capabilities or considerations of asymmetric or limited-interconnection bandwidth; guaranteed real-time response to applications for continuous media streams; and storage, retrieval, and I/O capabilities suitable for delivering large volumes of data to great numbers of users. Techniques include persistent storage, programming language support, and file systems. • Data and Knowledge Management Services: These services include extensions to existing database management technology for combining knowledge and expertise with data. These include methods for tracking the ways in which information has been transformed. Techniques include distributed databases; mechanisms for search, discovery, dissemination, and interchange; aggregating base data and programmed methods into objects; and support for persistent object stores incorporating data, rules, multimedia, and computation. • Information Security Services: These services provide support for the protection of the security of information, enhanced privacy and confidentiality for users of the infrastructure, protection of intellectual property rights, and authentication of information sources within the infrastructure. Techniques include privacy-enhanced mail, methods of encryption and key-escrow, and digital signatures. Also included are techniques for protecting the infrastructure (including authorization mechanisms and firewalls) against intrusion attacks, such as worms, viruses, and Trojan horses. • Reliable Computing and Communications Services: These include system software services for nonstop, highly reliable computer and communications systems that can operate without interruption. The techniques include mechanisms for fast system restart such as process shadowing, reliable distributed transaction commit protocols, and event and data redo logging to keep data consistent and up-to-date in the face of system failures. System Development and Support Environments These provide the network-based software development tools and environments needed to build the advanced user interfaces and the information-intensive NC applications. • Rapid System Prototyping: These consist of the tools and methods that enable the incremental integration and cost effective evolution of software systems. Technologies include tools and languages that facilitate end-user specification, architecture design and analysis, component reuse and prototyping; testing and online configuration management tools; and tools to support the integration and interoperation of heterogeneous software systems. • Distributed Simulation and Synthetic Environments: These software development environments provide the specialized underlying support mechanisms for the creation of synthetic worlds, which can integrate real as well as virtual objects, in terms of both their visual as well as computational descriptions. Methods include distributed simulation algorithms; geometric models and data structures; tools for scene description, creation, and animation; and integration of geometric and computational models of behavior into an integrated system description. • Problem Solving and System Design Environments: These environments provide the techniques that support the software and system design process through the use of automated tools, with particular emphasis on maintaining flexibility and tailorability in tool configurations to enable organizations to tailor

OCR for page 315
Page 332   their support environments to their needs. Examples include efficient algorithms for searching huge planning spaces, more powerful and expressive representations of plans, operators, goals, and constraints, and the incorporation of efficient methods to facilitate scheduling and resource allocation. The effects of uncertainty must be taken into account as well as the effects of goal interactions. • Software Libraries and Composition Support: These software tools and methods support the development of common architectures and interfaces to increase the potential for reusability across multiple underlying models of computation, the diversity of programming languages in use, and the varying degree of assurance provided by software components. Important elements of this area include the development of the underlying methodology, data structures, data distribution concepts, operating system interfaces, synchronization features, language extensions, and other technology to enable the construction of scalable library frameworks. • Collaboration and Group Software: These tools provide support for group cooperative work environments that span time as well as space. Methods include shared writing surfaces and live boards, version and configuration management, support for process and task management, capture of design history and rationale, electronic multimedia design notebooks, network-based video conferencing support, document exchange, and agents serving as intermediaries to repositories of relevant multimedia information. The technology should be developed to make it possible to join conferences in progress and to be automatically brought up to date by assistants (agents) with memory. Intelligent Interfaces Advanced user interfaces will bridge the gap between human users and the emerging national information infrastructure. A wide range of new technologies that adapt to human senses and abilities must be developed to provide more effective human-machine communications. The IITA program must achieve a high level user interface to satisfy the many different needs and preferences of vast numbers of citizens who interact with the NII. • Human-Computer Interface: This supports research in a broad range of technologies and their integration to allow humans and computers to interact effectively, efficiently, and naturally. Developments in this area include technologies for speech recognition and generation; graphical user interfaces that allow rapid browsing of large quantities of data; user-sensitive interfaces that customize and present information for particular levels of understanding; language corpora for experimental research; and human-machine interaction via touch, facial expression, gesture, and so on. The new IITA emphasis is on integration, real-time performance, and demonstration of these new communication modalities in multimedia, multisensory environments. • Heterogeneous Database Interfaces: This supports development of methodologies to integrate heterogeneously structured databases composed of multiformatted data. To support NII information dissemination, a capability is needed for a user to issue a query which is broadcast to the appropriate databases and a timely response is returned and translated into the context of the users query. Multiformatted data may range from ASCII text to numerical time series, to multidimensional measurements, to time series of digital imagery, etc. Also of critical importance is the integration of metadata with the data and its accessibility across heterogeneous databases. • Image Processing and Computer Vision: This activity supports research in making images, graphics, and other visual information a more useful modality of human-computer communication. Research areas include all aspects of theory, models, algorithms, architectures, and experimental systems from low-level image processing to high-level computer vision. Methodologies of pattern recognition will be further developed to allow automated extraction of information from large databases, in particular,

OCR for page 315
Page 333   digital image data. The new IITA emphasis is on integration, scalability, and demonstration of easy access and usability of visual information in real-world problems. • User-centered Design Tools/Systems: This consists of work in models and methodologies leading to interactive tools and software systems for design and other user-centered activities. User-friendly tools that combine data-driven and knowledge-based capabilities is one of the areas for new research. The new IITA emphasis is on supporting the development of ubiquitous, easy-to-use, and highly effective interactive tools. • Virtual Reality and Telepresence: This consists of research that will provide tools and methods for creating synthetic (virtual) environments to allow real-time, interactive human participation in the computing/communication loop. Such interaction may be through sensors, effectors, and other computational resources. The IITA focus is creating shared virtual environments which can be accessed and manipulated by many users at a distance in support of national challenge application areas. Summary and Conclusions Much of the discussion of the national information infrastructure has been at the applications level or the level of the bitways. Various groups, including Congress and the Clinton administration, have identified candidate NC applications on the one hand, while others have dealt with the issues of making interoperable the various existing and emerging communications infrastructures. This discussion suggests a shift in focus to the services layer. The right collection of capabilities at this level of the infrastructure will have an extraordinary impact on a wide range of applications. We have cataloged many of the key technology areas needed for the service layer of the NII: information infrastructure services, systems development and support environments, and intelligent interfaces. The further development of these technologies and their integration into coherent and robust service architectures, incorporating the principles of utility, diversity, and commonality as described here, will be a major challenge for the information technology research community in coming years. Cost-shared sponsorship of pilot demonstrations and testbeds is a key role for government in accelerating the development of the NII. In each NC application area, opportunities exist to demonstrate early solutions, including the potential for scaling up. We suggest that in the exploration of commonality and conversion issues, testbeds can also help address the fundamental issue of ubiquity. The scale of the enterprise, and the fundamental opportunities being addressed, necessitate cooperation among industry, government, and academia for success. We have suggested appropriate roles and approaches to cooperation, with emphasis on the roles of government and research. This is predicated on the assumption that government, in addition to sponsoring key basic research, has a crucial catalytic role in working with all sectors to address the challenge of the national applications to scaling up to the point of ubiquity and reliance. Acknowledgments The ideas expressed in this paper have been influenced by discussions with colleagues at DARPA, especially Duane Adams, Steve Cross, Howard Frank, Paul Mockapetris, Michael St. Johns, John Toole, Doyle Weishar, and Gio Wiederhold. Our ideas have also benefited from extensive discussions with participants in the HPCC program from a diverse collection of federal agencies: Howard Bloom (NIST), Roger Callahan (NSA), Y.T. Chien (NSF), Mel Ciment (NSF), Sherri de Coronado (NIH), Ernest Daddio (NOAA), Norm Glick (NSA), Steve Griffin (NSF), Dan Hitchcock (DOE), Paul Hunter (NASA), Jerry Linn (NIST), Dan Masys (NIH), Cherie Nichols (NIH), Walter Shackelford (EPA), and Selden Stewart (NIST).

OCR for page 315
Page 334 References Clinton, William J., and Albert Gore, Jr. 1993. Technology for America's Economic Growth: A New Direction to Build Economic Strength, February 22. Committee on Information and Communication (CIC). 1994. High Performance Computing and Communications: Technology for the National Information Infrastructure, Supplement to the President's Fiscal Year 1995 Budget. National Science and Technology Council, Washington, D.C. Computer Science and Telecommunications Board (CSTB), National Research Council. 1994. Realizing the Information Future: The Internet and Beyond. National Academy Press, Washington, D.C. Computer Systems Policy Project (CSPP). 1993. Perspectives on the National Information Infrastructure: CSPP's Vision and Recommendations for Action. Computer Systems Policy Project, Washington, D.C., January 12. Federal Coordinating Council for Science, Engineering, and Technology (FCCSET), Office of Science and Technology Policy. 1993. FCCSET Initiatives in the FY 1994 Budget. Office of Science and Technology Policy, Washington, D.C., April 8. Federal Coordinating Council for Science, Engineering, and Technology (FCCSET), Office of Science and Technology Policy. 1994. High Performance Computing and Communications: Toward a National Information Infrastructure. Committee on Physical, Mathematical, and Engineering Sciences, Office of Science and Technology Policy, Washington, D.C. Gore, Jr., Al. 1991. ''Infrastructure for the Global Village," Scientific American 265(3):150–153. Gore, Jr., Albert. 1993. From Red Tape to Results, Creating a Government That Works Better & Costs Less: Reengineering Through Information Technology, Accompanying Report of the National Performance Review. U.S. Government Printing Office, Washington, D.C., September. Information Infrastructure Task Force (IITF). 1993. The National Information Infrastructure: Agenda for Action. Information Infrastructure Task Force, U.S. Department of Commerce, Washington, D.C., September 15. Information Infrastructure Task Force (IITF), Committee on Applications and Technology. 1994. Putting the Information Infrastructure to Work. NIST Special Document No. 857. Information Infrastructure Task Force, U.S. Department of Commerce, May. Information Technology Association of America (IITA). 1993. National Information Infrastructure: Industry and Government Roles. Information Technology Association of America, Washington, D.C., July. Institute for Information Studies (IIS). 1992. A National Information Network: Changing Our Lives in the 21st Century. Annual Review of the Institute for Information Studies (Northern Telecom Inc. and the Aspen Institute), Queenstown, Md. Kahin, Brian. 1993. "Information Technology and Information Infrastructure," in Empowering Technology: Implementing a U.S. Strategy, Lewis M. Branscomb (ed.). MIT Press, Cambridge, Mass. Motiwalla, J., M. Yap, and L.H. Ngoh. 1993. "Building the Intelligent Island," IEEE Communications Magazine 31(10):28–34. National Computer Board of Singapore (NCBS). 1992. "A Vision of an Intelligent Island: The IT2000 Report," March. Vernon, Mary K., Edward D. Lazowska, and Stewart D. Personick (eds.). 1994. R&D for the NII: Technical Challenges. Report of a workshop held February 28 and March 1, 1994, in Gaithersburg, Md. EDUCOM, Washington, D.C.