Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 62
Keeping the U.S. Computer Industry Competitive: System Integration 4 The Next Tier: Building Systems of Systems BROADENING PERSPECTIVE In discussions of systems integration, even the most intent listeners can develop a kind of information-age vertigo. One must adjust not only to the dizzying pace of technological advance, but also to the political, social, regulatory, economic, interindustry, and international issues that swirl around emerging applications of information networking technology. Disorientation is almost inevitable when, for example, one tries to determine the roles of communication carriers and service providers, an amalgam of local telephone monopolies and emerging competitors, long-distance carriers, equipment makers, and suppliers of information services that operates in a complex, multi-jurisdictional regulatory environment. In short, the technological complexity inherent in systems integration is subsumed by other types of complexity that arise as the scope of networking grows from small—a single firm—to intermediate—a group of firms—to large—an entire nation and beyond. Perhaps the most salient question at this stage in the evolution of distributed networked computing is, Where do we want to go from here? Building on the previous one, this chapter addresses the question by describing the necessary attributes of information networks as they evolve into "systems of systems" and, as has been predicted, as virtually all types of business and organizational activity go "on line." It also outlines some of the issues arising from the convergence of industries wrought by the digitalization of information.
OCR for page 63
Keeping the U.S. Computer Industry Competitive: System Integration SYSTEMS AS COMPONENTS "What people thought of as a system just a few years ago is a small component in what people are thinking of as systems today," said Michael Taylor, central systems engineering manager at the Digital Equipment Corp. "We all recognize that one person's system is another person's component." This evolution in the concept and actual embodiment of information systems implies an ever expanding sphere of users and applications. It also implies that no information system will ever be a finished product. Whether hardware, software applications, sources of information, or services, new components can always be added. Consequently, there may never be a definable end point in the evolution. And if there is, it is not discernible from today's vantage point. For comparison, consider the continuing development of the nation's telephone system. In 1876, Alexander Graham Bell devised the method by which human speech and other sounds can be converted directly into electrical current and back again. In the succeeding 125 years, the United States, pursuing the social goal of "universal access," built the world's best telephone network, an accomplishment made possible by waves of innovation set in motion by Bell's invention. However, the telephone network is hardly complete. Today it is looked upon as a component of emerging nationwide and global information systems composed of subsystems as vast as the U.S. telephone network and as small as an individual computer, telephone, or other personal computing or communication device. "Network-based systems integration is only emerging," said Mark Teflian, vice president and chief information officer at Covia, the nation's second-largest travel reservation system. Truly integrated networks not only shuttle data from one location to another, he said, but also enable assimilation of information, or learning. According to Teflian and other colloquium participants, most of today's networked systems do not achieve the levels of integration necessary to help the user transform data into information and information into knowledge. In a sense, organizations that have effectively linked information technologies and people have scaled a plateau that becomes a staging area for an ascent to a higher mountain. These organizations, said Robert Martin, Bellcore vice president for software technology and systems, have managed to create network-based systems and, collectively, they face the formidable task of integrating those systems. In Martin's view, at the peak of the mountain they and others must climb is a "national information networking infrastructure." The challenge for today, Martin, Teflian, and others said, is to build a foundation that supports successively higher levels of integration within and among information systems and, in so doing, increases the accessibility,
OCR for page 64
Keeping the U.S. Computer Industry Competitive: System Integration utility, and organizational or personal value of information. Participants identified some of the transcending features that all emerging systems should possess in order to achieve systems integration on a national scale. Integratable Components The fundamental building blocks of an integrated system of systems, said Martin, are communications and computing devices and software components that ''can be plugged together" to meet a customer's needs today and still accommodate the incorporation of new technology. A major issue, he said, is the "design of integratable components for yet-to-be-defined projects." "I think the challenge for the future," Martin explained, "is to spread [today's] networks across the country so that we can do an inter-enterprise automation and fundamentally change the nature of the system to where we build systems on the fly based on components that already exist across a distributed network." Connectivity and interoperability would not eliminate the need for customized system integration services. The need for specialized applications to solve complex problems and to address unique organizational needs would remain, even with a "highly networked distributed base," Martin said. Simplicity, Transparency, and Functionality "Heterogeneity is going to be the byword of systems integration for the foreseeable future," Alfred Aho said. According to the Bellcore assistant vice president, the biggest sources of this heterogeneity are not the peculiarities of different types of hardware and software, but rather the people who use, build, and maintain systems of information technology. More important than the physical connectivity of components and networks, Aho said, are the connections and processes that integrate people so that they can easily use and manipulate information and network-based services. The underlying hardware, software, and networks that constitute a system should be "transparent" to people, he maintained. An element crucial to achieving simplicity and transparency, he added, is a "single-system look and feel," a consistent user interface that would eliminate the frustration that people experience today as they move from application to application. "I would also advocate," Aho continued, "that we want to be able to develop functionality with which the end user can construct new services and be able to solve problems with the systems. We would like this functionality to be delivered in such a way that a person who is in a particular business can understand it. He [should] not have to be a computer scientist or an electrical engineer to be able to invoke this kind of functionality."
OCR for page 65
Keeping the U.S. Computer Industry Competitive: System Integration Selectable Speed, Reliability, and Security Some users of information technology will require the most advanced networking capabilities available, but most will not need noise-free, gigabit transmission rates. Flexible transport of information—whether in the form of data, sound or voice, or graphics—and, therefore, flexible pricing schemes for use of transmission services are so important that Martin and others1 consider them to be defining attributes of information networking. Consistent Architecture and Efficient Methods for Distributed Development of Systems The goal of facilitating collaboration among geographically dispersed groups and among people with different skills and expertise is a major impetus for information networking. Not fully appreciated, however, is the high level of collaboration among dispersed parties required to integrate groups of systems and, ultimately, to create a cooperative environment. The "recursive" nature of the integration process, said Aho, presents special difficulties that undermine efforts to get systems to work together. These difficulties arise from the fact that parts of the whole—the systems that will eventually be the components of some larger system—are developed by groups that have different perceptions of what the entire assemblage should be like and what it should do. "If you look at some of the major integrated systems we build today," said Larry Druffel, director of the Software Engineering Institute at Carnegie Mellon University, "you see that there are companies distributed across the country, if not across the world. If people [at different sites] do not have the same view of the overall architecture, we can be pretty sure that we are going to see major disasters when they come together." Eventually incompatibilities can be overcome, Druffel added, and sub-systems "can be glued together," but often at the cost of system efficiency. Druffel and Aho described a consistent underlying architecture as a cornerstone of efforts to achieve higher levels of integration among today's information systems and tomorrow's. Unfortunately, said Aho, "Architecture, I think, is something that we all talk about but understand very little." Network Intelligence In addition to collaboration, the perceived benefits of timely capture and effective presentation of information also drive efforts to achieve higher levels of integration. From the perspective of a business, information reduces uncertainty, and the quicker a firm can gather and process information, the quicker it can identify and respond to market opportunities. Teflian
OCR for page 66
Keeping the U.S. Computer Industry Competitive: System Integration of Covia made the point this way: "Information equals money, and timely information equals more money." Therefore, he added, the point of sale should be the point of information capture. To exploit the advantages that rapid capture and analysis of information create, Teflian predicted, an increasing number of businesses in all sectors of the economy will develop information systems that enable on-line transaction processing (OLTP). Eventually, he further predicted, real-time information retrieval and interactive computing will become the dominant mode of computer usage, inside and outside of business. Because of stiffening global competition and the resultant shortening of product life cycles, most firms will view their products and services as "perishable goods," he said, and OLTP systems will be critical to seizing marketing opportunities in narrowing time frames. The American Express information network provides an example. There is "an absolute real-time requirement," explained Albert B. Crawford, the company's executive vice president for strategic business systems. "As you use your credit card, whether you are in Bangkok or Boston, we have to get that transaction to Phoenix and do some computations to see if you are an authorized credit risk or not or a suitable credit risk and back out to that point of sale in less than 20 seconds, and [transmission over] the network consumes eight to ten seconds of that." The anticipated transition to OLTP systems creates the need for more intelligent networks, those that are capable of responding to a request without requiring the user to attend to confusing and time-consuming procedural details. Today, for example, there are no complete dictionaries or directories that describe and locate information resources accessible by computer. An intelligent network equipped with a comprehensive set of these aids, according to Teflian, would be able to respond to requests automatically or with only minimal human interaction. Users need not concern themselves with the specifics of where a desired piece of information or an application resides on a network. Nor should differences in computer languages and protocols within a system interfere with communication. In Teflian's view, the network should be able to do all the necessary translations. Effective Presentation of Information Making information easily and instantaneously accessible creates the need for tools that manage flows of information and help users digest responses to their requests. While the human senses can take in billions of bits of data per second, the brain processes information at the much slower rate of about 50 bits per second. Therefore, information must be organized in a form and on a scale that permit people to assimilate and manipulate information as they receive it, Teflian said. One simplifying attribute already mentioned is
OCR for page 67
Keeping the U.S. Computer Industry Competitive: System Integration a consistent user interface, which eliminates the problem of adapting to appearances or other sensory cues that are unique to each application. Artificial intelligence tools that relegate routine tasks to computers, information-filtering and information-prioritizing methods, and so-called "knowbots" that essentially act as information-gathering assistants are in various stages of research and development, and some are already commercially available.2 The primary aims of these and other efforts, said Druffel, should be to foster human understanding, rather than to overload with information. Moreover, he advised, user interfaces should be designed to accommodate new functionality. Introduction of speech recognition and other capabilities that are in store should not necessitate a major revamping of the interface and existing applications. AN INDUSTRY OF INDUSTRIES The convergence of computing and communications and the expanding web of information networks are mirrored by a corresponding convergence of several major industries—principally, computers, communications, publishing, and broadcasting. All are involved in handling information, which in the past existed in distinct forms. But once information is reduced to binary bits, text, graphics, and audio become complementary parts of a multimedia whole, and the boundaries separating once-distinct industries fade. Equip a personal computer with the hardware for reading a compact disc and providing sound, and it becomes a means of audio and visual entertainment. With two-way communication capabilities, a television encroaches on the domains of the telephone and the computer. Get a "smart phone," and you have a communications device that takes over the tasks of the computer—for example, home banking or conducting other on-line transactions. These present-day examples are only early manifestations of technological forces driving the creation of a large, hybridized information industry. The actions of this industry in formation will steer U.S. society into the information age.3 At this stage, it is not entirely clear how this convergence of industries and the anticipated networking of society will proceed. A few colloquium participants suggested that the infrastructural melding of networks and systems is proceeding as fast as it can. Most were dissatisfied with the pace, with some proffering that the more strategic, government-coordinated efforts under way in Europe and Japan will prove more productive than the piecemeal market-driven efforts undertaken in the United States. Whether the comparatively slow response of the United States to infrastructural issues marks a period of watchful waiting or one of indecision is an important question.
OCR for page 68
Keeping the U.S. Computer Industry Competitive: System Integration SHOULD THE MARKET DECIDE? Several participants suggested that building a nationwide information infrastructure was too important a matter to be delegated entirely to a large and varied collection of firms responding to market cues and pursuing their own economic self-interest. "If we are strictly market driven, short-term driven, that is a problem," maintained Mischa Schwartz, Charles Batchelor Professor of Electrical Engineering at Columbia University. The tendency to focus on the short-term results in quick responses to immediate market needs, he said, but it diverts attention and resources from building the capabilities and services that may be in demand at the start of the next decade. Robert W. Lucky, executive director of the Communications Sciences Research Division at AT&T Bell Laboratories, offered a similar perspective. "There is nobody worrying about putting the nation together" on an integrated national network, he said. "I do not understand, for example, why there is no national data network. Why is there no telephone book for computer users?. . . There is nobody responsible for this. There is nobody who cares to do it. There is nobody who is delegated to do it. There is no motivation for anybody to do it, and yet we would all like it." Economic incentives do not exist for services that fall in the category of the public good, Lucky maintained. But if people are not willing to pay for the kinds of services that Lucky described, the counter argument goes, then perhaps the nation doesn't really need them or other types of services that a national network could provide. "Making general public knowledge available to the masses would be nice," said Teflian, "but who pays for that?" Moreover, sidestepping the market could introduce new problems, he said. ''Integrating systems within a company to improve its productivity so that it can compete in the world marketplace is one thing," Teflian explained. "Integrating systems on a national basis, thus affecting intercompany markets, is quite another.'' COMMUNICATIONS: A "HALF-REGULATED" INDUSTRY Perhaps most controversial among the issues surrounding the proper roles of government and the market in the evolution of information networks are those concerning the regulation of communications utilities. Especially contentious are federal, state, and local rules pertaining to the seven Regional Bell Holding Companies and the some 1,300 smaller companies that provide local exchange service. By virtue of their monopoly in local markets, these telephone companies must obtain the approval of state and local utility commissions to raise rates, and the fees they charge for connecting calls to the long-distance network are set by the Federal Communications Commission (FCC). In addition, the federal court rulings that de-
OCR for page 69
Keeping the U.S. Computer Industry Competitive: System Integration tailed AT&T's divestiture of local telephone service forbade local telephone companies from directly engaging in manufacturing, owning local information services, and providing long-distance service. The current situation is very fluid and, often, very confusing, given that actions taken at several levels—the courts, the FCC, the U.S. Congress, and state legislatures and utility communications—can alter competitive conditions within markets for communication and information services. Bills and rule changes are pending at all levels.4 Major changes, in turn, are often subject to appeals. Within the current regulatory environment and with the proliferation of alternative communications technology, many companies have found that they can save money by bypassing the local telephone network, and they have built their own links to long-distance haulers. Alternative service providers have also emerged within local markets. Some offer fiber-optic connections to long-distance carriers and to other sites within the same city. With a recent FCC ruling, third-party service providers can now connect to the switches of local telephone companies, spawning competition for customers in the same exchange area. The result of this state of affairs, according to Teflian, is a "partially regulated industry." "Half regulation," he said, has created disincentives to private initiatives that would contribute to the building of a national information network. Using local implementation of integrated services digital network (ISDN) service as an example, Teflian noted that the dispersal of regulatory authority will likely result in decisions that create "discontinuities" among service areas. Large organizations that require nationwide ISDN service, he explained, may be forced to find methods to accommodate these local differences. The challenge, Teflian added, is either to develop a regulatory approach that assures uniformity at the local level or to pursue "higher levels of integration" that bridge local discontinuities. Martin of Bellcore suggested that these problems stem from uncertainties arising in the aftermath of the breakup of AT&T in 1984. "The United States, I think, is still suffering in post-divestiture confusion," he said, "and I believe something must be done. The best way to characterize that postdivestiture confusion is that, since divestiture was announced, there have been no new national services introduced in the United States, not a one." Numerous private information services, most serving specialized business and legal markets (through their own or third-party packet-switched networks), have been established, however. Many have not been successful. The largest—and among the newest—of existing services is Prodigy, targeted at household computer owners. Created by a partnership between Sears and IBM with an investment of more than $500 million, Prodigy offers more than 400 data services, including home banking and shopping and information retrieval, and has about I million subscribers.5
OCR for page 70
Keeping the U.S. Computer Industry Competitive: System Integration Colloquium participants did not dwell on specific issues arising from the complex and highly fragmented regulatory environment in which the information industry finds itself. Many of these issues pit industry against industry, each trying to preserve or carve out a domain of business activity. Rather, several participants pointed out that such issues prove that government policies directly influence the competitive behavior of firms. "We do have a telecommunications policy in the United States," Martin said. "It is enacted every day by many agencies distributed within the U.S. government and across the states, and so there is a policy. The only question is what that policy should be." THE HIGH PERFORMANCE COMPUTING AND COMMUNICATIONS PROGRAM: A STEP TOWARD AN ADVANCED NATIONWIDE INFORMATION NETWORK Although many colloquium participants suggested that the path of the nation's migration into the information age is still unmarked, virtually all were encouraged by the federal government's newly begun High Performance Computing and Communications (HPCC) program, an initiative jointly conceived by representatives of government, industry, and universities. "It is a good way to get lots of people excited about the [potential of advanced networking], to test a lot of techniques and technologies, and to show the way as to what needs to get done," explained Irving Wladawsky-Berger, IBM's assistant general manager of development and quality. If the approaches and technologies demonstrated in the HPCC program show promise, he added, the risk of moving on to the next tier of innovation and information technology applications will be reduced. Businesses will have the incentive and some of the know-how to go on to "build the next version and the version after that." As planned, the program has four major components: (1) research and development work on high-performance scalable parallel computing systems capable of sustaining trillions of operations per second on large problems (accounting for 25% of total funding); (2) developing software technology and algorithms for advanced applications and networking (41%); (3) developing a National Research and Education Network (NREN) with, ultimately, a data transmission rate of several billion bits per second (14%); and (4) investing in basic research and human resources to address long-term national needs for more skilled personnel, enhancement of education and training, and development of materials and curriculum (24%).6 Most discussion at the colloquium focused on the NREN, a three-phase project to be jointly funded by the federal government and industry. During the first phase, existing networks linking government, university, and industry researchers would be upgraded and interconnected. The second phase,
OCR for page 71
Keeping the U.S. Computer Industry Competitive: System Integration much of which has already been installed, would develop a 45-million-bits-per-second backbone—the "long-distance" lines of the network—permitting user-to-user transmission rates of 1.5 million bits per second for the 200 to 300 U.S. research institutions expected to be linked by the network. The third phase, involving research followed by deployment beginning later in the decade, would support an aggregate transmission rate of perhaps 3 billion bits per second. The goals of the first two phases are achievable with commercially available technology (indeed, much progress has already been made); the biggest challenges lie in the development of software and supporting standards for managing a large multi-organization network. The final phase, however, will entail a major technological transition. As a committee of the CSTB has noted,7 the high-performance, high-traffic network envisioned will require a "clean-sheet approach," an appraisal also used to describe the challenges posed by the less ambitious broadband ISDN, or BISDN. In this high-speed distributed computing environment, even the speed of light becomes a constraint, posing the problem of propagation delays. Technical issues, the committee pointed out, arise in many areas, including network control, layered architectures, communication protocols, switching, routing, multiplexing, and processor interfaces. In addressing these and other issues, colloquium participants pointed out, the NREN project can clear the technical obstacles that stand in the way of building an advanced information infrastructure for the United States. David Farber, professor in the Department of Computer and Information Sciences at the University of Pennsylvania, elaborated on some of the challenges. When you are operating at a gigabit [rate] across country, there are a remarkable amount of bits stuck in the line being transmitted. So the normal approach to protocols that we have developed over the years may or may not work. That is a research question. If they work at a gigabit, will they work at the 2 gigabits that we will probably have in another few years? Will they work at 10 gigabits? Can we understand what will happen in the future? Big, open question. If you proceed to come up with a protocol that works, . . . then can any operating system we have tolerate data rates of that speed? Will Unix survive when you put a gigabit down its throat, even if you can get in'? Most likely not. This litany goes on and on and on, and most everything you look at gets affected by trying to stuff that much data at that speed down the throat of modern technology. Farber suggested that when computers can exchange data at gigabit speeds, a network takes on the characteristics of a massively parallel, or multiprocessor, computer. The network becomes a "pile of computers that all think they are on a common, multi-processor environment and communicate via shared memory." The optical fiber linking the network functions like a
OCR for page 72
Keeping the U.S. Computer Industry Competitive: System Integration high-speed bus within an individual computer—the set of wires, or bidirectional data highway, that carries signals throughout the machine. Implicit in Farber's model is the need for network intelligence. "There is no way that massively networked computers can operate without a communications system that understands what, in fact, it is communicating across," he said. "This gigabit technology," he told the colloquium, "is the beginning, I believe, of a take-off point in communications. I think there is a gap now between the 'traditional' slow-speed communications of below 100 megabits, and the future communications of multiple gigabits that people are talking about. I believe the [forthcoming technology] will have a substantial and profound impact on the computer field and offer opportunities that I do not think we are very good at understanding right now." Farber noted that the NREN's benefits will take time to unfold. "In about three years," he said, "we will understand better what the issues are, exactly what the implications are, and, maybe, whether it is in fact justified as an investment for science. All of us believe it is or we would not be spending our time working toward it and we would not have the funding to do it. While the government is putting in a substantial amount of money, a large part of the burden is shared by industry, which is contributing a phenomenal amount of resources, lines, engineering talent, and research talent. So it is a very interesting vehicle, and it may tell us how we want to do cooperative work in the future in this country." To the extent that the NREN project and the entire HPCC program succeeds in exposing and developing these opportunities, the United States will have a clearer technical route into the information age. The NREN will serve, in part, as an important test bed for new technologies, applications, and network operation and management policies. Perhaps even more importantly, it is likely to set precedents for many key economic, regulatory, and political issues surrounding the future development of our national communication and information needs.8 Careful analysis of such "initial conditions," then, is to be encouraged as their influence can extend for many years into the future. NOTES 1. Dertouzos, Michael L. 1991. "Building the Information Marketplace," Technology Review, January, pp. 29–40. 2. The term "knowbots" was originally coined by the Corporation for National Research Initiatives of Reston, Virginia; see Anthes, Gary H. 1991. "Let Your 'Knowbots' Do the Walking," ComputerWorld, May 13, p. 17. 3. Computer Science and Technology Board, National Research Council. 1990. Keeping the U.S. Computer Industry Competitive: Defining the Agenda, National Academy Press, Washington, D.C., pp. 26–27.
OCR for page 73
Keeping the U.S. Computer Industry Competitive: System Integration 4. See, for example, Bradsher, Keith. 1991. "Judge Allows Phone Companies to Provide Information Services," New York Times, July 26, p. A1. 5. Shapiro, Eben. 1991. "A Service Is Dropped by Prodigy," New York Times, May 17, p. C3; and "Can Prodigy Be All Things to 15 Million PC Owners?" New York Times, June 2, p. F4. Konsynski, Benn R., and F. Warren McFarlan. 1990. "Information Partnerships—Shared Data, Shared Scale," Harvard Business Review, September–October, p. 116. 6. Committee on Physical, Mathematical, and Engineering Sciences, Federal Coordinating Council for Science, Engineering, and Technology, Office of Science and Technology Policy. 1991. Grand Challenges: High Performance Computing and Communications, Supplement to the President's Fiscal Year 1992 Budget, p. 8. 7. Computer Science and Technology Board, National Research Council. 1988. Toward a National Research Network, National Academy Press, Washington, D.C., pp. 25–35. 8. See, for example, Dertouzos, 1991, "Building the Information Marketplace," pp. 29–40; Gilder, George. 1991. "Into the Telecosm," Harvard Business Review, March–April, pp. 150–161; and Weingarten, Fred. 1991. "Five Steps to NREN Enlightenment," EDUCOM Review, Spring, pp. 26–30.
Representative terms from entire chapter: