Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 41
Keeping the U.S. Computer Industry Competitive: System Integration 3 Enabling Technology: Communications and Software Before our very eyes the now ubiquitious computer is undergoing a profound metamorphosis from a largely stand-alone, all-in-one machine to a gateway to a world of information technology and services. Indeed, for many people today, the computer is as much a communications device as it is a machine that performs high-speed mathematical and logical calculations. Not coincidentally, with each new wave of technology, the now ubiquitous digital telephone becomes ever more like the computer. As this convergence of computing and communications technology proceeds, it is not entirely clear whether the distinction between the two will have any meaning. Driving this convergence is the as of yet unabating miniaturization of electronic components that make up integrated circuits. As the density of the components on integrated circuits has increased at exponential rates, the speed and power of computers and other digital equipment have improved tremendously, while their costs have moved steadily downward. For example, workstations have been almost exclusively the tools of scientists and engineers. Fast and able to run several applications simultaneously, the high-performance machines have been too expensive for the average personal computer user. But workstation prices are dropping rapidly, and, by the end of the decade, some people predict, the equivalents of today's high-end machines may sell for well below $1,000, making the technology affordable for large numbers of people. Already, a well-equipped personal computer based on an Intel 80386 processor running UNIX can be purchased for about $3,000, or about the cost (in current dollars) of the much more primitive IBM PCs in the early 1980s.
OCR for page 42
Keeping the U.S. Computer Industry Competitive: System Integration The predictable and high rate of progress in hardware has fueled rising expectations for distributed information networks of immense capacity, capability, flexibility, and reach. These expectations, however, are constrained by technical, regulatory, and economic challenges on national and international levels. The principal obstacles lie in the areas of telecommunications and software technology, commonly referred to as the "bottlenecks" that impede progress toward large-scale information networks offering widely available services. Simply stated, the technical problems are these: First, the information processing capabilities of computers greatly exceed the capabilities of public telecommunication carriers to transmit data in its many forms between remote computers; data arrive in trickles rather than in the torrents that are needed to support real-time, multimedia interactive computing. Second, advances in hardware have outraced the ability of software designers and computer programmers to develop applications that exploit the full capabilities of new devices. The gap between potential performance and the actual functionality supplied by software applications remains large, and, at the same time, software accounts for a large and growing expense in the development of information technology. Third is a combined telecommunications and software problem. Software of high reliability and quality is a critical component of efforts to improve the transmission capacity and capabilities of telecommunications carriers, as evidenced by the fact that software may account for as much as 80 percent of the cost of a telecommunications system.1,2 Building an infrastructure and implementing the associated services necessary to achieve the seamless connection of information networks on national and global scales pose large, complex challenges for software development. Finally, it is not enough to develop technological solutions to the impediments that prevent high-speed, high-performance networking. There must also be agreement between the telecommunications and computer industries on how those solutions should be implemented into standards. Moreover, to promote global networking, standards development within nations should complement the work of international standards-setting bodies. Although telecommunications and software issues loom large, technical challenges also face manufacturers of integrated circuits and the other hardware components of information technology. For example, in addition to new software, faster and more powerful microprocessors, perhaps using photonic technology, will be needed to achieve the high switching speeds necessary for rapid transmission between computers of large volumes of data in multimedia forms. Moreover, as communications performance improves, computers and other digital equipment, whose capabilities are now constrained by slow data transmission, will eventually be hard pressed to assimilate data arriving at rates exceeding a billion bits per second, as envisioned for the National Research and Education Network (NREN), a
OCR for page 43
Keeping the U.S. Computer Industry Competitive: System Integration component of the recently begun federal High Performance Computing and Communications (HPCC) program. "Multi-gigabit networks represent a change in kind, not just degree, from today's networks," a committee of the Federal Coordinating Council for Science, Engineering, and Technology has explained. "For example, consider that in a coast-to-coast communication at three gigabits [3 billion bits] per second there are at any instant 'in flight' nearly nine megabytes [million bytes, or groups of eight bits of data, which is more than the memory of most personal computers and workstations."3 At the colloquium, most of the discussion focused on issues related to software and communications. Participants also addressed some of the challenges confronting the HPCC program and, in particular, the NREN. COMMUNICATIONS SURVEY A rule of thumb, based on today's networking experiences with traditional technology, is that the speed and ease of communication decrease as a function of distance. The fastest data flows occur within a computer, via the data bus, or the set of wires that shuttle signals between the microprocessor and circuit boards containing memory and controlling input and output devices. In local area networks (LANs) linked by coaxial cable, the typical rate for transmitting data between computers is 10 million bits per second. Data communication between LANs linked by a wide area network slows to a figurative snail's pace. The maximum data-carrying capacity of a typical copper wire telephone line is 64,000 bits per second, but modems generally do not even make full use of this limited capacity, with most transmitting and receiving data at rates between 2,400 and 19,200 bits per second. Specialized data transmission services—called T1 and T3—offered by telephone companies can greatly improve this rate, to 1.5 million bits per second for T1 and 45 million bits per second for T3. But even at a rate of 45 million bits per second, correspondence between remote computers is too slow to support interactive, data-intensive tasks, especially sophisticated graphics applications. Consider, for example, that one still image contains between 20 million and 200 million bits of information; in digital form, a typical pair of chest X-rays contains the information equivalent of four volumes (text) of the Encyclopedia Britannica.4 Also consider the data-carrying capacities necessary for the following uses of information technology: real-time graphics applications run on remote supercomputers require rates of 50 million to more than 700 million bits per second per user; real-time, cooperative computer-aided-design systems require 1.5 million bits per user; and television-quality video and audio require 45 million bits per channel, or less than half the rate for digital high-definition television (without data compression techniques).5, 6, 7
OCR for page 44
Keeping the U.S. Computer Industry Competitive: System Integration Introducing the need to integrate textual, graphical, and audio information for multimedia presentations magnifies the need for high-speed transmission, as does increasing the number of users on the network. Fiber-optic technology figures prominently in proposed measures to enhance transmission capacities and to integrate multimedia forms of data. Indeed, the fiber-optical lines that handle about 95 percent of long-distance telephone calls in the United States transmit information at a rate of about 2 billion bits per second. With the information-carrying capacity, or bandwidth, of fiber-optic cable increasing a hundredfold over the past decade, bandwidths of a trillion bits per second may be not only achievable but also commercially practical in the next decade. Moreover, the declining cost of optical fiber has made it competitive with copper wire for even short-distance communication links. In terms of maintenance and upkeep, fiber offers economic advantages over copper wire. Some local telephone companies are running fiber from their central switching offices to remote distribution terminals that handle up to 1,000 lines and, from there, to streetside pedestals that link to individual homes and most businesses. The cost of completing the final stretch, from curbside to home, remains prohibitive.8 Thus it is not clear when fiber will be extended to residences and businesses; nor is it certain whether local telephone companies will be the primary actors. Cable television companies, which have strung coaxial cable to 60 percent of U.S. homes and could easily connect nearly 30 percent more,9 appear eager to exploit fiber-optic technology, and some have announced their intentions to run fiber to the home.10 Incentives for completing this important last step, whether done by telephone companies, cable companies, or others, are dependent not only on perceptions of the market for services made possible by fiber-optic connections, but also on regulations governing the behavior of segments of the communication industry. (Regulatory issues are discussed in the next chapter.) Integrated Services Digital Network Service What is clear, however, is that many of the basic tools exist for sending and receiving multimedia information. Moreover, a world vision of high-speed communication networks already exists in the form of integrated services digital network (ISDN) service.11 A slowly evolving concept consuming the attention of industry, national, and international standards-making bodies since the mid-1970s, ISDN is essentially a telecommunications view of a public information infrastructure. It will allow users and carriers to aggregate, or multiplex, transmissions of data in graphical, textual, and audio forms, permitting multimedia communication between users on a network.
OCR for page 45
Keeping the U.S. Computer Industry Competitive: System Integration In its first and current manifestation, however, ISDN offers data transmission at a rate far less than 1 percent of the 150 million bits per second needed for real-time interactive multimedia applications. So-called narrowband ISDN (NISDN) service can carry 144,000 bits per second via two 64,000-bit-per-second channels and one 16,000-bit-per-second channel. Communications are delivered through a standard telephone socket and are parceled out to telephone and computer, which are linked by wire. With this setup, remote computer users can converse by telephone and simultaneously exchange alphanumeric information, which appears on the screens of their terminals. In the United States, about 0.5 percent of equipped access lines supported NISDN as of 1990; many of these were installed to serve the private networks of large companies, partially reducing their need for separate transmission lines for voice and data communications.12 Although the simultaneous linkage of telephone and computer afforded by the initial ISDN offering is significant, data transmission speeds do not accommodate many types of applications of information technology, and they are greatly exceeded by service offerings already available, such as T3. Moreover, users of ISDN service may have to invest in new equipment interfaces. (For example, an ISDN adapter for a personal computer, with accompanying software, costs about $2,200). In essence, early offerings of ISDN service enhance the capabilities of the telephone but leave untapped the potential of the computer as a tool for sharing, processing, and presenting information in multimedia forms. Successive upgrades in ISDN service are in the offing, however. From 144,000 bits per second, rates would improve first to 1.54 million bits (24 channels, each carrying 64,000 bits per second), a service already used by some businesses. From there, rates would move progressively higher, to the 150-million-bit-per-second threshold for multimedia applications and upward to gigabit-per-second speeds. The migration to the threshold rate (i.e., 150 million bits per second), which marks the transition to broadband ISDN (BISDN), has been frustratingly slow for users awaiting high-speed distributed computing networks that are interoperable, accommodate a broad range of interactive applications, and are easy to negotiate. These users want BISDN to be available now rather than at the turn of the century or beyond. As a result, many are forging their own solutions to their needs for high-speed communication and incurring the expense of building private networks. One argument for proceeding with NISDN contends that the initial service is a necessary stepping-stone to broadband public networks. For example, while large corporations appear to have an insatiable appetite for communications bandwidth, many smaller businesses and most households may see little need for digital information services, much less a single access point to those services, which ISDN would provide. If a reasonable
OCR for page 46
Keeping the U.S. Computer Industry Competitive: System Integration level of demand is established for NISDN applications, then telecommunications carriers and manufacturers of telecommunications equipment would be encouraged to invest in the technologies required to progress to BISDN. Perhaps an even more compelling argument is the growing status of ISDN as an international standard interface for global networking. Government-owned communications monopolies in Japan and Europe seem to be more convinced of this argument than are U.S. telecommunications carriers and information-service providers. As a result, implementation of supporting ISDN standards has been proceeding more rapidly in Japan, Singapore, and Europe than in the United States. However, in early 1991, major U.S. computer and communications firms endorsed a set of standard specifications for ISDN service.13 The agreement could lead to conversion of about half of subscriber lines to NISDN service by the end of 1994.14 Toward Broadband Networks ISDN's modest beginnings might be likened to a narrow river that over time broadens into a major waterway supporting cargo-carrying vessels of every size and function and, by means of its tributaries, offering access to any port desired. Rather than on geological time scales, however, the transformation from today's constricted narrowband communication networks into broadband arteries with immense data-carrying capacity is expected take place over the span of a few decades. Fiber-optic technology offers the means for achieving this transformation, but the evolution requires more than glass cables serving as conduits for multigigabit streams of data. In its early stages, the evolution may largely entail pushing the limits of current technology, but moving to gigabit-per-second rates and beyond will require revolutionary changes in computing and communication networks. "As we move into gigabit networks, however, we must take a 'clean sheet' approach to many of the systems issues," writes Leonard Kleinrock, professor of computer science at the University of California, Los Angeles. "The critical areas to be considered include switching technology, processor interfaces, protocols, connection-oriented communications, routing, layered architectures, and coexistence with carrier environments." 15 Whither OSI? Some of the challenges that Kleinrock has identified could require a major revamping of the International Standards Organization's Open Systems Interconnection (OSI) architecture. Essentially a composite reference model for building information systems, OSI was devised to promote the development of industrywide standard protocols to create a common communications environment for distributed systems. 16 (See Box 3.1, "A Simple
OCR for page 47
Keeping the U.S. Computer Industry Competitive: System Integration Box 3.1 A Simple Protocol Analog "The diplomatic use of the term 'protocol,' as a code of etiquette and precedence derived from the Greek roots 'to glue together,' comes close to its meaning in computer networks. The messages that flow between computers follow an established set of rules (etiquette) in proper sequence to glue the network into a cooperating community. The following human analog tries to capture the concept of a protocol hierarchy or stack. "An American chief executive officer (CEO) wishes to complete a business transaction with his or her counterpart (peer) in Japan. The American CEO, representing the 'application layer/ composes thoughts in a manner that a Japanese peer will understand (peer protocol), and dictates a letter to a secretary (presentation layer). The secretary converts the communication from one format (voice) to another format (written), representing the service the secretary (lower layer) provides to the boss (higher layer). The secretary then puts the letter in an envelope and puts the Japanese CEO's address on the envelope, thus making a session on behalf of a higher-layer entity. The letter is mailed (passed down to a lower-layer entity) to the U.S. Postal Service, a reliable 'datagram' transport mechanism. The post office passes the letter to regional collection centers (switching centers of the network layer) and then on to the destination post office via the routing information in the letter's address, usually the ZIP code (control information). The passing is handled by bundling many different letters with similar ZIP codes in bags carried by truck, plane, or ship to the destination (the physical media, lowest layer). The process now repeats in reverse—post office to secretary to Japanese CEO—physical to network to transport to presentation to application layer. The Japanese secretary sees to it that the English letter is translated into Japanese, the destination's presentation layer. "At each layer of the protocol hierarchy, there is a peer protocol that understands the rules (etiquette) of its peers; the letters are formatted the same, the envelopes are addressed the same, the mailbags are labeled in an agreed-upon format, and so on throughout the process. Furthermore, each layer provides a service to its higher layer, and interface protocols express the service requests. Figure 13.1] shows this hierarchy of layers: the popular Open System Interconnect (OSI) model of a seven-layer hierarchy." SOURCE: Computer Science and Technology Board, National Research Council. 1988. Global Trends in Computer Technology and Their Impact on Export Control, National Academy Press, Washington, D.C., pp. 110–11.
OCR for page 48
Keeping the U.S. Computer Industry Competitive: System Integration FIGURE 3.1 Open System Interconnect (OSI) and DOD models with representative protocols. SOURCE: Computer Science and Technology Board, National Research Council. 1988. Global Trends in Computer Technology and Their Impact on Export Control, National Academy Press, Washington, D.C., p. 112.
OCR for page 49
Keeping the U.S. Computer Industry Competitive: System Integration protocol Analog.'') OSI divides the workings of a network into seven layers, each composed of one or more protocols. The bottom two layers—''physical" and "data link"—specify transmission rates, signaling conventions, and other protocols for managing communications media. At the top, or "applications" layer, are the protocols for electronic mail and other applications that can be conducted on the network. OSI embodied more than 100 international standards as of 1989,17 and the number continues to grow. Most of the protocols, however, attend to the transfer of data; very few pertain to high-level collaborative applications that can be performed over a network. Robert Martin of Bellcore was one of several colloquium participants who suggested that rather than fostering information networking, standards are becoming so unwieldy that they may eventually deter its evolution. "The thing I worry most about is the number and the inherent complexity," he said. "We are not smart enough to manage within this environment." Alfred Aho, also of Bellcore, had a similar lament. "By the time you look at the total amount of software required to develop the system according to the OSI world," he said, "you are talking about millions of lines of code. If we move forward into the future, we are going to have components of the systems we are building today for years into the future. Who is going to maintain that old software? Who is going to understand it? What kind of return on investment are we going to get for people who look after this old software and make fixes to it when the systems malfunction?" One of the penalties paid for the "openness" that OSI is intended to foster may be exacted in the form of reduced speed, a problem explained by Hisashi Kobayashi, dean of engineering and applied science at Princeton University. Gigabit transmission rates could overwhelm current and emerging communication protocols designed to support the OSI reference model, which has guided standards development for more than a decade. "It is still too early to say," he explained, "but people are beginning to realize that the OSI layered architecture will be inefficient for a network based on optical technology. Communication links will no longer he a bottleneck, but most likely processing power [of switches] will be." Processing power at network switches could be especially taxed when data flows approach gigabit-per-second speeds. In a typical packet-switched network, the switch must process three layers of OSI protocols before sending data en route on to its intended destination. The intelligence required to process these "heavyweight" protocols causes switching delays of 50 to 100 milliseconds, which are tolerable when data is traveling at a rate of 64 thousand bits per second. As rates improve to hundreds of millions of bits and higher, the delays could create the equivalent of rush-hour traffic jams. In Kobayashi's view, protocols for broadband networks will have to be simpler to enable fast response times and to avoid overtaxing the processing capabilities of switches, as well as
OCR for page 50
Keeping the U.S. Computer Industry Competitive: System Integration of computers communicating on the network. "Lightweight" protocols, which may trade some of the functionality of OSI standards for greater speed, should be a focus of research, be suggested, adding that some of the tasks now performed by software may be more efficiently carried out by hardware. In fact, switching hardware itself may have to change. Processing and transmission demands may eventually exceed the speed and capabilities of silicon-based integrated circuit electronic devices. Faster gallium-arsenide semiconductors would yield substantial improvements in switching performance, but in the long term, photoelectronic, photonic, and perhaps even superconducting devices may be the sources of the revolutionary advances in switching technology that many think will be needed to meet future networking demands.18 Moreover, current protocols assume that errors and data losses are most likely to originate in the network, and procedures have been devised to guard against such problems. But with a high-speed optical-fiber network, Kobayashi explained, "the traditional notion of error due to noise in the channel is less relevant than the possible loss of packaged information due to the finite memory space in the buffer," where data awaiting transmission is temporarily stored. Also problematic, he said, are network control and management. Today, feedback mechanisms are used to ease network congestion: A message is sent to the sending terminal, which then slows its transmission or waits until the bottleneck is cleared. But with gigabit transmission rates, streams of data packets already will be in transit by the time the network can alert the sending terminal to its traffic problems. This means that, instead of reactive or feedback control, methods of "predictive or proactive control" are likely to be needed. "In other words, you have to predict the congestion a few milliseconds or nanoseconds ahead of time and then take some corrective action to limit the transmission of your information," Kobayashi explained. Integrating Communications Media In recent years, wireless communication services have proliferated, placing heavy demands on the already crowded frequency bands of the radio spectrum used for, among other things, radio, television, satellite-to-ground telephone services, cellular and mobile telephones, and pagers. Pending uses for the over-the-air communication channels include computer-to-computer radio connections and personal communications networks. With some exceptions, these services stand apart from those offered over copper wire and fiber-optic cable—public telephone service and data communications. For example, the fast-growing cellular telephone industry, which has
OCR for page 51
Keeping the U.S. Computer Industry Competitive: System Integration seen its subscribership increase dramatically to 5.3 million in 1990 19 and the public switched network are evolving as separate islands rather than as complementary components within a larger communications environment. Observers have called for a reexamination of communication options, with the ultimate aim of integrating now-fragmented wireless and wire-line services into a coherent framework for internetworking. Some would go even further. For example, there have been suggestions for transmitting television signals by means of fiber-optic cable, rather than over the air. This possibility is created by proposals for an all-digital high-definition television (HDTV) standard for the United States. (The U.S. Federal Communications Commission is evaluating six proposals for an HDTV transmission standard, four of which are all-digital systems. The commission is expected to make its selection in 1993.) The large segment of the spectrum now consumed by private and public television stations would be freed for other uses, and at the same time, the new means of delivering video entertainment might create levels of consumer demand sufficient to justify extending fiber-optic cable to U.S. households, completing all the connections necessary for a nationwide broadband communications infrastructure. SOFTWARE Part and parcel of many issues and obstacles confronting telecommunications carriers and service providers are those related to the design, development, and maintenance of software. As Aho of Bellcore suggested with his predictions of problems associated with OSI protocols, complex standards are underlain by complex software. Even in a single self-contained network, general management and housekeeping tasks, such as routing, addressing, error checking, and protocol implementation, may account for more than 90 percent of the data traffic.20 The concerns raised by Aho are part of a set of issues collectively referred to as the "software problem." The manifestations of the problem are several. Advances in hardware have outraced the ability of software designers and computer programmers to develop applications that exploit the capabilities of new devices. New applications are often beset with errors, and, in the opinion of many observers, they are difficult to use. Modifying old programs and databases to work with new hardware or new software is time consuming and expensive, exacerbating a backlog in new applications awaiting development. Applications developed to work with one operating system do not easily transfer to another. Consequently, one frequently cited testimonial to technological progress—that today's personal computers are equivalent to the mainframes of less than a decade ago—is diminished by the fact that available software takes far less than full advantage of the
OCR for page 52
Keeping the U.S. Computer Industry Competitive: System Integration raw processing power of the modestly priced machines now sitting on millions of desks. The trend toward open systems, improvements in tools to aid designers and programmers, and other developments have paid dividends. Nonetheless, software design and development have proven especially resistant to efforts to transform these activities into a structured engineering discipline, a transformation that, many believe, would contribute directly to progress in distributed network computing. To many, the process is a curious combination, of art and craft, particularly during the early stages of conceptualization and design, and of laborious writing of programming code.21 "Software may be the limiting factor," said Larry Druffel of Carnegie Mellon University's Software Engineering Institute. "But remember the other side of that: it is the enabling factor as well. . . . Obviously, if we are going to do systems integration [on a national scale], we really have to deal with the software issues because that is, after all, what allows us to change things after they are built. That is what it is all about." Colloquium participants examined some of the hurdles that must be overcome to improve software design and development and to manage the increasing complexity inherent in integrating information systems. Measures of Complexity and Performance Even in small-scale projects, according to Aho, systems integration poses the challenge of transforming abstractions, such as improved customer service or product quality, into concrete functions, usually embodied in software applications. Consider one of today's popular user interfaces, a graphical model of the desktop: It is an abstraction that transforms a computer screen into the most common of work environments. In the process of creating systems of systems, each successive layer of integration introduces new abstractions underlain by higher levels of complexity, Aho explained. It becomes increasingly difficult, he added, to assess the performance and effectiveness of systems. Consequently, integration strategies that work well in the context of a single system may be undermined by unanticipated interactions and problems that arise as the scale of internetworking grows. "I am a great fan of looking at problems in a systems context," Aho said. "So, the one thing I would advocate to my fellow researchers is that when you do your work, do the work in a systems context. Ask, does this scale up to medium-sized systems and to large-sized systems'? We are making substantial investments in [integrating systems], and we would like to be able to design systems that are not the [equivalents] of Three Mile Island." The design of integrated systems, he suggested, would benefit greatly
OCR for page 53
Keeping the U.S. Computer Industry Competitive: System Integration from measures that could help integrators and researchers define the complexity and scale of their undertakings. Similarly, Aho advocated developing measures for gauging the performance of systems, such as their cost-effectiveness, how easily they can incorporate new technology and accommodate new applications, and how they respond to accidents and failures. Implicit in Aho's remarks is the importance of an interdisciplinary perspective. "My theory is that, if you do not look at the sum of the performance measures and the underlying architecture, then there will be untoward events in the future, which will cause the systems to behave adversely land very badly]," Aho explained. "We would like to have some kind of performance guarantees in the running of systems that they will respond gracefully to adverse inputs. We have lots of examples of systems that have behaved badly under certain conditions." One method being used with increasing effectiveness in the marketplace to measure the quality of systems integration, software, and hardware is the use of Service-Level Agreements (SLAs) as a contract between the providers of services and the users. The SLA specifically lists measurable attributes such as "up-time," response time for user transactions, cost per user hour, and other quantitative items that can be used by both the provider and the consumer to evaluate the quality of service. Definition of an SLA also helps in setting expectations for the system at the beginning of the development and integration process. Software Architecture The traditional definition of architecture—the art and science of designing and erecting buildings—continues to dominate thinking in systems integration, according to Druffel of Carnegie Mellon University. That is, systems designers tend to think in terms of configurations of hardware, just as building architects translate their concepts into specifications for components made of iron and steel. "That is kind of what we have been dealing with over the years," Druffel said. "First, we have a hardware architecture and then the software guys have to figure out some way to make all that work. As we think about systems architectural issues, we really need to think about the complementary software architectures. From a software perspective, I would like to know, for example, the functional view, the control structures, the expected behavior, how [system] states are stored and communicated, and what the dependencies are." Druffel and others underscored the need for standards (see below, "Software Standards") and other mechanisms that foster a shared perspective on software architecture and better communication among dispersed groups
OCR for page 54
Keeping the U.S. Computer Industry Competitive: System Integration involved in developing systems software. At their current stage of development, object-oriented programming techniques, CASE tools, and "layered abstractions," such as the OSI model, represent only a partial response to the need, Druffel said, and, in some cases, they interject additional confusion. "Each of these approaches reflects a different notion about what services it provides," he explained. ''If you are going to write software that tries to interconnect them, it gets rather messy." "A good example of the problems you run into when people are building tools or building components that are later going to be integrated," he added, "are those that arise from different views of data ownership." Often, it is not clear which group within a distributed computing environment has responsibility for database control and management, or several groups may assume authority. In either case, Druffel said, configuration management—essentially, organizing the construction of complex software from separate pieces—and control of software versions can be problematic. Moreover, system components may be designed by people who differ in their views of data ownership, resulting in different models of how the overall system and its pieces should function. When the time comes to integrate these differently conceived components, difficulties often arise. The need for a shared architectural view is magnified by the goal of developing systems that can evolve with technology, a recurring colloquium theme. However, this goal can confound software development. "When we are talking about software," Druffel said, "we are talking about components that are going to be developed probably without complete knowledge of how they are going to be used, and we are talking about systems that have to evolve. That is, they are going to be composed of units without a complete understanding of what that end system may eventually be, which adds a little bit of complication to the issue." The added criterion of evolvability presents software developers with the task of planning for all technological possibilities. At best, developers can only approximate future developments. So rather than being a rigid scaffolding that greatly restricts future options for expansion, software architecture must have a high degree of flexibility. Adoption of new technology and new applications should not require razing the previous system and starting anew. "We know that over the next 10 years," Aho said, "progress in microelectronics is going to make computer chips faster, memories bigger, and bandwidths greater, and we should not forget about algorithms and other procedures that will be making our software faster. Are systems designers going to be able to accommodate these improvements?" Aho suggested that Japanese methods of incremental quality improvement may be applicable to systems design. "Proof of progress over time," he said, should be a hallmark, but operationalizing a model of quality
OCR for page 55
Keeping the U.S. Computer Industry Competitive: System Integration improvement poses a "very difficult challenge for the systems design community." Software Standards Standards are an essential element of a coherent software architecture. Unfortunately, promulgation of common conventions and widely accepted platforms for linking applications and other software elements is as prone to delays and other complications as is standards making in the telecommunications area. In fact, development of software-related standards to achieve true interoperability of applications may be a more formidable task, beset by more interdependencies and greater levels of detail. To convey this complexity, Druffel provided a small list of topics awaiting internetworking standards: data bindings, language bindings, and bindings between standards, network management, and security. Performing an application on a distributed computing network may invoke standards in all these areas, as well as additional ones. Moreover, each area presents its own peculiar set of issues. For example, a standard that is intended to bridge the incompatibilities between different programming languages must not only serve as a translator but also resolve other disparities. "It is not just a matter of syntactically communicating between two languages," Druffel explained. "They really do have different run-time expectations. Under one language, a default may mean something entirely different from what it means in the other language. These kinds of issues have to be dealt with as well." Consider also all the eventualities and interdependencies that must be addressed to enable group collaboration on multimedia documents that combine, for example, voice input, graphics, digitized photographs, hand-written notes, spreadsheet tabulations, and keyboard input. The capability to exchange heterogeneous types of information between points anywhere along a network and to combine the elements of this diverse compilation according to the whims of users will require an array of standards and common conventions. But if these standards require users to perform laborious routines and entail time-consuming steps, the benefits of collaboration will be diluted. Converging Software Interests Since networking, in general, and interoperability, in particular, are the major motivations for standards making in these and many other technical areas, distinctions between what is a computing standard and what is a communication standard have become almost artificial. However, the landscape of national and international standards making is divided between the
OCR for page 56
Keeping the U.S. Computer Industry Competitive: System Integration two realms and then subdivided among industry domains within each. (See appendix below for a brief description of major standards-making bodies.) Although some consolidation of activities and interests has occurred in recent years, it has not matched rates of consolidation occurring between the communications and computer industries and between the information services and applications supported by those industries. "Over the last 20 years," Kleinrock of UCLA has maintained, "the innovations in data networking have come from the data-processing industry, and not from the carriers. This is in spite of the fact that the data-processing solutions have used the underlying carrier plant to establish their data networks. As we move into the broadband era, it is essential that these two merged industries cooperate in providing service to the user community."22 Many of the obstacles that stand in the way of advanced networking will require software solutions. For example, it has been estimated that the scale of software supporting the next generation of switching systems in the evolution of ISDN will be 10 times greater than that supporting the current generation.23 Given the magnitude of the need for reliable, high-quality software to accomplish complex networking tasks, collaborative development of software by representatives of the computer and communications industries, it has been suggested, may be the most efficient way to address the need.24 No matter what the label assigned to a networking issue—communications or computing—the ultimate aim is to meet the needs of the same large set of users. The overriding goal of standards makers and of software developers who build on those standards, Druffel reminded, should be simplicity and ease of use—"to simplify rather than to increase the complication." APPENDIX: STANDARDS MAKING AT A GLANCE The most wonderful thing about standards is that there are so many from which to choose. —Original author unknown Worldwide, more than 7,000 professionals working in some 250 subcommittees of standards-setting bodies are involved in promulgating, testing, and formalizing standards for information technology.25 These numbers attest to both the enormous size of the task and to the welter of detail that must be addressed in enabling electronic access to information. Within this domain of national and international standards making, much of the activity focuses on protocols for communicating within and across networks. The following discussion provides an overview of the organizations in-
OCR for page 57
Keeping the U.S. Computer Industry Competitive: System Integration volved in setting standards for information networking.26 Because the complexity and the scale of the effort are often not appreciated, it might be instructive at the outset to give some indication of the extensive body of information-and communication-related standards that already exists. At its last quadrennial meeting in 1988, the International Telecommunications Union (ITU), one of three major international bodies that accredit standards for telecommunications and information technology, affirmed or adopted nearly 1,600 standards that were documented in nearly 20,000 pages of text.27 The ITU is an organization of the United Nations. The standards-making arena is actually a composite of many different playing fields. Individual firms—typically, hardware and software manufacturers and communications carriers and other major service providers—represent the smallest domain of activity, followed by trade associations, user groups, and other groups that may form to back a single standard, developed collectively or selected, perhaps, from several offered by member organizations. In the formal, or de jure, standardization process, these organizations propose standards that are submitted for national accreditation. Such proposals are usually assigned to subcommittees of the private, nonprofit American National Standards Institute (ANSI), the principal promulgator of U.S. standards. ANSI may form a special subcommittee to develop a standard for formal approval, or it may assign the task to a so-called secretariat. ANSI secretariats, usually professional or trade associations with expertise in the area (e.g., the Institute of Electrical and Electronics Engineers, the Electronic Industries Association), may provide administrative support to subcommittees. For example, a number of communications-related standards, including several concerning ISDN, have been developed by the T1 committee sponsored by the Exchange Carrier Standards Association. At the international level, ANSI serves as the U.S. arm of the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC), which accredit international standards for information technology and its applications. For example, the OSI reference model and many of its supporting standards, such as those for transferring files over a network or for computer-integrated manufacturing, were developed under the auspices of the ISO. Along with those submitted by other nations, ANSI-accredited standards are reviewed by the ISO/IEC Joint Technical Committee, which develops standards for formal approval by its parent bodies. International telecommunications standards, such as those pertaining to electronic mail and directory services, as well as more traditional standards like radio-spectrum allocation, are the product of a separate process, dictated by the international treaty that formed the International Telecommunications Union (ITU). Technical specifications supporting the implementa-
OCR for page 58
Keeping the U.S. Computer Industry Competitive: System Integration tion of the ISDN standards are being developed by ITU's Consultative Committee for International Telephony and Telegraphy (CCITT). U.S. telecommunications standards are introduced to ITU for international approval by the State Department, which serves as the representative of U.S. industry and is advised by ANSI. The federal government also wields considerable influence in the promulgation and adoption of standards. One source of this influence is the government's huge purchasing power. For example, the federal government has stipulated that it will only purchase information technology products that adhere to standards supporting the OSI architecture, specifically those defined by the Government Open Systems Interconnection Profile (GOSIP), published as Federal Information Processing Standard 146. Another source of influence is the National Institute of Standards and Technology (or NIST, part of the Department of Commerce). Its Computer Systems Laboratory (CSL) participates in over 85 national and international voluntary standards activities (e.g., the North American ISDN Users' Forum, the U.S. National Committee for CCITT, and the IEEE Standards Board and committees). A recent independent review of the CSL's activities observed the following: For many long years, the CSL and its predecessor, the Institute for Computer Sciences and Technology (ICST), have waged a long and lonely, but effective, battle on behalf of federal government agencies whenever agency interests were opposed to private market segmentation interests of a variety of vendors in the computer and communication industry. Suddenly, CSL finds itself in a world where both users and vendors are discovering that good standards are in their interest, at least for those areas where innovation is not too lively. This shift represents a major environmental change for CSL and it requires a rethinking of the successful approaches of the past decade or so.28 However, in light of these profound environmental changes that open-systems standards portend, the review panel concluded that the CSL's budget is "much too small to address effectively even a small fraction of the topics relevant to CSL's mission."29 Despite the realities of this budget situation, NIST is extensively involved in promoting information technology standards. Straddling the formal standardization process are de facto and ad hoc initiatives that seek to establish a particular product or implementation as a standard on the basis of its strong position in the market. The IBM PC computer, Microsoft's MS-DOS operating system, and Adobe's Postscript page-description language for laser printers are examples of products that have become de facto standards. Users are increasingly trying to leverage their purchasing power to hasten the adoption of standards without waiting for cumbersome formal
OCR for page 59
Keeping the U.S. Computer Industry Competitive: System Integration standardization processes to reach agreement on particular specifications or for the market to cull the array of competing alternatives. Examples include the federal government, as in the case of its GOSIP OSI standards, and the group of manufacturers that organized the Information Technology Requirements Council. This latter organization, together with its highly visible Manufacturing Automation Protocol/Technical and Office Protocols (MAP/TOP) Users Group, has recently merged with the Corporation for Open Systems (COS), a trade association whose goal is to bring users and vendors together to promote the utilization of open-systems technology. The British-based X/Open Company Ltd. is another organization that combines vendors and users to advance standardization and interoperability, using, like COS, product testing and certification.30 In January of 1991, the User Alliance for Open Systems (which includes representation from over 30 major system users, including such well-known organizations as Eastman Kodak, DuPont, General Electric, and NASA) also joined COS to leverage their mutual interests in the implementation of common requirements for open systems.31 Overall, there are at least 20 user alliances devoted to accelerating the standards-setting process. Growing appreciation of the importance of standards and the increasing number of initiatives it has generated are welcome developments, expected to free users from the pitfalls of proprietary solutions to their information technology and communications needs. But the proliferation of standards-making activities by ad hoc groups that have grown impatient with formal mechanisms introduces a new wrinkle, according to Michael Taylor of the Digital Equipment Corp. ''A complication, of course, is that it is no longer absolutely clear what standards body is responsible for what," he said. "So, in some sense, we have both an opportunity to do things faster and a new problem, which is mediating among this more complex and disparate group of standards bodies." NOTES 1. The major consequences of even "minor" software errors in such systems were dramatically illustrated in the July 1991 outages of telephone networks in several large metropolitan areas, including Washington, D.C., and Los Angeles (Andrews, Edmund L. 1991. "String of Phone Failures Perplexes Companies and U.S. Investigators," New York Times, July 3, p. A1). See also, National Research Council. 1989. Growing Vulnerability of the Nation's Public Switched Networks , National Academy Press, Washington, D.C. 2. Hargrave, Andrew. 1987. "Communications: Towards the 21st Century," supplement to Scientific American, October, p. T14. 3. Committee on Physical, Mathematical, and Engineering Sciences, Federal Coordinating Council for Science, Engineering, and Technology, Office of Science and
OCR for page 60
Keeping the U.S. Computer Industry Competitive: System Integration Technology Policy. 1991. Grand Challenges: High Performance Computing and Communications, Supplement to the President's Fiscal Year 1992 Budget, p. 54. 4. Kleinrock, Leonard. 1991. "ISDN—The Path to Broadband Networks," Proceedings of the IEEE, Vol. 79, No. 2, February. 5. Ferguson, Charles H. 1989. "HDTV, Digital Communications, and Competitiveness: Implications for U.S. High Technology Policy," VLSI Memo No. 89-506, Massachusetts Institute of Technology, Cambridge, Mass., February. 6. Wright, Karen. 1990. "The Road to the Global Village," Scientific American, March, pp. 83–94. 7. Herbst, Kris. 1989. "Getting Graphic," Network World, July 31, pp. 30, 32, and 45. 8. Feder, Barnaby J. 1991. "Optical Fiber (Almost) at Home," New York Times, March 24, p. F-6. 9. Slutsker, Gary. 1981. "Divestiture Revisited," Forbes, March 18, p. 124. Coy, Peter, and Mark Lewyn. 1991. "The Baby Bells Learn a Nasty New Word: Competition," Business Week, March 25, p. 100. 10. For example, Time-Warner Inc. has announced that it will run optical fiber to between 5,000 and 10,000 homes in a section of New York City. Initially, the company will offer customers 150 channels and two-way communication features, but the system could be expanded to 600 channels. With additional hardware, the fiber-optic system could also provide telephone service. (Kneale, Dennis. 1991. "Time Warner Plans Cable-TV System with 150 Channels," Wall Street Journal , March 8, p. B-10.) 11. Much of the discussion in this section is based on Kleinrock, Leonard. 1991. "ISDN—The Path to Broadband Networks," Proceedings of the IEEE, Vol. 79, No. 2, February, pp. 112–117. 12. U.S. Department of Commerce, National Telecommunications and Information Administration. 1991. The NTIA Infrastructure Report: Telecommunications in the Age Information, NTIA Special Publication 91-26, U.S. Government Printing Office, Washington, D.C., October. 13. Keller, John J. 1991. "Standards Set for Data-Voice Phone Service," Wall Street Journal, February 26, p. B-7. 14. U.S. Department of Commerce, National Telecommunications and Information Administration, 1991, The NTIA Infrastructure Report: Telecommunications in the Age of Information. 15. Kleinrock, 1991, "ISDN—The Path to Broadband Networks." 16. Kahn, Robert E. 1987. "Networks for Advanced Computing," Scientific American , October, pp. 141–142. 17. Gantz, John. 1989. "Standards: What They Are. What They Aren't," Networking Management, May, p. 33. 18. Kleinrock, 1991, "ISDN—The Path to Broadband Networks," p. 116; and Optoelectronics Technology Research Corporation. (no date). Key Technology for the 21st Century, Tokyo, p. 1. 19. Northern Business Information. 1990. "U.S. Cellular Markets," December, p. 55. 20. Gantz, 1989, "Standards: What They Are. What They Aren't," p. 32. 21. Computer Science and Telecommunications Board, National Research Coun-
OCR for page 61
Keeping the U.S. Computer Industry Competitive: System Integration cil. 1991. Intellectual Property Issues in Software, National Academy Press, Washington, D.C. 22. Kleinrock, 1991, "ISDN—The Path to Broadband Networks," p. 115. 23. Kobayashi, T., former president and chief executive officer of the NEC Corp., as quoted at World Telecommunications Forum, Singapore, May 1985. (Cited in Hargrave, 1987, "Communications: Towards the 21st Century," p. T14.) 24. Ibid. 25. Gantz, 1989, "Standards: What They Are. What They Aren't," p. 27. 26. For a detailed discussion of relevant standards-related issues, readers may wish to consult Crossroads of Information Technology Standards, National Research Council (National Academy Press, Washington, D.C., 1990). 27. Dorros, Irwin. 1990. "Can Standards Help Industry in the United States to Remain Competitive in the Market Place?" P. 35 in Crossroads of Information Technology Standards, National Research Council. Gantz, 1989, "Standards: What They Are. What They Aren't," p. 28. 28. National Research Council. 1991. An Assessment of the National Institute of Standards and Technology Programs: Fiscal Year 1990, National Academy Press, Washington, D.C., pp. 303–304. 29. National Research Council, 1991, An Assessment of the National Institute of Standards and Technology Programs: Fiscal Year 1990, p. 304. 30. Verity, John W., et al. 1991. "Computer Confusion: A Jumble of Competing, Conflicting Standards Is Chilling the Market," Business Week, June 10, pp. 72–77. 31. User Alliance for Open Systems. 1991. Overcoming Barriers to Open Systems Information Technology, Corporation for Open Systems, McLean, Va., January 27.
OCR for page 62
Keeping the U.S. Computer Industry Competitive: System Integration 4 The Next Tier: Building Systems of Systems BROADENING PERSPECTIVE In discussions of systems integration, even the most intent listeners can develop a kind of information-age vertigo. One must adjust not only to the dizzying pace of technological advance, but also to the political, social, regulatory, economic, interindustry, and international issues that swirl around emerging applications of information networking technology. Disorientation is almost inevitable when, for example, one tries to determine the roles of communication carriers and service providers, an amalgam of local telephone monopolies and emerging competitors, long-distance carriers, equipment makers, and suppliers of information services that operates in a complex, multi-jurisdictional regulatory environment. In short, the technological complexity inherent in systems integration is subsumed by other types of complexity that arise as the scope of networking grows from small—a single firm—to intermediate—a group of firms—to large—an entire nation and beyond. Perhaps the most salient question at this stage in the evolution of distributed networked computing is, Where do we want to go from here? Building on the previous one, this chapter addresses the question by describing the necessary attributes of information networks as they evolve into "systems of systems" and, as has been predicted, as virtually all types of business and organizational activity go "on line." It also outlines some of the issues arising from the convergence of industries wrought by the digitalization of information.
Representative terms from entire chapter: