Applications of Integrated Systems: Evolution in Concept and Practice
Ask a dozen practitioners what systems integration is, and you will probably get as many different definitions (see Box 2.1). Ask that same group what systems integration is intended to accomplish, and the answers are likely to be more uniform. The standard response is "solutions" or "application solutions." For people who are not steeped in the field, however, that answer prompts a fundamental question: What's the problem in the first place?
Typically, the set of needs and problems that an integrated computing and communications system addresses is unique to an organization. At snack food maker Frito-Lay, for example, the challenge was to develop an information system that helped the company leverage the advantages and efficiencies that accrue to its large size and yet enabled the national firm to maneuver flexibly in local markets, where it sells the 5 billion packages of snack food that generate annual revenue of $4.5 billion. The company's integrated system, widely used as an example of an effective corporate information system, links all parts of its operations. The system has allowed Frito-Lay "to be decentralized in its marketplace activities," explained Charles S. Feld, vice president for management information systems, "and at the same time, leverage our whole manufacturing and logistics system. We have been able to do that through the use of information and technology, primarily by providing information to the people that have to make decisions lower in the organization." (See Box 2.2.)
MOTIVATIONS FOR SYSTEMS INTEGRATION
Despite the great variability in the issues and needs organizations seek to resolve with applications of information technology, there are at least five general categories of motivations for investing in systems integration.
Box 2.1. Definitions of Systems Integration
''Fulfilling a practical objective through the assemblage of diverse component technologies and disciplines that are critical to each other's success. It is a teaming of technology components that results in high synergy.''
—Jeffrey M. Heller, Senior Vice President, Electronic Data Systems
Systems integration is "process innovation—to simplify basic business operations, to compress the time they require, and to narrow the gap between the product or service and the customer."
—W. James Fischer, Managing Partner, Andersen Consulting
"People building upon existing components to satisfy a customer's need."
—Robert L. Martin, Vice President for Software Technology and Systems, Bell Communications Research
"Effective integration implies a system-level architecture that permits the integration, or connection, of system components and permits later integration of unplanned components. Effective integration also implies an integrating mechanism that permits components to share data. It implies an overall model that permits the user to understand what the system is doing. It implies a constant user interface. It also requires integration of the functions of the applications that the system supports."
—Larry E. Druffel, Director, Software Engineering Institute, Carnegie Mellon University
"It started with technology—putting bits and pieces together—and grew into managing information. It has grown into understanding processes and now, I believe, it is getting into understanding the human element of what we are trying to accomplish."
—Michael Taylor, Central Systems Engineering Manager, Digital Equipment Corporation
"Solving a problem efficiently, recursively; giving disparate components a single-system look and feel. . . We should keep [integrated information systems] simple so that we can maintain them and use them, and we should keep them affordable so they will deliver the greatest value to the largest number of people."
—Alfred V. Aho, Assistant Vice President, Bell Communications Research
Box 2.2. The Frito-Lay Information System
If an executive at Frito-Lay headquarters near Dallas wants to know how his company's products are faring on supermarket shelves in Boston, or the price of a corn futures contract at the Chicago Board of Trade, or the fuel efficiency of the company's delivery fleet, that information is instantly available. In fact, current information on virtually every aspect of the snack food maker's operations—manufacturing, purchasing, warehousing, distribution, marketing, sales, management, and research—is easily retrieved with the company's information system and presented in the level of detail desired.
Widely cited as a model of an effective corporate information system, Frito-Lay's computer network has established itself as the company's most important strategic and competitive tool. Executives say it is a requirement for business survival in the 1990s.
A national company competing against local and regional snack food manufacturers for shelf and display space in more than 400,000 stores, the subsidiary of Pepsico Inc. has used its comprehensive intelligence to transform itself into a "micromarketer" that enjoys the economies of scale that accrue to a multibillion-dollar enterprise.
"We learned how to handle the volume, we learned how to handle the speed over the years," explained Charles S. Feld, Frito-Lay's vice president for management information systems. ''But what has happened to us is, our marketplace has gotten very complex. Boston is now very different from Chicago, very different from California; supermarkets are different from convenience stores; and products are differentiated by flavors and bag sizes. The world is no longer one size or one color of jeans. Everybody wants diversity. So we have had to figure out a way to leverage our size and prowess in the marketplace and still be able to compete on a very targeted basis."
The company's 10,000-person sales force provides the information that is key to Frito-Lay's nimble performance in local markets. Equipped with handheld computers, a sales person keys in orders during sales calls and furnishes customers with a printout, an on-the-spot sales receipt with tax, discounts, and promotions included. At the end of each day, the sales force electronically transmits sales, orders, and other information to the headquarter's mainframe computer. The next morning, the sales people link up again with the mainframe to receive the day's routing and scheduling information.
With the daily-updated information from the field, the company can track performance in precise detail, down to the sales movement of an individual product in a single store. As a result, managers say they know more about local marketing conditions than their competitors do, and they can devise sales strategies accordingly.
Before the networked information system was introduced, the company's marketing strategy consisted of two or three national initiatives, formulated at the top of the organization. Now, with the benefit of detailed knowledge of local markets, middle managers generate and execute some 300 sales-building ideas each year. While top management oversees these local and regional marketing initiatives, the shift in tactical responsibilities permits executives to focus more on the company's strategic direction.
Executives at the 60-year-old company, which employs 26,000 people, cite effective teamwork across the organization as the primary advantage of the information system. Benefits have been realized in many forms. Electronic data entry by the sales force eliminated time-consuming paperwork, saving between 30,000 and 50,000 hours each week. The company was able to consolidate 400 sales routes even as its annual sales volume increased to $4.5 billion, from $3.7 billion. Improved tracking of product movement reduced the number of "stales" (products that have exceeded their shelf lives) from 2 percent of sales volume to less than 1 percent, resulting in annual savings of $39 million.
Designed to accommodate changing needs and new applications, the information system continues to evolve and increase its strategic and operational value to the company.
Feld attributes the success of the system to three enabling factors: "a business proposition worthy of the investment" in information technology, systems integration skill, and "the will to see the job through during tough times."
For many organizations, experiences with information technology have not lived up to expectations. American business invested billions in computing and communications technology during the 1980s.1 At the start of the decade, the nation's inventory of computer terminals totaled about 4 million; as of today, some 75 million IBM-compatible personal computers alone have been sold, and half of all U.S. office workers have a computer on their desks.2 Many firms realized significant benefits. But a large number did not, or at least returns to their investments were not commensurate with initial expectations. One major disappointment was the negligible improvement in the productivity of the service sector (although gains were made in the quality and diversity of service output), which has been estimated to account for about 85 percent of the nation's total stock of information technology items.3
Hindsight offers some valuable lessons. The most obvious, of course, is that merely possessing technology, regardless of its capabilities, does not
translate automatically into an organizational asset. As simple as it may seem now, this lesson is actually the product of evolutionary changes in technology and, most important, in understanding the role of that technology.
The proliferation of information technology products and vendors has produced the need for connectivity and interoperability. Many organizations own heterogeneous collections of computing and communication equipment, purchasing, for example, one vendor's machines for engineers, another's for administrative support staff, and yet another's for managing large databases and on-line transaction-processing activities. Because of incompatibilities in operating systems and other vendor-specific peculiarities, dissimilar machines, and the people who used them, functioned in isolation. Exchanging information between these computing islands entailed laborious translation procedures or, worse yet, manual reentry of information.
This state of affairs led to user demand for connectivity, a means to let unlike systems perform at least rudimentary tasks such as exchanging files. But connectivity was not enough. Users also wanted their hardware and software to be interoperable, that is, to make applications, information, and peripheral devices easily accessible to any computer, regardless of who made it and what operating system it used.
An installed base of information technology has to accommodate new technology and new capabilities. The need to combine the old with the new is a perennial source of headaches for managers of information systems. Firms that have invested vast sums in information technology cannot simply jettison that investment and start anew with each successive wave of commercial innovation. Nor would they want to even if they could afford it. The databases and applications embedded within existing information systems are often described as the "corporate jewels," strategically important assets that are integrally related to the firms' operations. Moreover, the value of existing information and programs can be increased greatly when integrated with the capabilities of new hardware and software.
Advances in technology, combined with growing appreciation of what can he accomplished with that technology, have prompted firms to search for new applications and sources of competitive advantage . Although the computer—or, more appropriately, the ever-growing family of digital technologies—may not yet merit the title of universal machine, growing appreciation of its potential is inspiring organizations to apply the technology in new ways and, in so doing, pursue new business opportunities. Just as important, new capabilities in computing and communications are motivating individual firms and groups of firms—suppliers and customers and even industries—to reevaluate their entire way of doing business.
In an increasingly global economy, firms must rely on telecommunications and information technology to manage and coordinate their operations and to stay abreast of international competitors. The ability to communicate and transmit large volumes of data nearly instantaneously facilitates closer linkages with foreign subsidiaries, suppliers, and customers, but it also telescopes the time organizations have to respond to changes in international markets and to the actions of competitors. If, for example, a manufacturing firm's competitors can place electronic orders with foreign suppliers or change the specifications for a part and transmit the new design immediately to a collaborator located half the world away, then that firm must also have the same capabilities just to keep pace with the competition.
None of these motivations stands entirely apart from the others. Collectively, they are driving businesses and other organizations to use their information technology innovatively and effectively. Moreover, information networking technology is so intricately related to the broader phenomenon of the growing interdependency among regional, national, and global economies that the importance of its role can only be expected to increase.
DISTRIBUTED NETWORKED COMPUTING: EVOLUTION IN UNDERSTANDING
The steady stream of complementing innovations in computer and communications technology provides the building blocks of systems integration projects. Components of hardware and software are integrated into distributed computing networks, which many firms view as their "central nervous systems," the means to coordinate all elements of their operations into a synergistic whole. The result is a sprouting of electronic and optical-fiber connections that link the information age's equivalents to the pools of neurons of varying size and function that make up the central nervous system and work in concert with the brain. Centers of activity range from individual computers on a network to local area networks (LANs), which connect computers at single sites, to wide area networks (WANs) that link LANs or individual machines across a region, a nation, or the world.
While networks of machines and devices are the ostensible manifestations of the trend toward distributed computing and communications, the most significant connections, according to colloquium participants, are those between people and organizational units using linked devices. It is at the level of the worker that systems integration and distributed computing should have its greatest impact, many asserted.
This expectation is markedly different from the notions of automation that have heavily influenced computer applications since the 1960s. In the
early days of commercial computing, companies used the technology to automate "simple stand-alone processes," explained Gerard R. Weis, senior vice president at Sears Technology Services Inc. "We would do things like capture data, keypunch it, and report on it simply to compress the time from the time we got the data until we produced a report and to obtain operational cost savings by reducing the number of people who manually produced those reports."
In the following decade, according to the recounting of Weis and other participants, firms began to link automated processes within some units of their business. The result was what has been described as archipelagos of automation created from islands of automation.
"The 1980s," said Weis, drawing on his own company's experiences, "saw us take two divergent paths. . . . On the business side, we focused on data integration and data-based management so that we began to tie together information in the various lines of businesses. On the technology side, we focused on pushing down the cost of running those systems and on making sure that the infrastructure would support the data integration that was going on at the applications level."
What many firms have learned during this 30-year evolution is that automating business as usual did not tap the most significant competitive advantages that can be achieved with information technology, explained W. James Fischer, managing partner for technology services at Andersen Consulting. Whatever the gains inherent in this approach of assigning computing technology to its most obvious uses—preparing payrolls, budgets, and inventories and performing other number-crunching tasks—they were likely to be short-lived advantages because such applications are readily available to all competitors, he said.
In contrast to the past pattern of responding incrementally and, often, in piecemeal fashion to the growing capabilities of information technology, firms in the current decade may use the technology to redefine themselves. That is what Weis foresees happening at Sears. "[W]e have to reassess our business processes and our culture," he said, "and figure out then how to make the business run differently and to exploit technology in fostering that change."
SYSTEMS INTEGRATION AS "PROCESS INNOVATION"
Structural and cultural change is a formidable challenge for any firm, public agency, or other type of organization. Yet for most organizations, maintaining the status quo will likely mean that they will not realize the most significant advantages afforded by the technology, contended Fischer, whose responsibilities at Andersen Consulting include devising a comprehensive firm-wide view for applying information technology.
"[S]ystems integration," he said, "really ought to be about the business of process innovation," which, by definition, necessitates change. Process innovation entails changing the ways companies "perform their standard business functions, changing the way they manufacture, changing the way they distribute, changing the way their orders are taken, and changing the way they sell their product," he explained.
Ultimately, Fischer said, the aim is to "simplify the business": to reduce the time it takes to perform key activities and to narrow the gaps between the personnel and the functions that support those activities. Within manufacturing firms, this requires erasing barriers between design, engineering, production, marketing, and distribution. Fischer also maintained that the new cooperative links forged by systems integration should extend outside the organization and tighten relationships with suppliers and customers.
Mark Teflian, vice president and chief information officer at Covia, which operates the world's second-largest computerized airline reservation system, offered a different yet complementary conceptual framework for appraising the transformational role of networked information technology. Global competition and the rapid diffusion of technology across international borders, he said, have shortened the competitive life of most products and, consequently, collapsed the time a firm has to recover costs and generate profits that support succeeding cycles of innovation and product introductions. Increasingly, Teflian predicted, firms will regard their products as perishable products with limited shelf lives. Seizing short-lived marketing opportunities and optimizing pricing strategies will require timely capture of information at the point of sale and rapid response to changing market conditions.
The growing value of timely information, Teflian asserted, will spur wide adoption of on-line transaction-processing (OLTP) systems. Pioneered by airlines that developed computerized reservation systems, OLTP systems provide computer users on a network with simultaneous real-time access to shared databases. Whenever necessary, users can retrieve information, change it, and enter new information, thereby updating the databases and providing others on the network with the most current information available. According to Teflian, such on-line systems offer firms the means to broaden and deepen their intelligence, a fundamental requirement for rapid and informed decision making.
ASSIMILATING INFORMATION AND ENABLING PEOPLE
As customers, software and hardware manufacturers, and system integrators reassess the roles and uses of information technology, people emerge as the most critical element and as the element most resistant to the organizational changes that systems integration fosters. Integrating people—helping them assimilate information, create, collaborate, and, in sum, work
more productively—is the highest-order task of systems integration. Ultimately, the success or failure of systems integration is determined not by "groups of technology," Teflian contended, but by how effectively networks help people "process and assimilate information." Overcoming incompatibilities between computer operating systems and other systems-engineering hurdles are "minor problems," he said, compared with the challenge of adapting the technology to ensure that it is truly an enabling tool for the people who use it.
Systems integrators and their customers have devoted most of their attention to technical issues, which, obviously, if not resolved will impede effective use of information technology. But such issues should not obscure the fact that the true measure of information technology's value is its impact on human and organizational performance, advised Max Hopper, American Airlines senior vice president for information systems. To illustrate his point, Hopper offered as an example his company's collaboration with the French national railroad to develop a computer reservation system, called Reserv-rail. Through the systems integration project, the railroad is taking a 20-year "leap" in technology, he said, but the physical deployment of the enabling computer and communications equipment represents only a secondary part of the process. "I do not think it is a piece of hardware," he explained. "I mean, it did not make a damn bit of difference whether the computer that is used is a PC [personal computer or a supercomputer. It . . . really relates to changing the way the company does business. That is where, I think, there is skill [needed]."
That skill, however, is at a nascent stage of development. "I think we are just starting to understand the fact that the computing system has to be driven by the human system," said Michael Taylor, central systems engineering manager at the Digital Equipment Corp. "We are starting to think in terms of a new paradigm that says you start with the people, the way those people need to do work. Forget all this technology. That will come later. But look at the people, look at what those people have to do, and see what you can do to make them more productive."
With this perspective on using information technology to make workers more effective, the purview of systems integration expands greatly. Into a domain largely devoted to solving detailed technical problems enter issues intricately related to cultural notions of work. For example, networking is expected to enable greater cooperation and interaction among workers, but it is not at all clear how to foster a truly collaborative, computer-supported work style and to capture the anticipated productivity benefits.
ACHIEVING EFFECTIVE SYSTEMS INTEGRATION
Faced with a task of such breadth and complexity, systems integrators and their customers may be tempted to dissect the economic, technological,
organizational, and cultural problems and issues they confront into ever-smaller parts. But Feld of Frito-Lay warned against this tendency. "We need to think about [information systems]," he said, "in much longer time frames and from a much higher mountain. . . . Breaking down the problem to smaller elements is going in the wrong direction because you cannot see what is happening. You have got to be able to step above it" and view the system as a whole.
Yet, the whole is a composite creation—a one-of-a-kind assembly of many people and many individual pieces of hardware and software—built by interdisciplinary and often geographically separated teams. Coordination is essential and difficult. In such a complex undertaking, the system-wide perspective that Feld advocated can easily be lost. Moreover, methodological tools to guide systems integration projects and help ensure congruence and complementarity among supporting tasks and products are at a rudimentary stage. In fact, colloquium participants generally agreed that building large integrated systems from heterogeneous collections of hardware and software remains more an imperfect art than a structured scientific or engineering discipline.
Numerous pitfalls are inherent in the process and, consequently, it is not uncommon for an information system to fall short of expectations held at the beginning of a systems integration project. For example, more than a few anecdotal accounts describe projects that greatly exceeded their budget or, worse yet, were abandoned after considerable investments of money and time. Consider the experiences of the federal government, the largest user of systems integration services. In 1990 the General Accounting Office reported that the government spends $20 billion annually to improve its 53,000 computer systems. But the watchdog agency found that "attempts to modernize the government's information systems have produced few successes and many costly failures." 4
In addition, problems arise after information systems are up and running. Equipment problems, programming errors, and other disruptive events cause network failures, resulting in annual losses estimated to range from $600,000 to $3 million for firms with large systems. 5,6 Moreover, unauthorized use of networks and other security abuses have resulted in large, but untabulated, losses of money and information. 7
Acknowledging the difficulties and complexities that can undermine the aims of system integration, colloquium participants identified some of the key attributes of efforts to build effective information systems, as well as the essential features of those systems.
Understanding the Organization and the Application Area
From the various definitions of systems integration offered by participants, one might deduce that effectiveness is variously perceived by the
beholders—the organizations and people who will use the information system. For example, Jeffrey M. Heller, senior vice president at Electronic Data Systems, described an effective systems integration project as one that fulfills a "practical objective through the assemblage of diverse component technologies and disciplines that are critical to each others' success." At a general level, this definition seems straightforward enough, but it can become exceedingly complex at the operational level of an individual organization. "The formulation of customer requirements," Heller said, may be the most important and least appreciated aspect of systems integration. Added Feld of Frito-Lay, "You have to have a business proposition that is worthy of the investment" in an integrated system of information technology.
Translating a business plan into an integrated set of hardware, databases, and applications, according to Fischer of Andersen Consulting, requires an approach that encompasses each of the main elements common to every company: the overriding business strategy, the technology and the operations supporting that strategy, and the skill levels of the work force. "These four elements must be synchronized, or important synergies will be lost," Fischer said. "If any of you had the opportunity to review systems that were designed in the past 20 years, you would find that none of them reflects an understanding of all four elements."
An essential element of this understanding, according to several participants, is comprehensive knowledge of the particular attributes of the industry in which a client company is competing and of the unique characteristics of that customer's business. Customized applications developed without this knowledge are not likely to satisfy a customer's information needs, nor to provide the competitive advantages that the firm was seeking from its investment in technology. For U.S. systems integration firms aiming to compete in foreign markets, advised Ivan Selin, thorough familiarity with the application area as practiced in target countries may be the most important determinant of exporting success. (See Box 2.3.)
Increasingly, integrators are recognizing the need to be intimately familiar with application areas, a recognition that has motivated strategic partnerships with consulting organizations expert in the strategic issues of a particular industry. In addition, there is growing appreciation of the multidisciplinary nature of systems integration and the concomitant need for collaborative teamwork throughout the development process. The need for expertise in computers, software, communications, and the application domain remains critical. But some integrators are choosing to broaden their perspective and are now augmenting their teams with anthropologists, social scientists, and other specialists who can, for example, address issues related to the design of user interfaces and to how people adjust to collaborative work environments.
Box 2.3 Thoughts on Exporting
Systems integration has the ''makings of a great export industry," Ivan Selin, then under secretary for management in the U.S. State Department (he is now chairman of the Nuclear Regulatory Commission), told the colloquium. For U.S. firms to preserve and build on their commanding position in the emerging international market, Selin emphasized, they must execute a lesson already learned in the domestic market: To build effective integrated information systems, firms must be intimately familiar with the characteristics of the foreign industries and companies they are working with. He expanded on this point:
First, of course, are the obstacles to any kind of high-technology export. You have to know the language, you need entree to the customers, you may need local partners, [and so on]. . . . Then there are some additional obstacles specific to integrated systems. First of all, by definition, we're talking about major applications. We're talking about a lot of time, a lot of effort, a lot of customized software, a lot of industry-specific standard software, and a lot of hardware. Before you can sell one of these major applications you really need to know the industry as practiced in the target country, not just as practiced in the United States. Depending on the industry, there are major differences from country to country.
But you also need to know the company, because, again, these are not products off the shelf; they're specific to a particular company's applications, and these are hard to know. Even if you devote a lot of time and effort to understand international banking or, say, trade documentation, when you go to Japan you find out that documents are done differently from the way they are done in the United States. The companies are different, the culture is different, what they are willing to share from one company to another is different. . . .
On top of that, . . . you are talking about doing applications that are deeply involved in the "innards" of how the company operates, and so companies tend to be very reluctant to bring in outside firms to do really mainline systems. It's one thing to buy something off the shelf or out of a catalogue; it's another to trust an outsider to come in and develop a system that will be central to the operation of that company for a long time to come. On the one hand, the company is terribly dependent on the outsider to provide the system successfully; on the other hand, that outsider is going to walk away with a lot of inside information, and in many societies and many companies that is something that is given up very reluctantly. Finally, you need to have experts in the country to which you are exporting. It's not enough to have very good sales representatives and very good maintenance people—you need to have the people who have a fair share of the information that was necessary to develop the system in the first place.
''To be successful," said Albert B. Crawford, executive vice president for strategic business systems at American Express Travel Related Services, "a systems integration contractor must demonstrate relevant experience in multiple disciplines and in successfully managing extremely large, complex projects." However, finding people with the requisite mix of skills to achieve that level of performance is becoming increasingly difficult for the systems integration industry.
Recognizing Essential Features of Information Systems
The rapid advance of computer and communications technology underlies an ever-changing set of user needs. New capabilities create new business opportunities and, at the same time, open the door to new competition, necessitating a change in business strategies and operations and, consequently, in the information systems on which firms depend. As a result, businesses want to be able to capitalize on new technology offerings, but without having to start from scratch with each new round of innovation or each newly identified information need.
Thus a key attribute of an integrated information system is flexibility, the ability to evolve to accommodate unforeseen technologies and information needs and to be adapted with relative ease to address new competitive challenges. Such flexibility, however, has not been a hallmark of information systems.
In the past, said Taylor of the Digital Equipment Corp., integrators "created some very innovative, very creative, very unique solutions to the business problems of the era. The bad news was that with this very creativeness and uniqueness, we caused difficulties. . . . Companies [now] looking at these integrated solutions of a few years ago find that, because they are unique, they are difficult to migrate forward, they are difficult to evolve as the underlying technology changes, and what, at the time, was a competitive advantage for the firm could well become a competitive disadvantage if indeed that solution is rigid and inflexible and geared to a way of doing business that is no longer in tune with the way that corporation now wants to do business."
These experiences point to the need for a consistent systems architecture assembled with modular building blocks—hardware, software, database, and communications platforms with flexible linkages, or interfaces. Standardization and the move toward open systems are yielding modular products that, in effect, can be bolted on or plugged into existing systems and yet are malleable enough to link to tomorrow's technology.
"From a technology perspective," Taylor explained, "we are taking a
much more architectural approach. That says you need building blocks, you need to innovate within those building blocks but you need to retain that architectural framework so that the whole evolution is not impeded by the fact that everything is interconnected to everything else."
The growth of information networking, however, has outpaced standards development and the computer industry's migration to open systems. Many organizations that are pioneering applications of information technology still bear the risk of being locked into proprietary systems that may restrict future options for connectivity and interoperability.8 For now, these organizations must choose from among competing platforms, hoping that their selections will become the nationally and internationally accepted and implemented standards.
If a firm is large enough, it can try to dictate its architectural requirements to prospective suppliers, as American Express Travel Related Services has done in the area of communication components. According to Crawford, who oversees development of the subsidiary's global information system, American Express has stipulated a set of standard interfaces that equipment suppliers must provide.
American Express, like many other firms, is now endeavoring to build an underlying architecture for its global information system, a framework on which to combine old and new technology and build coherent solutions rather than a patchwork of partial solutions. Like a well-conceived plan that includes contingencies for uncertainty, a good systems architecture is expansive and adaptable. But, Crawford explained, it also exerts controls to ensure that new equipment, databases, and applications achieve the desired levels of connectivity and interoperability.
Frito-Lay's new corporate information system provides an example of the benefits that accrue to an architecture that can evolve with changing needs. "One of the fundamental design criteria" in Frito-Lay's layered architecture, explained Feld, who directed the development of the company's system, is the ability to accommodate change. For example, only minor database adjustments are required if the firm revamps its employee pay structures or even if it restructures the organization. "We have 32 areas of the country now," Feld said. ''If we wanted to drop down to 28, it is a weekend database reorganization." Because of the disaggregated nature of the information system, major alterations in one area do not generate unwanted and unforeseen changes in others, he explained.
Data Communication Capabilities
Because of heavy network and internetwork traffic or because of their need for specialized computing and communication capabilities, such as the transmission of large graphics files, firms are paying as much attention to
their data-transmission capabilities and information-retrieval times as they are to such perennial issues as processing power and memory and storage capacity. Some have invested in building private information infrastructures, often at considerable expense. For example, when General Motors (GM) initiated its massive program to automate its manufacturing operations, about half of its outlays for computer systems went for communications-related equipment and software.9 Today, corporate spending for private networks accounts for more than half of all spending for communications networks in the United States.10
But in building their high-speed networks, firms are entering into murky waters. The unsettled state of high-level communications protocols necessary for exchanging information and applications within and between networks means today's choices could complicate efforts to satisfy tomorrow's networking needs if alternative conventions emerge as the industry standards. Moreover, even the largest firms are discovering that, as the trend to internetworking and interenterprise cooperation proceeds, they can no longer entirely bypass public telecommunications carriers, an amalgam of international, national, regional, and local utilities, or the growing number of third-party suppliers of enhanced telecommunications and information services. This increases the number of interfaces and, therefore, potential bottlenecks that must be negotiated for internetworking applications, while increasing the vulnerability of a firm's communications and information system to security violations and technical failures.
As internetworking grows, the potential for theft of data, other security abuses, and accidents also increases. For example, technical problems in one network can cause deterioration of service in connected networks. Internal safeguards developed by managers of private networks will no longer suffice, but effective responses to new risks have yet to be developed.
"Five years ago," explained Weis of Sears, "we were less concerned about the outside world, and we had pretty good security means that we could implement privately within the company." But today, as Sears makes greater use of public voice and data communication networks, its information system has become more vulnerable, he said—"There are some policy issues that need to be resolved." For example, automatic number identification, which provides the telephone number of a person who gains access to an information system by means of a modem, would permit managers to audit usage. The technology, however, has raised privacy issues that have resulted in lawsuits.
Better security, said Fischer of Andersen Consulting, is "an extremely severe challenge for the future." With more and more companies billing,
ordering, and handling other tasks through electronic data interchange (EDI) and as electronic links to consumers increase, he added, "You can see how the possibilities for security violations go up exponentially. So it is a problem that we see as very, very key for the future of the industry."
Network Management and Reliability
"Nobody has yet given us the capable tools for network management of the scale, type, and variety that we need," complained Crawford of American Express. As a result, American Express, in collaboration with IBM, is developing its own system for network management and control. As part of the development effort, American Express has determined what data it needs for effective network administration, and it now requires prospective suppliers of network management products to satisfy those data needs.
The shortcomings of network management tools are widely acknowledged, the subject of numerous articles in the field. Suppliers of those tools are struggling to meet the need and, in the process, capture a share of a rapidly growing market (variously estimated to be a few hundred million dollars in revenues per year). What users hope these products will provide are a simple, unified means for monitoring the performance of an entire network and alerting managers to failures or deterioration in performance; straightforward methods for analyzing and pinpointing the causes of problems; and a comprehensive set of tools for responding to problems and rerouting traffic around trouble spots. Management issues extend beyond the need to keep networks up and running, however. Also important, for example, are tools for controlling software distribution, preventing the introduction of unlicensed programs (which may contain viruses and other problems), and maintaining data integrity.
Ease of Use and Effective Presentation of Information
Simplicity, according to colloquium participants, is the ultimate determinant of an information network's effectiveness. If information technology is not easy to use, if information is not easy to access, select, and share, and if applications are not easily mastered, then an information and communication system fails to accomplish its primary function of enabling people.
For the average user, first impressions—that is, interactions with the user interface—can be lasting ones. Rather than breeding the familiarity that fosters greater use and exploration of network applications, several participants noted, differences in user interfaces can cause considerable frustration and confusion, dissuading employees, for example, from using electronic mail features or entering information into a database on potential custom-
ers. A networked information system, maintained Robert L. Martin, vice president for software technology and systems at Bell Communications Research, should have a "common look and feel." An underlying consistency in the appearance and functionality of applications and databases not only hastens learning and, perhaps, lowers expenditures for training employees who are expected to use the network, but it also makes it easier to tap the functionality of suites of applications.
With the emergence of multimedia applications that combine information in all its textual, graphical, and audio forms, new types of interfaces—those that, for example, respond to voice commands or eye and hand movements—will also be introduced. These offerings will greatly enhance the utility of information systems. But, said Larry E. Druffel, director of the Software Engineering Institute at Carnegie Mellon University, integrating new interface technology should not require revamping the information system and its underlying applications and databases.
Of course, the utility and diversity of applications and complementing tools determine what users can actually do with an information system. Today, most organizations have a backlog of ideas for new customized applications awaiting development. Ideally, workers or groups of workers should be able to build their own applications to help them accomplish their tasks. New organizational software that allows groups to work together is a step in this direction. It strives to create what, in essence, is a flexible programming environment that enables users to create their own applications. Such software often includes programming tools and prepackaged bits of programming code, or objects, that help users create new applications and databases by combining the components of existing ones. With significant advances over the next 10 to 15 years, suggested Weis of Sears, these programming aids may enable business professionals to "specify, in a nonprocedural way, the business functions that they want to perform and then turn those rules or nonprocedural statements into a system" that performs the desired applications. Today, however, the gap between promise and reality is large. Even for the most experienced systems integrators and software designers and programmers, it remains exceedingly difficult to develop an application or suite of applications that, at the implementation stage, works according to plans and expectations.
A major unsolved problem in this area is in the statement of requirements in a format understandable to the end users and also to the developers. These two groups usually come from very different job backgrounds and frequently share very little common terminology. Methods such as "rapid prototyping," which allow end users to observe the "look and feel" of the final systems before committing to full-scale development, have been very successful in defining expectations.
ISSUES AND RESEARCH NEEDS
Like the process of systems integration itself, the issues and challenges that will shape the evolution of distributed, networked computing must be assessed from several interrelated perspectives. A rapidly growing industry with a potentially large international market, the systems integration industry must attend to matters that will affect its competitive status. The businesses and other organizations that are now the primary users of systems integration services face challenges in making the most advantageous use of their information technology. Finally, as the web of interconnected computing and communication devices grows, the entire economy and all of U.S. society become affected, introducing a more encompassing set of needs and issues.
Today's experiences with networked information systems are testimony to the advantages to be reaped on scales small and large. But they are also testimony to the tremendous challenges positioned between the reality and the promise of information networking. One assessment of the current reality is this: "We have now reached a stage of uncontrolled chaos in the marketplace of data processing and data communications. Multivendor systems are almost universal, and the inability of the elements in this heterogeneous environment to interwork is legion."11
Many organizations have mastered these difficulties and are realizing significant benefits, but many are still struggling to take full advantage of their integrated systems of information technology. More important, most organizations, daunted, perhaps, by the prospect of sizable investments and the mire of technical issues, are not even pursuing many of the advantages afforded by information networking. At the colloquium, representatives of system integration firms and organizations that are major users of information technology offered their perspectives on the issues and research questions that stand in the way of effective use and wide-scale adoption of an information network by U.S. businesses. In subsequent chapters, many of these same issues are discussed in societal and global contexts.
Adapting the Installed Base of Information Technology
Information technology is a large and growing portion of the capital stock of U.S. businesses. A major challenge, therefore, is what many in the systems integration industry call migration, a "forward-engineering" process that enables owners of information technology to preserve and build on their installed bases of hardware, software, and information. Given the additional complexity this imposes on the already complex process of systems integration, it sometimes may seem that razing the existing system and
starting anew is preferable to the often massive restructuring that moving to distributed, networked computing entails.
"Is our historical investment [in information technology] a plus or a minus?" asked Heller of Electronic Data Systems. The answer, according to other colloquium participants, is probably both, with the relative balance between asset or hindrance hinging on progress in developing methodologies for forward engineering and the reuse of software and databases (see section immediately below).
Unlike firms in Asia and, to a lesser but still significant degree, those in Europe, said Fischer of Andersen Consulting, U.S. organizations "have an enormous installed base of packaged knowledge, of business logic, of systems across the country. We cannot come in with a clean sheet of paper and say, 'Point B is where I want to be; A is where we are. Let's just do it.' We have got to have a way to recover design, to recover the logic out of the existing systems we have. . . . We need to be able to accumulate what has been done over the past 30 years with the automation projects, be able to collect that knowledge and that information in a repository, and be able to forward-engineer it to new solutions."
The issues involved transcend systems engineering, the act of connecting isolated devices and applications into networks. Improved systems engineering techniques are needed to strengthen the competitive status of the U.S. systems integration industry, according to Fischer and others. But far more challenging, they stressed, are the difficulties involved in restructuring existing technology and knowledge in ways so that this base both enhances and is enhanced by computing and communication capabilities.
Research on forward engineering and migration strategies is under way, much of it under the sponsorship of the U.S. Department of Defense. Thus far, however, the returns to these efforts have been limited. "Current reengineering technology," said Barry Boehm, director of the Defense Advanced Research Projects Agency (DARPA), "tends to take unstructured, outmoded ADP [automated data processing] systems and turn them into structured, outmoded ADP systems."
Software Design, Development, and Reuse
Every digital device is programmable. Software, therefore, is the glue that links the vast array of digital devices into a network, and through applications, it is the primary determinant of the network's value to an organization. It also represents the major expense of an integrated information system, much of it stemming from the customized programming involved and from the cost of maintaining and upgrading applications. In addition, poor software design and programming errors are not infrequent causes of network failures.
A recurring theme throughout the colloquium was the need to improve methods for software design and development to make the process more efficient and to make information systems more reliable. This is a long-recognized need, but, participants pointed out, its importance grows with the complexity of networked systems and with the size of the potential losses incurred when these systems fail.
Part of the answer rests with tools and techniques for reusing and interchanging components of existing software, which are also essential for efforts to forward-engineer the installed base of information technology. "Often, the need to upgrade or change hardware, software, or communications [technology]," explained Heller of EDS, "requires companies to consider the necessity of reprogramming their existing applications to take advantage of the new advances in computer technology. To reduce this necessity, new tools—modeling, CASE [computer-aided software engineering], object-oriented techniques, and so on—need to be analyzed for their viability." Thus the ability to carry forward such things as the design rationale inherent in the software is an example of a subtle but important consideration in building large systems that must accommodate future developments.
"Today," Heller added, "application building is far too expensive."
Boehm of DARPA concurred: [I]t is time to start building software component by component rather than construction by construction."
Methods for adapting and reusing existing programs in new applications and for restructuring old data are improving, some participants noted, citing progress in object-oriented programming as an example. Nonetheless, significant problems remain. Positive steps in this direction are continuing enhancement of CASE tools for computer-aided program development, object-oriented techniques for using preprogrammed bits of software, formal techniques for verifying whether programming code performs according to specifications, and nonprocedural languages that permit users to write their own programs without using a formal programming language. For now, it is uncertain whether one method or a combination of these methods will yield improvements on the scale that is needed.
Several panelists suggested that object-oriented techniques hold considerable promise in achieving interoperability among the functions within separate applications and in salvaging past programming work. CASE tools, in contrast, they observed, have yet to yield the promised productivity benefits. Indeed, in a recent survey of users of CASE tools, more than a third reported that programming productivity had not increased.12
In the meantime Japanese software firms and, recently, European companies are concentrating on manufacturing-style approaches to software development. Approaches vary, but quality-assurance methods and automation are common denominators.
Methodologies to Guide Organizational Change
Full assimilation of an integrated information system often implies a corresponding integration of the business organization and its processes. Typically, this may entail flattening the organizational hierarchy, revamping the scope of business, and linking units that were once functionally isolated. Such sweeping change is not accomplished easily. But experiences of the last two decades demonstrate that the competitive advantages gained by automating the status quo evaporate quickly.
Many systems integration firms use formal methods for analyzing how firms use and communicate information internally and externally. On the basis of such analyses and close consultation with their clients, integrators develop an architectural plan for "defining technology requirements," said Heller of EDS, and for "blending . . . investments in computing and communication [equipment and software] and in structuring data into information that is meaningful in business terms." Nonetheless, he added, the process is a "little bit fuzzy."
"There are a lot of disciplines required," he said. "Not all of these disciplines are strictly technical or technology-based. But it takes a multidisciplinary team approach, in our experience, to deliver to a customer, to meet his requirements in even small areas that are well bounded, to say nothing of a full business."
Because each organization is unique, it would be unreasonable to assume that systems integration and the implementation of changes in business operations and worker relationships can be reduced entirely to a formal methodology. Still, better tools are needed, according to Fischer of Andersen Consulting. "We need some packaged or agreed-to methods and tools to help us figure out how to go about redesigning the business process," he said.
It can be helpful to view the systems integration process in a total life-cycle perspective, one that includes "change" as a built-in attribute. One model suggested had five steps: first, identifying the need for change from the current operation; second, defining and documenting the requirements for the "new" system; third, implementing and integrating the components of the new system; fourth, making the transition from the current system to the new system with acceptable risk to the business; and fifth, maintaining the new system, including providing for quality improvements and upgrades (which takes us back to the first step).
When the topic of standards arises in a group of hardware manufacturers and software publishers, a heated debate can be expected to ensue. In
general terms, one group will argue that standards can prematurely freeze technology and dampen innovation, while the other group extols the merits of easy connectivity and interoperability for users and, consequently, the market-enlarging effects of standards. No such debate occurred at the colloquium. Suppliers of systems integration services and users of information technology appeared united in their support for standardization of hardware and software interfaces and for communication protocols.
"Standardization does not imply a static situation," said Heller of EDS. "Standards must be continually reviewed and modified as requirements change and technology advances."
Added Crawford of American Express, "International standardization and so-called open systems will add further impetus to systems integration."
While espousing the need for standardization, several speakers were critical of how U.S. industry, the federal government, and users of information technology have participated in the standards-setting process. U.S. computer and communications firms are active in national and international standards organizations, but too rarely do they act from an industrywide perspective. Although increasing numbers of users are becoming more active in standards issues, sometimes crafting their own standards, most remain passive observers.
Crawford advised users to "be proactive" in standards issues. Two staff members in his division work full-time to advance American Express's positions on international telecommunications standards, and another small group of workers concentrate on regulatory and standards activities in the United States.
Several colloquium participants suggested that the federal government could play an instrumental role in coordinating U.S. industry's participation in international standards activities and monitoring developments in foreign countries. They pointed out that Japan and the European Community have taken a more comprehensive and forward-looking view of standards than have U.S. government and industry.
Careful monitoring of international activities, said Hopper of American Airlines, is necessary because of the impact of standards on the globalization of information technology and because of the potential for nations to devise standards that serve to protect domestic industries and restrict international competition. "I am a believer that standards can only help us," he said. "We should embrace them and not resist their inevitability, but with the caveat that we have to guard against standards" that are designed to be barriers to international competition.
Other facets of standardization are discussed in following chapters (see Chapter 3 appendix on standards making).
Data Communication Capabilities
As already noted, large U.S. businesses have invested enormous sums to build their own high-speed communication links to connect networks. Many have chosen to bypass, as much as they can, the public telecommunications carriers because of insufficient data-carrying capacity,13 discontinuities in the service offerings of local and regional carriers, and cost savings they can achieve with their own networks. These firms have, in effect, built their own information infrastructure, an unaffordable option for most organizations that could benefit from high-speed internetworking capabilities. Thus most of the approximately 700,000 private networks in the United States are information outposts linked by the data-transmission equivalent of one-lane highways.14
As discussed in the next chapter, the prospect of broadband Integrated Services Digital Network service, which initially would offer transmission rates of more than 150 million bits per second, is viewed as one potential remedy to this infrastructural deficiency.
Many colloquium participants suggested that the growth of the U.S. systems integration industry could be constrained by shortages of qualified personnel. While the entire computer sector confronts scarcities of talent in key science and engineering disciplines, the needs of systems integrators may be the most difficult to satisfy, at least through the traditional channel of universities.
''Systems integration demands a special mix of expertise with emphasis on organizational, consulting, and management skills—on top of demonstrated technical expertise," Crawford explained. "There is no way that recent graduates can acquire those skills through education alone."
Nonetheless, participants maintained that U.S. universities have an important education and training role to play. Unfortunately, departments of computer science and engineering and other units that concentrate on computer-related topics have not included systems integration in their domains. "The best way to get the academic community to address the introduction of systems integration into the educational system is to entice the academics to engage in appropriate research," said Druffel of the Software Engineering Institute at Carnegie Mellon University. "We must develop within the research community an appreciation for the importance of the problem and its validity as a research topic. This also implies the availability of funding, and so the major research funding agencies must be convinced [of the need]."
Currently, many systems integration firms invest heavily in the training of their employees. EDS and Andersen Consulting, for example, spend the
equivalent of about 10 percent of their annual revenues on training programs. The need for these programs will likely remain strong, according to Heller.
''Retraining will be one of the key strategies of systems integration firms for meeting their technical personnel needs," Heller said. However, the challenge of equipping people with the requisite skills and knowledge may become more difficult, he said. "Technical positions in a systems integration firm require technical aptitude, and many people undergoing training will not be successful," he explained. At the same time, the field is becoming more complex. "The technical curricula a company uses for training entry-level science and engineering graduates will likely require modification in pace and content," Heller said. Moreover, imparting technical skills to people with nontechnical backgrounds "will require much patience and leadership attention."
1. For a broad range of historical statistical information see The Computer, Business Equipment, Software and Services, and Telecommunications Industry: 1960–1996, Industry Marketing Statistics Committee, CBEMA, Washington, D.C., 1987.
2. Gantz, John. 1987. "Systems Integration: Living in a House of Our Own Making," Telecommunication Products + Technology, May, p. 35. Gantz, John. 1991. "Double Trouble," The Economist, January 12, p. 63. Depke, Deirdre A., and Richard Brandt. 1991. "PCs: What the Future Holds," Business Week, August 12, pp 58–64.
3. Wright, Karen. 1990. "The Road to the Global Village," Scientific American, March, p. 84.
4. General Accounting Office. 1990. Meeting the Government's Technology Challenge, GAO/IMTEC-90-23, February, p. 4.
5. Verity, John W. 1990. "Taming the Wild Network," Business Week, Oct. 8, p. 144. Dauber, Steven M. 1991. "Finding Fault," Byte, March, p. 207.
6. For example, in July of 1991 what eventually turned out to be a "minor" software problem led to massive failures of telephone networks in several large metropolitan areas, including Washington, D.C., and Los Angeles (Andrews, Edmund L. 1991. "String of Phone Failures Perplexes Companies and U.S. Investigators," New York Times, July 3, p. A1).
7. Computer Science and Telecommunications Board, National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age, National Academy Press, Washington, D.C., p. 7.
8. For example, the major airline reservation networks and service providers are now going through an expensive process of replacing their widely used but proprietary airline protocol, the Airline Link Control (ALC) with its 6-bit-per-character structure, in favor of standard packet-switching communication technology widely available on the open market (see Crockett, Barton. 1990. "Airline Reservation Nets Finally See Fit to Dump Outdated Protocols," Network World, September 24, p. 9).
9. In fact, GM defined a new standard (Manufacturing Automation Protocol, or MAP) for the specific needs of this application. MAP is now used widely in plant-
floor automation systems . See Kaminski, Michael A. 1990. "The Users' Viewpoint on Standards-Based Communications," Crossroads of Information Technology Standards, National Academy Press, Washington, D.C., p. 11.
10. Dorros, Irwin. 1990. "Calling for Cooperation," Bellcore Exchange , November–December, p. 7.
11. Kleinrock, Leonard. 1991. "ISDN—The Path to Broadband Networks," Proceedings of the IEEE, Vol. 79, No. 2, February, p. 112.
12. Brandt, Richard. 1991. "Can the U.S. Stay Ahead in Software?" Business Week, March 11, p. 104.
13. Public telephone companies now offer two types of enhanced service for transmission of digital data. T1 service transmits data at the rate of 1.5 million bits per second, sufficient for simultaneous transmission of voice communication and textual and numerical data; T3 service offers a transmission rate of about 45 million bits per second, which accommodates only rudimentary real-time graphics applications.
14. Gilder, George. 1991. "Into the Telecosm," Harvard Business Review , March–April, pp. 150–161.