Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Improving Decision Making About Information Technology Analyses at the macroeconomic, industry, firm, and activity levels have attempted to assess the impact of using IT at those levels. At the enterprise level a number of studies and articles have indicated (1) shortcomings in how some companies have managed IT, particularly when they first began investing in IT, and (2) a lack of correlation between the amount a company invests in IT and its return on assets, returns to shareholders, or profits per employee. To understand more thoroughly why companies have had such variable experience with their investments in IT, the committee asked experienced executives a series of structured questions (see Appendix D). The questions were intended to probe for any important problems executives had encoun- tered in implementing IT systems, how they had attacked those problems, and what issues remained. While often admitting previous and current- management problems, many executives indicated they had made signifi- cant improvements in their management practices and were currently modi- fying their processes to achieve further effectiveness. This chapter summarizes some of the interesting problems and solutions encountered. Paul Strassmann, a former executive of the Xerox Corporation and di- rector of defense information at the U.S. Department of Defense, analyzed the inconsistent relationship between investment in IT and service firm performance between 1977 and 1987 (Figure 5.1~. He concluded, "A com- puter is only worth what it can fetch at an auction. IT has value only if surrounded by appropriate policy, strategy, methods for measuring results, 165
66 100 tn ~n ~n 0 50 a, ~: ° O -50 20 - a) Q ~, 10 ~r s ~O s u' INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY o o o o o o o C~ o ° o o o oo ~ ~o° f f 0 ~° O8 0 o o o o o o o oo ~ o o o o o o o o o o 1 1 1 1 1 0 1 2 3 4 5 Expenditure on Information Technology as Percentage of Revenue · Bank of Boston Bank America ·Security Pacific First Interstate Bankers Trust '/ells Fargo ~ JP Morgan e Chase Manhattan . ~ellon Chemica First Chicago . Manufacturers Hanover Norwest Citicorp ~ 8,OOo 10,000 12,000 14,000 16,000 18,000 Cost of Computer Systems per Employee (dollars per year)
IMPROVING DECISION MAKING ABOUT INFORMATION TECHNOLOGY 167 60,000 cat Q <~7 40,000 o O 20,000 Q E LLJ a) Q Cl) .~ o o . . . . . . · · ·. S; %~. . · · · . . · · · . 5,000 10,0001 5,000 Cost of Information Technology per Employee (dollars per year) FIGURE 5.1 Investment in information technology and service performance: an inconsistent relationship. SOURCE: Reproduced with permission from Strassmann, Paul. 1990. The Business Value of Computers, Information Economics Press, New Canaan, Conn., p. xviii. Copyright 1990 by Information Economics Press. project controls, talented and committed people, sound organizational rela- tionships, and well-designed information systems.... The productivity of management is the decisive element in whether a computer helps or hurts." The committee concurs in this assessment. Box 5.1 provides a further illustration from software applications development. As Figure 5.1 sug- gests, standard measures of output make it appear that some companies' investments in IT have paid off well, whereas those of other companies with comparable opportunities have not. More often than not, good man- agement has made the difference. Generally problems with obtaining a payoff from investment in IT, when they exist, lie not in the capacity of the technology but in the planning and implementation of systems. Good man- agement can overcome many technological deficiencies, while poor man- agement can prevent returns from otherwise productive technologies. The latter has often happened, for example, when corporate managers have left decisions to technical experts who were not knowledgeable about the strate- gic factors, operating variables, or organizational dynamics involved in imple- mentation. A variety of studies (listed in Appendix A) have set forth many of the common shortcomings in the management of IT.
68 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY BOX 5.1 The Impact of CASE Tools on Productivity Experience drawn from software engineering highlights the tmpor- tance of management. Con~puter-a~ded software engineering (CASE) tools such as programming environments and debuggers were originally sold and marketed as tools that could increase the productivity of program- n~ers by an order of magnitude. However, a variety of studies indicate that the use of CASE tools makes far less difference to programmer productivity than does the choice of the particular individuals to Jo the programming job; when CASE tools are not used, programmers are un- product~ve, but the use of CASE tools Is not a guarantee that programs mers will be productive. It turns QUt that when a programming project is competently managed, the impact of CASE tools Is more easily observed. SOURCE: Bill Curtis, Software Engineering Institute, Carnegie Mellon University, Pittsburgh Pa. COMMON PROBLEM AREAS IN THE MANAGEMENT OF INFORMATION TECHNOLOGY The following section addresses some of the identified problem areas in the management of IT and summarizes observations about them made dur- ing the committee's interviews. Lack of Competition One claim is that service companies have been protected by regulations and geography from domestic and foreign competition more than their manu- facturing counterparts.2 Hence they have not been forced to compete and respond to changes as quickly. However, this explanation is incomplete. Interviews verified that these two factors might have slowed potential changes in regulated airlines, health care, banking, and communications companies. Yet no professional service, retail, wholesale, entertainment, or interna- tional banking company mentioned these factors as relevant. Still, there is no question that more recent instances of restructuring in service industries have coincided with increased deregulation, cross-border trade, foreign di- rect trade, and depressed domestic markets. These factors may help to explain previous slow responses to change in some types of companies but as a full explanation seem inconsistent with the facts that (l) U.S. service companies have consistently been among the first to adopt new technologies, (2) other countries (particularly in Europe, and Japan) have been even more sheltered in key service industries like communications, transportation, distribution, and finance, and (3) U.S. ser
IMPROVING DECISION MAKING ABOUTINFORMATION TECHNOLOGY 169 vice companies' performance in most areas has compared very favorably with that of the companies of other industrial countries. (See Chapter 1.) Inadequate Planning and Follow-up Inadequate planning and follow-up have undoubtedly been significant problems in implementing IT projects for many service-sector firms. Although at the time of the comm~ttee's interviews many respondents indicated that their pro- cedures for planning deployments of IT were comparable to those used for investments in other technology, they often candidly admitted that they had experienced earlier problems. Early investments were often pursued to ac- quire short-term savings in labor by merely automating existing practices- rather than to generate longer-term gains or strategic potentials. Investments in duplicative or incompatible programs sometimes occurred. In many instances, the costs associated with development and ongoing support required for major new information systems were underestimated. This was especially true for the maintenance, the updating of software, and the retraining costs associated with such systems. As was noted in Chapters 3 and 4, introduction of IT frequently leads to a variety of job changes that may have far-reaching implications across an entire company. Often, the nature and scale of these implications have not been fully explored as part of planning for IT. As a result, job changes were made less smoothly, and changes in performance evaluation and reward systems unnecessarily lagged deployment of IT systems. Some respondents cited problems of underestimating incremental usage created by new systems that quickly exceeded acceptable utilization ratios in data centers later leading to unexpected costs for further central pro- cessing power. Unanticipated costs for support and additional computing power increased both total investments and the bureaucracies to operate data centers and thus lowered long-term returns. As Roger Ballou, presi- dent, Travel Related Services Group (USA), American Express, noted, One place an awful lot of people get into trouble is by not looking at the fully loaded costs of technology investment. If you analyze the costs of just bringing up the system and its ability to offset existing paper report- ing, it can look very effective. But when you look at the full ramifications in terms of CPU [central processing unit] utilization on an ongoing basis, staffing support to provide help-desk functions for it, and so on, it can change both the cost and seeming returns on the investment. We recently built an on-line reporting system using existing software. We did have to pay a few hundred thousand dollars for disk drives and things like that. With programming, the total investment was probably several million. However, the ongoing operating costs of this system are probably running at $300,000 to $500,000 a month. If you are not careful, you also keep using available CPU time incrementally, and all at once you have to buy a large new chunk of CPU capacity for the next project.
70 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY Resistance and Inefficiencies in Work Practices As noted in Chapter 3, the mere fact of access to desktop computing does not necessarily improve the effectiveness with which knowledge workers (i.e., the individuals in an organization who are responsible for interpreting and analyzing raw data and information and converting them to useful knowledge) can pursue their jobs. In many cases, knowledge workers have used IT to do more busy work without necessarily enhancing output spreadsheets can be recalculated, presentations fine-tuned, or manuscripts revised more frequently with little noticeable benefit. In a related realm, respondents often questioned whether they had achieved sought-after communications benefits from more powerful desktop tools. Many said they had not yet found effective ways to stimulate or measure better communications through electronic mail or other computer communications systems. A serious challenge underestimated by several of the companies inter- viewed comes from the long time that it takes to change traditional work practices and corporate cultures. When personnel are uncomfortable with a new work environment and lack clear direction, they often attempt to main- tain their old procedures in parallel. Such employees often feel that their basic skills and organizational worth are being undermined. Knowledge workers, in particular, often seek new grounds to justify their presence and search out other things to do as work is removed from their queue. J. Raymond Caron, president of CIGNA Systems, illustrated these con- cerns when he noted, When we began to automate some of our agents' activities, we thought we would have less work coming into our central offices for the underwriters to do and there would be a shift in the workload. That didn't happen, primarily because we underestimated the amount of work required to change traditional underwriting practice. And without a design change in the overall process, we didn't really achieve our goals. We now also have hundreds of financial types who are wizards at spreadsheet work. You have to ask the question, "How many times do you have to display this kind of data and who cares?" Corporate policies that consciously promote continuity in employment and job security may work against achieving institutional changes, although they do help address some of the employee concerns discussed in Chapter 4. As Daniel Schutzer, vice president, Citibank, said, As a nation we have had a population boom and a need for full employ- ment. We have had a reluctance until recently to lay people off, in part because we didn't have global competition until then. Now we're seeing a terrific downsizing as a result of the technology introductions of the past decade or two. This is a permanent restructuring. We'll never see those jobs come back again.
IMPROVING DECISION MAKING ABOUT INFORMATION TECHNOLOGY 171 Excessive Project Scope For many years, IT vendors and popular journals overemphasized the importance of large-scale "system solutions." Managers too often responded by seeking to install mega-projects with high visibility. Such projects gen- erally have multiple objectives that must be reconciled and integrated across several divisions. Mega-projects tend to become very complex. And they often take inordinate amounts of time, investment risk, and political com- promises to bring into being. As a result, even companies with well-estab- lished track records for innovative uses of IT have experienced difficulties with large-scale IT projects as the examples of Federal Express (success- ful with Cosmos II, unsuccessful with Zap-mail) and American Airlines (successful with SABRE, unsuccessful with CONFIRM) testify. Citibank's Schutzer underscored these difficulties: When something becomes a big magnet project, everybody becomes fo- cused on it. They throw in everything but the kitchen sink that they think about but might or might not really need. The business people don't necessarily know how easy or hard something is to do, and they are en- couraged to be more elaborate by programmers or by technical project engineers. Before you know it, you're designing for many more functions than you can possibly deliver or need. The technical guys in an effort to promote the project have usually given unrealistic schedules, which the business people may believe because they don't know enough about the technology, or delve deeply enough into the details. Then one of three things happens: (1) Everything just gets so complicated it blows up, and you get major business reverses. (2) People just get tired and kill it quiet- ly. Or (3) you announce a premature success and switch the thing over too soon without adequate testing. In any one of these scenarios, the result is a disaster and the hero whose ego is on the line usually dies. Respondents explained why such mega-projects had been major sources of corporate IT investment inefficiency. First, because of their complex- ity, such projects cost more in time and investment terms. Second, by the time they were implemented, competitors might also have come out with similar systems, often at lower costs because of cheaper technologies de- veloped during the innovator's prolonged development time. Third, be- cause of delays, large programs also ran a much higher risk of not being matched to corporate or customer needs by the time they were imple- mented. When queried, most respondents said they now broke large-scale programs into a series of smaller projects and implemented these incre- mentally. (For details, see section below, "Compressing Project Scope and Payback Time.")
172 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY Technology-driven Investments in IT Respondents reported that in earlier years, many purchases of IT were technology- or vendor-driven, rather than being determined by business needs or opportunities. At that time, many service firms had relatively little experience with advanced technologies and often lacked the expertise to articulate precisely their needs for IT or to challenge IT solutions proposed by vendors. Because of rapid advances in equipment, vendors usually could promise much greater technical capacity and flexibility at lower cost. Not wanting to lose competitiveness, top managers in service enterprises might agree to equipment purchases based on optimistic projections by lower- level technology champions and vendors. Frequently, important support systems like software, optical scanning, labeling, or materials-handling systems were inadequate, making it difficult to use the computers' full capabilities. For example, many retail firms experienced major problems in using electronic scanning systems as a source of data for marketing, profitability analyses, customer micromarketing, and other service features at the cus- tomer interface. Existing software was too expensive to implement on mainframe computers; scanning equipment was not sufficiently accurate; bar-coding labeling was not as comprehensive or readable as retail applica- tions required. Only as needed subsystems became available-especially vendor-generated labeling and decentralized microcomputing power was full implementation possible. Substantial improvements in IT, falling costs of IT, increasing customization of IT to meet user needs, diffusion of IT expertise throughout the ranks of senior management, and a much broader base of experience with IT have increased user sophistication substantially in recent years. Difficulties in Software Development Given the widespread availability of IT hardware and "shrink-wrapped" software, the only significant technological advantage that most innovators can keep as proprietary is software developed in-house. As critical as software is, however, software engineering is often very difficult to disci- pline. For a variety of reasons, software developers are often reluctant to take advantage of structured computer-assisted software engineering tools or macro programs (like METHOD 1,2) that would help to ensure the accu- racy, completeness, and speed of their work. Meaningful metrics for track- ing the output of software development have been particularly difficult to devise. Commonly used measures such as lines of code completed per day generally do not reflect either the complexity of a problem or the quality of the code, while function point systems are difficult to implement. As an
IMPROVING DECISION MAKING ABOUTINFORMATION TECHNOLOGY 173 example of the problem, Jon d'Alessio, then staff vice president and chief information officer, McKesson Corporation, stated, We don't have any formal measures of our own internal systems develop- ment productivity. We don't do function points or lines, or anything like that. There has been a great reluctance to do it by our systems staff. We have had major debates on what to measure, how to measure it. Basically, in the past we have had a culture of very creative people who were artists in building systems. We're trying to move it toward an engineering, disci- plined approach. But artists don't like to be measured, and I'm not sure the engineers do either. Difficulties in software development and engineering are not unique to the business community; software engineering remains a major challenge at the research frontiers of computer science.3 The biggest problem today in software development is the inability to produce software on a large scale. Software development managers have treated software too much like an art form whose process of creation cannot be improved through the application of sound design and engineering principles. However, developments in the 1980s suggest that understanding of how to build large software systems efficiently and well is improving, and companies are beginning to achieve significant returns on investments devoted to software development. To some extent, the knowledge being developed has been codified.4 Some companies like AT&T, Arthur Andersen, and Marriott have sys- tematized particular aspects of software development. But respondents to the committee's interviews said that new techniques and metrics for ensur- ing quality and productivity in software development are still required. The need for better standards and diffusion of software management capabilities will intensify as user systems become larger and more interactive. These may be the crucial bottlenecks to future IT progress. CRITICAL ISSUES IN THE MANAGEMENT OF INFORMATION TECHNOLOGY Other issues critical to the management of IT were identified in the committee's interviews and in its own deliberations. Many of the basic themes are not new, but the emphasis needed within each has changed. The most pervasive themes for improving the use of IT relate to (1) developing genuine information and IT strategies focused on achieving competitive advantage, (2) implementing cross-functional reengineering and restructur- ing of processes and organizations, (3) actively involving users and custom- ers throughout the processes of design and implementation of IT projects, (4) developing customer-driven measures of quality, (5) compressing the scope and payback time of projects, (6) improving postproject audits, (7)
74 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY carefully benchmarking processes and performance against those of outside sources, and (8) installing performance-evaluation and reward systems that are customer driven and that develop intellectual capital. Information and IT Strategy Seeking Competitive Advantage Most interviewed executives acknowledged the importance of having an IT strategy. However, the committee found that a majority of the firms contacted described their IT strategies as primarily components or exten- sions of divisional strategies. For the most part, programs were evaluated and prioritized as a part of divisional planning processes and subsequent procedures for allocating capital. Information technologies were regarded by a majority of the interviewed companies primarily as enablers for other desired divisional or corporate goals (e.g., cost reduction, new-product de- velopment, quality improvement). These companies' "IT strategies" were more long-range plans for investment in and installation of IT than explicit plans for integrating IT as part of a competitive positioning strategy. Some companies also had a separate corporate platform integration strategy with its own priorities. Further, some companies reported having a special infor- mation systems committee at the corporate level to review and coordinate individual strategic programs. Time horizons for IT plans varied consider- ably, but 80 percent of respondents operated on either a 3- or 5-year time horizon updated annually through capital budgets and specific operational plans keyed in at 6-month to 1-year intervals. (See Question Box 1 in Appendix D.) CIGNA Systems provides an interesting summary example of such practices. J. Raymond Caron noted, We have eleven businesses within CIGNA, and we have a strategy for each. Division by division is the way we put the plans together. In essence each division first puts down its wants and needs. Then the divi- sions prune this list by setting priorities in terms of capital availability and payoff. They draw a line in terms of what they agree to do and what they take off the table. In addition there is a corporate "platform integration" strategy. This is a kind of umbrella for all twelve systems in terms of data centers, computer platforms, communications, and applications. This largely has to do with CIGNA-Link, which is our PC LAN communications inter- connection facility for all our businesses worldwide. Decisions about those are made at the corporate level, and we set overall guidance and direction at that level. As each division puts its strategies, plans, and budgets to- gether, it is incumbent upon them to use those directions. The central infrastructure is budgeted and evaluated separately as a corporate invest- ment, with a charge-back system based upon the amount of use that each area makes of a service.
IMPROVING DECISION MAKING ABOUTINFORMATION TECHNOLOGY 175 A long-term plan for investing in IT and installing IT is an important beginning point, but a more strategic view (as has also been generated by CIGNA) is desirable. Many interviewed companies said that they had hired at top levels professionally trained IT managers familiar with the business. But with only a few exceptions, neither top managers nor the new IT man- agers described genuine information strategies that (1) would lead to dis- tinctiveness or competitive preeminence for their companies or (2) provided specific quantitative targets to ensure that the set objectives were met. The committee believes that any information or IT plan is most likely to be successful when it is part of a well-integrated strategic plan that contains such elements. As applications of IT move from an emphasis on traditional cost-cutting toward more strategic concerns, a well-defined strategic pro- cess for integrating IT and customer needs becomes particularly important. Edward Hanway, president of CIGNA Worldwide, described his company's approach: We are using a team-oriented approach and a lot of new software technolo- gies that allow a "try it, test it, fix it" type of development effort as opposed to our historic approach, which was a long, involved, planning and document preparation process. We've been adamant in motivating these multifunctional teams, pushing them to realize that the challenge is not simply to save money. Efficiency is important, but the big issue is how successful we are with a customer or in the market. Another critical component of any comprehensive information and IT strategy is engagement in that strategy by all levels of the organization. The mere involvement of chief executive officers, chief operating officers, and chief financial officers in developing the company's information strat- egy is not sufficient; true understanding and commitment are essential. To exploit the potential of IT in facilitating strategic change requires consistent long-term support across all divisions. Only top management can provide the patience, consistency, and reward structures that make it possible to execute forward-looking strategies despite urgent operating pressures. These are complex tasks requiring top-level finance, human resource, and operat- ing champions to lead needed changes. For many companies, ensuring such leadership may require making major changes in the selection and promo- tion criteria used for top managers themselves. Cross-Functional Reengineering and Reorganization Successful installation of IT increasingly requires that both information technologists and user groups look beyond their old functional boundaries. Both applications-specific expertise and familiarity with the latest IT are critical. Ensuring continuous integration between technologists and users
176 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY throughout planning and implementation both improves results and shortens design cycles. In the words of Edward Hanway, By closely integrating the people who do the IS finformation systems] work into the management structure of the business units, we have gotten terrific results in a much quicker time frame than we ever would have hoped for previously. Attitudinally, there has been a major change. Now, there is none of that age-old "we versus they," or "throw the requirements over the wall and expect us to deliver" mentality.... The productivity of IS people has increased dramatically as a result. Changing interfunctional processes usually requires readjusting organi- zational structures across those functions. Major restructuring, in turn, in- volves significant risk and takes time to accomplish smoothly. As a result, large organizations often implement such changes incrementally. To reduce risks they frequently run a series of smaller trials or experiments with po- tential new organizational modes, develop coaches for these modes, create systematic ways to learn from their experiments, and then attempt to diffuse this learning throughout the firm. All this takes time and continuous man- agement attention. To exploit IT effectively in such cases, the committee thought it helpful for managers to develop an explicit transition plan involving certain key steps. Critical among these is how to "reskill" the organization for its new roles particularly for the flexible, multiskilled, team-oriented structures that seem to be emerging. Neither public education nor company training programs have been particularly attentive in the past to meeting the needs for new skills. Many companies lack experience with new organizational and operating modes and may need to seek the necessary expertise outside their own boundaries. As part of the process of change, companies often need new methods for evaluating the contributions of individuals and teams. Important among these are measures and incentives for their line managers to develop people (and systems) as "knowledge-based assets" (a topic dis- cussed below in the section "Customer- and Knowledge-driven Performance Evaluation and Reward Systems". Many respondents commented that the first performance gains from IT's use were often obtained by merely automating existing activities. But over 70 percent of respondents said that in today's environment, careful process analysis and reengineering are key to improving benefits from the use of IT. In its classic form, reengineering involves a series of rather well- understood steps: 1. Analyze the total process to determine whether it should be done at all, that is, whether its value added exceeds its costs; 2. Analyze each major step within the process to determine its value added versus its cost and whether it can be eliminated, simplified, or con- solidated with another step;
IMPROVING DECISION MAKING ABOUT INFORMATION TECHNOLOGY 177 3. At each major step, analyze each component activity, eliminating, consolidating, or simplifying those that remain; 4. Map the remaining component activities into a new system and ana- lyze to see where further steps may be eliminated, consolidated, or simplified; 5. Install new organizational structures and systems specifically de- signed to implement the remaining activities as efficiently and effectively as possible; and 6. Follow up to ensure that the new process is operating as intended. When ineffective processes are not reengineered, the application of IT to those processes simply results in performing ineffective processes more rapidly. Respondents often noted that important contributions to productiv- ity and effectiveness through the use of IT frequently came less from the computerization of existing activities than from the transformation of a process itself.5 Successful automation is often targeted first toward those steps that are most onerous, complex, boring, or time consuming. By defin- ing and analyzing the steps and outputs of these processes down to their smallest repeatable microelements, IT-based systems can be introduced that operate at lowest cost, yet still maximize flexibility, control quality, and capture essential information for future use. When these steps were care- fully undertaken with user and customer participation, startling results were often reported. Roger Ballou of American Express described that company's process and gains as follows: If you don't reengineer the work flow to take advantage of the technology, you're just doing the same inappropriate things quicker. People have a tendency not to think about what the technology will let them do once it's in; they just do what they're doing today. In a simple example, our busi- ness travel operations historically had something like 31 manual checks done to ensure accuracy, lowest fare, and things like that. In two stages we automated about 75 percent of those quality assurance checks. To do this right we had to reengineer the whole work flow of the reservation process and ticketing functions in our offices. If we had just put the automation in, because of the 25 percent of checks that still have to be done manually, we probably would not have generated any substantial savings. But by designing a statistically valid sampling technique that could ensure accuracy levels above 98 percent on this manual 25 percent, we have been able to generate tremendous savings. We have a department of process engineering that provides the technical expertise for these tasks, but they do it in and with the offices affected. We call the process "brown paper reviews," because when they go into the offices they use big sheets of brown paper to lay out the process engineer- ing work flows of all steps, working with counselors and with office per- sonnel to get ownership of the solution.
178 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY A cross-functional and strategic orientation to process development is critical. Without it functional groups are likely to introduce IT on an inter- nal ad hoc basis without considering performance gains that might be ob- tained by looking outside their traditional organizational boundaries. Such an approach often has led to major suboptimizations. Because most organi- zations tend to grow initially around specialized functional expertise, over time a number of steps may be built into business processes merely to accommodate functional decision making and the power politics associated with functional structures. Appropriate reengineering can break down these structures and substitute cross-functional processes and team decision mak- ing, offering new opportunities and payoffs. Increasingly, improved com- munications and capabilities for data exchange enhance and enable these opportunities. At CIGNA Corporation, Melvin Ollestad, senior vice president, Claims, Employee Benefits Division, described how his corporation's cross-func- tional reengineering is specifically matched to customer needs: The company has a Systems Development Methodology (SDM) for cross- functional design. In our division, first of all we now try to decide how we want to run the business, what is the best way to serve customers. We try to understand exactly what that means and to reengineer how we want to deliver each specific service. Then we put in place the technology to support us in doing that. We had become functionalized within the offices so that we had one group of people opening the mail, a second doing basic inputs to record a claim, a third group to do data entry, another group for the adjudication process, other specialists on various kinds of claims for reference purposes, and still another group that generally answered ques- tions, phone calls, and so on. The idea of putting together teams of people and dedicating them with appropriate IT support to customers appeared to be a good way to get a needed transformation. It has had the benefit of aligning us much better and more deeply with our customers. The customers like it a lot. The employees feel better about having control, being able to make decisions, and being heard. We also find that we need fewer management superviso- ry people, and the ones we do have are in a facilitating mode, which is more rewarding to them as people get used to the process. Interviewed companies repeatedly reported that, when cross-functional reengineering is focused on specific problems, total processing times could drop from days or weeks to a few hours or minutes. Box 5.2 presents an example from the literature. From the customer's viewpoint, cross-func- tional structures also often prove to be much more effective with respect to quality, improving both efficiency and responsiveness to customers' needs. J. Raymond Caron of CIGNA Systems described that company's ap- proach to cross-functional reengineering and the benefits received from that approach:
IMPROVING DECISION MAKING ABOUT INFORMATION TECHNOLOGY 179 BOX 5.2 Reduction of Application Processing Time at the IBM Credit Corporation The IBM Credit Corporation provides financing for potential cus- tomers of IBIS computers, software, and services. Prior to reengineering of operations, a typical credit request took approximately ~ days from submission to approval. Several people handled each request: order takers, credit checkers, pricers, and the like. Idowe~fer, managers discover ereJ that the actual processing time for the typical credit request was 90 minutes: for most of the 6 days, the credit request sat on someone's desk. The company reangineered its operations, replacing specialists in the different departments with a single generalist a "deal structurert'- who processed an entire credit request from beginning to end. For difficult applications that might require more expertise for proper pro- cessing, the deal structurer calls on a small pool self specialists. IBM Credit Corporation cut its average turnaround time from 6 days to 4 hours; In addition, it now processes 100 times as many deals as before with a slightly smaller work force. SOURCE: }rammer, Michael, and dames Choppy. 1993. Reengineering We Gorporatior~: A Manifesto for Business Revolution. Harper CQIl~r1S, New York, May. We now have two kinds of reengineering. We have functional reengineering and what we have termed strategic reengineering. In a functional reengi- neering setting, we work with a claims department, an underwriting depart- ment, or a marketing department, looking at the processes within the func- tion. The more powerful and difficult approach is strategic reengineering. Strategic reengineering looks at the way a business is performed from the customer's viewpoint, and slices it across our very strong functions. This approach clears things up very quickly in terms of what adds value, what does not add value, what gets in the way of meeting customer needs, and what the cost is of providing products and services. We try wherever possi- ble to include in the effort customers, distributors, and all of the organiza- tions within the business that are involved with the value chain, as well as suppliers. This type of reengineering takes a lot of effort to do because of our strong historical functional organization bias. But we find it yields the most results. At BankAmerica, Martin Stein, vice chairman, noted, One of the effects of IT is that we now have really two types of organiza- tions. We have our traditional hierarchical, vertical organization, where we tend to centralize things like back-office functions. But IT has also
180 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY created a real operating organization which goes the other way, and that's based on cross-functional teams. Our COIN [Customer On-line Informa- tion Network] system has worked because people formed a cross-function- al design team to match this process where you couldn't tell users from technology people. The projects that succeed on a large scale appear to have those characteristics; the ones that fail don't. We're at the point now where if we don't see that type of informal cross-functional work team organization, we won't do the project. Carefully implemented, cross-functional process mapping and reengineering- along with use of self-directed teams~an result in the elimination of numer- ous steps and often two or three organizational layers involved in managing interfunctional processes. For companies interviewed by the committee, such approaches frequently resulted in real-time (computer-based) interactions and much-enhanced (personnel-based) responses to customers. For example, self- directed "800 number" teams supported by new IT systems often consolidated various order-processing, customer-inquiry, customer-response, field-dispatching, and trouble-shooting activities that had built up huge functional bureaucracies in the past. Such interfunctional applications were among the highest "quick- payoff' applications of IT reported in interviews. Traditional hierarchical management structures are difficult to reconcile with the needs of cross-functional teams. Successfully reengineered busi- ness processes often result in very flat, network, team, cluster, inverted, or other new organizational modes.6 Indeed, the full realization of benefits from using IT generally requires not just extensive investments in hardware but a complete overhaul of the firm's traditional organizations, systems, practices, and culture. Very few respondents said that they had an explic- itly stated goal of flattening an enterprise's organizational structure by us- ing IT. Such goals, announced publicly, might generate considerable resis- tance to change. However, the potential payoff is high enough to warrant conscious consideration and planning for organizational restructuring in any proposed process change. In fact, a majority (60 percent to 80 percent) of companies interviewed by the committee found that the use of IT had had an impact on their organizational structure (e.g., changing spans of control, facilitating organizational flattening, or encouraging use of self-directed teams). But very few had made a full transition to supporting their new organizational structures with both new customer-oriented measures of per- formance and new reward systems. Many said that they were experiment- ing with such changes now. Companies reported both increased centraliza- tion (usually in data centers or databases) and increased decentralization (usually in operations) resulting from their use of IT. (See Question Box 5 in Appendix D.)
IMPROVING DECISION MAKING ABOUT INFORMATION TECHNOLOGY 181 Continuous User and Customer Involvement Direct and intimate user involvement in the specification, design, and implementation of IT systems was a strong contributing condition for suc- cess in the vast majority (over 85 percent) of the companies interviewed by the committee. (See Question Box 6 in Appendix D.) This is not surpris- ing, since the applications-specific knowledge needed for effective imple- mentation is far more likely to reside in the minds of users (whose job it is to understand the application) than in the minds of information technolo- gists (whose job it is to understand the technology). Many of the companies interviewed by the committee found that exter- nal customers also have a valuable role to play during the planning and installation of IT-based systems. Unfortunately, relatively few companies directly involved such customers beyond early specification stages i.e., in system design or implementation. Because customer needs ultimately de- fine the nature and success of many applications, direct input and feedback from external customers can be very helpful in creating effective IT sys- tems. As IT applications become more strategic, involving external cus- tomers may become even more important. Modern software and improved development practices can facilitate the kind of highly interactive prototyping that greatly increases both the quantity and quality of user input. The most effective user and customer involvement occurs when: . Users and customers participate interactively. Since user and cus- tomer needs are often difficult to conceptualize and articulate, close user interaction with information technologists, or still better, interaction with possible prototypes, can help uncover hidden but real needs. Focus groups and experimental facilities for testing consumers' responses are important ways to obtain input, but observing users interacting with a prototype can add important new insights. A vice president of a major airline described that company's approach: We have consumer inputs for our system designs from the airports where we have implemented test systems. In addition, we have user inputs from the people who are actually working at the counter. For them, we use a prototyping technique where we actually are coding and developing right along with the users. As we put in a feature, they will recommend chang- es, redesign it, implement it, continually modify it, until we have reached something we all feel is satisfactory. · Users and customers are involved continuously. Because many factors critically affecting usability are decided not in original specifica- tions, but in design and implementation stages, many companies have found it very helpful to involve users and customers throughout implementation. As BankAmerica's Martin Stein stated:
182 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY If we can't get user commitment, no matter how important the system is, we won't do it. We try to do reengineering with each process change. But it is not a complete reengineering before installing the technology. It's almost an existential process. We reengineer it as we go along. One of the things we discovered is that if we are really smart, we get the benefits we say, but often it turns out that we don't get them from the places we thought we would. If you have a situation where there is a clear general idea of what the payoffs are and there are compelling economics, as you go along you get more perfect knowledge of where the gains are and can guide the process in those directions. Each reengineering iteration seems to refine it. We use interfunctional teams to let us design within these dynamics. · Groupware technology facilitates collaboration. Collaboration among technologists, users, and customers located in geographically separated sites can be cumbersome and can make rapid changes and iterations difficult. Moreover, maintaining multiple versions of a large development system can become a logistics nightmare. Technology that provides computer support for cooperative work, called "groupware," can reduce the impact of physi- cal dispersion, as well as provide for managing a system with multiple developers or multiple users. At Chase Manhattan, Craig Goldman, senior vice president and chief information officer, reported, Using Lotus Notes my developers, working with customers, can develop applications in hours and days that once took months and years. These are very user-friendly applications, with a high degree of rapid prototyping. Now our customers can actually see the development taking place before their eyes. What this has done is to make them believers that things can be done. They have wound up spending more time than they ever would have in the past working with the developers, interfacing with them, and actual- ly developing the technology. We have come up with better input and better ideas from the key people using the technology, and also better products coming out the back end because there was greater involvement on the part of customers. In some of our areas, this involvement extends to surveys that go right out to our external customers. Customer-driven Measures of Quality One of the more important trends in managing IT is the attempt to develop better metrics to measure and manage quality from the customer's viewpoint. As in manufacturing, companies' financial measures of rev- enues or returns may provide poor gauges of the quality of output, espe- cially in the short run. Several respondent companies had developed elabo- rate formal nonfinancial measures of service quality. The most straightforward of these involved internal engineering metrics. Since service quality is often produced in the same moment that the service is consumed, many
IMPROVING DECISION MAKING ABOUT INFORMATION TECHNOLOGY 183 respondents had installed on-line (real-time) IT systems to ensure the deliv- ery of crucial elements of quality that could not be achieved otherwise. Thus when financial service representatives pull up the file on a new product, they are constrained by numerous rules, limits, and procedures built into the software-to ensure that all relevant data are checked and that no impermissible commitments are made. The customer is served faster and more accurately, yet the costs of internal processing and errors are reduced. Fast-food operations' electronic systems ensure that inventories do not go stale, staffing levels are maintained, cooking temperatures and cycles are correct, customer bills are properly itemized and added, and so on. Such systems allow relatively untrained people to perform tasks accu- rately that they previously could have performed only imperfectly, if at all. For more sophisticated professional activities, such as design or mainte- nance operations, architectural management, legal work, accounting audits, bioassays, real-estate evaluations, and investment banking, on-line IT sys- tems have been implemented to ensure a thoroughness, consistency, and quality never before possible. However, despite their utility, engineering metrics and IT systems that monitor the performance of internal operations can ensure only that internal operations are proceeding as designed, not that those internal operations are providing the customer with real value. Moreover, what the company re- gards as higher-quality output (more customized service, or a faster re- sponse) may not in fact be perceived by customers as more desirable, espe- cially if they must pay more for it. Companies have often found upon checking that customers cared far more about reliability in delivery or pleasant personalities in contact people than about fast response times. They can discover this only by interviewing customers. As a result, sophisticated companies are beginning to pay substantial attention to measures of quality that are customer-based. A surprising num- ber (43 percent) of the companies interviewed by the committee had insti- tuted such measures.7 Although companies try to collect as much data as possible through automated means, a complete evaluation procedure should also normally include some random visits, customer sampling, and personal observations as assessment tools. These are especially important in under- standing certain significant dimensions of point-of-contact service perfor- mance like cheerfulness, creativeness, responsiveness, professionalism, or other key characteristics of personal service. Service companies in particular have taken a leadership role in develop- ing customer-oriented measures of quality. Quality in services often re- quires extensive interaction at the point of customer contact, is of prime importance to the customer, and has a high potential impact on future sales. As in manufacturing, there tends to be a strong positive correlation between service quality and lower cost. The elimination of errors in producing a
184 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY service decreases the costs of coordination and rework, customer service costs, and customer complaints (to say nothing of the unmeasured and pos- sibly greater costs of losses of goodwill at the customer level). Two ex- amples suggest the kinds of approaches companies take to improve levels of customer service while decreasing costs: · McKesson has defined 42 "customer satisfactors" that it surveys externally and measures internally on a routine basis. A seven-page ques- tionnaire goes to over 1000 customers every year, and updates are made quarterly on a smaller set of factors considered to be the most important. McKesson is now trying to link these "satisfactors" to its compensation- incentive systems. At the strategic level, McKesson also emphasizes five themes for competitiveness: customer-supplier satisfaction, people devel- opment, market positioning, relative net delivered cost, and innovation. For all of these factors, McKesson uses internal and external metrics to track its own and competitors' positions as perceived by customers. · MCI, in addition to using on-line measures of technical quality, does in-depth customer surveys about twice a year and uses the results of other customer questionnaires administered at a lower level of detail (10 questions) monthly. In addition it makes extensive use of focus groups and other techniques to check its general image. Every customer with more than approximately $30,000 a month in billings is surveyed in detail once a year, either informally or through in-depth interviews. The in-depth inter- views are done by an outside company. MCI does its own statistical analy- sis of the surveys that come in from the samples for residential customers. It also uses focus groups to get a more personalized feel for how those customers are responding to particular services. MCI uses formal measures of loss rates, geographically, by customer service center. In its business communications division, MCI measures these quality factors by branch office at 132 branches. It also measures loss rates by customer segment and does an extensive employee satisfaction survey every 18 months, consider- ing that to be an important factor in customer service. Respondents reported on a number of other experiments aimed at mea- suring service quality at the customer level. Nevertheless, interviews indi- cated little direct use of customer-driven metrics in measuring the perfor- mance of specific departments. There was an even greater gap in converting such measures into performance incentives for contact groups. And at the time of the interviews, no companies had converted either the results of customer surveys of quality or data on customers' observations into useful financial measures of service performance. These are important areas for future management attention.
IMPROVING DECISION MAKING ABOUT INFORMATION TECHNOLOGY 185 Compressing Project Scope and Payback Time Increasingly, companies face the dichotomy that while they seek to increase paybacks from IT through more strategic (usually more complex and longer-range) programs, the life cycle of each generation of technology is becoming shorter. To manage this anomaly, many interviewed compa- nies said that they now proceed incrementally on large IT projects. These companies said that they consciously seek to break such projects down into smaller, more discrete segments each of which can (1) be justi- fied individually and (2) be integrated incrementally into an agreed-upon system architecture. The broad goals of the overall project and its general costs and benefits are analyzed. Then the output and input characteristics of each major module (and its needed interface standards) are established. These are used to discipline all subsidiary project designs. Then the pro- gram is broken down into definable smaller projects, each with finite timing and payoffs. As early projects are successfully implemented, they help to reduce the real and perceived risk on the total project. Early paybacks lower the present value of the total investment. Initial feedback from early projects can be used to guide those that come later in the sequence. Overall project management is simplified and more focused. And there is less political resistance to large-scale projects as early successes ease concerns. To implement changes in large systems incrementally, companies indi- cated that they often developed and tested individual modules on a small scale or in a single operating division. As these projects proved their viability, they might then be integrated for testing with other successful . . . . projects on a local scale or be rolled out as discrete projects across various divisions. As a result of such practices, companies could achieve faster paybacks (by not having to wait for the entire program to be completed), and risks were reduced. Despite the rapid rate of improvement and turnover that abbreviates the life cycle of much IT equipment, only a minority (30 percent) of the inter- viewed companies said that they had a special "hurdle rate" for IT invest- ments vis-a-vis other investments. (See Question Box 1 in Appendix D.) Instead, they adjusted for the relatively quick obsolescence of IT equipment by introducing faster depreciation rates into their calculations. As a prioritizing device, some companies sought 6- to 9-month payoffs on IT projects. Others noted that among successfully implemented projects, the time to actual payoffs rarely exceeded 3 years. For example, Chase Manhattan's Craig Goldman said, "In the data center arena, you have to pay project investments back in the year you make the investment. We are planning on reducing our absolute costs over the next 3 years, every year while we enjoy a 25 percent volume growth; there's a program to support it." At CIGNA Corporation, James Stewart, executive vice president and
186 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY chief financial officer, stated: "Increasingly we are shortening the planning time frame. We are focused on shorter-term paybacks rather than building galactic systems. The planning time frame has shortened from the tradi- tional galactic 2 to 3 years down to 6 months." Planning for payback within shorter periods helps manage technologi- cal risks. Fast and effective implementation can also have strong strategic significance. As McKesson's Jon d'Alessio said, On the application side, you get the possibility of a competitive edge for a while. But the lead time for your corporation is not very long; you clearly don't get a sustainable edge because competitors can respond so quickly. How efficiently you implement, how effectively you do it, and how quick- ly and well you translate concepts into better customer service those are the things that differentiate companies. Many companies interviewed by the committee had further improved control by (1) demanding that divisional systems follow corporate-wide interface and software standards for compatibility and interoperability, (2) employing corporate-level allocation and follow-up processes for interdivi- sional strategic and infrastructure projects, and (3) using more systematic preproject analyses and postproject audits than in earlier years. Postproject Audits The committee found that virtually all interviewed companies reported using formal evaluation procedures for IT projects (which lent themselves to such analyses) before investments were made. (See Question Box 2 in Appendix D.) But postproject evaluations were less universally pursued. A majority (64 to 68 percent) did attempt audits on certain types of projects, notably cost-reduction and new-product programs. Audits for other types of projects were less frequent. (See Question Box 3 in Appendix D.) Postproject audits were often said to be erratic or spotty. This was of some concern to both the committee and to many respondents, despite the fact that a large majority of respondent companies that had undertaken overall assessments claimed acceptable to high payoffs on their IT investments. Some examples illustrate various viewpoints: . commented, At the Travelers Companies, Larry Bacon, senior vice president, Do we do postinvestment audits consistently across the board? No, it's very spotty. Our decentralized style dictates that the divisions run their own businesses. Some divisions do audits very rigorously; others don't. On each project, however, we try to make sure that we do capture the intended benefits.
IMPROVING DECISION MAKING ABOUT INFORMATION TECHNOLOGY 187 . At SuperValu, H.S. Smith, vice president, said, We do cost-benefit analyses prior to the execution of each project. In the past we haven't really been very religious about auditing after the fact. We have audited our capital appropriations, but we don't capitalize soft- ware, so we have not audited the results of that. . At Citibank, Daniel Schutzer said, I think we are probably equally guilty with everyone else as far as keeping records and checking how well we succeeded on projects and whether we really did achieve expected benefits. We constantly check the milestones for the development itself: whether we delivered in 2 years for the $5 million proposed or whether we slipped and overran. But we don't really systematically ask: Is the project increasing revenue the way we thought? Is it reducing costs or personnel? Sometimes, if there is an immediate reduction of personnel, that measurement may be taken. But for some of the other kinds of incidental benefits revenue increases, expense sav- ings it's not clear to me that we do a good job of following up. Many had installed more rigorous procedures in the last few years. They thought that installing such procedures would undoubtedly force oper- ating and information systems managers to concentrate more on specifying and achieving planned gains, and that this presumably would improve fu- ture measured performance gains from using IT. However, in the committee's interviews very few companies mentioned going to the next high-payoff step of systematically analyzing postproject audits to ascertain and catalog those success or failure patterns that could assist in selecting and managing future IT programs. There was little evidence that formal post hoc evalua- tions were directly used to guide future capital or program budgeting allo- cations. Further, such appraisals were seldom used to evaluate line manag- ers' performance or to set metrics for incentive plans. All these seem worthy considerations in improving future program management. Benchmarking Against Specialized Outside Providers Benchmarking examines how one's own performance of an activity compares to that of others performing the same activity. Benchmarking studies generally provide better information about business processes than about specific costs. Definitions of data and what is included in various cost categories vary widely among companies. These definitional problems are compounded by all of the usual problems about measuring service out- put. Consequently, comparisons of best-practice processes generally are more productive. Companies can make significant gains (1) by evaluating and modifying "best practices" observed externally and (2) by deploying their own best practices more widely internally. In addition to improving
88 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY many processes directly, comparisons with outsiders can also send signals to groups within the company being benchmarked that their performance can be checked against that of outside service groups and that they must keep up with competitive practices. The committee found that benchmarking had generally been undertaken only relatively recently. Most interviewed companies benchmarked prima- rily against other competing peer firms. Only a few (less than 30 percent) mentioned benchmarking against internal "best-in-class" activities in their own firms or in noncompeting external firms. Fewer still benchmarked against specialized external service providers like ADP Services, EDS, or ServiceMaster which have widespread reputations for efficiency. Although outsourcing of data centers has become a $7 billion to $10 billion industry in the United States and Japan,8 few interviewed companies mentioned outsourcing as a direct result of their benchmarking studies. More often they updated, consolidated, or modified their internal processes themselves. A major exception was MCI. Richard Liebhaber noted: Where I get my view of 5- or 10-year technology is by visiting vendors. People ask me how many development engineers I have working for MCI. I say I have 19,000 but they don't work for me-they work for 74 ven- dors. I view all those vendor engineering people as working for me. So, we go out into their laboratories, find out what they are doing, and influ- ence what they are doing. A particularly useful type of benchmarking can result when groups of companies voluntarily pool their own information and agree jointly to sponsor a consulting firm to undertake a detailed study of comparative practices. Data on relative performance are then fed back to individual firms. Each firm's own data are specified for its internal use, but the identity of the remaining participants is disguised by normalizing size (or other distinguishing features) and identifying competing companies only as A, B. or C. For the companies interviewed, such practices offered useful relative calibrations, although not specific financial standards for service performance. For example, . . CIGNA Corporation used outside consultants to compare unit costs of its back office and data centers versus competitors' unit costs for such services (in terms of tape drives, databases, CPUs, and so on). This engi- neering cost-driven study did not address returns on investment or make specific comparisons with specialized outsiders such as EDS. · BankAmerica compared its IT performance against that of other competitive institutions in terms of certain key measurements of effective- ness on an anonymous basis. It also used noncompeting peer groups (such as the member companies of the Research Board) for similar comparisons, but it had not specifically evaluated its overall investments in IT with re- spect to paybacks.
IMPROVING DECISION MAKING ABOUT INFORMATION TECHNOLOGY 189 The following quotes summarize how some other successful companies approached benchmarking: At Chase Manhattan, Craig Goldman said, We have done some very specific studies both internally and with the help of external sources to peg us vis-a-vis competition in a number of key areas. In one, we hired Nolan Norton. They took our data from five or six major facilities and compared them to a cross-section of other companies. There was a second study done by Price-Waterhouse that said on an indi- vidual basis, each of those data centers were efficient and significantly more cost-effective than if we outsourced them. Overall we found we are close to being as efficient as we could get from an outsourcing contract today. By doing further consolidations, we will be more efficient. In addition, we hired Booz Allen to look at our major competitors and to give us some comparative data both on efficiency levels and relative perfor- mance trends. We have also made major strides in migrating to common platforms and systems, looked around the network, and picked up the mod- ules from each sector that made the most sense. . NationsBank's Patrick Campbell, senior vice president, Technol ogy Planning, described that company's approach: In Dallas, we have a very strong IT user community. For example we have J.C. Penney's, Frito-Lay, American Airlines, Southwest Airlines, and so on. We have begun dialogues with representatives of these firms in "user communities" or "user groups." By exchanging information with noncom- petitors, we can take pages out of their book just about every day and not have to reinvent the wheel. This benchmarking group is very selective about who can join. One of the rules is that the prospects need to be Dallas-based so we have close proximity. Basically, participants must be at the corporate office level and hold the position of senior technology planner on the company team. None of the companies can compete direct- ly with one another. We also benchmark our internal operations against established out- sourcing suppliers on a continuing basis. We do "best-in-class" analyses of their processes as well. We try to position ourselves between the out- side vendor community, like AT&T or IBM, and our customers. One of the benchmarks to which we compare ourselves is the ability to provide IT services to our internal and external users at a competitive price. In other words, if AT&T can perform a service for 12 cents a minute, our gauge is to be less than 12 cents per minute. If we can do that, we are basically in a sound business position; we are not a net overhead cost the way many organizations are. Benchmarking has received widespread attention only in recent years. Even so, the most common type of benchmarking appears to be a compari- son of a firm's performance in a given activity to that of other peer compa
190 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY nies. Comparisons to specialized service providers, to smaller firms, and to firms not in the same industry are much rarer. Since activities are relatively generic (as noted in Chapter 4), it should not matter to the benchmarking company whether the best-practice provider of a given activity is a peer company or an external specialized service provider. Including specialized external service providers in the comparison group can be especially useful, because such companies make their living by concentrating on an activity and providing it more efficiently or effectively than others. Customer- and Knowledge-driven Performance Evaluation and Reward Systems A company's prosperity in the long run is intimately linked to the way in which its reward structures are aligned with its corporate goals. The committee discussed in depth the question of whether in corporations of the future, the management of intellect (or intellectual processes) and the capi- tal embodied in knowledge-based assets will be the primary bases on which they compete. Even today, knowledge-based service activities such as re- search, design, product or process development, buying, trading, marketing, advertising, systems integration, software development, and logistics man- agement contribute most of the value added in manufacturing enterprises. Whole service industries like consulting, accounting, financial services, the law, health care, entertainment, and many aspects of the communications field also depend on the value added by intellectual processes. The most valuable assets of firms in these industries typically lie in their technological and professional know-how, their flexible response and innovation structures, and their knowledge about customers and markets. These assets reside in the minds of individual staff members, in software programs, in information and management systems, and in the databases of the companies. Indeed the management of intellectual capital may well be a major factor in determining who survives and who does not in the coming years. To quote Walter B. Wriston from The Twilight of Sovereignty, "In- formation, in the words of Leon Martel, is 'rapidly replacing energy as society's main transforming resource."'9 Some studies have suggested that the management-evaluation and in- centive infrastructures of companies have not yet been adjusted to take full advantage of the opportunities that the use of IT offers.~° The committee's interviews support these contentions as they pertain to performance evalua- tion and reward structures. If a firm's competitive edge rests on its knowl- edge-based assets and its superior customer service, reward systems need to be able to measure such assets, to recognize individuals and teams whose work contributes to superior customer service, and to reward these people accordingly (Box 5.3 gives an example).
IMPROVING DECISION MAKING ABOUT INFORMATION TECHNOLOGY BOX S.3 A Knowledge-based Reward System In 1992, Salomon Brothers was planning to install an employee com- pensation system based on the knowledge Cat people bring to their Corm A new employee with no knowledge about the financial business receives a certain level of base pay. Employees are organized into teams that specialize in various products such as corporate bonds. To earn a raise, the employee must complete an assignment can a certain set of skills; as the employee masters a wider and wider Variety of skills through progressively more difficult assignments, his or her compensation will increase. Saloman Brothers expects that employees trained under this new arrangement will complete transactions more quickly. But it expects its biggest payoffs front how dec~s~c~ns about new products and evaluations of new businesses are made. SOURCE: Gabor, Andrea. 1992. "After the Pay Revolution, Job Titles Won't Matter,t' New York Times, May 17, Business Section, p. 5. However, few respondents reported direct connections between (1) their customer-based measures of performance and quality and the incentives offered to those handling contacts with end customers (although many said they were currently experimenting with such arrangements) or (2) improve- ments in knowledge-based assets and rewards given to managers. It is ironic that financial markets often reflect the value of intellectual assets (through a company's "Q value," i.e., its market value versus the replace- ment value of its physical assets) but that the company's own books and performance-evaluation systems rarely do. The value of such assets does not appear in published financial data or in the "asset" accounts used for internal controls. The omission of such factors in performance evaluation and reward systems could pose major long-term problems for service com- panies competing in a customer-driven, information-intensive era. SUMMARY AND CONCLUSIONS Although a large percentage of interviewed companies felt they had received adequate to high payoffs from using IT, there were a number of areas in which the committee thought managers could seek greater perfor- mance advantages. Principal among these were (1) developing and obtain- ing top management commitment to genuine information and IT strategies focused on gaining strategic advantage, (2) more extensive cross-functional reengineering and reorganization of processes affected by IT, (3) expanded
92 INFORMATION TECHNOLOGY IN THE SERVICE SOCIETY user and customer involvement in all aspects of the design and implementa- tion of IT projects, (4) improved customer-driven measures of quality in- stalled and in use, (5) an increased focus on shorter-term payoffs for IT investments within a long-term strategic framework, (6) better-developed postproject audits, (7) more external benchmarking and increased consider- ation of"best-practice" processes from outside specialist service groups, and (8) expanded use of customer- and knowledge-driven performance mea- surement and reward systems throughout the firm. Even the committee's sample of sophisticated respondent companies often needed improvement in these areas. Most of the problems respondents reported in achieving pay- offs from investment in IT when such problems existed came not from overinvesting in IT, but from management inadequacies in planning and implementing IT systems. Both Chapters 3 and 5 have highlighted some of the more interesting ways experienced managers have found to improve their success in using IT's potentials. Nevertheless, there is room for fur- ther improvement. NOTES AND REFERENCES 1Strassmann, P. 1990. The Business Value of Computers, Information Economics Press, New Canaan, Conn. 2Roach, S. 1989. "Pitfalls of a New Assembly Line: Can Services Learn from Manufac- turing?," Morgan Stanley, New York. Also, Roach, Stephen S. 1991. "Services Under Siege: The Restructuring Imperative," Harvard Business Review, September-October, pp. 82-91. 3Computer Science and Telecommunications Board, National Research Council. 1992. Computing the Future, National Academy Press, Washington, D.C. Also, Computer Science and Technology Board, National Research Council. 1989. Scaling Up: A Research Agenda for Software Engineering, National Academy Press, Washington, D.C. 4Humphrey, Watts S. 1989. Managing the Software Development Process, Addison- Wesley, Reading, Mass. 5In these cases, IT itself is not irrelevant. IT often provides a key element in the new process. 6A discussion of these organizational modes can be found in Quinn, James Brian, 1992, Intelligent Enterprise, Free Press, New York, Chapters 4 and 5. 70ther studies indicate that many service institutions lack such formal feedback tech- niques for measuring the quality of services. For example, one study found that 70 to 90 percent of all banks were in this category. See Giesler, E., and A. Rubenstein. 1988. "Mea- surement of Efficiency and Effectiveness in the Selection, Usage, and Evaluation of Informa- tion Technology in the Services Industries," Joint Meeting of Institute for Illinois and Industry Information Council, August 31. 8The National Academy of Engineering is currently studying some important aspects of outsourcing that will be discussed in a forthcoming National Academy of Engineering report, Preparing a Global Economy: A New Mission for U.S. Technology. 9Wriston, Walter B. 1992. The Twilight of Sovereignty, Scribners, New York. 1OMcKensie, R., and R. Walton. 1988. "Implementation of Information Technology: Human Resource Issues," MIT, Sloan School of Management, Management in the l990s Pro- gram, Cambridge, Mass.