National Academies Press: OpenBook

Deconstructing the Computer: Report of a Symposium (2005)

Chapter: Panel V: What Have We Learned and What Does It Mean?

« Previous: Panel IV: Peripherals: Current Technology Trends, continued
Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×

Panel V
What Have We Learned and What Does It Mean?

Moderator:

Carol A. Corrado

Federal Reserve Board of Governors


David F. McQueeney, International Business Machines

William J. Raduchel

Marilyn E. Manser, Bureau of Labor Statistics

Kenneth Flamm, University of Texas at Austin

Jack E. Triplett, The Brookings Institution


Dr. Corrado praised the value of the day’s discussions for economists like herself whose reason for attending had been to gain an understanding of the prospects for technology. She introduced Dr. Manser, head of the productivity group at the Bureau of Labor Statistics, as the only member of the panel who had not spoken in the course of the day’s program, then called upon Dr. McQueeney to begin.

Dr. McQueeney observed that while many of the day’s speakers had alluded to Moore’s Law, all had offered different views of how it applied to their industries, each of which was unique. He raised a question about the future of the information technology industry that he had often discussed with colleagues at IBM: “Are we going at some point to slow down the rate of innovation, or are we not going to slow it down but cross some ‘good-enough’ thresholds, so that some parts of the industry will become mature whereas other parts perhaps will not?” If it were the latter vision that turned out to be the more prescient, he stated, “it means we’ve picked the low-hanging fruit and filled up the valleys with rainwater, and perhaps the way you create value changes.” The industry had in the past been able to create value by applying microscopic, core-level technology at the bottom of the food chain and having the value trickle up to the top, the only place where it matters to customers; it was this, he noted, that might be in for a change.

Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×

A second point that had come up in the course of the day’s discussions was that business value was no longer contained in one place: in one country, at one site, or in one company. Through web services, a way of finding and executing business capabilities electronically on the web, any geographical connection between the place where one business process was executed and the place where another linked business process was executed had been completely broken. As an example, he cited companies’ using global resources to enable them to locate a help desk in a different part of the world from its technical operations. This phenomenon raised the question of whether value—to the customer, in the IT industry, or from the point of view of economic competitiveness—was any longer localized. “I think that it is not,” Dr. McQueeney declared.

A third issue identified by Dr. McQueeney was that of whether investment in the consumer electronics industry would drive future innovation on the commercial side of IT. He noted that IBM’s single highest-volume processor-chip business was in game systems, where the company was partnering with Sony and Toshiba. Since the microprocessor that gets the most design resources is the one that is most efficient and most advanced, he said, even though IBM could charge more for a mainframe microprocessor—and it might be able to make the best system, because the system is very complex—there was “no way that that market could afford the engineering bill to make the best microprocessor.” Mainframe microprocessors 10 years hence would probably resemble processors developed for the “consumer-gaming/handheld part of the industry,” whose customer base was so much larger than the amount of development investment that could be made in it as a fraction of revenue was much higher. Including displays and graphics chips with microprocessors, he said that the leisure-spending or disposable-income market provided “a tremendous source of investment for things that are at the bottom of the food chain for the commercial side of IT.” It furnished new ways to innovate that had not existed when the IT industry was rather self-contained in its investments.

Speaking next, Dr. Raduchel pointed out that the IT industry was not static and said huge changes could be expected in the future. Noting that the industry was “very supply-driven,” he rated the attempt to measure the contribution of IT to economic productivity as a very difficult challenge. Because advancement wasn’t “being driven necessarily by customer” but by the fact that staying alive in the industry meant heading off competitors by putting out the best product possible, the process was characterized by technologists’ efforts to do their best within the laws of physics. Such a process “doesn’t lend itself well to hedonic price indexes” and, in general, poses obstacles to arriving at easy output measures.

Identifying a second challenge by the phrase “the network is the computer,” Dr. Raduchel observed that the industry was prone to “talking about computers as isolated devices”—which, he added, “they are not.” Networks were causing massive changes in how systems that real-world people used were structured and

Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×

built. Alluding to a STEP Board plan for a conference on telecommunications, he cautioned that developments in switches were just as important as those in the PC for overall delivery to customers. Pointing to an earlier reference to the change brought by Dense Wave Division Multiplex (DWDM), he called “the biggest misjudgment in the capital market in the history of the world” the financial sector’s failure to reckon with the potential invention of a technology that would increase the capacity of existing fiber-optic cable by a thousandfold. “Every time you touch this problem, you begin to realize how big it becomes,” he observed. Moreover, none of it would be of any consequence unless software were brought into the picture, another major challenge. Software was often accused of being sloppy, but sloppy software might still be of great value in solving a problem. And sometimes the cause of its sloppiness—that many brains went into developing it—was also the key to its effectiveness.

Dr. Manser began by noting that a great deal of attention had been paid to the importance of the high-tech sector in explaining the productivity growth and the productivity speedup that occurred during the latter part of the 1990s. High-tech equipment affects labor-productivity growth in two ways: through the use of high-tech capital services (that is, the flow of services from the stock of high-tech equipment and software) throughout the economy by producers in all sectors, and through productivity improvements in the industries that produce high-tech equipment. BLS data for the non-farm business sector showed that, in combination, those two high-tech effects accounted for roughly two-thirds of the speed-up in labor-productivity growth that occurred in the latter part of the 1990s relative to the 1973–1995 period, a result she called “striking.”

BLS also produces labor-productivity and multifactor-productivity measures for industries in the U.S. economy according to their three-digit classifications. Labor-productivity change is measured by relating changes in real output to changes in worker hours. In multifactor-productivity change, which is somewhat more complicated to measure, BLS relates real output to changes in not only worker hours but also capital services, intermediate purchases, and, in some cases, worker skills. For the period 1987–2000, generally regarded as a period of good performance, the rate of growth of labor productivity in the U.S. non-farm business sector was 1.8 percent per year on average, which was generally regarded as strong. BLS calculates productivity using data from a variety of sources on, among other factors, revenues, prices, and labor hours. In the four-digit industrial classifications that comprise the computer and semiconductor area, the strongest measured labor-productivity growth was 42 percent per year for semiconductors and related devices, a rate she called “pretty phenomenal compared to the 1.8 percent for the economy as a whole.” The data also showed labor productivity growth of 37 percent per year on average for electronic computers, 17 percent per year for computer peripheral equipment not elsewhere classified, and 15 percent per year for computer storage devices.

Dr. Manser had pulled these figures together before attending the conference

Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×

because she wanted to know whether what she heard from the day’s speakers would raise questions about the measures BLS had been using, controversy being constant in the measurement community over how the Bureau was measuring output in the high-tech sector. Her conclusion, particularly in light of the “phenomenal things that had happened in these industries over time,” was that nothing that had been said suggested that the BLS numbers were overstating labor-productivity growth. While the way prices and real output are measured is very important to BLS in general, also important is using price measures that are consistent over a long period, because looking at productivity means looking at trends over time.

Turning to the question of workers, about which she noted little had been said in the course of the day, Dr. Manser said that BLS charted what it calls “labor composition,” which concerns the impact of changes in workers’ skills on productivity. BLS’s official series on labor composition broke out data only for the overall business sector and for the non-farm business sector. It would be important to examine the impact on productivity of changes in the composition of labor for the high-technology industries, she said, adding that she hoped this could be done in the context of work BLS was embarking on jointly with the Census Bureau, where a “very rich” new dataset had been developed, to expand understanding of labor composition.

Dr. Flamm said he would offer two quick comments, the first of which was that there had been a certain tension among some of the presentations. A number, Dr. McQueeney’s and Dr. Bregman’s among them, had focused on the problems of measuring technical advance and productivity in the large, complex systems used by large organizations. Dr. Flamm wondered whether the big volumes and big dollars in computer hardware and software were at that moment centered in the large-enterprise, complex-system market or in the small-to-medium business and consumer market. If the market were conceived of as being bifurcated in that way, then measurement problems for many of the issues discussed would be far less arduous for the small-to-medium enterprise and consumer than for the large, complex systems, whose owners are writing their own code and constructing their own storage systems. Dr. Flamm proposed the following as a characteristic to distinguish the two markets: In the small-to-medium enterprise and consumer business the storage administrator was the user, whereas in the market made up of large, complex systems there were storage administrators who were not the users. He said that Dr. McQueeney had been unable to tell him which market was larger and that his own guess was that most of the dollars were actually in the simpler, stand-alone systems.

Dr. Triplett countered that the services industries were the big buyers.

But Dr. Flamm, insisting that he would like to see the numbers, reiterated that the problem of measuring big, complex systems was different from that of figuring out what the PC and the software used by his accountant cost and what the change had been over time.

Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×

His second comment was that perhaps one way of approaching the issue of how to go about figuring out which characteristics were the right ones to measure—which was, he noted, one of the objectives of the day’s meeting—would be to set aside the notion that there was some fixed set of characteristics for everything that would need to be documented consistently over time. Following up an earlier statement by Dr. Silver, he suggested instead coming to grips with the fact that characteristics’ relevance might shift as technology advanced. “There are waves or generations of technology,” Dr. Flamm noted, “and there’s one set of characteristics that are appropriate for one wave or generation, but then the world shifts and everyone quickly forgets about the previous wave.” Once one technology has given way to another, the businesses involved are no longer concerned about collecting data used in metrics that applied to the previous technology—unlike economists and business statisticians on the one hand and, on the other, the government, to the extent that it needs the data to enforce export controls. Concluding, he noted that export controls induced businesses as well to collect the older data and suggested that persuading the government to take a list of characteristics drafted by the STEP Board as the basis of its export-control system might open up a very fertile source of data.

Dr. Triplett began his remarks by suggesting that the tension of which Dr. Flamm had spoken might have resulted from the variety of the professional groups represented among the conference presenters and participants. With the focus of both the presentations and the questions differing to a noticeable degree, it was possible that the bridge between technologists and economists had not been completed, but he judged the session to have been productive in any case.

Dr. Triplett said that hearing experts in the field say that software was very hard to measure had been a learning experience for him. “I thought I knew that when I came in here,” he observed, “but part of my concern was that I thought maybe it was hard to measure because I didn’t know enough about it. So the good news is that it wasn’t my fault.” The bad news, however, was that there had not been as much progress made measuring software as had been desired. He pronounced himself “not too sanguine” on significant results’ being obtained from analyses of the impact of software changes on customers, saying that in the complex world of the business environment it would be hard to hold constant all that would need to be held constant in order to perform the experiment of putting in a software innovation and seeing what its effect was. “I’d like to see somebody try it,” he said, “but if I had government research money to hand out, I’m not sure that would be one I’d think would be really profitable.” The idea had, in fact, put him in mind of a presentation he had seen on measuring the contribution of consulting firms to their clients’ profits—a “great idea,” but one that ignored the complexity of the world in general and, in particular, the manifold reasons for which firms hire consultants.

Dr. Triplett then turned to the benchmarks, saying that the “possibility of using real information on the cost of doing a job with different kinds of method-

Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×

ologies” had been discussed in economics literature since at least the 1980s. A first problem with benchmarks was that in time-series comparisons, which are what interest economists, only the change in the cost of what has been done in the past can be benchmarked. “You can figure out what you did yesterday, and you can figure out the change in the cost or the speed of doing that today,” he said, but “what you can’t value is doing today what you couldn’t do yesterday.” In the history of the impact of the computer, the most significant gains have not come in cutting the time it takes to type a letter but in making possible things that either simply had not been feasible before or had not been feasible in the same way. It is the task itself that has changed.

A second problem with benchmarks was that many tasks must be combined in some way. Offering an example of an office-productivity benchmark, Dr. Triplett posited as a task the time taken to execute the “replace all” command in Microsoft Word (which replaces all instances of a certain word with another word). Breaking activity down in this way caused data to be gathered on a great number of different tasks from which an aggregate must be derived. But the aggregate was often arrived at simply by averaging the tasks up—in the absence of any valuations for the tasks. And even if various sets of tasks were successfully benchmarked, it might be difficult to weight them across users with different requirements. While praising the potential of benchmarks, he cautioned that still to be thought through were methods of obtaining benchmark information, which was not yet available in sufficient quantities, and of aggregating it.

DISCUSSION

John Gardinier began the question period by commenting that he had recently retired and was trying to start a small business. He said he had seen a significant shift on the horizon that would make more applicable to the real world a basic tenet of economic theory: that in a competitive market correct information is available immediately to all the players. Making price comparisons during the previous week using the Internet, he had found the identical computer peripheral having a price span of over 100 percent; on airline fares he found a spread of a factor of five. “They used to be able to fool me,” he remarked. “They can’t fool me anymore.” Because this information was available, businesses would be under increasing pressure from better-informed consumers.

Dr. Flamm responded that a huge literature was developing on price dispersion on the Internet, which was staying constant or increasing.

Dr. McQueeney, noting that airlines had put more and more optimization into their yield maximization of profit vs. load, said “an arms race” was in progress between consumers armed with cheaptickets.com and the airlines armed with their own tools.

Iain Cockburn of Boston University observed that even as competitive pres-

Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×

sure and technological development produced huge gains through the rapid expansion of capabilities and decline of prices, there was a cost: “Things become impossible that you used to do.” As an illustration, he cited the difficulty of retrieving data stored on half-inch tape or a 5.25-inch floppy disk. Display technology, in what he called a “striking difference,” seemed immune to obsolescence and its associated costs.

Dr. McQueeney recalled that, a month before, a group having lunch at his research lab postulated that someone handed them an 8-inch floppy disk with winning lottery numbers that were to be drawn in an hour. The unanimous consensus was that those at the table would not have been able to collect the money because they would not have been able to find a reader in time.

Bill Long of Business Performance Research recounted having received two Web documents as part of a single STEP Board bulletin—one notifying him of the current meeting, the other of a workshop on research and development in data needs—and said that the juxtaposition of the two had, over the course of the day, brought certain questions to mind. Alluding to personal frustration with the quality of data he has collected and worked with, he indicated that he had been intrigued when speakers talked about “the possible use of ROI calculations either by the seller or the buyer” of information technology, as well as about “payback data—‘we paid for it in six days or six months,’ or whatever.” He said, however, that the data referred to had sounded “very proprietary” and characterized one speaker’s attitude thus: “We wouldn’t tell you even if we had it, and I’m not sure I’m going to tell you whether we have it.” Noting it was the job of many in the room to make sense of productivity gains and to determine what caused them, he asserted that success would depend on “some government agency collecting some data it’s not collecting now using a classification system that probably doesn’t yet exist.” He asked whether there had been any progress in that direction and what sort of progress might be expected over the next five to 10 years.

Dr. Corrado responded that comments throughout the day had provided fuel for the view that a broader definition of “business fixed investment” might be appropriate in describing the economy. “I don’t know if those of you who were speaking realize that you were supporting the view that R&D expenditures should be capitalized, that is, treated as business investment, in our national accounts,” she said. In view of the importance of innovation and technology in today’s economy, the measurement of R&D expenditures and considering whether some uses of employees’ time represent investment rather than inputs to current production are important topics in productivity research. Although studies have expanded the production boundary to encompass R&D and related outlays, the precise scope of these expenditures, and their measurement in real terms, are not settled issues and remain challenges before us.

Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×
Page 87
Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×
Page 88
Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×
Page 89
Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×
Page 90
Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×
Page 91
Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×
Page 92
Suggested Citation:"Panel V: What Have We Learned and What Does It Mean?." National Research Council. 2005. Deconstructing the Computer: Report of a Symposium. Washington, DC: The National Academies Press. doi: 10.17226/11457.
×
Page 93
Next: Concluding Remarks--Dale W. Jorgenson »
Deconstructing the Computer: Report of a Symposium Get This Book
×
Buy Paperback | $55.00 Buy Ebook | $43.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Starting in the mid 1990s, the United States economy experienced an unprecedented upsurge in economic productivity. Rapid technological change in communications, computing, and information management continue to promise further gains in productivity, a phenomenon often referred to as the New Economy. To better understand this phenomenon, the National Academies Board on Science, Technology, and Economic Policy (STEP) has convened a series of workshops and commissioned papers on Measuring and Sustaining the New Economy.

This major workshop, entitled Deconstructing the Computer, brought together leading industrialists and academic researchers to explore the contribution of the different components of computers to improved price-performance and quality of information systems. The objective was to help understand the sources of the remarkable growth of American productivity in the 1990s, the relative contributions of computers and their underlying components, and the evolution and future contributions of the technologies supporting this positive economic performance.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!