Mark Bregman of Veritas Software Corporation expressed the concern that the metrics used by the industry may be on the way to irrelevance because of changes in customer behavior. Just as the amount of copper wire sold to make generator coils is probably obsolete as a measure of the electrical power industry, he argued, “looking at microprocessors and disk drives as a way of measuring IT value really does cause an increasing amount of the cost to show up as G&A or overhead. When someone goes to an IBM and pays one fee for the whole service utility, they really are capturing all that in an investment in IT business value; when they buy chips and boards and assemble them into boxes and you only measure the cost of the chips and boards, all that other investment looks like overhead.” Dr. Bregman singled out the appropriate placing of an aggregation point that moves very rapidly as one of the main challenges in looking at the information technology industry over a period of 30–50 years. “It’s not just a matter of looking at the whole stack,” he stated.

Dr. Landefeld noted that statistical agencies like BEA struggle with this very problem. “We are trying to measure the value of investments in in-house software,” he explained, “but we can’t value it in terms of the value of the output—the cost of the inputs is the best we can do.”

Dr. Bregman pointed out that changing such definitions makes it hard to compare over time—which, he acknowledged, is the statistical agencies’ “whole game.”

Victor McCrary of the National Institute of Standards and Technology asked Dr. McQueeney what metrics the industry uses to evaluate its scientists. Noting that researchers who work at the pre-development stage have traditionally been judged by their publications, presentations, and patents, he said his IT lab is seeking other ways to evaluate both short- and long-term research. He commented that notions of “economic value added” and “business value” have “worked back into the R&D community.”

Dr. McQueeney corroborated the importance of this issue, stating that IBM has worked “incredibly hard” on it for three decades, during which the marketplace has increasingly provided “inputs to the core research portfolio and core research deliverables.” But while IBM’s effort to align research with development and product is not new, over the previous five years “the influence of the customer has been reaching further and further back into our product plans, into our deep development, and, ultimately, into our research,” he said. Within the previous six months the company had made known that key researchers would take part in consulting engagements in critical areas of research, particularly those concerning optimization and mathematics, “partly to deliver value out to the customers from our research lab, partly to bring a worldview of the bleeding edge of the customer environment back in to influence the portfolio.” IBM scientists, he said, now “understand that they’re expected to have world-class credentials and world-class standing within their technical communities but that the end goal is impact on the business and, therefore, on our customers.”



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement