The Economics of Software
William J. Raduchel
Dr. Raduchel noted that he was drawing from his many years of experience with software, which began when he wrote his first line of code as a teenager, and from his background as an economist to search for a good economic model for software. Yet, he cautioned that his presentation would not answer all the questions it raised; achieving a thorough understanding of the problem, he predicted, would take years, maybe even decades. Though sharing Dr. Jorgenson’s concern about the consequences stemming from the lack of a practical economic model, however, he noted that he would attempt to bring the technology and the economics together in his presentation.
Dr. Raduchel characterized software as “the medium through which information technology [IT] expresses itself.” Software loses all meaning in the absence of the computer, the data, or the business processes. Nonetheless, it is the piece of IT that is becoming not only increasingly central but also increasingly hard to create, maintain, and understand. Positing that software is the world’s largest single class of either assets or liabilities, and the largest single class of corporate expenses, he argued that “the care and feeding of the systems [that] software runs…dominates the cost of everything.” In addition, software is, as
shown by the work of Dr. Jorgenson and a number of other economists, the single biggest driver of productivity growth.2
U.S. ECONOMY’S DEPENDENCE ON SOFTWARE
Projecting a satellite photograph of the United States taken during the electrical blackout of August 2003, Dr. Raduchel noted that a software bug had recently been revealed as the mishap’s primary driver. When “one line of code buried in an energy management system from GE failed to work right,” a vast physical infrastructure had been paralyzed, one example among many of how the U.S. economy “is so dependent in ways that we don’t understand.” Recalling the STEP Board’s February 2003 workshop “Deconstructing the Computer,” Dr. Raduchel described an incident related by a speaker from a company involved in systems integration consulting: When the credit card systems went down at a major New York bank, the chief programmer was brought in. After she had spent several minutes at the keyboard and everything had begun working again, “she pushed her chair slowly away from the desk and said, ‘Don’t touch anything.’
“ ‘What did you change?’ the head of operations asked.
“ ‘Nothing,’ she said. ‘I don’t know what happened.’
Throughout the 1970s and 1980s, Americans and American businesses regularly invested in ever more powerful and cheaper computers, software, and communications equipment. They assumed that advances in information technology—by making more information available faster and cheaper—would yield higher productivity and lead to better business decisions. The expected benefits of these investments did not appear to materialize—at least in ways that were being measured. Even in the first half of the 1990s, productivity remained at historically low rates, as it had since 1973. This phenomenon was called “the computer paradox,” after Robert Solow’s casual but often repeated remark in 1987: “We see the computer age everywhere except in the productivity statistics.” (See Robert M. Solow, “We’d Better Watch Out,” New York Times Book Review, July 12, 1987.) Dale Jorgenson resolved this paradox, pointing to new data that showed that change at a fundamental level was taking place. While growth rates had not returned to those of the “golden age” of the U.S. economy in the 1960s, he noted that new data did reveal an acceleration of growth accompanying a transformation of economic activity. This shift in the rate of growth by the mid-1990s, he added, coincided with a sudden, substantial, and rapid decline in the prices of semiconductors and computers; the price decline abruptly accelerated from 15 to 28 percent annually after 1995. (See Dale W. Jorgenson and Kevin J. Stiroh, “Raising the Speed Limit: U.S. Economic Growth in the Information Age,” in National Research Council, Measuring and Sustaining the New Economy, Dale W. Jorgenson and Charles W. Wessner, eds., Washington, D.C.: National Academies Press, 2002, Appendix A.) Relatedly, Paul David has argued that computer networks had to be sufficiently developed in order for IT productivity gains to be realized and recognized in the statistics. See Paul A. David, Understanding the Digital Economy, Cambridge, MA: MIT Press, 2000. Also see Erik Brynjolfsson and Lorin M. Hitt, “Computing Productivity: Firm-Level Evidence,” Review of Economics and Statistics 85(4):793-808, 2003, where Brynjolfsson and Hitt argue that much of the benefit of IT comes in the form of improved product quality, time savings and convenience, which rarely show up in official macroeconomic data. Of course, as Dr. Raduchel noted at this symposium, software is necessary to take advantage of hardware capabilities.
“At that point the CEO, who was also in the room, lost it: He couldn’t understand why his bank wasn’t working and nobody knew why,” Dr. Raduchel explained, adding: “Welcome to the world of software.”
FROM BUBBLE TO OUTSOURCING: SPOTLIGHT ON SOFTWARE
In the 1990s bubble, software created a new and important part of the economy with millions of high-paying jobs. The spotlight has returned to software because it is emerging as a key way in which India and China are upgrading their economies: by moving such jobs out of the United States, where computer science continues to decline in popularity as a field of study. A discussion was scheduled for later in the program of “why it is in the United States’ best interest to tell this small pool of really bright software developers not to come here to work” by denying them visas, with the consequence that other jobs move offshore as well. Although the merits can be argued either way, Dr. Raduchel said, to those who are “worried about the country as a whole … it’s becoming a major issue.”
In the meantime, hardware trends have continued unchanged. Experts on computer components and peripherals speaking at “Deconstructing the Computer” predicted across-the-board price declines of 20 to 50 percent for at least 5 more years and, in most cases, for 10 to 20 years.3 During this long period marked by cost reductions, new business practices will become cost effective and capable of implementation through ever more sophisticated software. As a result, stated Dr. Raduchel, no industry will be “safe from reengineering or the introduction of new competitors.”
Characterizing a business as “an information system made up of highly decentralized computing by fallible agents, called people, with uncertain data,” Dr. Raduchel asserted that there is no difference to be found “at some level of reduction … between economics and information technology.” Furthermore, in many cases, a significant part of the value of a firm today is tied up in the value of its software systems.4 But, using the firm Google as an example, he pointed out that its key assets, its algorithms do not show up on its balance sheet.5 “Software
is intangible and hard to measure, so we tend not to report on it,” he observed, “but the effect is mainly to put people in the dark.”
REDEFINING THE CONSUMER’S WORLD
Software is totally redefining the consumer’s world as well. There are now scores of computers in a modern car, and each needs software, which is what the consumer directly or indirectly interacts with. The personal computer, with devices like iPod that connect to it, has become the world’s number-one device for playing and managing music, something that only 5 years before had been a mere speck on the horizon. With video moving quickly in the same direction, Dr. Raduchel predicted, the next 10 years would be wrenching for all consumer entertainment, and piracy would be a recurrent issue. A major factor driving piracy, he noted is that “the entertainment industry makes money by not delivering content in the way consumers want.” Music piracy began because consumers, as has been clear for 15 years, want music organized with a mix of tracks onto a single play-list, but that doesn’t suit any of the music industry’s other business models.
Anticipating an aspect of Monica Lam’s upcoming talk, Dr. Raduchel observed that software may look “remarkably easy and straightforward,” but the appearance is borne out in reality only to a certain extent. Among those in the audience at one of two Castle Lectures he had given the previous year at the U.S. Military Academy were numerous freshmen who had recently written programs 20 to 30 lines in length and “thought they really understood” software. But a program of that length, even if some have difficulty with it, is “pretty easy to write.” The true challenge, far from creating a limited piece of code, is figuring out how to produce software that is “absolutely error-free, robust against change, and capable of scaling reliably to incredibly high volumes while integrating seamlessly and reliably to many other software systems in real time.”
SOFTWARE: IT’S THE STACK THAT COUNTS
For what matters, rather than the individual elements of software, is the entirety of what is referred to as the stack. The software stack, which comprises hundreds of millions of lines of code and is what actually runs the machine, begins with the kernel, a small piece of code that talks to and manages the hardware. The kernel is usually included in the operating system, which provides the basic services and to which all programs are written. Above the operating system is middleware, which hides both the operating system and the window manager, the latter being what the user sees, with its capacity for creating windows, its help functions, and other features. The operating system runs other programs called services, as well as the applications, of which Microsoft Word and PowerPoint are examples.
“When something goes right or goes wrong with a computer, it’s the entire software stack which operates,” stated Dr. Raduchel, emphasizing its complexity. The failure of any given piece to work when added to the stack may not indicate that something is wrong with that piece; rather, the failure may have resulted from the inability of some other piece of the stack, which in effect is being tested for the first time, to work correctly with the new addition. A piece of packaged software is but one component of the stack, he said, and no one uses it on its own: “The thing you actually use is the entire stack.”
Finally, the software stack is defined not only by its specifications, but also by the embedded errors that even the best software contains, as well as by undocumented features. To build a viable browser, a developer would have to match Internet Explorer, as the saying goes, “bug for bug”: Unless every error in Internet Explorer were repeated exactly, the new browser could not be sold, Dr. Raduchel explained, because “other people build to those errors.” While data are unavailable, an estimate that he endorsed places defects injected into software by experienced engineers at one every nine or ten lines.6 “You’ve got a hundred million lines of code?” he declared. “You do the arithmetic.”
With hundreds of millions of lines of code making up the applications that run a big company, and those applications resting on middleware and operating systems that in turn comprise tens of millions of lines of code, the average corporate IT system today is far more complicated than that of the Space Shuttle or Apollo Program. And the costs of maintaining and modifying software only increase over time, since modifying it introduces complexity that makes it increasingly difficult to change. “Eventually it’s so complicated,” Dr. Raduchel stated, “that you can’t change it anymore. No other major item is as confusing, unpredictable, or unreliable” as the software that runs personal computers.
THE KNOWLEDGE STACK OF BUSINESS PROCESSES
Opposite the software stack, on the business side, is what Dr. Raduchel called the knowledge stack, crowned by the applications knowledge of how a business actually runs. He ventured that most of the world’s large organizations would be unable to re-implement the software they use to run their systems today, in the absence of the skilled professionals whose knowledge is embedded in them. “Stories are legion in the industry about tracking down somebody at a retirement home in Florida and saying, ‘Here’s $75,000. Now, would you please tell me what you did 10 years ago, because we can’t figure it out?’ ” Systems knowledge is the ability to create a working system that operates the applications at the top of
the stack; computer knowledge amounts to truly understanding what the system does at some level, and especially at the network level. Because almost no individual has all this knowledge, systems development is a team activity unique in the world.
The industry has dealt with this real-world challenge over the past 30 years by making the software stack more abstract as one moves up so that, ideally, more people are able to write software. Those who can write at the kernel level of the stack number in the hundreds at best worldwide. For, as is known to those with experience in the software industry, the very best software developers are orders of magnitude better than the average software developer. “Not 50 percent better, not 30 percent, but 10 times, 20 times, 100 times better,” Dr. Raduchel emphasized. So a disproportionate amount of the field’s creative work is done by “a handful of people: 80, 100, 200 worldwide.”7 But millions of people can write applications in Microsoft Visual Basic; the results may not be very good in software terms, but they may be very useful and valuable to those who write them. The rationale for introducing abstraction is, therefore, that it increases the number who can write and, ideally, test software. But, since heightening abstraction means lowering efficiency, it involves a trade-off against computing power.
THE MAJOR COSTS: CONFIGURATION, TESTING, TUNING
The major costs involved in making a system operational are configuration, testing, and tuning. Based on his experience advising and participating in “over a hundred corporate reengineering projects,” Dr. Raduchel described the overall process as “very messy.” Packaged software, whose cost can be fairly accurately tracked, has never represented more than 5 percent of the total project cost; “the other 95 percent we don’t track at all.” Accounting rules that, for the most part, require charging custom and own-account software to general and administrative (G&A) expense on an ongoing basis add to the difficulty of measurement.
A lot of labor is needed, with 1 designer to 10 coders to 100 testers representing a “good ratio.” Configuration, testing, and tuning account for probably 95 to 99 percent of the cost of all software in operation. Recalling Dr. Jorgenson’s allusion to the downloading of patches, Dr. Raduchel noted that doing so can force changes. That, he quipped, is “what makes it so much fun when you run Microsoft Windows Update: [Since] you can change something down here and break something up here … you don’t know what’s going to not work after you’ve done it.” Moreover, because of the way software ends up being built, there’s no way around such trade-offs.
ECONOMIC MODELS OF SOFTWARE OUT OF DATE
As a result, software is often “miscast” by economists. Many of their models, which treat software as a machine, date to 40 years ago, when software was a minor portion of the total cost of a computer system and was given away. The economist’s problem is that software is not a factor of production like capital and labor, but actually embodies the production function, for which no good measurement system exists. “Software is fundamentally a tool without value, but the systems it creates are invaluable,” Dr. Raduchel stated. “So, from an economist’s point of view, it’s pretty hard to get at—a black box.” Those who do understand “the black arts of software” he characterized as “often enigmatic, unusual, even difficult people”—which, he acknowledged, was “probably a self-description.”
Producing good software, like producing fine wine, requires time. IBM mainframes today run for years without failure, but their software, having run for 30 years, has in effect had 30 years of testing. Fred Brooks, who built the IBM system, taught the industry that the only way to design a system is to build it. “Managers don’t like that because it appears wasteful,” Dr. Raduchel said, “but, believe me, that’s the right answer.” For specifications are useless, he said, noting that he had never seen a complete set of specifications, which in any case would be impractically large. Full specs that had recently been published for the Java 2 Enterprise Edition, the standard for building mini-corporate information systems, are slightly more than 1 meter thick. “Now, what human being is going to read a meter of paper and understand all its details and interactions?” he asked, adding that the content was created not by a single person but by a team.
Their inability to measure elements of such complexity causes economists numerous problems. A good deal of software is accounted for as a period expense; packaged software is put on the balance sheet and amortized. While programming is what people usually think of as software, it rarely accounts for more than 10 percent of the total cost of building a system. The system’s design itself is an intangible asset whose value grows and shrinks with that of the business it supports. The implementation also has value because it is often very difficult to find even one implementation that will actually work. Despite all this, Dr. Raduchel remarked, only one major corporation he knew of recognized on its books the fact that every system running must be replaced over time.
MAJOR PUBLIC POLICY QUESTIONS
Dr. Raduchel then turned to public-policy questions rooted in the nature and importance of software:
Are we investing adequately in the systems that improve productivity? Numerous reports have claimed that the enterprise resource planning and other systems put into place in the late 1990s in anticipation of the year 2000 have greatly improved the efficiency of the economy, boosting productivity and con-
tributing to low inflation. But many questions remain unanswered: How much did we invest? How much are we investing now, and is that amount going up or down? How much should we be investing? What public-policy measures can we take to encourage more investment? There are no data on this, but anecdotal evidence suggests that investment has fallen significantly from its levels of 7 years ago. Companies have become fatigued and in many cases have fired the chief information officers who made these investments. “People are in maintenance mode,” said Dr. Raduchel. Noting that systems increasingly become out of synch with needs as needs change over time, he warned that “systems tend to last 7 years, give or take; 7 years is coming up, and some of these systems are going to quit working and need to be replaced.”8
Do public corporations properly report their investments and the resulting expenses? And a corollary: How can an investor know the worth of a corporation that is very dependent on systems, given the importance of software to the value of the enterprise and its future performance? Mining the history of the telecommunications industry for an example, Dr. Raduchel asserted that the billing system is crucial to a company’s value, whereas operating a network is “pretty easy.” Many of the operational problems of MCI WorldCom, he stated, arose “from one fact and one fact only: It has 7 incompatible billing systems that don’t talk to one another.” And although the billing system is a major issue, it is not covered in financial reports, with the possible exceptions of the management’s discussion and analysis (MD&A) in the 10K.
Do traditional public policies on competition work when applied to software-related industries? It is not clear, for example, that antitrust policies apply to software, which develops so rapidly that issues in the industry have changed before traditional public policy can come into play. This “big question” was being tested in United States v. Microsoft.
Do we educate properly given the current and growing importance of software? What should the educated person know about software? Is sufficient training available? Dr. Raduchel noted that the real meat of the Sarbanes-Oxley Act is control systems rather than accounting.9 “I chair an audit committee, and I
love my auditor, but he doesn’t know anything about software,” he said. And if it took experts nearly a year to find the bug that set off the 2003 blackout, how are lay people to understand software-based problems?
What should our policy be on software and business-methods patents? Those long active in the software industry rarely see anything patented that had not been invented 30 years ago; many patents are granted because the prior art is not readily available. Slashdot.org, an online forum popular in the programming community, has a weekly contest for the most egregious patent. “If you read the 10Ks, people are talking about [these patents’] enormous value to the company,” Dr. Raduchel said. “Then you read Slashdot and you see that 30 pieces of prior art apply to it.” The European Union is debating this issue as well.
What is the proper level of security for public and private systems, and how is it to be achieved? A proposal circulating in some federal agencies would require companies to certify in their public reporting the security level of all their key systems, particularly those that are part of the nation’s critical infrastructure. Discussed in a report on security by the President’s Council of Advisors on Science and Technology (PCAST) was the amount of risk to the economy from vulnerability in systems that people may never even have thought about. The worms that had been circulating lately, some of which had caused damage in the billions of dollars, were so worrisome because they could be used to crack computers in 10 million to 20 million broadband-connected homes and to create an attack on vital infrastructure that would be “just unstoppable,” said Dr. Raduchel, adding: “Unfortunately, this is not science fiction but a real-world threat. What are we going to do about it?”
What is happening to software jobs? Do we care about their migration to India and China? Is U.S. industry losing out to lower-cost labor abroad or are these jobs at the very tip of the value chain, whose departure would make other parts of high-tech industry hard to sustain in the United States? “Let me tell you,” Dr. Raduchel cautioned, “the people in India and China think it’s really important to get those software jobs.”
What export controls make sense for software? Taken at the margin, all software has the potential for dual use. As Dr. Raduchel noted wryly, “If you’re building weapons for Saddam Hussein, you still have to make Powerpoint presentations to him about what’s going on—apparently they were all full of lies, but that’s another issue.” More practically, however, export controls for dual-use software, such as those calling for encryption, can help ensure that certain types of sensitive software are not used in a way that is detrimental to U.S. national security.
Should the building of source code by a public community—open-source code—be encouraged or stopped? Dr. Raduchel included himself among those who have come to believe that source code is the least valuable rather than the most valuable part of software. Consequently, giving it away is actually a good strategy. However, some forces in the United States, primarily the vendors of proprietary software, want to shut open-source software down, while others, such as IBM have played an important role in developing Linux and other open-source platforms.10 In the course of a discussion with Federal Communications Commission Chairman Michael Powell, Dr. Raduchel recounted that he had suggested that all critical systems should be based on open-source software because it is more reliable and secure than proprietary software. Some believe that if nobody knows what the software is, it would be more reliable and secure. Yet, that position overlooks the view that open-source software is likely to have fewer bugs.11 Many, according to Dr. Raduchel, “would argue that open-source software is going to be inherently more reliable and secure because everybody gets to look at it.”
What liability should apply to sales of software? Licenses on shrink-wrapped software specify “software sold as is”; that is, no warranty at all is provided. In view of the amount of liability being created, should that be changed? And, if so, what should be done differently? Dr. Raduchel called the central problem here the fact that, although a new way of writing software is probably needed none has emerged, despite much research a couple of decades back and ceaseless individual effort. Bill Joy, one of the founders of BSD UNIX, had recently stated that all methods were antique and urged that a new way be found, something Dr. Raduchel rated “hugely important” as the potential driver of “a value equation that is incredibly powerful for the country.”
How are we investing in the technology that creates, manages, and builds software? Outside of the National Science Foundation and some other institutions that are stepping up funding for it, where is the research on it?
Pointing to the richness of this public-policy agenda, Dr. Raduchel stated: “I am not sure we are going to get to any answers today; in fact, I am sure we’re
not.” But he described the day’s goal for the STEP Board, including Dr. Jorgenson and himself, and for its staff as getting these issues onto the table, beginning a discussion of them, and gaining an understanding of where to proceed. With that, he thanked the audience and turned the podium back over to Dr. Jorgenson.