Throughout the 1970s and 1980s, Americans and American businesses regularly invested in ever more powerful and cheaper computers. In doing so, they assumed that advances in information technology would yield higher productivity and lead to better business decisions. These expected benefits did not materialize—at least in ways that were readily measured. This phenomenon was called the “computer paradox,” after Robert Solow’s remark in 1987 that “we see the computer age everywhere except in the productivity statistics.”
By the mid-1990s, however, new data began to reveal an acceleration of growth accompanying a transformation of economic activity. This shift in the rate of growth coincided with a sudden, substantial, and rapid decline in the prices of semiconductors and computers. After 1995 it appears there was a point of inflection; price declines abruptly accelerated from 15 to 28 percent annually. In the same period, investments in computers exploded. The contribution to growth attributed to computers rose more than five-fold to 0.46 percent per year in the late 1990s. Software and communications equipment contributed an additional 0.30 percent per year for 1995 to 1998. Preliminary estimates through 1999 suggest further increases for all three categories.
This period also coincides with the widespread adoption of the Internet, the emergence of the “dot-com” e-companies, a surge in venture capital investment and, in some quarters, predictions that the concept of business cycles no longer applied. The symposium reviewed in this volume does not focus on the transitory phenomena once identified with the New Economy. Indeed, the decline of the dot-com companies and the re-emergence of the business cycle were already apparent when the symposium was convened.
The New Economy referred to here addresses changes in the U.S. economy