from the Gulf War.28 It described how General Pagonis ran what many people called the most successful military logistics operation in history. When the general arrived in the theater of war he introduced a new rule: any person in his chain of logistics command could send a message on a three-by-five card to any other person in the chain of command, without fear of retribution, and this message would have to be answered in 24 hours. Thus, he allowed the lowliest private to send a note to the highest general, and he enforced this rule. The vast majority of notes concerned routine matters. Nevertheless, a tiny fraction carried messages that were of critical importance (General, the tires we’re getting don’t match the trucks we have. Is that a problem?), and these averted huge potential problems. Dr. Raduchel said, “Those of you who live in big corporations with lots of Generation-X employees will understand that this is a description of e-mail.”
Weak Signals, Diverse Sources
He closed with a story he heard at the Santa Fe Institute about beehives and ant colonies as organisms. A researcher said that for the colonies to survive the key issue was their ability to combine weak signals from diverse sources in ways that drive innovation. He conjectured that e-mail, in allowing weak signals from diverse sources to reach a person with decision-making ability, might drive innovation in a similar ways. He concluded that the free flow of information is a powerful driver of “whatever this new economy is,” and that one goal of the symposium today was to help focus on such issues.
Dr. Greenstein focused on ways to sustain innovation in communications markets and on emerging policy issues that deserve attention in the United States. He defined an emerging issue as an issue for which clear analysis could lead to better decisions, in comparison to “just muddling through as usual.” As a representative of academia, he would bring a slightly different viewpoint to some of these issues.
Some Bottlenecks to Progress
Within communications he focused on the Internet, beginning with three factors “that are going to be bottlenecks to progress in the next five years.” The first
is that there are alternative modes for developing new infrastructure in communications, and these modes are treated asymmetrically by present legal structures. Second, the costs and benefits of restructuring communications industries are difficult to observe; therefore, revised policies for these industries will be difficult to formulate. Third, some core and long-established principles for the regulation of communication activities are being upended by current changes in technology, which raises new regulatory challenges.
He approached these three themes by looking backward five years and forward to the next five. In the last five years, we have gone through a period that has almost no precedent. Researchers have shown that the Internet emerged from a two-decade incubation period as a government-managed technology program and became a refined and malleable commercial operation whose features are relatively advanced.
Bringing the Internet into the Commercial World
The early Internet had some peculiar features. It was optimized to a non-commercial environment, with end applications generally assumed to be independent of transport mode. It also presumed a lack of gateways, which is now being questioned. The Internet accommodated the needs of the Academy as well as a number of other organizations that had helped to set its standards.
When this technology—in other words, TCP/IP-based applications—was thrown into a commercial environment, it forced an extensive retrofit of the existing communications infrastructure. The experience was not unlike the displacement of the telegraph by telephony, when two independent communications structures arose independently. The Internet retrofit was straightforward because it was expedient to link dial-up modems with the existing public telephone switch network. This fostered rapid entry of customers and a huge demonstration effect. People quickly saw how easy it was to use and wanted to join. It also stimulated unprecedented and almost spontaneous growth of the communications infrastructure. The growth of the Internet represents a massively decentralized, unorchestrated investment activity. There is no precedent in communications for such a pervasive application of a relatively refined technology, although there is precedent in other realms of electronics.
Addressing the Last-Mile Problem
The business aspect of this new technology, however, was not refined, which resulted from its incubation in academic and government environments. That, said Dr. Greenstein, presents three problems. The first is the last-mile problem: how to wire or otherwise connect individual users and businesses from their homes or offices to the nearest Internet backbone. The earliest and still the most common way to do this is by dial-up Internet access through the telephone sys-
tem. The United States fell into this mode largely because of a set of regulations, almost unique among nations, that use flat-rate pricing structures for local telephone services. The effect of these regulations, written long before the Internet was conceived, is to make Internet dialing cheap and expedient.
It is not obvious that dial-up service is the preferred route to the Internet, especially for the next generation of broadband communication. Indeed, there are now three competitors offering to help users span that last mile—telephone companies’ own services, Internet service providers using the telephone network, and cable companies that were formed with only television in mind; wireless access is coming as well. The problem with this particular competition is that all three modes are controlled by different regulatory regimes, for different reasons, and none was designed to cope with Internet issues.
Underlying this issue is this country’s ideal of promoting technology-neutral policies, and the current regulatory combination does not seem to support this. The ideal last-mile policy would begin by letting markets decide among a variety of alternatives, especially when there are large technological complexities. There is unintended favoritism, however, in current regulations. For example, common carrier regulation in telephony requires companies to provide service in high-cost areas. Internet service providers have an enhanced service exemption, and cable companies must comply with rules about content and distribution mix. In addition, all three regimes are regulated in different ways across the country. These variations in regulation make a huge difference in whether a given Internet business model can succeed, he said. This issue will arise many times in the years to come, as it did in the open-access debate over AOL’s acquisition of Time Warner.
The Effect of New Regulations on Business Models
Another looming difficulty is the likelihood of disruptions when information-intensive activities are restructured. Some 70 percent of home Internet use is now “free” or supported by advertising, which has had a large impact on media companies. If this unusual condition is changed by new regulations, it will affect many business models. He said that the benefits and costs of dislocation are difficult to measure, and this lack of data can lead to bad policy. For example, GDP easily misses changes in distribution methods. It gives good data on established channels but poor data on new channels. Price changes may be missed and quantities wrong. People are participating in new kinds of economic exchange that are not measured or that are valued in wrong ways. An increase in welfare can appear as a decline in GDP.
Intermediaries can provide valuable services in a time of transition between new possibilities and what the technical frontier allows. These are environments we do not measure well. For example, how do we measure the economic value of a customized technology to the invention of a new business model? Nor do we
measure well the value of adapting something to a unique situation. This could represent a bottleneck.
Legal and Regulatory Principles
Dr. Greenstein turned next to emerging challenges to basic legal and regulatory principles behind communications industries in the United States, beginning with three trends. First, a tradition inherited from common carrier regulation is that of a “ bright line” between content and distribution. One reason for this line is that it reduces the danger of bottlenecks in information delivery. The thinking is that competitive delivery information across multiple modes lessens worries about joint ownership issues like this. We worry about that because there are now bottlenecks in delivery as well as asymmetries in the cost of different modes of delivery.
Second, there is no agreement yet about how best to deliver and retrieve information to and from the household or business. Therefore, open-access rules are going to be reviewed continuously—the basic principles that affect the returns on investment in any kind of last-mile activity. Businesses are reluctant to invest in an environment without regulatory commitment.
Third, regulatory bodies at the national and state levels are accustomed to issues flowing to them at a certain rate. The present environment is bringing issues to them at a much faster rate and with greater frequency than they have ever seen, and they are not equipped to handle them. The amount of expertise necessary to make intelligent decisions is high, and this in turn, raises new questions about appropriate discretion in governance at the agency level. Many dimensions of the next generation of communications infrastructure will be influenced by local regulatory decisions in various states.
Will the Commercial Markets Serve Everyone?
Dr. Greenstein turned to another assumption that will be challenged: that the communications infrastructure will continue to be virtually ubiquitous. An established principle in this country is that governments first allow the commercial markets to function freely and then they promote services for those who are underserved. For the next five years, however, what will the commercial markets do with the Internet if they are left alone? Who will not be served? Dr. Greenstein said that he had examined these questions, and a lesson of the last five years is that commercial dial-up Internet access serves about 90 percent of the population without any help. This means that low-density areas can be targeted quite easily, with the help of some subsidies.
It is much harder to predict whether the digital divide will widen over the next five years. The quality of access varies among different groups and different geographic regions, and ease of access differs with training, education, and