Click for next page ( 57


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 56
Panel I ------------------------------------------ New Technology Trends and Implications INTRODUCTION Mark B. Myers The Wharton School University of Pennsylvania Dr. Myers opened the session by announcing that each presenter would have 20 minutes and that the question period would come after all had spoken. He then introduced the panel's first speaker, Mark Doms of the Federal Reserve Bank of San Francisco. THE RECORD TO DATE: QUALITY-ADJUSTED PRICES FOR EQUIPMENT Mark E. Doms Federal Reserve Bank of San Francisco Dr. Doms, thanking Dr. Jorgenson for the invitation and Dr. Wessner for organizing the conference, began by noting that he would use the terms "tech- nological change" and "technological advances" interchangeably. At the outset, 56

OCR for page 56
NEW TECHNOLOGY TRENDS AND IMPLICATIONS 57 he would tackle the question of why the link between technological change and prices is significant, so that he would then be able to explain how the advent of "some new gee-whiz technology" translates into prices and hence into consumer welfare. Thereafter he would talk about what the current official numbers said, about what other estimates were, and about the challenges of coming up with good prices for communications equipment and services--again, for the purpose of clarifying how a society benefits from innovations of the kind under considera- tion at the symposium. He would close by discussing the evolution of communi- cations services and equipment. Why do we care about prices? And why do we care about investment in communications? First, investment in communications has been substantial, but also very volatile, in the United States. As Dr. Jorgenson had said, one of the first-step products in the New Economy had been computers, but it was also true that the country's investment in communications equipment had been of about the same dollar magnitude as its investment in computers. From a GDP or na- tional-accounts perspective, therefore, the two were pretty similar over the course of the 1990s and had continued to be similar in the current decade. Around $100 billion per year, representing a little over 10 percent of total equipment invest- ment in the U.S. economy, was being spent on communications; Dr. Doms termed that "a fairly sizable chunk." At the same time, there had been huge swings in the U.S. investment in communications, making it one of the most volatile of all the components of GDP. During the past recession, investment in communications gear fell 35 per- cent from peak to trough, "just a very, very large number." As more years of data came in, making possible a backward glance, the recession of the early 2000s might be remembered as a "high-tech recession," Dr. Doms speculated, adding that "certainly what happened to communications played a major role in what happened in the high-tech sector." Communications Investment and National Economic Performance The other reason for interest in communications investment springs from the way in which it contributes to the performance of the U.S. economy. Those econo- mists who monitor the national economy, whether they work for a statistical agency or for the Federal Reserve, look at measures of how many dollars are spent on communications in the United States every year. What makes their job hard is that a dollar spent today on communications is not the same as a dollar spent yesterday; in fact, there is a great deal of change. Dr. Doms observed that a computer costing $1,000 currently was a lot more powerful and a lot more useful than a computer that that had cost $1,000 five or ten years before. The same was thought to hold true of communications gear; there had been enormous techno- logical change, especially going back 25 years. At that time, most communica- tions was done by landline telephone, a stark contrast to the diversity of means of

OCR for page 56
58 THE TELECOMMUNICATIONS CHALLENGE communication currently available. So the problem that must be surmounted in order to understand how communications affects GDP and productivity growth is translating a given amount that was spent in a past year into today's dollars. "We basically try to ask this question: If we have $100 billion today, what did that translate to in spending, say, four years ago?" Economists are able to look at such trends over time--to make "intertemporal comparisons"--by using price indexes. To illustrate, Dr. Doms turned to the tech- nological change that occurred in fiber optics between 1996 and 2001. During that period, there were tremendous advances in the amount of information that could travel down a strand of glass fiber, owing to increases both in the number of channels--that is, in the number of wavelengths that could be transmitted along a single fiber--and in the capacity of each channel. Depending on how this change is measured, on the point at which the measurement begins, and so on, "you basically get a doubling every year in the potential capacity of a single strand of glass fiber," he said. During this five-year period, the price of the gear used to transmit information over fiber fell, on average, 14.9 percent per year. Pointing out that the latter rate was clearly below the rate of increase in capacity, Dr. Doms underlined the importance of the lack of a one-to-one relationship between the change in technological capability and the price. "The intuition is that if you have the option to buy a car that's twice as fast as your current car, you will not value that new, fast car twice as much as your old car," he said, because a form of diminishing returns sets in. In a similar way, the price acts as an indicator of the value that society places on a technological change. It was unfortunately probable, therefore, that the price indexes currently in use for looking at productivity and at GDP understated the true price declines that had occurred for communications equipment. Prices for Computers vs. Communications Gear According to the U.S. Bureau of Economic Analysis (BEA), whose informa- tion on prices comes mainly from the U.S. Bureau of Labor Statistics, prices for communications gear fell an average of 3.2 percent per year between 1994 and 2000. That stood in sharp contrast to what had happened to computer prices, which fell an average of 19.3 percent during the same years. "With all the innova- tions that happened in communications equipment, do we really think that the official number of 3.2 percent per year is accurate during this time period?" Dr. Doms asked, answering: "Probably not." Work done by him and others indi- cated, rather, that communications equipment prices fell on the order of 8 to 10 percent, about half as fast as prices for computers. This movement stood in contrast to that of most other prices in an economy where prices tend to go up, and which was then showing an inflation rate of between 1 and 2.5 percent. Dr. Doms then turned his attention to the challenges of measuring prices. Although made using "traditional, standard methods and crude data," the esti-

OCR for page 56
NEW TECHNOLOGY TRENDS AND IMPLICATIONS 59 mates of 8 to 10 percent per year through 2000 for the drop in prices of commu- nications gear represented a step in the right direction. Still, he acknowledged, more refinement was in order on that front. Additionally, it appeared that no one had a very clear idea of what had happened with regard to technological change in and prices of communications equipment from 2001 on. Computing such prices in order to see how much better off society was as a result of the technological changes was a very hard job demanding a large number of person-hours. "We had to purchase a lot of private-sector data," he recalled, describing the undertaking as "very expensive" and pointing out that statistical agencies such as BEA and the National Bureau of Economic Research were "very budget-constrained." An important challenge, looking into the future, was in the very speed at which technology was changing. "We don't know what technology is going to emerge three months from now, a year from now, two years from now," Dr. Doms remarked, "and it's very hard for the statistical agencies to figure out what they should be following." Was WiFi going to take off any more than it had to date-- was that going to be "the next great thing"? Just how quickly would fiber to the home take off--and what would be the effect on the equipment involved in that? Those studying the economy's welfare would like to know what is going to happen in the future so that they can start gathering the appropriate data, do the appropriate analysis, and construct the price indexes. Communications Prices Past, Present, and Future To illustrate the increasing difficulty of tracking both prices and technological change, Dr. Doms displayed a table comparing the landscape for communications and communications equipment 25 years earlier, currently, and in the future (see Figure 9). A quarter-century back, when most of the money spent on telecommu- nications equipment went to switches for telephone centers, the industry was "a lot easier" to track: "We could see what happened when we went to digital switches." In the 1990s and into 2000, there was a movement away from spending on telephone switches and toward spending on a wide array of telecommunica- tions technologies, in particular those connected to data, computer networking, and fiber optics. Following these developments was harder for the statistical agen- cies, especially in light of their budgetary problems. "Unless the statistical agencies get increased funding," he added, "in the future they are not going to be able to follow new, evolving trends very well." Summarizing, Dr. Doms said that his efforts were aimed at improving under- standing of how technology increases the economic performance of the country. The real terms in which GDP and productivity growth were discussed, he noted, were a tool used to control for what was happening to prices in the economy. The area of communications equipment and services was one in which, he believed, prices were "very much mis-measured," and hence this area's contribution to national economic performance was probably greatly understated.

OCR for page 56
60 THE TELECOMMUNICATIONS CHALLENGE 25 years ago Today Future Primary form of Land line voice, Data, cellular Data, video, ??? communication some data voice, landline voice, cable Major expenditure Telephone Computer Computer categories for switching network, fiber network, last-mile communications equipment optic, telephone solutions, gear switching, wireless, ??? cellular, cable TV Primary makers of Ma Bell, Nortel Lucent, Motorola, ??? the gear Cisco, Nortel, Broadcomm, Juniper FIGURE 9 Evolution of communications and communications equipment. TECHNOLOGY TRENDS, EMERGING STANDARDS, AND THEIR IMPACT Jeffrey M. Jaffe Lucent Technologies Saying he planned to talk about networking technologies, Dr. Jaffe indicated that he would focus on areas where issues concerning standards and lack of clarity in regulatory policy were retarding progress. In some of these areas, the whole world was being held back, while in others the United States was being held back relative to the rest of the world. He hoped that exploring some of them would stimulate discussion, which might in turn lead to forward movement. He complimented Dr. Raduchel on his presentation, which he said made clear that the dramatic changes that had taken place in communications over the previous couple of decades would continue for the next decade or two. He reiterated that he would focus on the issue of networking, which he judged "probably more complex than some of the end-user individual aspects." Policy initiatives, he added, needed to keep pace with the rapidly evolving technological realities. Explaining why he had labeled the changes "dramatic," Dr. Jaffe pointed out that the voice network of the future would run over the Internet Protocol (IP). Since this technology has a different capability when it comes to voice quality, cost, and reliability, this would be a major change, he predicted. More remarkable changes were on the way. For example, developments in sensor networks and personal networks could mean that cell phones would soon to be sold with a built-

OCR for page 56
NEW TECHNOLOGY TRENDS AND IMPLICATIONS 61 in camcorder. "Think of having hundreds of millions of video broadcasters-- basically everybody--broadcasting to the grandparents whatever is going on in their grandchildren's life, or personal video conferencing: Think of the demand of all that on the network." In addition to commercial impacts, there would be national-security impacts, he added, because sensor networks were also critical for homeland security. All these video cameras and sensor networks, he asserted, raised important issues concerning privacy and personal liberty. This means that the right standards and policy initiatives need to deal with not only the positive potentials of the new technologies, but also some of the potential downsides. While wireless itself was "an absolutely wonderful technology," Dr. Jaffe said, its potential is obscured because there are numerous standards for it. Among these available standards are 3G1X, EVDO, EVDV, UMTS, HSDPA, 802.11, 802.15, 802.16, 802.20, 802.21, and OFDM, in addition to public-safety stan- dards. The resulting confusion creates challenges both from a policy perspective and from the point of view of interoperability, he noted. Contrasting Paces of Technology Development, Regulation Meanwhile, telecommunications services were increasingly becoming blended, with voice, data, video--all media--becoming the same from the point of view of the technology. In response, the Third-Generation Partnership Program, one of the standards organizations, developed. a standard--IP Multimedia Subsystem (IMS)--for dealing with these blended services and converged access. All these advances in networking and services were taking place in a regulatory environ- ment that was increasingly concerned about infrastructure protection, disaster recovery, and emergency services. As a vendor, one of the things that Lucent worried about, he said, was its need to develop new products and to recognize new regulatory imperatives at a time when the regulatory imperatives were very slow to come out. "The innovators are getting out there with the innovations," Dr. Jaffe noted, expressing concern about the cost of retrofitting regulatory disci- plines that are later applied on the system. Against this background, Dr. Jaffe proposed to talk about six areas where he believed regulatory and standards issues appeared to be standing in our way: 1. Voice over IP (VoIP). 2. The new IMS services. 3. National Emergency Planning/First-Responder Networks. Major com- munications deficiencies in U.S. first-responder networks were discovered on 9/11. 4. FTTx. Fiber to the home or premises. 5. Government research funding for telecom. The industry's movement to a horizontal model and elimination of stovepipes (as it implemented the inno- vations of the past 30 or 40 years) had been extremely efficient for the consumer. But, from a research perspective, a major issue had arisen: Who was planting the

OCR for page 56
62 THE TELECOMMUNICATIONS CHALLENGE new seed corn for tomorrow? "That is something which I think that we as a country need to be concerned about," he stated. 6. Spectrum policy. With the Third-Generation Partnership Program standardizing IMS, the voice-over-IP system was being built out. Inside the network was a sophisticated set of systems that handled the media control: Session Initiation Protocol (SIP), which was the signaling protocol for voice over IP, plus a variety of controllers, media servers, and application gateways (see Figure 10). Emergency planning, CALEA, disaster recovery, and E911 need to work seamlessly in this new environment. Turning to security issues, he raised the problem of "Spam over Internet Telephony (SPIT)." He warned while many in the audience may not have heard this term yet, they would likely make its acquaintance soon. This was because opening up a network to the Internet Protocol meant opening it up to misuse, "something that," he said, "we need to be concerned about from a regulatory point of view." He also called protocol diversity an issue, pointing out that those most expert in the signaling protocol for the next-generation network, IP, were Application Media Server Servers SIP 3GPP/IMS Routing Service Engine Network Switch SIP Subscriber Data Gateway Controller Media Gateway PSTN Requires regulatory clarity to ensure no hiccups in deployment Issues include Emergency Planning, CALEA, Disaster recovery Security issues: SPIT, SPAM, Authentication, Denial of service Protocol diversity FIGURE 10 3GPP/IMS provides next generation of blended services and VoIP.

OCR for page 56
NEW TECHNOLOGY TRENDS AND IMPLICATIONS 63 hackers. This was markedly at odds with the case of traditional voice networks, whose greatest experts comprised a closed society. While it is a good thing that we've opened it up, he said, we also need to have a thoughtful approach to deal- ing with the security issues. Authentication: Left out of the Internet's Design As an example of an approach that qualified as thoughtful, Dr. Jaffe cited the deliberation that led the European Union to decide that authentication was among its biggest issues. As authentication was not part of the Internet's design, with the Internet protocols it was easy for people to hide their identity. While the U.S. "Do Not Call" list had curbed telemarketing over the traditional voice network, 81 percent of email today is spam, he said. And with the simultaneous occurrence of three things--the addition of voice over IP, mobility, and SPIT--it would "be very easy for a user to spam every single cell phone in America with an SMS [Short Message Service] message." This, he noted, is "not a great thing to have in your network." No one yet knew how to prevent this, and it would be particularly burden- some to those who had to pay when they received messages, as they would have to start paying for spam. Rather than decrying the technology, he said, he was pointing out the need to address the relevant policy issues and thereby to prevent such things from occurring. A variety of solutions to authentication--single sign- on, caller ID, public key infrastructure--had been available for some time, but their implementation had been very slow in terms of the reliability and the security of the backbone. Similarly, he said, it was important to start thinking about how to do the signaling. For example, should there be some out-of-band signaling even within the Internet Protocol? Should giving signaling packets priority over media packets be introduced? Were there ways of introducing diversity? Network Capability and Privacy: A Trade-off? Dr. Jaffe then enlarged his discussion of the basic voice network to consider the services that allow the viewing of TV programs on cell phones, as mentioned by Dr. Raduchel. To be able to deliver these "lifestyle" services well, he said, IMS was designing an approach that would feature: seamless control; data transparency, meaning that everything works even with different protocols and devices; immediacy, in that the user is always on the network; and nimbleness.

OCR for page 56
64 THE TELECOMMUNICATIONS CHALLENGE By its nature, however, this network would "understand" everything about the user, which in turn raises an important social issue: If the network knows so much about end users, what does that mean for privacy? Although this is a diffi- cult problem, he said, "I think it is absolutely vital that we address it." Dr. Jaffe discussed U.S. and European privacy models. The U.S. model allows customers to trade off their privacy for enhanced service. The European model featured very strict laws governing the gathering and sharing of personal data. Most notable in the U.S.'s market model was the sharing of responsibilities: The government provided an overall architecture defining roles and responsibili- ties for network operators, network vendors, users, and so on; operators needed to obey those policies, and network vendors needed to provide technology to make it easy for users to specify their choices. He cited IBM's Hippocratic Database and Bell Labs' Privacy Conscious Framework as examples of vendors' efforts to fill the vacuum by defining approaches that allowed users to customize their privacy. Improving Readiness for Physical, Cyber Attacks Moving to the topic of emergency planning and first responders, Dr. Jaffe pointed to the recommendation of the 9/11 Commission that the nation be prepared to deal with simultaneous physical and cyber attacks. Also needed, he said, were trusted networks and trusted devices that could keep the government functioning in the event of emergency, and which could help first responders, whose numbers, according to some scenarios, might reach 5 million. "What we saw on 9/11 was that first responders couldn't communicate with each other if they were from different services, and that was within a single city," he recalled. "Contrast that with the broadband mobile communications capability which is available commercially." What is currently available to protect the population is a system that is not interoperable and which has very low bandwidth. By con- trast, current commercial systems provide total interoperability and very high bandwidth, that offers real-time voice, video, and location services. This led him to put forward a "modest proposal": Make available to first responders the very low cost, very efficient system of wireless communications that had been developed for commercial needs and was principally provided by cellular vendors. In our federal system, such decisions are delegated to first responders in each locality, limiting the potential for a national interoperable sys- tem. That was a problem, he suggested, that the FCC might want to take on. Emergency planning networks were going to have to evolve so that first responders could handle not only voice networks, currently their main function, but also sensor networks. The latter, networks of highly integrated micro-sensors, would be able to provide much useful information about potential physical attacks on infrastructure. While there is a great deal of technology going into developing the sensors, arriving at the right standards for taking all the sensor information

OCR for page 56
NEW TECHNOLOGY TRENDS AND IMPLICATIONS 65 and feeding it into a national first-responder network is hard. "We desperately need standards," he stated, "and the country is not moving quickly enough." On the subject of fiber to the home, Dr. Jaffe remarked that residential use of bandwidth had been on the same exponential curve for a long time (see Figure 11). The presence of an "insatiable appetite for bandwidth" had become apparent over the previous 20 years: "No matter how much broadband we give to people, they're willing to use it up." Despite steady improvement in modems, narrowband had reached their limits. Various forms of broadband had replaced it, but there seemed little reason to believe that demand for bandwidth would not continue to increase. This provided an inducement to achieve the highest bandwidth over the longest distances, and--despite the virtues of DSL, cable-modem, and copper--there was no question that the best bit rate for distance was provided by fiber (see Figure 12). Improving Broadband Capacity But leaving aside fiber for a moment, Dr. Jaffe pointed to the major challenge the United States faces in broadband: The country had fallen to eleventh in the world--or perhaps, as his fellow panelist David Isenberg interjected, to fifteenth-- in broadband capacity per capita. Building costs were a factor in this, he said, but there was another reason: While fiber tended to go into new construction, regula- tory uncertainty appeared to be holding down the installation of fiber all the way Residential access average peak data rate vs. time 100,000 Ultrabroadband 10,000 More spectrum (more Hz) and more bits/Hz and new infrastructure 1,000 (kbps) Narrowband 100 (Dial-up) Rate Broadband 10 Data More spectrum (more Hz) and 1 more bits/Hz but same infrastructure 0.1 1970 1980 1990 2000 2010 2020 Year Same spectrum with more bits/Hz but same infrastructure We have barely begun to tap the potential of broadband in 2004 History indicates that we will reach 10100 Mbps per household but it will take decades FIGURE 11 Residential access--Perspective.

OCR for page 56
66 THE TELECOMMUNICATIONS CHALLENGE 30 1310nm 20 1550nm FTTx 850nm single-mode Coaxial Cable 10 Cable (shared BW) multi-mode - Modem multi (km) fiber - fiber Twisted Pair Distance (dedicated BW) xDSL New infrastructure needed Old infrastructure suffices 1 1 10 100 1,000 10,000 Bit rate (Mb/s) FIGURE 12 Fiber infrastructure is the enabler. NOTE: Plot assumes modulation at 1 bit/Hz. to the curb in conjunction with rehabilitation of existing buildings (see Fig- ure 13). In view of this, Dr. Jaffe made a second proposal: that there be clear, consistent regulation encouraging near-term investment in a fiber infrastructure. The lack of regulatory clarity, and by the accompanying uncertainty, erodes some of our technology leadership. For purposes of illustration, Dr. Jaffe pro- vided the audience with an explanation of a hybrid integration technology created at Bell Laboratories. Fiber optics, he began, was very expensive technology because all the components are discrete. The cost of that expensive technology "doesn't hurt you that much" when used in a metropolitan network or in a long-haul net- work, because it is being amortized against billions of usages. But using that same technology at the home is very expensive because the amortization is lost. Bell Labs had developed technology in which optical components were put onto silicon wafers, thereby achieving the efficiency of Moore's Law. According to a graph on a slide he displayed, with the new technology you could substantially reduce the cost if produced in large volume. However, slow regulatory change has meant that sufficient investment in the technology has not been forthcoming. Is Fundamental Research a Casualty of Deregulation? Before concluding, Dr. Jaffe addressed the topic of funding for research and development, one that he said was not only very important but also close to his heart. Over the previous 25 years, the United States had substantially changed the structure of its telecommunications industry from a single, vertically-integrated company to numerous, horizontally-arranged companies. This was done for very

OCR for page 56
NEW TECHNOLOGY TRENDS AND IMPLICATIONS 67 FTTP is the most flexible solution for wired broadband services Enormous, scalable bandwidth Single-pipe integrated services (voice, data, video, ... ) The U.S. is generally trailing the rest of the world (11th in total broadband deployment) 2 scenarios: "Greenfield" -- new construction Fiber all the way to the building Same cost to install FTTP or copper access -- or both! All new construction should have fiber installed, even if not yet used "Rehab" -- rebuilding existing copper access plant Small part of access network (typically < 10%) is updated each year May use "deep fiber" to the curb with copper for the last 100 to 1,000 feet to building FIGURE 13 FTTP landscape. good reasons and with very good results, as it substantially introduced innovation and reduced its cost. Unfortunately, however, fundamental research had been forgotten in the process. For many years, it was a tax on the telephone bill that had funded basic research in the telecommunications industry. After that, the venture model of the previous decade had provided a very effective substitute for underwriting research in communications: It had plowed billions of dollars, much of it focused on tele- communications, into numerous deals in the 1990s and early in 2000. Since then, however, this model had collapsed. With both models no longer relevant, Dr. Jaffe said, the United States needed a "means of cooperation across the partners in the industry to improve on the research situation." Europe, meanwhile, had adopted the explicit strategy of becoming more competitive in telecommunications and was implementing it, in part, through the European Framework Programs. Summarizing, Dr. Jaffe called VoIP the "voice technology of the future" but reiterated numerous policy issues: security, reliability, CALEA, E911, disaster recovery, diversity, and authentication. He stressed that while new services would be enabled through the network's "knowing" a lot about the user, it was neces- sary to ensure that this was handled appropriately. He termed emergency plan- ning inadequate and called for a nationwide solution based on interoperable high bandwidth. He indicated that the United States could no longer afford "to keep losing ground to the other countries of the world" in fiber to the home. And, finally, he pointed out that much of the innovation the country had seen over the

OCR for page 56
68 THE TELECOMMUNICATIONS CHALLENGE previous 20 years and would see over the ensuing 10 had resulted from its past leadership in basic research in telecommunications. With this in mind, he advo- cated reexamining the current U.S. research model. FOUR FUTURES FOR THE NETWORK David S. Isenberg Isen.com Dr. Isenberg began by apologizing for having interrupted Dr. Jaffe, although he explained that the occasion of the interruption, an allusion to the U.S. ranking in broadband per capita, was one of his "hot-button issues." The data showing the United States to be eleventh in the world were some three years old, and in the interim this country had been growing at 42 percent per year, while a number of the countries that had placed below it in those rankings had posted annual growth rates approaching 300 percent. The International Telecommunication Union, in a study issued early in 2004 and reflecting 2003 data, had placed the United States thirteenth. But in making his own quick analysis of the three-year-old data, Dr. Isenberg had projected that the United States would fall within a year to last among the 15 nations consid- ered. While he had not seen a listing of countries 16 through n, he said that he would "guarantee" that some on it were growing at triple-digit rates. He would, in fact, "not be a bit surprised" to find that the United States, number three in broad- band per capita as recently as 2000, had been knocked out of the top 15. "So, for all the wonderfulness of the Communications Revolution and all the improve- ments we're seeing in this country," he declared, "we're in a disaster: We're losing our national leadership." Taking Intelligence out of the Network Originally, the title of Dr. Isenberg's talk was to be "The Rise of the Stupid Network." The "stupid network" was more or less the result of applying the "end- to-end principle," which states that "if you can do something in the middle of the network or at the edge of the network, do it at the edge." Borrowing a formulation from Tim Bray, he said that the way to explain the principle to a telephone com- pany was to say that "you want a fat pipe, you want it to be always on, and then `get out of the way.'" This principle--take the intelligence out of the network and put it at the edge--had guided the Internet's success. Jerome Saltzer, David Reed, and David Clark had articulated the principle, which Dr. Isenberg said was currently "the key factor," in the late 1970s. While Dr. Raduchel and others might talk about digitization and packetization as important, and while these were indeed neces- sary, a packetized, all-digitized network could still be a vertically integrated,

OCR for page 56
NEW TECHNOLOGY TRENDS AND IMPLICATIONS 69 stovepiped, closed network. To open up the network the end-to-end principle is needed. Indeed, he asserted, the end-to-end principle had been directly respon- sible for all killer applications of the previous decade, some of which he listed: email, e-commerce, Web browsing, instant messaging, blogging, audio over IP, and Internet telephony. Not one among them had been invented by a stovepiped, vertically-integrated network provider like a telephone company or a cable com- pany. Rather, each had been brought to market as an application on top of a stupid network. The N10 network allowed future applications to be discovered. `Any Application over Any Network' Considering the formulation "Any Application over Any Network," Dr. Isenberg shifted focus from the first part of the phrase to the second, the "flipside," as he termed it: "Over Any Network." He presented a second list com- prising twisted pairs, CoAx, Cat 5/6, fiber, hybrid fiber wireless (HFW), licensed wireless, unlicensed wireless, new wireless modulation techniques, and new wire- less architectures. And, he said, more physical layers and new architectures alike remained to be discovered. Evoking the image of an hourglass, he placed the "cornucopia of applications" in the top, the Internet Protocol at the middle, and any network in the bottom. The result, said Dr. Isenberg, is physical diversity, which avoids dependence on one set of infrastructures such as SS7. The disadvantages of such dependence were illustrated by the notorious incident in 1990 when a switch generic that was missing a semicolon caused a lengthy interruption of telephone service during which tens of millions of calls were blocked. "Physical diversity is the only route to absolute network reliability," he stated, "and you only get physical diversity with the end-to-end network." Again, it had not been telephone or cable com- panies that had developed the most effective of these networks: Internet, Ethernet, and unlicensed wireless. Moreover, future network technologies remained to be discovered. Disrupting the Telco Business Model Dr. Isenberg then offered a brief and, he said, somewhat oversimplified over- view of how the N10 network disrupts the telephone company's business model (see Figure 14). Alerting the audience to what he termed the crux of his presenta- tion, he explained: "The `stupid' network, the end-to-end network, makes it impossible for the telephone company to sell anything--it is left with nothing to sell other than commodity connectivity." Describing the "old" model, he said that when a telco set up a call, it touched every element in every network. "This allowed the owner of Network C, for example, to introduce cool features so people would prefer it to bad old Network B, which didn't have the features." In the new, inter-networked model, it was the Internet Protocol's job to make all that was

OCR for page 56
70 THE TELECOMMUNICATIONS CHALLENGE Telco Model Inter -Networking Model Application: Product & Application: voice Monthly data, service income voice, income video, . . . anything! Network layer: Expense Network Layer: Commons designed for voice (Subsidized by Internet Protocol Application) Physical Layer: Big Question: non-specific what's the Physical layer: Expense end-to-end (business?) designed for voice (Subsidized by connectivity (operating?) Application) model FIGURE 14 How end-to-end disrupts. specific to a single network disappear and to permit only those things common to all networks to come to the surface. Since the Internet ignores whatever is specific about a single network, including the features that had formed the basis of competition, features lose relevance in an inter-networked world. Coming at the same point from another angle, Dr. Isenberg said that in the stovepiped, vertically-integrated model a telephone or cable company sold the application and then subsidized the underlying layers with the application revenue. In an inter-networked model users still buy the application, which still produces income; this is something the industry knows how to do. But as the applications rest upon a commons, the Internet, a "big question" remains: "What is the business or operating or functional model to get the physical connectivity?" Future of the Network: Four Scenarios As a prelude to sketching four scenarios for the future of the network, Dr. Isenberg characterized the status quo. The last mile was no problem anymore, as customers had 100-megabit, even gigabit LANs in their homes. Nor was the price of technology a problem, as these LANs were available for $39 at office supply stores. Rather, the current hitch was located "in the middle of the net- work," at the level of access (see Figure 15): "I've got a gigabit in my Macintosh sitting over there useless, because I can't connect at a gigabit."

OCR for page 56
NEW TECHNOLOGY TRENDS AND IMPLICATIONS 71 Customer Backbone (Gigabit LAN) (Terabits) Telco/Cableco Access FIGURE 15 Connectivity today. Under the first of the four scenarios, competition as envisioned by the Tele- communications Act of 1996, a variety of pipes was to go into the home (see Figure 16). The result, as described by Dr. Isenberg: "Multiple players and every- body loses." The fundamental assumption of the 1996 Act was, as he character- ized it, that thanks to the market's "magic hand," competition would do what regulation could not. But he and others, among them Roxanne Googin, believed that if the middle of the network was empty and the only thing occurring there was the movement of bits back and forth, one was dealing in a pure commodity and it was very hard to have anything to sell. Once a telephone company, or a fiber company for that matter, had sold one fiber to a user, it would never sell that same user another. For, with more technology coming next year, the user would be able to light the fiber twice as fast then, and even more technology would be coming the year after that. After the initial sale, therefore, the fiber vendor would be out of business. This gave rise to a paradox: In order to survive in the competi- tive world of the "best network," telephone companies would have either to cripple the network or to cripple competition. Scenario two represented the future according to the telephone companies (see Figure 17). "It is today's `official' future," said Dr. Isenberg, "if you get around the fact that when they say `competition,' what they really mean is `competition where I'm the competitor.'" While there would be a modicum of improvement, far less bandwidth would be available than technology would allow or than was available in other technologically advanced countries. Besides a crippled network, this scenario featured crippled competition: Municipalities would not be allowed to compete, for example, and CLECs (competitive local exchange carriers) would have been driven out of business.

OCR for page 56
72 THE TELECOMMUNICATIONS CHALLENGE Customer Backbone (Terabit) Multiple Players Everybody Loses FIGURE 16 Scenario #1: Competition (as envisioned by 1996 Telecom Act). 3G Backbone (Terabit) Customer Telcos bring a little improvement but with crippled technology and crippled competition FIGURE 17 Scenario #2: Telco-topia--the "official future." Entertaining a Forbidden Thought: Monopoly Scenario three, which he called "rethinking `natural monopoly,'" was "politically incorrect," Dr. Isenberg acknowledged, although he added: "what the hell." He averred that the Bell System had, for 50 years, given the United States what was arguably the world's best telephone system, albeit a vertically integrated one. He proposed, therefore, determining what the current natural monopoly was and whether something useful could be based on it (see Figure 18). Pointing out

OCR for page 56
NEW TECHNOLOGY TRENDS AND IMPLICATIONS 73 Backbone Customer (Terabit) A wisely-run, well-regulated monopoly that gets it FIGURE 18 Scenario #3: Re-regulation--rethinking "natural monopoly." that monopolies in themselves are not illegal but only become illegal when they engage in certain behaviors, he posited that a monopoly might be crafted that was wisely regulated, well run, and public spirited. Under the fourth scenario, technology will become so good that customers would simply build and own the network (see Figure 19). "We'll go down to the Networks `R' Us, buy a network device, plug it in, and be on the network." There were reasons to believe, dating back to Tim Shepherd's famous MIT thesis in 1995, that this was within the scope of current technology. Backbone (Terabit) Wired and Wireless Customers Customer-owned networks take over the access business FIGURE 19 Scenario #4: Customer-topia.

OCR for page 56
74 THE TELECOMMUNICATIONS CHALLENGE Speculating on what a future telecommunications act might mandate, Dr. Isenberg concluded with the following questions: "Will we be locked into an `everybody-loses' situation, or a `too-little-too-late' situation, where the United States loses its global leadership? Or will we manage to come up with some kind of monopoly at the very lowest layers, perhaps a monopoly that just strings fiber but doesn't light it? Or will we encourage the kind of technology whereby we don't even need a company to run our networks for us?" DISCUSSION Cynthia de Lorenzi introduced herself as chief executive officer of PatriotNet, an independently owned Internet service provider (ISP), and as a representative of the Washington Bureau for ISP Advocacy (WBIA), an organization made up of "the abundant small ISPs who helped grow the Internet." She asked advice on what message she might take back from the symposium to her colleagues at WBIA, to the CLECs (competitive local exchange carriers) they worked with, and to others who, she said, "help drive this industry." Voicing the claim that the small ISPs were actually the doorway to innovation, she asked what their future was and whether it was "time for all of us independents to go away." Dr. Isenberg stated that "if nothing else gets done and the current policy directions are carried out," the industry was moving towards a reverticalization in which it would makes sense only for connectivity providers to be ISPs. Agreeing that small ISPs were "very pro-innovation," he said he shared her worry. Dr. Jaffe, recalling technical challenges he had outlined concerning quality, security, reliability, understanding of users, providing privacy, and so on, remarked that there were many open issues in the next generation of network. Innovative service providers large and small would likely have a very important role--"introducing the necessary enhancements in an entrepreneurial way"--and he would, he said, encourage the small ISPs to look at such things as ways of providing "a better VoIP solution than the next guy." There would be a great deal of competition in that area, he predicted. Transportation a Substitute for Bandwidth? Jay Hellman, who introduced himself as "a real estate developer with too much of a technology education to think like one," indicated that the theme of his question would be the relationship between transportation and communication. Office buildings, which he had been in the business of constructing, and com- puters are not nearly as different as they appear, he postulated. Office buildings were invented as a tool for what was then a new kind of work: processing and communicating information. "The office building," he explained, "was an infor- mation processing factory, and the paradigm was paper-based manual labor." While it was a familiar fact that location is of paramount importance in real estate,

OCR for page 56
NEW TECHNOLOGY TRENDS AND IMPLICATIONS 75 he stated, "the technology that defines location is transportation." It was a point so obvious that one rarely thought about it: The easier it is to get there, the more valuable the location is. Turning to communications, Mr. Hellman said that he had been led to start a telecommunications company by his frustration with the attempts of existing telecom companies to pass off DSL and cable modems as broadband. His conclu- sion, he said, had been that the last mile of the broadband network was neither copper nor CoAx; it was asphalt. "When you need bandwidth," he explained, "you get in your car and go there," stating that those in attendance had traveled to the symposium "for bandwidth." Although personal, face-to-face meetings were undeniably of value, they were overused, with the consequence that the transpor- tation network was "in complete congestive overload." A possible factor in this was the principle enunciated by Dr. Jaffe: No matter how much bandwidth people have, they want more. It was in light of the relationship between communication and transportation, Mr. Hellman said, that he hoped the panel might address the issue of regulation. Putting fiber into the home and making sure that it functions is a business, and it needs to be a profitable business. But, comparing it to the building of streets, he suggested that in the interest of ensuring a fair rate of return, it "ought to be a regulated business, almost like the real AT&T." Carrying the analogy further, he likened the duo of fiber and services to that of the public thoroughfare and such service companies as UPS and FedEx that use it to compete; it was desirable, he added, that the street be as accessible to as many people as possible. What did the panelists think, he asked, of the contention that the last mile of the network should be not competitive but regulated, yet that it needed to provide significant band- width and to be ubiquitous? Networks in the Hands of Customers Dr. Isenberg, acknowledging that Mr. Hellman was "onto something," named two "antidotes" to the kind of regulation he had spoken of: (1) technological decline and continuation of business as usual, and (2) the development of tech- nologies that allow customers to own their own networks. He called Mr. Hellman a "pretty good example" of the latter, since he had started a telephone company after being unable to contract for the telecommunications services he needed. Dr. Isenberg hoped that the regulatory situation would, he told Mr. Hellman, "encourage people like you to do your own thing," adding that he was "on the right track as far as thinking about the larger, more generic solution." But regarding another of Mr. Hellman's points, that there was no such thing as too much bandwidth, Dr. Isenberg cautioned that telephone companies would take exception. "If they served up too much bandwidth, then they wouldn't have anything to sell," he stated, arguing that "telephone companies make their profit based on scarcity."

OCR for page 56
76 THE TELECOMMUNICATIONS CHALLENGE Toward More Sophisticated Price Indexes Dave Wasshausen of BEA's national accounts staff registered his agreement with Dr. Doms that having good price indexes for high-tech communications equipment, computers, and software, is vitally important to measuring real invest- ment in the national accounts. While admitting that his colleagues generally used the Producer Price Index in their work, he noted that they were receptive to work on indexes being done by academics and in the private sector, and he pointed out that they had incorporated Dr. Doms's work into their price indexes for LAN equipment, switch gear, and other types of high-tech equipment. Such symposia as the present one were very important to him and his colleagues, as they wanted to learn more about how to measure such equipment. Finally, recalling Dr. Doms' allusion to the budget constraints under which statistical agencies found them- selves, he agreed that BEA had to prioritize. From his own perspective, the highest priority at that moment was price indexes for software; this matter had been treated at the STEP Board's symposium of February 2004, which he planned to revisit. Dr. Myers, returning to Dr. Jaffe's comments on the demise of basic research within corporations, observed that great basic research laboratories had been created and supported by monopolies. In addition to the AT&T and IBM monopolies, there had been a Xerox monopoly and, for a number of years, a monopoly held by DuPont. Given the contemporary consumer-oriented, market focus, he said, it was unlikely that a government monopoly would be created. "The only monopoly I envision occurring would be a `market monopoly,'" he said, pointing to the "Wintel" monopoly reigning in personal computing as an example. Expressing his doubt that this monopoly had created the kind of basic research evoked by Dr. Jaffe, he asked for the latter's comments. New Models for Basic Research Dr. Jaffe praised the government for doing an "outstanding job" in funding fundamental research within the university system--a role identified over half a century before. As the nation moved toward having fewer natural monopolies, he said, it needed a way of funding basic research in the commercial sector, which "brings a different perspective than the university system." That challenge was currently being studied by a panel of the National Research Council's Computer Science and Telecommunications Board, whose report was due out early in 2005. Led by Bob Lucky, the panel was to consider the dimensions of telecommunica- tions research and, in addition, the models that might be appropriate for it. Dr. Myers expressed the STEP Board's appreciation to the panelists.