6
Public Policy and Private Action

Introduction

The evolution of the national information infrastructure (NII) poses a number of problems, alternative solutions, and opportunities for private and public parties. With an emphasis on gathering private sector perspectives, this project did not set out to define appropriate roles for government. Contributors to the forum made clear that there is hardly a consensus among representatives of various industries about the role of government. Across the range of issues relating to information infrastructure there is evidence of imperfect performance both in markets and by government. Therefore, the serious debate and commentary center on what imperfect government actions to remedy imperfect markets are justified. The workshop, forum, and set of white papers developed for the NII 2000 project, like the steering committee, mirror society in presenting divergent perspectives.

Regardless of political sentiments about its role in general, government at all levels will inevitably be a major player. Government agencies—at state and federal levels—participate in almost every information-related role pursued by the private sector—publisher, user, network manager, innovator. Governments have additional responsibilities by virtue of their constitutional obligations—as arbiter, regulator, convener, and even leader in the interest of equity and an efficient, productive society. The federal government has unique responsibilities with respect to the transnational issues arising in the global information infrastructure



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 197
--> 6 Public Policy and Private Action Introduction The evolution of the national information infrastructure (NII) poses a number of problems, alternative solutions, and opportunities for private and public parties. With an emphasis on gathering private sector perspectives, this project did not set out to define appropriate roles for government. Contributors to the forum made clear that there is hardly a consensus among representatives of various industries about the role of government. Across the range of issues relating to information infrastructure there is evidence of imperfect performance both in markets and by government. Therefore, the serious debate and commentary center on what imperfect government actions to remedy imperfect markets are justified. The workshop, forum, and set of white papers developed for the NII 2000 project, like the steering committee, mirror society in presenting divergent perspectives. Regardless of political sentiments about its role in general, government at all levels will inevitably be a major player. Government agencies—at state and federal levels—participate in almost every information-related role pursued by the private sector—publisher, user, network manager, innovator. Governments have additional responsibilities by virtue of their constitutional obligations—as arbiter, regulator, convener, and even leader in the interest of equity and an efficient, productive society. The federal government has unique responsibilities with respect to the transnational issues arising in the global information infrastructure

OCR for page 197
--> (GII) and advancing the national technology base through support for research and development.1 Support for research and development (R&D) can help to increase the options,2 lower the costs, and enhance the capabilities of technologies that can be brought to the NII marketplace. Given the preponderance of private investment and the uncertainty surrounding information infrastructure markets, many people point to the importance of reducing government constraints on private decisions and investment. The big task for government, they would note, is careful but determined deregulation of telecommunications services. A second task is the studied avoidance of new regulations in response to problems arising in the new uses of technology. The concern is not only to avoid constraining competition, but also to minimize the kinds of actions to manipulate the system that telecommunications regulation has unintentionally fostered. As Robert Crandall of the Brookings Institution observed, "Once regulation becomes a system for politically redistributing income, and cross-subsidizing one service out of another, it becomes very difficult to allow competition and the entry of new technologies."3 Participants in this project pointed to both telecommunications regulation, per se, and the Modified Final Judgment decree arising from the AT&T Corporation antitrust suit as forces constraining entry.4 Although those inputs centered on existing telephone and cable businesses, other comments related to the challenge of how to foster entrepreneurialism, development of new information infrastructure industries, and market entry by smaller and nontraditional players. As Quincy Rodgers of General Instrument observed, smaller players have fewer resources, limiting their ability to enter certain markets successfully. Technological innovation and competition should be encouraged together, he and others argued, leaving to the marketplace the determination of which technologies deliver value to customers.5 Many others ask, How can so many industries and so many users expect to acquire enough shared knowledge to create consensus unless some legitimate public body, not competing in commercial markets, is willing to take a leadership role aimed at orderly evolution of the NII? Governments, perhaps uniquely, can try to encourage all parties (their own agencies included) to address cross-cutting issues cooperatively and constructively. The challenge is to effect government leadership that keeps options open, encourages innovation, and allows markets to work. Whether the government role is seen as something to be minimized or something to be leveraged, it will reflect two realities. First, the intertwining of public and private investments and activities shaping the NII implies that consultation between government and private sector institutions will have to be effective at the operational as well as the strategic or policy level. This interaction will have to reflect the fact that public and

OCR for page 197
--> private investment and activity together are dominated by private sector elements. Second, the NII—and the GII of which it is a part—is an incredibly complex economic and technological system. Simply understanding its behavior is a major task; finding appropriate tools for addressing gaps and bottlenecks requires a level of knowledge of system behavior that no one industrial sector (or governmental unit) is likely to have. Examination of these two realities in turn is furthered by understanding how the Internet has gained in significance in both contexts. Public-Private Engagement The high level of collaboration among its suppliers and users—and the associated roles of public and private institutions—makes information infrastructure a hybrid of the principles guiding other (and not unrelated) forms of public infrastructure, including traditional public utilities and education, which present contrasting mixtures of public and private roles.6 The Internet provides a powerful illustration. It developed out of the mission responsibilities of several federal agencies plus state and regional interests in both feeder and backbone networks. The Internet has demonstrated that some commonality of vision can make a decentralized process more efficient and effective. 7 However, there was no grand plan; the "bottom-up" character of the development, typical of most technology-driven changes that occur in a competitive environment, engendered vitality and legitimacy. The Internet developed largely as a result of research conducted in many institutions across the country and regional needs to interconnect state and private institutions and sources of information. The NSFNET, a principal Internet backbone commissioned by the National Science Foundation (NSF) and operated under cooperative agreement by a joint venture among IBM, MCI, and the nonprofit Merit Inc., was a proving ground for cooperation among government, industry, and academia in the planning, development, and operation of information infrastructure. 8 Public and private research and investment have supported the creation of collections of valuable information as well as services to make them available. Thus the Internet illustrates government investment as a catalyst to private investment in both the demand and the supply sides of information infrastructure. For a discussion of emerging prospects, see "Government as User and Service Provider" below. Unresolved is the appropriate role of government in assuring operational integrity of an NII that is increasingly diverse in its components, service offerings, and management. See observations from David Messerschmitt of the University of California at Berkeley in Box 6.1. In an Internet that is not a critical system for its users, anecdotal evidence of operational

OCR for page 197
--> BOX 6.1 Network Control and Management If we really have a goal of a national information infrastructure that incorporates some existing networks, such as the Internet, telephone networks, et cetera, [it will combine] existing networks [having] existing control structures (e.g., Signaling System 7 in the telephone network). The Internet does not have such a structure. But it is developing it for the purposes of providing quality-of-service guarantees. The question I have is, how are we going to develop some kind of standards or some way to ensure that all of these different pieces of this overall infrastructure interoperate at the control or signaling level? We tend to talk about standards in terms of [examples such as] MPEG, which involves the actual signals going through at one time. But how are these networks going to control each other? When I have to access a Mosaic server across the Internet from a TCI-run cable TV system, how does the addressing and quality of service information get transferred from the cable system to the Internet in some organized fashion? —David Messerschmitt, University of California at Berkeley failures may not be a cause for alarm and may even be a transient phenomenon, although it is also a cautionary indicator for the nation's information infrastructure in general, since mature infrastructure must be reliable. These operational failures also raise questions about whether some minimal common institutional structure or mechanisms for governance are needed to preserve system integrity, availability, and so on. Finally, the Internet also provides a vehicle for exploring the costs, benefits, and alternatives for equitable access to information infrastructure and associated services.9 See Box 6.2. Attorney Allan Arlow, Tora Bikson of the RAND Corporation, Richard Friedman of the University of Wisconsin Medical School, Allen Hammond of New York Law School, and other contributors expressed concern about the prospect of ''information have nots," acknowledging that some segments of the population are not sharing—as a function of market forces alone—in the widening deployment of advanced services and sophisticated information appliances. 10 The Internet, and more limited government service programs (e.g., Social Security) and commercial programs using dedicated or specialized systems, offer the opportunity for some experience through libraryand school-based access and kiosks (with public terminals) that provide access in various public spaces (Jackson, 1995). They show both the potential for and limitations of public access arrangements, with limitations including the level of access achieved, the public nature—and therefore limited personal privacy afforded—of the access, and the limited ease of use and thus the training requirements.

OCR for page 197
--> BOX 6.2 Universal Service Yesterday the term universal service did not differentiate between access and the service itself, because for a phone call the phone and the service were essentially the same thing. Now, or in the future, we are talking about two different things. One is equitable, if not universal, access to the network so that everybody can touch the network, even if that means walking to a library. A separate issue is whatever subsidy is appropriate to make sure that the people that "we the public" want to have certain services, have those services. —Robert Powers, MCI Telecommunications Inc. You cannot have an NII if you do not hook most of the people in the United States into it. At least you cannot have an affordable NII. … Nothing is more universal in the United States than television: 98 percent of all homes have television; 96 percent of all homes have telephones; 63 percent of all homes have cable. Good access to the American home depends on television. —James McKinney, Advanced Television Systems Committee Paying customers should be the largest contributor at first; educated middle- and upper-income individuals who are driving the use of technology today. Second, there should be free access to schools, libraries, and other public facilities providing basic access and exposure to the network infrastructure, which will build demand for the future. And finally, universality, the end goal, is not possible without a revenue stream to attract and hold investment. —Michael Greenbaum, Bell Atlantic Corporation Over the near-term (5- to 7-year) horizon of this project, public and institutional access may be the only access available to some segments of the population, given the constraints on expanding the quantity and quality of bandwidth capacity serving residences and small businesses. Because for many public access remains a second-best option compared to personal systems (e.g., telephones, televisions, and computers in residences),11 it is an area that warrants explicit monitoring and assessment, on behalf of those who cannot afford their own personal computers. Public access should be the focus of a testbed or other type of experimental program to explore technical, cost-sharing, and other dimensions of the issue. NII Systems Issues The central NII systems issue relates to promotion of an overall archi-

OCR for page 197
--> tectural concept (see Box 6.3), something that many participants endorse in principle but find difficult to attain in practice. The steering committee endorses the ideal of architectural convergence while noting the pragmatic and not inconsistent expectation for multiple approaches. It distinguishes the question of objectives from that of process. The process of developing NII architecture is not a neat one. At the forum Howard Frank of the Advanced Research Projects Agency (ARPA) asked explicitly, "Do you see any kind of process that could bring a convergence of architectural concepts?" The confounding of technological and business strategies makes a clean answer to that question elusive. One element of such a process that emerged from the NII 2000 project is vision-setting discussions and forums, in which a central challenge is to remain apolitical. Realizing the Information Future (CSTB, 1994b) argued for the benefit of a common architectural framework, and the many briefings and presentations associated with that report's dissemination furthered discussion within individual industry and technical communities. Complementary public discussions have taken place over the past 2 years under a number of different auspices. 12 The NII 2000 project built on that stream and emphasized cross-sectoral and cross-industry interaction. It also carried forward a leadership-via-convening approach that can foster development of common understanding and provide for interaction among disparate parties (multiple provider industries or providers and users). Such discussions can have an impact on the development of architecture without interfering with particular deployment activities, because interoperability can be achieved at a higher level in the architecture than that represented by the facilities that dominate investment decisions. A more active process revolves around the surging interest in experimentation with Internet applications, which have provided practical, hands-on exposure to open architecture and internetworking. They may also be more broadly visible than the market trials of more closed systems ongoing among cable, telephone, and other companies. Accordingly, attention to the Internet grew over the course of the project, raising questions about what, if anything, the government should do to help the Internet evolve in the context of commercial operation, other than to transfer its management institutions carefully from government support to user (or investor) funding. Any further process, presumably, should tip the business case toward a common architectural framework. The message of the NII 2000 project is that such a business case may emerge over time. How quickly and whether it can or should be accelerated remain subject to debate, with the notable exception of R&D (see "Technology Development Through R&D"), on the importance of which there were affirmation and consensus despite the project's emphasis on downstream deployment.

OCR for page 197
--> BOX 6.3 Architecture Where does the NII fall on the continuum ranging from the decentralized model with all intelligence residing in the terminal, spanning the Internet model and the ability to develop and deploy new applications very quickly, … to the centralized model, which includes the old telephone system? … Is it possible to have a single model for the NII on this sort of continuum? … Is there one network, or is there instead an internetwork of at least several types of networks that are organized according to quite different paradigms? I think that in some respects we are talking past each other, because some very different types of applications are being discussed. The two fundamental types of applications are (1) those that depend on database access, including information retrieval and many of the entertainment applications that involve accessing a central reserve of information (entertainment video, for example) and (2) communications—between two computers, two people, or between the computers of people. For applications based on database access, full connectivity is less critical, as can be seen in the case of companies like America Online and CompuServe that have made a business out of users subscribing to only one of these services. On the other hand, an application like Mosaic running on the Internet illustrates the value of full connectivity of users to a database. … Is full, logical connectivity required within a national information infrastructure? If so, then interoperability, standards, and other associated issues become much more difficult to deal with. A related question is, Are we going to have a common infrastructure for both types of generic applications, database access, and communications, or are we going to have a proliferation of separate infrastructures for these two kinds of applications? If we have communications infrastructure that provides for full connectivity between users, then we inherently have simultaneously infrastructure that also serves for database access applications. … It is not necessary that all near-term deployment provide all the capabilities represented in a strategic vision. Indeed, one critical aspect of such a vision is that it should be easy and cost-effective to add new technologies and capabilities to the NII as unanticipated applications and user needs emerge. If this flexibility is achieved, it is only necessary that near-term investments be compatible with a long-term strategic vision, and hence not preclude future possibilities or force later disinvestment and widespread replacement of infrastructure. If people can access the Internet over their cable system and it is providing what they want, then over time the bandwidth of that Internet pipeline will grow, and the bandwidth of everything else will shrink, and gradually people will move toward the solution they like. That is a market force. Perhaps the Internet or its extension could serve [the necessary] function—which is not the task of bringing together many different things, but rather the process of starting with what we have and extending it in the direction we want to go. —David Messerschmitt, University of California at Berkeley

OCR for page 197
--> Discussion among contributors from a range of telecommunications and content industries illustrated how many factors intertwine to complicate movement toward a more open architecture, as examined in Chapters 3 and 4. For example, Edward Horowitz, whose company, Viacom Inc., has left the cable business to focus on content generation and marketing, commented on the importance to content providers of unconstrained access by customers and complained about service providers as bottlenecks. Bell Communication Research's Padmanabhan Srinagesh, who has been studying the economics of internetworking, suggested that "there might be a trade-off between encouraging competition at the lower levels, at the level of bits in the networks, versus having an open architecture where the people who provide the bits also provide a standardized open interface that other people can use to compete with them for end users." But the National Cable Television Association's Wendell Bailey observed that the cable industry was built on control of content, albeit in an environment of local monopoly franchises: "We relish, as an industry, our right to be an editor." Content publishers and customers may prefer the convenience of maximal openness, but conduit owners express concern that an obligation to carry any kind of content reduces them to common carriers and may therefore constrain their potential to generate profits through selection and differentiation. At the same time, if they are not common carriers they assume a heavy legal obligation with respect to the content they provide, even it they did not originate it. Perceptions of how common carriage has fared in conventional telecommunications contribute to reservations (relating to profitability, potential regulation, and business development prospects) expressed by some about the business implications of open architectures. Another source of concern is uncertainty about how the market will evolve to support competition based on ownership of facilities and/or service supply—what forms of industry structure will be sustainable? As noted in Chapter 3, there were those at the forum who expressed concern about what might happen in telecommunications and information service markets if prices became chaotic. Others worry that open competition in an industry so bound to its fixed costs will produce effects akin to those experienced in the airline industry following deregulation. The low consumer prices that "price wars" might bring would be welcomed by many, but an industry shakeout and reconcentration might follow, which could either sustain or slow investment in deployment. One telephone company representative said that "there would be many starters but few finishers." See Box 6.4. While the steering committee heard a number of concerns about the possibility of closed systems hindering the development of the NII, it

OCR for page 197
--> BOX 6.4 Competition's Darker Side? In the long run, [technological and associated cost trends may give rise to] an industry with chronic excess capacity. An industry like that is very susceptible to price wars. If there are two or more broadband wires to the house providing essentially the same service, both economic theory and practical experience suggest that price is going to get competed down to marginal cost, leaving the producers with no way to cover their fixed investments in infrastructure. The result is bankruptcy, financial distress, et cetera. So here are a few scenarios. One is to just let it happen. The problem is cutthroat competition and financial distress. After two or three repetitions, firms start losing interest in investing in the market, because there is no way to recover their infrastructure investment. They drop out, only a few major players remain, and local or maybe even national monopolies result. Well, what about monopolies? Monopoly provision is better than no provision at all. Local monopolies, such as the local telephone service, the cable TV service, and so on can provide broadband service on a city-by-city basis. The problem is that you get the usual monopoly distortions in terms of excessive pricing, lack of innovation, and maybe even worse, the dreaded "R" word: the cause for regulation. The third scenario is the "killer app" scenario, in which a very high bandwidth application becomes very popular and the revenues from selling the transport for such a "killer app" are sufficient to cover the infrastructure investment. But what is that application? As we have heard today [at the forum], nobody knows what the "killer app" is—two-way video, interactive virtual reality. … There is not going to be a market there if people aren't willing to go for the applications. So it is the usual chicken-and-egg problem. Finally, the last scenario is vertical integration with the content providers. That seems to be the most popular solution. … But if the market for transport becomes very competitive in the future—if transport is a commodity business—then the money is going to have to flow the other way. The content will have to cross-subsidize the transport. Then the content is what is paid for, and the transport is thrown in for free; it is just a cost of delivering the content. A question is, Will the content providers really stick to their joint ventures once they have competing transport providers clamoring for their business? That remains to be seen. —Hal Varian, University of California at Berkeley finds little reason for concern that closed systems will represent an issue that will require policy intervention, beyond traditional tools of competition policy such as antitrust. The position of most developers is that new applications will come into existence over the Internet, an open system, and that an application, if initially created in an open form, has a natural resistance to being bundled. In support of this tendency, the steering committee notes the success of the Internet itself, the evolutionary path by which the on-line service providers have become Internet access provid-

OCR for page 197
--> ers, and the trends toward open systems and open protocols in general. If the open form is robust in terms of functions and institutions, it will survive. If it is not, that form should not be sustained by policy unless there is a compelling national need. Research, however, can produce results that may make open systems more attractive (e.g., by lowering perceived costs), potentially accelerating their implementation. Thus, the steering committee concluded that the government in particular need not and should not make any presumption that it must protect the open nature of mature products, but that it should move to foster an open, innovative environment in which new services and applications can occur. This approach includes encouraging the deployment of communications infrastructure that is general and flexible, removing regulatory barriers to innovation (for example, making spectrum for experiments easily and predictably available)13 and competition, and continuing to foster the success of the Internet through R&D and use in the delivery of public services. Defining Roles For Government The large volume of descriptive material and statements of opinion elicited at the workshop and the forum and in the white papers provide some insights into current and potential roles of government (at all levels) in policy making. They fall primarily into the following categories: Regulation and other rule-setting influences on market entry and conduct; Direct government use of and experimentation with information infrastructure in the delivery of public services; and Exploration of systems issues and fostering of technology development through research and development to create options, standards to choose among the options, and systems-level consensus to guide the choice of standards. Three other government roles that cut across or influence the above three policy areas include attention to international aspects (trade, regulation, and interconnection and interoperation), data and analysis relating to infrastructure trends, and convening of different parties and sectors. These areas are discussed in turn in the following sections. Regulation, Rules, and Norms Complete examination of issues related to regulation was beyond the scope of this project, but the body of inputs (see Box 6.5) underscored the

OCR for page 197
--> BOX 6.5 Regulation and Market Roles Do the conditions exist in the regulatory environment that make it possible for every player sitting at this table to risk its capital on relatively equivalent terms? —Gail Garfield Schwartz, Teleport Communications Group Monopolies [established by regulation] become powerful forces for creating hostages in government and lead to the retardation of technological development. Those monopolies have led to a slowdown in television's introduction and spread in the 1950s, a substantial slowdown in cable television's introduction and spread in the 1960s and 1970s, and a substantial slowdown through regulation of the development of wireless communication systems in the 1970s, 1980s, and even into the 1990s. —Robert Crandall, Brookings Institution If a phone company today attempts to upgrade to ADSL [asymmetric digital subscriber line], it cannot do so until somebody on M Street has said that is a good idea. If a UHF station in Los Angeles says, look, we are tired of sending I Love Lucy to the masses, we want to take our 6 megahertz and use it for two-way digital paging, they may not do so at all. If they wish to take their 6 megahertz 24 hours a day and slice it into fragments of 5 minutes and sell it to all comers, that is illegal as well. … Nextel created a cellular network largely by understanding process, buying up radio dispatch licenses, and then getting them dezoned. —Peter Huber, Manhattan Institute for Policy Research In some states, cable operators are common carriers. So it is not a model that I would reject out of hand. … Because, in the 500-channel world, it is hard to see how you really are a lot different from a common carrier. —Wendell Bailey, National Cable Television Association The federal government should not go in and try to preempt the local governments. It could and should establish some overarching guidelines for acquiring sites so that it will at least facilitate the process. For PCS [personal communication service], we have build-out requirements. We cannot afford to sit around too long and try to get sites. We spent a lot of money to get the licenses. So there needs to be some help. There is certainly a sensitivity, however, at the local level that you cannot just have the federal government saying that the wireless entity can put up a tower or a base station anywhere it wants. —Mary Madigan, Personal Communications Industry Association [T]he problems can include … the actual absence of a process for handling the permitting. In some locations there are actual prohibitions where people say, "not within the jurisdiction here," but at the same time they want the service. Some, unfortunately, are viewing this as an opportunity to make a bonanza: "We want seven percent of your gross revenues, plus we want some fee per site." Some have even actually gone so far as to say, "if you want to locate it in certain areas, you can deploy as many as you like, but you have to pay this fee, and no more than two people per antenna," which is absurd. —Robert Roche, Cellular Telecommunications Industry Association

OCR for page 197
--> International Issues There was surprisingly little discussion of transnational issues from the industry experts. Although international issues were not a focus of this project, contributors generally assumed that international connectivity is a requirement, and experimentation with new services was reported to be taking place globally. The lack of concern expressed about incompatibilities in law and regulation, uncertainties about venues for legal accountability, and access to foreign markets perhaps reflected both the relative immaturity of information infrastructure developments overseas and the pragmatic, country-by-country approach of the international firms. As a practical matter many of the most important commercial networks, such as the SWIFT system that supports international banking, have been in place internationally for some years. Among the more experimental developments, the Internet and the World Wide Web (Web) it supports are almost "the only game in town." Increasingly, commercial and recreational use of the GII can, technically, ignore national borders altogether, and that phenomenon warrants at least revisiting existing policies in such areas as export control, taxation, and enforcement of a wide range of rights and responsibilities. As Maria Farnon of Tufts University notes in her white paper, "Bodies like the ITU [International Telecommunications Union] have grown increasingly irrelevant with the introduction of new services such as the Internet and have seen their turf eroded by new organizations that do not necessarily have official government sanction." These new services and organizations have been embraced or at least tolerated in almost every nation on earth, setting an important precedent for the value to each nation of reasonably unrestricted international access. See Box 6.11 for Farnon's observations on the Internet as an international phenomenon. Inputs to the NII 2000 project from satellite and smaller wireless system providers acknowledged the appeal of such technology for expanding global access in regions where wireline infrastructure is limited and the costs of building it are extremely high. Within individual countries, issues of open architecture, competition in telecommunications, and local equivalents of universal service are being debated; the G7 Ministerial Conference on the Information Society of February 199530 fostered cross-national discussion and experimentation; and the issue of international traffic settlements is one of many. Attorney Jonathan Band commented on the international attention now being devoted to system openness and harmonization of expectations for provisions in areas ranging from intellectual property protection to interoperability.

OCR for page 197
--> BOX 6.11 Politicoeconomic Implications of the Internet Despite the clear intention of the industrialized world to foster the building of national backbones, and the gradual diffusion of connectivity in many developing countries, the traditional TO [state-owned telecommunications operator] structure, and the resulting legal and commercial models this fosters, remain serious obstacles to a truly international Internet. While technical difficulties can be overcome with resources from institutions such as the World Bank, NGOs [non-governmental organizations], and governments themselves, the traditional mind-set of control over the communications infrastructure and services is more difficult to displace. … [G]overnments have long justified their ownership of the telecommunications operator on the basis of national security reasons, and also derive significant political and revenue benefits from this ownership. Although this structure has been seriously undermined in the United States, the European Union, and parts of Asia, it remains strong elsewhere. Ideally, an international network like the Internet should provide a protocol that is easily adapted to a wide variety of infrastructure development stages and should offer services that can be tailored to reflect the cultural, legal, and regulatory norms of every country. However, the model that the Internet has provided demonstrates that an international network will, by definition, still act to undermine many traditional structures that have evolved around the old TO system. Rather than seeking to impose old standards of behavior and control on the Internet, governments can best encourage the development of national information infrastructures by eliminating the inherent conflicts that exist between the new services and the domestic organization of telecommunications. This means introducing competition into all levels of service and allowing the market to drive pricing and standards. SOURCE: Extracted from "How Do Traditional Legal, Commercial, Social, and Political Structures, When Confronted with a New Service, React and Interact?," a white paper contributed to the NII 2000 project by Maria Farnon. Systems Data and Analysis for NII Assessment Data issues are prosaic, mundane, and easily overlooked, but addressing them could be a simple and powerful way of improving private and public decision making. How can public or private entities aim to influence something without knowing the extent of an issue or the impact of a contemplated action? Again, the private sector has used market research data more or less effectively, but much of that information comes from proprietary and unpublished studies (yet another argument for public testbeds). Periodic assessments, using a mixture of public and private sector inputs as in the case of this project, may be a useful way for the government to gain information to be shared with all interested parties. The AT&T divestiture has already detrimentally affected the avail-

OCR for page 197
--> ability of telephony statistics.31 Cable and wireless statistics are tracked largely by trade associations and market researchers (raising various concerns about the quality and completeness of data), and the decentralization and commercialization of the Internet are among the factors that make measuring its dimensions difficult. The white papers by Reagan Moore and by Hans-Werner Braun and Kimberly Claffy of the San Diego Supercomputer Center argue that "[t]he NII continues to drive funding into hardware, pipes, and multimedia-capable tools, with very little attention to any kind of underlying infrastructural sanity checks." The authors go on to relate deployment of new technology to measurement problems: "With the transition to ATM [asynchronous transfer mode] and high speed switches, it will no longer even be technically feasible to access IP [Internet Protocol] layer data in order to do traffic flow profiling, certainly not at switches within commercial ATM clouds." Meanwhile, the difficulty of gaining fresh insight into application domains suggests that there would be a benefit to more systematic study of how technology is selected and used in different settings. Government as Convenor Complementing and informing other government roles, convening—beyond that already provided by professional, trade, and private standards-setting organizations—appears to be helpful in encouraging better assimilation of information infrastructure by all users and providers. The good offices of state and federal government can break down some of the barriers to use of the information infrastructure for better health services, more effective education, distribution of government information and benefits, and so on. Private actions may not be sufficient for rapid progress and resolution in some critical arenas (e.g., education and health care) characterized by imperfect markets. In cases such as the interstate delivery of health care services, where legal barriers impede changes in professional practice, policy changes are needed to create efficient domain-specific information services. The issues, needs, opportunities, and handicaps in such areas have been surveyed many times over the past few years by the Information Infrastructure Task Force (IITF), Congress, trade and professional organizations, and independent analysts.32 The experience of this project, including a workshop and forum in which domain representatives expressed both frustration about difficulties in communicating with infrastructure providers and gratitude for the opportunity for learning and exchange, suggests that continued government support for convening different parties, bridging user and supplier communities, would have value. Whatever the federal government does to foster convening should

OCR for page 197
--> take into account the role of NII itself as an implicit, informal convening mechanism. As various contributors pointed out indirectly and directly, enhancements to the information infrastructure will allow people to coalesce both openly and privately, thereby changing the processes of government. Mixed expectations were voiced by one participant in connection with the use of information infrastructure to support electronic democracy. As Michael Greenbaum of Bell Atlantic Corporation speculated: At its best it will give voice and cohesion to the underrepresented. At its worst it will enable terrorists and hate groups to act under the cover of anonymity. This medium is better suited than most to building constituency, rather than just mass communications. It should be approached with caution as a means for formal and informal referendum, through which activist groups might unduly influence representatives and undermine the deliberative aspect of a representative democracy. Although not everyone will use the technology, the deeper information infrastructure becomes embedded in U.S. society, the more some degree of access will become a part of what it means to be an informed voter. Conclusions The NII 2000 project suggests that most in industries that supply information infrastructure would support a variety of roles for government, roles that are complex, often subtle, sometimes active. But these roles are, like the work of private firms, intertwined with private roles and subject to a broad range of uncertainties about future possibilities and problems. The steering committee heard broad support for the government contributing as an enlightened customer and participant in building the NII. In particular, it heard an expression of the need for continued government support of innovation in the form of support for university-based research and development, both basic and applied. But there was also general (albeit not unanimous) enthusiasm for accelerated deregulation—not only to speed market entry but also to allow licensed carriers the freedom to use their assigned spectrum innovatively—implying a reduction in governmental authority to direct NII evolution unilaterally. The concerns expressed about the lagging of many domain-specific areas of application, despite the prospect for important economic drivers from the application industries, suggest that this enabling role for governments reaches into many areas of government responsibility and is important at the state as well as the federal level. Thus responsibility for constructive government participation will be decentralized, raising ques-

OCR for page 197
--> tions about whether an overall systems view can or should be sustained in some continuing activity or organization. The steering committee concluded that a consensus vision—however indistinct and subject to evolutionary modification—of the NII in the first decades of the 21st century should form its core but should not represent a constraint on all NII developments. Articulating and evolving national goals, and driving toward a national consensus, are important elements of the government role. It is a difficult role in which leadership—as opposed to unilaterial action or definitions—can help to overcome the confusion and misunderstanding that typically accompany technically complex phenomena. There is general agreement that government can play a role in bringing diverse interests together to seek out consensus on values and objectives for the national capability and on architectural principles supportive of those values. If it is done right, that consensus will govern many of the public and private sector actions. Leadership and articulation stand in contrast to regulation or judicial action: not everything can be defined by rules. People in organized societies do things because there are corresponding norms and understandings. But we have already seen in many industries, including telecommunications, that if government defines too many rules, eventually all people do is try to work around them. Reconciling divergent positions and priorities within the government may be a critical first step, since the government itself is as diverse as other sectors of society. For example, it may be the case that entirely different priorities would emerge from the Department of Defense (DOD) and the Social Security Administration; within the DOD, the different services have had a hard time achieving interoperability. The IITF has been an experiment in cross-agency coordination. While it has helped to articulate and explore many key issues, the question remains as to which part of the decentralized government will articulate the goals and guide the architecture (CSTB, 1994b). This project itself represents a step by the federal government, acting through the IITF, to explore the extent of the consensus on which federal policy making and leadership can rest. What is the proper mechanism for pursuing this exploration? The project did not discuss specific institutional structures. However, it seems evident that something more long range and more centered in private sector participation is needed than the IITF and the former NII Advisory Council. The essential requirement is that both the information service providers and the information creators and users in the private commercial and not-for-profit sectors must be fully involved, along with relevant government bodies. Second, the primary economic drivers of the NII will be found, the steering committee believes, within the most important domains of the

OCR for page 197
--> national life—health services, education, electronic commerce (including goods distribution, marketing, and retailing), and public safety and the like. (While an important factor, entertainment appears to be less of a driver than anticipated at this project's outset.) Because the ability of these domains to take full advantage of the NII is more limited by factors inherent within those domains than by shortcomings in the information network support, progress in creating these services will lie primarily in the hands of the professional groups and firms concerned. These responsibilities clearly fall primarily to public and private institutions outside the realm of telecommunications and high-performance computing, and they must be taken up by the relevant bodies in their own self-interest. However, the federal government and the states can, within their own operational missions, act to reduce the barriers to the creation of need-based demand for information infrastructure. The NII 2000 project did not conclude (although it was hypothesized at the project's outset) that all information networks must be open and interoperable. Some mature and specialized applications can justify their unique application-specific architectures by the cost reduction they afford. But the powerful lesson to be learned from the Internet is the value of an open interface on which all kinds of new and mature services can be built, one that allows an experimental new service to look for users among the entire population with access to a digital network. Government should adopt policies intended to retain the power of a service like the Internet to be a testbed for innovations and a link among many resources for both information and its processing and distribution. Both federal and state governments have the opportunity to increase their own efficiency and improve their services to the public through the development and use of the NII. A current initiative toward that end focuses on "reinventing government." These efforts should be undertaken under policy guidelines supporting the future evolution of the NII, as well as the best use of currently available facilities. In other words, the government should further the development of the core functions of the NII, using a "learn-and-change-by-doing" approach. This will place the government's purchasing power squarely behind the goal of national, consensus-based progress. In this endeavor, the federal and state governments must seek a concerted strategy, since each has the potential for strong influence in the evolution of the NII. Perhaps the most important role government can play, especially at the federal level, is the continued vigorous support of advanced research into digital networking, with the objective of creating opportunities for the private sector to sustain the maximum amount of generality and flexibility at minimum cost. Much of this work can be carried out in support of government applications, and much can be carried out in collaboration

OCR for page 197
--> with private industry; much is appropriate to the university environment. It should be noted that different parts of the information industry are structured quite differently with respect to available R&D resources. Historically, the regulated telephony industry has been quite research-intensive. The cable TV industry and, more importantly, the domain-specific application industries do not have this tradition of research, depending instead on the vendors that support them; they would benefit, in particular, from partnership projects with government agencies. Finally, as noted above, there is much political debate about the effect deregulation, plus a blizzard of anticipated technological and market innovations, will have on the structure, profitability, and competitiveness of the many segments of the information industry. There is a continuous role to be played in evaluating and then acting (or not acting). At times, these actions are taken to remove impediments, and at other times, to encourage changes in direction. This is no different from what any corporate manager faces: the need to continuously evaluate and to correct the course accordingly. But the government should not try to manage the evolution of industry structure, barring evidence of serious inequities or economic problems. It should keep its tools, such as the Federal Communications Commission and the National Telecommunications and Information Administration, available to deal with any situation that might arise in the future. Particularly important will be better data collection and analysis to reduce some of the guesswork and improve the bases for decision making in industry, government, and elsewhere on infrastructure development, selection, and use. In any case, the slower pace of deregulation and privatization in most other nations will require active engagement with those policies in the interest of achieving a vigorous and effective GII in which U.S. suppliers are not competitively disadvantaged. Notes 1.   See CSTB (1994b,d). Also, the federal government has pursued a variety of initiatives that support the evolution of information infrastructure technology, including those relating to high-speed and optical networks, digital libraries, mobile computing, and so on. 2.   Government-funded networking research and development can explore architectural alternatives that cut across existing implementations, creating options for new configurations of the great range of available technologies. 3.   See CSTB (1995b) for additional discussion. 4.   Antitrust law and regulation, frequently misunderstood in any event, received little direct attention from project participants. Although, if anything, antitrust policy seeks to promote entry where a competitive market is sustainable, its effects may be most evident in this context as a factor affecting the composition

OCR for page 197
-->     and conduct of consortia and alliances (such as those for joint personal communication service licenses), or in government investigations of major players such as Microsoft to look into the possibility of monopolization. 5.   One possible form of government involvement that was not suggested or discussed was targeted investment downstream in production, as through investment incentives. For commentary on the difficulties and ramifications of attempts to direct the allocation of private capital, see Beltz (1991): [T]he HDTV debate has amply illustrated the hazards of high-technology politics and industry-led targeting. Rather than one industry voice to guide policy makers, there are many. Just as multiple technologies are involved in HDTV and the related final product markets, so also are there multiple interest groups, each with its own policy agenda and report documenting its critical importance. The range of interested parties includes TV programmers, TV equipment suppliers, cable companies, satellite companies, over-the-air broadcasters, domestic and foreign TV manufacturers, U.S. semiconductor manufacturers, telecommunications companies, and computer manufacturers—the list seems endless. Which industry voice should be followed? Whose agenda for the development of high-resolution systems, standards, and components should be chosen? Who should decide? 6.   Public-private interaction in the context of railroad development is discussed in CNRI (1995a, p. 5), which observes, ''Public sponsorship and joint public/private enterprise had characterized public works projects since construction started on the Erie Canal in 1817. Moreover, leaving aside the questions of the federal land grants for the moment, the rise of the railroads was accompanied by a relative decline in the value of federal investment." 7.   This point is discussed at length in Realizing the Information Future: The Internet and Beyond (CSTB, 1994b). 8.   See CSTB (1994b). 9.   Note that with respect to the issue of equitable access there is confusion over both mechanisms and objectives. Freer market entry and competition are mechanisms that can lower prices, which can in turn support a broader set of customers. However, they are generally not sufficient to address the needs of either the truly disenfranchised (whose means are too limited to support even low competitive prices) or those living in (e.g., rural) areas that are costly to serve. The white paper by the Organization for the Protection and Advancement of Small Telephone Companies notes, for example, that some areas are characterized by longer subscriber loops and other impediments to digital service. Perhaps some cause for optimism can be read into recent analyses of AT&T, which, as noted in CNRI (1995b), suggest that consumer demand motivated the expansion of the scope of AT&T's service in conjunction with the technological and management factors addressed in more conventional analyses. 10.   For example, forum participant Allan Arlow noted that the market success of different advanced network access interfaces will create a de facto definition of an "info have" that law or regulation could make de jure.

OCR for page 197
-->     According to an editorial in the New York Times of September 5, 1995, entitled "The Information 'Have Nots'": The basic promise of the information age—that books, facts and figures will be widely disseminated over the telephone lines—will come to nothing unless public access to computers and telecommunications technology is broadly expanded. The public libraries should be one source of easy access. They are struggling to offer the services but are faltering because of high telecommunication costs. A recent survey of what the Department of Commerce describes as the "information have nots" revealed that about 20 percent of America's poorest households do not have telephones. Only a fraction of those who do will be able to afford the computers and related equipment that grant access to the information society. The children of those households start out at an obvious disadvantage, as do adults who could benefit from on-line training or avail themselves of on-line jobs search materials and so on. States that house a disproportionate number of the country's poor will need to take special care to secure broad access to avoid an information underclass. 11.   Although inherently attractive to policy makers, the concept of making the national information infrastructure available via libraries and public access kiosks was roundly criticized in research conducted by U S West about a year ago. The research sampled hundreds of consumers in urban as well as rural settings—essentially all of them said they would be dissatisfied with public access. While public access seems workable for books (whereby consumers can buy their own favorites at a reasonable price) and database research (which few consumers do), respondents appeared to put access to the NII on a par with having television and telephones—they are not about to go to the local library to have it. 12.   These include the federal inter-agency Information Infrastructure Task Force and the former associated National Information Infrastructure Advisory Council, the American National Standards Institute and the associated Information Infrastructure Standards Panel, the Cross-Industry Working Team organized under the auspices of the Corporation for National Research Initiatives, the Council on Competitiveness, EDUCOM, the Coalition for Networked Information and its member organizations, and others. 13.   For example, David Messerschmitt commended the framework provided by the treatment of the industrial, scientific, and medical bands of spectrum, a zone where relatively little regulatory control is imposed on use as long as interference is minimized. During the January workshop, the Federal Communications Commission's Mike Marcus asked about the desirability of expedited approvals for testbeds and experiments analogous to those for fast-track Title III radio experiments, noting that the absence of such a process has been associated with delays for certain service trials. 14.   Or sometimes sharing in the cost of nodes and transport; see the white paper by the Organization for the Protection and Advancement of Small Telephone Companies.

OCR for page 197
--> 15.   Telecommunications reform legislation could substantially alter the framework for federal regulation (changing the enabling statute(s)) and also affect state regulation to the extent that it provides for federal preemption. The process of deregulation is clearly in motion; the uncertainties prior to and during the NII 2000 project have related to timing and application or emphasis. One issue contemplated in legislative discussions is some kind of requirement to increase connectivity from schools. Thus, AT&T received considerable attention for announcing its own plans to offer "free Internet access and voice-messaging services to 110,000 public and private schools" at a cost of about $150 million over 5 years. Beginning in fall 1996, the package would include "free dial-up Internet access, browser software, and 100 free hours of use, with a 30 percent discount on service thereafter"; the voice messaging service would be free for the first 3 months. See Naik (1995b). 16.   For example, a month before the spring forum, Bell Atlantic announced it was suspending two of its video dial-tone applications (also known as "214s") then pending before the Federal Communications Commission. The regional Bell holding company (RBOC), which had proposed using hybrid fiber coaxial cable and asymmetric digital subscriber line systems in different areas, stated that changes in technology necessitated changes in its applications. On May 25th, the day after the forum and nearly a year after Bell Atlantic had originally filed its applications, the RBOC withdrew its 214s altogether. 17.   These actions suggest questions about the desirability of and business case for the technology paths outlined in the filings and about whether the process itself will continue to be a requirement, now that telecommunications reform legislation has been passed. In these and other cases, the presence, absence, or degree of regulation may be cited as justification for or against action, but other factors clearly affect the desirability of a particular course of action. 18.   Such as project contributors Peter Huber of the Manhattan Institute, David Messerschmitt of the University of California at Berkeley, and James McKinney of the Advanced Television Systems Committee. 19.   For fuller consideration see such other CSTB reports as Realizing the Information Future (1994b), The Changing Nature of Telecommunications/Information Infrastructure (1995b), and Rights and Responsibilities of Participants in Networked Communities (1994d). 20.   The Aspen Institute has recently published a report on such a concept (see Firestone and Schement, 1995). 21.   The Information Infrastructure Task Force, through its Committee on Applications and Technology, has put considerable emphasis on convening people in major application areas, such as health care and education. 22.   This point was argued by Ken Klingenstein of the University of Colorado at Boulder during the forum. 23.   Similarly, David Messerschmitt noted that "transcoders already introduced in cellular telephony preclude privacy by end-to-end encryption," pointing to insufficient attention historically to privacy in the development of cellular networks. 24.   The report of CSTB's project on national cryptography policy, requested

OCR for page 197
-->     by the Defense Authorization Act of 1994, is scheduled to be published in mid-1996. 25.   They relate their testbed to four modes of regulatory interaction: a one-to-one exchange between entities (e.g., a private entity requesting information from an agency); a one-to-two relationship (e.g., for dispute resolution); a one-to-many relationship; and a many-to-many configuration (e.g., negotiated rulemaking). 26.   The broad applicability of some government applications is consistent with a variety of ongoing initiatives—congressionally driven procurement reforms, the National Performance Review, and the Government Information Technology Systems working group, a sister to the Technology Policy Working Group (TPWG), under the Information Infrastructure Task Force, and efforts to harmonize technology development and exploration efforts, such as those led by the TPWG, with efforts focused on applying information infrastructure. 27.   See the white paper by Oscar Garcia on behalf of the IEEE. 28.   The concern with design was noted at the forum by Walter Wiebe of the National Science Foundation. 29.   See Kahin and Abbate (1995) for an overview. See also Besen and Farrell (1994) and Farrell and Shapiro (1992). 30.   More information is available on-line at http://www.ispo.cec.be (EC Information Society Project Office Webserver) under G7 Information Society Conference. 31.   The basic problem is that activity devolved to multiple entities, only some of which were required to report key statistics to the government. Another problem is the broadening of the scope of concern from telephony to a larger mix of services within the NII. The need to resort to subjective or speculative market research estimates for a variety of communications and information services in the Department of Commerce's Industrial Outlook volumes is but one indicator. 32.   Several such efforts are referenced in the bibliography.