Public Policy and Private Action
The evolution of the national information infrastructure (NII) poses a number of problems, alternative solutions, and opportunities for private and public parties. With an emphasis on gathering private sector perspectives, this project did not set out to define appropriate roles for government. Contributors to the forum made clear that there is hardly a consensus among representatives of various industries about the role of government. Across the range of issues relating to information infrastructure there is evidence of imperfect performance both in markets and by government. Therefore, the serious debate and commentary center on what imperfect government actions to remedy imperfect markets are justified. The workshop, forum, and set of white papers developed for the NII 2000 project, like the steering committee, mirror society in presenting divergent perspectives.
Regardless of political sentiments about its role in general, government at all levels will inevitably be a major player. Government agencies—at state and federal levels—participate in almost every information-related role pursued by the private sector—publisher, user, network manager, innovator. Governments have additional responsibilities by virtue of their constitutional obligations—as arbiter, regulator, convener, and even leader in the interest of equity and an efficient, productive society. The federal government has unique responsibilities with respect to the transnational issues arising in the global information infrastructure
(GII) and advancing the national technology base through support for research and development.1 Support for research and development (R&D) can help to increase the options,2 lower the costs, and enhance the capabilities of technologies that can be brought to the NII marketplace.
Given the preponderance of private investment and the uncertainty surrounding information infrastructure markets, many people point to the importance of reducing government constraints on private decisions and investment. The big task for government, they would note, is careful but determined deregulation of telecommunications services. A second task is the studied avoidance of new regulations in response to problems arising in the new uses of technology. The concern is not only to avoid constraining competition, but also to minimize the kinds of actions to manipulate the system that telecommunications regulation has unintentionally fostered. As Robert Crandall of the Brookings Institution observed, "Once regulation becomes a system for politically redistributing income, and cross-subsidizing one service out of another, it becomes very difficult to allow competition and the entry of new technologies."3 Participants in this project pointed to both telecommunications regulation, per se, and the Modified Final Judgment decree arising from the AT&T Corporation antitrust suit as forces constraining entry.4 Although those inputs centered on existing telephone and cable businesses, other comments related to the challenge of how to foster entrepreneurialism, development of new information infrastructure industries, and market entry by smaller and nontraditional players. As Quincy Rodgers of General Instrument observed, smaller players have fewer resources, limiting their ability to enter certain markets successfully. Technological innovation and competition should be encouraged together, he and others argued, leaving to the marketplace the determination of which technologies deliver value to customers.5
Many others ask, How can so many industries and so many users expect to acquire enough shared knowledge to create consensus unless some legitimate public body, not competing in commercial markets, is willing to take a leadership role aimed at orderly evolution of the NII? Governments, perhaps uniquely, can try to encourage all parties (their own agencies included) to address cross-cutting issues cooperatively and constructively. The challenge is to effect government leadership that keeps options open, encourages innovation, and allows markets to work.
Whether the government role is seen as something to be minimized or something to be leveraged, it will reflect two realities. First, the intertwining of public and private investments and activities shaping the NII implies that consultation between government and private sector institutions will have to be effective at the operational as well as the strategic or policy level. This interaction will have to reflect the fact that public and
private investment and activity together are dominated by private sector elements. Second, the NII—and the GII of which it is a part—is an incredibly complex economic and technological system. Simply understanding its behavior is a major task; finding appropriate tools for addressing gaps and bottlenecks requires a level of knowledge of system behavior that no one industrial sector (or governmental unit) is likely to have. Examination of these two realities in turn is furthered by understanding how the Internet has gained in significance in both contexts.
The high level of collaboration among its suppliers and users—and the associated roles of public and private institutions—makes information infrastructure a hybrid of the principles guiding other (and not unrelated) forms of public infrastructure, including traditional public utilities and education, which present contrasting mixtures of public and private roles.6 The Internet provides a powerful illustration. It developed out of the mission responsibilities of several federal agencies plus state and regional interests in both feeder and backbone networks. The Internet has demonstrated that some commonality of vision can make a decentralized process more efficient and effective. 7 However, there was no grand plan; the "bottom-up" character of the development, typical of most technology-driven changes that occur in a competitive environment, engendered vitality and legitimacy.
The Internet developed largely as a result of research conducted in many institutions across the country and regional needs to interconnect state and private institutions and sources of information. The NSFNET, a principal Internet backbone commissioned by the National Science Foundation (NSF) and operated under cooperative agreement by a joint venture among IBM, MCI, and the nonprofit Merit Inc., was a proving ground for cooperation among government, industry, and academia in the planning, development, and operation of information infrastructure. 8 Public and private research and investment have supported the creation of collections of valuable information as well as services to make them available. Thus the Internet illustrates government investment as a catalyst to private investment in both the demand and the supply sides of information infrastructure. For a discussion of emerging prospects, see "Government as User and Service Provider" below.
Unresolved is the appropriate role of government in assuring operational integrity of an NII that is increasingly diverse in its components, service offerings, and management. See observations from David Messerschmitt of the University of California at Berkeley in Box 6.1. In an Internet that is not a critical system for its users, anecdotal evidence of operational
BOX 6.1 Network Control and Management
If we really have a goal of a national information infrastructure that incorporates some existing networks, such as the Internet, telephone networks, et cetera, [it will combine] existing networks [having] existing control structures (e.g., Signaling System 7 in the telephone network). The Internet does not have such a structure. But it is developing it for the purposes of providing quality-of-service guarantees. The question I have is, how are we going to develop some kind of standards or some way to ensure that all of these different pieces of this overall infrastructure interoperate at the control or signaling level? We tend to talk about standards in terms of [examples such as] MPEG, which involves the actual signals going through at one time. But how are these networks going to control each other? When I have to access a Mosaic server across the Internet from a TCI-run cable TV system, how does the addressing and quality of service information get transferred from the cable system to the Internet in some organized fashion?
—David Messerschmitt, University of California at Berkeley
failures may not be a cause for alarm and may even be a transient phenomenon, although it is also a cautionary indicator for the nation's information infrastructure in general, since mature infrastructure must be reliable. These operational failures also raise questions about whether some minimal common institutional structure or mechanisms for governance are needed to preserve system integrity, availability, and so on.
Finally, the Internet also provides a vehicle for exploring the costs, benefits, and alternatives for equitable access to information infrastructure and associated services.9 See Box 6.2. Attorney Allan Arlow, Tora Bikson of the RAND Corporation, Richard Friedman of the University of Wisconsin Medical School, Allen Hammond of New York Law School, and other contributors expressed concern about the prospect of ''information have nots," acknowledging that some segments of the population are not sharing—as a function of market forces alone—in the widening deployment of advanced services and sophisticated information appliances. 10 The Internet, and more limited government service programs (e.g., Social Security) and commercial programs using dedicated or specialized systems, offer the opportunity for some experience through libraryand school-based access and kiosks (with public terminals) that provide access in various public spaces (Jackson, 1995). They show both the potential for and limitations of public access arrangements, with limitations including the level of access achieved, the public nature—and therefore limited personal privacy afforded—of the access, and the limited ease of use and thus the training requirements.
BOX 6.2 Universal Service
Yesterday the term universal service did not differentiate between access and the service itself, because for a phone call the phone and the service were essentially the same thing. Now, or in the future, we are talking about two different things. One is equitable, if not universal, access to the network so that everybody can touch the network, even if that means walking to a library. A separate issue is whatever subsidy is appropriate to make sure that the people that "we the public" want to have certain services, have those services.
—Robert Powers, MCI Telecommunications Inc.
You cannot have an NII if you do not hook most of the people in the United States into it. At least you cannot have an affordable NII. … Nothing is more universal in the United States than television: 98 percent of all homes have television; 96 percent of all homes have telephones; 63 percent of all homes have cable. Good access to the American home depends on television.
—James McKinney, Advanced Television Systems Committee
Paying customers should be the largest contributor at first; educated middle- and upper-income individuals who are driving the use of technology today. Second, there should be free access to schools, libraries, and other public facilities providing basic access and exposure to the network infrastructure, which will build demand for the future. And finally, universality, the end goal, is not possible without a revenue stream to attract and hold investment.
—Michael Greenbaum, Bell Atlantic Corporation
Over the near-term (5- to 7-year) horizon of this project, public and institutional access may be the only access available to some segments of the population, given the constraints on expanding the quantity and quality of bandwidth capacity serving residences and small businesses. Because for many public access remains a second-best option compared to personal systems (e.g., telephones, televisions, and computers in residences),11 it is an area that warrants explicit monitoring and assessment, on behalf of those who cannot afford their own personal computers. Public access should be the focus of a testbed or other type of experimental program to explore technical, cost-sharing, and other dimensions of the issue.
NII Systems Issues
The central NII systems issue relates to promotion of an overall archi-
tectural concept (see Box 6.3), something that many participants endorse in principle but find difficult to attain in practice. The steering committee endorses the ideal of architectural convergence while noting the pragmatic and not inconsistent expectation for multiple approaches. It distinguishes the question of objectives from that of process.
The process of developing NII architecture is not a neat one. At the forum Howard Frank of the Advanced Research Projects Agency (ARPA) asked explicitly, "Do you see any kind of process that could bring a convergence of architectural concepts?" The confounding of technological and business strategies makes a clean answer to that question elusive. One element of such a process that emerged from the NII 2000 project is vision-setting discussions and forums, in which a central challenge is to remain apolitical. Realizing the Information Future (CSTB, 1994b) argued for the benefit of a common architectural framework, and the many briefings and presentations associated with that report's dissemination furthered discussion within individual industry and technical communities. Complementary public discussions have taken place over the past 2 years under a number of different auspices. 12 The NII 2000 project built on that stream and emphasized cross-sectoral and cross-industry interaction. It also carried forward a leadership-via-convening approach that can foster development of common understanding and provide for interaction among disparate parties (multiple provider industries or providers and users). Such discussions can have an impact on the development of architecture without interfering with particular deployment activities, because interoperability can be achieved at a higher level in the architecture than that represented by the facilities that dominate investment decisions.
A more active process revolves around the surging interest in experimentation with Internet applications, which have provided practical, hands-on exposure to open architecture and internetworking. They may also be more broadly visible than the market trials of more closed systems ongoing among cable, telephone, and other companies. Accordingly, attention to the Internet grew over the course of the project, raising questions about what, if anything, the government should do to help the Internet evolve in the context of commercial operation, other than to transfer its management institutions carefully from government support to user (or investor) funding. Any further process, presumably, should tip the business case toward a common architectural framework. The message of the NII 2000 project is that such a business case may emerge over time. How quickly and whether it can or should be accelerated remain subject to debate, with the notable exception of R&D (see "Technology Development Through R&D"), on the importance of which there were affirmation and consensus despite the project's emphasis on downstream deployment.
BOX 6.3 Architecture
Where does the NII fall on the continuum ranging from the decentralized model with all intelligence residing in the terminal, spanning the Internet model and the ability to develop and deploy new applications very quickly, … to the centralized model, which includes the old telephone system? … Is it possible to have a single model for the NII on this sort of continuum? … Is there one network, or is there instead an internetwork of at least several types of networks that are organized according to quite different paradigms?
I think that in some respects we are talking past each other, because some very different types of applications are being discussed. The two fundamental types of applications are (1) those that depend on database access, including information retrieval and many of the entertainment applications that involve accessing a central reserve of information (entertainment video, for example) and (2) communications—between two computers, two people, or between the computers of people. For applications based on database access, full connectivity is less critical, as can be seen in the case of companies like America Online and CompuServe that have made a business out of users subscribing to only one of these services. On the other hand, an application like Mosaic running on the Internet illustrates the value of full connectivity of users to a database. …
Is full, logical connectivity required within a national information infrastructure? If so, then interoperability, standards, and other associated issues become much more difficult to deal with. A related question is, Are we going to have a common infrastructure for both types of generic applications, database access, and communications, or are we going to have a proliferation of separate infrastructures for these two kinds of applications? If we have communications infrastructure that provides for full connectivity between users, then we inherently have simultaneously infrastructure that also serves for database access applications. …
It is not necessary that all near-term deployment provide all the capabilities represented in a strategic vision. Indeed, one critical aspect of such a vision is that it should be easy and cost-effective to add new technologies and capabilities to the NII as unanticipated applications and user needs emerge. If this flexibility is achieved, it is only necessary that near-term investments be compatible with a long-term strategic vision, and hence not preclude future possibilities or force later disinvestment and widespread replacement of infrastructure.
If people can access the Internet over their cable system and it is providing what they want, then over time the bandwidth of that Internet pipeline will grow, and the bandwidth of everything else will shrink, and gradually people will move toward the solution they like. That is a market force. Perhaps the Internet or its extension could serve [the necessary] function—which is not the task of bringing together many different things, but rather the process of starting with what we have and extending it in the direction we want to go.
—David Messerschmitt, University of California at Berkeley
Discussion among contributors from a range of telecommunications and content industries illustrated how many factors intertwine to complicate movement toward a more open architecture, as examined in Chapters 3 and 4. For example, Edward Horowitz, whose company, Viacom Inc., has left the cable business to focus on content generation and marketing, commented on the importance to content providers of unconstrained access by customers and complained about service providers as bottlenecks. Bell Communication Research's Padmanabhan Srinagesh, who has been studying the economics of internetworking, suggested that "there might be a trade-off between encouraging competition at the lower levels, at the level of bits in the networks, versus having an open architecture where the people who provide the bits also provide a standardized open interface that other people can use to compete with them for end users." But the National Cable Television Association's Wendell Bailey observed that the cable industry was built on control of content, albeit in an environment of local monopoly franchises: "We relish, as an industry, our right to be an editor."
Content publishers and customers may prefer the convenience of maximal openness, but conduit owners express concern that an obligation to carry any kind of content reduces them to common carriers and may therefore constrain their potential to generate profits through selection and differentiation. At the same time, if they are not common carriers they assume a heavy legal obligation with respect to the content they provide, even it they did not originate it. Perceptions of how common carriage has fared in conventional telecommunications contribute to reservations (relating to profitability, potential regulation, and business development prospects) expressed by some about the business implications of open architectures. Another source of concern is uncertainty about how the market will evolve to support competition based on ownership of facilities and/or service supply—what forms of industry structure will be sustainable?
As noted in Chapter 3, there were those at the forum who expressed concern about what might happen in telecommunications and information service markets if prices became chaotic. Others worry that open competition in an industry so bound to its fixed costs will produce effects akin to those experienced in the airline industry following deregulation. The low consumer prices that "price wars" might bring would be welcomed by many, but an industry shakeout and reconcentration might follow, which could either sustain or slow investment in deployment. One telephone company representative said that "there would be many starters but few finishers." See Box 6.4.
While the steering committee heard a number of concerns about the possibility of closed systems hindering the development of the NII, it
BOX 6.4 Competition's Darker Side?
In the long run, [technological and associated cost trends may give rise to] an industry with chronic excess capacity. An industry like that is very susceptible to price wars. If there are two or more broadband wires to the house providing essentially the same service, both economic theory and practical experience suggest that price is going to get competed down to marginal cost, leaving the producers with no way to cover their fixed investments in infrastructure. The result is bankruptcy, financial distress, et cetera. So here are a few scenarios.
One is to just let it happen. The problem is cutthroat competition and financial distress. After two or three repetitions, firms start losing interest in investing in the market, because there is no way to recover their infrastructure investment. They drop out, only a few major players remain, and local or maybe even national monopolies result.
Well, what about monopolies? Monopoly provision is better than no provision at all. Local monopolies, such as the local telephone service, the cable TV service, and so on can provide broadband service on a city-by-city basis. The problem is that you get the usual monopoly distortions in terms of excessive pricing, lack of innovation, and maybe even worse, the dreaded "R" word: the cause for regulation.
The third scenario is the "killer app" scenario, in which a very high bandwidth application becomes very popular and the revenues from selling the transport for such a "killer app" are sufficient to cover the infrastructure investment. But what is that application? As we have heard today [at the forum], nobody knows what the "killer app" is—two-way video, interactive virtual reality. … There is not going to be a market there if people aren't willing to go for the applications. So it is the usual chicken-and-egg problem.
Finally, the last scenario is vertical integration with the content providers. That seems to be the most popular solution. … But if the market for transport becomes very competitive in the future—if transport is a commodity business—then the money is going to have to flow the other way. The content will have to cross-subsidize the transport. Then the content is what is paid for, and the transport is thrown in for free; it is just a cost of delivering the content. A question is, Will the content providers really stick to their joint ventures once they have competing transport providers clamoring for their business? That remains to be seen.
—Hal Varian, University of California at Berkeley
finds little reason for concern that closed systems will represent an issue that will require policy intervention, beyond traditional tools of competition policy such as antitrust. The position of most developers is that new applications will come into existence over the Internet, an open system, and that an application, if initially created in an open form, has a natural resistance to being bundled. In support of this tendency, the steering committee notes the success of the Internet itself, the evolutionary path by which the on-line service providers have become Internet access provid-
ers, and the trends toward open systems and open protocols in general. If the open form is robust in terms of functions and institutions, it will survive. If it is not, that form should not be sustained by policy unless there is a compelling national need. Research, however, can produce results that may make open systems more attractive (e.g., by lowering perceived costs), potentially accelerating their implementation.
Thus, the steering committee concluded that the government in particular need not and should not make any presumption that it must protect the open nature of mature products, but that it should move to foster an open, innovative environment in which new services and applications can occur. This approach includes encouraging the deployment of communications infrastructure that is general and flexible, removing regulatory barriers to innovation (for example, making spectrum for experiments easily and predictably available)13 and competition, and continuing to foster the success of the Internet through R&D and use in the delivery of public services.
Defining Roles For Government
The large volume of descriptive material and statements of opinion elicited at the workshop and the forum and in the white papers provide some insights into current and potential roles of government (at all levels) in policy making. They fall primarily into the following categories:
- Regulation and other rule-setting influences on market entry and conduct;
- Direct government use of and experimentation with information infrastructure in the delivery of public services; and
- Exploration of systems issues and fostering of technology development through research and development to create options, standards to choose among the options, and systems-level consensus to guide the choice of standards.
Three other government roles that cut across or influence the above three policy areas include attention to international aspects (trade, regulation, and interconnection and interoperation), data and analysis relating to infrastructure trends, and convening of different parties and sectors.
These areas are discussed in turn in the following sections.
Regulation, Rules, and Norms
Complete examination of issues related to regulation was beyond the scope of this project, but the body of inputs (see Box 6.5) underscored the
BOX 6.5 Regulation and Market Roles
Do the conditions exist in the regulatory environment that make it possible for every player sitting at this table to risk its capital on relatively equivalent terms?
—Gail Garfield Schwartz, Teleport Communications Group
Monopolies [established by regulation] become powerful forces for creating hostages in government and lead to the retardation of technological development. Those monopolies have led to a slowdown in television's introduction and spread in the 1950s, a substantial slowdown in cable television's introduction and spread in the 1960s and 1970s, and a substantial slowdown through regulation of the development of wireless communication systems in the 1970s, 1980s, and even into the 1990s.
—Robert Crandall, Brookings Institution
If a phone company today attempts to upgrade to ADSL [asymmetric digital subscriber line], it cannot do so until somebody on M Street has said that is a good idea. If a UHF station in Los Angeles says, look, we are tired of sending I Love Lucy to the masses, we want to take our 6 megahertz and use it for two-way digital paging, they may not do so at all. If they wish to take their 6 megahertz 24 hours a day and slice it into fragments of 5 minutes and sell it to all comers, that is illegal as well. … Nextel created a cellular network largely by understanding process, buying up radio dispatch licenses, and then getting them dezoned.
—Peter Huber, Manhattan Institute for Policy Research
In some states, cable operators are common carriers. So it is not a model that I would reject out of hand. … Because, in the 500-channel world, it is hard to see how you really are a lot different from a common carrier.
—Wendell Bailey, National Cable Television Association
The federal government should not go in and try to preempt the local governments. It could and should establish some overarching guidelines for acquiring sites so that it will at least facilitate the process. For PCS [personal communication service], we have build-out requirements. We cannot afford to sit around too long and try to get sites. We spent a lot of money to get the licenses. So there needs to be some help. There is certainly a sensitivity, however, at the local level that you cannot just have the federal government saying that the wireless entity can put up a tower or a base station anywhere it wants.
—Mary Madigan, Personal Communications Industry Association
[T]he problems can include … the actual absence of a process for handling the permitting. In some locations there are actual prohibitions where people say, "not within the jurisdiction here," but at the same time they want the service. Some, unfortunately, are viewing this as an opportunity to make a bonanza: "We want seven percent of your gross revenues, plus we want some fee per site." Some have even actually gone so far as to say, "if you want to locate it in certain areas, you can deploy as many as you like, but you have to pay this fee, and no more than two people per antenna," which is absurd.
—Robert Roche, Cellular Telecommunications Industry Association
TABLE 6.1 Regulation and Shareholder Value: An Organizing Framework
Federal Communications Commission
Price caps/rate of return; Earnings ceiling/sharing; Productivity target; Consumer dividends; Investment targets
Rate restructuring; "Streamlined" regulation; New service approval
Rate structures; Depreciation charges; New entry; New service approval
State Public Utility Commissions
Incentive regulation; Intra-LATA entry; Regulatory lag; Earnings attribution
Usage-sensitive rates; New service offerings Service; "restructuring"
Rate "rebalancing"; Universal service; Carrier of last resort; Rate averaging; Intra-LATA entry/pricing
Tax policies; National information infrastructure; Noncompensated service requirements
Line-of-business restraints; Mergers and acquisitions
Noncompensated service requirements
Department of Justice/Judiciary
SOURCE: Darby Associates (1995) as published in Communications Business & Finance.
negative influence of regulation on innovation and markets and therefore deployment. See Table 6.1 for an overview of regulation as a source of business risk. State and local regulation stimulated considerable commentary from contributors, reflecting its influence on residential broad-band service choices (including cable-telephone competition) and its disproportionate impact on emerging wireless services—for which basic facilities are being built—and on state and local infrastructure initiatives.14
In brief, many deployment decisions and activities are proceeding independently of regulatory decision making or are based on best guesses as to likely developments in telecommunications reform.15 The proliferation of Internet-related activities—Web servers and applications, security services that support commercial transactions over the Web, the formation and growth of Internet access providers, and so on—illustrates an unregulated venue for growth. By contrast, contentious disagreement
among the Federal Communications Commission, telephone and cable companies, and the courts on the regulatory treatment of video delivery services underscores the way complex interactions among technology, business, and regulatory conditions shape investments. The recent moratorium on filings relating to video delivery services under 47 USC Sec. 214, which requires plans to be articulated months or years in advance of deployment, reflects those disagreements. So, too, does litigation relating to the Cable Communications Policy Act of 1984 (P.L. 98-549), which prevented telephone carriers from providing video programming within their region; a U.S. Circuit Court of Appeals ruled this ban unconstitutional, and the Supreme Court will decide the issue in 1996.16 17 The wireless phenomenon drives some18 to call for action ranging from greater reduction to total elimination of spectrum regulation. Speaking more generally, only one participant, Viacom's Edward Horowitz, suggested that there was an intermediate position between regulating or deregulating, achievable by "sunset" provisions.
Protecting the NII: Ethics and Mechanisms
Contributors to the NII 2000 project articulated concerns about privacy, intellectual property, free speech, ethical conduct, and, in particular, security, but given the scope of the project these issues did not receive a great deal of attention.19 The discussions of plans for and progress toward deployment, especially in the context of the relatively insecure Internet, suggest that these concerns have not been showstoppers—although they may have altered the form or emphasis of decisions about deployment or use. Thus, Robert Crandall cautioned that some of the rhetoric on these issues may be a red herring. See Box 6.6.
Ethical and behavioral norms provide the context in which various services and mechanisms will evolve. Despite little experience to date, many commented on the uncertain implications of linking up a very large and very diverse set of people with access to a wide and powerful set of information resources and computing tools. The positive aspect of this trend is the possibility of bringing together and meeting the needs of various communities of interest. The negative aspect was acknowledged by some, such as Milo Medin of @Home, who remarked that "the network itself is bringing people far closer than they used to be. And that level of interaction between people is going to cause friction in addition to these communities of interest." The changing environment, possibilities, and scope of interactions led Lois McCoy of the National Institute for Urban Search and Rescue to propose a need for a "Bill of Rights,"20 a logical extension of the articulation of privacy principles and security tenets undertaken by the IITF and codes of ethics generated by a variety
BOX 6.6 Intellectual Property Protection in Perspective
Robert Crandall of the Brookings Institution expressed skepticism during the forum concerning the point ''that perhaps it is exposures to civil suits and the failure to resolve all the difficult questions involved in intellectual property protection that may be slowing down the development of the NII. I don't know that we have any evidence of that at this point. We just went through an attempt to revise tort liability laws and federalize it in the U.S. Congress. One of the reasons that it didn't proceed any farther than it did was that we don't have systematic evidence that civil liability, tort liability has had a severe effect upon a variety of goods- and service-producing industries already in the United States." Among other things, Crandall's comments suggest that private action may be sufficient to handle many issues raised by this set of concerns, although others continue to voice a need for policy monitoring if not intervention.
of business and professional organizations. Michael Greenbaum, apparently wary of government intervention, maintained that "informal standards serve the industry well by defining public expectations and setting clear objectives" and argued for "guidelines and self-regulating bodies that comprise the principal constituents."
Security, Reliability, and Architecture
Concerns about security lead to considerations of mechanisms that may also provide protections for the privacy of personal information, intellectual property, freedom from theft of service, integrity of information and systems, availability of service, and other vulnerable elements. Effective design and implementation of those mechanisms, plus associated policies and practices, have implications for the architecture of the information infrastructure. See Box 6.7.
Comments by Ed Hammond of Duke University Medical Center and others underscored the importance of engaging people with large-volume, specialized needs in areas of national importance, such as health care, both because of the wide-ranging consequences of decisions and because the issues may be nonobvious, counterintuitive, or contentious. 21 Quincy Rodgers argued that determinations of level of security and methods of implementation should be seen as business decisions, which in turn has implications for standardization, service provider decisions about whether specialized access devices (e.g., set-top boxes) may be available retail or leased, and the role of governments.
In many instances a common framework or architecture is preferable,
BOX 6.7 The Challenge of Meeting Different Needs Simultaneously
With respect to a uniform security mechanism in an NII, whose interests are being protected? I suggest that there will have to be a range of mechanisms provided by many different parties to protect network providers from one another and from their computer operators, content providers from theft of intellectual property, and consumers from detriment to their privacy interests. And so we shouldn't expect to have a secure network by simply getting some sort of oligopolistic control as we can say that our broadcast networks are secure because it's very rare that an outsider is able to broadcast an illicit television program.
—Allan Schiffman, Terisa Systems/CommerceNet
The ideal system permits us to have a system in which all who need data can have exactly what they need. It's complicated by the fact that there are 40 different types of people that probably have a legitimate need for access to health care information.
—Ed Hammond, Duke University Medical Center
both to achieve economies in developing and implementing a cross-cutting service and because to achieve interoperability some degree of standardization may be needed. Contributors raised a range of concerns, including issues associated with electronic signatures and cryptographic key and certificate management, which require supporting "infrastructures" (see the white paper by Robert Aiken and John Cavallini); the absence of corresponding cross-community or cross-sectoral processes, some of which require international interaction, on mechanisms for security and other protections;22 and the need for yet other processes to ensure that a complex and multifaceted information infrastructure can, like the relatively simpler telephone network, meet national security and emergency preparedness needs (see the white paper by Lois Clark McCoy et al.). The rallying cry in the national security and emergency preparedness area is for an "emergency lane on the information superhighway," addressing inherent security and reliability issues plus support for the security needs of those managing public crises that require rapid recovery, disaster recovery, and mobilization.
Although forum discussions and white papers affirmed the value of a security architecture, progress toward implementing it seems hampered by the same factors that hamper progress toward implementing common architecture generally. For example, John McDonald of MBX Inc. listed in a white paper a number of trends that have led to a "concentration of
network assets" that increases vulnerability to a single switch failure, line cut, or software system crash, notwithstanding the application of a variety of techniques that enhance network integrity. He noted that some of the trends are unintended side effects of strategies to ensure compliance with federal regulations and argued that integrity and robustness must be "considered from the ground up" and, presumably, in a coordinated manner.23 Software is a huge expense for telecommunications equipment vendors because the fundamental technical approach they are taking to network control leads to systems that are inherently rigid and subject to failures, requiring heroic efforts to make them robust enough to operate in the real world. Newer and more flexible and robust approaches have been emerging; an issue for the technical and vendor communities is where, when, and how to advance and implement them.
This project does suggest that addressing security, reliability, recoverability, and associated protections may present one of the most constructive vehicles available for the government to influence overall architectural development in the evolving information infrastructure. Doing so will inevitably require resolving issues associated with cryptography policy, a topic well beyond the scope of this project but under consideration in another by CSTB.24
Government as User and Service Provider
Government public services will touch most citizens and many organizations in residences, places of business, and public spaces. Examples include law enforcement, health care information sharing and service delivery, research and education, environmental monitoring, national security, and more. Some will involve multiple levels of government, and some may blend government and commercial elements. For example, the relatively heterogeneous Intelligent Transportation System (ITS) fostered by the Department of Transportation reflects multilevel government interests in safety, efficiency, and capacity plus commercial applications. Jim Keller and Lewis Branscomb of Harvard University suggested that the ITS offers a model combining general-purpose information infrastructure with specific systems implemented in more closed fashion over or connected to the NII. Providing another example, the white paper by John Ziebarth et al. proposes a testbed for network-accessible information storage, access, and system management to better handle huge collections of regulatory information from multiple government institutions.25
National Institute of Standards and Technology Director Arati Prabhakar cast the opportunities for government use as part of the larger context of public-private interaction: "One of the great opportunities here is to abandon the traditional way that government agencies and depart-
ments often come at new capabilities. Agencies say, 'Let's go create this unique solution which will then be immune to all technological changes in the future.' Today we really have a chance to try to couple agencies and departments as they are doing their jobs, using these new technologies. We have a chance to link them to where the technology is going by linking them to the private sector activities." Emphasizing the cross-cutting nature of government applications, Aiken and Cavallini explain that the "[government services information infrastructure; GSII] is that portion of the NII used to link government and its services, enables 'virtual agency' concepts, protects privacy, and supports emergency preparedness needs."
Government databases provide a natural focus for government applications of information infrastructure, given the federal government's unique collections of data and information that are of broad interest and might be more broadly used in the future if made more accessible. The Government Information Locator Service is a cross-cutting information access initiative that could be used to explore various approaches and implementation issues; the National Spatial Data Clearinghouse is one of several more specialized information access programs that could also yield broader insights. Aiken and Cavallini note that government applications need not be viewed as "one size fits all," any more than private sector applications are. Diversity of service offings can still be compatible with interoperability. "[Start] with what the GSII will be used for … and with those people who share common interests in both using and providing the GSII. The applications should determine what standards and technologies are required and will provide interoperability among their own constituency as well as with other groups, if properly coordinated." They cite government-supported scientific research as a domain in which the affinity group concept has been and should continue to be explored in terms of essential processes—requirements, coordination, interoperability, cross-sector exchange. As Mark Abbott of Oregon State University observed, experiences in the area of ocean and earth sciences research suggest that the record may be mixed: advances in information infrastructure prompt changes in the conduct of science, some carrying with them new distractions and costs. But this is the value of experiments: they show what works well, what does not, and where to direct future effort.
The above observations suggest that the federal government, in particular, may come full circle, from instigator of computer technology applications in the 1940s and 1950s to, on average, repository of obsolete technologies in the 1970s and 1980s, to a spearhead in the 1990s for innovative applications that may facilitate private sector activities and associated social and economic benefits. This vision, expressed primarily
through white papers, does not imply monolithic construction projects. But it does imply farsighted procurement of commercial technology (goods and services) and equally farsighted applications-development decisions.26 The history of the Internet shows that applications and experiments in certain user domains—and, increasingly important today, applications that are cross-domain in nature—may provide one of the most constructive vehicles available to the government for stimulating private action. As suggested by Aiken and Cavallini, information infrastructure can be part of a system for facilitating government-industry interactions.
Technology Development Through R&D
The present Internet provides evidence of the constructive role that government expenditures on R&D can often play. However, much research and development work remains to be done to develop the information infrastructure technology base, and as that base evolves it will reveal a stream of new challenges. That conclusion was apparent from many sources of evidence:
- Deployment plans for proven technologies, reflecting lack of readiness of the next generation of technology;
- Delay or moderation of deployment plans subject to various tests and trials,
- Difficulties using today's technologies that announced plans appear to ignore; and
- Expectations for greater if unspecified internetworking, and recognition that something more than conventional communications or passive entertainment would be required to drive significant advances.
Like regulation, R&D was not a focus of this project. Yet ideas for R&D work that emerged from the discussion suggest opportunities for constructive and (in contrast to regulation) relatively noncontroversial government involvement. As Allan Schiffman of Terisa Systems/CommerceNet summed it up,
I'm not one who thinks that the technology issues are completely settled yet. Certainly not in distributed computing, platforms, or common interfaces. Maybe the guys who are rolling the fiber optics out understand what's going on, but I'm not sure that the rest of the technology is in place. There is still room for a good deal more experimentation. The government has played a pretty useful role in that process up until now.
Many of those who suggested that issues are settled focused on the lower
parts of the infrastructure. Activities are far less clear when it comes to the evolution of higher-level services. Accordingly, Randy Katz et al. assert in a white paper that "direct research and development is the most effective way to stimulate new service capabilities and associated commonalities."
One context for R&D that received considerable mention and support is the testbed—building on the history presented by the Internet and the more general recognition that understanding complex systems depends on building and testing them—and associated opportunities for collaborative exploration. Testbeds for R&D that supports applications (including interdisciplinary consideration of technical and nontechnical elements underlying successful applications in different domains) would be valuable. Under the testbed rubric, government can continue to promote the deployment of advanced information technologies in universities to stimulate the creation of applications that will make the core technology useful to society at large. Through these and other testbed efforts, government can foster greater transfer of ideas between universities and industry. Real partnerships between universities and industries are difficult to develop and maintain in a fast-moving industry and technical arena; this is a specific instance of the larger problem of facilitating large-systems research and industry-university collaboration (CSTB, 1994a).
Several areas of research were cited, directly or implicitly, in the course of the project. Four principal subject themes were architecture, for integrating a widening range of components and systems in a dynamic context; technology for information management; technology for greater security, privacy, and reliability; and technology for greater ease of use (both through better user interfaces27 and better design generally). 28 See Box 6.8. These last two categories suggest a related need for research that addresses social processes relating to the adoption and use of information infrastructure. Katz et al. derive several topics for research by relating NII needs to directions established under the High Performance Computing and Communications Initiative, including the categories of information infrastructure services, system development and support environments, intelligent interfaces, and national challenge (major domain) problems. See Box 6.9 for elaboration on some of these topics.
Architecture and Networking
In the networking area, and associated with architecture, David Messerschmitt pointed to areas where little work is being done—quality of service, communications delay, and allocation of impairments across connected networks. Security complements these topics and also receives less attention than many believe is needed. Overall, as Messerschmitt
noted, network signaling and control systems constitute an area in which considerable work is needed. Leonard Kleinrock of the University of California at Los Angeles and others outlined technology needs associated with extending the performance of wireless, untethered, and nomadic computing and communications systems, which translates into support for "ad hoc access to services." For example, Kleinrock enumerated several systems requirements and components (see Box 6.10), calling for a "reference model" for nomadicity to facilitate analysis and progress. He related support for nomadic access to basic network parameters—bandwidth, latency, reliability, error rate, delay, storage, processing power, interference, interoperability—in the context of very large scale, identified needs for middleware services, and noted that portable systems also require advances in size, weight, and battery power.
Another set of issues associated with networking and architecture and implicit in the discussions of deployment involves technology to lower the costs of achieving openness and symmetry. The lack of clarity regarding costs and benefits makes existing businesses reluctant to change dramatically, especially existing players that are starting from technology bases that have already been engineered to reduce costs as much as possible. This problem was stated elegantly and directly by Stewart Personick of Bell Communications Research:
History has shown that lack of interoperability and openness can be a self-fulfilling prophecy … because without research on how to make network architectures and constructs that are conducive to interoperability, then what appears is not conducive to interoperability. It is not in the best interests of the largest participants in the industry to open anything up. It is politically correct to say that interoperability is good, but very few view it as the right thing to do from a business perspective. So how do you work out this conundrum? The largest participants are in the best position to fund R&D, but it is not in their interest to open up networks. Little guys can do certain things, but solving the interoperability problem is really tough. That is just one example, but a very important one in an area where somehow it is in the public's interest to promote interoperability and openness but the research has to be funded almost against the wishes of the biggest participants. That is an area where I believe government funding is absolutely essential.
As Realizing the Information Future (CSTB, 1994b) observed, more general and flexible technology looks more expensive than do the more specialized alternatives, and yet there may be little actual knowledge of relative costs or investigation of approaches that may add only modestly to costs.
BOX 6.8 Interface-related Research Needs
Information Management and Ease of Use
End users, especially those professionals in specialized domains of application, are usually concerned with problems of collecting, storing, finding, and manipulating information. They see communications as a means to an end. But until those domain-specific markets appear, communications providers focus on their communications facilities and basic services. This disjuncture appears to be slowing an important source of demand. Commented Lois McCoy, "What the user wants is the answer to his question. It doesn't make any difference what that question is."
The information management problem calls for R&D in many areas. One is how the user can find what is needed. Ross Glatzer, retired from Prodigy Services, remarked on how limited the state of the art is in today's on-line services arena: "Getting around on-line services today is like
BOX 6.9 R&D Needs for Information Infrastructure Services and Systems Development
[trying to find] the guy who explains where he is [by] calling from … a phone booth at the corner of 'Walk and Don't Walk.' The leaders in on-line services will be those who can best integrate communications, information, and transactions." Andrew Lippman of the Massachusetts Institute of Technology expanded on the wish-list:
You have to invent indices. You have to invent mechanisms and techniques by which you can accumulate, process, and reprocess the information. If you look at the Internet these days, you will find a tremendous amount of content-free information out on it that is purely in the form of index. There are music indices that do not play a note, but guide you through the world of music. That is the technical and architectural challenge that remains. Because the PC will predominate, it will foster symmetric industries. Probably entertainment will fall into line and at least merge inside of the computer, as will the signal processing. But the challenge that we have to face is how to build flexible databases.
Part of what is needed are advances in decision support systems, according to Gio Wiederhold of Stanford University, who derived inferences from his study of medical information needs. Wiederhold observed that "[m]aking choices is best supported by systems which provide a limited number of relevant choices (summaries, searching, selecting)." Similarly, Reagan Moore of the San Diego Supercomputer Center wrote of the need for techniques for data assimilation—mining and modeling—
BOX 6.10 Nomadicity Requirements and Components
especially for scientific research, while Richard Sharpe of the Hartford Foundation spoke of the technical and procedural challenges associated with collecting and aggregating data needed for innovative medical applications. Wiederhold also noted that otherwise reluctant people could be motivated to use the information infrastructure if it is seen as providing access to quality data (relevance, completeness, legitimacy). Related issues in standardizing data formats are discussed below.
One approach that attracted multiple comments was development of virtual environments, which were described as an approach to making cyberspace more intuitive and useful to a wider range of people, facilitating real-time, interactive human-computer communications. In the January workshop, Marty Tenenbaum of Enterprise Integration Technologies Corporation/CommerceNet observed that there was progress
towards a more active collaboration where instead of clicking on something and getting a document, you are clicking on something and getting a three dimensional environment, a space on the Web where people can hang out and if other people are there at the same time, you are able to interact with them.
This concept was specifically advanced in the proposal for RegNet (see the white paper by Ziebarth et al.), which would involve three-dimensional volumetric regulatory information management (with cross-referenced databases). Donald Brutzman et al. go as far as to suggest in a white paper that virtual environments could be considered a superset of information infrastructure issues, in view of the concerns with scale, interactive three-dimensional graphics coordinating input devices with a single-screen display, and so on, with illustrations including a live three-dimensional sports stadium with instrumented players, a 100,000-player problem for a military war game, and virtual worlds as laboratories for robots and people in scientific research.
If consensus on system engineering and architecture issues for most of the NII network environment is to be implemented, that implementation will take the form of standards. The concepts of broad internetworking and open architecture presuppose standards. These standards may have the force of law when they are incorporated in regulatory processes. They may be formal, created through an approved consensus process, as has always been the case for the lower layers of telecommunications infrastructure. They may be informal or de facto industry standards created through a variety of institutions, including consortia of self-selected industry participants. Finally, they may be de facto standards made effective by the market power of the products embodying them.
The basic questions about what to standardize, where, when, and how are not new; the Technology Policy Working Group and others have considered them for some time.29 Among participants in the NII 2000 project, there was general unease about a more active or direct government role supporting standards setting, except among those involved in broadcasting. As Quincy Rodgers summed it up,
What do you do? Do you standardize some? Do you standardize all? Do you standardize them through government action? Do you endure the delays that that approach can sometimes occasion? Have you dramatically improved the problem of bureaucratic standard setting by moving to some large private sector body? How does all of that fit at this stage of the development of these products?
Rodgers argued for de facto standards. Those familiar with telephony have noted that while centralized governmental decision making on standards can lead to unfortunate technology choices, the private sector process can perpetuate incompatibilities, as evidenced by divergent implementation of so-called "standard" integrated services digital network technology and technologies for video delivery.
Even in discussing government's internal needs, Aiken and Cavallini argued that "a modular, seamless integration and evolution of the multi-component GSII into the evolving NII will need to be based primarily on voluntary processes and proven interoperability solutions rather than on mandated standards." On the other hand, building on experience at ARPA, Randy Katz et al. suggest in their white paper that the "research [community] and government can take a leading role in establishing new commonalities that foreshadow industry standards." Also, although receiving little discussion, a government role for standards-related validation and dissemination could be valuable.
Thus the Internet standards are both de facto and consensus standards at one and the same time. One factor making that possible has been the important role that scientists and engineers, especially in universities, have played both as builders and users. This close-knit relationship between the "vendors" and "customers" is quite unique in standards making and is a powerful asset in obtaining both good standards and user acceptance of them.
The government's role in the development of the Internet standards was quite indirect. Nevertheless, government people, including experts in agencies such as ARPA, NSF, the Department of Energy, and the National Aeronautics and Space Administration, have made important personal intellectual contributions and, more importantly, have directed their research support to the most talented technical people in the national community. This use of R&D investment has constituted indirect support for standards making and is an excellent way to avoid the heavy hand that so many in industry fear, while creating options from which industry can choose.
There was surprisingly little discussion of transnational issues from the industry experts. Although international issues were not a focus of this project, contributors generally assumed that international connectivity is a requirement, and experimentation with new services was reported to be taking place globally. The lack of concern expressed about incompatibilities in law and regulation, uncertainties about venues for legal accountability, and access to foreign markets perhaps reflected both the relative immaturity of information infrastructure developments overseas and the pragmatic, country-by-country approach of the international firms.
As a practical matter many of the most important commercial networks, such as the SWIFT system that supports international banking, have been in place internationally for some years. Among the more experimental developments, the Internet and the World Wide Web (Web) it supports are almost "the only game in town." Increasingly, commercial and recreational use of the GII can, technically, ignore national borders altogether, and that phenomenon warrants at least revisiting existing policies in such areas as export control, taxation, and enforcement of a wide range of rights and responsibilities.
As Maria Farnon of Tufts University notes in her white paper, "Bodies like the ITU [International Telecommunications Union] have grown increasingly irrelevant with the introduction of new services such as the Internet and have seen their turf eroded by new organizations that do not necessarily have official government sanction." These new services and organizations have been embraced or at least tolerated in almost every nation on earth, setting an important precedent for the value to each nation of reasonably unrestricted international access. See Box 6.11 for Farnon's observations on the Internet as an international phenomenon. Inputs to the NII 2000 project from satellite and smaller wireless system providers acknowledged the appeal of such technology for expanding global access in regions where wireline infrastructure is limited and the costs of building it are extremely high. Within individual countries, issues of open architecture, competition in telecommunications, and local equivalents of universal service are being debated; the G7 Ministerial Conference on the Information Society of February 199530 fostered cross-national discussion and experimentation; and the issue of international traffic settlements is one of many. Attorney Jonathan Band commented on the international attention now being devoted to system openness and harmonization of expectations for provisions in areas ranging from intellectual property protection to interoperability.
BOX 6.11 Politicoeconomic Implications of the Internet
Despite the clear intention of the industrialized world to foster the building of national backbones, and the gradual diffusion of connectivity in many developing countries, the traditional TO [state-owned telecommunications operator] structure, and the resulting legal and commercial models this fosters, remain serious obstacles to a truly international Internet. While technical difficulties can be overcome with resources from institutions such as the World Bank, NGOs [non-governmental organizations], and governments themselves, the traditional mind-set of control over the communications infrastructure and services is more difficult to displace. … [G]overnments have long justified their ownership of the telecommunications operator on the basis of national security reasons, and also derive significant political and revenue benefits from this ownership. Although this structure has been seriously undermined in the United States, the European Union, and parts of Asia, it remains strong elsewhere.
Ideally, an international network like the Internet should provide a protocol that is easily adapted to a wide variety of infrastructure development stages and should offer services that can be tailored to reflect the cultural, legal, and regulatory norms of every country. However, the model that the Internet has provided demonstrates that an international network will, by definition, still act to undermine many traditional structures that have evolved around the old TO system. Rather than seeking to impose old standards of behavior and control on the Internet, governments can best encourage the development of national information infrastructures by eliminating the inherent conflicts that exist between the new services and the domestic organization of telecommunications. This means introducing competition into all levels of service and allowing the market to drive pricing and standards.
SOURCE: Extracted from "How Do Traditional Legal, Commercial, Social, and Political Structures, When Confronted with a New Service, React and Interact?," a white paper contributed to the NII 2000 project by Maria Farnon.
Systems Data and Analysis for NII Assessment
Data issues are prosaic, mundane, and easily overlooked, but addressing them could be a simple and powerful way of improving private and public decision making. How can public or private entities aim to influence something without knowing the extent of an issue or the impact of a contemplated action? Again, the private sector has used market research data more or less effectively, but much of that information comes from proprietary and unpublished studies (yet another argument for public testbeds). Periodic assessments, using a mixture of public and private sector inputs as in the case of this project, may be a useful way for the government to gain information to be shared with all interested parties.
The AT&T divestiture has already detrimentally affected the avail-
ability of telephony statistics.31 Cable and wireless statistics are tracked largely by trade associations and market researchers (raising various concerns about the quality and completeness of data), and the decentralization and commercialization of the Internet are among the factors that make measuring its dimensions difficult. The white papers by Reagan Moore and by Hans-Werner Braun and Kimberly Claffy of the San Diego Supercomputer Center argue that "[t]he NII continues to drive funding into hardware, pipes, and multimedia-capable tools, with very little attention to any kind of underlying infrastructural sanity checks." The authors go on to relate deployment of new technology to measurement problems: "With the transition to ATM [asynchronous transfer mode] and high speed switches, it will no longer even be technically feasible to access IP [Internet Protocol] layer data in order to do traffic flow profiling, certainly not at switches within commercial ATM clouds." Meanwhile, the difficulty of gaining fresh insight into application domains suggests that there would be a benefit to more systematic study of how technology is selected and used in different settings.
Government as Convenor
Complementing and informing other government roles, convening—beyond that already provided by professional, trade, and private standards-setting organizations—appears to be helpful in encouraging better assimilation of information infrastructure by all users and providers. The good offices of state and federal government can break down some of the barriers to use of the information infrastructure for better health services, more effective education, distribution of government information and benefits, and so on. Private actions may not be sufficient for rapid progress and resolution in some critical arenas (e.g., education and health care) characterized by imperfect markets. In cases such as the interstate delivery of health care services, where legal barriers impede changes in professional practice, policy changes are needed to create efficient domain-specific information services. The issues, needs, opportunities, and handicaps in such areas have been surveyed many times over the past few years by the Information Infrastructure Task Force (IITF), Congress, trade and professional organizations, and independent analysts.32 The experience of this project, including a workshop and forum in which domain representatives expressed both frustration about difficulties in communicating with infrastructure providers and gratitude for the opportunity for learning and exchange, suggests that continued government support for convening different parties, bridging user and supplier communities, would have value.
Whatever the federal government does to foster convening should
take into account the role of NII itself as an implicit, informal convening mechanism. As various contributors pointed out indirectly and directly, enhancements to the information infrastructure will allow people to coalesce both openly and privately, thereby changing the processes of government. Mixed expectations were voiced by one participant in connection with the use of information infrastructure to support electronic democracy. As Michael Greenbaum of Bell Atlantic Corporation speculated:
At its best it will give voice and cohesion to the underrepresented. At its worst it will enable terrorists and hate groups to act under the cover of anonymity. This medium is better suited than most to building constituency, rather than just mass communications. It should be approached with caution as a means for formal and informal referendum, through which activist groups might unduly influence representatives and undermine the deliberative aspect of a representative democracy.
Although not everyone will use the technology, the deeper information infrastructure becomes embedded in U.S. society, the more some degree of access will become a part of what it means to be an informed voter.
The NII 2000 project suggests that most in industries that supply information infrastructure would support a variety of roles for government, roles that are complex, often subtle, sometimes active. But these roles are, like the work of private firms, intertwined with private roles and subject to a broad range of uncertainties about future possibilities and problems. The steering committee heard broad support for the government contributing as an enlightened customer and participant in building the NII. In particular, it heard an expression of the need for continued government support of innovation in the form of support for university-based research and development, both basic and applied. But there was also general (albeit not unanimous) enthusiasm for accelerated deregulation—not only to speed market entry but also to allow licensed carriers the freedom to use their assigned spectrum innovatively—implying a reduction in governmental authority to direct NII evolution unilaterally.
The concerns expressed about the lagging of many domain-specific areas of application, despite the prospect for important economic drivers from the application industries, suggest that this enabling role for governments reaches into many areas of government responsibility and is important at the state as well as the federal level. Thus responsibility for constructive government participation will be decentralized, raising ques-
tions about whether an overall systems view can or should be sustained in some continuing activity or organization.
The steering committee concluded that a consensus vision—however indistinct and subject to evolutionary modification—of the NII in the first decades of the 21st century should form its core but should not represent a constraint on all NII developments. Articulating and evolving national goals, and driving toward a national consensus, are important elements of the government role. It is a difficult role in which leadership—as opposed to unilaterial action or definitions—can help to overcome the confusion and misunderstanding that typically accompany technically complex phenomena. There is general agreement that government can play a role in bringing diverse interests together to seek out consensus on values and objectives for the national capability and on architectural principles supportive of those values. If it is done right, that consensus will govern many of the public and private sector actions. Leadership and articulation stand in contrast to regulation or judicial action: not everything can be defined by rules. People in organized societies do things because there are corresponding norms and understandings. But we have already seen in many industries, including telecommunications, that if government defines too many rules, eventually all people do is try to work around them.
Reconciling divergent positions and priorities within the government may be a critical first step, since the government itself is as diverse as other sectors of society. For example, it may be the case that entirely different priorities would emerge from the Department of Defense (DOD) and the Social Security Administration; within the DOD, the different services have had a hard time achieving interoperability. The IITF has been an experiment in cross-agency coordination. While it has helped to articulate and explore many key issues, the question remains as to which part of the decentralized government will articulate the goals and guide the architecture (CSTB, 1994b).
This project itself represents a step by the federal government, acting through the IITF, to explore the extent of the consensus on which federal policy making and leadership can rest. What is the proper mechanism for pursuing this exploration? The project did not discuss specific institutional structures. However, it seems evident that something more long range and more centered in private sector participation is needed than the IITF and the former NII Advisory Council. The essential requirement is that both the information service providers and the information creators and users in the private commercial and not-for-profit sectors must be fully involved, along with relevant government bodies.
Second, the primary economic drivers of the NII will be found, the steering committee believes, within the most important domains of the
national life—health services, education, electronic commerce (including goods distribution, marketing, and retailing), and public safety and the like. (While an important factor, entertainment appears to be less of a driver than anticipated at this project's outset.) Because the ability of these domains to take full advantage of the NII is more limited by factors inherent within those domains than by shortcomings in the information network support, progress in creating these services will lie primarily in the hands of the professional groups and firms concerned. These responsibilities clearly fall primarily to public and private institutions outside the realm of telecommunications and high-performance computing, and they must be taken up by the relevant bodies in their own self-interest. However, the federal government and the states can, within their own operational missions, act to reduce the barriers to the creation of need-based demand for information infrastructure.
The NII 2000 project did not conclude (although it was hypothesized at the project's outset) that all information networks must be open and interoperable. Some mature and specialized applications can justify their unique application-specific architectures by the cost reduction they afford. But the powerful lesson to be learned from the Internet is the value of an open interface on which all kinds of new and mature services can be built, one that allows an experimental new service to look for users among the entire population with access to a digital network. Government should adopt policies intended to retain the power of a service like the Internet to be a testbed for innovations and a link among many resources for both information and its processing and distribution.
Both federal and state governments have the opportunity to increase their own efficiency and improve their services to the public through the development and use of the NII. A current initiative toward that end focuses on "reinventing government." These efforts should be undertaken under policy guidelines supporting the future evolution of the NII, as well as the best use of currently available facilities. In other words, the government should further the development of the core functions of the NII, using a "learn-and-change-by-doing" approach. This will place the government's purchasing power squarely behind the goal of national, consensus-based progress. In this endeavor, the federal and state governments must seek a concerted strategy, since each has the potential for strong influence in the evolution of the NII.
Perhaps the most important role government can play, especially at the federal level, is the continued vigorous support of advanced research into digital networking, with the objective of creating opportunities for the private sector to sustain the maximum amount of generality and flexibility at minimum cost. Much of this work can be carried out in support of government applications, and much can be carried out in collaboration
with private industry; much is appropriate to the university environment. It should be noted that different parts of the information industry are structured quite differently with respect to available R&D resources. Historically, the regulated telephony industry has been quite research-intensive. The cable TV industry and, more importantly, the domain-specific application industries do not have this tradition of research, depending instead on the vendors that support them; they would benefit, in particular, from partnership projects with government agencies.
Finally, as noted above, there is much political debate about the effect deregulation, plus a blizzard of anticipated technological and market innovations, will have on the structure, profitability, and competitiveness of the many segments of the information industry. There is a continuous role to be played in evaluating and then acting (or not acting). At times, these actions are taken to remove impediments, and at other times, to encourage changes in direction. This is no different from what any corporate manager faces: the need to continuously evaluate and to correct the course accordingly.
But the government should not try to manage the evolution of industry structure, barring evidence of serious inequities or economic problems. It should keep its tools, such as the Federal Communications Commission and the National Telecommunications and Information Administration, available to deal with any situation that might arise in the future. Particularly important will be better data collection and analysis to reduce some of the guesswork and improve the bases for decision making in industry, government, and elsewhere on infrastructure development, selection, and use. In any case, the slower pace of deregulation and privatization in most other nations will require active engagement with those policies in the interest of achieving a vigorous and effective GII in which U.S. suppliers are not competitively disadvantaged.
Government-funded networking research and development can explore architectural alternatives that cut across existing implementations, creating options for new configurations of the great range of available technologies.
See CSTB (1995b) for additional discussion.
and conduct of consortia and alliances (such as those for joint personal communication service licenses), or in government investigations of major players such as Microsoft to look into the possibility of monopolization.
Such as project contributors Peter Huber of the Manhattan Institute, David Messerschmitt of the University of California at Berkeley, and James McKinney of the Advanced Television Systems Committee.
The Aspen Institute has recently published a report on such a concept (see Firestone and Schement, 1995).
The Information Infrastructure Task Force, through its Committee on Applications and Technology, has put considerable emphasis on convening people in major application areas, such as health care and education.
This point was argued by Ken Klingenstein of the University of Colorado at Boulder during the forum.
Similarly, David Messerschmitt noted that "transcoders already introduced in cellular telephony preclude privacy by end-to-end encryption," pointing to insufficient attention historically to privacy in the development of cellular networks.
The report of CSTB's project on national cryptography policy, requested
by the Defense Authorization Act of 1994, is scheduled to be published in mid-1996.
See the white paper by Oscar Garcia on behalf of the IEEE.
The concern with design was noted at the forum by Walter Wiebe of the National Science Foundation.
See Kahin and Abbate (1995) for an overview. See also Besen and Farrell (1994) and Farrell and Shapiro (1992).
More information is available on-line at http://www.ispo.cec.be (EC Information Society Project Office Webserver) under G7 Information Society Conference.
Several such efforts are referenced in the bibliography.