Introduction and Summary
Defining The National Information Infrastructure
Are there any dominant trends in the evolution of a global or national information infrastructure (NII) for which there is no specification, no overall plan, and no institutional mechanism for reaching consensus about what it is or what it should be? That question motivated the NII 2000 project, which sought to characterize the technology deployment, market expectations, and proposed activities of communications and information facilities and service providers over the next 5 to 7 years. Perspectives provided directly by these suppliers—drawn from multiple industries—were complemented by inputs from a cross section of users in industry, private nonprofit organizations, and the public sector. The diverse inputs on deployment plans and prospects revealed that there are as many visions of the information future as there are sectors of the economy helping to create them.
Today, the range of players, starting positions, and strategies underscores the fact that the information infrastructure is not static; it is evolving in ways that reflect its initial components as well as ambitions for its growth. It is nearly impossible to overstate the implications of the combination of powerful low-cost microprocessors, high-capacity multistream digital servers, low-cost memory and digital storage media, and broadband connections to and from homes and businesses. Each of these capabilities is advancing quite rapidly, and the combination of them makes
possible an enormous number of applications, some clearly seen and some as yet unimagined. There will be missteps and failures, along with full and partial successes, but the technology and its uses will advance steadily.
History tells us that systems as complex as a nation's information infrastructure evolve incrementally, driven by private investment in the pursuit of unrealized opportunities, consensus-based public needs, and countless entrepreneurs testing the system for natural niches, for unique sources of customer value. Given that future plans are marked by diversity of vision and action, is the NII itself a paradigm that is losing its luster? Perhaps, if it is interpreted to refer to conformance with a grand plan, which appears increasingly unrealistic.
Given a future only partly seen, how shall we characterize the NII? The steering committee for the NII 2000 project sees no choice but to define the NII broadly and inclusively:
The national information infrastructure (NII) is the collection of all public and private information services—both facilities- and content-based—operating as a complex, dynamic system. It exists today but is and always will be in a state of flux.
Since more and more devices, systems, and processes will contain computing elements and be interconnected in some way, it is important that the NII be defined early on as being inclusive. Of course, the global information infrastructure of which it is a part extends the notion of inclusiveness geographically, technologically, and economically.
Rather than a single coherent technical framework, the NII is a concept to focus thinking about a very important set of resources whose value to society depends on their connectivity, accessibility, and functionality for many important purposes.1 This view of the NII admits to different perspectives, as expressed by various readers of an early draft of this report. Box 1.1 gives two examples. In particular, perspectives differ significantly across industries. Approaches to testing new markets are as different as conducting market trials for integrated multiservice packages and making new applications available on the Internet. But the difference between the personal computer (PC) vision and the set-top box vision is not about the detailed technical choices to be made; it concerns instead a choice about the role of infrastructure—should infrastructure deliver a particular service (e.g., telephony or television), or should it enable the creation of entirely new services via general-purpose (e.g., Internet) services?—and differences in the funding model (identify a specific ''killer app," or assume that the aggregate of the demand will come from millions of customer uses, none of which is important by itself). Symptomatic of the many differences among industries are semantic disagreements
BOX 1.1 The NII: What Is in a Name? A Range of Reactions
"NII," in my thinking, is a rallying phrase, not to be confused with or perceived as a tangible definable entity. It is not a predetermined universal architecture or even an orchestrated collection of interworking architectures to be furthered by some form of consensus, even at the highest level. NII … is the sum of countless U.S. and global initiatives, driven by continuous intellectual discovery, and by conceptualization, development, and implementation of a wide variety of applications and services. "NII" is founded in scientific curiosity, social need, historic experience, national character and goals, standard-of-living expectations, the needs of individual users and of corporations and organizations, security requirements, profit motives, current and future national economy considerations, global policies, and so on. The list is very lengthy. …
—Thomas Plevyak, Bell Atlantic
[Beware falling] victim to the tendency in Washington to add coats to every available coat hook. The NII cannot possibly be the sum of all of our expectations for a better society based on improved communications and electronic information. Even if someone were capable of articulating a vision of that scope, none of the rest of us would grant him or her the power to execute it. What we can have, what is doable, what we can build reasonable consensus around, is a "system of electronic communications and information resources based on computing technology.'' Despite efforts by many to promote ideological visions of what the system can do for society, the system is fundamentally amoral. Much of its development has been, and will continue to be, based on technological Darwinism. … I have two suggestions. Don't refer to the NII as an "it." At best, the NII is a complex system of systems spanning a variety of new and older technologies and consequently never embodying the holy grail of developers—a "single system image." Secondly, be cautious in using terms that have been captured by social and political visionaries and already have emotional baggage attached.
—Michael Roberts, EDUCOM
that confound public debate and private sector interactions. Box 1.2 lists terms that are fundamental to information infrastructure but subject to differing definitions and usage among industries.
Does a broadly encompassing definition of the NII preclude any expectation of effective interoperation of the parts of the NII? To what extent will a more or less integrated nationwide system of computer network services be able to attract users and capital investment? Had these questions been posed, say, 5 years ago, the answers would have been more divergent. Today, the answers reflect the extraordinary impact of the Internet, which has become increasingly attractive for commercial activities over the past 2 years.2 The Internet is a critical component of many of the business plans of industrial contributors to this project. Ad-
BOX 1.2 Vocabulary Test: Key Terms on Which Many Differ
ditionally, the steering committee notes an increasing recognition in all sectors of industry that interconnection and interoperation are a powerful stimulus to content creation, ubiquitous communication, and the opening of new markets. Thus, despite multiple visions and definitions, the steering committee concluded that there is a good chance of attaining a "seamless web" with interoperable facilities and services that compose a subset of the larger NII,3 as broadly defined above. The Internet protocols will help to achieve this outcome. Other components will probably remain disconnected, proprietary, vertically integrated business "towers."
This chapter provides an overview of the key issues and findings covered in the NII 2000 project. Like the rest of the report, it draws on inputs gathered via a workshop, forum, white papers, and a variety of consultations and secondary source materials. It relates technological capabilities to evolving plans for business strategy, competition, and structure, characterizing private sector views and stated plans to illuminate opportunities for the public sector. It explains why the Internet and also federally supported research and development came to play a greater role in the project and in the steering committee's assessment than originally anticipated.
Driving Deployment: Business Transitions, Business Models
The opportunities presented by the evolving information infrastructure are as speculative as they are rich: enormous technical, business, and public policy uncertainties about the future face investors, the public, and policy makers. The communications and information marketplaces are changing as technologies converge and the production processes of different industries overlap increasingly.
Today, the most widely used information systems—telephones, television (cable and broadcast), and, to a lesser extent, hard-copy publication (e.g., print, videocassettes, compact disks)—are mature services resting on many billions of dollars of installed equipment. Their viability depends on the continuing provision of services to established consumers and business users at profit-making prices; they attract investment because growth in profit is anticipated. As a result, the business plans of many established communications and information service suppliers appear to center on extending existing high-revenue business offerings such as telephony and video delivery, upgrading the underlying facilities, and hoping to be properly positioned if and when new applications mature. (De)regulation permitting, cable TV will be able to deliver interactive services and voice telephony. Telephone companies will offer video delivery. Cellular services will become digital, as they already are in many other countries around the world, and will evolve to personal communication services. Digital broadcast satellites will eventually acquire practical uplinks and offer interactivity.
It is not surprising, considering the magnitude of the capital investments anticipated for new infrastructure, that telephone and cable TV companies are each viewing the other's business as attractive, since these are mature businesses generating billions of dollars in revenues, representing major sources of capital for expanding the NII (although there is no evidence that the revenue pool will grow in proportion to the number of providers). Nor is it surprising that both sectors are planning on initial growth in applications that constitute modest and incremental extensions of current lines of business (TV on demand, games, home shopping, passive access to information).
New information infrastructure markets are characterized by a large number of small players, who divide the total (initially limited) revenues. Will the information infrastructure market follow the pattern seen in PC software, in which a large pool of providers shrinks to a few dominant players after only 3 to 4 years? The prospects for competitive losses, combined with economies of scale and scope, immature applications, and a skeptical public, suggest that, as is often the case with maturing mar-
kets, a shakeout among information infrastructure providers will eventually result.
Many sectors of the information industry (notably print publishing and broadcast television) will find the new environment highly disruptive—even threatening—as well as filled with potential for new activities. Others whose businesses bring together consumers with goods and services produced by others may also be challenged. When customers themselves have direct access (of the type envisioned via the NII), the previous intermediary either is eliminated or becomes a facilitator rather than a dispenser of knowledge.
In agreement with most of the industry experts contributing to this project, the steering committee believes that the future information infrastructure will provide more than just basic telephony and entertainment: a wide range of applications will combine to form the economic base of the NII (see Box 1.3). This broader vision implies that existing facilities will be used in fundamentally transformed ways. In particular, termination of the network in affordable but powerful computing devices will create an inherently more general-purpose communication environment. At the same time, new physical facilities will have to be built to provide expanded interactive bandwidth to more users.
Virtually all industries and public institutions will use the NII to do their work better, often in quite industry-specific ways that will implicitly increase demand for information infrastructure. Examples from contemporary experience include pharmaceutical companies incorporating communications facilities and services into delivery of drug information to physicians and hospitals, and banks incorporating communications into on-line retail banking, bill paying, and investing. The business models associated with such embedded service delivery ultimately may bring far greater resources to the NII than will the direct user fees that support the current infrastructure. But that supposition is difficult to quantify, given the difficulties various industry sectors experience in characterizing their future infrastructure service requirements.
Facilities providers contemplating major investments to create the next generation of networks want to ensure that there will be reasonable ways to recover their costs. Some are creating and testing new applications that they hope will be broad enough to adapt to new markets. Still others are holding back until the crystal ball becomes considerably less cloudy. Market trials have attempted to probe the attractiveness (as well as the performance and cost-effectiveness of delivery technologies) of new forms of network-based entertainment and shopping as well as educational and other "public interest" services. But results reported are not encouraging—the trials are bounded by the lack of genuine novelty in the former and a lack of substance in the latter.4
BOX 1.3 Tenets of the Information Society
Facilities providers are stymied by the prospect of financing an infrastructure that will be characterized increasingly by low marginal costs for use but large fixed costs for facilities deployment, a problem that regulatory reform will not solve.5 6 The difficulties with justifying investment are most serious for providers of access circuits to individual residences and small businesses. Given that almost all the wireline options for access technology cost the same to within 20 or 30 percent, it is possible to form a rough guess as to the monthly fee needed to recover these costs. All
current forms of wireline residential access technology, if installed new, cost about $1,000 per home passed, which seems to imply a monthly fee near $30. The only alternative that might give a lower price under some circumstances is wireless, which has the advantage that it involves less major investment up front and thus could support emerging markets with lower initial penetration.
It was further noted in the project's workshop, forum, and white papers that an important role for research and development in this context is to explore network and information access technologies that would alter costs and capabilities over the long term. Reducing per-home access costs by a factor of two, a cost reduction that (for constant function) occurs in the computer industry at intervals of less than 2 years, would change completely the prospects for rapid deployment of advanced broadband services to the home. However, the cost structure of the access circuit appears to be driven by factors other than those that help to reduce computer costs, and has indeed remained stubbornly constant for some time.
The consequence of uncertainty about future network and information requirements and the potential for profitability is hesitation in planning for future infrastructure investments. In this respect, there has been a marked change over the last 5 years, at least in public posture. What was, a few years ago, the vision of a rewired America, to be accomplished as a single planned reengineering over a fairly small number of years, has become a much more incremental plan for experimentation and upgrades. This more incremental approach to investment, which is discussed in Chapter 3, is probably more realistic than the earlier visions (and indeed those earlier plans may never have been as concrete as some of the media discussions suggested).7 It has the advantage that instead of a single massive upgrade to some chosen new technology, which then becomes obsolete over time, it involves smaller staged investments that bring in a series of new technologies. This incremental process could establish a pattern of continuous technology upgrade as justified by proven market demand. However, it will almost certainly lead to a slower and more measured pace of investment and deployment. One of the steering committee's conclusions is that timing, as much as direction, is the major uncertainty in the current planning for the NII.
The divergence of opinion concerning the relative need for one-way communication, primarily for entertainment, and two-way communication, for various sorts of interactive uses, was a major theme of the workshop and the forum. A basic uncertainty is the rate at which upstream capacity (from the user into the network) will be required to support emerging two-way applications such as provision of information by individuals and interactive "telework" support. The concern is whether suf-
ficient two-way bandwidth can be made available at affordable rates and whether lack of bandwidth will stifle the emergence of important new applications, especially those involving sharing of images and video. Perceptions of the true market demand will influence deployment decisions and will thus shape the ways in which end users in homes and independent offices can participate in a society that will depend increasingly on interactive electronic communications.
Reflecting the prominence of such concerns, one area where experimentation is notable and where multiple approaches may coexist for the foreseeable future is cost recovery. At least five different but overlapping economic models, representing different ways to allocate cost, are evident in the information infrastructure at large and in the Internet in particular: (1) usage-based fees (e.g., metered telephone service; consumer pays by the minute); (2) access subscription (e.g., cable or flat-rate telephone subscription; consumer pays by the month); (3) broadcast (e.g., TV and radio; advertiser pays by the minute, consumer does not pay for service per se); (4) end-user device-centered (e.g., PC-owning consumer pays for unlimited use of a device, including use that is independent of network-based services); and (5) embedded (cost is hidden in some other, possibly domain-specific, service; consumer leases set-top box in conjunction with purchased TV). The fates of these models may help to determine which businesses and industries endure or advance in the information infrastructure marketplace and, given their different capacities for generating returns on investment, how quickly underlying facilities are enhanced.
The Significance Of The Internet
Much of the current excitement about information infrastructure and the convergence of many communications and information technologies to a digital basis has been catalyzed by the Internet. Thus, understanding the Internet may be key to understanding many of the opportunities in the NII.
Is the Internet a model for a commercially dominated interoperable infrastructure, or is it a remarkable but transitory development from the world of research and education? The steering committee concluded that the Internet is indeed a prototype for much of the emerging information infrastructure, despite its roots in experimentation and its current (but not necessarily enduring) limitations. Its future is evolution, not replacement.
What is the Internet? It is a network of many kinds of networks. But to understand its importance it is better to think of it as a capability for internetworking, allowing any user to find, touch, and if desired connect
to a large variety of networks and the sources of information, users, and computational resources that each makes available.
As a Barometer of Potential
The number of Internet access service firms has been growing very rapidly, as has Internet access activity (e.g., acquisition of Internet addresses), although no one knows for sure how much of this growth is driven by faddism as opposed to enduring demand for access and associated goods and services. The most frequently cited illustration of the economic (and deployment) impact of the Internet is its role as the underpinning for the World Wide Web (Web), which has made the Internet easier to use and more broadly meaningful. The Web appears to provide what PC owners have always wanted: the capability to point, click, and get what they want no matter where it is. Whereas earlier manifestations of the information revolution bypassed many people who were uncomfortable with computing technology, it appears that the Web is now attracting a large cross section of people, making the universality of information infrastructure a more realistic prospect. If the Web is a first wave (or a second, if the Internet alone is a first), it is likely that further advances in utility and application will follow. Once people are comfortable finding information on the Internet, they will discover that they want much more: they will want help in locating reliable, useful information; they will want to discuss it with others, build communities around it, generate it, and so on. These activities will create demand for more applications and for different kinds of interactivity, formats, media, and so on. Box 1.3 suggests how technology and changing approaches to work, education, and recreation in the "information society" can continue to co-evolve.
A powerful appeal of the Internet as a prototype for the new information environment is that innovators can take relatively small risks while still accessing a huge potential market. The economics of telecommunications facilities deployment contrasts markedly with that of the Internet, whose current state reflects a unique set of economic conditions: tremendous market access has been achievable with a very small initial investment, an advantage that has led to a wide range of experimentation and innovation. Combining low barriers to entry with unusually high potential for growth in usage volume, the Internet has been an extraordinary platform for innovation, one that is perhaps unique in human history.
However, a central conundrum at the present is that the Internet, with its very low cost of entry, must operate on top of expensive physical communications infrastructure. As discussed above, very large investments are required up front to achieve any substantial upgrade in U.S.
communications facilities, and without this investment the Internet itself, and the other possibilities for exploitation of the infrastructure, cannot flourish.
As a Laboratory for Development of Workable Standards
The Internet also demonstrates the remarkable potential (although perhaps the outer limits) for evolutionary standards development and implementation in concert with rapid technological change. In particular, many of the important Internet standards were not adopted until they worked in a real-life environment and passed the test of performance and scaling. Those standards processes and the associated architectural principles embodied in the Internet derive from the long-term involvement of a relatively small community of computer scientists, funded largely by the federal government, who saw the Internet as a collaborative enterprise in which they served as designers, developers, and users. They understood that the large number of alternative paths for technological evolution meant that standards must be open to technological change, anticipate future innovations, and either accommodate them or minimize the cost of accommodating them later.8
The current Internet standards development process is subject to a number of stresses that are a source of concern. Although there is no clear case for intervention, the process should be monitored. Because computer scientists are no longer representative users of the Internet, the technologists most active in Internet standards setting may be somewhat removed from the needs and preferences of diverse users. Conversely, the users who need standards are no longer mostly computer or other kinds of scientists, and their increasing involvement in the Internet Engineering Task Force standards process has led to a much balkier and more balkanized process. At the same time, increasing commercial interest in the Internet is creating pressures for alternatives to standards that are proprietary or that are defined by dominant industry players.
As a Basis for Critical Flexibility
Although to many people the Internet is synonymous with its applications, such as public or open electronic messaging or the World Wide Web, the power and utility of the Internet actually derive from its internal organization, the way the protocols and functions are designed and organized. The basis of its flexibility is that it defines a simple service interface that any application can use to request network service from whatever network technology is in use. Since this interface is independent of underlying technology details, and also independent of specific applica-
tions, it allows new applications to be devised and deployed, thus stimulating innovation in network technology. The 1994 Computer Science and Telecommunications Board report Realizing the Information Future (RTIF; CSTB, 1994b) advocated an open interface with these characteristics to support innovation of new network applications. The report called this interface the technology-independent bearer service and called the network that would result from providing this interface the Open Data Network, or ODN. The Internet features an ODN architecture, with a form of the bearer service in its Internet protocol.
In the course of the NII 2000 project, the steering committee heard repeatedly that the Internet standards are the basis on which new applications are being crafted. The current volume of deployed devices using the Internet standards, together with the observed level of investment in Internet-related products and services, constitutes a unique foundation, one for which there is no alternative now or in the next decade. The steering committee has concluded, based on its assessment of industry trends, that the call for an open, technology-independent bearer service as a basis for emerging applications, as voiced in RTIF, was correct. Now, moreover, a more concrete conclusion is justified: the Internet standards are in fact being widely used for this purpose and are regarded by a great majority of commercial players as the only viable option for an open, application-independent set of service interfaces at this time. Thus the Internet and the protocols on which it is based are critical components of the evolving NII.
As a Vehicle for New Market Structures
The Internet is but one example of the transformation now occuring in the marketplace, a transition arising from the ability of new service providers as well as existing infrastructure providers to make new services available by layering them on top of existing communications infrastructure. Innovation in services layered over physical (and virtual) facilities and offered by the same or different providers will flourish, reflecting the increasing dependence on software as a service-creating technology, as well as increasing ability to reap value from information.9
It is unclear at this stage how the various players—service and facility providers—will align, integrate, or interoperate.10 Marginal costs and prices for communications services can be expected to fall as the use of digital technology and (assuming deregulation) competition increase. Communications firms may attempt to integrate vertically to take advantage of the higher profit margins possible in providing proprietary content, or they may choose to purvey the content of others but seek to control their market through proprietary access to their customers. Such
business structures may not be conducive to open, interoperable arrangements and meaningful competition.
While business pressures may motivate attempts at vertical integration, there are countervailing forces for open competitive service interfaces. For the first time in the history of the computer industry, virtually all computers speak according to the same set of protocols, and content providers can write their applications to get their information to their users using a set of standards that are independent of the underlying media. The content provider (e.g., the publisher and editor or broker or repackager) seeking to reach the maximum number of subscribers will pressure the telecommunications providers to interoperate with their competitors if need be. The telecommunications providers, if they want to play, have to support these protocols—and they are doing so. Hence, interoperability may be a natural competitive outcome of stratification of content and conduit providers.
These pressures will play out as the various components of the NII evolve over the next decade. The Internet is an example of open interfaces providing a structure for a set of business sectors—facilities, Internet service, and applications. The availability of standard interfaces permitting interoperability will define the structure of the industries providing information infrastructure as much as any technology-based process can, but it is also the case that the structure of the businesses will define the interfaces, and therefore the degree of openness for the infrastructure over time.
Whither the Internet?
If the Internet is to continue to meet user requirements over the next decade, it must evolve in a number of ways. Some of these involve aspects of the current structure that are inadequate and need to be fixed (for instance, current Internet protocol addresses are too short); some involve the need to add new features for which possible approaches are well understood (for instance, end-to-end security); and some require major advances (for instance, a capability for intellectual property rights protection). The need for these various sorts of advances suggests the importance both of continued R&D and of a vigorous standards-development process.
The widespread call for better security on the Internet covers needs ranging from protection against system penetration to trustworthy transfer of information and protection of intellectual property. Although security is a concern for all information infrastructure, the open nature of the Internet in general as well as specific features of its evolving technology underscore the challenges to security in the Internet context. It is not yet
clear how the necessary protections will be provided, especially given the importance of transnational communications, but advances are being made. Despite such concerns, lack of better security has not actually halted the expansion of the Internet; while efforts are being made to increase security, individuals and organizations are tailoring their use to the level and kind of protections available.
The transition of the Internet to a competitively provided commercial service has raised many questions about models, options, and expectations. Some, including long-time Internet players monitoring changes in network performance, have expressed fears about the prospect of unstable service, due to issues such as incoherent routing and other technical or business concerns. As commercial providers begin to keep operating statistics secret, it has become much more difficult to gather overall information on usage, such as the growth rates and usage patterns of different applications, which would facilitate long-term planning by all parties.
In terms of the broader operating environment, it appears to some that the community of shared interests represented a very valuable, but rare, historical opportunity, and we must now figure out how to develop shared, interoperable infrastructures without that type of user-developer community. Private and public actions should recognize how the Internet development process has factored into U.S. competitiveness in key markets. For example, leadership in creating and implementing key NII standards relates to leadership in the hardware and software that the United States exports.11 This value is implicit in federal support of Internet standards setting.
Realizing The NII's Potential—The User Perspective
Enthusiasm about the NII is tempered by concerns about the difficulty of realizing its promised benefits in large application domains such as health care or education, let alone in the more diffuse context of individual citizens seeking access from their homes. After hearing several presentations by infrastructure facilities providers, emergency response expert Lois Clark McCoy observed: ''I didn't hear the word 'user,' and I didn't hear, 'What does the user want?' I heard, 'This is what we're going to give you,' as well as a great deal of discussion about the tools, such as fiber optics and cellular, and about how the service was going to be provided." These observations point out a tendency for public debates to focus generally on the supply side, as well as the considerable and differing challenges in adopting new infrastructure within individual application domains. Industry sectors that may become big users of the NII—as represented by manufacturing, health care, education, and emergency
response, for example—see barriers to progress that leave them with a sense of frustration.
New applications may well come from specific user communities or domains, but their development is snared in a "catch 22." Large-scale implementation and associated acceptance in the market await not only strategies for overcoming the constraints of traditional processes linked to legacy systems, but also the development of standards for data (for elements ranging from terminology to presentation), codes of practice, sector-specific institutional and market structures, and the removal of legal and/or institutional barriers within individual domains. Thus information services can improve health care delivery and lower its trillion dollar cost, but only if they become integral to the processes associated with delivering health care. A similar situation exists for educational computing and communications, which already show the considerable influence of the emerging information infrastructure.
Providers of NII facilities do not appear to see as their concern the removal of domain-specific barriers to adoption of network-based applications; such matters go far beyond supplying networking facilities and services. Yet inputs from professionals from several specific communities eyeing the NII evolution suggested that the lack of apparent interest from infrastructure providers has compounded already slow progress in resolving cultural, content presentation, legal, and other barriers to use.
For example, sector-specific and broad-based standards and approaches must go hand in hand. There are hundreds, perhaps thousands, of niche information markets specialized by subject matter, corresponding to every industry sector and to every professional and skill category. They have specialized needs for data standards and sector-specific approaches, but with rare exceptions, each requires access to general information and information from neighboring niches or the larger sector or domain in which it fits, as well as specialized material. Consequently, even niche markets need communication access to other parts of the economy, and niche market providers will probably use general-purpose network capabilities and standards for achieving that access. Thus, infrastructure providers may have a greater interest in domain-specific circumstances than may appear obvious.
Although the steering committee heard considerable frustration and concern from key business domains about the current difficulties of utilizing the NII, this perspective should be viewed in the context of the current large business market for telecommunications. For example, business customers account for about 60 percent of carrier revenues (see U.S. Bureau of the Census, 1995). Large companies in particular are already purchasing bit transport services at wholesale, and they frequently invest
in their own facilities. Large businesses thus clearly have access to the current information infrastructure and are starting to take advantage of it.
In contrast, one of the key concerns voiced at the forum and workshop was whether (or when) the reach of high-bandwidth networks and advanced services would extend to the home and small business, which are important locations for many of the business domains that voiced frustration with access and utility, such as health care and education. As regards this reality, the spokespersons for the various business domains, the infrastructure providers, and the visionaries on behalf of a new consumer were all equally frustrated.
Deployment Of Infrastructure Technology
In 10 to 15 years cheaper and more powerful microprocessor-based devices, broadband connections to and from homes and businesses, and other enhancements to the fundamental technology infrastructure of the economy will be widely available for a great variety of uses. If anything has been oversold, it is timing. While emerging infrastructure is relatively simple to grasp conceptually, in implementation it is quite complex and software-intensive. Notwithstanding appropriate public policy concerns about implications of the NII for society, attempts to constrain the fundamental technologies or to push the market for them in arbitrarily conceived directions will not succeed.
Several key technology issues are driven by the gap between requirements arising as a result of what users increasingly seem to do and want and the capabilities present or imminent from providers.
A majority of the concerns and uncertainties uncovered in the NII 2000 project centered on the issue of future connections to homes and small businesses—the access circuits that will permit the end user to connect to the information infrastructure and take advantage of its promise. The discussions of access circuits revealed the great potential for technological advances, but also great uncertainty about the extent to which business realities would constrain rapid exploitation of that potential. Other technology areas discussed, such as backbone capacity and access for large businesses, were much less the focus of concern.
Access to the information infrastructure today is provided by proven and relatively mature technologies—the copper wire pairs of the telephone companies, the coaxial cable of the cable industry, and terrestrial broadcast television and radio. Newer technologies include cellular telephony and direct satellite broadcast of television. The future will bring an
expanding range of technology options, including much higher bandwidths over both copper pairs and coaxial cable alike, and several new forms of wireless communication. Increased transmission capacity to the home will be provided by the growing use of fiber-optic cables. Even if these fibers initially reach only part of the way to the home, they can provide much increased capacity to the system as a whole. Partial enhancement of current systems with fiber is the basis for upgrading both the copper pairs and coaxial systems.
Different industry sectors appear to be responding to these circumstances in different ways. The cable television industry, with its current base of coaxial cable, seems primarily to be planning a technology upgrade of that infrastructure to a configuration called hybrid fiber coaxial cable (hybrid fiber coax; HFC), which uses fiber optics to deliver signals part of the way to the residence and uses the existing coaxial cable for the final part of the path. In contrast, the telephone industry is exploring a number of different technology paths. Some reuse the copper pairs to provide higher bandwidth (the various digital subscriber line approaches), others involve new systems similar to advanced cable television infrastructure, and still others use wireless for advanced services. These various technologies are explained and evaluated in Chapter 4.
Flexibility and Interoperability
The steering committee focused on the extent and nature of actual technology limits, given that major investment in infrastructure cannot easily be repeated. If the infrastructure has the flexibility to adapt if and when there is proven market demand, then it is less necessary to be able to predict now exactly what the future will be. Different technologies involve different fundamental limits to two-way capacity. The further fiber-optic cable reaches toward the home, the more likely it seems that two-way bandwidth can be configured as needed. In this respect, the HFC technology received considerable discussion because it is being deployed widely by essentially all of the cable companies and at least some of the telephone companies, and compared to some other options, it is somewhat more limited in its ability to support large up-channel capacity. While project participants did not agree totally on this point, the steering committee concluded that HFC technology has sufficient capacity to support exploration of the emerging market and allow new applications to be launched. It may or may not have enough capacity to sustain a fully mature market with high penetration of advanced two-way services, depending on details of the application and the way the HFC system is installed. However, the steering committee believes that if the potential of the market is once proven, investment for upgrades will be
made if necessary. It is the first steps that are the most tenuous, and HFC is adequate for that.
Ideally, hardware investments will incorporate the necessary flexibility and interoperability required to support a highly functional, evolving set of NII uses. The need for flexibility is a consequence of two expectations: regulatory change and the opening up of existing markets to competition, which implies the need to reuse existing infrastructure for other services (e.g., cable infrastructure for telephony). The steering committee noted that increased flexibility in reuse of infrastructure for different services is an important trend in most communications technology today, fueled by important factors such as the increasingly digital, as opposed to analog, transmission of information. Most of the presentations from industry acknowledged this need for flexibility, but expressed also concerns about the costs of major infrastructure upgrades.
Interoperable systems allow content providers to achieve wide dissemination without having to repackage their material for each different distribution system, and they enable end users to communicate using common standards and data representations. However, the need for interoperation does not imply that open and interoperable systems must be mandated as a part of new technology deployment. If the underlying physical infrastructure at the level of circuits and switches is engineered to support adaptability in a flexible manner, and if protocols and software can themselves evolve as markets mature, then the degree of service interoperability achieved at any given time need not be a major concern, because it can be changed later as the market requires.12
Additional Technology Concerns
Beyond the issues of access—including adequacy of bandwidth, especially in the up-channel—and interoperability, the following were other significant technological concerns identified in the project:
- The ability to give users the sense that they are always connected, as opposed to their having to undertake a time-consuming connection process before using a remote application;
- The ability to enable users to connect to a number of remote points on the network at the same time;
- The ability to support mobile users;
- The ability to provide for adequate security in the network; and
- The ability to mix different media types, in particular real-time traffic and traditional computer data.
User Interaction with Networked Infrastructure
Paralleling the importance of technology choices for network facilities—the access wires and wireless links, the cables and fibers in the ground—are the technology choices affecting how users are connected to the network, the means for humans or for computers to interact and communicate in a useful fashion.
Today, the obvious devices that are connected to the network in volume are the telephone, the television, and the personal computer. The importance of the television is its ubiquity and familiarity. Its drawbacks are its limited functionality, with low screen resolution, no useful input modes, and no computing power. The PC is more powerful and general but costs more ($2,000 is a persistent price point for current full-featured products). However, as has been pointed out numerous times, 30 percent of U.S. households have a PC today, and substantial additional penetration into the consumer space is predicted in a very few years. Essentially all new PCs will have a modem, and so a broad base of consumers will have the ability to go "on line" in some form. The PC is thus an important and growing consumer interface to the information infrastructure. Less change, in technology or use, appears to be anticipated for telephones.
How will the TV and the PC evolve over the next 5 to 7 years? Contributors to the project from the industries that supply these devices indicated that all would like to see their device evolve into a more general form capable of supporting a broader range of applications. The steering committee sees as unrealistic the suggestion that they will converge to a single multipurpose device. The differences in cost, resolution, viewing distance, and so on suggest that each will continue to play a distinct role. But there is no doubt that each will evolve so that it can to some extent play the role of the other.13
To broaden the base of access to more consumers, including those less able to bear the costs, several project participants raised the prospect of breaking the $2,000 price barrier for a home PC and producing a cheaper device (at $500 to $1,000) that can be used for limited PC applications and as an interface to on-line services and the Internet. Some of these attempts involve using the TV as a display, whereas others use a low-cost computer display or eliminate data storage or other features. Although in late 1995 this idea was translated into a variety of new-product announcements, cheaper PCs have been tried before, without much success, and there is skepticism as well as hope about the current undertakings. This area of innovation is an important one to watch during the next few years, because its success or failure could influence the options for consumer access to the network, and also because reducing the cost of access might be accomplished by an alternate approach, that is, pushing functions back
into the network, which would change the performance and cost expectations for that part of the infrastructure.
Another point discussed at the forum was how the various networks entering the home would interconnect to the various end-user devices. Today, there is no single, common point of interconnection: telephone wires connect directly to a telephone or modem, and the cable feed connects directly to a TV, set-top box, or (in the future) a data modem. Numerous proposals for making this multiplicity of connections less complex and less costly involve providing a single point of attachment for wires entering the home, and for the networks distributing information within the home. Interconnection could be accomplished through some form of enhanced set-top box or through a network interface at the side of the house. What is perhaps most important is not the name or location of the physical box, but rather which particular functions are performed in which box, and who provides and controls these various functions. No one network provider should control the consumer's access to other networks, and so any single point of entry and cross-connection should not belong to any one of the network providers. The alternative would be for the consumer to purchase and control the device.14
Public Versus Private Objectives
The competitive drive of private industry will dominate the process by which the NII evolves. Private firms will build it; their business plans must justify the investments; and competition and the desire for new markets, not pursuit of abstract visions or societal goals, will define and shape it. This reality provides the impetus that will make it happen, and at the same time triggers many fears and concerns. In this context, opinions differ considerably on whether there is an appropriate government role in advancing the NII and, if so, what it is.
Most people can agree that an ideal information infrastructure should have such qualities as extended interoperability, broad accessibility, and support for broad participation. It should allow multiple channels for many-to-many communications and information sharing as well as one-to-one (familiar today through telephony), and also one-to-many (familiar through broadcast and cable television). Progress toward that ideal is more likely if the government can set an example with its own services and help enable a consensus on a vision of the future by removing barriers to its realization. The steering committee believes that rapid progress toward a harmonious national environment of interrelated information services and capabilities would be valuable to the nation. It does not believe that a rational set of public and private services is likely to emerge from the action of market forces alone. However, the government's role is
as a partner and participant with the private sector, exercising its regulatory authority with restraint.
If private industry is to build the NII, how can the government ensure that the evolving NII meets critical societal goals, such as equitable access? This problem was not a focus of the project, but it came up on many occasions.15 There was repeated concern about citizen access to those services that emerge as important to full participation in democratic processes and about equitable access to public services.16 From this perspective, it is important to note that although consumers may in general be citizens, citizens are not all consumers, and public policy must consider the circumstances of the larger citizenry. Governmental units will play a role in helping to define what appropriate access really means and in establishing the means to assure that it is achieved.
Project contributors provided much encouragement for government to serve as a model promoter and user of NII. Success in these endeavors may not come easily; the public sector is not known as a model of efficiency in using information systems in its delivery of services. Nevertheless, many of its activities relating to information infrastructure have been widely acknowledged as constructive (CSTB, 1994b). One-third of economic activity in the United States is in the public sector, and for this and other reasons public services will represent a substantial fraction of the development, deployment, and operating costs of the NII. The federal government, in particular, also has a valuable tool in the form of support for fundamental and applied research and development, an area of activities that may range from university laboratory research to testbeds that can explore new technologies in the context of various user domains. Research and development (R&D) can enable more capabilities, greater ease of use, and lower cost for different components of the information infrastructure. The steering committee concluded that support of R&D is one of the most effective mechanisms available to the federal government.
Through the funding of network and applications research, its own visionary use of NII capabilities to better serve the public, encouragement of public-private partnerships in specific NII development and use situations, and the convening of the involved parties to discuss the importance of keeping technological options open, the government can favorably influence the coherence of physical information infrastructure and associated services. In the eyes of many from industry, practical government leadership will involve new ways of doing business as well as pursuing such proven models as those represented by the fostering of the Internet and related technologies.
Although it drew comparatively little discussion at the 1995 forum, the international nature of communications implies that enhancements to
the NII will be deployed and used in the context of an evolving global information infrastructure. Governments have special responsibilities in the arena of international negotiations, which can affect interconnection arrangements, standards setting, markets for content, and other dimensions of the NII marketplace. Public policy decisions in the United States will affect primarily the growth of U.S. information infrastructure, but in a context where mistakes can shift the opportunity to European and Asian companies and markets.
Despite the many challenges confronting the private and public sectors in advancing the information infrastructure, it is important to remember that the problems discussed in this report are problems of success, not of failure.
Organization Of This Report
This report is divided into chapters that present inputs collected from the NII 2000 project's many contributors. It serves partly to report what those inputs were—emphasizing, as requested, the views of contributors from a variety of industries. It also provides interpretation and commentary, including observations on what the steering committee did not hear and conclusions drawn by the steering committee from its deliberations.
The chapters complement each other, but they also present some similar material from different perspectives. Chapter 2 focuses on user problems and needs, addressing both organized users (e.g., users within different industries or application domains) and more independent users operating out of households or small businesses. Chapter 3 considers determinants of infrastructure supply, examining influences on the nature and rate of investment and alternatives for cost recovery or revenue generation. Chapters 4 and 5 examine key technologies being deployed: Chapter 4 describes different kinds of technologies, while Chapter 5 is a compilation of information generated through this project and published sources on actual deployment levels and forecasts. Chapter 5 is the closest the steering committee could come to providing the elusive "road map." Chapter 6 examines the role of government in shaping deployment; it is not a comprehensive discussion of information infrastructure policy making, but rather a consideration of the interaction between public and private actions intended to foster and shape the information infrastructure.
The chapters are complemented by seven appendixes with information relating to the information collection and synthesis processes that shaped the report. A forthcoming second volume will contain the white papers that are the basis of much of this report and that are quoted from and referred to throughout.
The administration's Agenda for Action (IITF, 1993) describes a "seamless web" of universally accessible information services and interactive capabilities as its vision of the NII.
Although at issue are business decisions, there is a notable unevenness in the underlying economic conceptualization. In some cases the market model is relatively easy to understand, but we just do not know which technologies will win. The combination of high levels of infrastructure investment, low marginal costs for use, and competing suppliers is one for which economic theory provides only limited and ambiguous insight, which is one reason that predicting the future is difficult. Besides not being able to know in advance the technological winners, we also do not know how the market will behave, whatever the winning technologies. These issues are examined in Chapter 3.
See CSTB (1995c).
like the de facto standards development that prevails in most of the rest of the computer industry; it can occur rapidly, and it fosters competition.
An interesting indicator is the relatively high incidence of carrier partnerships or initiatives to deliver information.
Today Cisco Systems exports almost all of the routers for the Internet worldwide, and most of the supporting software for TCP/IP comes from the United States.
If, however, custom network interfaces for the different networks that enter the home have to be attached to such a device, then its final form, if it matures, will be in two parts, with the basic unit belonging to the consumer, and featuring attachment sockets or slots into which different network interfaces can be inserted. Such an architecture has not been developed or accepted for a consumer network interface yet, although it has been discussed. However, it bears some resemblance to a PC, which has a bus or external interface into which network cards are traditionally inserted. This raises the possibility that a PC could become a consumer network cross-connection device, a proposal that generates both interest and skepticism, depending on which industry sector is speaking. See Chapter 2.
See CSTB (1994b and 1995b) and their references for fuller consideration. Anderson et al. (1995) provide a thoughtful examination of electronic mail as an initial focus for universal access.
There are areas of application, in health, education, transportation, and library services, for example, where government has functional responsibilities that could be better served by full use of information and communications services.