The Future NII/GII: Views of Interexchange Carriers
We are not starting a new national information infrastructure/global information infrastructure (NII/GII) from a blank page; we are building on an information infrastructure, corporate structures, regulatory practices, billing practices, services, and public expectations that already exist. So the challenge is not so much "what is the ideal world that we would write down on a blank page?" but rather "how do we get there, starting with today's realities?" This paper presents views of the interexchange carrier community, as to the future NII/GII potentials, the stones we must step on to cross that tricky creek from "now'' to "then," and the stones we could trip on and end up all wet. Our principal emphasis is on how to achieve fair and effective competition, utilizing the regulatory process as necessary to help us get from "now" to "then," and assuring fair access to networks and services, for all Americansindividuals, businesses, governments, educational institutions, and other entities.
This paper begins with some postulates upon which we think we can all agree and then discusses where those postulates lead us, and how we can actually accomplish the things that our assumptions imply.
Postulate 1: There already exists a "national/global information infrastructure." It has existed, in some degree, since telegraphy became widely implemented. But now we're facing enormous enhancements: The focus in the past has been largely on the "communications" aspects of an NII/GIIgetting voice or data (information) from one place to another. The future NII/GII will broaden to include vast improvements in creation, storage, searching, and access to information, independent of its location. To achieve these improvements will require sophisticated customer-premises equipment and skilled users. The government may find it appropriate to assist in developing user skills and providing customers' equipment; but the associated costs must not be imposed on the telecommunications industry.
Postulate 2: It is economically feasible, in many instances, for there to be competition in the form of competing local telecommunications facilities, as well as services. But there will surely be some locations in which there is only one local facilities-provider. Furthermore, even if there are multiple local facilities, there may be only one active facility connection to a given home or office. In that case, will the multiple competing providers of services of all kinds have open and fairly priced access to that same end-link, so that any service provider can have fair access to customers? Our postulate: Such bidirectional open access can be achieved by facilities providers' unbundling their services and providing them for resale on a fair basis; and effective competition can bring about those results. But it will be a gradual transition process from regulation of
NOTE: Since the drafting of this paper, Mr. Clifford has moved to DynCorp, Advanced Technology Services, as vice president for engineering and technology.
monopolies to effective market competition, varying in speed of accomplishment in different locations. The associated gradual relaxation of regulation must be done carefully, in such a way that monopolies are freed from regulation in proportion to the reality of effective competition.
Postulate 3: A fundamental difference between the information superhighway and certain other widely used facilities such as the interstate highway system is in the mechanism for paying for their use. If you pay for something out of the public purse, and give it away for free or for a flat-rate charge, you create the potential for enormous waste of that resource if excessive use of that resource is of some benefit to the users. In the case of asphalt highways, that turns out not to be much of a problem: a person can "waste" the resource only by spending time driving on the roads. But in the case of the NII, one could send megabyte files to thousands of folks who don't want them, just by pressing an <ENTER> key, if the service were free. So Postulate 3 says that although there will be specific instances of flat-rate billing for services, usage-based billing will be widely used, to limit wasteful use of the resources and to correlate costs with benefits.
Postulate 4: As often stated by federal representatives, the NII/GII will be built by private-sector investments, not by governments. That clearly does not exclude governments from building facilities for their own use when specific needs cannot be met by private-sector providers, or from exercising their regulatory power to assure fair competition and to achieve other public interest goals, or from having their purchasing power influence the direction and speed of development and implementation. The purchasing power of the top 10 to 15 federal agencies is comparable to that of the Fortune 500 companies, so that power can surely influence the marketplace. But governments won't build the basic infrastructure.
Postulate 5: There are, however, public-interest goals that governments must participate in achieving. Aside from relaxing current laws and regulations that prevent competition in networks and services, we postulate that a major governmental role will be in assuring implementation of what we prefer to think of as the "successor to universal service." The successor to universal service could take either of two quite different forms, or some combination of the two. It could mean simply equitable access, meaning that the NII should be sufficiently ubiquitous that anyone can have access to the network at equitable rates and at some accessible location. Or it could be an advanced universal service, which would enable qualified persons or entities to use selected services available via the NII. In either case, any subsidy should be targeted to the needs of end users, not to specific providers. All providers must be able to compete for the end user with equal access to the subsidy funds or vouchers.
In whatever combination evolves, the mechanisms for funding and distributing the subsidy pool will be different from today's mechanisms, with multiple network and service providers being involved instead of today's situation with monopoly providers of networks and services. The mechanisms for creating any subsidy pools that may be required must be fair and equitable to all of the contributors to those pools. Contributors could be network and service providers, but could also be governments, using income from some form of taxes. Clearly, some level of regulation will be required during the transition from today's "universal service" to tomorrow's version of that concept.
Postulate 6: The anticipated vast increase in use of the GII for personal and business purposes offers potential for enormous compromises of security (of both personal and business confidential information), personal privacy, and protection of intellectual property, unless preventive measures are implemented. And since people don't want their privacy invaded and businesses simply cannot afford to have their proprietary information exposed to others, people and businesses will adopt encryption and/or other mechanisms to prevent such intrusions and exposures. Government and law enforcement agencies must recognize that reality and must not unduly restrict the use and worldwide trade of encryption technologies.
The Vision: What is the NII/GII?
We begin with a high-level description of what we are expecting will evolve and then discuss what must be done to get there. One of the most concise definitions of an NII/GII is found in the 1994 report Putting the Information Infrastructure to Work, from the National Institute of Standards and Technology, "the facilities and services that enable efficient creation and diffusion of useful information."
The concept of NII/GII obviously encompasses a "network of networks," including telephone networks, cable TV nets; the Internet; satellites; and wireless-access networks, domestic and foreign. We also suggest that the broad concept of NII/GII includes not just these basic networks, but also the end equipment, the information resources, and the human beings that interact with the equipment, networks, services, and resources.
The GII should and, we believe, will provide at least the following high-level functions:
basic concept of protection of intellectual property, when that property is available electronically, will evolve in an unpredictable fashion over time, and that (3) protection mechanisms will more and more be confined to commercial use of intellectual property, rather than simply forbidding the copying of such information.
The Role of Interexchange Carriers
Today, the basic role of interexchange carriers (IXCs) is carriage of narrowband and wideband transmissions over "large" distances, where the difference between "large" and "small" distance is defined by regulators and lawmakers, and the term "large distances'' certainly includes global delivery.
In addition to simply providing those services and dramatically lowering their costs to users as a result of competition, IXCs have been instrumental in the development and implementation of new telecommunication technologies and services. These include implementation of digital transmission techniques, single-mode optical fiber, synchronous digital techniques, and "intelligent network" services such as "800" service, which not only provides for the called party to pay, but also allows for such flexible services as time-of-day routing and routing on the basis of the location of the caller. The Internet backbone is, of course, an example of a major IXC role. This is not simply to brag about past achievements, but to point out that IXCs have competed in the past and will continue to compete in the future with each other and with other telecommunications entities. In that process, they will push development and implementation of new technologies and services as hard and as fast as they can.
In tomorrow's GII, the terms "long-distance carrier" (LEC) and "interexchange carrier" will become less used. Yes, there will be companies that concentrate more on building and selling local services, and companies that concentrate more on transcontinental and global networks. And both of those types of companies will provide at least some of their services using network capacities leased from the other types of companies. But the customer/user will hardly care about the details of what network is used; the user will care about the nature, availability, reliability, and price of the services provided.
As the conventional boundaries between local and long distance companies eventually disappear in the eyes of consumers, it will become the strategy of many retail providers to offer end-to-end services that ignore this traditional distinction. The critical element is the introduction of "wholesale" local exchange services that long distance companies can resell as part of their end-to-end packages. Wholesale long distance products are already available for local telephone companies to resell in combination with their local services to provide end-to-end service (to the extent that such resale is not currently restricted by the Modified Final Judgement). With a limited number of local networks expected to develop, it is especially important that a comparable wholesale local service be introduced to foster competitive diversity.
But that's the long view. How do we get there from here? How do today's IXCs play in that game? Here are our basic views:
A discussion of architecture is of course closely linked to the above discussion of the future role of today's IXCs. They will clearly have a key role in developing and employing tomorrow's network architecture; however, it may differ from today's.
The requirements of such services as "find me anywhere, anytime," sophisticated and flexible billing services, call routing dependent on time or day, store and forward, messaging, and others that we have yet to imagine will clearly require more and more sophisticated network intelligence and functionality. IXCs have already made great progress in these areas in the last decade or so; but the demands of video/multimedia services and increasing demands for flexibility and value-added features will require considerable expansion of "intelligent network" functionality. There is no doubt that such functionality will be implemented. And when it is implemented in any bottleneck portion of the network, all providers must have access to that monopoly functionality.
The architecture for tomorrow's long distance transport will not be all that different from today's, from a functional point of view. Clearly it will have even more capacity than today's already high-capacity two-way high-quality digital networks. Network protection and restoration capabilities will be enhanced. Newer multiplexing technologies such as SONET, and packet technologies such as frame relay and ATM, will be employed to provide ever faster and easier ways to insert and pick off individual transmissions from high-bit-rate bulk transmissions and to increase reliability and lower costs for certain services. And mechanisms to protect the privacy and security of users' information will surely be widely incorporated into the networks.
Some have asked whether the interexchange networks will have the transmission capacity that will be needed, if data traffic continues to grow at an exponential rate. Data traffic could, it is said, be equal to the volume of today's voice traffic in a few years, requiring a doubling of total network capacity. It is our view that capacity simply is not a problem in the long-distance networks, although it clearly is a problem in the "last mile." Long-distance networks are already almost entirely optical fiber. The capacity of a given pair of those fibers has been doubling roughly every three years since 1983, based only on changes in electronic and photonic equipment attached to the fiber, not changes in the buried fiber itself. The bit-rates that can be shipped through a given fiber, using a given optical wavelength, have increased as follows, with an even bigger leap expected in 1996:
Further, wave division multiplexing (WDM) is becoming practical, so that a given fiber can carry two or eventually even four or more signals, each transporting the 9,600 (or more) megabits per second. So as early as 1996 or 1997, using four-window WDM, it could be feasible for a given fiber pair to carry 38,400 megabits per second, compared to the 405 megabits per second available as recently as 1983. In terms of the conventional 64-kilobit-per-second circuits, the ratios are not quite the same because of overheads and other technical factors, but the result is just as impressive: a fiber pair in 1996 or 1997 could be carrying 616,096 circuits, compared with 6,048 circuits that that same glass could carry in 1983an increase by a factor of almost 102! Capacity is simply not a problem, in the long-distance network, for the foreseeable future.
Clearly, today's IXCs are moving toward providing end-to-end services. The reasons for this include the desire of many customerslarge and smallto deal with a single service provider, and the need for uniform protocols and services, end to end. The move to end-to-end services is also driven by the interest of both IXCs and their customers in cutting the costs of the last-mile links, which are now such a large portion of the costs of providing long-distance telecommunications and are priced significantly higher than actual costs. IXCs pay access charges to LECs, on a per-minute-of-use basis, for calls delivered or originated by the LECs. Those access charges amount to approximately 45 percent of the gross revenues of IXCs. Various estimates made by LECs indicate that the actual cost of local access is about a penny a minute or less. But the access charge for interexchange carriers is three (or more) cents per minute at each end of the link. Competition will help eliminate these excess non-cost-based access charges now imposed on IXCs and, finally, on end users.
There are two potential mechanisms for cutting those last-mile costsand they are not mutually exclusive. One is for regulators to require that the access charges be cost based. The second is the introduction of effective competition in the local networks, which cannot develop if the new entrants are forced to pay a non-cost-based access charge to keep the existing monopolist whole. It should also be noted that there is potentially a structural barrier that could dilute the effectiveness of access competition: Obviously, the "access business" of the interexchange carrier must go to whichever network the customer has selected to provide its "last-mile" connection. An IXC's leverage to bargain for lower access prices would be limited by the IXC's ability to influence the subscriber's choice of local network provider. Therefore, until and unless either regulation or effective local competition causes the local network providers to offer open interconnection to all interexchange service providers, on efficient and competitively priced local networks, there could still be limitations to the ability of every service provider to reach every end user, with fair access costs. Because we expect that these local network providers will also be offering their own line of retail local and long-distance services, it is not clear that these network owners will have a great deal of interest in reducing their rivals' costs. Until such time as true
competition removes such barriers, regulation will be needed to assure that access to the end user by unaffiliated retail providers is fully available and priced fairly. One possible regulatory approach is to require that new entrants provide equal access to last-mile links, with rates capped at the rates of the incumbent provider.
We further note that simply the introduction of competition in local networks will not be fully effective in bringing down the last-mile costs if existing LECs maintain a dominant position in service provision and are allowed to continue charging non-cost-based rates for carrying services that originate or terminate on competitors' networks. And even if the new entrants build networks that are significantly more efficient and less costly than existing LECs' networks, those reduced costs cannot be fully passed through to end users if the LEC networks remain inefficient and those inefficiency costs are passed through to those new-entrant networks when the dominant LECs provide the last-mile carriage. Existing LECS will have minimal motivation to become more efficient and lower their actual carriage costs as long as they hold the dominant market share so that they can charge their smaller competitors the LECs' full last-mile costs.
Some argue that the access charge income, over and above actual cost, is used to support "universal service." However, studies indicate that the excess access chargesover and above actual costare far greater than what is needed to support universal service. A study by Monson and Rohlfs 1 concludes that IXCs and other ratepayers are charged about $20 billion per year over and above the actual costs of providing the access services needed. And a study by Hatfield Associates 2 estimates that the actual subsidy that LECs need to support universal service is $3.9 billion. The excess income goes to support LEC network inefficiencies, domestic and foreign investments by LECs, and other LEC activities in which neither IXCs nor the public at large have interests.
Our bottom line suggestion here is that competition in the local communications marketwhich will include not only today's LECs and IXCs but also today's cable TV companies and perhaps local utility companies that have access to rights-of-waycould go even further toward providing low-cost access than does the current subsidy by means of access fees. But to achieve that goal will require some mechanisms for motivating existing dominant LECs to improve their network efficiencies and lower their costs and therefore their access charges.
Numbering plans will clearly change dramatically in the future, as individuals demand multiple numbers for their own individual use and as the population grows. Already, the use of 10-digit numbers is required for more and more local calls, as both overlays and geographic splits are implemented because of exhaustion of numbers in a given area code. The demand for nongeographic "find-me-anywhere" numbers, and for separate numbers for home phones, wireless phones, office phones, fax machines, and other special services, will surely make our grandkids gawk when we tell them we used to be able to make phone calls with only 7 digits! (Some of us remember using only 4 or 5 digits, but never mind.…)
Enhanced "Phone Calls"
There's debate about whether everyone will want a "personal number for life," so that the same number will ring her or his phone as long as she or he is around. For example, there's some benefit in a West Coaster knowing that 703 is in Virginia, and it might not be polite to ring that number at 10:00 PM Pacific Time. A geo-independent number would not give the caller such a hint. But clearly there's a market for such geographically independent and permanent numbers. The IXCs, responding to their competitive market environment, have put in place intelligent network capabilities nationwide and therefore are likely to be in the forefront of making such a service practical.
"Find-me-anywhere" service may or may not be tied to the "number-for-life" idea. The number for life may or may not be set up to follow the called party wherever s/he tells it to, on a real-time basis. But clearly such a tie could be executed and would be popular. For the same reasons as above, today's IXCs are in a fine position to pioneer such service, as they are now doing.
We already have messaging services, in the form of remotely reachable answering machines as well as services that can be provided by both local and long-distance carriers. As more and more people use the personal number and find-me-anywhere services, there will be more and more need to cut off such access when people go to bed at night, say, six time zones away from where their callers expect them to be. So we expect messaging services to grow rapidly in popularity, for this as well as other obvious reasons of convenience and reachability. There will be services offering messages of limited length (short-message services) as well as the ability to leave longer messages.
Aside from whatever multimedia services today's IXCs themselves provide, it is likely that one of their future roles will be in providing transparent interfaces between other service suppliers and end users, when the users and the service suppliers use different equipment or protocols for storage or transport. Also, IXCs will provide billing and security services associated with multimedia services for which they themselves are not the originating providers.
Do users care whether the connection network they are using is analog or digital, voice or "data"? No, they just want to talk, or have their screen filled with pictures or thought-provoking words or good graphic diagrams. It is the network providers and spectrum managers and equipment builders and capacity-theorists like Claude Shannon that notice and care about the differences. But in practice, there are clearly lots of differences, in terms of efficient use of network and spectrum capacities, flexible services, accuracy of data to be passed from one place to another, and ability to protect privacy. And since the entire IXC network is going digital anyway, for technical and cost and quality reasons, whether the transmission is voice or data, the result has huge benefits in terms of cost, speed, and reliability of the data transmissions that will be so important in tomorrow's global economy and lifestyle.
In this case, one could argue that there is no special role for IXCs compared to local providers, since we are all going digital in the end and will therefore eventually be able to provide the needed speed and reliability. But from the practical point of view there is indeed a major role for today's IXCs, and that is back to the point of providing competition and therefore much faster progress in building the digital high-bit-rate, last-mile networks that will be so important to customers who need the features that can be provided with broadband two-way digital networks.
Financial and Other Transaction Processing
The United States led the world in implementation of credit cards and debit cards, to replace paper cash and checks in many financial transactions. But now, Europe is clearly ahead of the United States in its implementation of smart-card technology and services, to take advantage of the enormous capability and
flexibility of cards with computer chips built in. We are not at all out of the runningwe do know about computer chips, and there is growing interest here. But at the moment European entities are in the lead in the implementation of smart-card technologies. IXCs will surely not be the only players in this game; but they do have an enormous opportunity to incorporate smart-card technology into many of their service offerings, and thereby bring the United States back into a leadership role in this new technology and service platform.
New Billing Mechanisms
Billing, as has been hinted at above, will be an interesting issue in the future GII. We are rapidly learningwith the help of smart cards and other technologieshow to reliably establish that someone who wants to use a service is indeed the person she or he claims to be and will pay the bill when it comes. But we do have the fascinating challenge of learning how to bill proportionately to either the use or the value of "bursty" digital bit-streams. If a user is hooked up to the network for an hour and hits the <ENTER> key only once, does she or he get billed for 1 hour? Or for sending the 100 or the 100,000 packets that resulted from the touch of the <ENTER> key? Obviously, the provider does not bill separately for each packet, the way it writes a separate line on today's bill for each phone call. Is there a way to make some kind of average or statistical count of the number of packets or bits that flow to or from a given user and bill on that basis? Of course there is. The question is how it will be done to be understandable and fair to both the user and the provider, and how it can be confirmed by either party. Again, this will not be the role of long-distance carriers alone but will evolve on the basis of trials by all the carriers competing in both local and global services.
The Successor to "Universal Service": Access to the Network
As suggested in Postulate 5, above, the current concept of "universal service," in the sense of having essentially everyone be reachable by phone, will surely evolve as technologies and new markets and services evolve. There is no public interest in everyone's having access to every single "service" that the NII/GII will provide, any more than there is a public interest in everyone's driving a Rolls Royce. Clearly it is in the public interest for everyone to have at least some access to the GII, just as it is in the public interest for all of us to have access to streets and roads and telephones. But with the dramatic expansion of the types of services that will arise on the GII, we must learn to be more specific about who gets subsidized for use of what services, and how that cross-subsidy can be managed in a competitive structure. This seems clearly to be a case where all the service providers, as well as consumers and the public at large, have a deep interest in designing and executing mechanisms to provide "equitable access" and/or "advanced universal service" in a way that gives all of us the benefits of achieving those societal goals, as they may eventually be defined. Today's IXCs are as anxious as anyone to have the networks and services provided in such a way that students, workers, and all others have access to the GII, although there will clearly be services for which a subsidy to users is totally inappropriate. But we do insist that the contributions to any cross-subsidies that are required must be managed in such a way that the purchasers of any given service or network access are not unfairly required to subsidize some other unrelated service or access. A cross-service subsidy system would not only unfairly tax the users of a particular service; it could even prevent the provision of some desirable service or prevent the survival of a potentially valuable service or network provider.
Fraud was not too great a problem when each subscriber was permanently connected by hard wires to a given port on a given switch, so that it could reliably be established what subscriber made what phone calls. But as the cellular industry has dramatically demonstrated, once the subscriber is no longer so readily identified, fraud skyrockets. Now, however, based principally on smart-card technology but also potentially making use of other technologies such as fingerprints, voiceprints, and other biometrics, we know very reliable ways to uniquely
identify the person or entity utilizing the network or network services. The big question here is not whether fraud can be effectively prevented, but rather what balance we should settle upon, between fraud prevention and the customer's convenience and privacy. Wethe IXCs as well as wireless and other service providerswill have to gain some experience in order to reach this balance. But there is no doubt that a balance appropriate to both customers and the business world can be reached.
In a marketplace situation truly like Adam Smith's, there would be essentially no need for regulation of telecommunications services. But that is not where we now sit. During the transition from monopoly telecommunications markets to effective competition, a new regulatory framework is needed. That framework must simultaneously grant the incumbent LECs the flexibility to respond to competition in an appropriate fashion and at the same time protect consumers and potential competitors from anticompetitive abuse of the LECs' substantial remaining monopoly power. We must move incrementally from the monopoly-based assumptions and regulations with which we have lived for many decades to as close to the free-market situation as we can practically get. The goals to be achieved include the following:
Why can't the marketplace settle these issues? Let us look at them one at a time.
A Fair Chance for All Providers
After living in a local monopoly environment for many decades, and building local infrastructure with many billions of dollars, and establishing rights of way that are difficult and expensive to duplicate, there is no way to simply drop all local regulation and declare that everybody has a fair chance at the market. One of Adam Smith's postulates to describe a "market" was that there should be easy (equitable) entry and exit from the market. Over the long term, that could be accomplished in local telecommunications, especially with the opportunities provided by wireless transport. But it cannot be done as quickly and easily as renting a building and starting a new restaurant. So we do have a challenge here, to provide a fair chance for all providers, at least for the foreseeable future, under a regulatory framework that simulates competitive marketplace conditions.
The existing local bottlenecks not only give the LECs potential leverage in providing local services; they also provide significant unwarranted leverage to regional Bell operating companies (RBOCs) in provision of long-distance services, if RBOCs are permitted to provide long-distance services before those bottlenecks are effectively eliminated. At least in the short run, those local bottlenecks will continue to exist. Indeed, there may be some network functions for which competition is infeasible for the indefinite future.
Equitable accessone of the successors to "universal service"has been discussed above. But clearly it must be in this list also, as it will require some form of industry and public agreement as to what is required and how to achieve it in a manner that is fair and equitable to all playersproviders and customers alike. That agreement could either be by "regulation," in the usual sense of having it imposed by government, or it could be by mutual agreement among the parties to the process. It remains to be seen what balance of these two approaches will succeed in this case, which is more complex than was the case with plain black phones. A major
challenge will be to achieve a reasonable level of access for essentially everyone, without having to provide such a high level of subsidy that valuable services are priced out of the market and multiple providers cannot survive in that market.
An equally important aspect of equitable access is, of course, access by competitors to bottleneck functionalities, whether those bottlenecks are owned by other competitors or by entities that are not direct competitors but that have the ability to favor some competitors over others by means of their bottleneck functionalities.
The management of radio spectrum is one instance in which we believe Adam Smith would agree that there is no practical marketplace. Mr. Smith wisely postulated that in a proper marketplace both the buyer and the seller must be able to understand what they are buying and selling. The radio spectrum is just too complex to be clearly defined in a market situation. I can buy a shirt from you, and we know who owns what. You can buy an acre of land from me, and we can draw clear lines around it. Those are simple two- or three-dimensional objects, which we can clearly define. But how many dimensions does the radio spectrum have? One dimension of frequency; three dimensions of space; one dimension of time, which could be measured in days, weeks, hours, or nanoseconds; one dimension of power; a few (not clear how many) dimensions of modulation technique. We could argue all day about how to count the number of dimensions, or variables, that it takes to describe the radio spectrum. But it is clear that in general the spectrum is too complex an object to be readily handled in a marketplace, especially since it is not easy to establish who might be illegitimately using some piece of it at a given moment.
Therefore, we must continue to have some governmental management of spectrum use. There are some specific cases, as has been recently demonstrated in the case of personal communication service license auctions, where a marketplace can be created. But even there, it was not "spectrum" that was being bought and sold in the marketplace. The process was first to create radio licenses, using the conventional engineering and policy process; then the spectrum managers determined that there was no significant public interest in who (among the qualified applicants) held the licenses; and only after that determination was made could the spectrum manager in good faith put those licenses (not the "spectrum") up for bids. Surely, future spectrum managers could broaden the specifications of particular licenses, so that the license-holders could have more flexibility in the services provided. But the basic spectrum-management process must be maintained.
As we develop into a truly global economy, with global transactions taking place by the millions every hour, we have more and more need for global interoperability of telecommunications systemsnot necessarily globally identical systems, but surely systems that can readily talk to each other. The current problems associated with the new wireless communications systems are a fine example. The global system for mobile communications standards are being implemented in most of the nations of the world. If they were implemented everywhere, a person could carry a smart cardcalled a subscriber identity module (SIM) in this particular casewherever she or he went, rent a phone at the airport if the wireless frequencies happened to be different from those at home, insert the SIM into the phone, and have access to the same account and many or all of the services available at home. But right now there is no certainty that such interoperability will be achieved in the United States.
The question is, Should there be international regulations to impose standards that would enforce full interoperability? We believe the answer is no, at least in the case of equipment, although international standards are certainly appropriate in certain instances of spectrum management. We suggest that it is up to the providers to decide whether to adopt voluntary standards and be compatible, or bet on some allegedly better technology, and either win or lose in the marketplace. In any case, it seems highly unlikely that the world is ready right now to have an international standards or regulatory body make mandatory equipment standards and enforce them across
national borders. There is a clear need for some level of international spectrum management, but not such a clear need and practicality for international regulation of the details of how internal telecommunications issues are resolved within nations.
The federal government's role as a regulator and spectrum manager has been addressed in the above section. But in addition to the issues mentioned above, we must recognize that the federal government does have a significant potential influence as a major purchaser of telecommunication services. That purchasing power could be usedintentionally or notto influence the direction and speed of development of the NII. We urge the federal agencies to be conscious of this potential, but we insist that the specific needs of specific agencies must not be distorted in order to fit into some effort to use that purchasing power to influence network development.
Another major step that must be taken by the federal government, to affect not only telecommunications but also many other aspects of U.S. business and its global competitiveness, is to relax or eliminate the current restrictions on export of encryption hardware and software. Encryption technology already exists globally, and so the current restrictions have little or no long-term effect on national security but have a major effect on U.S. manufacturers and NII/GII participants.
We also recognize major concerns with the roles of state and local governments. There are major potential barriers in that domain just because of the structure of local regulation. There are, for example, over 30,000 local franchising agencies in the United States, which exercise some control over cable TV systems. Obviously, such a structure could give rise to major problems of compatibility and implementation, for networks and services that will be far more complex than today's cable television.
The Next 5 to 7 Years?
It is our position that how far we can advance toward the long-term NII/GII goals in the next 5 to 7 years is primarily dependent on how well we do at the incremental process of implementing competition in the local telecommunications marketplace. If we do start implementing that competition quickly, at the state and local levels as well as at the federal level, then we can expect several fundamental changes within that time period:
The ultimate NII will take longer than 7 years. We will not have a fiber to every single home and office in 7 years. But we are confident that, with proper leadership and cooperation at all levels of government and in the laws and regulation, we can move so far toward those goals that we would want never to return to the situation of 1995.
1. Monson, Calvin S., and Jeffrey H. Rohlfs. 1993. "The $20 Billion Impact of Local Competition in Telecommunications," United States Telephone Association (USTA), Washington, D.C., July.
2. Hatfield Associates Inc. 1994. "The Cost of Basic Universal Service," Hatfield Associates Inc., Boulder, Colo., July.