Where Is the Business Case?
Factors Shaping Investment In Information Infrastructure
Perceptions of business opportunities pace deployment of information infrastructure—how much money can be made, with what investment and risk, over what period of time? In and around 1993, when the administration released its National Information Infrastructure: Agenda for Action (IITF, 1993), private and public discourse seemed to reflect greater confidence concerning directions and opportunities. In 1996, however, the inputs to the NII 2000 project suggest that the only aspect of the business case for deployment on which there appears to be broad consensus is uncertainty compounded by a focus on associated risk. 1
The current uncertainty is accentuated by a combination of ongoing or anticipated transitions in technology (see Chapters 4 and 5), regulation and public policy, and industrial organization. Key technology transitions include the shift in computing power and functionality from the center to the edge of networks; the separation of services from the physical infrastructure; and the shift from analog to digital storage, transmission, and processing. The key public policy shifts relate to relaxation of regulation and other changes in competition policy, plus associated business incentives (e.g., permissible depreciation schedules). Supply and demand are complicated by regulation, which variously (1) affects who can enter which markets, the cost of interconnection, capital formation, and product and service pricing in some but not all segments of the na-
tional information infrastructure (NII); (2) has different implications for incumbent and new-entrant enterprises; and (3) motivates some investment decisions based on anticipation that the framework will change. Shifts in industrial organization draw on both technology and policy trends and are fed by the declining costs of bandwidth, increasing consumer investment for infrastructure (cable, end nodes, information services), and surging commercial interest and activity in the Internet.
Among market participants and analysts, there is agreement that markets for information and communications goods and services will grow continuously and that there is a variety of markets, with many kinds of investments, costs, revenue generation models, and buying patterns. There is also recognition that market growth is a function of cumulating purchases and uses that, in turn, build on technical foundations (e.g., expanding price-performance of microprocessor-based devices) and behavioral ones (e.g., growing comfort and sophistication in the use of personal computers (PCs)).2 Nevertheless, the lack of agreement on how much of what kind of services will likely be purchased (at a given price and time) leads alternately to hedging, experimentation, or stalling.
The clearest trend is toward considerable experimentation by network, service, and content providers now and extending perhaps for another 5 years. The foundation for various experiments is a solid business case in telephony, television delivery, and (more variable) on-line services. At this time, though, corporate commitment to a complete strategy relating to network-based services appears unrealistic, and even companies that had made public strategic commitments have been backing off. For example, among the regional Bell holding companies (RBHCs), enthusiasm has diminished for video dial-tone systems, extending to cancellation, deferral, or alteration of plans for deploying video delivery systems.3 The market reaction to the Netscape initial public offering may be emblematic of the current ambivalence: despite considerable uncertainty, there is also considerable investor interest, some of it reflected in almost incredibly high initial stock prices.4
Technical, business, and regulatory uncertainty is yielding a degree of business paralysis as well as experimentation. The difficulty of expanding the market is seen in the fact that many proposed and attempted information services are alternative methods of delivering known products (e.g., (near) video on demand vs. video cassette rental). Robert Crandall of the Brookings Institution cautioned that where capital investment requirements are high, competing wireless investments may retard progress inasmuch as they drain potential revenues from new wireline facilities by targeting similar applications. By analogy, he noted that "cable evolved not in the novel ways that some of the pie-in-the-sky optimists envisioned in the 1960s and 1970s, but rather by providing us
BOX 3.1 A 20-year Horizon?
[W]hat is wrong with 20 years? If you look at the evolution of infrastructure, 20 year is close to a microsecond. If you look at how long it took for digital technology to roll out just into the telephone switching plant, that was a 20-year process. If you look at the development of the PC, which we tend to think of as happening overnight, that was literally a 20-year process as well. If you look at the time needed to introduce any of the fundamental technologies over the last three or four decades, it has been a 15-, 20-, or 25-year process. Indeed, if fiber optics in the home was 20 years from now, I would say that would be a revolutionarily short time, given the fact that the marketplace is going to have to pay for all of this. And if you tried to make it faster, you would probably be distorting the marketplace. You would have to do it with some sort of premature subsidies. Consequently, I am listening to the conversation and saying I do not really understand why we are bemoaning the 20-year period. We should really be rejoicing.
—Howard Frank, Advanced Research Projects Agency
more sports, more movies, more replays of sitcoms, and perhaps more news programs." On the other hand, comments generated by the project suggest that such variations on a theme are more likely to come from the obvious players, the existing businesses, and that newer ideas hatched by entrepreneurs may be less visible, at least at first.5
Confident of the potential for innovation, James Chiddix of Time Warner Cable observed that "there are thousands of people who will begin to put creative energy into trying to find ways to harness these technologies to find some new wonders and surprises. It is very hard to predict what those things are and what their social implications are. But I think that time is the only issue, whether it is 5 years or 20." Howard Frank of the Advanced Research Projects Agency (ARPA), reflecting on the forum discussions, maintained that 20 years seems reasonable. See Box 3.1.
Information infrastructure businesses can be split roughly into two categories—facilities and services—and the investment required accordingly can be incremental or "greenfield," can support processes that are capital- or labor-intensive, or can be long- or short-lived or sunk.6 Particular investment choices will drive the amount of bandwidth available, the features that can thus be provided with the bandwidth—especially openness and symmetry giving the capacity for upstream communication—and the application services enabled for the public as a result. Caution and anxiety about the size and profitability of the market increase with the degree to which a business is dominated by facilities, and there-
fore by capital investment. Investments in facilities are crucial for expanding the deployment of bandwidth, while investments in services that make use of the facilities and in devices that make use of the services are lower in magnitude and shift at least some of the cost away from the facilities providers. The greatest cost—and the highest level of risk—appears to be associated with deploying access circuits, facilities serving homes. Three alternatives seem to be seen for the near-term deployment of network-based infrastructure:
- Some people will spend large amounts of money to take the infrastructure (especially access lines) one big step forward, installing technology we will live with for a long time; or
- We will live with access technology that evolves continuously, not in big jumps or after long pauses; or
- We cannot afford to do what companies or commentators are contemplating. 7
The remainder of this chapter outlines several issues: the investment challenges associated with evolving the NII, the role of the Internet, and alternative models for generating revenue to recover and drive future investment. A number of themes emerged from the different inputs to the project; this chapter reports, as well as comments on, some of the tone or flavor imparted by different types of facilities and service providers (and by industry analysts) to the debate on information infrastructure investments. Who said what is sometimes as meaningful as what is said at all.
Investment In Facilities
The Problem of How Much Bandwidth to Invest In
Fiber is being deployed by many parties, setting the stage for competition, since as noted in Chapter 4, sunk investments in fiber facilities can support multiple application services. The investments are being made despite uncertainty that was captured somewhat cynically by Jack Thompson of Gnostech Incorporated when he observed:
… [T]he telephone people are saying, I can't make money in telephony anymore—I have to get into cable. And the cable guys seem to be saying, I can't make any more money in cable; I have to get into telephony. So why is everybody trying to get into a business where there is no money left?
J.C. Redmond et al. of GTE Laboratories address this query specifically in a white paper, noting two possibilities: either current participants can
modify their networks "to handle the total communications needs of customers (voice, data, video) by relatively modest incremental investments"—fostering competition to dominate and survive—or unprecedented growth and demand for new services may occur, fostering entry into adjacent or complementary businesses. Given these possibilities, and moderated by the difference in the costs to upgrade a cable TV system to support telephony or to upgrade a telephone system to offer broadband services (Telecommunications Reports, 1995i), companies are both posturing and making actual investments in the hopes of reaping the advantages that may accrue to successful first movers.
In the context of investment in facilities for access to information, the problem of whether, when, and where to gamble on fiber is emphasized by telephone companies, because deployment of fiber represents a high fixed-cost investment—creating the desire for high-volume use to recover costs.8 See Box 3.2. In telephony, despite compelling arguments for fiber in backbone facilities, the lack of local traffic volume may make deployment of fiber to the household unlikely in the near future outside of highdensity urban areas.9 For cable companies, the upgrade path to fiber is clearer. Because they already have coaxial cable networks reaching into homes, cable firms are building fiber out from the center toward homes in a hybrid architecture (see Chapter 4).
Noting that net capital stock has been declining with changes in accounting measures, if not in economic terms in the telephony business, Robert Crandall underscored the need to "keep the flow of new funds coming into this sector of the economy if we hope to build these extremely risky—particularly if they are to be wireline—systems with a substantial amount of fixed costs." With a nod to the major presence of telephone companies, he estimated that "if the NII is to be built by established telephone companies with technology now under development, it would probably require a near doubling of these companies' assets." 10
The scale of the investment, combined with both competition and constraints on the nature and remuneration provided by business activities to generate revenue, poses the threat of asset stranding to telephone companies. Stewart Personick of Bell Communications Research expressed the problem in the context of sustaining demand, in a competitive environment, for two-way communications capacity:
If I had to spend $30 a month personally to operate a server, and I found out that for $20 a month I could put my material on someone else's server, with a 155-megabit-per-second incoming line, redundancy, and everything else, why would I month after month pay the $30?
Hal Varian of the University of California at Berkeley explained that the problem is one of survival over the long term, given the underlying economics of the business. He spun scenarios (see Chapter 6, Box 6.4) that
BOX 3.2 In Search of Volume
Joel Engel of Ameritech explained that "fiber is an extremely cost-effective transmission medium. It is not cost-effective because it is cheap, but because it has very high capacity. The unit cost is very low, so anywhere you can aggregate traffic, where you can share among many users, the unit cost is quite cost-effective. But, if you run fiber out to the individual subscribers, individual homes or small businesses, they do not generate enough traffic to support that cost, and they have to pay the cost of the whole thing because it is dedicated." In his white paper BellSouth's Robert Blau describes the regional Bell holding company (RBHC) environment as one in which competitive entry into the relatively lucrative exchange access and intra-LATA (local access and transport area) toll markets has both constrained RBHC traffic growth and led to lower prices, depressing growth in earnings. In addition, increases in RBHCs' fixed costs as a proportion of total costs (as may occur with upgrades) may make earnings more volatile.
Jim Chiddix of Time Warner Cable spoke of his company's Full Service Network experiment in Orlando, Florida, as "an expensive project. But if there is a volume market there, the nature of the technologies that we are discussing will force prices down very quickly." John Redmond, reflecting on GTE's experiment in Cerritos, California, emphasized the need for low prices to induce use: "It was not clear that people were going to pay that much more for anything that they could get free off of the television now." Intel Corporation's Kevin Kahn related the problem to mass production in the businesses of information appliances and services: ''At the end of the day, the fewer the choices, the higher the volumes; the higher the volumes, the lower the prices, and the more ubiquitous—therefore, the more attractive as a consumer product—it is."
might arise if "chronic excess capacity" were to result from "two or more broadband wires to the house providing essentially the same service." Other things being equal, there would be price wars, producers could not cover fixed investments in facilities, and there would be financial distress.
Risk aversion pervades the comments and concerns expressed by the biggest players in the NII arena, the telephone companies (especially the local exchange carriers; LECs), which appear to face both high costs for the associated access circuit investments and high uncertainty in expected circuit use.11 See Box 3.3. In his white paper Robert Blau of BellSouth points out that telephone companies can and do choose to invest in apparently less risky "opportunities outside local networks," and he suggests that "decisions by Bell company managers to accelerate the introduction of advanced network gear did not have a positive effect on shareholder returns."12 His analysis points out that investment is deterred by the prospect of delays between deployment (and associated investment) and the generation of returns on that investment. (Invest-
BOX 3.3 Risk Aversion and Investment in Circuits
Business says you spend money where you think you can make money. You do not spend money today to make money 25 years from now.
—Edmond Thomas, NYNEX Science and Technology Inc.
Even in a noncompetitive environment, if you invest the capital significantly before the revenues come in, you are in deep trouble, because typically the interest that you have to pay when you borrow the money is about equal to the rate of return when the revenue starts coming in. When there is a 2- or a 3-year gap in between those two events, you never catch up.
—Joel Engel, Ameritech
If shareholders believe that risk-adjusted returns on investment in advanced network technologies will remain competitive with returns on alternative investment opportunities, then those technologies will be deployed and the new service features they make possible will be brought to the market in a timely manner. If, on the other hand, shareholders do not regard prospective returns on network investment to be high enough to compensate for risk incurred, then lesser amounts of discretionary capital spending will be committed to new network technologies.
—Robert Blau, BellSouth Corporation
ment advisors Alex. Brown & Sons Inc. (1995), however, have recommended against considering earnings as the key valuation metric for online service providers because they are "investing heavily ahead of expected revenues"; revenue (subscriber) growth and projected cash flow are better indicators. Other financial analysts reportedly expect delayed returns and advise postponing profits in the interest of investing for growth and position (Higgins, 1995b).)
Investment posture also reflects the nature of the business; the degree of optimism and the aggressiveness of investment prospects voiced by contributors to the NII 2000 project varied with the speaker's affiliation. Leslie Vadasz of Intel Corporation related the differences in attitudes about financing to the differences in culture between the computing and communications sectors: "I go to computer meetings, and we don't talk about how much we are going to invest in this, or how much return we are going to get on that. We don't ask, Is there a market? Is there no market? We run as fast as we can." Vadasz expressed concern that absent comparable attitudes among telecommunications providers, "the capability of the access device is going to go up, and up, and up, and up, and the
bandwidth available for a [public switched] network environment is not going to keep up with it." The conservatism voiced by telephone company executives corresponds to their observed pace of investment; the impatience voiced by computer companies, among others, begs the question of whether a different outlook, combined with prudent investment strategy, might accelerate deployment. Yet cultural differences do reflect some fundamental differences in the computing and communications markets, such as the greater uncertainty in communications markets; differences in market power and industry structure (e.g., monopolies may be slower to innovate); and regulatory constraints, all of which contribute to a tendency of communications providers not to run as fast as many computer firms.13
Federal Licenses as an Influence on Deployment of New Wireless Systems
In contrast to the uncertainties regarding deployment of wireline facilities, the minimum investment in facilities for wireless cellular and personal communication service (PCS; also paging and other mobile radio) systems is more or less determined by spectrum licensing build-out requirements and by competitive pressures.14 15
Licenses do not obviate the risks associated with investment (and arguably could increase them to the extent that deployment proceeds faster than apparent demand would justify). Industry representatives privately acknowledge great uncertainty regarding the evolution of wireless systems, as noted by one project reviewer: "The industry is growing so fast and in so many directions that few people really have a grasp of the whole." One indicator is the reported scaling back by licensees (0.5 MHz) of interactive video and data services of their expectations for the original target markets. Moreover, while attempting to reconceptualize the services they can offer and reduce financial risks, they may transmit only test signals to meet Federal Communications Commission requirements—an illustration of the fact that investment and deployment do not necessarily add up to commercial service availability (see Arlen, 1995a; and Mills, 1995b).
Although cellular license holders were not allowed to bid for PCS licenses in their own territory, the two markets are expected to co-evolve, with the PCS share of the combined market increasing and interoperability
growing to help sustain the cellular market base.16 A key positioning element was the success of the top broadband PCS license winners in aggregating bids to achieve nationwide geographic coverage and contiguous spectrum coverage. Private mobile services (including those for police and fire protection and the growing industry for unit-to-unit communications for personal and business use) already support tens of millions of units in service worldwide, with leading-edge technologies in use and planned.
While regional small wireless systems involve industrywide and major-city investments on the order of $20 billion,17 space-based satellite systems aimed at provision of global telephony (voice and data) include Teledesic's $9 billion, 840-satellite, low-earth-orbit system; Hughes Communications' $3 billion, 8-satellite system; Motorola's 66-satellite Iridium system; and many more (Samuels, 1995; Cole, 1995). These global systems will be able to provide "cellular-like" service throughout the United States. The white paper by Robert Blau cites a WEFA Group forecast that suggests that excess capacity will result from commercial wireless investments.18 However, since some unused capacity is normal for lumpy infrastructure investments, the key unknown is the size and duration of the gap between capacity and demand.
Investing to Achieve Infrastructure Generality
Compounding the problem of how much bandwidth to invest in is the problem of deciding to what extent that bandwidth should be provided in a general and flexible form. Decisions about providing for generality affect deployment of services, over and above deployment of facilities. See Box 3.4. The computer industry designs and builds computers that are used for a wide range of applications, not just for a single predetermined application such as word processing. Uncertainty about the eventual application forces the development of devices with general and flexible characteristics. As discussed in Chapter 4, the Internet similarly attempts to be a general infrastructure that can support a range of applications. This generality provides insurance against the unpredictability of demand. A specific example that came up in several discussions at the workshop and forum was whether it was justifiable to install significant bandwidth for traffic from the user into the network, usually discussed as upstream, back-channel, or symmetric bandwidth. Symmetry is a quality inherent in telephony (albeit with limited bandwidth). A related issue is the extent to which the capabilities of the network should be provided in an open manner.
Open interfaces may stimulate the creation of new applications but may also open the door to unwelcome competition. At the forum and in
BOX 3.4 Investment for Generality
The real question is, are we going to have enough capacity in this basic hardware infrastructure and some of the supporting application software that makes possible differentiation so that we can have appropriate differentiation where it is needed?
—Steven Wildman, Northwestern University
The most important characteristic of the NII from the perspective of users is probably the flexibility to accommodate new applications in the future, most of which we cannot anticipate today … [allowing] application developers to target a broad set of NII infrastructure and terminals with new applications without having to deal with an exponentially growing set of new cases as new technologies are developed, but rather … develop generic applications that can target the entire infrastructure and all of the terminals out there, no matter whether they are wireless PDA[personal digital assistant]-type devices or desktop supercomputers or standard telephony devices. That is not very easy to achieve. It requires a very careful definition of architecture, and scalability of applications to the capabilities of terminals and to the capabilities of the transmission infrastructure. One of the key characteristics is the capability to deploy new applications without any modifications to the network itself. Because once you require modifications to the network, then that puts a huge obstacle, economic and otherwise, in the way of new applications.
—David Messerchmitt, University of California at Berkeley
I would like to see systems that allow upstream bandwidth from individual homes to be considerably higher …, and then deal with the aggregate upstream bandwidth on the HFC [hybrid fiber coaxial cable] system, for example, as a traffic engineering problem. If we do a fixed static allocation of upstream bandwidth, we have precluded any reasonable way of saying, "I am going to put a media server in my home because I have a really clever idea for serving media on the Internet." On the other hand, if it is strictly a question of doing traffic analysis on a deployed system, and then deploying additional spectrum or changing the bandwidth reservations around in order to increase the upstream bandwidth when it becomes available, then that is a much more hopeful situation. … I have some difficulty believing that in a system with 125 homes you are going to find 125 Web suppliers that are running full media out of their homes. What I have no difficulty believing is that you will find, within the relatively near-term future, some people out there who are doing that. If the allocations are static, such people are precluded from running full media out of their homes. If the allocations are dynamic allocations, then it is reasonable to experiment and to let people have that kind of outbound bandwidth while others are simply sitting there, pointing and clicking with their infrared control, and we will not have precluded part of the space for purposes of experimentation.
—Kevin Kahn, Intel Corporation
his white paper, David Clark of the Massachusetts Institute of Technology related observations from the Internet environment to the evolution of the telephone system (see Chapter 4), describing its process of opening up certain interfaces:
The reason we want to look at architecture is that the decision to provide an interface, both in a technical sense—whether to implement it at all—or in a business perspective—whether to open it—is in fact a critical business decision. … If you open an interface, then that is where your competitors show up.
Providing for such openness may have broader business ramifications as well. For example, the steering committee noted some concern in the business community lest open access for new services be invoked as political grounds to force provision of mature services in an open way. (Some cable providers, for instance, have expressed the concern that providing open access services such as the Internet might establish a precedent suggesting that video services should be similarly open. The extreme form of this concern is fear that the eventual consequence of open access to specific service platforms will be the imposition of a common carriage status on all services.)
Openness and bandwidth symmetry were focal points of discussion in the January workshop; the inability of people to quantify markets for these qualities seems to lead to market trials and deployment plans with limited openness and limited upstream bandwidth. Overall, comments by representatives of various industries suggested that more closed and proprietary service packages seemed to offer more promise for profits, a view anecdotally supported by reports that Wall Street attaches higher value to ventures with some element of proprietary, differentiating technology. Other participants, notably those associated with the computer hardware and software industries, suggested that the open nature of the Internet model may provide more stimulation for innovation.
Uncertainty about the market looms large when it comes to forecasting demand for general infrastructure qualities and associated services. Considerable skepticism is expressed by telecommunications industries and industry analysts about the demand for symmetric communication. 19 Observed Graham Mobley of Scientific-Atlanta:
What will the consumer really want to do with the interactive services, and how much is he willing to pay? There have been projections that show that, with interactive services, you could probably increase revenues by a factor of two between broadcast and incremental interactive services. On the other hand, nobody really knows what those interactive services are yet. Therefore, cable systems are not sure how much cost and investment to put into providing full interactivity.
According to Joel Engel of Ameritech, part of the confusion on this topic comes from a failure to distinguish symmetric applications from symmetric networks. Networks will support multiple applications, which will have differing requirements for symmetry. Engel argued that the overall demand is for asymmetry:
[D]uring any prime time period pick up the fiber or the coax and cut it and look at the bits going in both directions. You are going to see that in the aggregate, the total traffic is going to be highly asymmetric because of all of the services that do not require any upstream at all.
The cable industry's plans to incrementally provide greater upstream capacity (see the section "Incremental Increases" below) reflect its uncertainties about consumer demand for symmetry and about the costs and revenues associated with providing more generality.
Inputs to the project attested to telecommunications providers' beliefs in the profitability of infrastructure that is not fully open or symmetric. For example, Tim Clifford, previously with Sprint, questioned the extent of the need for openness and interoperability, and AT&T Corporation's Mahal Mohan argued at the forum that interoperability mattered most at lower architectural levels. See Box 3.5.
The problem telephone companies see with openness is the seeming inexorability of being reduced to a commodity transport business: those who make investments face free-rider competition and the prospect of being unable to recover costs.20 By contrast, closed systems provide more of an incentive to innovate and be a first mover, and the various experiments with more or less closed systems offering different combinations of services suggest a competition to be first with the winning formula. Hal Varian suggested that this was the problem with "plain old" Internet transport service: "There are no barriers to entry. There is no proprietary technology, no special inputs. Since Internet transport is an undifferentiated commodity, consumers are going to buy on the basis of price." However, several Internet service providers do use proprietary technology (e.g., some in routing—Advanced Networks and Services Inc.; some in security—UUNET, ANS; and some in network management—MCI, ANS), which raises questions about the relative costs and success of technology-based attempts to differentiate service offerings. More generally, these examples illustrate a variety of options for Internet service providers to differentiate their services without getting into the content business, such as control of bandwidth, security, private virtual networks, alternative access (e.g., wireless), roaming, multicast, and (good) network management.
Viacom's Edward Horowitz cautioned that "the incentive is to build these networks to the point where there are maximum barriers to compe-
BOX 3.5 Interoperability
Service providers want to differentiate themselves through advanced technology (such as better service) and through information services (or content). That tends to drive one away from interoperability. So, as we move into this environment of competitive services, I think we have to start thinking about what the real need for interoperability is. Does it truly have to be globally seamless or can we allow essentially closed groups to provide interoperability through a limited set of interconnect points?
—Tim Clifford, DynCorp (formerly with Sprint)
The interoperability is at a transport level, being able to get messages across, back, and forth. But the feature functionality is at a different dimension.1
—Mahal Mohan, AT&T Corporation
tition for a network vis-à-vis another one and not go further." Horowitz noted that there are many points (such as operating system, storage, transport) in his content-creating and content-distributing business at Viacom where he could face higher costs because of the constrained choices of services that arise as a consequence of "only one solution for a network." He emphasized the need for open access to the set-top device, explaining that "it is not a box per se; it is the process by which you extract information from transportation and you get it displayed or [converted] into usable form."
Horowitz's concerns are examined in the context of alternative industry and government perspectives in a white paper by attorney Jonathan Band. Commenting on how Microsoft's competitive success has inspired arguments and action by different industry groups, Band observes:
Microsoft hopes to dominate the market for the operating system for the "set-top box"—the entry point to the information infrastructure into individual homes or business. By controlling the standard for the set-top box operating system, Microsoft will be able to exercise control over access to the entire infrastructure. Microsoft wants to encourage third-party vendors to develop applications that will run on its operating system; the more applications, the more desirable the operating system be-
comes and the more likely that the market will adopt it as a de facto standard. At the same time, Microsoft wants to prevent the development of a competing set-top box operating system that is compatible with all the Microsoft-compatible applications.21
Band discusses how the word "open" has been used by various industry sectors to describe interfaces that are controlled or protected in some way in support of particular business interests. An important issue is that innovators of new features, functions, and interfaces may view these as valuable intellectual property and may wish to derive revenues from their use. Standards bodies that specify open network standards must balance the rights of creators with the need for users to have assured right to use the standards on predictable and reasonable terms.
Building on the implied potential for unbundling, James Chiddix argued that "when our capacity is not limited, it is not in our business interest to impede our customers' access to any service that they want to pay for." Also reflecting on the changing cable environment, General Instrument's Quincy Rodgers observed that "the customers are demanding open systems. They are demanding licensing. … People are insisting that their networks will, for instance, support multiple operating systems." Wendell Bailey of the National Cable Television Association pointed out that the facilities by themselves do not guarantee openness, which also derives from business strategies:
Cable modems are about supplying wideband access to data service, [also including] multimedia clips or video in motion. But … they will not give you access to cable channels that are not carried by your cable operator. The fact is that had there been such a model 15 years ago, you would not know what to ask for now, because there would be no programs or services. People like those in the cable television industry paid to create them, to put them up there to run on their platforms. Now you would like a switched network to get at all of them. Someday you will have that: you will have that when the incremental cost of providing that is reasonable—when there is a business to be had for providing that.
As Chiddix acknowledged, "Clearly fiber to the curb with one fiber taking traffic in each direction lends itself well to totally symmetrical traffic, whereas hybrid fiber coaxial cable, as it is being implemented, is significantly asymmetrical in its capacity." Similar observations can be made about broadcasting, which delivers considerable bandwidth that is controlled by the broadcaster.
The issue of who controls access to and use of circuits and who benefits from associated revenues is key. Mused John Redmond, "We are going to have to bundle our services, because we are going to have to go to customers and put together things they want. It may not be all our
services. It may be some of yours, and you may have to work with someone else to put a package together that someone wants." These comments appeared to focus on bundling with content providers, but they could apply equally to services associated with other kinds of applications, as anticipated in Chapter 2.
Stewart Personick, noting that internetworking does not have to imply full integration of services, suggested that part of the problem was one of getting started:
There are two different meanings of interoperability. I can have large vertically integrated suppliers that represent closed systems internally, and they can always gateway to each other so that they can exchange messages. Customers on one system can access information on another. The question is, Will large, integrated, competitive suppliers voluntarily allow niche players to leverage off of their investments? That is, will they open up inside their system so that someone who has a better server, application, or user interface can just use that limited advantage and start a business? I think that it is certainly not likely that we will have that type of openness, where there are open interfaces everywhere, in the near term. However, it may be a very good long-term objective because, as we know, that type of openness does promote innovation and does promote driving prices down.
Consistent with Personick's speculation, the Web and even the telephone system appear to provide counterexamples to arguments favoring closed systems.22 Thus, telecommunications executives also acknowledge the change in incentives provided by expanding bandwidth. As suggested above, greater bandwidth raises questions about how to maximize the use of the deployed capacity. Noted Engel, "Perhaps sharing the cost to the individual customer across many, many services may be the way to break the code on the last mile."
From Facilities To Services And Applications
Although the facilities owners dominate the public debate over telecommunications policies, they are only a part of the process of deploying the NII. Separate innovation and roll-out are also needed in services, especially services other than commodity transport—those relating to content delivery and a variety of communications and information applications.
Compared with deployment of facilities, deployment of services and applications involves a larger number of players, in part because such players need not own their own facilities. Market entry is cheaper, since it is driven more by software and human resources; capital to supply some services (the canonical high school student in the home basement)
may be relatively small, involving leased circuits, switches or routers, and various servers.
Balancing Investment—Software ''Capital"
Stanford University's Gio Wiederhold suggested in a white paper that "[t]here is an optimal trajectory in balancing investments in the systems infrastructure versus software application support." In at least the health care arena, the focus of his paper, he recommended moving "support to the information processing infrastructure, so that relevant applications can be built easily and the customers can be satisfied." Duane Adams of ARPA noted that the conceptual framework for that balance remains uncertain, in part because of uncertainties surrounding software system interoperability. He remarked on opportunities for providing services that are basically tools to help the developers create products that will become available over the NII. That prospect was echoed by Edward Horowitz of Viacom Incorporated, who characterized five environments provided (or constrained) by information infrastructure for generating and distributing content, relating their value to users to their degree of openness (Box 3.6). Similarly, Charles Ferguson of Vermeer Technologies maintained in a white paper that software and network interoperability both depend on unbundling "viewers, servers, tools, operating systems, specific information services, and/or Internet access provision with each other."
Adams' observations also seemed to anticipate the explosion of interest, in late 1995, in Sun Microsystem's Java programming language for developing distributed applications. Java programs can be transferred from a server to a remote site over the network as needed, which expedites the deployment and use of new applications. Ferguson echoed the general point about software-based tools in his paper; see Box 3.6.
The Separation of Services from Facilities—Broadening the Potential Content
The applications business is the umbrella under which fall concerns about "content." Inputs to the NII 2000 project underscored that the information infrastructure is an arena for both content as a product and content as a byproduct of other activity. Content includes not only "packaged" content provided as a business activity, but also informal or amateur content in the form of material shared by individual creators in varying formats (voice, numerical data, image, video), and content communicated as an adjunct to other, higher-value activities (e.g., for electronic commerce or distributed and telework; see CSTB, 1995c).
BOX 3.6 Software and Other Tools
State-of-the-art commercial technologies applicable to the Internet include visual tools and WYSIWYG [what-you-see-is-what-you-get] techniques that enable end users to develop applications that previously required programming, client-server architectures, on-line help systems, platform-independent software engineering techniques, and systematic quality assurance and testing methodologies. Adobe, Quark, Powersoft, the Macintosh GUI [graphical user interface], and even Microsoft have used these techniques to make software easier to use. If these techniques were applied to Internet software, the result could be a huge improvement in everyone's ability to use, communicate, publish, and find information.
—Charles H. Ferguson,
"The Internet, the Word Wide Web, and Open Information Services: How to Build the Global Information Infrastructure"
There is  an authoring environment, how you get your ideas and visions developed;  storage;  the transport;  deciphering for use or display; and  … invoicing, collecting the money. I look to the NII as being able to support multiple operating systems, competitive operating systems, in all five of these environments without permitting one operating system or one set of users to block those of another.
—Edward Horowitz, Viacom Inc.
Content as a product is the emphasis of publishing, entertainment, and other businesses. As Edward Horowitz argued, "You have no bytes and bits without intellectual property; you have no business without something to convey." Some market analysts have gone so far as to suggest that the value of information about a service can exceed that of the actual service.23 Horowitz's comments on the importance of packaging and marketing to content providers showed that information infrastructure can be important for distributing both information-based products and information about those products.
Ongoing developments suggest that the information infrastructure plays a key role in such trends as marketing goods as streams of services. The NII facilitates direct contact between advertisers and consumers, and in the process it is producing changes such as those observed for product "branding."24 A key development—or sign of NII maturity—will be the shift in emphasis away from the tools of access (appliances, networks) to the services and functionality delivered. Market success on the NII may require figuring out how to sell computing and information profitably,
much as manufacturers seek to sell a lifetime of clean clothes instead of packages of soap, ongoing oral care instead of toothpaste, pet care rather than dog food, and so on.
Depending on who is speaking, "free" or no-fee information can be a product (as for scholars), a loss (where intellectual property is appropriated in a way that bypasses a compensation system sought by the owner), or a complement to a product that is intended to motivate the buyer. As Terisa Systems/CommerceNet's Allan Schiffman noted, "In cyberspace there is the possibility of having a good deal more shelf space, but still the necessity of competing for the attention span of prospective purchasers."
Perhaps because of the Internet's legacy as a source of no-fee information generated in the research, education, and library contexts, and perhaps because of uncertainty about how to charge for intellectual property, several people spoke of the Internet's value as a source of complementary information about commercial products. Andrew Lippman of MIT spoke generally about the Internet's potential for navigating various reservoirs of information; Daniel Lynch of Interop Company and Cybercash Inc. spoke about the potential for the Internet to foster global brand identification. Edward Horowitz described actual uses of the Internet as a marketing and promotion vehicle (see Box 3.7). Similar experiences and ambitions have been chronicled in business press reports on a wide variety of efforts to use the Internet and Web for advertising and promotion, with some degree of contention among directly generated and advertising agency efforts.25
Peter Huber of the Manhattan Institute cautioned against assuming that the content tail would wag the infrastructure dog, inasmuch as communications transport businesses are the largest information infrastructure businesses in terms of revenues. Even factoring out known levels of data traffic over telephony networks, their size attests to a huge market composed of humans talking to each other. Other contributors commented on the enduring strength of communications relative to sharing of packaged information. For example, past president of Prodigy Ross Glatzer reminded forum participants that "all people communicate, but only a small percentage of the population really cares about any one topic of information."
Consistent with Glatzer, Michael North of North Communications asserted that rather than assume that the market will favor "network-centric versus user-centric" services, we should expect people to be both "consumers of information sometimes and … producers of information sometimes. Sometimes we are in read-only mode, and we like to see what others have to say, and sometimes we would like to be publishers ourselves." The separation of services from facilities enables both kinds of
BOX 3.7 Promotional Information Uses of the Internet
There are certain things that go over the Internet which we love. We are promoting a new movie. We throw it up on the Internet, on our home page. When a Star Trek movie came up, we had half a million visits in six weeks through two million or three million pages of scripts and various other things downloaded during that period of time. The people who design the home page were ecstatic. … You do not get any money for it. The cost of creating that content comes out of the marketing budget, the marketing and promotion budget to manage the expectation. Simon and Schuster, for example, has huge amounts of educational programs. When you sell a program that consists of textbooks and ancillary material, the textbooks are what they want to get paid for and the ancillary material comes along with it. We are probably going to put the ancillary material up on the Internet.
—Edward Horowitz, Viacom Inc.
behavior (assuming some degree of symmetry), and both will fuel market growth.
The Internet And Its Use For Business
Effects on Provision of Goods and Services
How will the Internet or other information services affect the basic economics of content-providing businesses, which have historically been quite risky, with a much higher rate (and expectation) of failure than, say, telecommunications? Steven Wildman of Northwestern University explained that Hollywood economics recognize that "for motion pictures, 10 percent of them make a profit; 90 percent fail. Television programs aren't much better. CD-ROMs are looking similar." He suggested a need "to develop a financial infrastructure to average or aggregate those risks," as is currently done for motion pictures and television programs. Wendell Bailey described how this riskiness is apparent in cable program turnover: "At the 1995 National Cable Television Association convention in Dallas, Texas, there were 27 new [program producers] seeking access to cable networks. I have been in the cable industry for 14 years, and I counted up 181 programmers that I had seen at that show over the last 14 years that no longer exist today because no one wanted them." Yet the (current) cost of deploying a Web site is much, much smaller than the cost of making a movie or a CD. The result is much more experimentation and enthusiasm, since the definition of "success" (e.g., cost recovery) is much
different. This differential risk may lead to changes in the overall mix of content available through the information infrastructure. In the short term, cost recovery is an issue here as well as in other information infrastructure businesses; on-line service providers, for example, have reportedly frustrated some content providers with revenue sharing terms that favor themselves, but these arrangements, like pricing of Internet service provider offerings generally, are volatile (Zelnick, 1994).
A broad view of the information infrastructure and of trends in information supply and demand provides evidence that one-to-many communications is beginning to lose some of its effectiveness and efficiency.26 Trends in product diversification and customization suggest that if the slogan of the Industrial Revolution was the manufacturer's or retailer's statement, "This is what I have, don't you want it?"—the wave of the future is signified instead by the consumer's demand, "This is what I need, can you provide it?" The direction and nature of the flow of information may well change considerably, with a decline in the tendency of businesses to treat consumers as either target or database—rather, the manufacturer and the retailer will become both. Thus, for example, almost all print (and much television) advertising now contains an 800 number, even if the objective of the advertising is to enhance a brand image; this practice is now extending to Web and Internet addresses. The growth of the Internet and the Web themselves has been fueled by consumer-generated needs.
Drawing on his analyses of the television business, Wildman suggested that greater individual control over the selection of information products could have profound impacts on various kinds of content-publishing businesses:
If the editorial function can be put in the viewers' hands, [to do] the picking and choosing, the viewers can find what they want. Then the power and the economic justification for being a network or a channel bundler is dramatically reduced. If we lose the networks, we have lost the major force in television over the last 50 years.
Wildman also suggested that changes could come as a result of ongoing audience fragmentation (arising from greater ease in targeting segments and the increasing difficulty of building a national audience).
At the January workshop, J.M. Tenenbaum of Enterprise Integration Technologies Corp./CommerceNet speculated about the evolution of Internet-based interaction to support more individualized services, including personal webs and group webs:
[E]veryone is … going to be able to grab things over the net that are of interest, annotate them, link them into their own things, post them on their own Web server, and then selectively share some of those things
with their friends—initially by providing access to a server or posting them on a group server which has broader access out into the world.
Charles Ferguson argues that the Web offers "the opportunity to liberate computer users, publishers, and information providers from the grip of the conventional on-line services industry," which he likens to the mainframe computer industry because "it maintains its profitability only by charging extremely high royalties and by holding proprietary control over closed systems." Rhetoric aside, the Web clearly facilitates direct communications between information creators and consumers, which will affect the shape of a variety of intermediate industries, including publishing and advertising.
The trend toward greater individual control across different kinds of networks and media raises questions about prospects for provider-controlled service packages, including those associated with contemporary market trials. Similarly, notwithstanding existing provider comfort with asymmetry, one interpretation of recent shifts in broadcast company ownership is that despite their imperfections, interactive services and other uses of computers are shifting consumers' attention away from broadcast media, constraining expectations for growth in the business value of broadcasting.27
The Internet—Layering, Incrementalism, and Diversification
The development of the Internet epitomizes the separation of services from facilities. The rise of a variety of Internet access providers derives from layered infrastructure architecture, which is described in Chapter 4. One layer's materials are the next layer's framework. Activity is happening at all layers at once, beginning with a teasing apart of low-level services from physical infrastructure in the form of simple Internet access business. This situation complicates assessing who owns what, who resells what, capitalization and risk levels, entry requirements, and so on. On a rough basis, however, business opportunities can be seen in the area of transport service and access supply, "middleware" services, and applications.
First there are the Internet service and access providers. Although there is no clean categorization, some of these are commodity providers and some add higher value (e.g., information service providers that provide both communications and information services); some target individual users (aiming for large numbers of small accounts) and some organizational units (aiming for modest numbers of large accounts). Telephone, cable, and other facilities-based providers have moved over the past couple of years from strong reservations about the viability of the Internet to increasing willingness to provide and package Internet access,
with some kind of added value (as illustrated by earlier comments on the proliferation of suitable cable modems).28 Concentration among on-line service providers has been increasing; the largest three (America Online, CompuServe, and Prodigy Services) served 89.3 percent of the market as of December 1995, and the largest six together served an estimated 96.9 percent of the 11.3 million subscribers (Arlen, 1996). Increasing concentration may underlie a movement for some smaller service providers to become content providers serving customers through larger players (see Arlen, 1995b). In their white paper, Jiong Gong and Padmanabhan Srinagesh comment on the economics behind different categories of Internet service providers, themselves displaying different degrees of layering depending on whether they own their own facilities:
The variety of organizational forms in use raises the following question: Can ISPs [Internet service providers] with varying degrees of integration coexist in an industry equilibrium, or are there definite cost advantages that will lead to only one kind of firm surviving in equilibrium? The answer to this question hinges on the relative cost structures of integrated and unintegrated firms. The costs of integrated firms depend on the costs of producing the underlying transport fabric on which IP transport rides. The cost structures of unintegrated firms are determined in large part by the prices they pay for transport services (such as ATM and DS3 services) obtained from telecommunications carriers. These prices, in turn, are determined by market forces. More generally, the layered structure of data communications services leads to a recursive relationship in which the cost structure of services provided in any layer is determined by prices charged by providers one layer below.
Next come the middleware providers, notably those firms springing up to offer products based on the World Wide Web—Web-based publishing facilitators, browser providers, and so on29—and Internet (or Web) directory services (Higgins, 1995b). CommerceNet, for example, is a consortium that owns a Web server for which it develops and supports applications; it does not own network facilities, per se.30 Although middleware includes mass-merchandised, consumer-oriented, and corporation-oriented products, the segments and the providers are already appearing to consolidate (Goff, 1995).
Also in this category are the security protection service providers, some acting as part of browser offerings (e.g., Terisa Systems); some separate (e.g., RSA Data Security Inc.), albeit possibly functioning through business alliances. Another example of middleware services, again tied to the Web, are the on-line equivalents of credit card verifiers, which mediate financial transactions between purchaser and merchant. Middleware business seems to fit into a new kind of "intermediate product"
category (just as sheet steel is an intermediate good between the materials and automobile industries).
Then come applications, such as business services (e.g., Lexis/Nexis) and on-line games (e.g., Sierra On-line). These range from final products (e.g., games) to intermediate products (e.g., business services). 31 The proliferation of application services that are themselves inputs to other activities illustrates the difficulty of categorizing the products and businesses associated with information infrastructure. It also underscores the fact that infrastructure-related businesses are increasingly enmeshed in other, possibly higher-value, activities (e.g., education, commerce, research, work of different kinds), much as are other kinds of infrastructure.
Although activities can be examined from the perspective of different layers, Stephen Wolff of Cisco Systems Inc. cautioned that a holistic perspective is also important. Wolff noted that Internet growth will not remain very high unless facilities providers "increase their investment at commensurate pace, because the Internet is a value-added service on top of the underlying bitway structure. So it's all got to go together or it's not going to go at all." Similarly, @Home's Milo Medin noted that despite its growth, the Web's "potential is not going to be realized unless we can really scale up the level of bandwidth in the network." In his white paper, Robert Blau describes the Web as a driver of network market growth that could challenge local network capacity:
As new resources come on-line, demand for access to the WWW will increase along with its value to users as well as information service providers. Similarly, as the value of the WWW increases (e.g., by the square of the number of new users added to it during any given period), online sessions also should increase in duration (e.g., from current levels of 25 minutes per session versus an average of five minutes for local telephone call) for the simple reason that there will be more users and services to interact with. The combination of more businesses and residents spending more time on-line, accessing increasingly sophisticated multimedia services that require substantially larger amounts of bandwidth to transport, could press the limits of many local telephone networks within a relatively short period of time.
Blau's speculation about the Web as a driver for demand and revenue is notable, given his otherwise conservative assessment of the investment and revenue horizons for telephone companies. On the other hand, he also notes that Internet traffic can increase the efficiency of network facility use, suggesting that such traffic "could help recoup the cost of deploying wider-band technologies that Internet users will need" if "priced properly."
As fast as the Web's growth has been, the entire flow of Internet traffic is vastly less than the capacity of the telephone network—probably
less by two orders of magnitude.32 As a result, even with modest growth in local access capabilities, supply and demand for bandwidth may be in reasonable balance.
Hal Varian provided a broader holism, relating the economics of network markets, which depend on the presence of multiple users (who find value in direct proportion to the number of other users), to adequate distribution of complementary goods and services:
My favorite candidate for the "killer app" is multimedia video conferencing and wide area collaboration. But it is very hard to create critical mass in this kind of industry, as ATT found out years ago with their Picturephone initiative. People are not going to invest in the hardware, software, or learning costs until the technology is widely enough deployed to warrant their investment. But this leads to the standard chicken-and-egg problem: overcoming that conundrum will be the big problem for the industry.
Robert Crandall also emphasized at the forum that services will evolve with the information appliance (and other equipment) base, including access equipment and other elements of applications.33 34 Part of the growth of fax, for example, reflected the transition in equipment from telex as well as the acquisition of fax equipment by more and more people as a function of declining cost.35 Recognizing this problem and taking advantage of the relatively low cost of reproduction for software, on-line service providers (e.g., America Online and Prodigy) have given away access software. Similarly, market-leader Netscape has made Web browsers available for free or at low prices in the interest of selling more expensive software for servers; the Web server market is forecast to grow dramatically yet still remain small for several years.36 Notwithstanding the rate of growth of the Internet and the World Wide Web, user equipment cost and difficulty of use constrain the level of use of emerging features and services, as noted in Chapter 2.37
Both applications and middleware involve some kind of framework for organizing material (content) that may be produced or distributed as part of the application. A white paper by Randy Katz et al. captures the perspective voiced by several in arguing that "a critical element of NII development is the fostering of appropriate commonalities, with the goal of achieving broad adoptability while promoting efficient competition and technological evolution."38 Middleware standards (for example, standards for data format and service invocation39) enable a common application development framework; application standards enable implementation by a variety of providers and, ideally, access by users operating in a variety of environments. For example, a white paper by David Schell et al. describes needs associated with geographic information sys-
BOX 3.8 Middleware Standards Needs
Commercial need for better spatial data integration is already clear in areas such as electric and gas utilities, rail transport, retail, property insurance, real estate, precision farming, and airlines. Given the critical nature of applications positioned to combine "real-time" and geospatial attributes—emergency response, health and public safety, military command and control, fleet management, environmental monitoring—the need to accomplish the full integration of [spatial data] resources into the NII context has become increasingly urgent.
—David Schell, Lance McKee, and Kurt Buehler,
"Geodata Interoperability—A Key NII Requirement"
[Digital libraries] are leading to significant advances in the generation, storage, and use of digital information of diverse kinds. The range of underlying services and technologies includes advanced mass storage, on-line capture of multimedia data, intelligent location and filtering of information, knowledge navigation, effective human interfaces, system integration, and prototype and technology demonstration.
—Randy H. Katz, William L. Scherlis, and Stephen L. Squires,
"The National Information Infrastructure: A High Performance Computing and Communications Perspective"
tems, while the one by Katz et al. explains needs associated with digital libraries. Katz et al. underscore the importance of developing appropriate software, as noted above: they relate cross-cutting applications to the emergence of "a common service environment" for building domain-specific applications, complemented by application development and support tools and "a marketplace of reusable subsystems." See Box 3.8.
Alleviating concerns about standards will depend in large part on attainment of consensus within an application domain (see Chapter 2), but that process should build, to the extent possible, on common or cross-domain standards at the middleware level. This use of common middleware standards will maximize the prospects for compatibility between implementations at the domain level and will facilitate the eventual integration of inter- and intra-domain communication.
The Web provides an illustration: it embodies a framework and a set of standards that have made the Internet more useful to more businesses, organizations, and individuals; it works through standards, and it has given rise to businesses that implement and advance those standards. Whereas conventional or paper publishers own their own presses, trucks, and so on (or subcontract to businesses that own such equipment), the
Web and the Internet make use of more broadly shared distribution infrastructure. Another lesson from the success of the Web is that an open standard like the Internet's TCP/IP protocols can be a very fertile field for innovation, allowing experimentation with multiple applications.
The inclusion in Microsoft's new operating system, Windows '95, of software for accessing an information service package, including Internet access, reflects a business strategy based on tying two layers together: the software provides the interface to physical network facilities (typically owned or resold by others); it also provides TCP/IP support for customers already having Internet access. This development seems to go beyond the bundling of TCP/IP software with computer and network operating systems (Higgins, 1995a)40 and the distribution by information service providers of access software with PCs and directly to consumers recently. The broader Microsoft strategy appeared initially to tie three layers together by integrating various applications with networking (Zelnick, 1994). Microsoft Network is an illustration of the growing emphasis on software to implement information infrastructure access.
More broadly, Hal Varian suggested that the joint evolution of content and transport businesses may well lead to some form of integration, raising questions about the market structures that are economically sustainable given the product or business and possibilities for competition that technologies are creating. He commented on recent instances of cooperation and alliances among broadcast, cable, telephony, and entertainment companies, which embody integration that may or may not endure, at least in terms of which business dominates:
[V]ertical integration with the content providers seems to be the most popular solution. We have got considerable evidence of this happening already. There is U S West, Time Warner, Disney, Ameritech, BellSouth, MCI, Rupert Murdoch. America Online just announced some joint agreements and mergers yesterday. It is not obvious that it will really work in the long run. These days, the cash is going from the transport providers to the content providers, because it is the transport providers that have all of the money. But, if the market for transport becomes very competitive in the future and it becomes a commodity business, then the money is going to have to flow the other way. You are going to have to have the content cross-subsidizing the transport. What you buy is the content; the transport is thrown in for free. The big question is whether content providers will stick to their joint agreements once they have competing transport providers clamoring for their business. That remains to be seen.
Consistent with Varian's analysis, Milo Medin described how his new venture, @Home, will combine Web servers with cable facilities and cable and other media programming.41 Taking a broader view of integration,
there are already signs of the integration potential of telecommunications companies and banks, and yet other instances of integration may emerge. 42
Reacting to the risks, uncertainties, capabilities, and costs of various technologies, contributors to the NII 2000 project seemed to favor investments that allow for increases in bandwidth, openness, and symmetry on an incremental or expandable basis. David Messerschmitt of the University of California at Berkeley related incrementalism to architecture: ''I would try to define an architecture that had the characteristic that I could initially save money by deploying an asymmetric situation, but hedge my bets by having a rather simple upgrade path to a more symmetric situation." Decisions on architecture as well as facilities affect how easily decisions and investments made today allow for or support the next increment—how easily evolution will occur. For example, putting extra fibers in a bundle conveys significant extra capacity wherever they go. As Robert Powers et al. explain in their white paper, the capacity of a fiber pair has grown substantially over the past 15 years "based only on changes in electronic and photonic equipment attached to the fiber, not changes in the buried fiber itself." That situation relates investment in fiber to incremental investments in associated equipment. Decisions on architecture also relate to the split between incremental investment by providers and by customers (see "Economic Models" below). Thus, at the January workshop David Messerschmitt related integrated services digital network (ISDN) and the telephony activities associated with the Advanced Intelligent Network initiative (see white paper by Stewart Personick) to architectures that allow more intelligence in terminal devices and thereby provide more flexibility in provisioning new services than do architectures that contain all service functions in centralized switches.
In view of the uncertainties associated with the economics of investment in fiber, telephone companies have been paying significant attention to technologies that allow them to better leverage existing copper plant and digital switches (through software and appropriate line cards), providing significant bandwidth advantage relative to conventional telephone service, albeit less than that available through fiber.43 Two technologies dominate this approach: asymmetric digital subscriber line (ADSL) and ISDN.
ISDN is a phoenix technology, having been touted and then dismissed by experts but now appearing to be relatively easily supplied.44 In the absence of alternatives, both providers and business customers cite ISDN as a vehicle for telework and other data-intensive home communications.
Commented Edmond Thomas of NYNEX Science and Technology, "There were no terminals or applications for a very long time. … With the Internet, terminals now becoming available, and, more importantly, application software to call on it, [ISDN] is starting to find almost ubiquitous application." Les Vadasz emphasized ISDN's bird-in-the-hand value: "The technologists are enamored with the next generation and the generation beyond; they are not realizing what benefit we could get from [ISDN's] bandwidth, which is available now."
To make the point that incrementalism is relative, Robert Blau argued that at about $3.4 billion (42 percent of the company's total capital expenditures between 1988 and 1993), making all of Southwestern Bell's access lines ISDN-ready would be a major commitment. Edmond Thomas argued for diversification even in the short term: "You have got to try them all: the technology right now is at best a first generation. And you could make a very big blunder by choosing incorrectly." Nevertheless, router, modem, PC, and on-line service vendors all seem to be moving to support ISDN access, trends that should sustain some level of growth in ISDN availability.
ADSL represents a way to reuse the existing copper plant for delivery of video and some interactive applications to the home. Mahal Mohan explained in a white paper that ADSL makes use of existing copper plant by installing matching equipment at both ends of the loop. An incremental deployment, "ADSL devices can be disconnected from one user and moved to another user as, for example, when the first ADSL user decides to upgrade to a higher bandwidth medium such as fiber or fiber coax access."
Similarly, other forms of digital subscriber loop technology allow the copper loop to be reused for other forms of new service. The symmetric high-bit-rate digital subscriber line (HDSL) may represent a way for the telephone companies to use their copper loops to sell interactive data services to the home at speeds of up to 2 Mbps. While products for this application are not yet fully mature, it is possible that this service might actually be less expensive to deploy incrementally than the lower-speed ISDN service, because it could be specialized for data only, rather then needing to support both data and voice service through the switch.
Edmond Thomas related incremental investment in ADSL to the different costs associated with providing service to different areas. See Box 3.9. In his white paper, Francis Fisher of the University of Texas at Austin underscores the importance of relating deployment to area. He notes that there are relatively low costs of entry in dense downtown markets, fostering openness, whereas in the residential market it is more difficult to assure universal service, let alone open service, by a single provider. The
BOX 3.9 Potential Uses of Asymmetric Digital Subscriber Line Technology
Asymmetric digital subscriber line (ADSL) technology probably will have, in the early days, two major applications. One of them is in congested urban areas, such as in apartment buildings, where you have conduit congestion and there is no way to get to a customer's apartment without knocking down the walls. If you want to provide interactivity, ADSL may in fact be a way to do that. The other application is at the opposite end of the spectrum, in rural areas where you have a low population density. You would like to provide those customers some interactivity and you cannot afford to put in fiber initially, but the copper is already in the ground. Right now, our view is that fiber to the curb looks like it has potential in the very dense urban areas, and hybrid fiber coaxial cable probably in the suburban to the moderately dense urban areas. But my bet is also that our assessment right now is probably wrong. As volumes increase and technologies improve, the areas of applications of these technologies may, in fact, change dramatically. If we make a big investment today in one area, I think we could be very, very wrong.
—Edmond Thomas, NYNEX Science and Technology Inc.
incentive and profitability problems, of course, are exacerbated in the even sparser rural areas due to limited traffic aggregation.
The television industry has perhaps the strongest tradition of incrementalism, in part because of expectations—reinforced by a history of regulatory requirements for back-compatibility with the installed television set base—regarding consumers' limited ability and willingness to upgrade television sets. Thus, Scientific-Atlanta's Allen Ecker pointed out, "Analog is going to pay the bills for a long period of time for the investment to get the infrastructure, certainly in the entertainment part. …" Some kinds of information infrastructure require relatively modest incremental investments for consumers, some require more, and, as discussed in Chapter 2, the increase in intelligence and functionality in end-user devices suggests that more and more of the total investment is migrating to the periphery of the network and into users' environments. Graham Mobley noted that an associated issue is ownership and distribution of access devices: "[I]n the cable environment there may well be situations where interactive terminals would be leased from the cable company on a trial basis. If someone did not like the terminal, he or she could give it back and go lease a lower-cost or lower-featured box. These things will come along with time."
The theme of evolution in cable (see Box 3.10) was extended into the future by Don Dulchinos of Cable Television Laboratories, who described
BOX 3.10 Incremental Expansion of Cable Service
[T]he roll-out of home digital terminals or high-speed data modems … is a very evolutionary process. Only those customers who demand service, as they demand service, can be supplied with the terminals. The roll-out of the technology would be closely matched with the revenue made possible by that technology. Then, by the year 2000, you start looking at broadband types of full-service networks, add video servers and media servers in general, either at a head-end location or distributed throughout the network, and provide service that way.
—Don Dulchinos, Cable Television Laboratories Inc.
Since we have got this broadband pipe in place, we can reinforce it with a little fiber, make it work a lot better for our core business, and then get into digital businesses, whether they be telephony, PC modems, or interactive video, on an incremental basis, where most of the subsequent investment is variable. And it goes into the modem, into the routers, into the set-top boxes, or into the telephony interfaces. It is largely a traffic engineering problem; we can keep pushing fiber deeper, and making it more and more granular, and getting fewer and fewer homes per fiber trunk. And then we can reconfigure which part of the spectrum in the coaxial cable we use. Those are not great problems, because they are forced by demand, which means revenue.
—James Chiddix, Time Warner Cable
[In 1975,] basic cable was about $7 per month. They introduced HBO for another $7, a service that took effectively no bandwidth. They were able to double their revenue and put in a simple box on a variable cost basis. We have a very similar situation right now. You can add lots of digital information on a cable system today using an asymmetrical modem. I define "asymmetrical" as meaning broadband for downstream communication and twisted copper pair coming back. You can charge probably $15 or $20 because you are going to a particular kind of user to deliver very-high-speed data uniquely to that computer. It takes very little bandwidth, and your capital expenditures are one box at a time.
—Edward Horowitz, Viacom Inc.
hybrid fiber coaxial cable upgrade plans that will support delivery of about 80 analog video channels and at least 100 digital video channels, and growth in interactive applications (implying upstream capacity). He related capacity supplied by cable providers to home equipment and demand. James Chiddix also commented on the fit between supply and demand, further suggesting that the evolving cable architectures do provide the flexibility sought by David Messerschmitt.
In response to a question, Chiddix indicated that it would be possible to provide a choice of data-oriented radio-frequency modems supporting different degrees of asymmetry: "The beauty of having this radio spectrum in our coaxial cable is that we can have coexisting transport structures that behave very differently. We will build different businesses around them. In fact, we need to find ways to get more revenue out of high-value services than out of lower-value services in exactly that way." Edward Horowitz cautioned against expecting too much too fast, observing that "the only thing that was displayed at the  cable show of any value was the fact that there is a move by the cable community to install broadband modems." Significantly, a number of computer hardware manufacturers (e.g., Hewlett-Packard, Motorola) are seeking to supply these modems.
Wireless nonbroadcast networks have been incremental for consumers. At the outset of availability of cellular and PCS services consumers have had a larger wireline base of equipment to receive calls, while providers must develop sufficient infrastructure to support wireless service. PCS can be considered a kind of incremental improvement over cellular, with digital cellular as a closer improvement over analog cellular. There was some speculation among project contributors about the value of PCS and other wireless systems as a vehicle for relatively near term experimentation with different product and service concepts.45 Mused Harvard University's Lewis Branscomb, "Let the wireless services try to capture these new markets, and then see how fast the cable firms and the local exchange carriers get into the competition for the business."
The comments of both cable and telephony providers referred to business calculations and "traffic engineering" intended to manage and minimize investments in anticipation of delayed revenue growth. For example, in discussing with other forum participants how to interpret deployment statistics, Joel Engel explained that providers do not assume that every home passed will be a customer, which has implications for the translation from aggregate bandwidth to a neighborhood to bandwidth delivered or accessible to an individual household, especially during the first stages of service introduction, but also when the offering is fully mature.
Engel said that Ameritech has "engineered our system to take into account what we think is a reasonable distribution across these services to allow symmetry for those applications that require it." Andrew Lippman cautioned against excess conservatism: "Start small if that is the economic answer. But do not view the ultimate design as one that still will be potentially too small for what is really going to be there."
Arrangements For Interconnection
Although technology may facilitate unbundling, the business arrangements surrounding interconnection, which reflect underlying network economics and superimposed regulatory regimes, are key to associated investments and offerings. These arrangements, which relate to both interconnecting networks and gateways or other forms of connectivity to nonnetwork content and service providers, affect the incidence of cost and the levels of revenue and profit. Many questions arise, such as the following:
- Who wholesales and who retails what in local and interexchange connectivity and information access? 46
- How balanced will traffic flows be among interconnected systems?
- What capacity exists to charge different fees in different directions where flows are balanced or unbalanced?
- How will financial compensation (settlements) be governed in areas where traffic flows are unbalanced?
- How will end-to-end impairments be allocated amongst the various service providers? and
- How can quality of service be maintained in a distributed network with multiple providers?
The arrangements for interconnection constitute a topic on which economics research and analysis are under way, building from the experience in telephony.47 Not surprisingly, telephony-experienced participants in the project were the most vocal contributors on this topic. For example, Teleport Communications' Gail Schwartz and Paul Cain describe in their white paper the elements of central office interconnection facilities: interconnection electronics, cable, and services, plus local access and transport area (LATA)-based local exchange routing guides and the line information databases (associated with Signaling System 7) and other databases (for directory assistance, 800 numbers, and so on). They relate those facilities to business arrangements, noting that in the Internet "commercial service providers hand off traffic to each other with no settlements, no exchange of money, on a sender-keep-all or bill-and-keep basis."48 At the forum, Schwartz asked,
Are competing local exchange carriers terminating traffic on each other's networks going to pay each other or are they going to do a sender-keep-all arrangement? If they are going to pay each other, given that the traffic will be unbalanced for a long period of time (that is, the incumbents will be terminating a lot more traffic than the new entries will be terminating), what terms will they adopt for doing that? If it is not bill-
and-keep, which might not be feasible because of unbalanced traffic, then the economic basis for the exchange of traffic would normally be, absent a heritage of regulation, a capacity-based charge rather than a minutes-of-use charge. … If a new entrant has to price its own services in a manner that is tied to the discount plans and the other rate schedules of the incumbent, then the new entrant is economically impeded from using its portion of the interconnected network of networks to its full capability, and therefore recovering its own investment. …49
Although both established and new providers argue regularly in regulatory proceedings about the balance of traffic flows and cost incidence, 50 there is a tendency over time for flows to and from most areas within the country (and the providers that serve them) to be balanced, at least in voice communications.
The terms and conditions for interconnection are becoming a pressing issue as more and more networks connect with the Internet and with each other. Tim Clifford noted that arrangements between local exchange and interexchange carriers involve business mechanisms to track traffic flows and translate that information into financial settlements that are designed for switched voice and private line services; because the experience in telephony tends to exclude interconnection between competitive service providers, the past models may not apply. In their white paper, Powers et al. note that among the factors leading to greater interexchange carrier (IXC) provision of end-to-end services is "the interest of both IXCs and their customers in cutting the costs of the last-mile links, which are now such a large portion of the costs of providing long-distance telecommunications." Greater competition in local exchange or regulatory requirements for cost-based access charges will also affect those costs.
Local access charges also reflect another issue that has become urgent in the telecommunications reform debates: costs associated with unequal provider obligations. Baumol and Sidak (1994a, p. 196) argue that the costs of serving as the carrier of last resort and other regulatory "obligations are appropriately treated as sources of common fixed costs for the firm; the costs must be covered legitimately by the firm's prices."
Resolving the issue of settlements is far beyond the scope of this project, but contributors familiar with the Internet identified Internet-related interconnection arrangements as a sleeper that could have an enormous impact in the future. The commercialization of the Internet explicitly involves so-called network access points (NAPs). The NAPs interconnect Internet backbones (interexchange components) and are federally supported now but will not be indefinitely; there is also a Commercial Internet Exchange, which has connected member providers without settlement charges. Although NAPs provide a convenient location for interconnection, they offer no new mechanisms to facilitate settlements.
Internet protocol functionality makes settlements much more difficult to implement than in historical telecommunications systems, most obviously because the Internet has no concept of a "call" against which individual data packets can be counted and billed. This has been a major reason that many Internet service providers have not been able to craft agreements: even providers that have agreed to the principle of settlements still have not figured out how to act on it. In their "Economic FAQs About the Internet," MacKie-Mason and Varian (1995) assert that a system of settlements is inevitable for the Internet, because ''resource usage is not always symmetric, and it appears that the opportunities to free-ride on capacity investments by other network providers are increasing." They caution that as of mid-1995, "the necessary technical, accounting, and economic infrastructure is not in place for NAP-related settlements."
Deployment of high-capacity and more general facilities raises questions about how much revenue can be generated, and how fast, to pay back the investment. An unfortunate consequence of the term "national information infrastructure" is the tendency for most people to assume that it implies something monolithic and uniform as a bundle of services or businesses. But just as it is increasingly clear that the NII embraces combinations of technologies that are optimized for the delivery of multiple services or service groupings, it is also clear that the NII does and will continue to contain multiple business and social models. Complicating the debate, the new constituents entering the national information infrastructure discussion bring with them different models and motives. These models drive experimentation with pricing schemes, which in turn drive the flow of cost recovery from end users and from information and on-line service providers that make use of facilities and even other services owned by others .51 End-user pricing arrangements can and probably will differ from the way carriers are or will be compensated for providing underlying network capacity to information service vendors. The pricing of intermediate components (e.g., access to fundamental communications facilities, interconnection among networks) will be an important determinant of market conduct and performance, and itself calls for more analysis.
Padmanabhan Srinagesh, an economist for Bell Communications Research, related the telephony investment posture to a business model that recoups large up-front investments through streams of small payments (see Box 3.11). Telephone (or cable) companies deploying fiber face a problem of asset specificity: the wireline access circuits, in particular, link the provider to an individual consumer who may or may not want to
BOX 3.11 Cost Recovery in the Telephony Model of Communications
If customers were willing to pay $1,600 for optical fiber and symmetric bandwidth to the home, the way they pay for a computer, a competitive market with many kinds of suppliers might emerge. The fact is that the communications industry works in a different paradigm, where the one-time costs of facilities are recovered through a steady flow of monthly charges. The risk that the asset may be stranded remains with the provider and not with the consumer. This is a major reason for slow change in the communications infrastructure.
—Padmanabhan Srinagesh, Bell Communications Research
generate enough use to pay back the investment in a "timely" manner. Some kind of forward contracting (long-term commitment to pay) could offset the uncertainty, although that is not a conventional market mechanism outside of satellite service or certain kinds of private networking, possibly because of regulatory constraints. However, in their white paper, Gong and Srinagesh note that "there appears to be an empirical trend toward term/volume commitments that encourage consumers of private line services to establish an exclusive, long-term relationship with a single carrier." A consequence appears to be evidence that non-facilities-owning Internet service providers with such multiyear leases are themselves beginning to offer long-term pricing options to their customers.
Both NII 2000 project components (workshop, forum) and popular debates tend to characterize the telephony model as one that involves increasing consumer cost with increasing usage. Robert Crandall pointed out that the incremental charges themselves reflect quirks of the regulatory history:
In the case of the way states regulate telephone, we subsidize people's access to the system, but then charge them prices which are far in excess of costs to use the system. As a result, there is far too little usage of the communications networks we now have. … The assumption [appears to be] that we all subscribe to a telephone service in order to sit and either look at our telephone set and not use it, or to wait for some telemarketer to call us, so it doesn't cost us anything to pick up the telephone. Presumably if you want to use it, you should be allowed to use it at a price certainly no greater than the incremental cost to society of using it, and that isn't what we do today.
In short, the telephony price structure is not directly related to the cost structure, which confounds the business challenge facing telephone com-
panies that are contemplating how to proceed with information infrastructure.
The business models associated with the NII involve different combinations of, and investments in, several key components: network facilities (e.g., circuits, switches, routers, base stations, antennas and receivers, head ends); content (e.g., text documents, still and moving imagery, data files—presented as articles, photographs and videos, databases, and so on) and associated facilities (e.g., storage servers, browsing and search systems); information appliances (e.g., telephones, televisions and set-top boxes, personal computers, personal digital assistants); and skills (implying time and training) of people developing, producing, delivering, and using the information infrastructure. They differ in terms of who bears what costs and risks to deliver and use information infrastructure, including what costs are internal to customers or providers and what costs a provider can pass on to customers.
Cost incidence is particularly relevant to understanding what it takes to achieve some degree of equitable access, since users vary by income, education, and other indicators of willingness and ability to use a given service. Provider costs are themselves uncertain. For example, early entrants in the on-line shopping business have found that "just being there" is no prerequisite for success. There are significant investments that must be made in back-end systems as well as linking the manufacturer to the consumer via on-line access. The process of figuring out what technology is needed, where, can take time, producing a lag in matching business opportunity to consumer demand.52
Overall, variation in cost level and incidence as captured in different business models allows different parties to experiment with different offerings (goods, services, and price structures) at the same time, and it allows different kinds of providers to serve different kinds of customers with a range of prices, levels of performance and quality, and commitments of resources. Variation is a logical business response to the absence of a "killer app" and the enormous uncertainty about market demand. How much and what kinds of variation are sustainable over the long term, however, will depend on how the underlying economics of the information infrastructure evolve with the deployment of new technologies as described in Chapters 4 and 5. Another important but uncertain issue is the cost of accounting mechanisms necessary to support different approaches to intermediate and end-user charges. Such costs affect the relative appeal of charging on the basis of time of use, quantity of use, and/or access. Subject to applicable regulatory constraints, a range of business models should be observable over the next several years.
At least five models (and probably more), representing different com-
binations of cost recovery for content and for infrastructure, are evident today:
- Metered telephone: the consumer pays by the minute (for time using the network and level of traffic);
- Embedded or domain-specific: cost is hidden in some other service;
- Broadcast TV and radio (advertising): the advertiser pays (by the minute), and the consumer does not pay for service;
- PCs: the consumer pays for unlimited use of a device (including stand-alone use, independent of network-based services); and
- Cable or flat-rate telephone subscription: the consumer pays by the month.
The discussion below addresses how these various models are being applied to the evolving information infrastructure service menu. It highlights the emerging uses of the Internet, which is changing conventional charging and other business practices. Although the models overlap somewhat, they are intended to reflect different emphases and approaches to recovering the costs of services deployment.
Usage-based Fees for Communications and Information Services
The model of usage-based fees for services is most like the metered telephony model, which involves monthly payments on a volume-of-us-age (e.g., message service units) basis. In this model, which embraces network-based services that are both special-purpose (Lexis and Nexis are good examples) and general-purpose (CompuServe and America Online are good examples, in the context of use beyond a monthly base level), the information user is the key economic factor. Market growth and competition pit perceived value against cost. Information consumers weigh the factors, and they very sufficiently that the market has supported both special- and general-purpose, and both no-frills and high-value-added, services.53
Competition has kept the prices declining, and services continue to improve in both usability and value. Perhaps 7 million U.S. homes now subscribe to some kind of on-line service, and frequency of use in those homes continues to increase.54 Communication, chat and electronic messaging, and common-interest bulletin boards and consumer information continue to attract large volumes of traffic and growth. Electronic transactions such as investment activity, including stock transactions and electronic bill paying, while modest activities, continue to grow and proliferate.
Embedded or Domain-specific Services
Under the NII umbrella a number of domain-specific applications or services are being developed. It may increasingly be the case that the communications between businesses and between businesses and homes will be "included" in the pricing for core products and services. Embedded deployment of computing and network capability in the private sector is driven by the strategies of the businesses to which the infrastructure application contributes. Financially, such outreach by businesses may accelerate diffusion (assuming available facilities provide the necessary capabilities) inasmuch as their investments are amortized in a relatively short period of time. Associated costs for distributing information or communicating may become viewed as an element of overhead (somewhat analogous to climate control or electric power). This prospect suggests that much of the information infrastructure will become deeply embedded, like motors or solenoids, so that its costs are submerged.
For example, remote access to medical instruments and specialists targets physicians in their offices, providing rural and other hard-to-serve areas with greater resources via information infrastructure. Such services may provide PC-type hardware devices, software, and wide-area communications for physicians without charge.
Most of the currently envisioned health care uses of the NII in the home will be paid for by providers and payers of health care or suppliers such as drug companies. They will be "free" to consumers. In the context of home banking, an example of a domain-specific service model is the recent introduction by Citibank of home banking services "free" to its customers with some supply of software and necessary equipment (modem, or terminal).55 Obviously Citibank feels that this service will increase its competitive position, and it can absorb these costs through its core business revenues. Another example is provided by Pacific Gas and Electric Corporation, which is experimenting with an "energy channel" on cable television that allows users to more easily control their home appliances, lighting, and heating and other climate control functions. Although more efficient monitoring and control of energy consumption will ultimately save on utility costs, meter readers, and other costs, Pacific Gas and Electric currently plans to charge consumers about $10 per month for this optional service. On the other hand, Kansas City Power and Light plans to offer a similar service "free" to its customers because it believes that doing so will be profitable.
Although time will tell, domain contributors to the NII 2000 project expressed the belief that the Kansas City model would dominate. One reason is that support will also be provided by intermediate service suppliers. For example, Mastercard and Visa plan to use a software standard supporting electronic credit card transactions that will not involve charges
to consumers (Hansell, 1995b). Communications arrangements for that credit card system are left to consumers.56 That condition illustrates that a key to progress for embedded or domain-specific services remains the underlying or complementary relationships between consumers and communications providers.
The Broadcast Model
In the broadcast model, based on the traditional radio and television broadcast environment, the advertiser pays. In several instances both before and during the forum, contributors commented on broadcast model applications of the Internet component of the NII, for example, to bring information about products and services to niche and casual users without a fee, at least for access to and use of the information. The World Wide Web was the locus of much of this activity in 1995 and appears to be the place where this model is developing most rapidly. However, there are a number of other instances associated with specific businesses. For example, pharmaceutical companies are exploring on-line delivery of applications and networked information, which they intend to deliver free of charge, with advertising, to the doctor or clinic. One service—Physicians Online—has more than 50,000 users (Electronic Marketplace Report, 1995). The motivation is simple—facilitating the marketing of more quickly developed and more complex pharmacological compounds—and economically sound; it can be equated with the airlines' motivation in making computerized reservation systems available.57 The ubiquity of the Internet infrastructure adds the reach and efficiency of shared network infrastructure as the enabling ingredient. In the broadcast model, the user—for example, the medical professional—sees only the service; costs come in the time and training needed to gain the familiarity necessary to get to the information, in a real-time way, but there is no direct out-of-pocket cost to the end user.
Robert Crandall, speaking on behalf of an advertising group, commented on the historic role of advertising revenues in the growth of various media, despite regulation, and relative to subscriber-funded telephony. He argued that advertising should continue to be exploited as a source of financing for new information infrastructure, speculating on possibilities for a mix of advertising and direct subscriber payments—between services and for the same service: "[I]nvestors should be permitted to explore all possible sources of revenue from the marketplace if we expect them to commit such large amounts of capital to so risky an enterprise." Crandall's assessment is corroborated by a financial analysis for video services by Veronis, Suhler & Associates. Veronis, Suhler (1995) estimate that most households (which already pay more than $20 per
month for basic cable and $35 if they subscribe to premium channels) will be unwilling to pay a monthly incremental subscription fee of $10 to $15, a fee that would be necessary for recovery of investment in interactive video capacity at a rate of 10 percent per year. Hence they suggest an alternative strategy of offering interactivity at no extra subscription cost and paying for the upgrade with revenues generated by video on demand, a lower access fee (perhaps $5 per month), and advertising.58
Reflecting on his experience managing Prodigy Services, Ross Glatzer suggested (while tacitly accepting the broadcast model) that
even the best of the on-line service advertisers—and some are very good—must get a lot better when they go on the Net, because most of their customers will be paying for the time to view their applications. … On the Net I believe the advertiser will end up paying for the number of footsteps that cross his threshold, and the amount of time that those feet stay in the store. … This then puts a premium on closing a very high percentage of sales.
As Glatzer's comments suggest, the nature of advertising will have to evolve to meet new service contexts. With the exception of telephony and possibly mail, all current media were developed in the service of a mass culture. Newspapers, magazines, and radio and television with mass audiences were encouraged and supported by mass advertising. Little commercial information was targeted to specific audiences or individuals. While the public paid part of the cost of media such as newspapers and magazines, this revenue represented only part of their economic value. Advertising did and still does provide the bulk of media revenue. Mass advertising in mass media represented the ultimate in downstream information in a society and economy driven by mass production, mass distribution, and mass consumption.
The case for advertising on the NII must be made in the context of what advertisers and consumers seem increasingly to want: direct contact with each other. These contacts will, through interactive information infrastructure, become dialogues. These dialogues have the potential to become enduring relationships among manufacturers, service providers or retailers, and the ultimate consumer. Unable to identify or separate loyal users from trial purchasers or identify heavy users of other brands who would be well worth converting, manufacturers have lacked the capability to create different advertising programs for each. It is the ability to acquire and build a base of loyal, long-term customers—suggesting a new definition of branding plus the ability to better measure what action or behavior either advertising or new product introductions have stimulated—that provides the profit leverage for manufacturers of products, providers of services, or retailers.
How information is used and the direction of its flow are key. Because real accountability requires real information, abstract criteria such as "share of voice" and "share of mind" will give way to a brand's "share of customers." Information infrastructure will provide manufacturers with the information they need to convert products into long-term services. In the future a brand may identify the quality of the relationship as well as that of the product. It will also help target both advertising and the charges for same: an increasing number of advertisers want to know what action or behavior their advertising has stimulated. Companies will not, in the future, manage sales, but rather customers: the blind spending of billions on advertising and promotion for product trials will be replaced by a more focused and longer-term discipline that will manage and increase the lifetime value of customers. Thus, the subscription and membership approaches to selling are already used to encourage the purchase of books, magazines, newsletters, audio recordings, video cassettes, participation in museum activities, telephone and cable service, on-line services, and even coffee and heating oil.
End-User Devices Paid for by Consumers
In the model based on consumers paying for use of a device, the information user invests the capital required to access the network (via acquired personal computers and modems). In the Internet or on-line services case, the user also pays for services; in radio and broadcast television the service is received without such fees. In both cases, advertising can be an important source of cash flow, reducing the charges to consumers. Consumer investments absorb some of the risk of obsolescence and of slower-than-forecast growth in demand; they also disaggregate the investment.
This economic model is subordinated to the broadcast and the consumer-pay (usage-based fees or access subscription) models. It assumes that consumers will possess more intelligent information appliances in addition to the nearly ubiquitous basic televisions and telephones.59 It also recognizes that fundamental to the growth of the Internet and private networking has been the growing capital investment by individuals and organizations in computer equipment and associated local networks, modems, and related goods and services. As discussed in Chapter 2, the growth in number, diversity, sophistication, and cost of household information appliances is a major indicator; as discussed in Realizing the Information Future (CSTB, 1994b), the early growth of the Internet in the research, education, and library environments and the growth of private networking among corporations and other large organizations illustrate the varying willingness and ability of organizations of different kinds and
sizes to make investments in intra-organizational network-related infrastructure (devices, local area networks, support, and so on), internalizing relevant costs.
Dependence on consumer-owned PCs raises two concerns, as noted in Chapter 2. The first is affordability. At $2,000 and perhaps three to six times the cost of a television set, the average PC system is beyond the reach of many consumers, who may need to depend on access via large institutions (places of work, employers, schools, government agencies, libraries) or public kiosks. A second concern is upgrading home equipment to match changing service capabilities .60 Contributors involved with television made many observations about the long lives of television sets in homes, not least because consumers lack the investment incentive provided to businesses under the tax code to depreciate expenses for tax purposes.
One middle ground might come from more versatile software, but there is debate on this point. Queried Stewart Personick, "Is there a model that people will get used to the idea that these things are really disposable and throw them out every 3 to 5 years? Is the model that service providers will subsidize them like wireless cellular telephones, and it will be possible to get a new one for $29? … The theoretical solution of a programmable [and therefore upgradable] box is not very likely." Service-provider ownership of access devices is one response, illustrated by leased set-top boxes provided by cable companies or by the bundling (without separate fee) of equipment with services (see the discussion above of the embedded or domain-specific service model). Yet that path raises other concerns, such as the incentives for multiple service operators to minimize functionality to contain costs or the incentives to minimize generality to preserve profits via control over services.
The Access Subscription Model
The access subscription model is best described by the cable TV and on-line information and communications (including Internet access) providers; it can also be seen in flat-rate telephony (so much per month for unlimited calls). In both cases the cost to the information consumer is an attachment charge. That charge may include lease of reception equipment (e.g., set-top box) by the service provider.61 Unlike the usage-based fee services, in this model one pays a flat-rate fee, usually on a monthly basis, for access to the facilities and information. Of course, illustrating the fact that these models are not pure, cable charges do reflect some offset from advertising revenues to providers, and they may relate to pay-per-view and/or premium services as well as the basic package. In the
access subscription model there is a basic assumption that use time is not a cost factor. For example, in basic cable service, the basic investment and the cable signal delivered over it are there: if 100 or 1,000 subscribers tune in, there is no infrastructure cost element to factor into the price. Pay-perview cable, on the other hand, equates to the usage-based fees model, in which the owner of the information or intellectual asset wants to recover the cost or value of that asset by charging for its use.
A key question with this model is the nature of the service itself. If it is essentially a commodity, it is hard for a firm to remain competitive if each provider offers essentially the same service. Thus, cable provider prospects are affected by the introduction of direct broadcast satellite (delivering cable-like programming) and on-line service provider prospects by the burgeoning entry into Internet access, including entry by providers offering little added value but low prices.
Payment Models and the Internet Phenomenon
As discussed at the forum, many economic projections relating to the NII seem to be building on the experience of the Internet. The Internet today embodies an infrastructure of sizable dimension, with perceived values in many areas of science, education, government, and commerce. It was built to one set of economic principles and is in transition to another set of economic principles, with all the attendant angst. In the recent past, the wide-area backbone of the Internet was provided through a contract from the National Science Foundation (NSF), reflecting the origins of the Internet as a research project within the U.S. government (primarily the Advanced Research Projects Agency of the Department of Defense and the NSF-funded use of the Internet in support of the academic and research community). But with the expansion of the Internet beyond these uses, and the entrance onto the scene of commercial providers, the NSF orchestrated a transition away from its backbone onto a fully commercial, competitive set of backbone providers. This transition has now occurred, and there are a number of Internet service providers that offer wide-area Internet service, as well as regional and local providers. Consequently, payment for services by users is increasing, through subscription services offering Internet access, while federal support (historically targeted to members of the research, education, and library communities but often overestimated by both beneficiaries and observers) via NSF is being restructured to further precipitate the maturation and commercialization process. This second-stage transition, like the larger development of the commercial market for Internet access, is still unfolding; financing for Internet access in research, education, and libraries contin-
ues to be a source of uncertainty and concern, as do other, technical and business, aspects of commercialization.
The Internet brings a unique perspective to this discussion from the unusual economic environment in which it exists. Fundamentally we have a very valuable and effective artifact and network of providers and systems that have been assembled according to a social benefit model. The costs of the infrastructure have been borne by the institutions, both governmental and private, that have desired the connectivity and services that the Internet provides. The information providers on the Internet likewise have contributed their intellectual assets and ideas freely, and the user or customer base in fact disdains, as evidenced by the tradition of ''flaming," attempts by commercial enterprises to breach the free-flowing and noncommercial etiquette observed on the network. Based on this model, commercial interests have begun to take root, the growth and exploitation of the Web being perhaps the prime example.
The steering committee heard discussions reflecting a wide range of visions about how the Internet could be used in support of business. Some project participants spoke of the Internet as a vehicle among businesses: for electronic purchasing or joint design projects among several companies. Some project participants spoke of business needs that involved broad access to small business sites: offices of retail stockbrokers, insurance agents, or doctors. Some spoke of needing access to the home in support of their business: "telework," retailing, or home health care. And some believed that access to the consumer in the home was the business. As discussed in this report, these different visions have different implications for the business model and modes of payment, and for the needed infrastructure deployment. But the basic Internet services and functions can be provided in all these cases.
Growth conforms to a diffusion model characteristic of network markets, often modeled with an S-shaped curve.
Concerns raised by cable companies netted this response from U S West: "The key is not whether U S West or any other local exchange carrier can predict video dial-tone revenues and costs with any certainty—they cannot—but whether other telecommunications users will be harmed as a result of allowing LECs to provide video dial-tone service." See Telecommunications Reports (1994).
"Here was a company with negative earnings, $17 million in half-year sales—and a market capitalization of $2.2 billion at the close of August 9." See Hardy (1995), p. 206.
The problems of measuring and tracking new products and industries are well known; typically, relevant activities have to achieve some degree of scale or volume to be captured in reliable statistics.
Note that for existing entities there are investments to maintain or replace existing plant and equipment as well as investments for upgrades and growth, implying some need to net out what is investment for growth or upgrading.
in their white paper as much less costly, at about $120 to $130 per home passed. This would give cable television operators, who already have coaxial cable networks, a substantial lead in deploying HFC. The review by the steering committee of network deployment plans by cable and telephone firms tends to bear this out.
A July 1994 analysis by financial analysts at Veronis, Suhler & Associates estimated capital costs for deploying a new fiber-optic cable television network as being between $1,200 and $1,800 per household. They identify the cost of file servers and set-top boxes for two-way communications as variables that may drive the network cost higher than these estimates. As discussed in the white paper by Bailey and Chiddix, however, these are incremental costs; while the cost of building the basic transport infrastructure must be incurred completely before any services can be carried, incremental investments supporting interactivity and video on demand can be incurred gradually, while market demand for services builds.
Robert Blau of BellSouth and Howard Frank of the Advanced Research Projects Agency noted the constraints on accounting rates for depreciation and amortization and for other sources of return on investment in wireline telephony.
Similar licensing rules will likely affect the digitization of television broadcasting by setting a date for the termination of analog transmission. In the broadcasting area, the problem is not one of local access, given essential saturation of
the market, but rather quality and other features of the service. The essential investment issues relate to movement to digital systems and eventually to advanced television (ATV) service(s). See Information & Interactive Services Report (1995a) and Telecommunications Reports (1995a,b,c,d,f,h,k,l, and m). Note that positive government reactions to PCS auctions has fed proposals to auction digital television spectrum.
See Wireless Messaging Report (1995d,e). Wireless data and local area network offerings are expected to help fuel the wireless market. See Wireless Messaging Report (1994).
See, for example, the white paper by Jiong Gong and Padmanabhan Srinagesh.
See Clark (1995a), Elliott (1995b,c), and Goldman (1995). See Barboza (1995) for a discussion of the Internet as a medium for advertising.
Kessler (1995) noted that "the fundamental values of broadcasting licenses have topped. … As PC screens become more televisual, they will attract still more eyeballs. … TV is not dead, but its economic model is subject to revision."
See Anthes (1995a), Blodgett (1995), Gillin (1995), and Messmer (1995b).
For example, Intuit has contracted with several banks to use its Quicken software for individuals to transfer money between accounts, pay bills, and extract
account data for use on PCs. Actual transactions will be processed through a service unit. See Flynn (1995).
See Liebowitz and Margolis (1994) for a discussion of the literature on and definitions of network effects and related phenomena, as well as work by Economides (1994) and Economides and White (1994).
"Commonalities," they continue, "include standard or conventional interfaces, protocols, reference architectures, and common building blocks from which applications can be constructed to deliver information services to end users."
Netscape is also facilitating Internet service provider access (see Booker, 1995b; Corcoran, 1995b; and Information & Interactive Services Report, 1995b), as are some PC vendors (see Zelnick, 1994).
Cauley (1995b) and Information & Interactive Services Report (1995c). For other examples, see Robichaux (1995a) and Ziegler (1995c).
Bell Atlantic is taking this tack to experiment with "wireless cable"; see Mills and Farhi (1995).
See the white paper by Robert Powers et al.
For an in-depth discussion of interconnection pricing, see Brock (1995).
Schwartz explained that in New York, NYNEX and TCG have a relationship for exchange of local traffic using a capacity-based port charge, where loops and ports are unbundled and the inbound traffic is about a quarter of the outbound traffic.
Issues raised in regulatory proceedings include whether incumbents or competitors have higher costs for terminating calls. See Telco Competition Reports (1994a,b).
Since many subscribers use more than one service, the number of on-line
service accounts (approximately 11.3 million as of late-1995) overstates the number of households (Arlen, 1995b,c; 1996). By contrast, note that the increase between 1993 and 1994 in pager subscriptions was about 7 million (Wireless Messaging Report, 1995c).
See Booker (1995a) and Sullivan-Trainor (1995).
While there is much discussion about interactive television (broadcast and cable), that alternative is not likely to rival services delivered via PCs between now and the year 2000. However, whereas with cable the providers make investments and spread costs among subscribers, with direct broadcast satellite (DBS) consumers bear substantial capital costs up front. Market research data suggest that consumers spent between $0.5 and $1 billion on digital DBS receivers in their first year of availability (Communications Daily, 1995d). This compares to a total U.S. business local-area network investment of approximately $5 billion (see Chapter 5 section, "Data Communications"). See Markoff (1994), Landler (1995), and Shenon (1995).
As discussed at the January workshop, whether set-top boxes should continue to be leased or be made available for purchase is a matter of debate within the cable industry.