2
Trends and Directions

Although it is premature to predict which technologies and applications will succeed, general directions in underlying technologies can be discerned. This chapter outlines prospects in multimedia and interactive technologies and explores applications, particularly in entertainment and education. Not intended to be comprehensive, this chapter aims to depict the fluidity of the technologies and the uncertainty of the trends they are motivating.

THE OUTLOOK FOR MULTIMEDIA GOODS AND SERVICES

Enabling interactive multimedia to flourish are rapid advances in computer hardware, driving changes in telecommunications and other industries. Relevant hardware includes basic devices (e.g., microprocessors, memory chips and other storage devices, drives, access devices), larger computer systems that use those devices (e.g., personal computers, telecommunications switches), and other information appliances (e.g., television systems, set-top boxes).

Advances in hardware performance (and consequent reductions in the cost required to achieved any given level of performance) as measured in quantitative terms will continue to be made at very rapid rates. In the past 30 years, for example, the density of integrated circuits has improved by a factor of 10 million—seven orders of magnitude!1 Such incredible progress in such a short period of time has led to significant changes in the kinds of products that can be built from such powerful components. This rate of improvement is expected to continue at least until the turn of the century, when complex microcircuits may



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 15
2 Trends and Directions Although it is premature to predict which technologies and applications will succeed, general directions in underlying technologies can be discerned. This chapter outlines prospects in multimedia and interactive technologies and explores applications, particularly in entertainment and education. Not intended to be comprehensive, this chapter aims to depict the fluidity of the technologies and the uncertainty of the trends they are motivating. THE OUTLOOK FOR MULTIMEDIA GOODS AND SERVICES Enabling interactive multimedia to flourish are rapid advances in computer hardware, driving changes in telecommunications and other industries. Relevant hardware includes basic devices (e.g., microprocessors, memory chips and other storage devices, drives, access devices), larger computer systems that use those devices (e.g., personal computers, telecommunications switches), and other information appliances (e.g., television systems, set-top boxes). Advances in hardware performance (and consequent reductions in the cost required to achieved any given level of performance) as measured in quantitative terms will continue to be made at very rapid rates. In the past 30 years, for example, the density of integrated circuits has improved by a factor of 10 million—seven orders of magnitude!1 Such incredible progress in such a short period of time has led to significant changes in the kinds of products that can be built from such powerful components. This rate of improvement is expected to continue at least until the turn of the century, when complex microcircuits may

OCR for page 15
contain hundreds of millions of transistors, David Nagel, senior vice president and general manager of AppleSoft, said. In a very real sense, convergence has been enabled by advances in digital hardware technology, which have in turn been driven by development of engineering processes that result in the ability to etch more and finer lines on crystals of silicon. Better lithography and more lines per millimeter enable the creation of greater and greater numbers of the active elements of digital signal processing—transistors—on any given area of silicon. One of the early pioneers of the integrated circuit, Gordon Moore (now chairman of Intel Corporation), coined a heuristic that describes the doubling in this ratio that has occurred almost like clockwork for the past 25 years, a heuristic now known as Moore's Law (Karlgaard, 1994). Having more and smaller transistors on a single chip of silicon allows those transistors to be operated at lower voltages and with lower switching currents. This in turn allows the transistors to be operated faster and at lower electrical power levels, with the net result that the microprocessors that have evolved from the earliest 10-transistor integrated circuits now have computing speeds measured in hundreds of millions of instructions per second. At these speeds, real-time multimedia computing becomes possible; advanced compression algorithms that can reduce the required bandwidth of communication systems can be implemented with circuits at price points compatible with mass markets. Although physical laws limit how small lines can be made using the current optical lithographic technologies (a limit that would be reached shortly after the end of this century), new technologies using shorter-wavelength etching beams (e.g., x-ray, plasma) that allow significant progress below the limits imposed by visible-light lithography are now in early prototyping stages. However, new tools of all sorts are also needed to manage the incredible complexity of circuits that contain tens or hundreds of millions of switch elements. While design tools are being developed that can automate much of the design, a new problem has arisen for which no immediate solution is apparent: How does one adequately test these circuits? But while the problems of designing and testing of massively complex chips representing the very fastest hardware systems may begin to slow progress by the end of the decade, it certainly will be possible to create systems with the computing performance of today's fastest chips and all that is needed to put a complete computer on a single chip by the end of this decade. These general-purpose computing systems—complemented by more limited purpose logic for performing tasks such as high-speed compression, decompression, and even recognition of audio signals—will enable engineers to build digital systems with the overall computing power of today's supercomputers in products priced like commodity consumer electronics. In fact, the term "computing," which implies numerical computation—arithmetic—no longer adequately describes products currently being designed and sold based on high-performance digital technology.

OCR for page 15
Thus, quantitative improvements in process and component technologies have led to qualitative differences in the products built from them. One significant change has been the creation of products described as multimedia personal computers (PCs)—"multimedia" in this context emphasizing the real-time creation, processing, and presentation of sound and still and moving images or video. As discussed in Chapter 1, multimedia computing has enjoyed a rapid ascendancy in the marketplace, since images and sounds can in many situations be more efficient than text for transmitting information to humans. Today, multimedia PCs outsell by a wide and growing margin personal computers without the ability to manipulate sound and video. For multimedia computing to be practical for products associated with mass markets, microprocessors have had to evolve to the point that they are capable of executing millions of instructions per second. Equivalent advances in memory and mass storage have also been key, because information in the form of high-resolution images may contain tens or even hundreds of megabytes of information in typical electronic publishing scenarios. Illustrating the impact of advances in hardware is the digitization of video generation and delivery, accelerating and enhancing the coding and decoding of video signals, including associated compression of video signals (necessary for all digital delivery formats, including cable, wireless, or packaged media). Compression is important for expanding the capacity to deliver a variety of telecommunications services over a given infrastructure (e.g., a network provided by a telephone company or cable systems operator) and to increase the flexibility for a company to offer different packages of services (possibly obtained from different vendors) to different target audiences (Hodge, 1995). Costs, quality, and tradeoffs between the two are changing with advances in the fundamental technology. For example, signal processing on a microprocessor chip may obviate the need for larger amounts of special-purpose hardware supporting multimedia applications; at least one such product has been announced.2 Major computer systems and workstation manufacturers have meanwhile been developing and targeting higher-end systems as engines for video-on-demand services and associated market tests by cable and telephone companies. Experience with CD-ROM technology, popular for storage and retrieval of multimedia products, provides some insight into the current market. CD-ROM refers to a high-capacity compact disk that can store text and video as well as sound. The number of interactive CD players in U.S. homes has been growing dramatically as a result of declining cost and proliferating software. Estimates of hardware sales and the installed base vary, but overall (homes and businesses), some 4 million CD-ROM players were sold in the United States in 1993, and the installed base in 1994 was expected to grow to 10 million to 16 million (Landis, 1993; Johnson, 1993; Flynn, 1994; Samuels, 1994a). Despite this growth and apparent potential, consumers have voiced dissatisfaction with the problems of configuring or affording PC technology to use CD-ROMs

OCR for page 15
satisfactorily, and also with the quality of the CD-ROM software available (Carlton, 1994b). These problems, while possibly temporary and typical of relatively new technology (and expansion into a relatively unsophisticated consumer market), raise questions about consumer acceptance and about the durability of the format over time unless early problems are solved quickly and affordably. At a minimum, there is a need to improve data rates—which affect the quality of the video image that can be accessed, storage capacity, digital data reading technologies (e.g., through new laser techniques; Rosen, 1993), and response time for interactive uses (Carlton, 1994c). As Esther Dyson of Edventure Holdings Inc. observed in a late-1994 interview, enhanced speed will be an important factor in gaining broader applicability and consumer acceptance. In the next several years, even greater performance and higher levels of circuit integation will enable the development of products that have all of the capabilities associated with multimedia personal desktop computers today but that are much different in design and function from the PCs typically found in homes, schools, and offices. Based on established trends, we can expect several lines of development. In one, characterizable by miniaturization and mobility, we will see the functions of desktop systems available in devices the size and weight of today's electronic calculators. Products of this sort 3 will be made possible because (as noted above) as circuits of a given level of performance can be made smaller and smaller they can be made to consume less and less electrical power—and thus can be made battery powered and portable. High-performance wireless digital communication systems—packet radio, infrared, digital cellular, and so forth—have also been enabled by the general evolution in hardware technology. In a second line of development, the object is not so much to reduce the size and increase the portability of devices as to reduce the cost. In this category will be found digital television and other products that have been described as digital information appliances. Key to the commercial success of such products is the achievement of manufacturing costs in the range of those for other kinds of appliances—roughly a factor of 10 less than the costs associated with today's personal computer products. Such improvements are achievable well before the end of the decade because it will be possible to shrink the entire circuitry associated with high-performance multimedia PCs today to a single integrated circuit. One indicator of this progress is the introduction of PC-based devices including functions historically found in radios, televisions, telephones, answering machines, and fax machines (Associated Press, 1994). The addition of a central processing unit—a computer—could enable televisions to select a personalized viewing schedule, automatically record material, and provide memory in a hard disk, random access memory, or other solid-state device (Rosen, 1993). Development of such technologies is under way.4 These innovations not only store programming but also meet an important emerging need, offering consumers assistance (often referred to as "navigation") in sorting

OCR for page 15
through the expanding range of choices. This type of information processing/systems integration service illustrates the contention for consumers that is emerging between televisions and more general-purpose appliances, specifically PCs. While PCs may come to play a greater role within the household, computers may proliferate within the household without looking like computers—they may be increasingly embedded into consumer electronics.5 What may continue to differentiate appliances called computers is their general-purpose nature, including their expected use with a wide range of software, while more specialized appliances may run more specialized software. One product that has been widely discussed is the consumer product associated with interactive television systems, a digital version of the so-called set-top box (STB). The first product that could be legitimately termed a digital STB has already appeared in the consumer market in the form of a decoder for digital satellite television broadcasts. While only weakly interactive, since digital satellite today supports high bandwidth only in the outbound direction (the current systems allow "inbound" or customer response by virtue of an analog phone connection in the back of the product), these products clearly point the way to the future.6 The communications companies and the entertainment industry have all become actively interested and involved in the development of both the infrastructure (see "Networks" section below) and the products necessary to enable creation of interactive television. But the key to commercial success for interactive television depends on the ability to design and manufacture inexpensive STBs (or comparable television set components). Based on historical rates of improvement, we can anticipate that the necessary technology base to support products with the performance of today's multimedia PCs and high-performance graphics workstations but with the price points of consumer electronics systems will be achievable within the next two to three years. Since interactivity will allow the creation of entertainment, communication, and other products unlike anything in the market today, these new capabilities could significantly alter the economics of all three industries: computing, communications, and entertainment. Software A major obstacle facing multimedia developers is the scarcity of appropriate—compatible and consumer-desirable—software. Software products are proliferating, ranging from crude translations from a traditional medium (e.g., print) to more innovative (if experimental) designs that challenge or provide a sense of control to the user (such as the computer game "Myst"; see Rothstein, 1994). Also lagging are software design and computer programming capabilities—reflecting shortcomings in design and method as well as engineering effort—with the result that new devices cannot be exploited fully. Software will be the key growth area in the years to come, according to Nagel, but it progresses on a

OCR for page 15
slower time scale than does hardware. Explaining the centrality of software during a late-1994 interview, Richard Notebaert of Ameritech remarked, "The real question that we are faced with is not the technology. It's how we interact with the technology. How do we use point and click? How do we get voice-activated computers? The real question is in the world of software—the ability for humans to be able to use the software in a way that's as simple as dialing the telephone." Nagel contended that two trends could bring about significant progress. The first trend is the development of object-oriented programming and component software. Today, software is monolithic. To make software distinctive, developers add new features to existing applications, creating "bloatware" that has marginal value, Nagel said. But eventually software may be sold in pieces, and users could assemble their own applications by selecting a framework and a set of components—including images, sounds, and text. This practice could invite a dramatic expansion of the software industry, allowing for more innovative applications, he said. A second emerging trend is platform-independent software, such as, for example, software that can run on both PCs and Macintosh platforms. Developers currently write software to conform to applications programming interfaces, which are unique to each platform. The platform-independent principle should be incorporated into the national information infrastructure, Nagel said, because it would invite broad participation and innovation, which would hasten the evolution of the system. Software development tools for cross-platform systems are immature (Millison and LaGrow, 1993). Generation of multimedia content implies a need for creativity as well as tools and technology. As with theatrical motion pictures and television programming, experimentation—which requires time and some freedom—should be expected. For example, the early motion pictures were essentially stage plays that were filmed, using stage makeup and stage acting techniques. As the producers became more familiar with the medium of film they developed new techniques that were specific to motion pictures. On the other hand, digitization in the music industry has until recently (Bermant, 1994) proceeded largely independently of creative authorship (except, perhaps, for sampling). Thus, while creative authorship, per se, may not drive technology change, it will respond to opportunities presented by new technologies and, over time, may generate new requirements. In general, explained Steven Wildman in a late-1994 interview, the nature and mix of creative talents that provide comparative advantage to individuals and organizations will change.7 Skills—including those of graphic designers, film and video producers, scriptwriters, and teachers—will be needed in how to communicate in new media using digital tools. In a follow-up interview, Robert Stein commented on how evolving multimedia products call for new approaches to narrative and storytelling, approaches that will account for and anticipate the growing ability of the

OCR for page 15
user or audience to control the pacing and sequencing of story elements. Similarly, in producing music products, "over the next couple of decades musicians will come to the fore with a broad set of skills. They won't just be able to sing and play the guitar; they'll be able to do that in both a visual sense like with music videos, and then also in an interactive sense. You're going to see musical presentations that are almost like mini-operas, where a CD isn't just for sound, it's also pictures and motion pictures, etc., and all under the control of the listener." At present, the nascent market means that job opportunities are small in number.8 But colloquium participants emphasized the need for skills and insight, particularly the need for prospective authors who are bright, curious, and motivated. Alexander Singer, a film director, said the requisite mechanical skills can be taught and learned; what is more important is innate desire: I have taught this stuff and it's learnable. … I taught film production techniques to a group of disadvantaged minority youth 25 years ago, and I was successful. Some were barely literate; a number had police records. … What I found is, they were hungry to have a voice, and the hunger was three-quarters of the game. Several of them found their way into the entertainment industry, got paying jobs, and do good work. The methods [for teaching] will be found. I think in terms of backgrounds, I would infinitely prefer somebody who was a thinking person and who had a background in the humanities, but who also had the capacity to be wide ranging, to anybody who had a specific skill, all of which I think is learnable. Stein agreed, adding that such ideal students are difficult to find. "We're missing young people who have minds that are working, that have the ability to take complex subject matter and go out and wrestle it to the ground in a way that is satisfactory, and to bring it back to a broad audience in a way that makes any sense. "The schools are producing all kinds of kids with computer skills and graphic skills, et cetera. It's one in a million that seems to be able to understand what to do with those skills. …" On a more practical note, Sherry Turkle, a professor of the sociology of science at Massachusetts Institute of Technology (MIT), said cross-sector partnerships are needed to provide training in software-writing skills. She offered as evidence MIT's $70 million ATHENA experiment with computers in education. The MIT faculty was to write the software in this project, on the theory that their innate intelligence and professional expertise would be equal to the task. However, the experiment demonstrated that writing appropriate software requires specialized cognitive skills and an artistic as well as a scientific approach; new skills or high-quality teamwork are required, Turkle has said.9 "I think you really do need partnerships [involving] people who are in the entertainment business, who are in the computer business, [where] the technologists are learning the anthropology, the cognitive science, the psychology, the sociology, or [they] are learning how to do studies of societal impact, of personal impact, of how people think about thinking, of how people use software, how

OCR for page 15
people learn. …" Whereas special talent and training may be needed to generate commercial products, however, the Internet phenomenon also points to an alternative path: that of enhancing the infrastructure to the point that individual users and even commercial authors can develop content products without having to "program" as we now understand that art. Networks Building on but also paralleling advances in hardware is progress in networking. The full exploitation of advanced information technologies depends on networks that provide connectivity among input, display, processing, and storage systems dispersed within offices, buildings, campuses, cities, and beyond. From the user's perspective, networks can facilitate instant access to resources and integrated tools for collaboration, as well as more options for undertaking different kinds of activity from the home. George Gilder, for example, noted that more and more PCs are being connected to networks: the percentage of PCs that are interlinked grew from only 7 percent in 1989 to 60 percent in 1992. (Nagel suggested that this number will approach 100 percent by the end of the decade.) Improving upon and integrating various networks, advanced information infrastructure can provide a delivery vehicle, user interfaces, and interfaces among the components that make up interactive multimedia systems. Nagel put current trends in computer technology into a larger historical context: Despite its young age, the computer has already gone through several wrenching and displacing changes. In the 1950s and 1960s, the mainframe computer was really the dominant paradigm or the dominant mode of interaction for computing and calculation. The mainframe was supplanted as the sort of center of gravity in the industry, both economically and intellectually, shifted to the minicomputer in the 1970s. In the 1980s, a third shift took place from the minicomputer to the microcomputer, again driven in large measure by process improvements and integrated circuit technology. Today, we appear to be poised for yet another major shift from microcomputing to perhaps intimate computing and also, to an era of ubiquitous networking or universal networking. Nagel's emphasis on connectivity of computer-based devices and uses of networking contrasts with the current multimedia product emphasis on platforms that may be used apart from networks, notably CD-ROM players and game systems, although even these devices are being integrated into or substituted for by network-based applications. Network-based applications facilitate interaction among people, whereas stand-alone applications emphasize interaction between people and computers. As Dyson observed, "interactive" can cover both concepts, but the distinction may be relevant to both the technology chosen and the impacts on the users. Thus, although questions have been raised about the prospects for CD-ROM

OCR for page 15
applications to devolve to on-line information services, observers differ as to whether the market will shift largely to one format or the other or whether it will evolve in two tracks. An assortment of transmission networks is now operating or under development in the United States. Each network has its strengths and weaknesses; to date, none constitutes a complete national information infrastructure, but collectively they are fragments of the foundation. Telephone companies, limited today by the existence of vast networks of twisted copper pair wiring in the local subscriber loops, are seeking ways of upgrading these low-bandwidth analog delivery systems to support digital transmission of information at rates required for display and transmission of high-resolution imagery and sound to—and from—digital STBs. Cable TV companies (multiple system operators) already possess a high-bandwidth distribution system (coaxial cable can support transmission of digital information in the range of hundreds of millions of bits per second). These systems, too, require significant upgrading if they are to support interactive television services since they were designed for transmission of analog television signals in one direction (outbound) only and lack both the necessary topology and switching or routing capabilities for two-way transmission of digital information. Both the plant and potential mix of services are changing to facilitate delivering the much-touted 500 video channels to the home that some system operators may offer, and digital servers and other digital facilities for delivery of movies on demand, interactive games, home shopping, and even telephone services. In particular, cable distribution is evolving to a combination of fiber and (for short residential links) coaxial cable, a hybrid approach also being adopted in many quarters of the telephone industry.10 Indeed the cable industry is beginning to examine providing basic telephony services using some small fraction of the great bandwidth provided by their distribution systems. The rapid advance of wireless technologies holds the promise that personal digital assistants and other forms of wireless communications eventually will be competitive with wired communications for a growing variety of applications.11 12 Wireless technology is not new; television already is delivered by radio signals broadcast over the air, wireless cable (microwave radio frequencies used to deliver signals to rooftop antennas), and satellites; satellites have fostered the development of cable television, providing cable networks with an affordable method of linking thousands of cable systems. (Satellite broadcast services are growing rapidly in Europe, and U.S. satellite capacity continues to expand; NTIA, 1993.) The cellular telephone business, launched in 1984, has grown into a more than $10 billion per year business with well over 15 million customers, with continued growth expected. Growing competition, miniaturization of computer power, the imminent digitization of cellular phones (which will use the radio spectrum more efficiently than do their analog counterparts), and the low cost of establishing links to homes in comparison to fiber networks are among the factors behind this trend. The federal auctioning of licenses for personal communications systems

OCR for page 15
networks should provide a further stimulus to wireless technology, application, and market development. Expanding Bandwidth: Is There Enough? A fundamental issue in digital convergence is how to assure sufficient bandwidth to transmit huge volumes of digital bits very rapidly, over a wide area. Video packaging and delivery require a substantial amount of storage and transmission bandwidth in comparison to simple numbers, text, or even still images. Video raises additional technical concerns related to multicast communication, quality of service, protocol support with the increasing deployment of asynchronous transfer mode (ATM), and so on. Gilder wacs extremely optimistic about the technological prospects, claiming that the transmission speed of fiber-optic technology will multiply to the point that bandwidth will cost virtually nothing.13 ''That changes fiber from a replacement for copper to a replacement for air," Gilder said. "It makes possible a huge expanse of bandwidth. It means that all the various technologies that have been based on the assumptions of scarce bandwidth are deeply problematical in this next era. I think a major driving force of the world economy over the next 20 years is going to be the plummeting cost of bandwidth, of communications power. …" Although optimistic, Gilder did not address timing or what it takes to achieve a truly national capability in a timely manner—the kinds of issues being addressed in the context of telecommunication and information infrastructure policy debates. The meaning of available bandwidth is not always clear. For example, the fact that a given network may carry aggregated traffic at 2.5 Gbps on its "backbone" links does not mean it will be able to transmit large numbers of simultaneous multimedia sessions that include video (or that any individual user will experience a 2.5-Gbps rate). Although some commercial networks already operate at 155 Mbps, a very high speed that remains difficult to exploit, even 10 Mbps or less to an individual user can support some very sophisticated interactive multimedia applications. But key technology goes beyond simple capacity to include the ability to reserve adequate bandwidth for a given period and use, the ability to use certain kinds of signaling in unswitched versus switched environments, and so on. Interconnection and Interoperability The number and variety of networks give rise to concerns about means of interconnection and functional exchange of information or interoperability. Eli Noam, professor of finance and economics at the Columbia Graduate School of Business, foresees a future "system of systems" managed by both traditional carriers and the systems integration industry, which would buy transmission

OCR for page 15
capabilities and offer packages of services to consumers. This scenario would have the effect of "transforming the entire medium of communications into a system of systems in a software sense, superimposed on a physical structure of a network of networks," Noam said. The complex of technologies, services, and systems integration activities anticipated by Noam is already in evidence in the Internet, where service development has been initiated by individuals and research teams but subsequently accelerated and diversified by commercial ventures (CSTB, 1994b). Indeed, the variety of services, service providers, and even customers is richer and the fit between supply and demand greater in the Internet than appears evident in the various interactive and other advanced television trials to date. One reason for that variety is the Internet's support for symmetrical (two-way) communications, with which individuals can supply as well as receive information or content generally and services. Whether the Internet leads or becomes subsumed in the development of a yet larger and more complex information infrastructure, the commercialization of the Internet provides a laboratory for observing the interplay of technology, business development, and ancillary policy considerations.14 Michael Borrus, co-director of the Berkeley Roundtable on the International Economy, offered a somewhat different prospect. He perceives a "portfolio" of networks emerging at different rates, managed separately and for different purposes. He contends that there is no ideal configuration and that flexible design permits specialized networks to meet the needs of particular user groups. All networks are evolving toward expanded bandwidth and increased intelligence, Borrus said, but each is driven by different market and policy dynamics and is evolving different applications. "Each of these … networks, in very different ways, requires very different degrees of privacy, of liability and the like," Borrus observed. "It isn't at all clear to me that one all-singing, all-dancing infrastructure can provide that diversity at the moment, or whether it should provide that diversity." Borrus' reservations foreshadowed the debates that have emerged subsequently over the federal initiative to advance the nation's information infrastructure. Those debates revolve around the issue of what it means to have "a" national information infrastructure, especially if such an entity is not monolithic or homogeneous and is not financed or controlled by a single entity. Borrus' argument suggests a greater polarity of alternatives than may be feasible, especially from the perspective that the information infrastructure will grow from existing roots. CSTB has explored these issues in its 1994 report Realizing the Information Future, which argued the need for an overarching framework or architecture that could tolerate and accommodate multiple and competing technologies, facilities, services, and businesses as well as evolution in these elements. Multiple networks are likely to persist, given the heterogeneity of consumers and differentiation of offerings from different network providers. This multiplicity will give rise to both translation approaches (e.g., via STBs) and

OCR for page 15
Infrastructure Standards Panel launched by the American National Standards Institute in 1994, with the involvement of multiple trade, standards-developing governmental, and other organizations, represents an effort to forge a broader consensus, although the broader the group the more difficult the process of reaching consensus. Symptomatic of the problem is the tendency of some to look to industry consolidation as a way around the standards-development problems. For example, the collapse of the proposed Bell Atlantic/Tele-Communications Inc. merger led some observers to wonder about the consequences of not having a "unified force that might have set nationwide technical and operating standards for networks delivering video-on-demand, home-shopping and high-speed data services" (Andrews, 1994b). The need for and alternatives to recognizable market leadership remain subjects for debate and analysis, although the demise of the Bell Atlantic-TCI deal is generally accepted as a sign of the slowing of the pace of both industry consolidation and technology deployment.24 The challenge of developing standards is recognized as a pacing factor for the enhancement of the information infrastructure. ENTERTAINMENT AND THE ENTERTAINMENT INDUSTRY If entertainment and creative content are what differentiate digital convergence from the narrower merging of computing and communications, the entertainment industry becomes a central player. The approximately $6 billion U.S. market for home-based video games is hard evidence of the business potential from creating programming for interactive technologies. Yet such first-order product revenue estimates do not capture the larger, second-order revenues associated with service, manufacturing, and other activities. And the game market illustrates the fact that it may be hard to ascribe revenues to any one industry, since electronic/video games are offered as computer software, on-line services, and programs for special-purpose game systems, and they may draw on components derived from other products (e.g., motion pictures or television shows), giving them the character of an aftermarket product. Similar questions arise for home shopping, which—as a television offering—can be considered entertainment by some and retail distribution by others. In this instance, the differentiation of revenue passed through to product vendors from revenue associated with the delivery medium (e.g., a cable television shopping show) is important, along with determination of the extent to which new shopping activity is stimulated as opposed to diversion of shopping to the new media from more conventional ones (e.g., on-site store shopping, mail-order, or conventional catalogue and phone shopping). If, as suggested by Nagel, the way to maximize returns is to dominate a particular layer or link in the multimedia production and distribution chain, the difficult task of understanding the structure of the industry complex and how it is evolving becomes essential.

OCR for page 15
The entertainment industry is far from monolithic; it is really a set of different industries that produce such products as feature films, television programs, recorded music (records/CDs and tapes), and games. Each of these areas faces different production problems and different distribution and marketing problems. Further, individual entertainment industries have different cultures and concerns, resulting in a situation where those who work in one of these industries do not usually have much to do with the others unless there is some overlap of interests (as when the makers of theatrical motion pictures see that one can design and market games based on hit movies). In evaluating how the new digital technologies and media interact with the entertainment industries, it is important not to generalize too much. For example, many of the technologies most heavily discussed in the context of digital convergence have to do with new methods (e.g., optical fiber cable to the home) of distributing existing products but will not make major changes in the type of product that is distributed. Thus, for example, the recorded music industry embraced digital technology some time ago but has begun to go interactive only recently.25 Other technologies (e.g., CD-ROMs) allow for new types of products that have not previously existed—although they may be used initially to repackage traditional products (e.g., as a substitute for vinyl records) or to complement them (e.g., on-line discussion groups organized by fans of television shows; Barron, 1994). The entertainment industry will use digital technology in the following ways: (1) as a new delivery system for existing products (e.g., video on demand), providing essentially a new aftermarket for existing products;26 (2) in electronic games (and other ways of using digitized material, thereby increasing the variety of delivered products;27 (3) for direct-response sales ("home shopping"); (4) for new entertainment products that are in the process of being invented (e.g., Robert Winter's musical "books" demonstrated at the colloquium); (5) for potentially new distribution systems that would allow distribution of products to audiences that are not reached today; (6) for location-based entertainment, such as "high-tech" theme parks based on visual simulation and other offshoots of both the aerospace and electronics industries; and (7) for new production methods to enhance and/or lower the costs of existing products (e.g., computer animation or using virtual reality to design set lighting, or use of nuances of sound as dramatic elements). Digitization allows for new types of special effects and production technologies, often building on technologies and applications developed for scientific research contexts (e.g., visualization), that can be incorporated into existing types of media. Given these opportunities and possibilities, it is important to consider the differentiation between those parts of the entertainment industry complex that produce the programming (software) and those that distribute the programming that is produced. For example, in theatrical motion pictures, box office receipts compose only a part of the market, and producers look to the aftermarkets (foreign,

OCR for page 15
videocassette, sale to television (pay, network, and syndication)) to show a profit. The production of television programming is a deficit business: to cover its production costs, a TV program needs to be sufficiently successful to stay on the air for some 65 episodes, a threshold for sale to syndicated television. The issues facing these two areas of production are different from those facing the cable television owner or the over-the-air broadcaster that distributes the programming already produced. Another distinction pertains to those who wish to distribute existing types of programming in a digitized format and those who wish to create new types of programming using digital methods. The former category allows for a broader concept of the entertainment industry, one that would embrace cable television operators and computer network operators and possibly telephone companies, for example, as well as possible changes in the role of movie theater operators. Consider, for example, group interactive electronic games, which provide a new form of distribution. They may employ large-screen, sophisticated computer-graphic imaging, and programming at a level of narrative complexity and character depth well beyond that available in current video and arcade games. Developments in interactive storytelling concepts alongside advances in computer software and hardware are evolutionary corollaries of present systems.28 Although the actual "workers" in interactive media may be at secondary levels of the industry, it is clear that all major entertainment companies at their top levels have made a commitment to exploring ways that the digital convergence can interface with their products. Virtually every major Hollywood studio has established a subsidiary to create interactive products, typically computer adventures games based on movies. So far, most are repackagings of existing products, but that could change with the recent focusing of industry attention on games and the recognition that game revenues (over $6 billion per year) now exceed box office receipts (over $5 billion per year) (Siegel, 1994; Rothstein, 1994). Similarly, record components, seeing music production innovations being led by specialized multimedia (software) companies, are adding interactive media functions to their staffs (Bermant, 1994).29 30 The new Academy of Interactive Arts and Sciences has been established to confer awards in the field, the Houston International Film Festival has established new prizes for interactive multimedia products, and the American Film Institute's (Apple- supported) computer-based graphics, editing, and multimedia classes are overflowing (Siegel, 1994; Turner, 1993a). The process of exploration is highly fluid and uses a wide range of information and technological dissemination, networking, alliances, seminars, schooling, capital investments, and research and development. In terms of generating and delivering creative products generally, the United States has dominated the global mass entertainment market since the turn of the last century. "There is an accusation from almost every civilized country on Earth that America is a cultural imperialist," Singer observed. "If you talk to a Japanese anthropologist about how many pieces of Japanese electronics there are

OCR for page 15
in America, he'll tell you, 'Yes, but you've captured our children. They walk around in Batman masks all over Tokyo, and UCLA T-shirts. …'" Similarly, Nagel has observed that daily activities in Bali have been undertaken to the accompaniment of broadcast U.S. rap music.31 Singer attributed this dominance to the many generations and networks of creative talent that have resided in Southern California over the years. Hollywood always has catered to a mass audience, he said, beginning with the nickelodeon, which for a nickel would entertain viewers with a womanly form, a couple kissing, the barrel of a gun, or horses or trains moving. Significantly, Japan adapted the nickelodeon for the intellectual aristocracy, Singer said. Hollywood's inventive tradition can be traced to the unique nature of American culture, according to Singer. The nation's relative youth "gives us permission to play," and, as immigrants, Americans have an innate sense of adventure and dreams of possibilities. Moreover, Californians traditionally have been explorers; the early leaders of the entertainment industry, such as Louis B. Mayer, were itinerant salesman, Singer noted. The early influx of immigrants from other states, as well as foreigners escaping turmoil overseas, brought new ideas and excitement to Hollywood, a phenomenon that continues. Los Angeles today is characterized by "an anxious and driven spirit of reinvention," he said. See Box 2.1. Market forces also play a role in the global dominance of the U.S. entertainment industry. Wildman said the industry is more commercial than its foreign counterparts and, furthermore, it commands not only a huge domestic market but also the worldwide market for English-language products.32 About half of major U.S. producers' earnings from movies and television come from foreign markets,33 and so entertainment is a major positive contributor to the U.S. balance of trade,34 Wildman said. The large audience for U.S. video products—for which the costs of distribution are small compared to the initial cost of producing the creative product—has enabled producers to command huge production budgets, allowing them to invest heavily in entertainment value and audience appeal,35 and thereby propelling the export advantage, he said. U.S. film producers spend some four to ten times the budgets of European counterparts, resulting in pioneering special effects (embodying digital convergence) and higher production values generally. The introduction of new technologies is transforming the structure of the entertainment marketplace. As more outlets or channels are added, the return to each individual producer shrinks, Wildman explained. For example, the rise of cable television was paralleled by a decline in the three major broadcast networks' share of the prime-time audience (U.S. DOC, 1993). Smaller shares mean lower advertising revenues and smaller production budgets for prime-time programs, Wildman said, as evidenced by the proliferation of low-budget "reality" shows (although the impact of changes in share depends on the type of audience affected, which itself affects advertising revenues). On the other hand, the emergence

OCR for page 15
Box 2.1 A Message from Hollywood Serious-minded executives may doubt they have anything to learn from Hollywood. To be sure, film director Alexander Singer acknowledged, Hollywood is a peculiar place, a land of fantasies and dreams. "There are people who wear suits and sit in an office when in fact 'visions of sugar plums dance in their heads,'" he said. "They are filled with the fantasy of boundless possibility. The brass ring of a $500 million-grossing film is so intoxicating that it leaves ordinary and sensible people dizzy with desire." Movie makers are gamblers, taking wild chances on expensive bombs such as Ishtar and unexpected blockbusters such as Home Alone. "They do not know what it is that works," Singer said. "They cannot know. They never know.'' Yet, ironically, they are highly paid to know. This environment of high risk, high stakes, and relentless pressure to appear in control makes the entertainment business profoundly insecure and unstable, Singer said. Moreover, the odds against success are astronomical. For example, about 30,000 screenplays are registered every year by the Writer's Guild of America; only a few hundred are purchased, and a fraction of those are actually made into movies, he said. All of this fosters a creative atmosphere, but one with "a sickly quality … a sense of addiction." This creativity is struggling to find the application to multimedia. Hollywood executives are notoriously fearful of venturing beyond the familiar; Singer found senior entertainment executives to be "conservative" and "anxious" when confronted with virtual reality technology entertainment concepts during the early 1990s. Such reactions cast doubt on the prospect of the entertainment industry serving as a model for systematic innovation. Yet in one way the industry does serve as a prototype for survival in the information age, according to Singer, who believes Hollywood is a kind of precursor for certain forms of work forces and lifestyles of the future. A large part of this culture is one of impermanent but highly skilled production group teams. These temporary task forces, brought together for a single movie or televisions series, work intensely with a high degree of skill, quickly building personal relationships and teamwork comparable to that of a string quartet or a baseball team, Singer said. At the completion of the assignment the team dissolves and the individuals search for new employment. They are motivated as much by professional pride as by monetary gain. These workers must be highly independent: they need to inform themselves about industry activity, network to find jobs, and upgrade their skills regularly. They also must raise families and stay sane and healthy while living this moment-to-moment lifestyle. (Time lost for illness is among the lowest in any industry.) It is a model that could serve as a useful variant for the business community, Singer said, as job impermanence comes to characterize much of American life. of new media creates new "windows" in the distribution chain—new aftermarkets. The overall effect is to increase the aggregate audience for a given production, Wildman said. Another transforming factor may relate to financing. For example, film industry financing changed with the introduction of sound in motion pictures.

OCR for page 15
Previously, most film production had been financed by entrepreneurial individuals working in the industry or their associates. Sound imposed serious new technical demands on film making that required expensive equipment and talent, prompting Hollywood to turn to New York capital markets, which had underwritten earlier stages of the industry. This experience provided the rudiments of the current film industry's financing structure, with profound consequences for the character (both technically and in terms of content) of a major part of the entertainment industry. Thus, beyond its effect on what it is like to see a movie, sound changed the movie industry into something new, and that new thing produced different products than the old thing.36 By analogy, the long-term effects of new digital convergence technologies may be more profound with respect to altering the traditional definition of the industry than on the industry as currently defined. Overall, a major part of that transformation may be from a goods-producing to a service-producing orientation. Wildman also observed that as channels and programming proliferate, much of the "value added" starts to reside in the technology that allows us to sort, sample, and select what we want to use. As noted above, competing industries are offering different methods and approaches for those processes of finding entertainment content to consume. Only technologies that provide meaningful benefits to the creative community will take hold in the entertainment industry complex. The integration of new technologies will be driven by (1) enhancing the creative tools used by the entertainment industry to create its art form; (2) demonstrating cost economies and operational efficiencies; and (3) enabling new forms of entertainment and/or innovative ways to simulate reality. For example, with its 360-degree stereoscopic computer-graphic universe, individual spatial volition, and simultaneous interacting multiple players, fully evolved virtual reality currently is the most potent and problematic technology of the whole interactive spectrum. This medium gains particular relevance in the way it reflects aspects of the "virtual communities" emerging from the computer networks (Rheingold, 1993). Thus, while the popular and business press tout prospective impacts from greater bandwidth in communications capacity, applications of virtual reality from CD-ROMs and other new devices, and other digital convergence components and products, it remains premature to determine what difference individual developments may make. The bandwidth breakthroughs of today are impressive, but in context they may prove no more earth-shattering than earlier transformational jumps in communications capacity such as multiplexing of telephone trunk circuits or transformations in entertainment such as the addition of sound and color to film. The changes brought about by today's instances of digital convergence may be changes in degree or in kind; some may be changes in degree that are so significant as to be changes in kind. Ginn contended that ability to satisfy audiences is more important to success than is the kind of business: "If your view is that markets will segment, and I think they will over time, then those companies

OCR for page 15
that have the understanding of markets and how to satisfy customer needs, irrespective of where they are in this value chain, will be able to be very profitable. This includes cable companies and telecommunication companies and others in the distribution end." Reinforcing Ginn's view that it is insight as much as origin that matters was the apparent confusion at the January 1994 Information Superhighway Summit, a government-industry gathering that addressed, in a group more heavily dominated by entertainment industry executives, some of the issues raised at the CSTB colloquium. As reported in the Los Angeles Times (Lippman and Harmon, 1994): Not atypically for Hollywood, the most compelling reason many gave for attending the conference was that everyone else was. For the last year, representatives of the communications and media industries have demonstrated an almost compulsive need to trade information highway metaphors at symposiums and conferences with titles such as "Digital World," "Multimedia Expo" and "Digital Hollywood." Although everyone agreed that the highway is coming—and soon—there was little consensus about how it will evolve, how much it will cost or how quickly people will be able to access it through their computers, telephones or television sets. … Yet members of the Hollywood crowd seemed unsure of their place in the sparring between phone, cable and computer executives over how digital age entertainment and information will be delivered. "I feel like an English major in an organic chemistry course,'' said Disney's [Michael] Eisner. Notwithstanding the many ways that digital convergence can affect and be affected by the entertainment industry, there is likely to be a borader interaction with entertainment, per se. Entertainment is an essential part of human activity as well as an industrial classification signifier.37 The latter does not encompass the former in any meaningful way. Conventional measures of activity regarding the entertainment industry reflect no more accurately on the human activity of entertainment than measures of fertilizer production reflect accurately on the state of world nutrition, obesity in the United States, or changing fashions in cuisine. Failure to see the breadth and depth of entertainment as a central element in human activity, extending into many other realms of "production," probably accounts for the consistent misprediction of growth in markets for such products as consumer electronics and such services as long-distance telephone. These markets, at least from an economic perspective, have appeared to be almost completely "supply-driven"—the economist's euphemism for the fact that no one predicted the growth that actually occurred when the market took off. Yet they clearly have tapped deeply into some fundamental demand structure. Wildly optimistic and misplaced expectations for the picture-phone and interactive television illustrate the same confusion about the real nature of entertainment. The simple fact is, we don't understand the issue of entertainment very well. In looking for and predicting the impact of digital convergence, it may be useful to consider the potential of existing information goods and services as

OCR for page 15
entertainment: newspapers are information services catering to the masses, and commercial television almost defines that concept. Commercial television news is clearly a kind of information service, reporting facts, but it is also largely an entertainment product. A significant fraction of network television news budgets come from the networks' entertainment budgets, and the proliferation of "newsy" shows with celebrity anchors shows just how profitable such "infotainment" can be. Some would even argue, a la H.L. Mencken, that government itself is a source of entertainment (certainly supported by the success of the cable television network, C-SPAN), raising questions about some fraction of political expenses (polling, campaign consultants, and other election costs) relating to entertainment. As discussed in Chapter 3, related arguments might be made—and are made, seriously—in the context of education. The question is not what will happen with the new technologies and entertainment, but rather, what will these technologies add to the technology/entertainment mix already well in place? NOTES 1.   In 1962, an integrated circuit (IC) contained 10 transistors; today, one IC contains millions of transistors. 2.   "Intel Corp. and Spectron Microsystems, Inc. [plan] software that will support native signal processing on a Pentium chip, allowing multimedia software to run on the processor instead of requiring dedicated hardware. … The Pentium processor, though not fast enough to handle Joint Photographic Experts Group- and Motion Pictures Expert Group-type compression, will be able to compress and manipulate audio signals. This means that recorded speech can be slowed down or sped up. … The initial release will support audio only; video and communication will be supported later" (Mohan, 1994). A variety of specialized hardware components is used to support a variety of functions, from sound to graphics to communications, increasingly displacing specialized boxes and subsystems. Much innovation is taking place in more miniature and more integral communications componentry, for example, providing phone, fax, and voice mail capabilities within personal computers. See Corcoran (1993). 3.   New terms—"personal digital assistants," "personal communicators," and the like—have already been coined to describe these products. 4.   For instance, AT&T and a small company specializing in digital signal compression are developing an electronic "black box" for television set tops, to provide for reception and decoding of movies and other video services using telephone lines (Klein and Aston, 1993). Hewlett Packard and TV Answer collaborated to build a set-top box that provides interactive services; the system uses a portion of the radio spectrum to uplink and download signals from a satellite (Millison and LaGrow, 1993). 5.   The overlap between computers and conventional consumer electronics was noted in connection with the newcomer role of Microsoft and Intel at the 1994 Las Vegas Consumer Electronics Show. "Their inclusion illustrates just how blurred the line has become between computers and consumer electronics. While sales of more traditional consumer equipment languish, computer concerns hope to spur purchases of multimedia

OCR for page 15
    personal computers, hand-held 'digital assistants' and sophisticated computer games" (Carlton, 1994a). 6.   See Hodge (1995) for a discussion of technical and design issues for STBs that could support interactive television. Hodge asserts that an STB consistent with open architecture is "a remote control unit for the video server, to which it connects through an ATM network. This is the distinction between this unit and pre-ITV [interactive television]. It provides bilateral, full duplex communications to the video server" (p. 154). 7.   Wildman commented on how the introduction of sound in film ended careers of actors with poor voices and enabled careers of actors whose voices carried well. 8.   "[W]hile virtually every major publishing house and Hollywood studio has announced plans to develop sophisticated, high-quality multimedia software, and though cable, telephone and computer companies are forming alliances to test interactive TV systems, many of these ventures are in their infancy and won't translate into large numbers of jobs for years" (Kruger, 1994). 9.   Sherry Turkle, personal communication, November 11, 1993. 10.   With research under way in numerous technical areas aimed at leveraging its network architecture, the cable industry is billing its infrastructure as "the missing link to myriad multimedia applications," including personal communications services (Dukes, n.d.). 11.   Information can be carried on electromagnetic impulses or waves in various frequencies, the sum of which is known as the spectrum. The full electromagnetic spectrum ranges from radio waves (relatively long wavelengths) to x-rays and cosmic rays (at short wavelengths). Transmission through the air is concentrated in a narrow portion of the radio spectrum; radio frequencies that are very low cannot carry enough information for video, while radio signals at high frequencies cannot penetrate objects. Competition is growing for the prime bands of radio frequencies (Chiddix, 1991). 12.   A comprehensive analysis of wireless technology may be found in The Economist (1993a). 13.   This argument hinges on the development of amplifiers that can increase the speed of fiber-optic transmission. Today, fiber-optic technology is constrained by an electronic bottleneck: every 22 miles, the signal must be converted to electronics and then regenerated, Gilder noted. This means that, despite its wide bandwidth, fiber optics is limited in speed to 2.5 gigahertz (2.5 billion cycles per second). But the true capacity of fiber optics is 25,000 gigahertz—1,000 times all the frequencies used in the air today. "It could accommodate all phone calls in America on the peak moment of Mother's Day," Gilder said. 14.   Experimentation with real-time audio and video conferencing over the Internet has begun using the so-called Mbone multicast backbone (Eriksson, 1994). 15.   This phenomenon is discussed in Katz and Shapiro (1994). 16.   The Motion Picture Experts Group (Radcliffe, 1993) and a variety of corporate alliances and consortia (Markoff, 1995a) have pursued standards for video compression and video storage. 17.   Sound systems for films with digital sound are expensive and variable. Digital Theater System places the soundtrack on a disk separate from the film and synchronizes the two. Sony Dynamic Digital puts the soundtrack on the outer edge of the film, while Dolby uses the space between sprocket holes (Fantel, 1994).

OCR for page 15
18.   An overview of the standards-setting process for information networking is provided in CSTB (1992), p. 56. 19.   For example, the standard for audio CDs is lower than current recording capabilities. The "Red Book" standard calls for music to be stored in a 16-bit format, meaning the sound is sampled at 16 places along the sound wave 44,100 times per second; by contrast, studios now can record digitally using a 20-bit system, sampling 16 times more information and creating richer sound (Herschman, 1993). 20.   See Pereira (1994) and Carlton and King (1994). 21.   The intersection of standards setting and intellectual property rights management has been tracked in the annual Telecommunications Policy Research Conference series and in a 1994 workshop on standards setting for information infrastructure sponsored by the National Institute of Standards and Technology. 22.   For instance, the de facto standard for studio film productions—35-millimeter film at 24 frames per second—has survived for more than 60 years; the "Red Book" standard for CDs was set by Philips NV and Sony Corp. (NTIA, 1993). 23.   In addition, the Multimedia and Hypermedia Experts Group is developing coding for multimedia to manage throughput on distribution networks (Dukes, n.d.). 24.   According to Farhi and Sugawara (1994), "None of the experts is suggesting that the information superhighway won't get built, just that early predictions of its quick completion were grossly overstated." 25.   A number of major artists are pursuing interactive ventures, and consumers can buy products that allow them to create music out of preconstructed parts, or experience combinations of music, art, and games (David, 1993; Goldberg, 1993; Bermant, 1994). 26.   Where material can be confined to digital formats, economies can be reaped, for example through the electronic cinema concept, reducing the need to undergo the expense of conversion of motion pictures to film (especially likely where material is produced expressly for electronic distribution rather than traditional theatres). 27.   Walt Disney Co. licensed Microsoft and Sony Imagesoft to develop multimedia titles featuring Disney characters. "Industry analysts say collaborations of this kind will proliferate as entertainment companies like movie studios seek new media to exploit for licensing revenue and as software publishers seek characters of proven appeal for interactive games and stories" (Fisher, 1994). 28.   A rare insider's perspective on the difference between storytelling structure and video games may be found in Parkes (1994). 29.   See Landis (1993), Lohr (1994), and Schwarz (1993). Multimedia projects under development by major studios include interactive cartoons; explanatory versions of old movies showing, for example, how certain dance sequences were created; a virtual biopark, which will enable children to experience an animal's lifestyle; interactive television shows, including games, dramas, and mysteries; and interactive amusement park rides. 30.   A notable late entrant was the Walt Disney Co., which initially stayed out of the multimedia technology business and instead licensed its standard programming for multimedia use. Chairman Michael D. Eisner said of interactive media, "I don't like it so we won't invest in it" (The Superhighway Summit, Academy of Television Arts and Sciences, Los Angeles, January 11, 1994). Eisner also has expressed concern that more channels will mean more mediocrity: "A real nightmare is finding out that the superhighway has turned into a detour …" (Laderman et al., 1993). In late-1994, however, Disney

OCR for page 15
    announced plans for greater activity revolving around interactive software products (Turner, 1994b). 31.   David Nagel, personal communication, October 1994. 32.   The United States is "by far the world's most important supplier of films for international trade," Wildman has written, citing UNESCO statistics (Wildman and Siwek, 1988). These authors suggest there is a strong, positive relationship between the purchasing power of native speakers in any language—the English-speaking population happens to be wealthy and populous—and the importance of trade in films and television programs in that language. 33.   This figure comes from Wildman's analysis of a period of 30 years ending in 1988 (Wildman and Siwek, 1988). 34.   The U.S. film industry had a positive trade balance of approximately $1 billion in 1985, even though foreign earnings are reduced significantly by a number of trade barriers, as well as by piracy (Wildman and Siwek, 1988). 35.   The value of film and television products to consumers is "determined almost entirely by such public-good elements as the appeal of the story portrayed, the quality of the writing and acting, the perspective of the director, and the competence of camera crews and other technical personnel" (Wildman and Siwek, 1988). A public good is nonexclusive; that is, its value is not diminished by its use. 36.   Of course, new, foreign sources of investment may also provoke major changes in segments of the entertainment industry, but that issue was beyond the scope of this project. 37.   See Huizinga (1950) for a discussion of the role of play in society.