DINNER SPEECH



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 109
Eighth Annual Symposium on Frontiers of Engineering DINNER SPEECH

OCR for page 109
Eighth Annual Symposium on Frontiers of Engineering This page in the original is blank.

OCR for page 109
Eighth Annual Symposium on Frontiers of Engineering The Science, Technology, and Business of Digital Communication ANDREW J. VITERBI Viterbi Group, LLC San Diego, California ABSTRACT Wire-line telegraphy systems in the nineteenth century and wireless in the early twentieth century used rudimentary digital communication. Modern digital communication technology originated in the middle of the twentieth century and blossomed late in the century. The impetus was two-fold—solid-state integration that grew exponentially according to Moore’s law and the development of system theories of information and communication. Together they made possible the sophisticated algorithms that enable efficient digital communications both via satellite and terrestrially. Advanced communication systems, which were first used for military and government satellites, became economically viable and universally available only in the 1990s. Digital satellite broadcasting and wireless cellular voice and data transmissions are the beneficiaries of this half-century of remarkable progress. INTRODUCTION A crude form of digital communication began at the turn of the twentieth century with Guglielmo Marconi’s experiments. These early radio components generated pulses of energy of varying lengths, but not continuous waveforms. Analog communication really began with Lee De Forest’s triode amplifier. But modern digital communication encompasses more than the transmission of waveforms representing 1’s and 0’s. It includes elaborate processing of information to maximize the efficiency and accuracy of the message, whether it is audio, visual, or textual. Processing goes well beyond the capabilities of simple analog

OCR for page 109
Eighth Annual Symposium on Frontiers of Engineering modulation. This phase of the development of digital communication dates from the late 1940s, when two groundbreaking events took place within months of each other in the same complex of Bell Telephone Laboratory buildings in Murray Hill, New Jersey. The first was the development of the transistor and the birth of solid-state electronics. The second was the founding of the field of information theory with its remarkable implications for communication. SCIENCE The scientific basis for modern digital communication derives in equal measure from physical principles and mathematical concepts. Electromagnetic theory provides the foundation for all forms of electrical communications, wired or wireless. This scientific field originated in the eighteenth century and flourished in the nineteenth, culminating in James Clerk Maxwell’s equations and Heinrich Hertz’s propagation experiments. But modern digital communication was enabled by another field of physical research, solid-state electronics, which was developed in the middle of the twentieth century. The development of the transistor in 1947 led two decades later to the beginning of solid-state circuit integration with the ensuing exponential growth in digital processing and memory capabilities. The mathematical origins of digital communication theory are as remote as the Gauss, Euler, Fourier, and Laplace papers of the eighteenth and early nineteenth centuries and as recent as Shannon’s theory of information in the mid-twentieth century. Following the development of the transistor in 1947 at Bell Laboratories by Bardeen, Brattain, and Shockley, for which they earned a Nobel Prize, numerous government and commercial efforts were undertaken to replace bulky, power-hungry, and vulnerable vacuum tubes. William Shockley left Bell Laboratories to found Shockley Laboratories, which lost its key researchers to Fairchild Corporation, from which emerged the founders of Intel Corporation, the prime mover in the creation of the personal computer through the development of the microprocessor. In fact, in 1965, Intel’s cofounder Gordon Moore foresaw the exponential growth in solid-state integration and predicted that device density (the number of devices per silicon integrated circuit) would double every 18 months. Although Moore’s law is based on qualitative socioeconomic arguments rather than quantitative physical theories, it has proven to be amazingly accurate. In fact, from 1965 to 2002, the growth rate was slightly ahead of the 225 growth rate Moore predicted. The decrease in cost and power consumption has been proportional to the increase in device density. Indeed, the concept of “system-on-a-chip” has become commonplace in large-volume electronic manufacturing in the last few years. The increase in device speeds has also been exponential, rising from kilo-operations per second (K ops) in the 1960s to M ops in the 1980s to G ops today. This physical capability would be in search of an application were it not for

OCR for page 109
Eighth Annual Symposium on Frontiers of Engineering concurrent advances in system concepts rooted in applied mathematics. Although this field is the product of great European mathematicians of previous centuries, the impetus for the great strides in communication theory in the twentieth century was largely the work of midcentury American mathematicians, mostly the results of war-related research. The three prophets of modern communication theory were Norbert Wiener, Stephen Rice, and Claude Shannon. Wiener, a professor at MIT is often cited as the father of cybernetics. In 1949, he published a monograph entitled Extrapolation, Interpolation and Smoothing of Stationary Time Series, known to students as the Yellow Peril for the color of its binding. This work, which was grounded in both harmonic analysis and mathematical statistics, influenced the design of early radar and analog communication systems. Rice’s work at Bell Laboratories, which was published in 1944 in a paper in the Bell System Technical Journal entitled “Mathematical Theory of Noise,” applied random-process theory to the communication problem. In 1948, his colleague Claude Shannon, who had spent the war years theorizing on cryptographic concepts, published papers in two issues of the same journal entitled “A Mathematical Theory of Communication,” which introduced startling new concepts that went well beyond any previously well established theory. (In fact, initially it was underestimated or misunderstood by both physicists and mathematicians). Lest we appear nationalistic in heralding this purely American school, we should note that there was a nearly parallel Russian school led by Khinchine, Kotelnikov, and Kolmogorov that produced approximately parallel results. The practical applications of their work, however, never achieved wide acceptance or had the same impact. In the ensuing decades, schools of communication theory emerged in Hungary, Israel, Canada, Sweden, Italy, Germany, France, and Switzerland. The Hungarians followed the Russian model; all the rest followed the American model. Shannon’s theories, although difficult to master, are very easy to describe in terms of the less than obvious answers to two basic questions about the limits of digital communication. The first question, “compression” or “source coding,” asks how few bits per second are required for the faithful reproduction of a source (voice, video, or text). The second, “transmission” or “channel coding,” asks how many bits per second can be accurately transmitted over a noisy and otherwise impaired medium. The two papers published by Shannon in 1948 fully answered both questions! It took nearly half a century, however, to develop communication systems that have almost reached the performance predicted by Shannon. TECHNOLOGY Solid-state integrated circuitry, the technology that physically enabled advanced digital communication, would not have been possible without the evolu

OCR for page 109
Eighth Annual Symposium on Frontiers of Engineering tion of computing capabilities. Integration dramatically lowered the price of computation to the level of consumer appliances. This enormous expansion of the market resulted in economies of scale that further reduced prices. But the requirements of digital communication processors go beyond the processing speed and memory requirements of a computation processor. Best described as real-time digital-signal processing, it includes the basic operations of Fourier transforms and statistical operations, such as likelihood function computations involved in demodulation and decoding. The initial impetus for the development of these technologies came from the U.S. government. Starting in the 1950s, military research and development (R&D) by government agencies explored digital communications for their added security, covertness, and suppression of intentional interference (jamming). Similar requirements and techniques were simultaneously evolving for radar, which reinforced the communication research. In the 1960s and 1970s, NASA funded efforts to improve the efficiency of communication to and from space vehicles and satellites. The twin drivers were the minimizing of weight in satellite orbit and the maximizing of the communication range of space vehicles for a given earth antenna size (or conversely minimizing the antenna size for a given range). Government-sponsored R&D was performed by a variety of organizations, ranging from Defense Department and NASA laboratories to federally contracted research centers to universities and private contractors large and small. Later, through the 1990s, commercially motivated R&D led to great advances in wire-line modems, data-transmitting and broadcasting satellite systems operating with very small-aperture antennas (VSATs); in the diffusion of Internet connectivity; and in the explosive growth of wireless telephony. BUSINESS Large-scale commercialization of digital communication in the 1980s and 1990s spawned at least five interrelated industries in approximately the following order. In the 1970s, wire-line modems, the first personal modems, transmitted in the low K bits per second. Since then, the combination of improved lines and sophisticated signal-processing algorithms has increased transmission rates to M bits per second for digital subscriber loop (DSL) service. At the same time, coaxial cable systems installed to provide analog television now carry data at a rate of several M bits per second or several channels of digital television in the same bandwidth originally occupied by one analog channel. Starting in the 1960s, communication satellites began transmitting analog programming to cable TV head-ends. Each program occupied one satellite transponder and required antennas several meters in diameter. With digital compression and signal processing, as well as high-power satellites, the number of

OCR for page 109
Eighth Annual Symposium on Frontiers of Engineering programs per channel increased more than four-fold, and antenna diameters were reduced to fractions of a meter, opening up a large consumer market. The virtually unlimited bandwidth of fiber-optic networks developed in the last quarter of the twentieth century created an ideal foundation for networks carrying all forms of communication. Packet switching, which was developed through military research in the 1950s, provides tremendous flexibility and efficiency for digital transmission. The greatest beneficiary of all has been the Internet. Conceived as the ARPANET in the late 1960s for sharing programs, processing capabilities, and databanks among a few dozen laboratories and universities working on defense research, the Internet now connects countless nodes and services hundreds of millions of users through the Worldwide Web. The most recent and most widespread digital communication application, cellular wireless telephony, now serves more than one billion subscribers, one-sixth of the world population. Launched in the 1980s as an analog technology (basically two-way FM radio), in the 1990s, second-generation digital technology increased the market a hundred-fold. Some of the most sophisticated compression and transmission technologies have reduced the size, lowered the cost, and reduced the power consumption (and hence increased battery life) of cellular phones. In addition to cellular networks, also known as wireless wide-area networks (WANs), wireless local-area networks (LANs) are now used in homes, business enterprises, and urban “hot spots.” Finally, wireless personal-area networks (PANs), which transmit over a distance of a few meters, avoid cables and wires within a home or workplace. The two major wireless technologies provide a study in contrasts. Communication satellites in geosynchronous orbit must transmit a wide bandwidth signal over a range of 40,000 kilometers resulting in a very low signal-to-noise ratio. By contrast, terrestrial wireless transmission has a range of a few kilometers at most, but each user’s signal must contend with interference from a multitude of other users’ signals arriving simultaneously at the same base station or access point. A particularly successful wireless cellular technology, known as spread-spectrum or code-division multiple access (CDMA), reduces the interfering signal to appear as wideband noise, similar to the satellite receiver’s noise, which is of thermal origin. Thus, the receiver processing technologies of these two widely disparate digital communication technologies are surprisingly similar. The Wireless Cellular Industry I will conclude with a brief history of the biggest proliferation of digital communication devices. In less than a decade, the worldwide wireless cellular industry has put into service more than a billion digital cellular phones and data terminals. The evolution of this industry has been defined in terms of product

OCR for page 109
Eighth Annual Symposium on Frontiers of Engineering TABLE 1 Wireless Generations Defined 1G Analog Voice 1980s   2G Digital Voice and 10 Kbps Circuit Switched Data 1990s TDMA, GSM, CDMA 3G Digital Voice and Packet Data: 384 Kbps to 2 Mbps 2000+ CDMA 2000, WCDMA generation by standards bodies and the media (Table 1). The first generation of cellular handsets and infrastructure involved purely analog transmission and processing. Service began in the early 1980s and was well established in North America by 1990, when digital handsets and the infrastructure of the second generation (2G) were introduced. In Europe, where analog wireless technology had not been widely adopted, the new digital system, designated Global System for Mobiles (GSM), was highly successful largely because it was standardized by the European Telecommunication Standards Institute (controlled by major wireless carriers and manufacturers) so it could be adopted throughout the European Union. European subscribers could maintain service while roaming over a large geographical area. The time-division multiple-access (TDMA) transmission technology of GSM was emulated in North America (IS-54) and Japan (PDC), although with much narrower bandwidths. All three systems failed to achieve the industry’s goal of increasing bandwidth efficiency over analog by a factor of ten. PDC was adopted only in Japan; the North American TDMA system attracted limited support, mainly in Latin America, but did not displace analog service, even in the United States. Code-division multiple-access (CDMA), the only digital technology that fulfilled the goal of bandwidth efficiency, was proposed in the late 1980s. After initial industry resistance, CDMA was accepted in 1993 as an alternate North American standard (IS-95 or CDMA One) and entered service in the mid-1990s. South Korea adopted CDMA as its sole digital standard, began service in 1996, and was remarkably successful, not only in penetrating more than half of the population but also in converting the nation from an importer of technology to a major exporter of technology. Ultimately, CDMA service was offered by one or more 2G wireless service providers in most nations following the American and Korean example, usually in competition with providers of GSM or other TDMA services. The notable exception was Western Europe, where competition was excluded by the EU’s regulatory adherence to GSM. When the International Telecommunications Union (ITU) set about selecting an access technology for the so-called third generation (3G) to provide for higher speed data and further efficiencies in voice telephony, it approved two enhanced forms of CDMA. The fundamental reason for this choice is shown in

OCR for page 109
Eighth Annual Symposium on Frontiers of Engineering TABLE 2 2G Voice Efficiencies Relative number of users in given bandwith Analog 1 GSM 3 to 4 TDMA 4 to 5 CDMA 10 to 12 Beyond 2G: CDMA 2000-1x 21 WCDMA ?   Source: Seybold, 2001 Table 2, which compares various 2G technologies (based on research by a noted wireless analyst). The basis of comparison is the number of digital voice users that can be supported in the same bandwidth required to support one analog voice conversation. The principal difference between the two 3G versions, CDMA 2000 and WCDMA, is that the former is a direct evolution of the 2G version CDMA One and therefore requires synchronization of the (base station) infrastructure; the latter, WCDMA, puts the burden on subscriber handsets through increased processing and consequently higher power consumption. The CDMA 2000 subscriber count is already more than twenty million and has demonstrated a near doubling in capacity over that of the 2G CDMA System. WCDMA is off to a slow start, partly because of technical difficulties, but largely because of the poor financial health of European carriers caused by the huge debt they incurred as a result of excessive payments for the 3G spectrum. This cost was mostly avoided by North American and Asian service providers who operate 3G CDMA 2000 in the same spectrum as 2G CDMA One, from which it evolved. The ultimate success of 3G, however, will depend on the benefits it provides to subscribers, in the form of new and enhanced applications, and to service providers, in increased revenue and better returns on infrastructure investments. Four benefits are already evident: The use of spectrum (which has become a costly resource) for voice and data is much more efficient. Through greatly increased data transmission speed and, consequently, reduced latency, the sharing of video clips and still photos has become more appealing (as has already been demonstrated in some Asian countries). Even real-time game playing over the wireless Internet is being proposed.

OCR for page 109
Eighth Annual Symposium on Frontiers of Engineering For the same reason, the downloading of Web pages to PCs or PDAs away from one’s office or home is also more attractive. Enterprises with wireless LANs can extend them seamlessly to remote locations, even with mobility. With these advantages, the upgraded digital wireless industry is certain to rebound from its current recession, although more slowly than proponents have envisioned. REFERENCE Seybold, A. 2001. Silicon Insights: Spectral Efficiency. Available online at: <http://abcnews.go.com/sections/business/DailyNews/silicon_insights_seybold_010716.html>.