including software, scalable information infrastructure, scalable public key infrastructure, software engineering, the human workforce, and human-computer interactions. He added that accessibility is an important focus these days in high-performance computing, his own field. He then introduced the first of the two speakers for the session.

COMMUNICATIONS

Alfred Aho

Lucent Technologies

Dr. Aho said he would talk about the impact of basic research and particularly communications research on the information-based economy. He said that basic research in computer and information science and engineering has been the font of the information technology economy that we are now moving into.

Toward Network-Hosted Services

He warned against predictions in this fast-moving field: Just ten years ago the World Wide Web did not exist, and no one could have predicted the degree of connectivity that exists today. Already we see a phenomenal penetration of computers and information, but what these figures will look like 10 years hence is anyone’s guess. He did make one guess—that more people will use applications and services hosted on the network and more will use network services in the conduct of business. Small and medium-size businesses will adopt network-hosted applications to compete with large businesses. Large businesses, in turn, will focus on core competencies and refocus their information technology staffs on strategic issues.

He spoke of several examples of the rapid growth of communications services:

  • 5 million e-mail messages will be sent in the next hour;

  • 35 million voice-mail messages will be delivered in the next hour;

  • 37 million people will log onto the Internet today and choose from among a billion Web pages;

  • Internet traffic will double in the next 100 days.

Other advantages of the network-hosted economy are that it reduces capital and operations expenses, allows businesses to become global, permits virtual enterprises and institutions, improves trust in network security, allows for storage and applications anywhere in the network, and improves network performance.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 90
Measuring and Sustaining the New Economy: Report of a Workshop including software, scalable information infrastructure, scalable public key infrastructure, software engineering, the human workforce, and human-computer interactions. He added that accessibility is an important focus these days in high-performance computing, his own field. He then introduced the first of the two speakers for the session. COMMUNICATIONS Alfred Aho Lucent Technologies Dr. Aho said he would talk about the impact of basic research and particularly communications research on the information-based economy. He said that basic research in computer and information science and engineering has been the font of the information technology economy that we are now moving into. Toward Network-Hosted Services He warned against predictions in this fast-moving field: Just ten years ago the World Wide Web did not exist, and no one could have predicted the degree of connectivity that exists today. Already we see a phenomenal penetration of computers and information, but what these figures will look like 10 years hence is anyone’s guess. He did make one guess—that more people will use applications and services hosted on the network and more will use network services in the conduct of business. Small and medium-size businesses will adopt network-hosted applications to compete with large businesses. Large businesses, in turn, will focus on core competencies and refocus their information technology staffs on strategic issues. He spoke of several examples of the rapid growth of communications services: 5 million e-mail messages will be sent in the next hour; 35 million voice-mail messages will be delivered in the next hour; 37 million people will log onto the Internet today and choose from among a billion Web pages; Internet traffic will double in the next 100 days. Other advantages of the network-hosted economy are that it reduces capital and operations expenses, allows businesses to become global, permits virtual enterprises and institutions, improves trust in network security, allows for storage and applications anywhere in the network, and improves network performance.

OCR for page 90
Measuring and Sustaining the New Economy: Report of a Workshop The Need for Talented People One factor that may limit how fast information-hosted technology will evolve is the availability of talented people. Almost every information industry has a huge shortage of talented people who are needed to create next-generation systems. In many classical telephone companies, those who run the systems have an average education of high school plus one year of community college. Companies are introducing into this global information infrastructure some of the most sophisticated technology ever created and asking people with little background to take care of the systems. The switching speed of a human neuron is unlikely to change in the next 20 years, “and this has consequences in almost all layers of the protocol stack with regard to how the information age is going to evolve.” First is the limit to how much new technology and techniques people can absorb. The human challenges of dealing with technology will be significant in terms of its penetration and use. The challenges in software engineering will be equally daunting. Dr. Aho said that he began a course he taught by asking the students to estimate the total number of lines of software required to support the world’s telecommunications infrastructure. The students’ answers began with a few thousand and ranged upward to perhaps a million. In fact, Lucent’s most advanced switch has somewhere between 10 and 15 million lines of code; Windows NT has about 35 million lines of code. Few people, he said, pay attention to how hard it is to evolve the huge software systems that support the new kinds of applications. Web-Based Provisioning The Emergence of Application Service Providers Moving up to the application layer of the protocol stack, Dr. Aho said, we see the emergence of new classes of Application Service Providers (ASPs) that provide services unimaginable just five years ago. Some 500 new entrants have emerged in just a few years. Some of them are joining to form economic “ecosystems,” in which they partner with other ASPs to provide suites of services tailored to certain segments of the industry. He cited an estimated growth rate of the ASP market of over 90 percent to $2 billion by 2003. This joining, done largely from economic necessity, produces another layer of complexity and potential difficulties in how business will be conducted. Bandwidth Trading and Customer Care In addition to new kinds of service providers and services, Web-based provisioning of transport services will enable bandwidth trading that will function like a NASDAQ market. One can imagine any kind of service available on the net-

OCR for page 90
Measuring and Sustaining the New Economy: Report of a Workshop work and create a market or exchange in which economic forces decide what is bought and sold for what price. Another new development, particularly for businesses, is modes of customer care heretofore unimagined. Enterprises can use Web call centers to take care of their customers with sophisticated software agents. These centers will integrate Internet and telephone access, provide functionality to multiple enterprises, provide fully functional remote agencies (through disaggregated “soft switches”) with dynamic load balancing across agents, and maintain “blended agent pools” for voice, Web, and outbound calls and back-office workflow tasks. Caching and intelligent redirection will reduce traffic congestion and increase speed for Web access. New Systems and Drivers He added that the way we educate ourselves, entertain ourselves, and deliver education and health services in the future will not simply be automated versions of existing systems. These will be replaced by radically new styles of offering services that we can only imagine today. Companies will put new kinds of services into the network to do this, effectively turning the world into a global distributive system that will include data, computation, communication—essentially all the things imagined for semiconductors and the construction of personal computers. This new network-hosted economy is being enabled, in turn, by new technology drivers as systems of the 1990s evolve into systems of 2000+. For example, we have moved from semiconductor to microprocessor; now will come systems on a chip. We have moved from analog to digital; now will come cells and packets on optical fiber. We have moved from narrow bandwidth to the information highway; now will come flexible multiservice broadband to endpoints (Table 3). Rapid Evolution of Optical Systems Referring to previous discussions of Moore’s Law, he said that some other areas of technology were moving even more rapidly in some other areas. Optical transmission capacity, for example, is doubling every year, revolutionizing network architectures and economics. Some optical systems are going into terabit and even the petabit range. Engineers at Lucent have been able to put hundreds of gigabits on a single wavelength; in most recent experiments they have put over 1,000 wavelengths on a single fiber. Given hundreds of gigabits on a wavelength and hundreds to thousands of wavelengths, by the middle of this decade systems will deliver petabits per second. With all this information the challenge would be to use it with our limited neural-processing capacity, and to find the information we want. To do this Lucent engineers have made glass more transparent by removing a water bubble in the

OCR for page 90
Measuring and Sustaining the New Economy: Report of a Workshop TABLE 3 Technology Shifts are the Enablers of the Network-hosted Economy 1990s 2000+ • Semiconductor to mircoprocessor → Systems on a chip • Analog to digital → Cells packets on optical • Narrow bandwidth to information → “Flexible” multi-service broadband to highway endpoints • Expensive bandwidth with complex architectures → Almost free bandwidth and simple architectures • Dumb access devices to information with natural language interfaces → Network-enriched, heterogeneous appliances applicance • Preprogrammed service logic → Downloadable Applets, “Servelets” • Proprietary to open systems → Open, but reliable, secure systems • Dumb to intelligent networks people and data anytime, anywhere → From connecting places to connecting connecting places • Internet: research tool to e-commerce → Cooperating Network of Networks fiber optic cables. This allows them to amplify broad ranges of wavelengths. They have also reduced the cost of devices that put information on and off these fibers and applied new chirp pulse lasers that allow the selection of separate wavelengths, each one of which can be separately modulated as a single system. Another interesting phenomenon is the use of microelectrical systems to switch wavelengths. Lucent has just introduced a product called the lambda router, which works by creating on a wafer an array of hundreds of tiny pop-up mirrors mounted on a two-gimbal system. A beam of light comes in from an optical fiber and the mirror is adjusted to reflect it off another mirror and send it to the exit fiber. This type of switching is much cheaper and more power-efficient than traditional electronic switching, so the manufacturer can visualize giving corporations and ultimately consumers their own wavelength on which to communicate with others. Challenges for Routers and Software An interesting question, said Dr. Aho, is what type of protocol stack will be put on this wavelength. In other words, what lies between the information technology and the lambda router in this world of the future? He reiterated that electronic routers may not be able to cope with the speed of optical transmission in the future. A similarly complex question is what kind of software infrastructure will enable these new applications on the network. He said that one goal is to make available to the community at large application programming interfaces on the network on top of which one can create new network-hosted solutions. In the past, communications systems have tended to keep a closed environment in which only the service provider could control the network services. The Internet, by

OCR for page 90
Measuring and Sustaining the New Economy: Report of a Workshop contrast, will bring a growth of new services and new applications at the endpoints because of the open protocol center. For the immediate future, Dr. Aho foresees the creation of devices like soft switches that give third-party service providers the ability to construct services and control certain resources on the network. This must be done in a way that prevents a third-party service provider from destroying the network for other providers and users, which leads to an important question: Because these new services will be created by software, how will reliable software be produced for this new world? Dr. Aho cited two interesting papers written in the 1940s: one by John von Neumann, who said we can get more reliable hardware out of unreliable components by using redundancy, and one by Claude Shannon, who showed how to create more reliable communication over a noisy channel by using air detecting and correcting codes. Today redundancy and air detecting and correcting codes are used routinely in practice, but the world awaits a paper on how to get more reliable software out of unreliable programmers. Dr. Aho suggested that such a paper would be worth a Turing Award.20 A Focus on Software Productivity and Reliability He reported that considerable effort is now focused on software productivity and reliability. In software feature verification, he said, there is a Moore-like curve. In the last two decades, algorithmic improvements have allowed programmers to verify automatically that software has certain properties. A unit that extracts from a C program a finite-state model allows modeling of the desired properties for the code, using a form of linear temporal logic. A property might be that when a user picks up a phone he should get a dial tone. This can be expressed as a temporal logic formula. The system takes the negation of this when one picks up the phone and never gets a dial tone. Model checking can then determine a violation of that property: Is it possible for the system to be in a state where picking up the phone never brings a dial tone. 20   Von Neumann, Shannon, and Turing were pioneers in artificial intelligence. John von Neumann designed the basic computer architecture still used today, in which the memory stores instructions as well as data, and instructions are executed serially. He described this in a 1945 paper. Claude Shannon showed that calculations could be performed much faster using electromagnetic relays than they could using mechanical calculators. He applied Boolean algebra. Electromechanical relays were used in the world’s first operational computer, Robinson, in 1940. Robinson was used by the English to decode messages from Enigma, the German enciphering machine. Alan Turing conceived of a universal Turing machine that could mimic the operation of any other computing machine. However, as did Godel, he also recognized that there exist certain kinds of calculations that no machine could perform. Even recognizing this limit on computers, Turing still did not doubt that computers could be made to think. The Association for Computing Machinery presents the Turing Award, in honor of Alan Turing, annually to individuals who have made significant technical contributions. See <http://www.acm.org>.