contrast, will bring a growth of new services and new applications at the endpoints because of the open protocol center.

For the immediate future, Dr. Aho foresees the creation of devices like soft switches that give third-party service providers the ability to construct services and control certain resources on the network. This must be done in a way that prevents a third-party service provider from destroying the network for other providers and users, which leads to an important question: Because these new services will be created by software, how will reliable software be produced for this new world? Dr. Aho cited two interesting papers written in the 1940s: one by John von Neumann, who said we can get more reliable hardware out of unreliable components by using redundancy, and one by Claude Shannon, who showed how to create more reliable communication over a noisy channel by using air detecting and correcting codes. Today redundancy and air detecting and correcting codes are used routinely in practice, but the world awaits a paper on how to get more reliable software out of unreliable programmers. Dr. Aho suggested that such a paper would be worth a Turing Award.20

A Focus on Software Productivity and Reliability

He reported that considerable effort is now focused on software productivity and reliability. In software feature verification, he said, there is a Moore-like curve. In the last two decades, algorithmic improvements have allowed programmers to verify automatically that software has certain properties. A unit that extracts from a C program a finite-state model allows modeling of the desired properties for the code, using a form of linear temporal logic. A property might be that when a user picks up a phone he should get a dial tone. This can be expressed as a temporal logic formula. The system takes the negation of this when one picks up the phone and never gets a dial tone. Model checking can then determine a violation of that property: Is it possible for the system to be in a state where picking up the phone never brings a dial tone.

20  

Von Neumann, Shannon, and Turing were pioneers in artificial intelligence. John von Neumann designed the basic computer architecture still used today, in which the memory stores instructions as well as data, and instructions are executed serially. He described this in a 1945 paper. Claude Shannon showed that calculations could be performed much faster using electromagnetic relays than they could using mechanical calculators. He applied Boolean algebra. Electromechanical relays were used in the world’s first operational computer, Robinson, in 1940. Robinson was used by the English to decode messages from Enigma, the German enciphering machine. Alan Turing conceived of a universal Turing machine that could mimic the operation of any other computing machine. However, as did Godel, he also recognized that there exist certain kinds of calculations that no machine could perform. Even recognizing this limit on computers, Turing still did not doubt that computers could be made to think. The Association for Computing Machinery presents the Turing Award, in honor of Alan Turing, annually to individuals who have made significant technical contributions. See <http://www.acm.org>.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement