BOX 7.2 THE RIGHT HAND OF THE MODERN SCIENTIST

Since the late 1940s, Smarr noted, "computers have emerged as universal devices for aiding scientific inquiry. They are used to control laboratory experiments, write scientific papers, solve equations, and store data." But "the supercomputer has been achieving something much more profound; it has been transforming the basic methods of scientific inquiry themselves" (Smarr, 1992, p. 155).

"Supercomputer," Smarr explained, is a generic term, referring always to "the fastest, largest-memory machines made at a given time. Thus, the definition is a relative one. In 1990, to rate as a supercomputer, a machine needed to have about 1 billion bytes of directly addressable memory, and had to be capable of sustaining computations in excess of a billion floating point operations per second" (p. 155). These fairly common measures for rating computer power refer to an information unit (a byte contains eight of the basic binary units, 1 or 0, called bits) and the speed with which additions, subtractions, multiplications, or divisions of decimal numbers can be accomplished. These machines had between 100 and 1000 times the power of the personal computers commonly in use, and might cost in the neighborhood of $30 million. As he surveyed the growth of computational science, however, Smarr predicted that the "rapidly increasing speed and connectivity of networks will contribute to altering how humans work with supercomputers" (p. 162). He envisions that scientists and others working from personal and desktop machines will enjoy almost instantaneous access to a number of supercomputers around the country at once, with various parts of their programs running on each. He foresees that "the development of new software will make it seem as if they were occurring on one computer. In this sense the national network of computers will appear to be one giant 'metacomputer' that is as easy to use as today's most user-friendly personal machines'' (p. 163).

Smarr credited von Neumann with the recognition, in the 1940s, that the technology of electronic digital computers "could be used to compute realistically complex solutions to the laws of physics and then to interrogate these solutions by changing some of the variables, as if one were performing actual experiments on physical reality" (p. 158). The American government's experience in World War II uncovered a number of practical applications of this principle and other uses for the computer's speed, "to aid in nuclear weapons design, code breaking, and other tasks of national security" (p. 157), reported Smarr. This effort developed in the succeeding decades into government centers with major laboratories—as reliant on their computers as on the electricity that powered them—at Los Alamos, New Mexico, and Livermore, California. "From the efforts of the large teams of scientists, engineers, and programmers who worked on the



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement