The following HTML text is provided to enhance online
readability. Many aspects of typography translate only awkwardly to HTML.
Please use the page image
as the authoritative form to ensure accuracy.
Getting up to Speed the Future of Supercomputing
The meaning of the terms supercomputing or supercomputer is relative to the overall state of computing at a given time. For example, in 1994, when describing computers subject to export control, the Department of Commerce’s Bureau of Export Administration amended its definition of “supercomputer” to increase the threshold level from a composite theoretical performance (CTP) equal to or exceeding 195 million theoretical operations per second (MTOPS) to a CTP equal to or exceeding 1,500 MTOPS.2 Current examples of supercomputers are contained in the TOP500 list of the 500 most powerful computer systems as measured by best performance on the Linpack benchmarks.3
Supercomputers provide significantly greater sustained performance than is available from the vast majority of installed contemporary mainstream computer systems. In applications such as the analysis of intelligence data, weather prediction, and climate modeling, supercomputers enable the generation of information that would not otherwise be available or that could not be generated in time to be actionable. Supercomputing can accelerate scientific research in important areas such as physics, material sciences, biology, and medicine. Supercomputer simulations can augment or replace experimentation in cases where experiments are hazardous, expensive, or even impossible to perform or to instrument. They can collapse time and enable us to observe the evolution of climate over centuries or the evolution of galaxies over billions of years; they can expand space and allow us to observe atomic phenomena or shrink space and allow us to observe the core of a supernova. They can save lives and money by producing better predictions on the landfall of a hurricane or the impact of an earthquake.
In most cases, the problem solved on a supercomputer is derived from a mathematical model of the physical world. Approximations are made when the world is represented using continuous models (partial differential equations) and when these continuous models are discretized. Validated approximate solutions will provide sufficient information to stimulate human scientific imagination or to aid human engineering judgment. As computational power increases, fewer compromises are made, and more accurate results can be obtained. Therefore, in many application domains, there is essentially no limit to the amount of compute power
The TOP500 list is available at <http://www.top500.org>. The Linpack benchmark solves a dense system of linear equations; in the version used for TOP500, one picks a system size for which the computer exhibits the highest computation rate.