detect accelerating movements that could eventually lead to a catastrophic failure. Near-real-time processing of the data could be linked with preset alarm levels to provide early warning of impending disaster in sufficient time for evacuation or other emergency action, and perhaps even allow for remedial action to prevent it. Imagine post-event damage surveys performed by space-borne platforms that could be used to direct response and recovery efforts in the immediate aftermath of a catastrophic event such as an earthquake or terrorist attack. These types of systems are within the realm of existing technologies, and could become reality with appropriate investment in research and development.
The staggering increases in computing power and communications capabilities over the past 50 years have led to the development of information systems unimaginable just a few decades ago. The evolution in computer simulation capability since 1993 is shown in Figure 3.8. The growth in computer power has been driven by energy, scientific, and engineering applications, and especially by defense applications such as the Accelerated Strategic Computing Initiative (ASCI), a project started in 1996 to replace traditional nuclear testing by highly tuned, massive computer simulations (Messina, 1999).
As shown in Figure 3.8, the performance of supercomputers evolves rapidly. Today’s fastest computers perform up to 71 × 1012 floating-point operations per second. A list of the fastest computers can be found on the Web at http://www.top500.org. In November 2004, the fastest computer was the Department of Energy/IBM BlueGene/L beta system, which has the record benchmark performance of 70.72 Tflop/s (“teraflops,” or trillions of calculations per second). It is closely followed (51.87 Tflop/s) by the Columbia system built by Silicon Graphics and installed at the National Aeronautics and Space Administration Ames Research Center