Skip to main content

Currently Skimming:

Appendix E: Accomplishments of National Science Foundation Supercomputer Centers
Pages 108-117

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 108...
... In 1992 the four NSF Supercomputer centers (Cornell Theory Center, National Center for Supercomputing Applications, Pittsburgh Supercomputer Center, and San Diego Supercomputer Center) formed a collaboration based on the concept of a national MetaC enter for computational science and engineering: a collection of intellectual and physical resources unlimited by geographical or institutional constraints.
From page 109...
... All usage has been converted to equivalent processor-years for a Cray Research Y-MP, the type of Supercomputer that the NSF centers first installed in 1985-1986. TABLE E.1 Supercomputer Usage at National Science Foundation Supercomputer Centers, 1986 to 1994 Usage in Normalized Fiscal YearActive UsersCPU Hoursa 19861,35829,485 19873,32695,752 19885,069121,615 19895,975165,950 19907,364250,628 19917,887361,037 19928,578398,932 19937,730910,088 19947,4312,249,562 aData prior to May 1990 include the John van Neumann Center.
From page 110...
... , allowing the science and engineering communities maximum choice in selecting a machine that matches their computational needs. A short list of architectures offered through the NSF centers program includes single and clustered high-performance workstations or workstation multiprocessors, minicomputers, graphics supercomputers, mainframes with or without attached processors or vector units, vector supercomputers, and single instruction multiple data (SIMD)
From page 111...
... Cray T3D 512 processors 2 DEC Workstation Cluster IBM Workstation Cluster HP Workstation Cluster MasPar 2 1 6,000 processors Convex Exemplar 8 processors (new) SGI Challenge 32 processors (new)
From page 112...
... The NSF supercomputer centers were among the first to experiment with clusters of workstations as an alternative for scalable computing. The first work was done in the 1980s with loosely coupled clusters of Silicon Graphics Inc.
From page 113...
... ARPANET and the TCP/IP protocol, the NSFNET rapidly grew to provide remote access to the NSF supercomputer centers by the creation of regional and campus connections to the backbone. Although started by the pull from the high end, the NSFNET soon began to provide ubiquitous connectivity to the academic research community for electronic mail, file transport, and remote log-in, as well as supercomputer connectivity.
From page 114...
... K-12 Educators and Students Training of high school teachers and curriculum development are among the many MetaCenter educational efforts. Several programs have been initiated, such as ChemViz to help students understand abstract chemistry concepts; a visualization workshop at Supercomputing '93; and SuperQuest, a program involving MetaCenter sites that brings teams of teachers and students from selected high schools to summer institutes to develop computational and visualization projects that they then work on throughout the following year.
From page 115...
... Quantum Physics and Materials Science The great disparity between nuclear, atomic, or molecular scales and macroscopic material scales implies that vast computing resources are needed to attempt to predict the characteristics of bulk matter from fundamental laws of physics. Since the beginning of the NSF centers program, researchers in this area have been among the most frequent users of supercomputers.
From page 116...
... · Crystallography · Artificial intelligence and protein folding · Protein kinase solution · Molecular neuroscience serotonin · Molecular neuroscience-acety~cholinesterase · Kinking DNA · Antibody-antigen docking · Tuning biomolecules to fight asthma · Virtual spiders and artificial silk Engineering Man-made devices have become so complex that researchers in both academia and industry have turned to supercomputers in order to be able to analyze and modify accurate models in ways that complement traditional experimental methods. High-performance computers enable academic engineers to study the brittleness of new types of steel, improve bone transplants, or reduce the drag of flows over surfaces using riblets.
From page 117...
... Finally, to improve global weather or climate forecasts, supercomputers allow researchers to study the physics of such critical processes as mixing at the air-ocean interface. Among the related problems being addressed are the following: · Detoxification of ground water · Storm modeling and forecasting · Los Angeles smog · Upper-ocean mixing · Simulating climate using distributed supercomputers Planetary Sciences, Astronomy, and Cosmology As was evident in the recent impact of Comet Shoemaker-Levy 9 with Jupiter, observatories on Earth and in space have become intimately linked.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.