Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 105
Page 105 D Current High Performance Computing and Communications Initiative Grand Challenge Activities Since its beginning, the High Performance Computing and Communications Initiative has included Grand Challenges, difficult scientific problems whose solution will yield new scientific understanding while simultaneously advancing high-performance computing and communications. The following list of current Grand Challenge activities is based on the FY 1994 and FY 1995 "Blue Books" and communications with National Coordination Office staff. NATIONAL SCIENCE FOUNDATION Aerospace · Coupled field problems Computer Science · Machine learning · Parallel input/output (I/O) methods for I/O-intensive Grand Challenges Environmental Modeling and Prediction · Large-scale environmental modeling · Adaptive coordination of results of predictive models with experimental observations · Earthquake ground motion modeling in large basins · High-performance computing for land cover dynamics · Massively parallel simulation of large-scale, high-resolution ecosystem models Molecular Biology and Biomedical Imaging · Biomolecular design · Imaging in biological research · Advanced computational approaches to biomolecular modeling and structure determination · Understanding human joint mechanics through advanced computational models Product Design and Process Optimization · High-capacity atomic-level simulations for the design of materials Space Science · Black hole binaries: coalescence and gravitational radiation · Formation of galaxies and large-scale structure · Radio synthesis imaging
OCR for page 106
Page 106 DEPARTMENT OF ENERGY Energy · Mathematical combustion modeling · Quantum chromodynamics calculations · Oil reservoir modeling · Numerical Tokamak project Environmental Monitoring and Prediction · Computational chemistry (see Box D.1 for discussion) · Global climate modeling · Groundwater transport and remediation Molecular Biology and Biomedical Imaging · Computational structural biology Product Design and Process Optimization · First-principles simulation of materials properties NATIONAL AERONAUTICS AND SPACE ADMINISTRATION · Large-scale structure and galaxy formation · Cosmology and accretion astrophysics · Convective turbulence and mixing in astrophysics · Solar activity and heliospheric dynamics NATIONAL INSTITUTES OF HEALTH · Molecular biology · Biomedical imaging NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY · Product design and process optimization ENVIRONMENTAL PROTECTION ADMINISTRATION · Linked air and water-quality modeling NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION · Climate change prediction and weather forecasting
OCR for page 107
Page 107 BOX D.1 Computational Chemistry: Applying High-Performance Computing to Scientific Problems Chemists were among the earliest researchers to pursue the use of computers to extend understanding of their field. At the time of a 1956 Texas conference on quantum chemistry, electronic computers had developed to the stage that it was just feasible to program large scientific computations. However, these computers provided results too crude to be of interest to quantum chemists, even by the late 1960s. Nonetheless, based on this "progress," the conference passed a recommendation urging that more machines be at the disposal of university departments. Several groups at the University of Chicago, the Massachusetts Institute of Technology, and elsewhere pursued the goal of exploiting these new facilities, engaging in huge (for the time) research programs to compute the electronic wave functions for molecules constituted of two atoms from the first full row of the periodic chart. By the early 1 970s, significant progress had been made in the computation of molecular energies, wave functions, and dynamics of reacting systems and liquid structure by high-speed computers of the day, namely, the IBM 360/65, CDC 6600, and Univac 1108. Important work was also being done using semiempirical methods for systems with as many as 10 to 12 atoms. Reliable methods had been applied to the calculation of potential energy surfaces for H + H2 and F + H2 systemsmethods that have been essential in the advancement of understanding molecular collisions. There had been semi-quantitative calculations of hydrogen bond strengths and protein conformation, but the facilities to carry out such work were available to only a small group of chemists, mostly on the staffs of national laboratories. The need to extend access to such facilities coupled with the new goal of bringing together people to work on software development and to attack important problems in chemistry led to the creation of the National Resource for Computation in Chemistry (NRCC) funded jointly by the National Science Foundation (NSF) and Department of Energy. With the creation of the NSF supercomputer centers in the 1 980s, chemists were able to pursue computational studies with requirements well beyond the capability of systems available otherwise even to leading research groups of the period. In addition to high-speed computation, the centers made accessible large amounts of disk storage and fostered large-scale use of high-speed data communication. A major breakthrough of the early 1 980s was the recognition by industry of the value of computational chemistry to the marketplace. Companies set up research groups that used computational chemistry software for electronic structure studies (e.g., Gaussian and CADPAC) and molecular mechanics simulations (e.g., AMBER and CHARMM), coupled with graphics platforms. By the mid-1 980s an industry had developed in modeling software focused on the chemical, pharmaceutical, and biotechnology industries. Large companies, such as Eli Lilly and Dupont, bought supercomputers to provide the capability to model complex molecules and processes important for their businesses. One of the major directions for future work is the application of accurate quantum mechanical approaches to biological systems. This effort would complement the molecular mechanics method with selected calculations of higher accuracy to enable explication of important fine points. Areas where these efforts might be introduced are enzymatic reactions involving transition-metal centers and an array of catalytic processes. The additional power provided by massively parallel computer systems is stimulating a push for higher accuracy and improved algorithms. Methods that have had impact for serial processors that were readily modified for vector systems often must undergo major modification or replacement for massively parallel processors. A major requirement for advancement is seamless scalability across systems of different size. With the need for higher accuracy on massively parallel systems will likely come increased attention to Monte Carlo methods for quantum many-body systems. Quantum simulations are naturally parallel and are expected to be used increasingly on massively parallel computer systems.
Representative terms from entire chapter: