Page 107

BOX D.1 Computational Chemistry: Applying High-Performance Computing to Scientific Problems

Chemists were among the earliest researchers to pursue the use of computers to extend understanding of their field. At the time of a 1956 Texas conference on quantum chemistry, electronic computers had developed to the stage that it was just feasible to program large scientific computations. However, these computers provided results too crude to be of interest to quantum chemists, even by the late 1960s. Nonetheless, based on this "progress," the conference passed a recommendation urging that more machines be at the disposal of university departments. Several groups at the University of Chicago, the Massachusetts Institute of Technology, and elsewhere pursued the goal of exploiting these new facilities, engaging in huge (for the time) research programs to compute the electronic wave functions for molecules constituted of two atoms from the first full row of the periodic chart.

By the early 1 970s, significant progress had been made in the computation of molecular energies, wave functions, and dynamics of reacting systems and liquid structure by high-speed computers of the day, namely, the IBM 360/65, CDC 6600, and Univac 1108. Important work was also being done using semiempirical methods for systems with as many as 10 to 12 atoms. Reliable methods had been applied to the calculation of potential energy surfaces for H + H2 and F + H2 systems—methods that have been essential in the advancement of understanding molecular collisions. There had been semi-quantitative calculations of hydrogen bond strengths and protein conformation, but the facilities to carry out such work were available to only a small group of chemists, mostly on the staffs of national laboratories. The need to extend access to such facilities coupled with the new goal of bringing together people to work on software development and to attack important problems in chemistry led to the creation of the National Resource for Computation in Chemistry (NRCC) funded jointly by the National Science Foundation (NSF) and Department of Energy.

With the creation of the NSF supercomputer centers in the 1 980s, chemists were able to pursue computational studies with requirements well beyond the capability of systems available otherwise even to leading research groups of the period. In addition to high-speed computation, the centers made accessible large amounts of disk storage and fostered large-scale use of high-speed data communication.

A major breakthrough of the early 1 980s was the recognition by industry of the value of computational chemistry to the marketplace. Companies set up research groups that used computational chemistry software for electronic structure studies (e.g., Gaussian and CADPAC) and molecular mechanics simulations (e.g., AMBER and CHARMM), coupled with graphics platforms.

By the mid-1 980s an industry had developed in modeling software focused on the chemical, pharmaceutical, and biotechnology industries. Large companies, such as Eli Lilly and Dupont, bought supercomputers to provide the capability to model complex molecules and processes important for their businesses.

One of the major directions for future work is the application of accurate quantum mechanical approaches to biological systems. This effort would complement the molecular mechanics method with selected calculations of higher accuracy to enable explication of important fine points. Areas where these efforts might be introduced are enzymatic reactions involving transition-metal centers and an array of catalytic processes.

The additional power provided by massively parallel computer systems is stimulating a push for higher accuracy and improved algorithms. Methods that have had impact for serial processors that were readily modified for vector systems often must undergo major modification or replacement for massively parallel processors. A major requirement for advancement is seamless scalability across systems of different size.

With the need for higher accuracy on massively parallel systems will likely come increased attention to Monte Carlo methods for quantum many-body systems. Quantum simulations are naturally parallel and are expected to be used increasingly on massively parallel computer systems.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement