3
Technological Barriers: Computational, Experimental, and Integration Needs for ICME

Chapter 2 provided case studies that show the significant economic and competitive benefits that U.S. original equipment manufacturers (OEMs) and other manufacturers have achieved through the use of ICME. Those studies illustrate the integration of materials knowledge into component manufacturing, optimization, and prognosis. However, there remain significant technical barriers to the widespread adoption of ICME capabilities. In this chapter the committee discusses many of those challenges, focusing not only on modeling tools but also on the materials databases and experimental tools needed to make ICME a reality for a broad spectrum of materials applications. Finally, ways to integrate the various tools and data into a seamless ICME package are addressed.

CURRENT COMPUTATIONAL MATERIALS SCIENCE TOOLS

Today’s materials scientists have increasingly powerful computational tools at their disposal. A recent DOE study demonstrates the compelling nature of the opportunities in computational materials science (CMS).1 A recent NSF report focuses on the cyberinfrastructure needed for materials science.2 The power of

1

Department of Energy (DOE), Opportunities for Discovery: Theory and Computation in Basic Energy Sciences (2005). Available at http://www.sc.doe.gov/bes/reports/files/OD_rpt.pdf. Accessed February 2008.

2

National Science Foundation (NSF), Materials Research Cyberscience Enabled by Cyberinfrastructure (2004). Available at http://www.nsf.gov/mps/dmr/csci.pdf. Accessed February 2008.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 67
3 Technological Barriers: Computational, Experimental, and Integration Needs for ICME Chapter 2 provided case studies that show the significant economic and com- petitive benefits that U.S. original equipment manufacturers (OEMs) and other manufacturers have achieved through the use of ICME. Those studies illustrate the integration of materials knowledge into component manufacturing, optimi- zation, and prognosis. However, there remain significant technical barriers to the widespread adoption of ICME capabilities. In this chapter the committee discusses many of those challenges, focusing not only on modeling tools but also on the materials databases and experimental tools needed to make ICME a reality for a broad spectrum of materials applications. Finally, ways to integrate the various tools and data into a seamless ICME package are addressed. CURRENT COMPUTATIONAL MATERIALS SCIENCE TOOLS Today’s materials scientists have increasingly powerful computational tools at their disposal. A recent DOE study demonstrates the compelling nature of the opportunities in computational materials science (CMS).1 A recent NSF report focuses on the cyberinfrastructure needed for materials science.2 The power of 1 Department of Energy (DOE), Opportunities for Discovery: Theory and Computation in Basic Energy Sciences (2005). Available at http://www.sc.doe.gov/bes/reports/files/OD_rpt.pdf. Accessed February 2008. 2 National Science Foundation (NSF), Materials Research Cyberscience Enabled by Cyberinfrastructure (2004). Available at http://www.nsf.gov/mps/dmr/csci.pdf. Accessed February 2008. 

OCR for page 67
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  twenty-first century computing is making it possible to predict a range of structural features and properties from fundamental principles. These tools are diverse and range from the atomic level to the continuum level and from thermodynamic mod- els to science-based property models. Current computational materials methods range from the specialized materials modeling methods that are used in fundamen- tal research to the full-scale materials processing tools at manufacturing facilities. Researchers in materials science, mechanics, physics, and chemistry explore mate- rials processing–structure–property relationships as a natural part of the research process. The results from these explorations are often incorporated into sophisti- cated modeling methods focused on a narrow part of overall materials behavior. While these isolated CMS methods do not necessarily contribute to the ICME infrastructure, they represent a vast supermarket of method development that can be drawn on by yet-to-be developed integration efforts and infrastructures. The wide range of CMS methods available today are both a blessing and a curse to materials and engineering design teams. It is difficult for scientists and engi- neers to judge the efficacy of new or even well-established computational methods because the tools used are typically developed in somewhat isolated research envi- ronments. While this approach encourages creativity and innovation, it also means that use of these tools requires well-trained specialists who can maintain and run what are basically research codes. In other fields the computational methods—for example, finite element analysis (FEA) and finite difference methods—are firmly embodied in standard packages that have become an integral part of the academic training of the modern scientist or engineer, being based on the mathematical foundation of the discipline. In materials science and engineering, however, the scope is extremely broad and is based on a wide range of mechanisms that typically operate at different length and temporal scales, each of which needs to be modeled with specialized methods. The properties of materials are controlled by a multitude of separate and often competing mechanisms that operate over a wide range of length and time scales, The committee concludes that since there is no single overarching approach to modeling all materials phenomena, the widespread application of materials model- ing has been limited and has impeded the transformative power of ICME. Most computational materials methods can be traced back to academic groups that developed these methods as part of the educational, scientific, and engineer- ing process. A typical, but by no means universal, path to maturity would include several generations of research codes from one or more groups, which then are transitioned to applications in a government or industrial laboratory, then com- mercialized with or without government support. In the United States, federal support—through, for example, the Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) grants—has played a key role in commercializing processing and thermodynamic methods such as ProCast,

OCR for page 67
technological Barriers: comPutational, exPerimental, integration needs  and Deform, and Pandat. In the committee’s judgment, federal support will continue to play an important role in incubating and transitioning new ICME methods. Methods The fundamental technical challenge of ICME is that materials response and behavior involve a multitude of physical phenomena whose accurate capture in models requires spanning many orders of magnitude in length and time. The length scales in materials response range from nanometers of atoms to the centi- meters and meters of manufactured products. Similarly, time scales range from the picoseconds of atomic vibrations to the decades over which a component will be in service. Fundamentally, properties arise from the electronic distributions and bonding at the atomic scale of nanometers, but defects that exist on multiple length scales, from nanometers to centimeters, may in fact dominate properties. It should not be surprising that no single modeling approach can describe this multitude of phenomena or the breadth of scales involved. While many computational materi- als methods have been developed, each is focused on a specific set of issues and appropriate for a given range of lengths and times. Consider length scales from 1 angstrom to 100 microns. At the smallest scales scientists use electronic structure methods to predict bonding, magnetic moments, and transport properties of atoms in different configurations. As the simulation cells get larger and the times scales longer, empirical interatomic potentials are used to approximate these interactions. Optimization and temporal evolution of electronic structure and atomistic methods are achieved using conjugate gradients, molecular dynamics, and Monte Carlo techniques. At still larger scales, the information content of the simulation unit decreases until it becomes more efficient to describe the mate- rial in terms of the defect that dominates at that length scale. These units might be defects in the lattice (for example, dislocations), the internal interfaces (for example, grain boundaries), or some other internal structure, and the simulations use these defects as the fundamental simulation unit in the calculation. While true concurrent multiscale materials modeling is the goal of one segment of the materials community, for the foreseeable future most multiscale modeling will be accomplished by coordinating the input and output of stand-alone codes. This information passing approach has drawbacks associated with extracting infor- mation at each scale in an effective way. Also, all these approaches necessarily incor- porate simplifying assumptions that lead to errors and uncertainties in derived quantities that are propagated throughout the multiscale integration. Experimental data play a key role here in defining parameters and information not available from simulations at all scales and in calibrating and validating modeling techniques. Table 3-1 shows a variety of computational materials methods, some of them standard in ICME and others strictly research tools. The committee notes that the

OCR for page 67
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g 0 TABLE 3-1 Mode or Method, Required Input, Expected Output, and Typical Software Used in Materials Science and Engineering Class of Computational Materials Model/Method Inputs Outputs Software Examples Electronic structure Atomic number, mass, Electronic properties, elastic VASP, Wien2K, CASTEP, methods (density valence electrons, crystal constants, free energy GAMES, Gaussian, functional theory, structure and lattice vs. structure and other a=chem., SIESTA, quantum chemistry) spacing, Wyckoff positions, parameters, activation DACAPO atomic arrangement energies, reaction pathways, defect energies and interactions Atomistic simulations Interaction scheme, Thermodynamics, reaction CERIU2, LAMMPS, (molecular dynamics, potentials, methodologies, pathways, structures, point PARADYN, DL-POLY Monte Carlo) benchmarks defect and dislocation mobility, grain boundary energy and mobility, precipitate dimensions Dislocation dynamics Crystal structure and lattice Stress-strain behavior, PARANOID, ParaDis, spacing, elastic constants, hardening behavior, effect Dis-dynamics, boundary conditions, of size scale Micro-Megas mobility laws Thermodynamic Free-energy data from Phase predominance Pandat, ThermoCalc, methods (CALPHAD) electronic structure, diagrams, phase fractions, Fact Sage calorimetry data, free- multicomponent phase energy functions fit to diagram, free energies materials databases Microstructural Free-energy and kinetic Solidification and dendritic OpenPF, MICRESS, evolution methods databases (atom structure, microstructure DICTRA, 3DGG, Rex3D (phase-field, front- mobilities), interface and during processing, tracking methods, Potts grain boundary energies, deployment, and evolution models) (anisotropic) interface in service mobilities, elastic constants Micromechanical and Microstructural Properties of materials—for OOF, Voronoi Cell, mesoscale property characteristics, properties example, modulus, JMatPro, FRANC-3D, models (solid mechanics of phases and constituents strength, toughness, strain ZenCrack, DARWIN and FEA) tolerance, thermal/electrical conductivity, permeability; possibly creep and fatigue behavior Microstructural imaging Images from optical Image quantification and Mimics, IDL, 3D Doctor, software microscopy, electron digital representations Amira microscopes, X-rays, etc. Mesoscale structure Processing thermal and Microstructural PrecipiCalc, JMat Pro models (processing strain history characteristics (for models) example, grain size, texture, precipitate dimensions)

OCR for page 67
technological Barriers: comPutational, exPerimental, integration needs  and Class of Computational Materials Model/Method Inputs Outputs Software Examples Part-level FEA, finite Part geometry, Distribution of ProCast, MagmaSoft, difference, and other manufacturing processing temperatures, stresses CAPCAST, DEFORM, LS- continuum models parameters, component and deformation, electrical Dyna, Abaqus loads, materials properties currents, magnetic and optical behavior, etc. Code and systems Format of input and output Parameters for optimized iSIGHT/FIPER, QMD, integration of modules and the logical design, sensitivity to Phoenix structure of integration, variations in inputs or initial input individual modules Statistical tools Composition, process Correlations between inputs SPLUS, MiniTab, (neural nets, principal conditions, properties and outputs; mechanistic SYSTAT, FIPER, component analysis) insights PatternMaster, MATLAB, SAS/STAT table is not intended to be complete but rather to exemplify the methods avail- able for modeling materials characteristics. This table indicates typical inputs and outputs of the software and examples of widely used or recognized codes. Elec- tronic structure methods employ different approximate solutions to the quantum mechanics of atoms and electrons to explore the effects of bonding, chemistry, local structure, and dynamics on the mechanisms that affect material properties. Typically, tens to hundreds of atoms are included in such a calculation and the timescales are on the order of nanoseconds. In atomistic simulations, arrangements and trajectories of atoms and molecules are calculated. Generally based on models to describe the interactions among atoms, simulations are now routinely carried out with millions of atoms. Length scales and timescales are in the nanometer and nanosecond regime, and longer length scales and timescales are possible in the case of molecular system coarse graining from “all-atom” to “united atom” models (that is, interacting clusters of atoms). Dislocation dynamics methods are used to study the evolution of dislocations (curvilinear defects in the lattice) during plastic deformation. The total number of dislocations is typically less than a mil- lion, and strain rates are large compared to those measured in standard laboratory tests. Thermodynamic methods range from first-principle predictions of phase diagrams to complex database integration methods using existing tabulated data to produce phase diagrams and kinetics data. These methods are being developed by the CALculation of PHAse Diagram (CALPHAD) community (see Box 3-1). Microstructural evolution methods predict microstructure stability and evolution based on free-energy functions, elastic parameters, and kinetic databases. Recently, several groups established protocols to automatically extract thermodynamic and kinetic information from CALPHAD methods as input to such methods. Micro-

OCR for page 67
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  BOX 3-1 CALculation of PHAse Diagrams (CALPHAD) Methodology The calculation of phase diagrams is a well-developed and widely accepted computation- al approach for capturing and using materials thermodynamic information. Personal-computer- based commercial software, coupled with commercial and open databases of thermodynamic information (data and models), provides the results of sophisticated and accurate calculations. Now readily available to those with even modest backgrounds in thermodynamics and phase equilibria calculations, these thermodynamic simulations based on critically evaluated data are basic tools in materials and process design.1 However the development of the sophisticated tools and databases in use today took more than 50 years and was the result of the efforts of countless contributors. Because CALPHAD software is arguably the most important (and per- haps the only) generic tool available for ICME practitioners, a brief examination of its history could reveal how ICME is likely to develop.2 Although this method is ultimately rooted in work begun in the early 1900s, the modern CALPHAD movement began in the late 1950s, when the global scientific community began to envision a phase diagram calculation capability based on extensive databases of thermodyamic properties and empirical data. Over the course of 50 years this vision was a constant goal. While the time required to bring the effort to fruition may seem long, Saunders and Miodownik have suggested that such a lengthy incubation period between vision and fruition reflects the time required for individuals to meet each other and agree to work together and the time for science and technology to dedicate adequate funds. They also suggested that a contributing factor was the difficulty some scientists had in accepting that realizing this vision required a melding of empirical databases and fundamental thermodynamics. Since the late 1950s, many factors enabled CALPHAD to develop: • Visionary leaders who understood the potential of CALPHAD and who worked con- tinuously for decades to make it a reality. • CALPHAD research groups at universities and government laboratories such as the National Bureau of Standards (now NIST), often led by the aforementioned individuals, who provided continuity and sustained effort and focus. • A strong community of experts. • Technical conferences dedicated to CALPHAD that enabled researchers to interact and collaborate. • A focus on practical problems of interest to industry—for example, steels and nickel- based superalloys. • Textbooks dedicated to CALPHAD (the first was published in the 1970s). • Establishment of a journal (in 1977) dedicated to publication of CALPHAD data. • International agreements and international consortia dedicated to the CALPHAD vi- sion—one such is the Scientific Group Thermodata Europe (SGTE)—have been in existence since the 1970s. • Substantial public funding of database development especially in Europe via the pro- gram Cooperation in the Field of Scientific and Technical Research (COST). 1P.J. Spencer, ed., “Computer simulations from thermodynamic data: Materials production and development,” MRS Bulletin 24(4) (1999). 2For a more comprehensive review of the history of CALPHAD, see N. Saunders and A.P. Miodownik, CALPHAD—Calculation of Phase Diagrams, A Comprehensive Guide, Oxford, England: Elsevier (1998).

OCR for page 67
technological Barriers: comPutational, exPerimental, integration needs  and • Expert practitioners who develop phase diagram assessments and make them available to others either freely or via commercial databases linked to commercial CALPHAD software. • The use of common thermodynamic reference states along with a shared and agreed- on taxonomy. • The open publication and sharing of common data (at least for unaries and often for binaries and ternaries) that form the building blocks for many of the CALPHAD data- bases. • PC-based commercial software and databases that can be operated without extensive expertise. • Commercial software with programming interfaces that enable users to write their own applications and call up key functions on demand. Current CALPHAD development efforts include establishment of linkages with physics- based tools such as density functional theory for calculating the energetics required to assess phase stability and linkage with and development of diffusion databases and models that are in turn linked to microstructural evolution prediction tools. Finally, some developers of CALPHAD tools have begun to venture into property prediction, by either correlations or science-based models, setting the stage for the use of CALPHAD as a basic ICME tool. The enabling factors that led to the CALPHAD capability of today will also be critical enablers for the development of a widespread ICME capability in the future. mechanical and mesoscale property models include solid mechanics and FEA methods that use experimentally derived models of materials behavior to explore microstructural influences on properties. The models may incorporate details of the microstructure (resolving scales at the relevant level). Results may be at full system scale. Mesoscale structure models include models for solidification and solid state deformation using combinations of the previous methods to predict favorable processing conditions for specific microstructural characteristics. Methods for code and systems integration offer ways to connect many types of models and simula- tions and to apply systems engineering strategies. Statistical tools are often used to gain new understanding through correlations in large data sets. Other important ICME tools include databases, quantifiable knowledge rules, error propagation models, and cost and performance models. To be effective in an ICME environ- ment, all of these computational methods must be integrated with other tools. Developing such compatibilities should be a priority for model developers and funding agencies. The development of standards and common nomenclatures for data exchange and model compatibility is an important task and is discussed in more detail in the sections “Requirements for ICME Databases” and “Commercial Integration Tools.” It would be beyond the scope of this report to give details of the advances that are needed for all the methods employed to model materials behavior. Table 3-1

OCR for page 67
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  lists methods along with their inputs and outputs. While each method is by itself a critical component of an ICME process, linking the various methods remains a great challenge, not only from a scientific perspective but also because the codes for these models may exist on different computer platforms and be written in dif- ferent languages. While still an unsolved problem, projects like those sponsored by Eclipse are focused on the creation of open development platforms to make such computational linkages easier.3 Each class of methods in Table 3.1 has its own needs and challenges, among them the following: • Extensions of atomistic simulations to longer times through the use, for example, of accelerated dynamics methods and to broader classes of mate- rials systems through the development and validation of force fields for application to heterogeneous/mixed materials, especially at the interfaces between material types (for example, metal-ceramic); • Development of spatially hierarchical microstructural evolution methods for concurrently modeling microstructural features across length scales; • Advances in crystal plasticity finite element methods to include the effects of local heterogeneities in the microstructure; • Methods for modeling the spatial and temporal scales between dislocation dynamics and continuum level (for instance, finite element methods); • Science-based models for predicting the influence of microstructure on a wide variety of properties. • Development of improved microstructural evolution models for polymers, polymeric composites, and elastomers; • Advances in electronic structure calculations for modeling larger systems (for example, development of spatially hierarchical methods employing a flexible—such as a wavelet—basis) and for more accurately accounting for electron correlation, which will be critically important for materials at the nanoscale; and • Development of diffusion data and kinetics theory to explain a wide variety of materials phenomena in metals, polymers, and ceramics. This list, while far from complete, indicates the diversity of challenges in com- putational materials. For ICME, the key is to influence the directions these develop- ments take, with the goal being greater integration between models for different materials phenomena and across scales and better integration of data within the models and simulations. 3 For more information, see http://www.eclipse.org. Accessed February 2008.

OCR for page 67
technological Barriers: comPutational, exPerimental, integration needs  and Advances in Computing Capabilities ICME is possible today in part because of the exponential growth in computer storage and processing capability achieved over the last 40 years. Current desktop processors yield performance reserved for the supercomputers of a decade ago, multigigabytes of memory have become standard, and disks can store terabytes of information, all at an affordable cost. Thus the computational capabilities required to model materials behavior for ICME are becoming increasingly available to the practicing materials engineer. The prognosis is for continued improvements in hardware capabilities. The recent advances in multiprocessor computing have had a dramatic effect on the utility of a variety of methods, with a natural evolution of computational methods from serial to scalable parallel processing. The stages to full parallel pro- cessing include using parallel processing compilers, “parallelizing” computation- intensive portions of the code, “parallelizing” the original serial implementation, and redesigning the serial implementation to take full advantage of the available parallel architectures. Many commercial applications (for example, finite element methods) are available for parallel computing and are in common use in indus- trial settings. With the focus on multicore processors from the computer vendors, computing methods that take advantage of parallelism and that will require new and different programming paradigms will become increasingly common. Tools for ICME will need to have a number of features to take full advantage of the power of modern and evolving computing platforms. While the technical details are beyond the scope of this report, successful methods will generally include the following: • Scalable parallelism. As the cost of processors continues to fall, the ability to scale to hundreds or thousands of processors will be paramount. • I/O and file systems. Many classes of simulation tools are constrained by communication bandwidth between processors and to storage servers. For example, just reading the results of the Los Alamos National Laboratory simulations shown in Figure 3-1 required 100 servers. Higher computing performance will require new algorithms that scale without such heavy I/O burdens. New distributed file systems can help significantly with the stor- age and retrieval of large amounts of data, and materials simulations and visualizations will need to work well with these file systems. • Advances in graphics hardware. New graphics processing units (GPUs) can offer very large sustained processing speeds (up to 50 Gflop as this report is written), which is considerably faster than general-purpose central pro- cessing units. The general material application development community has done little to take advantage of this technology; however, graphics-

OCR for page 67
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  FIGURE 3-1 Left: Shock turbulence model with 589 million elements rendered at an effective rate of 3.2 billion polygons per second on 128-pipe Army Research Laboratory visualization system. SOURCE: Lawrence Livermore National Laboratory. Right: Asteroid impact study—240 million cells, 9.7 TB, 50 servers, multiple angles. Image courtesy of CEI. hardware-based computing (gpgpu) has been widely used in some mas- sively parallel astrophysics applications. The next generation of gpgpu hardware appears to hold great promise for high-performance comput- ing. New materials and applications in biology might benefit from this technology. • Intelligent data reduction. It is relatively easy to use 10,000 processors for a large, highly scalable computational fluid dynamics (CFD) application or to start thousands of design variations. It is more difficult to ensure the timely delivery of input data or the creation of large output files for large simulations that are run in parallel. Using such large amounts of data will necessitate the intelligent reduction of information required for the next- higher level of integration of materials or systems models. • Fault detection and recovery. In a cluster with thousands of processors, the mean time to failure of a single processor is less than 1 day. Simulation tools will thus need robust fault detection and recovery capabilities. Low-level fault detection is just entering compilers and parallel computing middle- ware such as message-passing interface (MPI), but no broadly available materials simulation tools currently take advantage of these capabilities or provide their own fault detection to improve their reliability. • Out-of-order execution. Modern processors with large numbers of cores and threads per core can run much faster when programmed such that most instructions can be scheduled either simultaneously or out of order. No

OCR for page 67
technological Barriers: comPutational, exPerimental, integration needs  and broadly available materials simulation tools are written to take advantage of such capabilities. • Petaflop computing. Next-generation computers, capable of petaflop per- formance, will probably employ hundreds of thousands of processors. New programming paradigms will be needed to achieve scalability on these massive machines. Computational capabilities will continue to increase, enabling higher fidelity and complexity in ICME applications. By taking advantage of new architectures and software enhancements, application developers can enhance the ability of their computational tools to meet the challenges listed in the preceding section. The committee notes, however, that few scientists and engineers in the materials community have the training to fully engage in the development of modern com- putational methods, so collaboration with computer scientists will be essential. Accordingly, institutions that engage in materials education, development, and manufacturing will need to undergo significant cultural change; this is discussed in Chapter 4. Uncertainty Quantification ICME requires the development of predictive models and simulations, the quality (accuracy and precision) of whose final results will have to be known by the materials engineer.4 The ability of the ICME community to predict the quality of a coupled set of calculations is limited, because there can be considerable uncertainty at almost all levels of the ICME process. All materials models have uncertainties associated with the natural variability of materials properties that arises from the stochastic nature of materials structures. The problem is exacerbated by the critical dependency of many materials properties on the distribution of defects (that is, on microstructural heterogeneities), which are, in turn, influenced by processing variables. Thus it is very important to carefully calibrate and validate modeling tools by comparing their results to the results of well-designed experiments on pedigreed materials. Beyond the uncertainties in the materials models, all simula- tion methods have their own levels of uncertainty, from the stochastic uncertainty of a molecular dynamics simulation to the numerical uncertainty of a large-scale finite element calculation. A key need for all ICME applications is quantification of uncertainties in each stage of a suite of calculations. 4 SinceICME is best practiced with complementary experimental and theoretical approaches, the validation of computational methods to fill gaps in theoretical understanding is critical to building a robust ICME approach. Validation is discussed in the section “Role of Experimentation in Com- putational Materials Science and ICME.”

OCR for page 67
technological Barriers: comPutational, exPerimental, integration needs  and researcher to find the minimum number of components that best describe a data- set, enabling much easier classification and feature extraction. Other data-mining tools include partial least squares regression, cluster identification, association analysis, and anomaly detection. These approaches are common in some fields but have yet to be widely applied to materials data. As an example, suppose the goal is to develop a new alloy system for a specific application. The expense associated with a complete exploration of a multicompo- nent design space is immense and generally not affordable. Informatics provides a way to identify trends in the data that might be normally overlooked. Using data- mining methods, common characteristics can be isolated and employed to identify promising classes of materials. The power of informatics is that the data can come in many forms—for example, from experiment or from modeling—and can have a wide range of uncertainty. To be more specific, consider the hybrid data mining and simulation technique of Fischer et al. for determining lowest-energy intermetallic structures and con- structing their phase diagrams.24 A database holds the structures and free energies of a large number of binary and ternary intermetallic systems. When the user requests the phase diagram for a binary system not in the database, the software first guesses which structures the alloy could form by applying statistical methods to the database, then tests and refines those guesses by a series of ab initio calcula- tions. Fischer et al. estimate that an unknown binary phase diagram, including all intermetallic crystal structures and lattice spacings, could be generated in this way by using just 20 ab initio calculations. Moving up in scale, it is not hard to imagine a mesoscale structure formation and evolutions models (such as the phase field method) employing a similar approach to automatically access thermodynamics data and the results of ab initio calculations.25,26 Indeed, this general approach may be widely applicable in linking models across scales. For the foreseeable future, the development of ICME computational models will require a specialized capability and a labor-intensive approach requiring an “expert.” A good example of this is CALPHAD, which needs experts or those expe- rienced in the “art” to develop data assessments and assemble databases. It is an iterative process, and many of the more commonly used databases for alloys (such as Ni superalloys and steels) have been in development for up to 20 years. Within the ICME framework, work on better, more efficient ways to manage databases 24 C. Fischer, K. Tibbets, D. Morgan, and G. Ceder, “Predicting crystal structure by merging data mining with quantum mechanics,” Nature Materials 5 (2006): 641-646. 25V. Vaithyanathan, C. Wolverton, and L.Q. Chen,. “Multiscale modeling of θ′ precipitation in Al-Cu binary alloys,” Acta Materialia 52 (2004): 2973-2987. 26V. Vaithyanathan, C. Wolverton, and L.Q. Chen, “Multiscale modeling of precipitate microstruc- ture evolution,” Physical Review Letters 88(12) (2002).

OCR for page 67
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  and construct them in a more semiautomated way would be an important step forward. Materials informatics is in its earliest stages of development. Much work remains before it will be developed sufficiently to be widely applicable in materi- als engineering; it requires creating a new and robust set of tools easily available to the materials engineer. It holds great promise, however, and could be a critical part of an ICME process. INTEGRATION TOOLS: THE TECHNOLOGICAL “I” IN ICME Technical tools for integrating materials knowledge are of obvious importance for ICME. Integration tools are the glue that binds software applications and databases into an integrated, cohesive, systemwide design tool that can be used by many contributors to the design effort. For ICME, these contributors might include materials researchers, materials design engineers, product designers, engineering design analysts, manufacturing analysts, purchasing agents, suppliers, and, possibly, quality control and customer support personnel. Integration tools are required for three tasks: • Linking information from different sources and different knowledge domains. This information could be in the form of computational models or empiri- cal relationships derived from experimental data. • Networking and collaborative development. This would be a helpful technical tool for solving some of the cultural and organizational problems facing ICME, which will be described in Chapter 4. • Optimization. This might be optimization of a product, a manufacturing process, or a material. It would allow materials engineers to fully engage in the computational engineering IPD process described in Box 2.1. Integration is viewed differently in each of the communities expected to con- tribute to the growth of ICME. Graphical representations representing the view- points of three of those communities are shown in Figures 3-6 to 3-8. Figure 3-6 shows a typical multiscale figure, with the timescale and the length scale important for the description of various systems. Much of the work that could be deemed computational materials science entails performing calculations in each of these regimes and then, by passing information from one regime-specific tool to another, linking the phenomena across the scales. While this concept is often useful for defining a modeling strategy, its importance is sometimes overemphasized. Devel- oping and linking models across length scales is not required for a workable ICME tool set. Rather, ICME practitioners develop models as an engineering activity that

OCR for page 67
technological Barriers: comPutational, exPerimental, integration needs  and FIGURE 3-6 Multiscale modeling, a construct used to illustrate the interdependence and connections between mechanisms acting at different length scales and timescales. SOURCE: Michael Doyle, Accel- rys, “Integration of computational materials science and engineering methods,” Presentation to the committee on March 13, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_ Presentations.html. Accessed February 2008. requires an initial expert assessment to get proper matching between the problem being attacked and the length scales that must be considered. Figure 3-7 (and to a large degree Figure 3.5) shows the integration problem from the viewpoint of a metallurgist. Viewpoints exist as well for ceramics, polymers, and other materials systems. Knowledge from disparate sources and domains (for example, thermodynamic models, models for simulating manufacturing processes, microstructural evolution models, and property models) is required to fully assess the influence of the manufacturing process on the properties of the materials that make up a manufactured product. Simulations of manufacturing processes must be integrated with computational models for phase equilibria, microstructural evolution, and property prediction. An important notion here is that properties of an engineering product “compete” and thus must be balanced in its design. The complexity of this optimization problem dictates that a computational approach is required. Missing from the traditional metallurgist’s perspective are the direct outputs to product development performance analysis and optimization. To be effective, ICME must address issues that are encountered in both of these integration domains and many more. In doing so, it will integrate these disparate fields into a holistic system allowing optimization and collaboration. Integration tools are thus the backbone of ICME. Depending on specific motivations, incen- tives, and requirements, they may be used in a proprietary setting (such as described

OCR for page 67
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  FIGURE 3-7 A metallurgist’s view of the integration problem represented by ICME for a nickel-based superalloy. SOURCE: Adapted from Leo Christodolou, DARPA, “Accelerated insertion of materials,” Presentation to the committee on November, 20, 2006. Available at http://www7.nationalacademies. org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. in the Ford virtual aluminum castings example in Chapter 2), in a collaborative but limited partnership setting (such as that described for the P&W AIM program, also in Chapter 2), or in an open, collaborative setting. Commercial Integration Tools Commercial integration software tools are available that are designed to link a variety of disparate software applications into an integrated package, which can then be used to optimize some underlying process. As a result of these efforts, de facto standards are emerging for “wrapping” models, running parallel parametric simulations, applying sensitivity analysis, and reducing the complexity (order) of systems. Such companies market and apply systems integration tools that will solve specific engineering problems, tools for interoperability across organizations, and, in some cases, tools for education.

OCR for page 67
technological Barriers: comPutational, exPerimental, integration needs  and FIGURE 3-8 Models and experiments flow from AIM integration architecture. SOURCE: DARPA and AFRL Accelerated Insertion of Materials program. Simulation data managers (SDMs) such as iSIGHT/FIPER and CenterLink are Web-based tools that do the following:27,28 • Provide standards-based integration environments to link applications; • Send data securely across network connections; • Run applications code on computer resources that might be local or remote and that consist of heterogeneous hardware platforms; • Use system resource or job execution queue managers such as load sharing facility (LSF); 27 Brett Malone, Phoenix, “Phoenix integration,” Presentation to the committee on March 13, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 28Alex Van der Velden, Engineous, “Use of process integration and design optimization tools for product design incorporating materials as a design variable,” Presentation to the committee on March 14, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008.

OCR for page 67
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  • Save design parameters and results in a database; • Provide database mining capabilities; • Enable three-dimensional surface design visualization; • Provide response surface approximation for experimental data; and • Measure and track uncertainty and contributions for given design parameters. These and other integration tools are widely used for IPD, but they have almost no presence in the materials engineering community. That said, they have been successfully used in pilot ICME demonstration projects.29,30,31,32 In the Defense Advanced Research Projects Agency’s (DARPA’s) Accelerated Insertion of Materials (AIM) program, a commercial SDM called iSIGHT, from the company Engineous Software, was used to link computer-aided design (CAD) forging process modeling, models for heat treatment, microstructural evolution models, property predictions, and structural analysis applications into a seamless work flow called a designer knowledge base. This designer knowledge base, depicted in Figure 3-8, effectively integrated quantitative information from a wide variety of sources and models. Design data and experimental results were stored in a common database. From this demonstration, the committee concludes that state-of-the-art commercial integra- tion tools are available for ICME and ready for widespread application, identifying and solving the unique problems that will arise as the discipline matures. For integration tools, common interface standards are highly desirable so that application engineers do not have to rewrap applications many times for different uses. The development of standards and nomenclatures, or taxonomies, should be done in conjunction with model and software developers and vendors and not in isolation. NIST has developed a wrapping standard that is available in the com- mercial SDM applications FIPER.33 Other interface formats are emerging from 29Leo Christodolou, DARPA, “Accelerated insertion of materials,” Presentation to the committee on November, 20, 2006. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations. html. Accessed February 2008. 30 Daniel G. Backman, Daniel Y. Wei, Deborah D. Whitis, Matthew B. Buczek, Peter M. Finnigan, and Dongming Gao, “ICME at GE: Accelerating the insertion of new materials and processes,” JOM November 2006: 36-41. 31 Dennis Dimiduk, United States Air Force, “Towards full-life systems engineering of structural metal,” Presentation to the committee on May 30, 2007. Available at http://www7.nationalacademies. org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 32Alex Van der Velden, Engineous, “Use of process integration and design optimization tools for product design incorporating materials as a design variable,” Presentation to the committee on March 14, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 33 Ibid.

OCR for page 67
technological Barriers: comPutational, exPerimental, integration needs  and UGS, MSC, and Dassault.34 Additionally, the International Organization for Stan- dardization (ISO) has promoted the Standard for the Exchange of Product Model Data (STEP), ISO 10303, as a comprehensive way to represent and exchange digital product information. However, application developers are often reluctant to create open interfaces to their applications. Open access is often viewed from a software developer’s point of view as a risk to its intellectual property. To the extent that code vendors and authors are willing to cooperate in the creation of open standard interfaces to their applications and data, the general community would benefit; to encourage them to do so would require incentives from major government agen- cies and industrial consortia. SDM environments provide the ability to securely transport data to local or remote computing resources and to execute a wide variety of applications on those resources. Once that has been done, the SDM takes computed results and stores them in a simulation database. SDM environments also enable collaboration among multiple research groups. Execution can be monitored by these groups, with all parties having limited or full access to the data or control of application execution. The groups can be inside or outside firewalls. Optimization is an important objective for ICME. Once applications are linked into a common framework, the next logical step is to perform multidisciplinary, systemwide design and optimization. Design trade-offs can be made, and the result- ing behavior can be propagated throughout the entire design work flow to obtain globally optimal solutions. Although materials computations are not currently integrated into the multidisciplinary optimization (MDO), the ICME-enabled desired future state would allow material and manufacturing process optimization trade-offs that could also be propagated throughout the entire design work flow. A single analysis of all of the linked application modules could be executed, or design studies could be conducted to access trade-offs. An ICME-enabled MDO system could also be used to bring the systemwide design to an optimal global design point or it could be run to simply assess reliability. ICME Cyberinfrastructure For many communities the World Wide Web serves as a platform for sharing information in the form of models and data. The term “cyberinfrastructure” refers to a relatively new infrastructure that according to an NSF report35 is “based upon 34 Nuno Rebelo, Simulia, “CAE: Past, present, and future,” Presentation to the committee on May 30, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 35 For more information, see the Report of the National Science Foundation Blue-Ribbon Advisory Panel on Cyberinfrastructure. Available at http://www.nsf.gov/od/oci/reports/atkins.pdf. Accessed February 2008.

OCR for page 67
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  distributed computer, information and communication technology.” Such an infra- structure is as essential to the knowledge industry as is the physical infrastructure of roads, bridges, and the like to the industrial economy. In 2003, the NSF Blue Rib- bon Advisory Panel on Cyberinfrastructure envisioned “the creation of thousands of overlapping field and project collaboratories or grid communities, customized at the application layer but extensively sharing a common Cyberinfrastructure.”36 Important elements of the cyberinfrastructure described in this report included grids of computational facilities; comprehensive libraries of digital objects, includ- ing programs and literature; multidisciplinary, well-curated, federated collections of scientific data, online instruments, and sensor arrays; convenient software tool- kits for resource discovery, modeling, and visualization; and the ability to collabo- rate with physically distributed teams of people using these capabilities. The report identified this as an important opportunity for NSF and stressed the importance of acting quickly and the risks of failing to do so. The risks include lack of coordina- tion, which could lead to adoption of irreconcilable formats for information; failure to archive and curate data that have been collected at great expense and may be easily lost; barriers that can inadvertently arise between disciplines if isolated and incompatible tools and structures are used; waste of time and talent in developing tools that may have shortened life spans due to the above-mentioned lack of coor- dination and failure to incorporate a consistent computer science perspective; and, finally, insufficient attention to resolving cultural barriers to adopting new tools, which may also result in failure. The committee proposes the following definition for the term “ICME cyberinfrastructure:” The Internet-based collaborative materials science and engineering research and development environments that support advanced data acquisition, data and model storage, data and model management, data and model mining, data and model visualization, and other computing and information processing services required to develop an integrated computational materials engineering capability. A key element of the ICME cyberinfrastructure will be individual collabora- tive ICME Web sites and information repositories that are established for specific purposes by a variety of organizations but linked in some fashion to a broader network that represents the ICME cyberinfrastructure. The DARPA AIM Designer Knowledge Base, using iSIGHT, the Internet, and a geographically dispersed team, represents the only known example of an ICME–Web collaboration. Although collaborative Web sites in materials science and engineering are relatively rare, there are some. One example is nanoHub, the Web-based resource for research, 36 Ibid., p. 7.

OCR for page 67
technological Barriers: comPutational, exPerimental, integration needs  and education, and collaboration in nanotechnology, developed at Purdue Univer- sity and funded by the NSF Network for Computational Nanotechnology.37 It is reportedly used by thousands of researchers from over 180 countries. Important elements of collaborative sites are security, networking capability, and, in some cases, grid computing. Since the technology surrounding collaborative Web sites and informatics is rapidly evolving and quite new in the case of materials science and engineering, it should be realized that there may be some redundancies and that some ICME Web sites and informatics efforts probably will fail eventually. If ICME develops soon and with substantial coordination, these redundancies and failed efforts will be minimized. The goal of a balanced, well-designed ICME cyberinfrastructure is to give scientists and engineers the means to do a number of things: • Link applications codes—for example, UniGraphics, PrecipiCalc, and ANSYS; • Develop models that accurately predict multiscale material behaviors; • Store and retrieve analytical and experimental data in common databases; • Provide a repository for material models; • Execute computational code anywhere computational resources are available; • Visualize large-scale data; • Enable local or geographically disperse collaborative research; and • Measure the uncertainty in a given design and the contributions of indi- vidual design parameters or sources. The desired future state is one in which a government-sponsored cyberinfra- structure composed of a variety of special-purpose Web sites is widely used for collaboration between researchers here and abroad. It will be routinely used to develop materials models for computer-aided engineering (CAE) analysis of new products, linked with manufacturing simulations. The development and mainte- nance of advanced materials models that are sensitive to manufacturing history will, for the foreseeable future, be accomplished by specialists from industry, small business, or academia. The materials models will be used to optimize product design and the manufacturing process and to develop new materials. Because these are collaborative tools, access to and control of vital data are crucial to all members of a research consortium and should not be hindered by security considerations. Minimizing redundant activities is a key side benefit of coordinated development programs such as those devoted to solving engineering challenge problems like the 37 For more information, see http://www.nanohub.org. Accessed March 2008.

OCR for page 67
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g 00 ones described in Chapter 2 and will be expanded on in Chapter 4. Whether the ICME cyberinfrastructure is being put to use in the context of an effort to solve a foundational engineering problem or is part of a self-assembled activity conducted by a professional society, the extent to which it can be made openly accessible to researchers both at home and abroad will greatly influence the cost and rate of development of an ICME capability. ICME integration tools must be made compatible with the underlying hard- ware and the software environment used in current design processes. Elements of computational compatibility include the following: • Portability across the heterogeneous hardware and operating systems. Members of the IPD team (IPDT) may be located at different companies that have different computing environments. • Interoperability with other tools and integrating software. • Standard data and I/O formats to permit data propagation among codes. • Efficient operation, so that tools may be used within a design optimization loop. • Good code design practices, so that codes may be updated as models improve or computing environments change. Standard formats for each of these data inputs and outputs are required to avoid the proliferation of formats seen in property data. As with properties, each format needs to specify uncertainty, trust, and generation method. And, formats will need to be able to evolve to meet changing needs and modeling capabilities. Security is a vital part of the ICME collaborative integration process. Rarely can a materials designer or a developer of a material’s constitutive relationship use a single code to take a design from microstructure predictions to material properties and the analysis of final product design. This systemwide analysis requires running multiple applications that might reside on different systems run by different, pos- sibly geographically remote groups. The ability to remotely execute tools and move sensitive data in a secure manner is critical to the success of the design community. Secure access to model and data repositories is also essential. Without proper secu- rity, corporate and government security policies will impede the development of systemwide design environments. SUMMARY Although existing computational materials science capabilities are impressive, they have not had a significant impact on materials engineering. Moreover, com- putational materials science lacks the integration framework that would make it widely usable in materials engineering. Establishment of such a framework would

OCR for page 67
technological Barriers: comPutational, exPerimental, integration needs 0 and transform the materials field. In selected instances, the existing tools have been integrated and applied in industrial settings, enabling the first demonstrations of the capabilities of ICME. Physically based models and simulation tools have pro- gressed to the point where ICME is now feasible for certain applications, though much development and validation remain to be done if ICME is to be more broadly adopted. The widespread adoption of ICME approaches will require significant development of models, integration tools, new experimental methods, and major efforts in calibration of models for specific materials systems and validation. The continued evolution and maturation of computational materials sci- ence tools will accelerate the ease and efficiency with which ICME tools can be implemented. To be effective in an ICME environment, all of these tools must be developed in a manner that allows their integration with other tools; this should be a priority for model developers and funding agencies. Modeling approaches that embed uncertainty are also important for advancing ICME. The advantage of improvements in computational capabilities such as parallel processing should be exploited by future CMS and ICME developers. Although ICME tools will be used in a computational engineering environ- ment, experimental studies and data are also critical for the development of empiri- cal models that can be used where there are gaps in theoretical understanding and that can be used to calibrate and validate ICME models. There are several new experimental methods under development whose maturation will do much to accelerate the widespread development of ICME. These include rapid character- ization methods, miniature sampling techniques, and three-dimensional materials characterization techniques. Validation experiments should be a key element of any approach to solve the engineering challenge problems that will be discussed in Chapter 4. The creation and maintenance of dynamic and open-access repositories for data, databases, and materials taxonomies are essential. These databases can also play a role in linking models at different spatial and temporal scales. Open access databases will reduce redundant research, improve efficiency, and lower the costs of developing ICME tools. The integration tools that are now available provide working solutions for ICME, but significant infrastructural development will be required to realize the benefits of integration. One forerunner of an ICME capability will be the establish- ment of curated ICME Web sites that can serve as repositories for data, databases, models for collaboration, and model development and integration. Significant government investments, similar to those awarded by the NIH to the genomics research community, will be required to create and curate the cyberinfrastructure necessary to support ICME. The extent to which this ICME cyberinfrastructure can be made open and accessible will greatly speed up the development of an ICME capability and lower its cost.