Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
3 Technological Barriers: Computational, Experimental, and Integration Needs for ICME Chapter 2 provided case studies that show the significant economic and com- petitive benefits that U.S. original equipment manufacturers (OEMs) and other manufacturers have achieved through the use of ICME. Those studies illustrate the integration of materials knowledge into component manufacturing, optimi- zation, and prognosis. However, there remain significant technical barriers to the widespread adoption of ICME capabilities. In this chapter the committee discusses many of those challenges, focusing not only on modeling tools but also on the materials databases and experimental tools needed to make ICME a reality for a broad spectrum of materials applications. Finally, ways to integrate the various tools and data into a seamless ICME package are addressed. Current Computational Materials science Tools Todayâs materials scientists have increasingly powerful computational tools at their disposal. A recent DOE study demonstrates the compelling nature of the opportunities in computational materials science (CMS). A recent NSF report focuses on the cyberinfrastructure needed for materials science. The power of âDepartment of Energy (DOE), Opportunities for Discovery: Theory and Computation in Basic Energy Sciences (2005). Available at http://www.sc.doe.gov/bes/reports/files/OD_rpt.pdf. Accessed February 2008. âNational Science Foundation (NSF), Materials Research Cyberscience Enabled by Cyberinfrastructure (2004). Available at http://www.nsf.gov/mps/dmr/csci.pdf. Accessed February 2008. 67
68 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g twenty-first century computing is making it possible to predict a range of structural features and properties from fundamental principles. These tools are diverse and range from the atomic level to the continuum level and from thermodynamic mod- els to science-based property models. Current computational materials methods range from the specialized materials modeling methods that are used in fundamen- tal research to the full-scale materials processing tools at manufacturing facilities. Researchers in materials science, mechanics, physics, and chemistry explore mate- rials processingâstructureâproperty relationships as a natural part of the research process. The results from these explorations are often incorporated into sophisti- cated modeling methods focused on a narrow part of overall materials behavior. While these isolated CMS methods do not necessarily contribute to the ICME infrastructure, they represent a vast supermarket of method development that can be drawn on by yet-to-be developed integration efforts and infrastructures. The wide range of CMS methods available today are both a blessing and a curse to materials and engineering design teams. It is difficult for scientists and engi- neers to judge the efficacy of new or even well-established computational methods because the tools used are typically developed in somewhat isolated research envi- ronments. While this approach encourages creativity and innovation, it also means that use of these tools requires well-trained specialists who can maintain and run what are basically research codes. In other fields the computational methodsâfor example, finite element analysis (FEA) and finite difference methodsâare firmly embodied in standard packages that have become an integral part of the academic training of the modern scientist or engineer, being based on the mathematical foundation of the discipline. In materials science and engineering, however, the scope is extremely broad and is based on a wide range of mechanisms that typically operate at different length and temporal scales, each of which needs to be modeled with specialized methods. The properties of materials are controlled by a multitude of separate and often competing mechanisms that operate over a wide range of length and time scales, The committee concludes that since there is no single overarching approach to modeling all materials phenomena, the widespread application of materials model- ing has been limited and has impeded the transformative power of ICME. Most computational materials methods can be traced back to academic groups that developed these methods as part of the educational, scientific, and engineer- ing process. A typical, but by no means universal, path to maturity would include several generations of research codes from one or more groups, which then are transitioned to applications in a government or industrial laboratory, then com- mercialized with or without government support. In the United States, federal supportâthrough, for example, the Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) grantsâhas played a key role in commercializing processing and thermodynamic methods such as ProCast,
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 69 Deform, and Pandat. In the committeeâs judgment, federal support will continue to play an important role in incubating and transitioning new ICME methods. Methods The fundamental technical challenge of ICME is that materials response and behavior involve a multitude of physical phenomena whose accurate capture in models requires spanning many orders of magnitude in length and time. The length scales in materials response range from nanometers of atoms to the centi- meters and meters of manufactured products. Similarly, time scales range from the picoseconds of atomic vibrations to the decades over which a component will be in service. Fundamentally, properties arise from the electronic distributions and bonding at the atomic scale of nanometers, but defects that exist on multiple length scales, from nanometers to centimeters, may in fact dominate properties. It should not be surprising that no single modeling approach can describe this multitude of phenomena or the breadth of scales involved. While many computational materi- als methods have been developed, each is focused on a specific set of issues and appropriate for a given range of lengths and times. Consider length scales from 1 angstrom to 100 microns. At the smallest scales scientists use electronic structure methods to predict bonding, magnetic moments, and transport properties of atoms in different configurations. As the simulation cells get larger and the times scales longer, empirical interatomic potentials are used to approximate these interactions. Optimization and temporal evolution of electronic structure and atomistic methods are achieved using conjugate gradients, molecular dynamics, and Monte Carlo techniques. At still larger scales, the information content of the simulation unit decreases until it becomes more efficient to describe the mate- rial in terms of the defect that dominates at that length scale. These units might be defects in the lattice (for example, dislocations), the internal interfaces (for example, grain boundaries), or some other internal structure, and the simulations use these defects as the fundamental simulation unit in the calculation. While true concurrent multiscale materials modeling is the goal of one segment of the materials community, for the foreseeable future most multiscale modeling will be accomplished by coordinating the input and output of stand-alone codes. This information passing approach has drawbacks associated with extracting infor- mation at each scale in an effective way. Also, all these approaches necessarily incor- porate simplifying assumptions that lead to errors and uncertainties in derived quantities that are propagated throughout the multiscale integration. Experimental data play a key role here in defining parameters and information not available from simulations at all scales and in calibrating and validating modeling techniques. Table 3-1 shows a variety of computational materials methods, some of them standard in ICME and others strictly research tools. The committee notes that the
70 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g TABLE 3-1 Mode or Method, Required Input, Expected Output, and Typical Software Used in Materials Science and Engineering Class of Computational Materials Model/Method Inputs Outputs Software Examples Electronic structure Atomic number, mass, Electronic properties, elastic VASP, Wien2K, CASTEP, methods (density valence electrons, crystal constants, free energy GAMES, Gaussian, functional theory, structure and lattice vs. structure and other a=chem., SIESTA, quantum chemistry) spacing, Wyckoff positions, parameters, activation DACAPO atomic arrangement energies, reaction pathways, defect energies and interactions Atomistic simulations Interaction scheme, Thermodynamics, reaction CERIU2, LAMMPS, (molecular dynamics, potentials, methodologies, pathways, structures, point PARADYN, DL-POLY Monte Carlo) benchmarks defect and dislocation mobility, grain boundary energy and mobility, precipitate dimensions Dislocation dynamics Crystal structure and lattice Stress-strain behavior, PARANOID, ParaDis, spacing, elastic constants, hardening behavior, effect Dis-dynamics, boundary conditions, of size scale Micro-Megas mobility laws Thermodynamic Free-energy data from Phase predominance Pandat, ThermoCalc, methods (CALPHAD) electronic structure, diagrams, phase fractions, Fact Sage calorimetry data, free- multicomponent phase energy functions fit to diagram, free energies materials databases Microstructural Free-energy and kinetic Solidification and dendritic OpenPF, MICRESS, evolution methods databases (atom structure, microstructure DICTRA, 3DGG, Rex3D (phase-field, front- mobilities), interface and during processing, tracking methods, Potts grain boundary energies, deployment, and evolution models) (anisotropic) interface in service mobilities, elastic constants Micromechanical and Microstructural Properties of materialsâfor OOF, Voronoi Cell, mesoscale property characteristics, properties example, modulus, JMatPro, FRANC-3D, models (solid mechanics of phases and constituents strength, toughness, strain ZenCrack, DARWIN and FEA) tolerance, thermal/electrical conductivity, permeability; possibly creep and fatigue behavior Microstructural imaging Images from optical Image quantification and Mimics, IDL, 3D Doctor, software microscopy, electron digital representations Amira microscopes, X-rays, etc. Mesoscale structure Processing thermal and Microstructural PrecipiCalc, JMat Pro models (processing strain history characteristics (for models) example, grain size, texture, precipitate dimensions)
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 71 Class of Computational Materials Model/Method Inputs Outputs Software Examples Part-level FEA, finite Part geometry, Distribution of ProCast, MagmaSoft, difference, and other manufacturing processing temperatures, stresses CAPCAST, DEFORM, LS- continuum models parameters, component and deformation, electrical Dyna, Abaqus loads, materials properties currents, magnetic and optical behavior, etc. Code and systems Format of input and output Parameters for optimized iSIGHT/FIPER, QMD, integration of modules and the logical design, sensitivity to Phoenix structure of integration, variations in inputs or initial input individual modules Statistical tools Composition, process Correlations between inputs SPLUS, MiniTab, (neural nets, principal conditions, properties and outputs; mechanistic SYSTAT, FIPER, component analysis) insights PatternMaster, MATLAB, SAS/STAT table is not intended to be complete but rather to exemplify the methods avail- able for modeling materials characteristics. This table indicates typical inputs and outputs of the software and examples of widely used or recognized codes. Elec- tronic structure methods employ different approximate solutions to the quantum mechanics of atoms and electrons to explore the effects of bonding, chemistry, local structure, and dynamics on the mechanisms that affect material properties. Typically, tens to hundreds of atoms are included in such a calculation and the timescales are on the order of nanoseconds. In atomistic simulations, arrangements and trajectories of atoms and molecules are calculated. Generally based on models to describe the interactions among atoms, simulations are now routinely carried out with millions of atoms. Length scales and timescales are in the nanometer and nanosecond regime, and longer length scales and timescales are possible in the case of molecular system coarse graining from âall-atomâ to âunited atomâ models (that is, interacting clusters of atoms). Dislocation dynamics methods are used to study the evolution of dislocations (curvilinear defects in the lattice) during plastic deformation. The total number of dislocations is typically less than a mil- lion, and strain rates are large compared to those measured in standard laboratory tests. Thermodynamic methods range from first-principle predictions of phase diagrams to complex database integration methods using existing tabulated data to produce phase diagrams and kinetics data. These methods are being developed by the CALculation of PHAse Diagram (CALPHAD) community (see Box 3-1). Microstructural evolution methods predict microstructure stability and evolution based on free-energy functions, elastic parameters, and kinetic databases. Recently, several groups established protocols to automatically extract thermodynamic and kinetic information from CALPHAD methods as input to such methods. Micro-
72 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g BOX 3-1 CALculation of PHAse Diagrams (CALPHAD) Methodology The calculation of phase diagrams is a well-developed and widely accepted computation- al approach for capturing and using materials thermodynamic information. Personal-computer- based commercial software, coupled with commercial and open databases of thermodynamic information (data and models), provides the results of sophisticated and accurate calculations. Now readily available to those with even modest backgrounds in thermodynamics and phase equilibria calculations, these thermodynamic simulations based on critically evaluated data are basic tools in materials and process design.1 However the development of the sophisticated tools and databases in use today took more than 50 years and was the result of the efforts of countless contributors. Because CALPHAD software is arguably the most important (and per- haps the only) generic tool available for ICME practitioners, a brief examination of its history could reveal how ICME is likely to develop.2 Although this method is ultimately rooted in work begun in the early 1900s, the modern CALPHAD movement began in the late 1950s, when the global scientific community began to envision a phase diagram calculation capability based on extensive databases of thermodyamic properties and empirical data. Over the course of 50 years this vision was a constant goal. While the time required to bring the effort to fruition may seem long, Saunders and Miodownik have suggested that such a lengthy incubation period between vision and fruition reflects the time required for individuals to meet each other and agree to work together and the time for science and technology to dedicate adequate funds. They also suggested that a contributing factor was the difficulty some scientists had in accepting that realizing this vision required a melding of empirical databases and fundamental thermodynamics. Since the late 1950s, many factors enabled CALPHAD to develop: â¢ Visionary leaders who understood the potential of CALPHAD and who worked con- tinuously for decades to make it a reality. â¢ CALPHAD research groups at universities and government laboratories such as the National Bureau of Standards (now NIST), often led by the aforementioned individuals, who provided continuity and sustained effort and focus. â¢ A strong community of experts. â¢ Technical conferences dedicated to CALPHAD that enabled researchers to interact and c Â ollaborate. â¢ A focus on practical problems of interest to industryâfor example, steels and nickel- based superalloys. â¢ Textbooks dedicated to CALPHAD (the first was published in the 1970s). â¢ Establishment of a journal (in 1977) dedicated to publication of CALPHAD data. â¢ International agreements and international consortia dedicated to the CALPHAD vi- sionâone such is the Scientific Group Thermodata Europe (SGTE)âhave been in existence since the 1970s. â¢ Substantial public funding of database development especially in Europe via the pro- gram Cooperation in the Field of Scientific and Technical Research (COST). 1P.J. Spencer, ed., âComputer simulations from thermodynamic data: Materials production and development,â MRS Bulletin 24(4) (1999). 2For a more comprehensive review of the history of CALPHAD, see N. Saunders and A.P. Miodownik, CALPHADâCalculation of Phase Diagrams, A Comprehensive Guide, Oxford, England: Elsevier (1998).
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 73 â¢ Expert practitioners who develop phase diagram assessments and make them available to others either freely or via commercial databases linked to commercial CALPHAD software. â¢ The use of common thermodynamic reference states along with a shared and agreed- on taxonomy. â¢ The open publication and sharing of common data (at least for unaries and often for binaries and ternaries) that form the building blocks for many of the CALPHAD data- bases. â¢ PC-based commercial software and databases that can be operated without extensive e Â xpertise. â¢ Commercial software with programming interfaces that enable users to write their own applications and call up key functions on demand. Current CALPHAD development efforts include establishment of linkages with physics- based tools such as density functional theory for calculating the energetics required to assess phase stability and linkage with and development of diffusion databases and models that are in turn linked to microstructural evolution prediction tools. Finally, some developers of CALPHAD tools have begun to venture into property prediction, by either correlations or science-based models, setting the stage for the use of CALPHAD as a basic ICME tool. The enabling factors that led to the CALPHAD capability of today will also be critical enablers for the development of a widespread ICME capability in the future. mechanical and mesoscale property models include solid mechanics and FEA methods that use experimentally derived models of materials behavior to explore microstructural influences on properties. The models may incorporate details of the microstructure (resolving scales at the relevant level). Results may be at full system scale. Mesoscale structure models include models for solidification and solid state deformation using combinations of the previous methods to predict favorable processing conditions for specific microstructural characteristics. Methods for code and systems integration offer ways to connect many types of models and simula- tions and to apply systems engineering strategies. Statistical tools are often used to gain new understanding through correlations in large data sets. Other important ICME tools include databases, quantifiable knowledge rules, error propagation models, and cost and performance models. To be effective in an ICME environ- ment, all of these computational methods must be integrated with other tools. Developing such compatibilities should be a priority for model developers and funding agencies. The development of standards and common nomenclatures for data exchange and model compatibility is an important task and is discussed in more detail in the sections âRequirements for ICME Databasesâ and âCommercial Integration Tools.â It would be beyond the scope of this report to give details of the advances that are needed for all the methods employed to model materials behavior. Table 3-1
74 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g lists methods along with their inputs and outputs. While each method is by itself a critical component of an ICME process, linking the various methods remains a great challenge, not only from a scientific perspective but also because the codes for these models may exist on different computer platforms and be written in dif- ferent languages. While still an unsolved problem, projects like those sponsored by Eclipse are focused on the creation of open development platforms to make such computational linkages easier. Each class of methods in Table 3.1 has its own needs and challenges, among them the following: â¢ Extensions of atomistic simulations to longer times through the use, for example, of accelerated dynamics methods and to broader classes of mate- rials systems through the development and validation of force fields for application to heterogeneous/mixed materials, especially at the interfaces between material types (for example, metal-ceramic); â¢ Development of spatially hierarchical microstructural evolution methods for concurrently modeling microstructural features across length scales; â¢ Advances in crystal plasticity finite element methods to include the effects of local heterogeneities in the microstructure; â¢ Methods for modeling the spatial and temporal scales between dislocation dynamics and continuum level (for instance, finite element methods); â¢ Science-based models for predicting the influence of microstructure on a wide variety of properties. â¢ Development of improved microstructural evolution models for polymers, polymeric composites, and elastomers; â¢ Advances in electronic structure calculations for modeling larger systems (for example, development of spatially hierarchical methods employing a flexibleâsuch as a waveletâbasis) and for more accurately accounting for electron correlation, which will be critically important for materials at the nanoscale; and â¢ Development of diffusion data and kinetics theory to explain a wide variety of materials phenomena in metals, polymers, and ceramics. This list, while far from complete, indicates the diversity of challenges in com- putational materials. For ICME, the key is to influence the directions these develop- ments take, with the goal being greater integration between models for different materials phenomena and across scales and better integration of data within the models and simulations. âFor more information, see http://www.eclipse.org. Accessed February 2008.
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 75 Advances in Computing Capabilities ICME is possible today in part because of the exponential growth in computer storage and processing capability achieved over the last 40 years. Current desktop processors yield performance reserved for the supercomputers of a decade ago, multigigabytes of memory have become standard, and disks can store terabytes of information, all at an affordable cost. Thus the computational capabilities required to model materials behavior for ICME are becoming increasingly available to the practicing materials engineer. The prognosis is for continued improvements in hardware capabilities. The recent advances in multiprocessor computing have had a dramatic effect on the utility of a variety of methods, with a natural evolution of computational methods from serial to scalable parallel processing. The stages to full parallel pro- cessing include using parallel processing compilers, âparallelizingâ computation- intensive portions of the code, âparallelizingâ the original serial implementation, and redesigning the serial implementation to take full advantage of the available parallel architectures. Many commercial applications (for example, finite element methods) are available for parallel computing and are in common use in indus- trial settings. With the focus on multicore processors from the computer vendors, computing methods that take advantage of parallelism and that will require new and different programming paradigms will become increasingly common. Tools for ICME will need to have a number of features to take full advantage of the power of modern and evolving computing platforms. While the technical details are beyond the scope of this report, successful methods will generally include the following: â¢ Scalable parallelism. As the cost of processors continues to fall, the ability to scale to hundreds or thousands of processors will be paramount. â¢ I/O and file systems. Many classes of simulation tools are constrained by communication bandwidth between processors and to storage servers. For example, just reading the results of the Los Alamos National Laboratory simulations shown in Figure 3-1 required 100 servers. Higher computing performance will require new algorithms that scale without such heavy I/O burdens. New distributed file systems can help significantly with the stor- age and retrieval of large amounts of data, and materials simulations and visualizations will need to work well with these file systems. â¢ Advances in graphics hardware. New graphics processing units (GPUs) can offer very large sustained processing speeds (up to 50 Gflop as this report is written), which is considerably faster than general-purpose central pro- cessing units. The general material application development community has done little to take advantage of this technology; however, graphics-
76 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g FIGURE 3-1 Left: Shock turbulence model with 589 million elements rendered at an effective rate of 3.2 billion polygons per second on 128-pipe Army Research Laboratory visualization system. SOURCE: Lawrence Livermore National Laboratory. Right: Asteroid impact studyâ240 million cells, 9.7 TB, 50 servers, multiple angles. Image courtesy of CEI. h Â ardware-based computing (gpgpu) has been widely used in some mas- sively parallel astrophysics applications. The next generation of gpgpu hardware appears to hold great promise for high-performance comput- ing. New materials and applications in biology might benefit from this technology. â¢ Intelligent data reduction. It is relatively easy to use 10,000 processors for a large, highly scalable computational fluid dynamics (CFD) application or to start thousands of design variations. It is more difficult to ensure the timely delivery of input data or the creation of large output files for large simulations that are run in parallel. Using such large amounts of data will necessitate the intelligent reduction of information required for the next- higher level of integration of materials or systems models. â¢ Fault detection and recovery. In a cluster with thousands of processors, the mean time to failure of a single processor is less than 1 day. Simulation tools will thus need robust fault detection and recovery capabilities. Low-level fault detection is just entering compilers and parallel computing middle- ware such as message-passing interface (MPI), but no broadly available materials simulation tools currently take advantage of these capabilities or provide their own fault detection to improve their reliability. â¢ Out-of-order execution. Modern processors with large numbers of cores and threads per core can run much faster when programmed such that most instructions can be scheduled either simultaneously or out of order. No
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 77 broadly available materials simulation tools are written to take advantage of such capabilities. â¢ Petaflop computing. Next-generation computers, capable of petaflop per- formance, will probably employ hundreds of thousands of processors. New programming paradigms will be needed to achieve scalability on these massive machines. Computational capabilities will continue to increase, enabling higher fidelity and complexity in ICME applications. By taking advantage of new architectures and software enhancements, application developers can enhance the ability of their computational tools to meet the challenges listed in the preceding section. The committee notes, however, that few scientists and engineers in the materials community have the training to fully engage in the development of modern com- putational methods, so collaboration with computer scientists will be essential. Accordingly, institutions that engage in materials education, development, and manufacturing will need to undergo significant cultural change; this is discussed in Chapter 4. Uncertainty Quantification ICME requires the development of predictive models and simulations, the quality (accuracy and precision) of whose final results will have to be known by the materials engineer. The ability of the ICME community to predict the quality of a coupled set of calculations is limited, because there can be considerable uncertainty at almost all levels of the ICME process. All materials models have uncertainties associated with the natural variability of materials properties that arises from the stochastic nature of materials structures. The problem is exacerbated by the critical dependency of many materials properties on the distribution of defects (that is, on microstructural heterogeneities), which are, in turn, influenced by processing variables. Thus it is very important to carefully calibrate and validate modeling tools by comparing their results to the results of well-designed experiments on pedigreed materials. Beyond the uncertainties in the materials models, all simula- tion methods have their own levels of uncertainty, from the stochastic uncertainty of a molecular dynamics simulation to the numerical uncertainty of a large-scale finite element calculation. A key need for all ICME applications is quantification of uncertainties in each stage of a suite of calculations. âSinceICME is best practiced with complementary experimental and theoretical approaches, the validation of computational methods to fill gaps in theoretical understanding is critical to building a robust ICME approach. Validation is discussed in the section âRole of Experimentation in Com- putational Materials Science and ICME.â
78 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g The quantification of uncertainty is a burgeoning field within computational science and engineering. Many methods have been developed to track how uncer- tainties in input parameters and the underlying models propagate through a simulationâthat is, how input affects output. There are two general approaches, sensitivity analysis and uncertainty quantification. In sensitivity analysis, one exam- ines how changes in inputs affect outputs. Uncertainty quantification is a more global process, which involves a statistical analysis based on probability distribu- tions. In both approaches, input parameters are treated as stochastic variables, with an associated probability distribution function. Various statistical methods are then employed to estimate the distribution of outputs associated with those inputs. Full integration of uncertainty quantification into ICME processes will be essential for developing the reliable applications needed for widespread acceptance of ICME in the integrated product development (IPD) process and for more efficient develop- ment of future ICME tools. Visualization The ability to visualize the output of complex analysis is crucial to understand- ing design challenges of today and tomorrow. Advances in commodity graphics hardware have led to the replacement of virtually all of the graphics hardware of the once-dominant proprietary graphics vendors. Yet even the most powerful graph- ics cards are not able to process large-scale models needed to accurately perform complex analysis. Moderately sized models can be loaded into a single card but cannot be rendered fast enough to provide meaningful interaction with the data. In the CFD codes that simulate manufacturing processes such as the casting of metals or the injection molding of plastic, the number of cells or elements exceeds 10-15 million, and additional graphics capability is required. Software development efforts have now created parallel-distributed graphics applications codes that can take advantage of graphics hardware installed on individual nodes of a compute cluster. Examples of these codes are EnSightDR, Paraview, and Chromium. These distributed nodes, combined with a fast interconnect and high-performance I/O, give users the ability to interact with large models. Rather than being limited to images generated from, say, the x, y, and z axes only, the interactivity of these dis- tributed systems allows users to zoom, pan, rotate, and display time-dependent animations on models ranging from 50 million to 1 billion elements or cells. With advanced visualization tools, the time-dependent results of large models showing changing states and structures can be more easily understood. Figure 3-1 illustrates a large shock turbulence model, which was rendered on a large Army Research Laboratory visualization cluster. The model is composed of 589 million polygons. Using an off-the-shelf production version of EnSightDR, the model can
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 79 be updated by six frames per second. This gives researchers the ability to interac- tively examine localized phenomenon anywhere on the geometry. One of the difficulties with any collaborative effort is obtaining access to com- pute and visualization resources. Boeing recently conducted a series of benchmark tests to validate the usefulness of remote visualization of moderate-size models. To test the performance of remote graphics, a 60 million cell model was rendered in Bellevue and displayed in Renton. The resulting images were updated at 10 frames per second, which is more than adequate for engineering design review and post processing. The conclusion was that remote visualization of moderately large models is now feasible, which saves large data transfers and duplicate disk space on both the computing platform and the graphics platform. One last challenge in visualization is to represent these very large data sets in ways that are comprehensible to the human mind. A computer may be able to display the results of a simulation with a billion degrees of freedom, but our ability to understand and use the data will depend on the visualization designerâs ability to highlight the important features in the result. This is true of high-Âperformance physical simulations in general: visualization challenges specific to materials include electron orbital interactions in large density functional calculations; large ensembles of dislocations; crazing, crystallinity, and other complex structures in polymers; and distributions of structure features such as precipitates, dendrites, and grains throughout an engineering component. Role of Experimentation in Computational Materials Science and ICME Experimental Calibration and Validation One of the important lessons of earlier ICME efforts is the profound impor- tance of experimental results for calibrating and validating computational meth- ods and filling gaps in theoretical understanding. In fact ICME is best practiced with complementary experimental and theoretical approaches. This implies well- i Â ntegrated research teams working toward a common goal, such as would be found in industry and government laboratories. Based on the lessons learned from the DOE advanced simulation and computing initiative (ASCI), verification and vali- dation of modeling methods has become a key part of the current National Nuclear Security Administration (NNSA) Predictive Science Academic Alliance Program. Using the correct mathematical description and numerical representation (verifica- tion) and ensuring that the results are consistent with well-designed experiments (validation) should also become standard practice in the ICME community.
80 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g There are three classes of experimental effort: â¢ Classical experimental validation, such as thermodynamic measurements and experiments designed to advance mechanistic understanding. These dis- ciplines are often poorly funded, because such mature methods are incor- rectly perceived as unimportant research areas. â¢ Novel experimental techniques, such as three-dimensional materials imag- ing microscopy and miniature sample technology. These are cutting-edge research areas where the methods have not yet matured or permeated the materials science and engineering community. â¢ High-throughput techniques, such as combinatorial materials science. These are new and unproven but have the potential to rapidly populate databases and enable large-scale ICME. For an ICME strategy to be successful, a strong link between experimental data and modeling is essential. Paradoxically, significant advances in low-cost process- ingâstructure and structureâproperty experimental methods may be required to advance the modeling and simulation infrastructure. Three-Dimensional Microstructural Characterization Structureâproperty models for materials are at the core of any ICME imple- mentation. Virtually all engineering materials contain structural features at the micro- or nanoscale that strongly influence properties. These structural features are in turn strongly influenced by the manufacturing processes used for a particular product and may include grain boundaries, second phases, pores, or defects such as dislocations and vacancies. They are often irregularly shaped and distributed nonuniformly in three dimensions. Knowledge of these features is particularly important for predicting flaw-sensitive properties such as fatigue. Most traditional characterization techniques collect information from two-dimensional sections and do not accurately capture the structural complexity. Thus three-dimensional images are often needed to extract quantitative structural information for property models. An example of a precipitate with a highly complex dendritic shape in three dimensions is shown in Figure 3-2. Three-dimensional imaging is of importance in a number of other technical fields, with medical imaging being a notable example. Imaging modalities such as X-ray computed tomography, magnetic resonance imaging, ultrasound, and positron emission tomography are now routinely used as diagnostic tools. These tomographic techniques acquire sequential two-dimensional âslicesâ of the object of interest and digitally reconstruct these slices into a three-dimensional data set
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 81 FIGURE 3-2 Three-dimensional rendering of a dendritic precipitate obtained by serial sectioning and reconstruction. Image courtesy of M. De Graef, Carnegie Mellon University. that can be analyzed in detail. There are hardware, software, and data challenges in three-dimensional imaging. Imaging sources and detectors, automated stages, and focusing or slicing techniques must be matched to the material being examined (soft tissue, bone, metallic, polymeric, ceramic). Data sets can be quite large, chal- lenging the processing power of computers used for meshing, reconstruction, and analysis. Image-processing routines unique to the tomography technique or class of material are often needed. Unlike such imaging in the medical field, the three-dimensional imaging of engineering materials is not yet widely available or well developed. By their nature, engineering materials are not amenable to many of the imaging modes utilized in the medical field. Except for special cases like synchrotron radiation, where large volumes of material are transparent to the imaging radiation, engineering materials must often be physically sectioned to acquire two-dimensional imaging âslices.â Recent interesting serial sectioning approaches include automated robotic
82 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g serial sectioning, focused ion beam sectioning,,,,,10 and the three-dimensional atom probe.11 Improvements in the efficiency of these techniques as well as com- pletely new techniques are needed to bring the materials community to the point where three-dimensional information can be routinely acquired. While continued advances in computing capabilities will mitigate some of the difficulties of process- ing large data sets, there is no agreement on three-dimensional data formatting standards nor are any community sites available for storage of these large data sets.Â Utilization of three-dimensional data in property models requires new quantitative analytical approaches beyond simple two-point correlations.Â Under development are automated protocols for representation and stereological analysis, meshing of the complex geometrical details of the microstructure, and direct linkage to finite element analysis for bridging to macroscopic properties.12Â Given the magnitude of the three-dimensional imaging and analysis task for the materials community, significant coordination will be needed. Rapid, Targeted Experimentation As described above, the availability of experimental data to fill gaps in theoreti- cal understanding, calibrate models, and validate ICME results is a key prerequisite for widespread ICME utilization. While experimental evaluation of new materials and evaluation of the influence of new processing approaches have been the corner- stone of materials development, such evaluation is slow and often very expensive, so that a more rapid approach to experimental exploration is needed. While R&D literature contains a great deal of useful information that could be harvested, newly âJ.E. Spowart, H.H. Mullens, and B.T. Puchala, âCollecting and analyzing microstructures in three dimensions: A fully automated approach,â Journal of The Minerals, Metals & Materials Society (JOM) 36 (October 2003). âB.J. Inkson, M. Mulvihill, and G. Mobus, â3D determination of grain shape in a FeAl-based nano- composite by 3D FIB tomography,â Scripta Materialia 45(7): 753-758 (2001). âM. Groeber, B. Haley, M. Uchic, and S. Ghosh, Materials Processing and Design, NUMIFORM 2004, American Institute of Physics Conference Proceedings 712. âA.J. Kubis, G.J. Shiflet, D.N. Dunn, and R. Hull, âFocused ion-beam tomography,â Metallurgical and Materials Transactions A 35(7): 1935-1943. â Bhandari, S. Sarkar, M. Groeber, M.D. Uchic, D.M. Dimiduk, and S. Ghosh, â3D polycrystalline Y. microstructure reconstruction from FIB generated serial sections for FE analysis,â Computational Materials Science 41(2): 222-235. 10â Shan and A.M. Gokhale, âDigital image analysis and microstructure modeling tools for micro- A. structure sensitive design of materials,â International Journal of Plasticity 20(7): 1347-1370. 11âS.S.A. Gerstl, D.N. Seidman, A.A. Gribb, and T.F. Kelly, âLEAP microscopes look at TiAl alloys,â Advanced Materials and Processes 10 (October): 31-33. 12âG. Spanos, ed., â3-D characterization and analysis of materials,â Scripta Materialia Viewpoint Set 55(1) (2006).
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 83 developed materials or new processing routes inevitably call for experimental data for the calibration and validation of models. There are emerging suites of new characterization tools that permit materials properties to be rapidly screened and evaluated without the need for large volumes of material. An example of this is the diffusion multiple approach. With a single sample containing pieces of various pure elements bonded together (Figure 3-3), aspects of binary and ternary phase diagrams can be explored with the use of a single sample, reducing by a factor of 20 to 100 the required number of samples that must be processed to obtain this fundamental information. Among the new techniques for probling properties in small volumes are local laser-based probes for thermal and electrical conductiv- ity, nanoindentors, and new electron backscattered scanning electron microscopy (SEM) techniques. The efficiency of such approaches can greatly accelerate the materials development process while reducing its often extraordinarily high cost. Figure 3-4 illustrates the use of focused ion beam milling with microcompression methods to sample the local micron-scale properties.13,14 Such techniques can be used to sample the effects of microstructural heterogeneities across a component or to screen the properties of small volumes of new materials without having to develop or utilize high-volume, time-consuming, and expensive processing operations. Currently these methods are limited by the milling time required to produce the samples, but alternative methods for producing massive arrays of microsamples are under development. Other thin- and thick-film combinatorial processing approaches, rapid micromachining property evaluation, and microscale mechanical tests are among the suite of emerging tools that promise to dramatically accelerate the materials and product development cycles. These techniques are still in their infancy, and protocols for acquiring, storing, and sharing the vast amounts of data they might generate have yet to be developed. Databases and ICME Development Over the course of the study, databases emerged as important enablers in the ICME infrastructure. Databases provide a mechanism for storing experimental and computational results and for efficiently linking to models operating at differ- ent length scales or timescales. As shown in Figure 3-5, the diversity of data types required for complete knowledge of a material necessitates a variety of database structures to meet the input needs of the various models. Currently the schemas for describing the breadth of database types are at varying degrees of maturity. This 13âM.D. Uchic, D.M. Dimiduk, J.N. Florando, and W.D. Nix, âSample dimensions influence strength and crystal plasticity,â Science 305: 986-989 (2004). 14âQ Feng, Y.N. Picard, H. Liu, S.M. Yalisove, G. Mourou, and T.M. Pollock, âFemtosecond laser micromachining of a single-crystal superalloy,â Scripta Materialia 53(5): 511-516 (2005).
84 FIGURE 3-3 Example of a combinatorial approach: application of a diffusion multiple to generate data on single-phase compositions of a ternary system for a survey of Ni-Mo-Fe. A diffusion multiple made up of Ni, Fe, Mo, and superalloy Inconel 706 (IN706) for rapid mapping of the Ni-Fe-Mo phase diagram and for studying the alloying effect of Mo addition to IN706. The phase diagram is plotted using atomic percent axes with numbering of the scales removed for simplicity. SOURCE: J.C. Zhao, âThe diffusion-multiple approach to designing alloys,â Annual Review of Materials Research 35: 51-73 (2005).
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 85 (a) (b) FIGURE 3-4 Locally machined micropillars (a) fabricated in a focused ion beam system. They are com- pressed with a nanoindentor tip to obtain stress-strain data (b). SOURCE: M.D. Uchic, D.M. Dimiduk, J.N. Florando, and W.D. Nix, âSample dimensions influence strength and crystal plasticity,â Science 305 (5686):986-989 (2004). Reprinted with permission from AAAS. section reviews the current state of databases useful to ICME, explores those data requirements, and anticipates the ways in which databases will need to expand in order to accommodate the needs of ICME models. Current Status and Database Issues When databases are properly constructed and maintained, they empower the efficient use of materials data in systems design by IPD processes. Progress in estab- lishing such databases in materials science and engineering is hindered by several problems. Often it is difficult to delimit the scope of the data to be stored or to determine their nature, because the mechanisms controlling a given property of a material may not be known. This also requires that the data, which can take many forms (numbers, images, graphs), be stored in a compact but low-loss procedure so that they can be resampled in the future. The variety of communities accessing and contributing to these databases means the databases must be both transparent and secure. Finally, while the manufacturing industry has the most to gain from ICME, the companies have competitive reasons for not moving significant parts of their materials knowledge base into the public domain. Many aspects of databases (for
86 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g Periodic Experimental Experimental Table Database Database Analysis Code ab initio, DFT codes Crystal Atomic Structure Potentials Backed out from CALPHAD Backed out from MD codes Gibbs Free- Diffusion Energy Constants Functions Optimization Software CALPHAD codes Diffusion codes Phase TTT, CCT Diagrams Diagrams Deterministic or Analytical Methods + FEA Microstructures Deterministic or Analytical Methods Mechanical, Physical, Electronic, Magnetic Properties Lifingcodes + FEA Component Life FIGURE 3-5 Integrating databases and computational materials science tools. DFT, density functional theory; MD, molecular dynamics; TTT, timeâtemperature transformation curve; and CCT, continuous cooling transformation curves.
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 87 example, funding, formatting, administration, populating, and business support) will need to evolve in order to better serve the emerging ICME infrastructure. In several areas of ICME small companies license powerful but low-cost driv- ers and graphical user interfaces (GUIs) that allow access to large proprietary databases. Such is the case for the materials thermodynamics and phase diagrams (CALPHAD: CALculation of PHAse Diagram) community, which constructs databases from literature data and unpublished academic and industrial studies and then sells them as part of a thermodynamic modeling package. Other com- paniesâsuch as MatWeb and Grantaâoffer large databases of material proper- ties in the form of a commercial product or as inexpensive Web-based products supported by selling advertising space on the Web site, much like the commercial search engines Yahoo and Google. Because properties are so dependent on pro- cessing and structure, property databases that contain little or no information on manufacturing history or microstructure are, in the committeeâs judgment, of limited utility for development of ICME. Since ICME seeks to build a tool set that links material composition and structure with material properties, such databases must contain lower-level inputs (for example, crystal structures, thermodynamics data, kinetic data, and physical properties). This is analogous to the bioinformatics problem, where a record on a nucleotide DNA subunit within a database would, for instance, contain information on the organism from which it was isolated, the input sequence, type of molecule, and a literature citation.15 Some materials professional societies play an important role in database devel- opment by bringing together industry, government, and academia to address data classification issues and by hosting materials properties databases. For example, ASM International has partnered with Granta, a privately held company based in the United Kingdom, to launch a Web-based materials information center. The site offers databases and software products for a variety of industries ranging from medical devices to aerospace and defense. Materials data can be managed through the groupâs proprietary software (GRANTA-MI), and another GUI enables mate- rials and process selection using the Cambridge Materials Selector. This software incorporates a variety of standard databases relevant to different communitiesâfor example, MMPDS data (previously MIL-HDBK-5), which is maintained by the Federal Aviation Administration for aerospace applications. Unfortunately these precompetitive databasesâdata in the public domainâare quite limited compared with what an IPD team in an industrial manufacturing setting actually needs. Investment in materials databases is at a very low level compared to investment in biological and genomics databases. Significant government support of genomic databases coincided with the realization that such data were sufficiently important to growth in this area that any collected data should immediately become part of 15âFor more information, see http://www.ncbi.nlm.nih.gov. Accessed February 2007.
88 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g the communityâs common knowledge base through archival, curated Internet data- bases. This approach has been enforced by requiring the original authors to post their data before publishing their findings. Such a requirement forces researchers to meet a certain level of fidelity in their data gathering and enforces a common data structure across the community. Currently GenBank, housed at the National Center for Biotechnology Information (NCBI),16 has over 130 gigabases of sequences, and Entrez17 offers an integrated database of databases allowing text-based searches. Smaller, model organism databases that house genome sequences and data on structure and function (such as FlyBase,18 WormBase,19 Mouse Genome20) cost between $400,000 and $1 million per year to develop and curate and are generally funded by the government through, for instance, the National Institutes of Health (NIH). Significant government investments, similar to those made by the NIH in the genomics community, will be required to create and curate the precompetitive databases required to support ICME. Requirements for ICME Databases Those who have built large scientific databases in the biology and physics communities emphasize two design principles.21,22 First, useful databases must start with a taxonomy of the field that is comprehensive and able to accommodate change. The fundamental problem with materials databases is that their structure, or schema, is generally focused narrowly on the immediate problem of the user base and is not easily expandable. Second, one must anticipate as much as pos- sible what questions people will want to ask, because todayâs data representation may not be flexible enough to suit future lines of inquiry. The development of materials and their transitioning to applications require efficient, informed, and flexible descriptions of the salient measurables driving property variability across a component. For example, one key measurable that presents particular challenges in stor- 16âFor more information, see http://www.ncbi.nlm.nih.gov/. Accessed December 2007. 17âFor more information, see http://www.ncbi.nlm.nih.gov/sites/gquery. Accessed December 2007. 18âFor more information, see http://flybase.bio.indiana.edu/. Accessed December 2007. 19âFor more information, see http://www.wormbase.org/. Accessed December 2007. 20âFor more information, see http://www.ncbi.nlm.nih.gov/genome/guide/mouse/. Accessed Decem- ber 2007. 21âCate L. Brinson, Northwestern University, âMaterials informaticsâwhat, how and why: Analogy to bioinformatics,â Presentation to the committee on May 30, 2007. Available at http://www7.national academies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 22âKrishna Rajan, Iowa State University, âMaterials informatics,â Presentation to the committee on March 13, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations. html. Accessed February 2008.
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 89 age and retrieval is the nature and variation in microstructure. There are ongo- ing materials science efforts to build microstructural databases, particularly for structural metals. The Office of Naval Research program on dynamic digital three- dimensional structure is exploring methods for collecting, storing, and retrieving microstructural data in the three-dimensional Materials Atlas. Microstructural data, images, and metadata are compressed using low-loss techniques and stored using a relational database (SQL) and a data format (HDF5) that were chosen for robustness and flexibility. The long-term success of the Materials Atlas will be determined by the ease with which users can select, retrieve, decompress, and use the information in as-yet-to-be-determined applications and the availability of resources for curation. The breadth of the databases required for ICME and the need for efficient data acquisition suggest the adoption of standardized formats. Such data and their repositories should facilitate the prediction of macroscopic materials properties (including cost) to meet the needs of design, manufacture, and prediction of life- time and reliability. They must allow the efficient passing of data between models at different length scales, from different domains, and at times between different institutions (see Figure 3-5 and other examples in this section). Database designers are likely to use different approaches to pass and store information, and there are many such formats in use today, most of them geared to a particular modeling tool. Linking databases to models requires data communication via standard formats. In order to consolidate the collection of formats for exchanging data on the properties of materials, the National Institute of Standards and Technology (NIST) created the XML interchange format called MatML. The format was designed to be very flexible in specifying properties, limitations, uncertainties, sources, and gen- eration methods. Unfortunately, failure to specify nomenclature makes MatML too flexible for machine reading, and there is no clear way to write a universal MatML importer for use in a mechanics code. For example, there are several possible ways to express the name of the elastic modulus and different meanings for the term âmodulusâ in mechanics. To manage these problems, Japanâs National Institute for Materials Science created a taxonomy and then defined a subset of property names to be used within MatML. Unfortunately the complexity of implementing this standard remains a barrier to its widespread adoption. Advances in ICME will be strongly dependent on the balance struck between transparency and security in precompetitive databases. To produce relevant and reliable simulation and experimental tools, the academic community and the gov- ernment laboratories require some level of transparency. However, to maintain their competitive advantage, OEMs may decide to maintain proprietary databases for their core technologies. Ideally, ICME software should be able to seamlessly integrate multiple proprietary and public data sources as inputs to materials and system integration tools in order to optimize over all the available data. Significant
90 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g growth in the precompetitive databases and a well-thought-out security strategy will be required to create and maintain this part of the ICME infrastructure. Mod- els, untied from their proprietary databases, should be released into the public domain. One of the lessons learned from previous ICME efforts is the profound impor- tance of having experimental results to fill gaps in theoretical understanding, so that a strong link between experimental data and modeling is essential for an ICME strategy to be successful. For those data to be accessible, the community needs a set of common databases, in much the same way that the genetics community requires a database of sequences of genes. It is not enough, however, to have access to data. Materials development requires an understanding of how different features in the data may be correlated with others. For making those connections, the commit- tee turns to a new field, materials informatics, based on ideas from the biological community. Materials Informatics A major challenge facing efficient development and widespread adoption of ICME is to develop a better linkage between experimental data and modeling. Materials informatics, a promising new development in materials research, offers a way to meet that challenge. Materials informatics employs creating databases and advanced data mining and analysis methods to seek patterns of behavior from large, complex data sets, with the goal of identifying new physical relationships between chemistry, structure, and properties. Because the data sets used in informatics are not restricted to experimental data, materials informatics has potential for provid- ing a means to connect the results of calculations and models with experiment to decrease the time required to develop a new material model or a new material.23 The advantage of informatics is that it makes it possible to extract information from large and complex data sets. A number of methods have been developed to mine such information. One such method, principal component analysis, allows a 23âBecause materials knowledge is embodied in experimental data, physically based models, empiri- cal rules, and heuristics, the fusion of information from all available sources will increase the level of confidence in ICME analysis results. Â Although methods to fuse data and modeling predictions are immature and require further research, some progress has been achieved. Â For example, under the DARPA-sponsored AIM program, Bayesian analysis was conducted to predict yield strength varia- tion. It used models within a Monte Carlo scheme to construct an a priori distribution that was then refined using only limited data. Â However, beyond such integrated analysis methods, models also can be applied to interpolate within sparse data sets; models, empirical rules, and heuristics can each be applied to identify suspect data and quantify/tag their uncertainty. Databases should be integrated with repositories of modeling results, rules, heuristics, and lessons learned to form a comprehensive knowledge base that will facilitate such information fusion.
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 91 researcher to find the minimum number of components that best describe a data- set, enabling much easier classification and feature extraction. Other data-mining tools include partial least squares regression, cluster identification, association analysis, and anomaly detection. These approaches are common in some fields but have yet to be widely applied to materials data. As an example, suppose the goal is to develop a new alloy system for a specific application. The expense associated with a complete exploration of a multicompo- nent design space is immense and generally not affordable. Informatics provides a way to identify trends in the data that might be normally overlooked. Using data- mining methods, common characteristics can be isolated and employed to identify promising classes of materials. The power of informatics is that the data can come in many formsâfor example, from experiment or from modelingâand can have a wide range of uncertainty. To be more specific, consider the hybrid data mining and simulation technique of Fischer et al. for determining lowest-energy intermetallic structures and con- structing their phase diagrams.24 A database holds the structures and free energies of a large number of binary and ternary intermetallic systems. When the user requests the phase diagram for a binary system not in the database, the software first guesses which structures the alloy could form by applying statistical methods to the database, then tests and refines those guesses by a series of ab initio calcula- tions. Fischer et al. estimate that an unknown binary phase diagram, including all intermetallic crystal structures and lattice spacings, could be generated in this way by using just 20 ab initio calculations. Moving up in scale, it is not hard to imagine a mesoscale structure formation and evolutions models (such as the phase field method) employing a similar approach to automatically access thermodynamics data and the results of ab initio calculations.25,26 Indeed, this general approach may be widely applicable in linking models across scales. For the foreseeable future, the development of ICME computational models will require a specialized capability and a labor-intensive approach requiring an âexpert.â A good example of this is CALPHAD, which needs experts or those expe- rienced in the âartâ to develop data assessments and assemble databases. It is an iterative process, and many of the more commonly used databases for alloys (such as Ni superalloys and steels) have been in development for up to 20 years. Within the ICME framework, work on better, more efficient ways to manage databases 24âC. Fischer, K. Tibbets, D. Morgan, and G. Ceder, âPredicting crystal structure by merging data mining with quantum mechanics,â Nature Materials 5 (2006): 641-646. 25â Vaithyanathan, C. Wolverton, and L.Q. Chen,. âMultiscale modeling of Î¸â² precipitation in Al-Cu V. binary alloys,â Acta Materialia 52 (2004): 2973-2987. 26â Vaithyanathan, C. Wolverton, and L.Q. Chen, âMultiscale modeling of precipitate microstruc- V. ture evolution,â Physical Review Letters 88(12) (2002).
92 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g and construct them in a more semiautomated way would be an important step forward. Materials informatics is in its earliest stages of development. Much work remains before it will be developed sufficiently to be widely applicable in materi- als engineering; it requires creating a new and robust set of tools easily available to the materials engineer. It holds great promise, however, and could be a critical part of an ICME process. Integration Tools: The Technological âIâ in ICME Technical tools for integrating materials knowledge are of obvious importance for ICME. Integration tools are the glue that binds software applications and databases into an integrated, cohesive, systemwide design tool that can be used by many contributors to the design effort. For ICME, these contributors might include materials researchers, materials design engineers, product designers, engineering design analysts, manufacturing analysts, purchasing agents, suppliers, and, possibly, quality control and customer support personnel. Integration tools are required for three tasks: â¢ Linking information from different sources and different knowledge domains. This information could be in the form of computational models or empiri- cal relationships derived from experimental data. â¢ Networking and collaborative development. This would be a helpful technical tool for solving some of the cultural and organizational problems facing ICME, which will be described in Chapter 4. â¢ Optimization. This might be optimization of a product, a manufacturing process, or a material. It would allow materials engineers to fully engage in the computational engineering IPD process described in Box 2.1. Integration is viewed differently in each of the communities expected to con- tribute to the growth of ICME. Graphical representations representing the view- points of three of those communities are shown in Figures 3-6 to 3-8. Figure 3-6 shows a typical multiscale figure, with the timescale and the length scale important for the description of various systems. Much of the work that could be deemed computational materials science entails performing calculations in each of these regimes and then, by passing information from one regime-specific tool to another, linking the phenomena across the scales. While this concept is often useful for defining a modeling strategy, its importance is sometimes overemphasized. Devel- oping and linking models across length scales is not required for a workable ICME tool set. Rather, ICME practitioners develop models as an engineering activity that
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 93 FIGURE 3-6 Multiscale modeling, a construct used to illustrate the interdependence and connections between mechanisms acting at different length scales and timescales. SOURCE: Michael Doyle, Accel- rys, âIntegration of computational materials science and engineering methods,â Presentation to the committee on March 13, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_ Presentations.html. Accessed February 2008. requires an initial expert assessment to get proper matching between the problem being attacked and the length scales that must be considered. Figure 3-7 (and to a large degree Figure 3.5) shows the integration problem from the viewpoint of a metallurgist. Viewpoints exist as well for ceramics, polymers, and other materials systems. Knowledge from disparate sources and domains (for example, thermodynamic models, models for simulating manufacturing processes, microstructural evolution models, and property models) is required to fully assess the influence of the manufacturing process on the properties of the materials that make up a manufactured product. Simulations of manufacturing processes must be integrated with computational models for phase equilibria, microstructural evolution, and property prediction. An important notion here is that properties of an engineering product âcompeteâ and thus must be balanced in its design. The complexity of this optimization problem dictates that a computational approach is required. Missing from the traditional metallurgistâs perspective are the direct outputs to product development performance analysis and optimization. To be effective, ICME must address issues that are encountered in both of these integration domains and many more. In doing so, it will integrate these disparate fields into a holistic system allowing optimization and collaboration. Integration tools are thus the backbone of ICME. Depending on specific motivations, incen- tives, and requirements, they may be used in a proprietary setting (such as described
94 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g FIGURE 3-7 A metallurgistâs view of the integration problem represented by ICME for a nickel-based superalloy. SOURCE: Adapted from Leo Christodolou, DARPA, âAccelerated insertion of materials,â Presentation to the committee on November, 20, 2006. Available at http://www7.nationalacademies. org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. in the Ford virtual aluminum castings example in Chapter 2), in a collaborative but limited partnership setting (such as that described for the P&W AIM program, also in Chapter 2), or in an open, collaborative setting. Commercial Integration Tools Commercial integration software tools are available that are designed to link a variety of disparate software applications into an integrated package, which can then be used to optimize some underlying process. As a result of these efforts, de facto standards are emerging for âwrappingâ models, running parallel parametric simulations, applying sensitivity analysis, and reducing the complexity (order) of systems. Such companies market and apply systems integration tools that will solve specific engineering problems, tools for interoperability across organizations, and, in some cases, tools for education.
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 95 FIGURE 3-8 Models and experiments flow from AIM integration architecture. SOURCE: DARPA and AFRL Accelerated Insertion of Materials program. Simulation data managers (SDMs) such as iSIGHT/FIPER and CenterLink are Web-based tools that do the following:27,28 â¢ Provide standards-based integration environments to link applications; â¢ Send data securely across network connections; â¢ Run applications code on computer resources that might be local or remote and that consist of heterogeneous hardware platforms; â¢ Use system resource or job execution queue managers such as load sharing facility (LSF); 27âBrett Malone, Phoenix, âPhoenix integration,â Presentation to the committee on March 13, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 28âAlex Van der Velden, Engineous, âUse of process integration and design optimization tools for product design incorporating materials as a design variable,â Presentation to the committee on March 14, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008.
96 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g â¢ Save design parameters and results in a database; â¢ Provide database mining capabilities; â¢ Enable three-dimensional surface design visualization; â¢ Provide response surface approximation for experimental data; and â¢ Measure and track uncertainty and contributions for given design parameters. These and other integration tools are widely used for IPD, but they have almost no presence in the materials engineering community. That said, they have been successfully used in pilot ICME demonstration projects.29,30,31,32 In the Defense Advanced Research Projects Agencyâs (DARPAâs) Accelerated Insertion of Materials (AIM) program, a commercial SDM called iSIGHT, from the company Engineous Software, was used to link computer-aided design (CAD) forging process modeling, models for heat treatment, microstructural evolution models, property predictions, and structural analysis applications into a seamless work flow called a designer knowledge base. This designer knowledge base, depicted in Figure 3-8, effectively integrated quantitative information from a wide variety of sources and models. Design data and experimental results were stored in a common database. From this demonstration, the committee concludes that state-of-the-art commercial integra- tion tools are available for ICME and ready for widespread application, identifying and solving the unique problems that will arise as the discipline matures. For integration tools, common interface standards are highly desirable so that application engineers do not have to rewrap applications many times for different uses. The development of standards and nomenclatures, or taxonomies, should be done in conjunction with model and software developers and vendors and not in isolation. NIST has developed a wrapping standard that is available in the com- mercial SDM applications FIPER.33 Other interface formats are emerging from 29â Leo Christodolou, DARPA, âAccelerated insertion of materials,â Presentation to the committee on November, 20, 2006. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations. html. Accessed February 2008. 30âDaniel G. Backman, Daniel Y. Wei, Deborah D. Whitis, Matthew B. Buczek, Peter M. Finnigan, and Dongming Gao, âICME at GE: Accelerating the insertion of new materials and processes,â JOM November 2006: 36-41. 31âDennis Dimiduk, United States Air Force, âTowards full-life systems engineering of structural metal,â Presentation to the committee on May 30, 2007. Available at http://www7.nationalacademies. org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 32âAlex Van der Velden, Engineous, âUse of process integration and design optimization tools for product design incorporating materials as a design variable,â Presentation to the committee on March 14, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 33âIbid.
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 97 UGS, MSC, and Dassault.34 Additionally, the International Organization for Stan- dardization (ISO) has promoted the Standard for the Exchange of Product Model Data (STEP), ISO 10303, as a comprehensive way to represent and exchange digital product information. However, application developers are often reluctant to create open interfaces to their applications. Open access is often viewed from a software developerâs point of view as a risk to its intellectual property. To the extent that code vendors and authors are willing to cooperate in the creation of open standard interfaces to their applications and data, the general community would benefit; to encourage them to do so would require incentives from major government agen- cies and industrial consortia. SDM environments provide the ability to securely transport data to local or remote computing resources and to execute a wide variety of applications on those resources. Once that has been done, the SDM takes computed results and stores them in a simulation database. SDM environments also enable collaboration among multiple research groups. Execution can be monitored by these groups, with all parties having limited or full access to the data or control of application execution. The groups can be inside or outside firewalls. Optimization is an important objective for ICME. Once applications are linked into a common framework, the next logical step is to perform multidisciplinary, systemwide design and optimization. Design trade-offs can be made, and the result- ing behavior can be propagated throughout the entire design work flow to obtain globally optimal solutions. Although materials computations are not currently integrated into the multidisciplinary optimization (MDO), the ICME-enabled desired future state would allow material and manufacturing process optimization trade-offs that could also be propagated throughout the entire design work flow. A single analysis of all of the linked application modules could be executed, or design studies could be conducted to access trade-offs. An ICME-enabled MDO system could also be used to bring the systemwide design to an optimal global design point or it could be run to simply assess reliability. ICME Cyberinfrastructure For many communities the World Wide Web serves as a platform for sharing information in the form of models and data. The term âcyberinfrastructureâ refers to a relatively new infrastructure that according to an NSF report35 is âbased upon 34âNuno Rebelo, Simulia, âCAE: Past, present, and future,â Presentation to the committee on May 30, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 35âFor more information, see the Report of the National Science Foundation Blue-Ribbon Advisory Panel on Cyberinfrastructure. Available at http://www.nsf.gov/od/oci/reports/atkins.pdf. Accessed February 2008.
98 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g distributed computer, information and communication technology.â Such an infra- structure is as essential to the knowledge industry as is the physical infrastructure of roads, bridges, and the like to the industrial economy. In 2003, the NSF Blue Rib- bon Advisory Panel on Cyberinfrastructure envisioned âthe creation of thousands of overlapping field and project collaboratories or grid communities, customized at the application layer but extensively sharing a common Cyberinfrastructure.â36 Important elements of the cyberinfrastructure described in this report included grids of computational facilities; comprehensive libraries of digital objects, includ- ing programs and literature; multidisciplinary, well-curated, federated collections of scientific data, online instruments, and sensor arrays; convenient software tool- kits for resource discovery, modeling, and visualization; and the ability to collabo- rate with physically distributed teams of people using these capabilities. The report identified this as an important opportunity for NSF and stressed the importance of acting quickly and the risks of failing to do so. The risks include lack of coordina- tion, which could lead to adoption of irreconcilable formats for information; failure to archive and curate data that have been collected at great expense and may be easily lost; barriers that can inadvertently arise between disciplines if isolated and incompatible tools and structures are used; waste of time and talent in developing tools that may have shortened life spans due to the above-mentioned lack of coor- dination and failure to incorporate a consistent computer science perspective; and, finally, insufficient attention to resolving cultural barriers to adopting new tools, which may also result in failure. The committee proposes the following definition for the term âICME cyberinfrastructure:â The Internet-based collaborative materials science and engineering research and development environments that support advanced data acquisition, data and model storage, data and model management, data and model mining, data and model visualization, and other computing and information processing services required to develop an integrated computational materials engineering capability. A key element of the ICME cyberinfrastructure will be individual collabora- tive ICME Web sites and information repositories that are established for specific purposes by a variety of organizations but linked in some fashion to a broader network that represents the ICME cyberinfrastructure. The DARPA AIM Designer Knowledge Base, using iSIGHT, the Internet, and a geographically dispersed team, represents the only known example of an ICMEâWeb collaboration. Although collaborative Web sites in materials science and engineering are relatively rare, there are some. One example is nanoHub, the Web-based resource for research, 36âIbid., p. 7.
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 99 education, and collaboration in nanotechnology, developed at Purdue Univer- sity and funded by the NSF Network for Computational Nanotechnology.37 It is reportedly used by thousands of researchers from over 180 countries. Important elements of collaborative sites are security, networking capability, and, in some cases, grid computing. Since the technology surrounding collaborative Web sites and informatics is rapidly evolving and quite new in the case of materials science and engineering, it should be realized that there may be some redundancies and that some ICME Web sites and informatics efforts probably will fail eventually. If ICME develops soon and with substantial coordination, these redundancies and failed efforts will be minimized. The goal of a balanced, well-designed ICME cyberinfrastructure is to give scientists and engineers the means to do a number of things: â¢ Link applications codesâfor example, UniGraphics, PrecipiCalc, and ANSYS; â¢ Develop models that accurately predict multiscale material behaviors; â¢ Store and retrieve analytical and experimental data in common databases; â¢ Provide a repository for material models; â¢ Execute computational code anywhere computational resources are available; â¢ Visualize large-scale data; â¢ Enable local or geographically disperse collaborative research; and â¢ Measure the uncertainty in a given design and the contributions of indi- vidual design parameters or sources. The desired future state is one in which a government-sponsored cyberinfra- structure composed of a variety of special-purpose Web sites is widely used for collaboration between researchers here and abroad. It will be routinely used to develop materials models for computer-aided engineering (CAE) analysis of new products, linked with manufacturing simulations. The development and mainte- nance of advanced materials models that are sensitive to manufacturing history will, for the foreseeable future, be accomplished by specialists from industry, small business, or academia. The materials models will be used to optimize product design and the manufacturing process and to develop new materials. Because these are collaborative tools, access to and control of vital data are crucial to all members of a research consortium and should not be hindered by security considerations. Minimizing redundant activities is a key side benefit of coordinated development programs such as those devoted to solving engineering challenge problems like the 37âFor more information, see http://www.nanohub.org. Accessed March 2008.
100 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g ones described in Chapter 2 and will be expanded on in Chapter 4. Whether the ICME cyberinfrastructure is being put to use in the context of an effort to solve a foundational engineering problem or is part of a self-assembled activity conducted by a professional society, the extent to which it can be made openly accessible to researchers both at home and abroad will greatly influence the cost and rate of development of an ICME capability. ICME integration tools must be made compatible with the underlying hard- ware and the software environment used in current design processes. Elements of computational compatibility include the following: â¢ Portability across the heterogeneous hardware and operating systems. Members of the IPD team (IPDT) may be located at different companies that have different computing environments. â¢ Interoperability with other tools and integrating software. â¢ Standard data and I/O formats to permit data propagation among codes. â¢ Efficient operation, so that tools may be used within a design optimization loop. â¢ Good code design practices, so that codes may be updated as models improve or computing environments change. Standard formats for each of these data inputs and outputs are required to avoid the proliferation of formats seen in property data. As with properties, each format needs to specify uncertainty, trust, and generation method. And, formats will need to be able to evolve to meet changing needs and modeling capabilities. Security is a vital part of the ICME collaborative integration process. Rarely can a materials designer or a developer of a materialâs constitutive relationship use a single code to take a design from microstructure predictions to material properties and the analysis of final product design. This systemwide analysis requires running multiple applications that might reside on different systems run by different, pos- sibly geographically remote groups. The ability to remotely execute tools and move sensitive data in a secure manner is critical to the success of the design community. Secure access to model and data repositories is also essential. Without proper secu- rity, corporate and government security policies will impede the development of systemwide design environments. Summary Although existing computational materials science capabilities are impressive, they have not had a significant impact on materials engineering. Moreover, com- putational materials science lacks the integration framework that would make it widely usable in materials engineering. Establishment of such a framework would
T e c h n o l o g i c a l B a r r i e r s : C o m p u tat i o n a l , E x p e r i m e n ta l , and I n t e g r at i o n N e e d s 101 transform the materials field. In selected instances, the existing tools have been integrated and applied in industrial settings, enabling the first demonstrations of the capabilities of ICME. Physically based models and simulation tools have pro- gressed to the point where ICME is now feasible for certain applications, though much development and validation remain to be done if ICME is to be more broadly adopted. The widespread adoption of ICME approaches will require significant development of models, integration tools, new experimental methods, and major efforts in calibration of models for specific materials systems and validation. The continued evolution and maturation of computational materials sci- ence tools will accelerate the ease and efficiency with which ICME tools can be implemented. To be effective in an ICME environment, all of these tools must be developed in a manner that allows their integration with other tools; this should be a priority for model developers and funding agencies. Modeling approaches that embed uncertainty are also important for advancing ICME. The advantage of improvements in computational capabilities such as parallel processing should be exploited by future CMS and ICME developers. Although ICME tools will be used in a computational engineering environ- ment, experimental studies and data are also critical for the development of empiri- cal models that can be used where there are gaps in theoretical understanding and that can be used to calibrate and validate ICME models. There are several new experimental methods under development whose maturation will do much to accelerate the widespread development of ICME. These include rapid character- ization methods, miniature sampling techniques, and three-dimensional materials characterization techniques. Validation experiments should be a key element of any approach to solve the engineering challenge problems that will be discussed in Chapter 4. The creation and maintenance of dynamic and open-access repositories for data, databases, and materials taxonomies are essential. These databases can also play a role in linking models at different spatial and temporal scales. Open access databases will reduce redundant research, improve efficiency, and lower the costs of developing ICME tools. The integration tools that are now available provide working solutions for ICME, but significant infrastructural development will be required to realize the benefits of integration. One forerunner of an ICME capability will be the establish- ment of curated ICME Web sites that can serve as repositories for data, databases, models for collaboration, and model development and integration. Significant government investments, similar to those awarded by the NIH to the genomics research community, will be required to create and curate the cyberinfrastructure necessary to support ICME. The extent to which this ICME cyberinfrastructure can be made open and accessible will greatly speed up the development of an ICME capability and lower its cost.