The second session of the workshop focused on developing models to represent microstructure evolution, alloy design, and part suitability. Lyle Levine (National Institute of Standards and Technology [NIST]) and Kyle Johnson (Sandia National Laboratories) gave opening presentations and were joined by Annett Seide (MTU Aero Engines), Eric Jägle (Max Planck Institute), Deniece Korzekwa (Los Alamos National Laboratory), Christian Leinenbach (Empa), and John Turner (Oak Ridge National Laboratory) for a panel discussion relating to the following questions:
- How does the additive manufacturing (AM) community develop and validate computer models that use measured material property data and build parameters to predict the location-dependent state of as-built and post-processed components?
- How does the AM community develop and validate computer models that connect the location-dependent state of a part to its performance?
Lyle Levine, National Institute of Standards and Technology
Levine began his presentation with a discussion of how the processing, structure, property, and performance stages in AM interact. He
explained that feedstock material and other environmental considerations can combine with the complex build process to create a complex composition and thermal stress history. This information can inform models of residual stresses and microstructure, which can then provide estimates of mechanical properties and life-cycle behavior.
Levine provided measurements for laser powder-bed fusion, each categorized by model inputs, model guidance, and model validation (see Table 3.1). He noted that a previous AM workshop (see NASEM, 2016) stressed the importance of benchmark measurements for comparison testing. In response, NIST began the AM Benchmark Test Series (AM-Bench) and now has a scientific committee that includes 60 organizations and 83 members.1
Several issues arose as this committee first attempted benchmark measurements, Levine explained. First, there was a tremendous range of additive processes and materials as well as unexplained build variability between machines and processes. Time-intensive metrological-level measurements were needed, and the systems were still being built. To streamline the process, two general sets of benchmarks were made for metals. The first set of benchmarks involved 21 scientists from 6 organizations and focused on part deflection, residual elastic strains, microstructure, phase fractions, and phase evolution. The second set of benchmarks involved 14 scientists from 2 organizations and focused on low-level phenomena, including melt-pool geometry, cooling rate, topography, grain structure, dendritic microstructure, and three-dimensional structure. For a blind benchmark challenge, there were 46 submissions (almost all with metals). Levine noted that the groups that used more physics for their submissions ended up being closer to actual measurements. During a later discussion period, a participant asked Levine why there were so few valid submissions. Levine responded that the relationship between residual stress measurements and part distortion models posed challenges. There were two submissions tied for first place for predicting residual stress measurements accurately and no winner for predicting part distortions. The groups also struggled with predicting surface topography, such as chevron patterns that form on the surface of materials, and anticipating the liquid flow during the solidification process. He speculated that this could be due to surface tension issues. Microstructure evolution was also challenging, particularly in understanding what phases and precipitate sizes/shapes happen as a function of time. Few groups submitted their results for microstructure evolution.
TABLE 3.1 Measurement List for Powder-Bed Fusion
|Model inputs||Model guidance||Model validation|
Melt pool (in-situ builds and tracks)
Laser tracks and build layers
Part level (as-built and processed)
SOURCE: Lyle Levine, National Institute of Standards and Technology, presentation to the workshop, October 24, 2018.
Levine stated that good progress has been made on quantitative in-situ monitoring, but more development is needed for in-situ technologies. However, some needed technology is not widespread, and there is often poor traceability to primary reference standards. While the process for international benchmark measurements is under way, it is limited in scope compared to the technological need. He said that the technology for state characterization is largely developed, but some aspects are widespread and others require specialized capabilities. Lastly, he noted a severe lack of AM-compatible alloys and relevant thermophysical and related materials data.
During the question and answer portion of this presentation, Levine was asked about the current state of three-dimensional microstructure measurement. He responded that the only place he knows that does three-dimensional microstructure measurements successfully is the U.S. Naval Research Laboratory. NIST has struggled with it in the past. He explained that when doing an X-ray computerized tomography scan and looking at the grain structures, a diffraction process is being done instead of a transmission process and high dislocation densities and resolution issues appear. The only way to do three-dimensional microstructure
measurement well is with a cross-sectional scanning electron microscope using localized geometry. Then, electron backscatter diffraction or other approaches can be used to examine the microstructure.
During the later panel discussion, an audience member asked about the next AM-Bench and whether Levine has thought about doing a conduction mode versus a keyhole mode. Levine stated that this was an excellent question, but the specifics need to be considered. For example, is the build conducted on one line with increasing power along that line, or is it done as a bare plate test where the power can be transitioned from conduction to keyholing? A material system also needs to be considered before transitioning.
Kyle Johnson, Sandia National Laboratories
Johnson stated that AM is a multiscale, multilevel problem. Sandia National Laboratories has a vision for linking processing, structure, property, and performance via the following six programs (listed in order from short term to long term).
- Thermal process modeling coupled with microstructure prediction. Sandia is working on microstructure prediction through its Stochastic Parallel PARticle Kinetic Simulator,2 which is used for AM single continuous build and powder-bed methods.
- Thermal process modeling coupled with residual stress prediction. Sandia has a Laser Engineered Net Shaping3 process to fabricate three-dimensional metallic components directly from computer-aided design solid models and to simulate AM builds. Sandia is moving toward reduced-order models to compute the full stress states more efficiently. This process can simulate a 6-hour build time in 8 minutes. Neutron diffraction measurements are also being incorporated into performance models.
- Fast performance prediction accounting for as-built state, properties, and defects for qualification. With 21 participant teams, the Third Sandia Fracture Challenge centered on predicting tensile failure of an AM part.
3 For more information about Sandia National Laboratories’ Laser Engineered Net Shaping process, see https://www.sandia.gov/mst/technologies/net-shaping.html, accessed October 26, 2018.
- Efficient concurrent multiscale modeling and uncertainty quantification using techniques such as multigrid and error estimation when material statistical homogeneity does not apply. An example would be generating microstructures using kinetic Monte Carlo, running a homogeneous simulation with an isotropic material model, recovering localized stresses using a posteriori error methods, and then comparing the results to direct numerical simulations of full kinetic Monte Carlo microstructure.
- Advanced high-throughput testing capability coupled with machine learning algorithms. Full-field high-throughput testing can now be combined with machine learning. Sandia has been looking at additional volume correlation techniques to get more volumetric results instead of surface-level results. The volumetric results could then be turned into a neural network that can help determine the correlation with failure or critical defect structure.
- Process parameter-dependent microstructure prediction leading to local texture control and optimization. Johnson provided an example that illustrated how process settings affect microstructure. Coupling the process-dependent microstructure and a design optimization code, such as Plato,4 might lead to the creation of a site-specific optimized microstructure (Popovich et al., 2017). Johnson said that this could be a “game changer” but is likely still years away.
Johnson noted that challenges remain with each of these six steps. For thermal process modeling coupled with microstructure prediction, better three-dimensional microstructure imaging capabilities are needed, and representation of local microstructure on full-size parts is both a computing power and data storage issue. For thermal process modeling coupled with residual stress prediction, residual stress is still difficult to measure, type-II residual stress is difficult to predict, and an optimization for residual stress is needed. Fast performance prediction accounting for as-built states, properties, and defects for qualification still has to include uncertainty quantification for these materials. Concurrent multiscale modeling and uncertainty quantification using techniques such as multigrid and error estimation can be expensive and difficult. Crystal plasticity models need to account for as-built dislocation structures and other microstructural characteristics that are unique to AM. Lastly, Johnson noted that advanced high-throughput testing capabilities coupled with machine learning algorithms still face questions such as what to use for speckle patterns and which defects or defect networks matter, as well as how to find them.
Following the presentations, Jägle, Leinenbach, Korzekwa, Seide, Turner, and Johnson participated in a panel discussion on microstructure evolution, alloy design, and part suitability. In addition to the two session questions outlined at the beginning of this chapter, Levine, the moderator, posed the following questions:
- What thermophysical parameters are most needed and how can they be measured?
- AM-Bench can only provide a limited amount of data. What future benchmark measurements should have the highest priority?
- It has been suggested that transition states/instabilities are important to investigate—for example, the onset of keyholing and dimensional instabilities for thin walls. What other transition states merit investigation?
- How can commercial in-situ process monitoring systems be validated?
- What is the best role for high-performance computing in AM simulation?
- What are short-, intermediate-, and long-term needs and directions in AM?
An audience member asked the panelists to share their thoughts on microstructure evolution modeling. In particular, since the microstructure cannot be truly predicted, could a blind prediction be used as the next step? Levine responded that AM-Bench 2018 did ask what phases develop in Inconel 625 during a residual stress heat treatment. One AM-Bench group correctly predicted the phases, but the growth rate and the shape of the precipitates were incorrect. Seide stated that although blind predictions may someday be useful, they are not possible yet. Johnson added that for certain materials, such as austenitic stainless steel, predictions are fairly reasonable. However, more research is needed to understand the impact of defects. Jägle noted that blind predictions for areas such as precipitate nucleation or growth rate, where predictions are determined by defects, are currently unavailable. However, very few people go beyond classical nucleation approaches. Another question was how to better integrate sensing data into models to improve predictions. Korzekwa stated that sensing data could be used both as a model input and to validate the model output. Understanding the boundary conditions of the situation is also very important; however, this all depends on the model and what the data actually are. An audience member asked how data could be used with models to predict or estimate the full state of a system that cannot be measured directly. Levine responded that projects should
generally be joint measurement and modeling efforts because one cannot model and measure everything. Measurements can be used to constrict model parameters and help identify underlying physics.
A participant asked whether a heat treatment could be devised to encourage a particular long-term microstructure evolution regardless of the as-built microstructure. Johnson responded that he is not sure if it is possible to do so. Jägle replied that there are limited options for changing the heat treatment, which is why it is important to understand the solidification process. Levine stated that developing a heat treatment for a specific AM alloy is complex. He described two cases in which NIST tried to develop heat treatments with unexpected complications. In one case, a residual stress heat treatment was needed before cutting the parts off of the build plate. This process resulted in significant amounts of unpredicted niobium, which had to be eliminated.
A participant asked about the roles of creep, fatigue, and tensile properties in microstructure evolution. Korzekwa responded that, overall, predicting segregation and texture is challenging. Temperature-dependent mechanical properties are not understood well enough to predict some of the previously mentioned heat treatment effects. She noted that more work is needed to improve modeling capabilities and estimates of relevant material properties; Jägle added that these advances could help researchers achieve desired microstructures and better understand performance. Once there is a microstructure, the models used to translate the microstructure into thermomechanical properties are similar, with additional considerations such as defects that are not present in other materials. Levine gave an example in which his team at NIST tested about six different annealing treatments before the precipitation process. The team did mechanical testing on treated parts; although these parts were composed of the same material and were subjected to heat treatments in similar ranges, tensile tests varied by a factor of three. This difference was due to microstructure variations, including which precipitates formed (and their size and predictability).
The same audience member asked how current knowledge of microstructure modeling could be applied to multicomponent alloy design. Jägle responded that there is no single approach to alloy design; it depends on the type of alloy. If he was asked to design a better aluminum alloy, he would need to design better precipitates or compositions that would work in AM.
In response to a question about model uncertainties and validation tests, Turner stated that confidence in a model is needed before exploring factors such as surface tension at various temperatures. Teter noted that sensitivity analysis of certain parameters, such as recoil versus Marangoni in the melt-pool behavior, is a big open question in AM.
Levine asked the panelists about short-, intermediate-, and long-term goals for AM of metals, which are described below.
- Improving microstructure modeling, particularly for the prediction of grain size, phases, and defects (Johnson);
- Using machine learning on in-situ monitoring data (Johnson);
- Developing guidelines for qualification design (Johnson);
- Modifying existing alloys to work in AM (Jägle);
- Improving the understanding of the physics behind some materials’ behaviors (Korzekwa);
- Refining standards (Seide); and
- Obtaining temperature-dependent thermophysical properties needed for simulations—some software systems have temperature as a function of part-geometry and other properties, which may be a direction worthy of further exploration (Leinenbach).
- Simulating all laser passes with computationally efficient approaches (Johnson);
- Improving topology optimization and location-specific process optimization (Johnson);
- Strengthening the understanding of modeling capabilities, such as process-continuous models and microstructure models (Korzekwa);
- Expanding training in computational materials engineering (Seide); and
- Developing a multiphysics approach for coupling capabilities (Seide).
- Combining digital volume correlation with machine learning to minimize failure (Johnson);
- Creating AM-specific alloys with specialized cooling rates (Jägle);
- Developing more user-friendly models (Korzekwa);
- Improving the understanding of microstructures in different parts and positions for localized needs (Seide);
- Strengthening model reliability to predict distortion, microstructure, and mechanical properties (Leinenbach);
- Establishing a set of community models and interfaces between the different components, where lower- and higher-fidelity models can be interchanged—this could be similar to an open source version built in a collaborative environment (Turner); and
- Developing community standards on the models and interfaces (Turner).
NASEM (National Academies of Sciences, Engineering, and Medicine). 2016. Predictive Theoretical and Computational Approaches for Additive Manufacturing: Proceedings of a Workshop. Washington, DC: The National Academies Press.
Popovich, V., E. Borisov, A.A. Popovich, V. Sufiiarov, D.V. Masaylo, and L. Alzina. 2017. Impact of heat treatment on mechanical behaviour of Inconel 718 with tailored microstructure processed by selective laser melting. Materials and Design 131:12–22.