Skip to main content

Currently Skimming:

2 Materials Design
Pages 3-41

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 3...
... Finding Thermally Robust Superhard Materials with Machine Learning Superhard materials have a large variety of uses, from drill bits to artificial joints, with estimated annual sales expected to reach around $14 billion by 2024, and as Jakoah Brgoch, University of Houston, told the workshop, the coming transition to clean energy will make it even more important to find superhard materials for use in extraction of the minerals needed for clean energy technologies. Thus, Brgoch's group has been looking for new superhard materials that can be produced efficiently and used under demanding conditions, such as high temperatures.
From page 4...
... By contrast, diamond, cubic boron nitride, and BC2N have a Vickers hardness well about 40 gigapascals no matter what load is put on them. A second issue with the use of the second group of materials for superhard applications, he explained, is that many of the transition metals are very scarce, so it would not be feasible to make commercial amounts of a superhard material that relied on one of very rare transition metal elements.
From page 5...
... This would be convenient, as the bulk modulus and shear modulus of materials can be calculated relatively easily with density-functional theory, a computational method based on quantum mechanics. However, when his group tried this approach, they found that while they could get figures that were in the ballpark of Vickers hardness, they were not accurate enough for the group's purposes.
From page 6...
... The next step, Brgoch continued, was to settle on an algorithm that would find relationships between the set of variables and the Vickers hardness, which, if it was successful, would make it possible to accurately predict the Vickers hardness of materials for which the hardness had not been measured, just based to the other properties of the material. The team tried a variety of machine learning algorithms, such as random forest, gradient boosting, and XGBoost, and the XGBoost model had the best performance, with an R2 correlation of 0.97 between the predicted value of Vickers hardness and the measured value, indicating a very strong per formance (Figure 2-1)
From page 7...
... "We had 66,440 compounds -- 10,000 binaries, 36,000 ternaries, 20,000 quaternary compositions -- that on a desktop we could predict in about 30 seconds. Once it's trained, this is a very quick process." Once the predictions had been made, the team sorted them, looking not only for superhard materials -- that is, those with a predicted Vickers hardness of at least 40 gigapascals -- but also for materials whose hardness was predicted to change little as an increasing load was applied.
From page 8...
... Next, the team applied their machine learning algorithm to create a model that predicts temperature-dependent Vickers hardness. They found that the model's results were not as clean as the earlier results -- which made sense because it was based on far fewer data -- but it was still able to capture the basic trend of how Vick ers hardness varied with temperature, with the hardness dropping as the tempera ture increased.
From page 9...
... More generally, she continued, pressure can produce new phases and forms in materials, it can induce and tune desirable properties, and it can also be used to provide important insights into materials' behavior, which can in turn be useful in aiding the design and discovery of materials with exceptional properties that might be suitable for extreme environments. The first step that Mao's team takes in using high temperatures to create new materials is to "study systems of interest over a wide range of pressures and temperatures using a suite of in situ and ex situ probes." A typical way to do this, she explained, is to use diamond anvil cells, in which a sample is held between the flat faces of two diamonds, with a gasket preventing the sample from escaping as pressure is applied.
From page 10...
... As the pressure increases up to 40 gigapascals, the sphere is progressively squashed, becoming increasingly denser, and that increased density remains even when the pressure is removed. Measurements indicated that after being exposed to 40 gigapascals, the density of the glassy carbon was permanently increased by 30 percent.
From page 11...
... These can be thought of as strips of graphene, and because they have a tunable band gap, they have some very useful properties. However, it is difficult to make these graphene nanoribbons, especially long, thin ones with smooth edges, so a faculty member in the Stanford chemistry department suggested that it might be easier to create nanoribbons by first synthesizing carbon nanotubes and then flattening the tubes in a diamond anvil cell.
From page 12...
... They loaded a diamond anvil with glassy carbon and argon, which diffused into the nanopores of the glassy carbon. When this was all exposed to pressures of greater than 20 gigapascals, the glassy carbon was transformed into nanodiamond, which held high-pressure argon in its inclusions (Figure 2-2)
From page 13...
... "In Science," she said, "we had some model nanocrystal systems where they did interparticle sintering and found that's able to further maximize the kinetic barrier, and they found out those phases didn't go back, and now also they add some ligands." Machine Learning Tools for Extending Quantum Simulations of Reacting Materials Nir Goldman, Lawrence Livermore National Laboratory, spoke about using machine learning to fill in a gap between what quantum simulations can provide and what experimentalists would like to know in terms of synthesizing materials. In particular, he spoke about a specific model, the Chebyshev Interaction Model for Efficient Simulation (ChIMES)
From page 14...
... Furthermore, the development of additive manufacturing has added another layer of complexity because one needs to consider not only the chemistry of the materi als and how they interact but also their grain size and the geometry of the printed material. "All of these things, in particular under dynamic compression, can lead you to different chemical paths," he said, "So if you have a desired outcome and you want to do materials design and additive manufacturing, you have this gigantic problem to deal with." The traditional approach to solving such a problem would be trial and error: start with a molecule, synthesize it, create some sort of device with it, then measure it, and repeat over and over again until a suitable path is found to the desired mate rial.
From page 15...
... In the remainder of his talk, he described two types of machine learning approaches that his group has worked on. One is a classical molecular dynamics approach that involves no quantum mechanical calculations, while the other involves a mixture of quantum mechanical approaches and empirical functions.
From page 16...
... Switching then to the second machine learning approach that his team has developed, Goldman described how the group has used semi-empirical quantum mechanics to get closer to something that is closer to plug and play -- that is, much more ready for download and use. In particular, they use data from existing high-accuracy quantum chemical databases to improve semi-empirical quantum mechanical models via machine learning.
From page 17...
... It is derived directly from density functional theory and can be thought of as approximate density functional theory, but the quantum mechanical parts -- which represent the band structure and coulomb forces -- are "pre-tabulated" via approximate quantum mechanics, while the shortrange repulsive energy is represented by an empirically fit function. In this form, the solution is much easier for machine learning to determine.
From page 18...
... In the first part of his presentation, Homer spoke about how the Olmsted data set affected work on grain boundary energy. Then he switched to the effects of the data set on work related to ground boundary migration and wrapped up his talk with a look to the future.
From page 19...
... "We're able to predict grain boundary energy, grain boundary mobility, and whether grain boundaries are shear coupled or not using this machine learning," he said. "And this is coming from when we only have 388 datapoints in this data set.
From page 20...
... Reed, and M Kumar, 2014, "Grain Boundary Energy Function for FCC Metals," Acta Materialia 65:161–175; Copyright 2022, with permission from Elsevier.
From page 21...
... This exemplifies another benefit of the Olmsted database, Homer said: "It has reinvigorated the field in terms of studying grain boundary migration." More generally, he said, "It has moved our field forward in important ways. It has led to the development of new methods.
From page 22...
... So, my answer is aligning incentives, but I don't know what that looks like." FIRST-DAY AFTERNOON SESSION Horacio Espinosa, Northwestern University, chaired the first day's afternoon session, which had three speakers: Christopher Weinberger, Colorado State Univer sity; Penghui Cao, University of California, Irvine; and Shyue Ping Ong, University of California, San Diego. A brief discussion period followed each presentation.
From page 23...
... To illustrate, he showed a couple of phase diagrams of carbon-tantalum compounds; as the phase diagrams illustrated, the compounds would take on a dozen or more different crystalline structures depending on the percentages of carbon and tantalum and on the temperature. In short, there is significant complexity in the different phases that can form -- and that is with just two elements.
From page 24...
... " Switching to the question of how to predict hardness in materials, Weinberger referred to Jakoah Brgoch's presentation earlier in the day on using machine learn ing to find thermally robust superhard materials and then offered some additional thoughts. A common approach to predicting material hardness, he said, is to use elastic constants, but there are cases where hardness does not behave as the elastic constants would suggest.
From page 25...
... The issue of creep is an even bigger problem, and he concluded by stating "We do not have good mesoscale models of creep. It's a very tough problem, and we haven't as a community invested in that enough.… But as a modeler, that's a great place to put effort so that we can start to get back the experiments on creep." Fundamental Mechanisms Under Extreme Environments and the Role of Machine Learning Penghui Cao, University of California, Irvine, spoke about the role that machine learning can play in studying fundamental mechanisms in extreme environ
From page 26...
... . Explaining why random solid solutions would be expected to form in these multi-principle-element materials, Cao said that when an alloy is formed at tem peratures close to the melting points of the component elements, the Gibbs free energy -- which is the sum of mixing enthalpy and configurational entropy -- would be dominated by the entropy, resulting in a random solid solution in which the component elements are randomly distributed, as in an intermetallic compound.
From page 27...
... The calculated values of f for that alloy at 3,000 K, 800 K, and 400 K were 0.57, 0.034, Diffusion and atomic jump processes Smooth energy landscape Diffusivity Energy vacancy vacancy � Vacancy atom ∆𝐺𝐺𝐺 atom � � Coordinate 𝒇𝒇 ⟶ diffusion correlation factor ∆𝑮𝑮𝑮⟶ diffusion energy barrier Rugged diffusion energy landscape FIGURE 2-4  Diffusion and atomic jump processes.
From page 28...
... Cao described results from a recent publication on how short-range order within a multi-principle-element material affects the energy landscape of the lattice and, thus, how easily disloca tions can move at different points in the lattice (Wang et al.
From page 29...
... Designing Extreme Materials at Scale with Machine Learning Shyue Ping Ong, University of California, San Diego, spoke of the promise and challenges of using machine learning to design extreme materials. In particular, he focused on multi-principal-element alloys and ceramics -- that is, those alloys and ceramics that have many principal elements in their composition.
From page 30...
... , Ong said. There are different machine learning approaches to calculating these potentials, including neural network potentials, Gaussian ap proximation potentials, and spectral neighbor analysis potentials.
From page 31...
... Opila, University of Virginia. Benchmarking Machine Learning Modeling Approaches Aaron Stebner, Georgia Tech, discussed benchmarks for machine learning modeling approaches to materials and manufacturing research and development.
From page 32...
... He also teaches the rubric in a machine learning course for engineers at Georgia Tech. That is how they are teaching people to walk through machine learning problems.
From page 33...
... Ten years ago, Stebner said, machine learning was fun and exciting to apply to materials science problems, and many papers were published showing how well it worked in various situations. But, he added, "I think we're near the point where we need to say, okay, what am I getting from using machine learning that I couldn't get other ways?
From page 34...
... Advanced Materials and Manufacturing Techniques for Various Applications In the second panel presentation Douglas E Wolfe, Pennsylvania State Uni versity, offered a survey of various challenges related to using advanced materials in various applications, particularly those in extreme environments.
From page 35...
... "So, you may only get 90 percent of the best intrinsic material, but you can manufacture it." The entire community will be most successful at finding materials to meet needs for extreme environments by working together and having dialogues on what can be done and what cannot. With that introduction, Wolfe launched into an overview of the manufacturing technologies in use today.
From page 36...
... The best-performing coatings now are the ultrahard diamond, and cubic or wurtzite boron nitride, which can resist pressures of more than 80 gigapascals, and super-hard transition metal carbides, nitrides, and borides, which can resist more than 40 gigapascals. Speaking of the challenges related to ultrahard materials he spoke first of be ing able to synthesize new ultrahard materials and dealing with the impurities that result in grain-boundary melting and affect the material's mechanical properties.
From page 37...
... Opila, University of Virginia, the final panelist, spoke about the thermochemical stability of materials in extreme environments, with a focus on degradation mechanisms. The context for the talk was the testing materials for use in extreme environments, such as protective materials on a space shuttle, and what can be learned from how those materials fail.
From page 38...
... Testing materials for use in hypersonic vehicles requires a different approach because the temperatures get much higher and cannot be achieved with the usual laboratory furnace. Instead, she said, the team uses resistive heating of the sample in a vacuum chamber where the oxidizing gases can be controlled.
From page 39...
... The lack of kinetic data is also a challenge, especially in complex materials. Opila ended with a set of research questions: • Do we have sufficient understanding of extreme environments?
From page 40...
... that are not well suited to high-throughput experiments. • Penghui Cao: It is important in training machine learning models to identify the material parameters, such as short-range order, that can affect material strength; these parameters can be identified using modeling and experiments.
From page 41...
... That is something that is common to pretty much every scientific field, he said, and in many cases the first part of doing machine learning work is conditioning the data. Access to well-curated data sets is the key to using machine learning in materials science, and the key feature of a data set may not be whether it is large or small -- both types have their applications -- but rather the quality of the data set and how well curated it is.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.