2
Case Studies and Lessons Learned

In the 1990s, mechanical engineers began to build and apply integrated computational systems to analyze and design complex engineered systems such as aircraft structures, turbine engines, and automobiles. By integrating structural, fluid, and thermal analysis codes for components, subsystems, and full assemblies, these engineers performed sensitivity analyses, design trade-off studies, and, ultimately, multidisciplinary optimization. These developments provide substantial cost savings and have encouraged the continued development of more advanced and powerful integrated computational systems and tools. Using these systems and tools, aircraft engine design engineers have reduced the engine development cycle from approximately 6 years to less than 2 years while reducing the number of costly engine and subcomponent tests.1 Integrated product development (IPD) and its primary computational framework, multidisciplinary optimization (MDO), form the core of this systems engineering process (Boxes 2-1 and 2-2).

IPD and MDO have revolutionized U.S. industry, but materials have not been part of this computerized optimization process. While the constraints of diverse materials systems strongly influence product design, they are considered only outside the multidisciplinary design loop; an example of this is shown for hypersonic vehicles in Box 2-2. In a large, multidisciplinary engineering environment, an integrated product development team (IPDT)—that is, the group of stakeholders

1

Michael Winter, P&W, “Infrastructure, processes, implementation and utilization of computational tools in the design process,” Presentation to the committee on March 13, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 36
2 Case Studies and Lessons Learned In the 1990s, mechanical engineers began to build and apply integrated com- putational systems to analyze and design complex engineered systems such as aircraft structures, turbine engines, and automobiles. By integrating structural, fluid, and thermal analysis codes for components, subsystems, and full assemblies, these engineers performed sensitivity analyses, design trade-off studies, and, ulti- mately, multidisciplinary optimization. These developments provide substantial cost savings and have encouraged the continued development of more advanced and powerful integrated computational systems and tools. Using these systems and tools, aircraft engine design engineers have reduced the engine development cycle from approximately 6 years to less than 2 years while reducing the number of costly engine and subcomponent tests.1 Integrated product development (IPD) and its primary computational framework, multidisciplinary optimization (MDO), form the core of this systems engineering process (Boxes 2-1 and 2-2). IPD and MDO have revolutionized U.S. industry, but materials have not been part of this computerized optimization process. While the constraints of diverse materials systems strongly influence product design, they are considered only out- side the multidisciplinary design loop; an example of this is shown for hypersonic vehicles in Box 2-2. In a large, multidisciplinary engineering environment, an integrated product development team (IPDT)—that is, the group of stakeholders 1 Michael Winter, P&W, “Infrastructure, processes, implementation and utilization of computational tools in the design process,” Presentation to the committee on March 13, 2007. Available at http:// www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007. 

OCR for page 36
case studies lessons learned  and BOX 2-1 IPD and MDO A key cultural change introduced into many U.S. industrial sectors toward the end of the twentieth century was the integrated product development (IPD) process, which dramatically improved the execution and efficiency of the product development cycle. IPD is also known as “simultaneous engineering,” “concurrent engineering,” or ”collaborative product develop- ment.” Its central component is the integrated product development team (IPDT), a group of stakeholders who are given ownership of and responsibility for the product being developed. In an effective IPDT, all members share a definition of success and contribute to that success in different ways. For example, systems engineers are responsible for the big picture. They ini- tially identify development parameters such as specifications, schedule, and resources. During the design process, they ensure integration between tools, between system components, and between design groups. They are also responsible for propagating data throughout the team. Design engineers have responsibilities specific to their capabilities and disciplines. Typically, design engineers determine the scope and approach of analysis, testing, and modeling. They define the computational tools and experiments required to support the development of a design and its validation. Manufacturing engineers ensure that the components can be made with the selected manufacturing process, often defining the computational simulation tools that are required. Materials engineers provide insights into the capabilities and limitations of the selected materials and support development of the manufacturing process. IPDTs can range in scale from small and focused to multilevel and complex. Regardless of size, however, the defining characteristic of an IPDT is interdependence. The key to a suc- cessful IPDT is that the team members, and the tools they use, do not work in isolation but are integrated throughout the design process. This approach may entail communicating outside the original company, country, or discipline. Owing to the demonstrated success of the IPD process, many engineering organizations, particularly at large companies, have invested considerable human and capital resources to establish a work-flow plan for their engineering practices and product development cycles as executed by the IPDT. The capability and dynamics of the IPD process are illustrated by the execution of a computationally based multidisciplinary design optimization (MDO). Modern engineering is a process of managing complexity, and MDO is an important computational tool that helps the systems analysts to do that. For example, a modern gas turbine engine has 80,000 sep- arate parts and 5,000 separate part numbers,1 including 200 major components requiring three-dimensional computer-aided engineering (CAE) analysis with structural finite element and computational fluid dynamics codes. This CAE analysis can easily require over 400 per- son-years of analytical design and computer-aided design (CAD) support. The only rational way to accomplish this engineering feat organizationally is by means of an IPDT. Owing to the development and validation of computational engineering analysis tools such as finite- element analysis, computer-based MDO has become routine for many systems or subsystems to improve efficiency and arrive at an optimized design or process. Computer-based MDO automates work flow, automates model building and execution, and automates design explo- ration. A block diagram of the relevant analytical tools utilized by MDO is shown in Figure 1Michael Winter, P&W, “Infrastructure, processes, implementation and utilization of compu- tational tools in the design process,” Presentation to the committee on March 13, 2007. Avail- able at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007.

OCR for page 36
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  2-1-1. Figure 2-1-2 shows the “electronic enterprise” required to support automated MDO. This includes libraries of validated and certified analysis tools; an integration framework; and electronic process or work-flow maps and IPD tools for work-flow management, collaborative engineering, and secure business-to-business information sharing. MDO has allowed the IPDT to focus on product design decisions based on the results of MDO rather than on generating data. This has greatly improved the robustness of final product designs and dramatically reduced the time to reach design solutions. FIGURE 2-1-1 The computational tools required for successful MDO. SOURCE: Michael Winter, P&W, “Infrastructure, processes, implementation and utilization of computational tools in the design process,” Presentation to the committee on March 13, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007. who are given ownership of and responsibility for the product under develop- ment—operates with materials as a static, limiting constraint on the overall IPD rather than as an optimizable parameter. Typically the list of materials from which a choice is to be made is either taken as a fixed constraint or consists of a small subset of materials that are evaluated statically outside the optimization loop. This approach narrows the design space, resulting in suboptimal vehicle performance in an application with low margins for error. Conversely, the development of the

OCR for page 36
case studies lessons learned  and FIGURE 2-1-2 A typical electronic enterprise required to support computationally based IPD. NOTE: Individual codes and tools are depicted as labeled boxes (for example, A is shown as Unigraphics, C is Ansys, F is Fluent, and so on). B2B, business to business. The flow through this figure shows the individual steps involved in establishing the computational process. Adapted from Michael Winter, P&W, “Infrastructure, processes, implementation and utiliza- tion of computational tools in the design process,” Presentation to the committee on March 13, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations. html. Accessed February 2007. right class of advanced materials, which is an inherently expensive process, could be better justified and motivated by integration of materials into the MDO com- putational process. During the late 1990s, materials engineers who worked on IPDTs witnessed these achievements and began to consider the need for similar computational methods for analysis and development of materials. The need to shrink the growing discrepancy between the system design cycle time and the typical time to develop

OCR for page 36
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g 0 BOX 2-2 An Example of MDO The MDO approach to design has become a powerful vehicle for broad exploration of design space at relatively low cost. An example is the development of hypersonic vehicles for space access and defense applications, where there is a complex interdependence of ve- hicle structure and aerodynamic and thermal loads; static and dynamic structural deflections; propulsion system performance and operability; and vehicle control. Since many conditions of hypersonic flight cannot reasonably be replicated in any current or foreseeable wind tun- nel, designs are not fully validated until actual flights are conducted. Thus the integration of validated analytical design tools with automated data transfer between disciplines in an MDO platform is essential for arriving at realistic but innovative and high-performing solutions. A tool integration scheme used by Boeing Phantom Works for the design of hypersonic vehicles is shown in Figure 2-2-1. This MDO scheme could, for example, be used to design an air- breathing, reusable, hypersonic flight vehicle, where strong interactions between aerodynam- ics, propulsion, aerothermal loads, structures, and control have a substantial impact on the optimal shape of the entire vehicle. With an MDO platform, thousands of potential vehicle shapes can be explored within the time frame of days (see Figure 2-2-2). Materials tools are notably absent from the MDO tool set (Figure 2-2-1). FIGURE 2-2-1 Boeing’s integration of analytical tools for design of hypersonic vehicles. SOURCE: K.G. Bowcutt, “A perspective on the future of aerospace vehicle design,” American Institute of Aeronautics and Astronautics Paper 2003-6957, December 2003.

OCR for page 36
case studies lessons learned  and FIGURE 2-2-2 Side and top views of vehicle shape explored in an MDO analysis of hypersonic vehicle design space. This evaluation allows vehicle performance at hypersonic speeds to be explored. A materials analysis is not part of this process. SOURCE: Kevin Bowcutt, Boeing.

OCR for page 36
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  a new material (8-20 years) provided further motivation. Materials engineers recognized that by building such tools and methods for materials, a process now termed integrated computational materials engineering (ICME), materials could be incorporated into the overall engineering system, thereby allowing full systems optimization. Within the last decade, industrial organizations, using their own and govern- ment funds, have begun to explore the application of ICME to solve industrial problems, to estimate the payoff for ICME implementation, and to identify devel- opment needs. For example, the Defense Advanced Research Projects Agency (DARPA) launched the Accelerated Insertion of Materials (AIM) initiative in 2001 to apply ICME to jet engine turbine disks and composite airframe structures. The goal of AIM was to establish an ICME tool set, integrate materials analysis with design engineering, and demonstrate the benefit of ICME in terms of reduced materials development time and cost. In that same time frame, similar efforts were launched in other industrial sectors to improve product performance and provide information that would be impossible or extremely costly to obtain using experimental methods.2 In general, case studies reviewed by the committee demonstrate that while ICME is still an emerging discipline it can nevertheless provide significant benefits, such as lower costs and shorter cycle times for component design and process development, lower manufacturing costs, and improved prognosis of the subsys- tem component life. In some cases, these benefits would have been impossible to achieve with any other approach, regardless of cost. In this chapter, several case studies involving ICME are presented along with a discussion of how other scientific disciplines have successfully undertaken large- scale integrated efforts. Case studies where the return on investment (ROI) could be documented are reviewed in more detail. The chapter ends with some lessons learned from these programs.3 CASE STUDIES—CURRENT STATUS AND BENEFITS OF ICME In this section, the current status of ICME is assessed by relating some case studies that illustrate the ICME processes. The vast majority of design engineers see material properties as fixed inputs to their design, not as levers that may be adjusted to help them meet their design criteria. Until materials, component design, 2 John Allison, Mei Li, C. Wolverton, and XuMing Su, “Virtual aluminum castings: An industrial application of ICME,” JOM (Journal of The Minerals, Metals & Materials Society) 58(11): 28-35. 3 The committee notes that none of the information provided to the committee was sufficient for the committee to make an independent assessment of the ROI. This is not surprising, because providing such information would involve releasing proprietary or private/classified information. The commit- tee therefore reports the ROI information provided to it without any independent assessment.

OCR for page 36
case studies lessons learned  and and the manufacturing process become fully integrated, designers will not be able to optimize product properties through materials processing. Using an ICME approach, however, this optimization can be accomplished in a virtual environ- ment long before the components are fabricated. The case studies discussed here attempt to show why that is so. For clarity, the case studies are divided into examples that use ICME to (1) integrate materials, component design, and manufacturing processes; (2) predict component life; and (3) develop manufacturing processes. While each ICME case study achieved different levels of integration and implementation, all of them resulted in substantial benefits. The case studies chosen for inclusion here generally had some level of detailed information on the ROI and other benefits of the ICME activity; this was a self-reported assessment, however, and validating the ROIs was beyond the scope of the committee. The majority of these case studies involved metallic systems. The committee identified similar examples in nonmetallic sys- tems such as polymers and semiconductors, but they either lacked any detailed information on ROI or did not approach full ICME integration. For example, in the area of integrated circuits, models for dielectric constants, electromigration, dislocation generation, and barrier layer adhesion have not yet been integrated with CAD circuit design models or “equipment” models for lithography or pro- cessing.4,5 The committee notes, therefore, that while the examples shown in this report all involve metallic systems, it believes that ICME will be applicable and will demonstrate significant benefits for integrating materials, component design, and manufacturing processes across all materials regimes, including nanomateri- als, biological materials, polymeric materials, ceramics, functional materials and composites. However, the challenges can be formidable. For instance, the modeling of polymeric materials is extraordinarily complex at all length scales, and owing to the long chains of the polymer backbones, the gap between feasible atomistic simulations and solid polymer properties is still substantial. Another challenge for industrial polymers is the plethora of suppliers, who offer many proprietary vari- ants of these materials. However, it is this kind of complexity that should motivate an ICME approach. 4 Sadasivan Shankar, Intel, “Computational materials for nanoelectronics,” Presentation to the committee on May 29, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_ Presentations.html. Accessed February 2007. 5 Dureseti Chidambarrao, IBM, “Computational materials engineering in the semiconductor indus- try,” Presentation to the committee on May 29, 2007. Available at http://www7.nationalacademies. org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007.

OCR for page 36
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  Integrating Materials, Component Design, and Manufacturing Processes in the Automotive Sector The virtual aluminum castings (VAC) methodology developed by the Ford Motor Company offers one detailed case study for integrating materials, component design, and manufacturing.6 The methodology was based on a holistic approach to aluminum casting component design; it modified the traditional design process to allow the variation in material properties attributable to the manufacturing process to flow into the mechanical design assessment. Fully funded by Ford to address specific power-train components, the VAC methodology was implemented by the company for cast aluminum power-train component design, manufacturing, and CAE. As discussed below, VAC has resulted in millions of dollars in direct savings and cost avoidance. For VAC to be successful, it required: • Models of the structure evolution and resulting physical and mechani- cal properties of aluminum alloy systems, using the classic processing– structure–property flowchart depicted in Figure 2-1, • The ability to link the various models together while maintaining compu- tational efficiency and simplicity, • Modifications to the traditional design process to allow for spatial variation of material properties across a component to be considered, and • Extensive validation of the predictive models. Quantitative processing–structure–property relationships were defined using a combination of science-based models and empirical relationships and linked to quantitative phase diagram calculations. As shown schematically in Figure 2-1, the influence of all the manufacturing processes (casting, solution treatment, and aging) on a wide variety of critical microstructural features was captured in com- putational models. These microstructural models were then used to predict the key mechanical and physical properties (fatigue, strength, and thermal growth) required for cast aluminum power-train components.7 Microstructural modeling was required at many different length scales to capture the critical features required to accurately predict properties. The outputs of this processing–structure–property information are predictions of manufacturing-history-sensitive properties. These predicted properties and their spatial distributions were mapped into the compo- 6 John Allison, Mei Li, C. Wolverton, and XuMing Su, “Virtual aluminum castings: An industrial application of ICME.” JOM (Journal of The Minerals, Metals & Materials Society), 58(11): 28-35. 7 Thermal growth, a common term used for aluminum alloys, represents the dimensional changes brought about by precipitation of phases with different volumes. It is used as a critical input in advanced component durability procedures.

OCR for page 36
case studies lessons learned  and FIGURE 2-1 The processing–structure–property flowchart for cast aluminum alloys for engine com- ponents. The chart shows the wide variety of individual models required for ICME. The proper- ties listed are required as the critical inputs to durability (performance) analysis of cast aluminum components. nent and subsystem finite-element analysis (FEA) of operating engines by which cast aluminum component durability (performance) is predicted. Both commercial software and Ford-developed codes are used in the VAC sys- tem. Commercially available casting-simulation software (Pro-CAST, MagmaSoft), thermodynamic and kinetic modeling software (Pandat, Thermo-Calc, Dictra), first principles code (VASP), and FEA software for stress analysis (ABAQUS) are used, integrating the methodology with the standard codes used in the manu- facturing, material science, and design environments. However, many additional codes were developed during the course of the program for specific tasks such as the interfacial heat-transfer coefficient (IHTC) optimization (OptCast), micro- structure evolution (MicroMod, NanoPPT, MicroPore), physical or mechanical property evolution (LocalYS, LocalTG, Local FS), and the resultant stresses and component durability (QuenchStress, HotStress, Hotlife). These codes, developed internally by Ford Research and Advanced Engineering, involved coordinating the fundamental research efforts of five universities across the United States and the United Kingdom. Substantial effort was applied to developing efficient links between the output of the casting modeling and structure and property prediction tools to feed seamlessly into the FEA codes and to facilitate reorganization of the design process. Figure 2-2 shows the process flow for predicting the influence of the casting and heat treatment process on the spatial distribution of yield strength in a typi-

OCR for page 36
 Virtual Aluminum C as tings Proc es s Flow Initial Geometry Filling Thermal Analysis •CAD Geometry •Accurate filling Profile •Boundary Conditions (OPTCAST) and Mesh (ProCast, OPTCAST) •Fraction solid Curves (ThermoCALC) Yield Strength Microstructure (Al2Cu) • LocalYS • Micromodel (MicroMod, PanDat) • Solution treatment (Dictra) • Aging Model (NanoPPT, PanDat) FIGURE 2-2 The ICME process flow for Ford’s VAC tool for local property prediction of yield strength (LocalYS). The flow starts with a geometric (CAD) representation of the component. This CAD is then used as input to the simulations of filling and solidification (thermal). The outputs of these simulations are used to predict microstructural quantities. Finally, these microstructural quantities are used to predict the manufacturing history sensitive, spatial distributions in yield strength. Specific commercial codes and Ford-developed programs and subroutines that have been integrated into the process are identified at each step. SOURCE: John Allison, Ford Motor Company.

OCR for page 36
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  The main components of the CTMP program were the development of ana- lytical tools, measurement techniques, process simulation, and product response. The two most important analytical tools of the program were the tube optimiza- tion model (TOM) and the virtual pilot plant (VPP). TOM offers a PC-based framework for developing a tube-making process that generates the desired micro- structure. VPP allows process and equipment scenarios to be evaluated in com- puter simulations rather than in the production facilities. The TOM was initially developed for use with three different grades of steel; however, many grades were added to make the models more broadly applicable. Timken made extensive use of a “design of experiments” approach to develop data-driven relationships between controllable factors in the manufacturing process and the material microstructure, which were then hard-wired into the TOM.17 The majority of codes in TOM were developed within the program, such as those that describe controlled slow cooling, inverse heat conduction, austenite decomposition, thermal expansion coefficients, recrystallization, grain growth, flow stress, and some specialized finite element mill models (ELROLL). Commercial codes used were QuesTek MCASIS codes for continuous cooling transition curves, various finite element (FE) or finite differ- ence (FD) codes (ABAQUS, DEFORM, SHAPE), and optimization codes such as EPOGY in the VPP. Direct, on-line measurements of austenitic grain size using a laser-ultrasonic gauge and an eccentricity gauge were used to validate in-process grain size, building on an earlier DOE program in this area. TOM was used to assess the impact of various manufacturing process parameters on the machinability of the finished material, a critical customer requirement. For the case of machining tubes into automotive gears, this capability allows the steel microstructures to be optimized to extend the lifetime of broaching tools. The benefits of the CTMP program can be demonstrated by reviewing some of its elements. The thermal-enhanced spheroidization annealing (T-ESA) “process recipe” is an annealing cycle that reduces the time and energy requirements involved in the spheroidization heat treatments applied to 52100 and other homogeneous high-carbon steels. The recipe has been perfected and implemented, reducing the process cycle time 33-50 percent and energy costs by $500,000 per year. The accumulated annual savings of other direct benefits are estimated at nearly $1 mil- lion, not to mention the opportunity cost that the tools provided by reducing the time that would be required of the technical staff to execute those studies without TOM. Moreover, TOM mill simulation and a mill trial demonstrated, with limited experiments, a strong potential for application of a minimum-capital manufac- turing process. If new capital equipment were installed, the benefits from a single 17 Design of experiments is a systematic approach to investigation of a system or process. A series of structured tests is designed in which planned changes are made to the input variables of a process or system. The effects of these changes on a predefined output are then assessed.

OCR for page 36
case studies lessons learned  and application of TOM/VPP would be in the millions of dollars. Finally, a preferred microstructure was discovered for maximizing gear broach tool life. Since cutting tools cost roughly half as much as gears, the industry-wide savings could amount to more than $10 million annually. Obviously not all ICME case studies have realized their full potential. For example, while the direct financial benefits at Timken were reportedly limited because there was no capital investment in an optimized plant, the experience there shows it is reasonable to expect that an ROI of 3 to 10 could be realized over the entire industry if capital investments were made. Commercialization of some of the codes developed in the CTMP program is planned, however, and would increase the use of ICME in this industry. The CTMP program has demonstrated that sci- ence developed into applicable technology can differentiate domestic products and further the cause of increasing U.S. competitiveness in the global market.18 There are a number of other examples in which manufacturing vendors have begun to embrace ICME, as shown in Table 2-3. While this area of integrated materials and manufacturing promises to provide the most immediate benefits to industry, it must be recognized that additional development and validation remains for these new techniques to be more widely utilized. LESSONS LEARNED FROM OTHER DISCIPLINES In addition to investigating the current status of ICME, the committee also explored some major integration efforts in other scientific and engineering fields. Integration experiences in other disciplines teach important lessons about accom- plishing major technical and cultural shifts. It is important to note that fields that have achieved successful integration share some advantages over materials engi- neering. First they enjoy a cohesive data structure, a common mathematical frame- work, and well-defined objects for investigation. For example, genomics describes a single kind of data, gene sequences, while astronomy focuses on a common set of celestial objects. In contrast, materials engineers working on a large array of engineering components must know about the physical and mechanical properties of the materials used as well as their spectroscopic and two-dimensional and three- dimensional microscopic characteristics. Given the diversity of the information and how this information is applied, the materials challenges more closely resemble the challenges of bioinformatics. In general, communities that have benefited from integration of information have set explicit goals for information acquisition and 18 For more information on the CTMP program, see Timkin Company, Final Report: Controlled Thermo-Mechanical Processing of Tubes and Pipes for Enhanced Manufacturing and Performance, November 11, 2005. Available at http://www.osti.gov/bridge/purl.cover.jsp?purl=/861638-qr9nuA/. Accessed May 2008.

OCR for page 36
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  TABLE 2-3 Examples of ICME in Materials Manufacturing Processes Company Case Study Benefits The Timken Company CTMP Potential to reduce manufacturing cost by 20 percent with new heat treat/mill on-line Ladish Co./Rolls- Titanium modeling The summary of the business case Royce/General Electric for this project is a 4.83:1 ROI Company/P&W/Boeing/Timet with a $24.442 million ROI within a standard 10-yr analysis period Ladish Co./General Electric Processing science Cost avoidance for production of Company methodology for metallic forged aerospace components structures Special Metals Corporation Four case studies in Savings of more than $500,00 in (a PCC company) alloy development using development materials, estimated Thermo-Calc $1.5 million in new revenue Böhler Schmiedetechnik Optimizing the forging of Cost avoidance for production of critical aircraft parts by the forged aerospace components use of finite element coupled microstructure modeling Carmel Forge/Scientific Forming Grain size modeling for Cost avoidance for production of Technologies Corporation Waspalloy forged aerospace components RMI Titanium Company Rolling modeling for Faster, more cost-effective development of fine grain optimization of rolling parameters titanium sheet at RMI and pass schedules Alcoa Howmet Grain size modeling Improved prior beta grain size control in investment casting of Ti-64 successfully executed “team science” projects that have established national centers and open-access databases of fundamental information. In the following sections specific lessons learned from the genomics, bioinformatics, and astronomy com- munities are highlighted. Genomics and Bioinformatics One of the most significant integration efforts in modern science was the human genome project (HGP).19 This well-coordinated, $3 billion, 13-year project 19 For more information, see http://www.ornl.gov/sci/techresources/Human_Genome/home.shtml. Accessed October 2007.

OCR for page 36
case studies lessons learned  and was funded in the United States by the DOE and the National Institutes of Health (NIH) and was conducted in collaboration with the United Kingdom, Japan, France, Germany, China, and other countries. The project determined the complete sequence of the three billion DNA subunits (bases), identified all human genes, and made them accessible for further biological study. At any given time, HGP involved over 200 researchers, and its successful completion required large-scale funding, the coordination of important technologies (for example, rapid, high-throughput sequencing capabilities and databases), and the evolving principles surrounding intellectual property and publication.20 All human genomic sequence information generated by the centers that had been funded for large-scale human sequencing was made freely available in the public domain to encourage research and devel- opment and to maximize the benefit to society. Further, the sequences were to be released as soon as possible and finished sequences submitted immediately to public databases. To promote coordination of activities, it was agreed that large- scale sequencing centers would inform the Human Genome Organization (HUGO) of their intention to sequence particular regions of the genome. The information was presented on the HUGO Internet page and directed users to the Web pages of individual centers for more detailed information on the status of sequencing in specific regions. Although HGP was completed in 2003, NIH continues to fund major coordinated sequencing projects. As an example of the magnitude and type of efforts funded by NIH, the sequence of the Rhesus monkey was recently completed, the result of a $20 million effort involving 100 researchers that was approved in 2005.21 While the resources expected to be made available to ICME are likely to be fewer than were applied to mapping the human genome, there are significant lessons to be learned by the ICME community from HGP by considering how that community initially organized and set goals and by considering the potential impact of large-scale, coordinated projects based on the gathering and organization of data. At the outset of the HGP, quantitative goals for information acquisition (gene sequencing), databases, computational advances, and the human infrastruc- ture were set. In 1991, 5-year goals were established, with funding of $135 million from NIH and DOE. The 1991 goals included these:22 • Improve current methods and/or develop new methods for DNA sequencing that will allow large-scale sequencing of DNA at a cost of $0.50 per base pair. 20 Rex Chisholm, Northwestern University, “Community computational resources in genomics re- search: Lessons for research,” Presentation to the committee on May 29, 2007. Available at http:// www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007. 21 Elizabeth Pennisi, “Boom time for monkey research,” Science 316 (April): 216. 22 For more information on the HGP’s 5-year plan, see http://www.genome.gov/10001477. Accessed February 2008.

OCR for page 36
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g 0 • Develop effective software and database designs to support large-scale map- ping and sequencing projects. • Create database tools that provide easy access to up-to-date physical map- ping, genetic mapping, chromosome mapping, and sequencing information and allow ready comparison of the data in these several data sets. • Develop algorithms and analytical tools that can be used in the interpreta- tion of genomic information. • Support research training of pre- and postdoctoral fellows starting in FY 1990. Increase the numbers of trainees supported until steady-state “pro- duction” of about 600 per year is reached by the fifth year. The technology for rapid, low-cost sequencing advanced quickly, while the needs for databases and computational tools continued to evolve.23 Interestingly, the training goals were not met, “because the capacity to train so many individuals in interdisciplinary sciences did not exist.” The establishment of interdisciplinary research centers, with significant participation from nonbiological scientists, was identified as an ongoing need throughout the program.24 ICME shares many of the same challenges: rapid, low-cost experimentation, development of databases and computational tools, and the need for interdisciplinary training. The HGP also drove developments in bioinformatics, a field that lies at the intersection of biology, computer science, and information science and is defined by the NIH as “research, development, or application of computational tools and approaches for expanding the use of biological, medical, behavioral or health data, including those to acquire, store, organize, archive, analyze, or visualize such data.” Bioinformatics generally makes use of publicly available databases that can be mined for associating complex disorders (analogous to material proper- ties) with different versions of the same gene (analogous to microstructures). In the case of NIH-sponsored genetic research, genetic data must be published in publicly accessible databases. Similarly genetics-oriented research journals also require that this kind of information is made publicly available before the relevant paper may be published. The length scales associated with bioinformatics data and their application pose a challenge as complicated as materials. Examples of informatics databases include GenBank (genetic sequences), EMBL (nucleotide sequences), SwissProt (protein sequences), EC-ENZYME (enzyme database), RCSB PDB (three-dimensional biological macromolecular structure data from X-ray crystallography), GDB (human genome), OMIM (Mendelian inheritance in man 23 F. Collins and D.J. Galas, “A new five-year plan for the U.S. Human Genome Project,” Science 262 (1993): 43-46. 24 Francis S. Collins, Ari Patrinos, Elke Jordan, Aravinda Chakravarti, Raymond Gesteland, and LeRoy Walters, “New goals for the U.S. Human Genome Project: 1998-2003,” Science 282 (1998):682-689.

OCR for page 36
case studies lessons learned  and data bank), and PIR (protein information resource). The National Center for Biotechnology Information (NCBI),25 which provides access to these and other databases, was created in 1998 as a national resource for molecular biology infor- mation. The mission of the NCBI includes the generation of public databases, the research and development of computational biology tools, and the dissemination of biomedical information. An important lesson learned by the genetics community is that a key first step in establishing standards is agreement on a taxonomy (that is, an agreed-on classification scheme) and a vocabulary that ensures interoperability of data and models.26 Database curators play a key role in ensuring the quality of the data and determining what data are needed by the community. Bioinformatics database development, maintenance, and curation are funded by NIH. A small model organ- ism database might cost $400,000 annually, including the services of three curators, two database programmers, and the principal investigator.27 One of the cultural lessons from bioinformatics is that transitioning from bench science to big science involves a recognition on the part of the researchers that although their data are connected to them in the database, once those data are used by others they are no longer connected to them. This transition also requires a realization that analysis of a researcher’s data by others does not diminish them but increases their impact. One presentation suggested that this required a transforma- tion from a “hunter-gatherer” research model to a “collective farming” model, in which coordination and collaboration are the central elements.28 There are still no similar, publicly funded databases, informatics efforts, or comprehensive training programs for ICME. In fact, there are substantial barriers to developing them: lack of funding, lack of standards, diverse data classes, propri- etary data ownership, and cultural barriers to building the needed collaborations among materials science, engineering, computational science, and information technology. However, there is a clear opportunity to capitalize on the large body of public high-quality data by establishing open-access, mineable databases analogous to those in the bioinformatics world. An effort on the part of the emerging ICME community to set quantitative and specific goals for materials characterization, databases, computational models, and training will be needed for this discipline to mature. 25 For more information see http://www.ncbi.nlm.nih.gov/. Accessed February 2008. 26 Rex Chisholm, Northwestern University, “Community computational resources in genomics re- search: Lessons for research,” Presentation to the committee on May 29, 2007. Available at http:// www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 27 Cate L. Brinson, Northwestern University, “Materials informatics—what, how and why: Analogy to bioinformatics,” Presentation to the committee on May 30, 2007. Available at http://www7.national academies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 28 Ibid.

OCR for page 36
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  Open Science Grid and Sloan Digital Sky Survey Fields from climate change to gaming have also benefited from integration of scientific models and collaborative frameworks involving large-scale databases or computing needs. Some efforts such as climate change modeling and genetics involve large-scale, central coordination; others such as gaming are more organic in nature and based on a free, open source software (FOSS) paradigm.29 These activities have been enabled largely by the Internet and tend to be Web based, with networks of geographically distributed scientists and code developers working together. This process, known in the United States as cyberinfrastructure and in Europe as e-Science, has given rise to the new field of grid computing, in which computers are shared.30 The committee was briefed on two such efforts, the Open Science Grid (OSG)31 and the Sloan Digital Sky Survey.32,33 These efforts were initiated because of the need for large-scale (petascale) computing and storage within the astronomy and physics community. The Sloan Digital Sky Survey provides over 40 terabytes (TB) of raw data and 5 TB of process catalogs to the public. The data challenge in this field is the integration of disparate types of data about astronomical objects (stars, galaxies, quasars), including images, spectroscopy data (acquired by an array of experimental techniques at various wavelengths), and astrometric data, along with the large volumes of data (2 to 4 TB per year). Tools developed for the automated data reduction efforts that make the survey possible have involved more than 150 person-years of effort. A lesson learned in this activity was that information is growing exponentially and planning for this data explosion is important. The OSG comprises a grid or distributed network of over 70 sites on four continents accessing more than 24,000 central processing units. Establishment of 29W. Scacchi, “Free and open source development practices in the game community,” IEEE Software (January 2004): 59-66. 30 Daniel Clery, “Infrastructure: Can grid computing help us work together?” Science 313 (July 2006): 433-434. 31 OSG is a consortium of software, service, and resource providers and researchers from uni- versities, national laboratories, and computing centers across the United States. It brings together computing and storage resources from campuses and research communities into a common, shared grid infrastructure over research networks via a common set of middleware. The OSG Web site says the grid offers participating research communities low-threshold access to more resources than they could afford individually. For more information on the OSG, see http://www.opensciencegrid.org/. Accessed February 2008. 32 Paul Avery, University of Florida, “Open Science Grid: Linking universities and laboratories in national cyberinfrastructure,” Presentation to the committee on March 13, 2007. Available at http:// www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 33Alex Szalay, Johns Hopkins University, “Science in an exponential world,” Presentation to the committee on May 30, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_ Presentations.html. Accessed February 2008.

OCR for page 36
case studies lessons learned  and this grid was funded (1999-2007) by $38 million from DOE and NSF. The OSG serves a diverse set of disciplines including astronomy, astrophysics, genetics, grav- ity, relativity, particle physics, mathematics, nuclear physics, and computer science. According to information provided to the committee, a key lesson learned from the OSG is that over half the challenges associated with its establishment were cultural. Thus significant effort was and is required in project and computational coordination and management, education, and communication. The OSG has a well-developed communication Web site, monthly newsletters, and annual sum- mer schools for participants. Key technical challenges include commercial tools that fall short of the needs for grid computation, requiring OSG collaborators to invent the software. SUMMARY AND LESSONS LEARNED Several consistent lessons emerged from the case studies reported to the com- mittee. These lessons set out a context for a path forward for ICME, which is discussed in Chapter 3 and Chapter 4. Lesson Learned 1. ICME is an emerging discipline, still in its infancy. Although some ICME successes have been realized and articulated in the case studies, from an industrial perspective ICME is not mature and is contrib- uting only peripherally. While some companies may have ICME efforts, product design often goes on without materials modeling. Significant government funds have been expended on developing tools for computational materials science (CMS), and many of these tools are sufficiently advanced that they could be incorporated into ICME models. However, government and industry efforts in integrating CMS tools and in applying them to engineering problem solving are still relatively rare. Lesson Learned 2. There is clearly a positive return on investment in ICME. Performance, cost, and schedule benefits drive the increasing use of simulation. ICME shows promise for decreasing component design and process development costs and cycle time, lowering manufacturing costs, improving material life-cycle prognosis, and ultimately allowing for agile response to changing market demands. Typical reductions in product development time attributed to use of ICME are estimated to be 15 to 25 percent, with best-case ROIs between 7:1 and 10:1. Less quantifiable, but potentially more important, ICME often offers solutions, whether for design decisions or lifetime prognoses, that could not be obtained in any other way.

OCR for page 36
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  Lesson Learned 3. Achieving the full potential of ICME requires sustained investment. Because most ICME efforts are not yet mature, they are just beginning to realize benefits from the investment in them. Several case studies, including the CTMP and AIM programs, highlighted the need for additional, sustained commercial invest- ments, whether in human resources, code development, or capital equipment, to achieve the full ICME payoff. Government programs, in particular, often fund the initial investigation of a concept but leave follow-up to others. Developing ICME into a mature discipline will take a considerable investment from government and industry. Lesson Learned 4. ICME requires a cultural shift. Several case studies articulated that the cultural changes required to fully benefit from ICME should not be underestimated. For ICME to gain widespread acceptance, shifts are required in the cultures in industry, academia, and govern- ment. The design philosophy must shift from separating the product design analysis and the manufacturing process optimization. The engineering culture must shift toward increasing confidence in and reliance on computational materials engineer- ing models and depending less on databases and physical prototypes. Materials researchers and other data generators must shift toward an open-access model that uses data in a standard format. Each of these cultural changes is challenging and must be supported with education and resources. Lesson Learned 5. Successful model integration involves distilling informa- tion at each scale. Models that explicitly link different length scales and timescales are widely viewed as a laudable goal; however, the very few cases where this goal has been met required computational resources that are not widely available. In most successful ICME case studies, length scales and timescales were integrated by reducing the information at each scale to simple, computationally efficient models that could be embedded into models at other scales. By focusing on ICME as an engineering undertaking, this approach to incorporating information on the length scale, the timescale, or the location was found to be effective. However, it requires experts with sufficient understanding of a particular materials system to be able to judge which material response issues are essential. Lesson Learned 6. Experiments are key to the success of ICME. Estimates from several case studies indicated that 50 to 80 percent of the expense of developing ICME tools related to experimental investigations. As models

OCR for page 36
case studies lessons learned  and for materials properties developed, it was often the case that earlier materials char- acterization was insufficient since it had been conducted primarily for the purpose of quality control. Experiments were also required to fill the gaps where theories were not sufficiently predictive or quantitative. Finally, experimental validation is critical to gaining the acceptance of the product engineering community and to ensuring that the tools are sufficiently accurate for the intended use. An important function of computational models is to capture this experimental knowledge for later reuse. Lesson Learned 7. Databases are the key to capturing, curating, and archiving the critical information required for development of ICME. A number of case studies, both within and outside the materials engineering discipline, highlighted the need for large-scale capture and dissemination of critical data. One showed the negative impact of failing to archive data in a recoverable form.34 To create and utilize accurate and quantitative ICME tools, engineers must have easy access to relevant, high-quality data. Both open-access and proprietary databases permit the archiving and mining of the large, qualified, and standardized data sets that enable ICME. Lesson Learned 8. ICME activities are enabled by open-access data and integration-friendly software. Integration of computational models requires the transfer of data between models and customized links—that is, input and output interfaces—within mod- els. To enable data transfer, information must be stored in accessible, standardized formats that can interface with various models. To facilitate model input and output, software must be designed to allow easy integration of user-developed subroutines, preferably through the use of open architectures that enable plug-in applications. While many of the main commercial codes used in design analysis allow user-definable subroutines, manufacturing simulation codes vary greatly in that capability and in the sophistication of the interfaces. To make fullest use of ICME, both databases and model software must be designed with open integra- tion in mind. Lesson Learned 9. In applying ICME, a less-than-perfect solution may be good enough. Several case studies emphasized that ICME can provide significant value even if 34 Jonathan Zimmerman, Sandia National Laboratories, “Helium bubble growth during the aging of Pd-tritides,” Presentation to the committee on March 13, 2007. Available at http://www7.national academies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008.

OCR for page 36
i n t e g r at e d c o m P u tat i o na l m at e r i a l s e n g i n e e r i n g  it is less than 100 percent accurate. While scientists may focus on perfection, many existing theories, models and tools are sufficiently well developed that they can be effectively integrated into an ICME engineering methodology. Sensitivity studies, understanding of real-world uncertainty, and experimental validation were key to gaining the acceptance of and value from ICME tools with less than 100 percent accuracy. To judge what is a reasonable balance between efficiency and robustness requires a team with expertise. Lesson Learned 10. Development of ICME requires cross-functional teams focused on common goals or “foundational engineering problems.” All the successful ICME efforts discussed in the report were carried out by cross-functional teams made up of experts in materials, design, and manufac- turing who were well versed in technology integration and had a common goal. Since many of the required tools are still under development, both engineering and research perspectives must be represented in ICME teams. Successful ICME required integration of many kinds of expertise, including in materials engineering, materials science, mechanics, mechanical engineering, physics, software develop- ment, experimentation, and numerical methods. An important part of this lesson is the selection of a common goal (such as a foundational engineering problem) that includes (1) a manufacturing process or set of processes, (2) a materials system, and (3) an application or set of applications that define the critical properties and geometries.