Click for next page ( 31


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 30
Session 3: Big Data Issues in Manufacturing Session 3 of the workshop focused on the application of big data concepts in manufacturing. Adele Ratcliff, Office of the Secretary of Defense, had originally been scheduled to present during Session 3 but was unable to participate. Presentations were made in Session 3 by Jesse Margiotta, DARPA, and Wayne Ziegler, Army Research Laboratory. DATA NEEDS TO SUPPORT ICME DEVELOPMENT IN DARPA OPEN MANUFACTURING Jesse Margiotta, Technical Advisor, DARPA Dr. Margiotta began by saying that today’s qualification and certification paradigm for parts and processes is fraught with difficulties. The methods are empirical, sequential, and iterative, leading to potentially prohibitive increases in cost and time. He said that the greatest challenge in qualification and certification, however, is its uncertainty. Dr. Margiotta pointed out that uncertainty in the qualification and certification process adds risk to a project, preventing new technologies from being incorporated into larger systems. As a result, the current qualification and certification paradigm creates a barrier to technology innovation and adoption. To counteract this, DARPA has begun the Open Manufacturing initiative; its main goal is to build and demonstrate a rapid qualification framework that aims to comprehensively capture, analyze, and control manufacturing variability. Dr Margiotta explained the guiding principles for DARPA’s Open Manufacturing program:  Identify critical parameters, variation, and limits early in the process.  Reduce testing and development iterations.  Predict location-specific probabilistic performance.  Build confidence in new technologies or qualification processes.  Accelerate process maturity and systematic process reassessment. Dr. Margiotta then described a project DARPA is developing with Honeywell Aerospace and several other team members. The project aims to develop rapid qualification of powder bed fusion additive manufacturing processes—in particular, direct metal laser sintering (DMLS). The general approach consists of the following elements: parameterize the manufacturing process; implement new sensors into the manufacturing process; incorporate an ICME construct that links process to materials to properties; and apply rigorous model verification and validation to understand the confidence limits. In this way, process parameters are linked to quantified, location-specific properties of the as-manufactured part. Dr. Margiotta showed a schematic of the critical elements for rapid qualification (Figure 4). The constituents to enable rapid qualification are shown in blue at the top of the figure. The supporting elements are shown below that; many of them—such as sensing, linking sensing capability to quality assurance, and microstructure property models—still need to be developed. Dr. Margiotta pointed out that the business cases and implementation plan are particularly important, as they will affect the usage and acceptance by the broader community. He stated that the architecture consists of increasing layers of complexity, including difficulties with the interfaces between different elements of the system. PREPUBLICATION DRAFT—SUBJECT TO FURTHER EDITORIAL CORRECTION 30

OCR for page 30
FIGURE 4 Critical elements of a rapid qualification system. SOURCE: J. Margiotta, Defense Advanced Research Projects Agency, presentation to the committee, on February 6, 2014, Slide 4. Dr. Margiotta then explained the informatics associated with the additive manufacturing process. First, experiments are conducted to define the processing window, which is then refined through additional experiments to determine the optimal site within that window. This leads to a semioptimized process and overall improved material properties. The energy input density can also be measured and correlated with the quality of the consolidated material. In addition, the build chamber is instrumented to provide real-time monitoring of process parameters. The sensors have been able to capture a large quantity of high-fidelity data; at this point, about 1 TB of sensor data are collected for each DMLS build. Dr. Margiotta then moved to the ICME construct, which uses process–microstructure– performance models to simulate the manufacturing process. Dr. Margiotta explained that the current simulation takes several days to a week to complete, which is much too long. These tools need to be further developed and simplified. The ICME construct consists of the following elements:  Computationally intensive, physics-based models to simulate the manufacturing process. These models simulate the laser interaction with the powder bed, including thermal profiles and heating rates.  Microstructural models to predict stresses, grain size, strain hardening, and other variables.  Yield strength prediction tool.  Uncertainty quantification to understand the relationship between processing and properties and the sensitivity of this relationship. PREPUBLICATION DRAFT—SUBJECT TO FURTHER EDITORIAL CORRECTION 31

OCR for page 30
Dr. Margiotta said that the framework is the most critical element of the system. Once the general framework is in place, tools can be swapped in as they are developed. He noted that the tools are still under development and that much work remains. Dr. Margiotta explained that the Open Manufacturing project was one of the first to extend verification and validation and uncertainty quantification to ICME processes. The Open Manufacturing project intends to draw on the work and standard practices developed in other fields in which simulation is well-developed and rigorously validated. These same methods can be transitioned into the materials and manufacturing arena. A participant asked what is meant by “rapid” in this context, because the term can mean different things do different people. Dr. Margiotta responded that a typical qualification effort takes at least several years, with a long development effort prior to that before the qualification is begun. He was hesitant to associate a specific number with “rapid” but pointed out that there are significant time savings to be had in qualification efforts. A participant also asked about the meaning of probabilistic design. Dr. Margiotta explained that one identifies the worst defects possible and puts them into the most critical locations, then, using the minimum material properties, designs the part so that it will be able to withstand that worst-case scenario. Alternatively, probability should be used to understand the likelihood of the defect, its location, and its effect and should then optimize the design accordingly. Dr. McGrath asked about the data that have resulted from the Honeywell additive manufacturing project. Dr. Margiotta responded that the intent is for these programs to provide the data in a data archival tool, as is described in Mr. Ziegler’s talk, below). Access will be provided to other government agencies, with the details on broader distribution still to be determined. The process and materials data will not become proprietary. Dr. Margiotta was asked if this constituted a “big data” problem. He responded that the project is generating considerable amounts of data, but it is not considered big data based on today’s definition. However, the materials manufacturing community does not currently have the ability to manage and analyze even this relatively modest amount of data. Dr. Margiotta also pointed out that DARPA, along with ARL and other program partners, is developing methods to standardize data fields and metadata fields for materials and materials processing. THE MATERIALS INFORMATION SYSTEM Wayne Ziegler, Materials Engineer, Army Research Laboratory Mr. Ziegler began his presentation by explaining the value of a material information system. He noted that, while it is not the case for this audience, he often needs to convince his listeners of the value of the materials information system approach; people often are more concerned with intellectual property or do not understand the problems that currently exist. He pointed out that a materials information system does the following:  Enables researchers to work more quickly and intelligently.  Reduces duplication of effort in test and evaluation, which correspondingly reduces costs.  Stops data loss and ensures data are available for the next generation.  Improves data consistency and quality.  Improves work processes and throughput.  Accelerates implementation. PREPUBLICATION DRAFT—SUBJECT TO FURTHER EDITORIAL CORRECTION 32

OCR for page 30
FIGURE 5 Materials information system. SOURCE: Wayne Ziegler, Army Research Laboratory, presentation to the committee on February 6, 2014, Slide 4. Mr. Ziegler said that a successful data management plan uses a systems engineering approach and includes four main components: capture, analyze, deploy, and maintain. He suggested that the DOD had historically had difficulty with maintaining programs that manage material and process data due to the challenges of a mobile workforce and shifting budget considerations. NASA also has a long history with strict materials data management, as several major catastrophes at NASA were related to materials issues. As a result, ARL is working with NASA to identify lessons learned and leverage NASA’s experience and IT infrastructure resources. Current challenges associated with the development of materials data management plans, both in industry and DOD, include these:  Lack of direction.  Lack of adequate resources.  Lack of a return-on-investment business case.  Lack of agreement: Not all companies or agencies believe that all data should be shared, and the cultural mindset needs to be changed. Mr. Ziegler said that the goal is to build a DOD resource for materials and process information. The DOD resource, Materials Selection and Analysis Tool (MSAT), is currently hosted by NASA as an independent component of the NASA Materials and Processes Technical Information System (MAPTIS) system. He went on to say that the MSAT program has a strong partnership with DARPA and its Open Manufacturing program (see Dr. Margiotta’s presentation, above). PREPUBLICATION DRAFT—SUBJECT TO FURTHER EDITORIAL CORRECTION 33

OCR for page 30
Figure 5 shows a vision of a materials information system. It begins with experimental methods; Mr. Ziegler pointed out that we tend to lose metadata in this area, and experimental methods and results should be part of an integrated database structure. Metadata include information related to testing conditions and program information necessary for data sets to be completely understood and if necessary validated through additional testing. Data mining techniques are then applied to the data and the mined data are used to inform models. This process is iterative, and requires the tracking of data pedigree, in other words, data about the data. Mr. Ziegler noted that as much as half of the data can be pedigree information. Relevant metadata are needed to compare data across data sets. Mr. Ziegler indicated that the Open Manufacturing project follows this general form. Mr. Ziegler pointed out that MSAT is called a selection and analysis tool since its initial focus is making programmatic or research decisions based on the data sets available in a robust and timely way. MSAT has a wide approach to application, modeling, resource management, process approval, and improvement. Mr. Ziegler then argued that there needs be a cultural shift in work flow management. The traditional work flow paradigm is to execute a task, collect and extract data, return a bigger data set, and pair it with separately recorded information about the process. However, when the collection of data is separated from the collection of process information, fidelity drops. Mr. Ziegler argued in favor of a new work flow paradigm, whereby when a process is executed, the data and the metadata are collected simultaneously. When data are extracted from this set, there would already be a link that coupled them in any future data processing. He went on to describe the steps in the materials information system: 1. Define data sets. Mr. Ziegler pointed out that any data that can be collected in a reasonable fashion should be collected, as it might prove useful later. ARL is still addressing this first step, and Mr. Ziegler explained that the data collection decisions are iterative; once they have started collecting a particular data set, researchers will likely determine that they will need other data as well. 2. Define data management schema. This process looks at how to organize and arrange the data. It is also an iterative process. 3. Develop import templates. Mr. Ziegler noted that ARL currently uses Excel; the templates are in a comma-separated values format so that they are software-agnostic. Several companies provide commercial data management packages or modeling software, and the objective is to build templates that can interface with a variety of commercial software. 4. Use templates to import data. 5. Manage data. This step includes defining access control and conducting verification and validation. 6. Define the use cases. The resulting information is used to define output templates. Mr. Ziegler suggested that Steps 1, 5, and 6 are the most critical for the users. Once the use case is known, then the data can be exported in a useful way. Mr. Ziegler then discussed several technical considerations, including the following:  Defining the main function of the system: Is it capturing a manufacturing process, or does it focus on identifying material properties?  Integrating the system with existing systems and workplace practices with minimum impact.  Understanding and how system users will use the data.  Data flow through the system. Mr. Ziegler noted that in the materials science and engineering technology area this is not “big data” (yet), though it is on a large scale.  The type of information to be handled.  System setup, deployment, and maintenance. PREPUBLICATION DRAFT—SUBJECT TO FURTHER EDITORIAL CORRECTION 34

OCR for page 30
 Responsibility for, and ownership of, the various system components. Mr. Ziegler said that this can be a contentious issue, as data have value. Defining data access may not always be technically challenging, but it can be a policy challenge. Mr. Ziegler concluded by noting some practical considerations, including these:  Not every user is an expert, so the user interface becomes critical.  Materials and process data are usually incomplete.  Data have value; access control is critical and potentially contentious.  It takes time to rationalize and consolidate data. The better the system is at collection, the better it will be at consolidation.  Data end users need data in diverse places and formats.  Materials information systems need end users. Mr. Ziegler argued that there is no value in the system if it does not have end users. It can be challenging to identify and engage users.  Designing a system from scratch is impractical. A participant indicated that data ownership can be an obstacle to data sharing. He said that DOD contracts have many data requirements in them, and that aspect needs to be managed on the contractual side to ensure that the requirements are not cost prohibitive. He pointed out that there is a responsibility for sharing data among material suppliers, original equipment manufacturers, and the government. In some cases, a supplier provides a material but no corresponding metadata. Agreements with suppliers can take 1-2 years to develop, which slows down innovation. Mr. Ziegler agreed that acquisition is an important element, though outside the scope of ARL’s mission and activities. Dr. McGrath asked for clarification on MSAT. Is it a tool for materials selection, with a correspondingly fairly limited user community? Or is it an element of a larger system within a larger community, with a framework surrounding it? What is the plan for scaling up past the Open Manufacturing project? Mr. Ziegler said that MSAT is both a materials selection tool and part of a larger system. MSAT’s current focus is on where to store materials and processes and how to develop a clear interface with the modeling community. The discussion then turned to standards. A participant stressed that the process for developing standard terminology is very difficult and slow. There is an ASTM committee for standards in this area. Companies do not like to fund their employees to do this type of activity, however, and the ASTM committee terminated its efforts because of insufficient community funding. Also, companies are not interesting in attaching themselves to a certain format, as they are concerned they will be forced to share data. They prefer to keep information proprietary in their own formats. A few participants noted that the culture among researchers and companies is such that materials data are considered a competitive edge, and companies want to protect their intellectual property. DISCUSSION Valerie Browning, from ValTech Solutions LLC, opened the discussion by noting that the workshop speakers thus far have discussed materials challenges in variety and veracity, but not volume or velocity. In materials, therefore, it may be more important to think about information rather than big data. The materials area has an extra layer of extraction or analytics that is unique to this community. The questions then become, Who is responsible for developing the analytics? How we can manage work flow on different time scales? and Who owns the analytics? Another participant agreed that there is a data problem in materials science, but not a big data problem. He said that the data problem seems to center on data collection and the lack of sharing. He suggested that a mandate is necessary stating that any government-funded data must be put into a standard PREPUBLICATION DRAFT—SUBJECT TO FURTHER EDITORIAL CORRECTION 35

OCR for page 30
format, a step that is being considered by NSF and DOE. A DOD participant said that DOD has explored the idea of such a mandate but finds it time consuming and expensive, and, in the end, concludes that the cost may outweigh the benefit. He indicated that the issue is more than one of data format; it includes questions about who owns and maintains data and where the information should reside. Someone else remarked that the NSF repository is not user friendly. Another participant pointed out that it is fairly common for universities to have permanent storage facilities available and gave the Deep Blue program at the University of Michigan as an example.18 However, other participants argued that these programs are expensive and do not always include metadata. One participant believed that there is a data collection problem in manufacturing. Manufacturers need to organize data definitions, contextuals, the meaning of operations, and connect with ICME. Dr. Davis brought up the analogy of health care, and finds many parallels in the health care shift to digital patient records. Another participant noted that advanced manufacturing can be done on a small scale that is not necessarily part of a large corporation. This could add a layer of complexity, as a small player may not be interested in negotiating business plans. 18 See http://deepblue.lib.umich.edu/ for more information, accessed February 26, 2014. PREPUBLICATION DRAFT—SUBJECT TO FURTHER EDITORIAL CORRECTION 36