Skip to main content

Currently Skimming:

4 The State of Smart Manufacturing Technology and Strategies to Address the Challenges
Pages 56-84

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 56...
... Interstate Highway System established after World War II, one could consider establishing a national transformative data infrastructure -- a concept the committee refers to as the "Cyber Interstate" -- to scale data and expertise sharing as a key capability for maximum industry-wide smart manufacturing impact. Just like the U.S.
From page 57...
... for maintaining data frameworks by sector could be identified. The Cyber Interstate also gives reason to establish the following: • Static harmonized data agreements, which are like size of lane, direction of the road, and traffic signs -- for smart manufacturing, one can start with basic and sharable data formats and schemas that lower the barriers for entrance.
From page 58...
... examples of individual plants and their broader supply chain networks. This visibility can include their playbooks with data, modeling, and orchestration, 3  NIST, 2022, Towards Resilient Manufacturing Ecosystems Through Artificial Intelligence -- ­Symposium Report, https://nvlpubs.nist.gov/nistpubs/ams/NIST.AMS.100-47.pdf.
From page 59...
... In the absence of advanced, cat egorized, secure contextualized data hubs, manufacturers resort to the traditional download practices for in-field data acquisition, which are tedious, error-prone, and often improper for real-time data analytics. There is significant value in creating extensible and open-access data infrastructure that provides a common middleware substrate for development and deployment of domain-specific analytics on stream ing data as well as reliable hosting of applications with seamless access to in-field Internet of Things (IoT)
From page 60...
... Conclusion: A secure digital smart manufacturing data interstate infrastructure that serves as a conduit to connect the wider smart manufacturing community, to include existing digital connections, is critical to ensuring that data and expe rience are shared across the smart manufacturing industry at sufficient scales to ensure U.S. global competitiveness in a sustainable way.
From page 61...
... The committee considers the short-, medium-, and longer-term timeframes to be in the ballpark of 2, 4, and 6 years, respectively. SYSTEM AND DATA INTEGRATION FOR SMART MANUFACTURING There is great value to enhancing and enabling digital integration within manufacturers and across supply chains.
From page 62...
... Benefits of Data Sharing in Smart Manufacturing Share Data to Build and Analyze Algorithms and Models for Individual Factory or Company Use There are many operations that are common enough that sharing data can avoid the huge resource and cost of reinventing algorithms for commonly used applications that have already been worked out. Sharing data to cut this cost goes hand in hand with sharing enough know-how about the data and the operation together to be useful.
From page 63...
... , Allowing Manufacturers in Supply Chains the Visibility of Each Other's Capacities to Better Manage Variations and Disruptions (i.e., pandemic) This includes sharing data for product-oriented supply chains, like consumer products that benefit from Amazon, Walmart, or Amazon-like next-day, at-your doorstep services.
From page 64...
... To use RAG-QA with large generative models, four steps need to be followed: prepare the data, train the retriever, train the generator, and fine-tune the RAG-QA model.4 Given concerns about protection of proprietary information and competitiveness, a corporation will more likely share its capacity data with its customers rather than across or even down the supply chain. Technical Challenges The technical challenges are related to what characterizes "big data," which arises from the rapidly increasing volume, velocity, and variety of data as well as the low veracity (i.e., high uncertainty)
From page 65...
... ; however, the output format can range from a pdf file to an xls file formatted to the liking of the testing house, rendering the ability to access and ingest for analysis time-consuming due to manual reformatting. • Supply chain procurement, order, and production capacity data: These relate to data flows between specific OEMs and their suppliers.
From page 66...
... The process enables data retrieval and access, ensures data quality, and adds value.5 Current general practices in data curation involve data organization and retrieval, such as naming files, structuring folder directories, and creating metadata to describe the data in a standardized way. A key understanding is that most factories (especially SMMs)
From page 67...
... Cybersecurity solutions should include formal verification, continuous ­integration/continuous delivery, digital twin, secure gateways, trust-but-verify components, cyber-physical passport, and the global supply chain ledger. Next-­ generation secure smart manufacturing architectures and protocols should ­incorporate AI-enabled threat detection and response mechanisms to detect and respond to cyberattacks in real time.
From page 68...
... with the next generation of secure manufacturing architectures. To realize the above-mentioned benefits and to address the business and tech nical challenges of data sharing, the committee envisions data banks with data contributed or donated by sources in the manufacturing community that are rigor ously validated to realize multifaceted functions as described below.
From page 69...
... Cyber threats can include attacks on supply chain partners, such as vendors or subcontractors, which can lead to data breaches or sabotage of critical systems. A modular and extensible methodology that can be applied to existing or new manufacturing facilities and supply chains to estimate energy savings and emissions reduction while maintaining a desirable level of cyber resiliency is ­desirable, which will help with creating consistent baselines for cybersecurity.
From page 70...
... With the assumption that individual tech nologies have been developed through existing and regular funding channels, the committee focuses on six interdisciplinary technologies that are highly demanded by the smart manufacturing community -- that is, human–AI co-piloting, sens ing, ­AI/ML, operational technology/information technology (OT/IT) integration through platforms, digital twins, and uncertainty quantification.
From page 71...
... The need for humans to make decisions associated with the deployment of smart manufacturing technologies will remain: the design, programming, and maintenance of rapidly evolving advanced technologies used in smart manufac turing cannot be accomplished without significant human involvement.
From page 72...
... Designing and producing unique and personalized products for con­ sumers requires human involvement. With the help of advanced automation, man ufacturers can integrate more customization and adapt to small changes in their 16  X.
From page 73...
... Thanks to the improvements of the industrial IoT and high-granular data acquisition technologies, baseline models can be constructed in a reliable and efficient way using wide-ranging data types spanning large frequency ranges from manufacturing systems.20,21 For instance, the implementation of smart sensors in a shop floor can generate data relevant to safety, productivity, quality, energy con sumption, and emission baselines.22 Accessibility of Sensors to Source of Information Sensor housing design directly affects sensor accessibility to the source of signal generation and plays a critical role in high-quality data collection and information acquisition that forms the foundation for smart manufacturing, to detect and trig ger automated and manual workflows. General practice in sensor design focuses on economical mass production of sensors.
From page 74...
... Gonzalez, R.M. Hernandez, et al., 2016, "Fabrication of Smart Parts Using Powder Bed Fusion Additive Manufacturing Technology," Additive Manufacturing 10:58–66.
From page 75...
... This is a new area with much innovation, and smart manufacturing should build on the inno­ vation, privacy, and security and be flexible as AI systems evolve and improve over time. Edge-cloud infrastructure and platform technologies are critical to enabling and scaling data sharing and data interoperability for robust local and collaborative model development, data exchange for operation interoperability, and logistics vis ibility for more resilient supply chains.
From page 76...
... Developing the infrastructure, platforms, and tools for integrating AI/ML with public data is therefore critical. This includes large AI models for general knowl edge and physical domain knowledge, as well as private propriety data, without intermingling private data.
From page 77...
... Wolff, and A Haghighi, 2023, "Architecture-Driven Physics-Informed Deep Learning for Temperature Prediction in Laser Powder Bed Fusion Additive Manufacturing with Limited Data," Journal of Manufacturing Science and Engineering 145(8)
From page 78...
... By allowing data processing closely to the equipment that represents the source of data generation, edge com puting adds another layer of benefits in terms of reduced latency and real-time decision-making, thereby increasing the overall system reliability. By leveraging these IT techniques, OT is scaled in multiple ways to realize improved operational efficiency, enterprise competitiveness, and adaptivity to rapidly changing market demands.34 Digital Twin Development of a digital twin involves the creation of a digital representation of a physical object or system of sufficient fidelity for it to be useful for life-cycle analysis.
From page 79...
... Uncertainty Quantification and Handling of Manufacturing Data and Models Currently, there is a lack of tools that address the process of data curation and the need to systematically and effectively quantify and manage the uncertainty that results from the large volume and variety of data collected from various sensors and other sources during manufacturing processes. These are general purpose tools that are being developed with the Clean Energy Smart Manufacturing Innovation Institute as platform and repeatable information model capabilities for processing data upon collection, ingestion, and contextualization and further processing and categorization if aggregated for collaborative use.
From page 80...
... Zhang, O Fink, et al., 2023, "A Comprehensive Review of Digital Twin -- Part 2: Roles of Uncertainty Quantification and Optimization, a Battery Digital Twin, and Perspectives," Structural and Multidisciplinary Optimization 66(1)
From page 81...
... The Department of Energy and other federal agencies should fund programs and consortia that develop technologies at the intersections of critical technologies (e.g., human–artificial intelligence [AI] co-piloting, sensing, AI/machine learning, platform technologies, digital twins, and uncertainty quantification)
From page 82...
... Sensing Promote research that synergistically integrates physical domain knowledge with data-driven methods to ensure consistency between the digital replica and physical counterpart. Explore additive manufacturing for customized sensor housing design to improve adaptivity and accessibility of sensors to the source of signal generation for high-fidelity data acquisition.
From page 83...
... Uncertainty Quantification Fund research that develops methods (1) to incorporate uncertainty quantifi cation into the data curation process as part of the data description to ensure data transparency and validity as the basis for informed, robust decision-making; and (2)
From page 84...
... • Develop and nurture consortia that address industry business models about intellectual property, trade secrets, data, know-how, and security that are preventing the industry from taking advantage of the full capability of the network as a resource. Develop and nurture consortia that provide toolkits, references, and sources of expertise for developing digital twins of manufac turing processes, including KPIs to gauge increased throughput, flexibility, and worker effectiveness.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.