Page 60

3—
Integrated Product and Process Design

Introduction

Information age manufacturing begins with information age design. Designing both a product and the processes by which it is produced involves understanding what the product is to do and how the product will do those things, converting the requirements for the product's behavior into engineering specifications, and producing plans that marshal the materials, equipment, and people needed to make and deliver the products. Even apart from the pressures for shorter time to market for new products that stress current design paradigms, new design challenges are generated by new product trends (e.g., shrinking feature size, decreasing tolerances, or a growing numbers of parts) and by new manufacturing processes (that designers must learn to exploit).

As a result, the amount of knowledge and data relevant to product and process design is rapidly becoming more than a single individual can comprehend. A further complicating factor is that an integrated product and process design (IPPD) effort must usually be coordinated among a number of engineering teams with different specialities and from different companies, since it is rare for a single company to have all the skills, technologies, and financial resources to design in-house all of the components needed. Managing this coordination task represents a major opportunity for information technology (IT) to have a positive impact.

As described in Chapter 1, a comprehensive IPPD system would include integration of performance specifications, conceptual design, detailed design, manufacture, and assembly, together with the ability to simulate actual use, field



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 60
Page 60 3— Integrated Product and Process Design Introduction Information age manufacturing begins with information age design. Designing both a product and the processes by which it is produced involves understanding what the product is to do and how the product will do those things, converting the requirements for the product's behavior into engineering specifications, and producing plans that marshal the materials, equipment, and people needed to make and deliver the products. Even apart from the pressures for shorter time to market for new products that stress current design paradigms, new design challenges are generated by new product trends (e.g., shrinking feature size, decreasing tolerances, or a growing numbers of parts) and by new manufacturing processes (that designers must learn to exploit). As a result, the amount of knowledge and data relevant to product and process design is rapidly becoming more than a single individual can comprehend. A further complicating factor is that an integrated product and process design (IPPD) effort must usually be coordinated among a number of engineering teams with different specialities and from different companies, since it is rare for a single company to have all the skills, technologies, and financial resources to design in-house all of the components needed. Managing this coordination task represents a major opportunity for information technology (IT) to have a positive impact. As described in Chapter 1, a comprehensive IPPD system would include integration of performance specifications, conceptual design, detailed design, manufacture, and assembly, together with the ability to simulate actual use, field

OCR for page 60
Page 61 repair, upgrade, and disposal. Such a comprehensive system will not be possible for many years. However, an important first step toward this vision can be achieved by joining detailed design with manufacturing and assembly. To accomplish this requires a new level of information structuring and integration. Feature-based design is the best way currently known to capture and integrate the necessary information that links the geometry of parts with their functions, fabrication, and assembly. Present computer-aided design (CAD)1 systems support creation of geometry only. Apart from stress and thermal analyses of parts and certain types of kinematic analyses, most design analyses must be done manually because there is no way to obtain the necessary information from the circles and lines stored in the CAD system. An IPPD system that realizes this first step toward a more comprehensive system will consist of three elements: a database, a set of algorithms, and user interfaces. The database will be structured to capture the information about the design in the form of geometry plus features (places of interest on each part, together with information on what role they play in the product's function and how to make and assemble the features in relation to each other). The algorithms will take the information they need from this database to simulate function, determine optimal assembly sequences, estimate fabrication or assembly cost, or perform design-for-assembly analyses, for example. The user interfaces will make it easier for the designer to create a design using features and apply the algorithms to study and perfect the design. Prototype software that does some of these things exists now. This software can be a basis on which to build a new kind of computer-assisted design that integrates technical, business, and economic issues relevant to design. It can support analyses of cost and function, as well as the study of families of products that share parts or subassemblies. Such software has been demonstrated for the design of certain complex electro-mechanical items. Another IT-enabled connection between design and manufacturing is the use of stereolithography as a visualization aid for designers and as the basis for rapid generation of prototype molds and dies for the production of mechanical parts. In an experiment conducted by a major automobile manufacturer, vendor bids based on a drawing and a stereolithographed model were lower than bids based on a drawing alone; this result was explained by the fabricator's greater ability to visualize the complexities of the item in question and thus to more accurately determine the costs of its fabrication. 1 Over the years, the acronym CAD has evolved in meaning. CAD initially stood for "computer-aided drafting"; however, perhaps because information technology achieved greater penetration into the world of design engineers, it has come to mean "computer-aided design," of which one part is drafting.

OCR for page 60
Page 62 Once a core design system is in trial use, it can be extended up and down the design process to include concept design and field use considerations. The information hooks in the data structures will be there to integrate with functional simulations, repair environments, and other aspects of product design. This evolution of the design system will be fueled by experience with its core implementation; feedback from users will determine what new capabilities it needs. Design Paradigms Electronic Design Of all the different types of design for discrete manufacturing, the design of very large scale integrated (VLSI) circuits (described in greater detail in Box 3.1) is today the most sophisticated. Thus, it is reasonable to suggest that the VLSI BOX 3.1 An Example of Information Technology-driven Integrated Product and Process Design In the semiconductor industry, electronic computer-aided design (CAD), also called electronic design automation, is a good illustration of how information technology can support the integrated product and process design process. Today, a designer of a chip using commercially available CAD tools would undertake the following steps: • Conceptual design. Customer requirements are determined and converted into design goals (e.g., instruction set, power consumption, and chip size and speed). Designers choose an underlying technology based on these goals. In addition, an associated "framework" orchestrates the operation of various other design tools, thus enforcing some of the methodology (e.g., cannot proceed to step x before step y is complete). • Detailed design. In detailed design, the conceptual design is reduced to logic elements that take in, convert, and put out logical values. A designer determines the basic architecture of the chip (how it will be partitioned into functional units) and then writes a description of each functional unit using a hardware design language such as VHSIC Hardware Design Language (VHDL). Detailed design involves making trade-offs among power, speed, and size; tools for detailed design address automated logic design, test vector generation, formal verification to ensure that the gate-level view of the chip matches the VHDL description, early timing analysis, and the spatial arrangement of various components. Tools are also available to perform a variety of detailed design tasks, such as static timing analysis, partitioning, layout, routing, wiring, and circuit-level timing; results from these tasks may impel the designer to alter the VHDL description. Unless a chip is to be fully customized (perhaps for reasons of maximizing speed of operation or minimizing power consumption or size), libraries are used to provide logic components to be integrated into the new chip. Detailed design results in a set of chip masks that drives the production process.

OCR for page 60
Page 63 • Simulation and verification. The detailed chip design must be tested to determine correctness and the extent to which the design meets functional requirements. Functional simulation ensures that the chip, as represented through the VHDL description, actually meets customer requirements for logic. Physical simulation, orders of magnitude slower than functional simulation, takes place at the gate or transistor level to ensure that timing and other layout-dependent issues are handled appropriately. Physical tests are performed on the chip that may reveal flaws in the simulator used to design the chip, flaws in the fabrication process, or design flaws that were initially unnoticed by the designer. The design stages and simulation tests are repeated until results suggest that the overall chip design is adequate. Process design generally need not be undertaken to create a fabrication process for a particular chip, since chip designers design only products that a given factory can fabricate; process design is needed to plan the factory and the fabrication technologies the factory will use to produce electronic chips of near-arbitrary design within one product or process technology family. The design of a semiconductor technology requires similar steps. What results is a set of design rules for the technology and high-level chip masks (used for etching the chip's features onto the semiconductor substrate) and usually a set of tools and library functions that the designer can use in implementing the logical and physical design of a chip built with this technology. These library functions consist largely of Boolean functions, circuit components such as latches, registers, register files, and memories, and design rules that define how they can be connected together in a logic network. Each library cell has a geometric view, a functional view (e.g., a three-input NAND gate) for use in logic synthesis, a behavioral view for use in high-level simulation, a fault view for use in testing, and a timing view for use in timing and delay analyses. Cell libraries may contain families of same-function circuits with different timing, power consumption, test characteristics, and geometry. design paradigm might suggest directions for improving the IPPD process in other manufacturing domains. An examination of VLSI design suggests that its successes are based on certain key attributes, including: • Appropriate design abstractions. The amount of detail needed in each step of electronic design is appropriate for the work to be performed at that step, thus allowing, for example, the three-dimensional characteristics of a chip to be virtually ignored until the final step of the detailed design process. Central to these design abstractions are languages that describe electronic products at many levels of abstraction (e.g., Verilog HDL) and the ability to use these descriptions in simulation, in design-space exploration, and in automation of a large part of a product's detailed design. Further, design tools can operate at different levels of abstraction and still obtain feedback from lower levels in the abstraction hierarchy; such tools enable an iterative refinement of the chip design that starts with

OCR for page 60
Page 64   high-level estimates and continues to the manufactured chips. Today, high-level chip specifications allow design alternatives to be explored almost at once, leading to early detection of conceptual-level errors and pruning of unprofitable paths. The early design can be refined and made more specific—data collected from this level can be back-annotated and used by higher-level tools.2 • Appropriately parameterized "parts" that can be used in the design process. Chips with new designs often make considerable use of logic functions that have been implemented before, for example, Boolean functions, latches, drivers, and receivers and terminators. Libraries of these basic functions (called library cells) and assemblies of these cells (called macros) that implement more sophisticated functions enable a designer to use prevalidated designs (perhaps with the specification of a few parameters) when appropriate, with the result that these functions are more likely to be integrated into a new chip correctly and in less time than if they are designed from scratch. (On the other hand, since connecting predesigned building blocks almost always does not result in optimal chip perfomance, chip designers may choose to forego the advantages of libraries to improve various dimensions of chip performance.) • The ability of the designer to ignore most off-nominal behavior and side effects in the design process. In practice, the behavior of transistors is influenced very strongly by their context. For example, a logic gate generally has different speed and power characteristics depending on where it is used. However, the design rules associated with library cells and macros and the tools that enforce adherence to these rules enable a designer to treat library components as parts that can be connected without concern for issues such as back-loading. • A close relationship between the fabrication processes and the extent to which the actual product matches the original design. If a designer adheres to the set of chip design rules3 appropriate for a given semiconductor technology, a properly controlled fabrication process will yield the desired product and desired product behavior (even if individual chips are different at some level). "Control" usually includes scrupulous attention to impurities in materials, surrounding 2 A specific example of this iterative design process is the use of logic synthesis and early (prephysical design) timing. Statistical methods are used to time the logic before it is placed and wired, and automated logic design systems (logic synthesis) can use these data to tune the logic design to meet timing requirements. After physical design, the timing is refined and made more accurate because more precise information about wire lengths and routes is present. The logic can be annotated with the exact times and be brought back to logic synthesis for readjustment of the logic design based on the new information. 3 The fabrication of digital chips uses what are essentially fixed manufacturing processes that are configured by patterns (plates and masks) derived directly by algorithms from the chip specifications. For this reason, chip fabrication is sometimes called a pattern-insensitive manufacturing process. The role of design rules is to restrict the patterns to those that will be faithfully honored by the fixed production process.

OCR for page 60
Page 65   gases, and process fluids and is pursued to ensure that the process itself remains stable and thus able to honor any design obeying the design rules. While the chip design process in actual practice generally requires more manual intervention than the idealized description above would suggest, most experience in electronic design suggests that automated design tools have dramatically increased the complexity of chips that can be fabricated and have reduced the time needed to deliver designs of constant complexity.4 Application to Mechanical Design Mechanical design poses myriad different problems, and the extent to which the electronic design paradigm can be applied to mechanical design is a matter of some debate. Digital logic can be considered as a special class of product whose design and fabrication problems have proven amenable (with the expenditure of considerable R&D resources) to the application of information technology. By contrast, mechanical items represent a wholly different class of products for which there is currently no formal representation of function and there is no direct algorithm-dominated way for reducing a functional description to physical design; whether this is fundamentally true or merely a limitation on current knowledge is as yet unknown. Most importantly, the science and engineering underlying models of mechanical products and the processes to manufacture them are not nearly as well understood as those for electronic products. Mechanical designs are characterized by complex and large multimedia energy interactions between a limited number of elements and by changes in element behavior over time. The information needed to describe multiple behaviors of mechanical systems is difficult to express in a single format or language, and there are few tools for representing or 4 This is not to say that electronic design does not have limitations. Moore's Law states that the number of transistors on a state-of-the-art chip will double about every 12 months. However, the design capability of engineers is not increasing with time nearly as fast as the number of transistors. Using today's design tools, an expert electronic designer engaged in fully customized design can reasonably hope to complete the necessary design work on a few hundred transistors in a single day of work; such work includes the detailed design, debugging, and documentation needed for a commercial product. For example, the Pentium processor (order of magnitude 5 million transistors) took about 100 engineers 2 years to develop, a rate that corresponds to approximately 250 transistors per day per engineer. The resulting gap between the complexity of a chip and the design capability of individual engineers must be filled by the use of additional engineers. Over the long run, two approaches for filling the design gap seem plausible: better tools to increase the design capability of individual engineers, and the reuse of existing and validated designs to reduce the amount of work that must be undertaken from scratch. Some designers today claim the ability to lay out hundreds of thousands of transistors (tens of thousands of gates) per designer-year by using high-level design paradigms with tools and programmable chips.

OCR for page 60
Page 66 BOX 3.2 Aspects of the Mechanical Design Process Three basic elements of mechanical design are the following: 1. Conceptual design. The design of mechanical systems often calls for the management of significant energy flows, the consideration of complex three-dimensional geometry, and the understanding of relationships between geometry and energy. The "underlying technology" in which a mechanical system will be implemented is often not naturally specified by the product requirements, and the space of design goals is much more multidimensional than in the case of very large scale integrated (VLSI) systems. 2. Detailed design. For mechanical items, detailed design is generally independent of conceptual design, since no formal methods exist for linking concepts or functions to detailed geometry except in perhaps a few special cases. In most instances, detailed design of mechanical systems that are even marginally efficient in their use of space, weight, or energy generally requires the custom creation of integrated combinations of three-dimensional geometry rather than piecing together predesigned and tested library building blocks. Using standard parts or designs for parts used for other products is not easy because geometries differ greatly even for the "same" item, and there is no way to catalog them systematically. The few standard library or catalog items that are available are generally not main function carriers but instead are fasteners, bearings, motors, valves, and other similar items. Difficulty in mechanical design often centers on creative geometric reasoning, management of multiple behaviors, mitigation of unavoidable side effects, and anticipation of a variety of failure modes. (In some cases, a mechanical design is an evolutionary outgrowth of a past design. In these cases (such as automotive suspensions) progress is being made in establishing design templates that capture in parametric or rule form the traditional parts and relationships among parts that every good example of the genre must contain. Design then consists of packaging the given elements, using existing simulations to confirm basic behaviors.) 3. Verification. Computer simulation tools can predict nominal behavior at the testing mechanical designs that are comparable in power to those available for the analogous electronic design task. A similar point applies to the modeling of multiple and simultaneous high-level energy interactions. Compared to the problems encountered by the VLSI circuit designer, the problems that challenge the mechanical designer, described in Box 3.2, suggest important research questions for better IT support of the design effort. These challenges are the subject of the remainder of this chapter. Needs And Research For Mechanical Design As noted above, it is not clear that the paradigm of electronic design can be applied to mechanical design, the area that the committee believes poses the greatest need today. Nevertheless, the successes of today's electronic design paradigm suggest research areas to improve mechanical design.

OCR for page 60
Page 67 system level if only one or two main modes of behavior are simulated at once and only in the case of sufficiently simple mechanical systems. Since side effects and off-nominal behaviors often cannot be adequately modeled and usually cannot be designed out, either extensive, time-consuming, and costly prototyping and field testing are required, or the designer must design the system very conservatively to mitigate the consequences of these side effects to an adequate degree over the expected life of the system. Process design is integral to the design of individual mechanical systems. One reason is that for many mechanical parts, there is no way to automatically convert product geometry and other specifications to a fabrication process such as a sequence of machining steps. Designing fixtures, for example, is a major problem. Moreover, most mechanical fabrication processes are neither pattern-insensitive nor precisely controllable; even when the commands to a given process are repeated identically, the output of the process (as implemented on an actual production line) is different each time, and the differences may well matter. A command to drill a hole 0.500 inches in diameter may be provided, but the resulting hole may be 0.499 inches in one case and 0.502 inches in a second case. In general, elimination of these differences is either too difficult or too expensive. Thus, process design involves the choice of an economical process whose output comprises mostly acceptable (though different) parts. In most cases, there are no simulations that can predict the range of variations in nominally identical outcomes that a given mechanical process might generate. This range, usually called "process capability" when appropriately normalized, is often estimated by experienced people. (Such "tolerance-like" problems also affect chip fabrication: for example, a chip that emerges from the fabrication process with line widths that are too large or too small may run more slowly or not at all. Such issues affect yield rates dramatically. The difference is that for electronic fabrication, the process that determines yield is independent of the pattern on the chip to a considerable degree.) The above three steps may have to be repeated until a functional and manufacturable design in achieved. Specifically, an important goal is the development of appropriate abstractions at every stage of the design process and tools that manipulate these abstractions. Abstraction is not the same at each phase of a design, and resolution of implementation details is deferred in the design process until such details are really needed. The use of such abstractions reduces work because existing representations can be reused, and different representations can be related through their common pieces. Key to developing these abstractions is the exploitation of hierarchical composability (i.e., the construction of complex standards or information models from simpler ones). Researchers should seek appropriate abstractions for mechanical products and develop tools to support design in terms of those abstractions. In electronic design, many sophisticated tools have been developed to support physical aspects of design, while fewer tools exist for aspects of conceptual design such as requirement gathering or architectural decision making. This

OCR for page 60
Page 68 disparity suggests that it is far easier to collect, characterize, and represent data about "things" than about ideas and decisions. Nevertheless, the payoff for automated support of activities such as requirement gathering, corporate decision making, and overall product architecture is so high that research directed at these targets is also worth undertaking. The committee believes that concentrating on the areas described in the rest of this chapter will yield the most significant benefits to the IPPD process in the short to medium term. Other areas in the IPPD process not described in this chapter (mostly in the conceptual design area) will be advanced in the short to medium term by better communication capabilities and by better and additional access to data. The committee notes also that success in improving design is likely to yield particular benefits to small manufacturers. To a considerable degree, tools that support product and process design can be categorized as "mostly software"; that is, they require low capital investment to obtain. Moreover, the increasing power of computational hardware (and its dropping cost) will make these tools more accessible to manufacturing firms with small capital budgets. To capitalize on this enabling trend, it is necessary to design these tools and their interfaces so that small businesses with limited technical depth can use them readily. A growing information infrastructure (Chapter 6) will link these small firms with each other and with larger firms. The emergence of standard data formats and interoperable tools will facilitate the spread of advanced capabilities. Table 3.1 lists several areas for research in product and process design. In general, the research proposed by the committee focuses on design-space exploration, creating (parameterized) geometry, characterizing tolerances, predicting failure modes, increasing robustness, and facilitating reuse of components and designs. In addition, major issues of scale-up and complexity remain to be addressed explicitly as an integral part of any research done in these areas. Explaining these research foci is the subject of the remainder of this chapter. TABLE 3.1 Research to Advance Product and Process Design Subject Area Example of Research Needed Multiview design descriptions Design by function Relation of geometry to function Functional simulation Parametric design Capture of nominal and variant behavior of products and processes in one model A mathematics of variation for performance modeling   Descriptions of product function and variants directly related to descriptions of geometric or material variations

OCR for page 60
Page 69 TABLE 3.1 Continued Subject Area Example of Research Needed Multipurpose data and model representations Techniques for relating models at different levels of abstraction   Logic-based representations   Protocols, formats, and representations for data interchange among models Design methods and tools for groups of parts and systems Decomposition methods to break product concepts into subsystems   Subassembly performance models and interface descriptions for joining subassemblies to each other   Assembly planning   Trade-off analysis (e.g., cost and design) Process description languages and models Set of process primitives (building blocks) from which process models can be built   Languages with syntax checking for correctness and completeness of process descriptions   Resource description models   Accommodation of spatial and temporal dimensions of processes Novel design considerations Easy, error-free configuration control at the selling or servicing stage   Manufacturing of a robust final product from parts obtained from different sources Product-process data model Data descriptions for many physical processes and entities in a unified form   Descriptions of design interactions, analyses, and process steps integrated with product geometry and function descriptions Decision aids Data visualization   Database searching using geometric features, performance criteria, or process descriptions   Intelligent advisors Geometric reasoning Visualization tools Knowledge and information management Systems that capture corporate memory and knowledge   Systems that support corporate learning   Techniques for handling data legacy issues   Systems that record design history and rationale

OCR for page 60
Page 70 Research for Product Description Communication among engineering professionals has always relied on models—sketches, drawings, analytic models of behavior, or many other symbolic representations of knowledge related to products. However, what distinguishes data or information models from arbitrary documentation, such as simple reports or drawings, is the addition of formalization.5 This formalization is motivated by the need for unambiguous communication between collaborators and/or our desire to use and interoperate among a growing set of computer-based application tools. Specifically, a data model is expressed in some data description language. A language specification defines the form and meaning (syntax and semantics) of entities in the language. The initial graphics exchange specification (IGES; IGES, 1993), for instance, specifies the language of IGES data files by specifying their syntactic form and relating the entities in the data files to known geometric entities. Traditionally, the specifications of data description languages such as IGES have been written in a natural language such as English, and a human being must use that description to build a software parser/recognizer for the data description language. The semantics of the elements of the language remain expressed solely in English. More recent efforts such as PDES/STEP (product data exchange using the standard for the exchange of product model data; ISO, 1994) provide a more formal language, EXPRESS, for use as a data description language in which data models are described. EXPRESS allows the modeler to capture some of the semantics of the data by explicitly recognizing relationships between data elements along with the cardinality of such relationships and by capturing constraints between data elements. Box 3.3 describes two approaches to knowledge representation. Appropriate formalisms also support the generation of agreement on queries about data; application tool development; language translation; and services such as change notification, management of information dependencies, and matching of information producers and consumers. A highly general expressive capability is needed to support the exchange of 5 The term ''formalization" is used in this chapter in the sense of "a greater degree of rigor, clarity, and explicitness that would be necessary for computer-based representation, manipulation, and analysis." However, the extent to which it should include matters such as formal proofs of completeness or correctness is subject to some debate within the community, with some advocating much higher degrees of mathematical formalization than others. Those who believe in high degrees of mathematical formalization tend to be logic theorists. The advantage of logic-based approaches is that they are neat and clean and amenable to the power of formal logic and mathematics. On the other hand, it is not necessary to embrace this degree of formalization in an attempt to get away from models specified in English and a few drawings. Indeed, the need to cope with the uncertainty of the manufacturing environment and with spatial and temporal relationships (for example) requires that a formal logic be modified in a way that reduces the "neatness" of the formalism and its ease of use as a base language for representing interesting phenomena. The result may be a more "scruffy" approach to the problem.

OCR for page 60
Page 73 • Design descriptions that allow multiple views of a product (geometry, engineering parameters, functional requirements and behaviors, and relationships among these domains). Data models should be capable of representing the product or component from many different views at different stages of design, manufacture, and use. The product designer should be able to view its geometry, tolerances, assembly problems, and repair scenarios; the techniques of feature-based design and parametric design hold particular promise (Box 3.5). The current PDES/STEP effort does address some of these issues; for example, an addition to the STEP Application Protocol called Configuration Controlled Design will contain both the geometry of a product and information on the revision, release, and effectiveness associated with producing this product. Nevertheless, no adequate data model exists, nor are there adequate methods for supporting these activities. • Model formulations or simulations that are capable of describing the main off-nominal behaviors, or variants, as well as the nominal behavior of a product.7 For many products, more effort goes into anticipating and mitigating a wide range of off-nominal behavior (e.g., testing aircraft for fatigue and crack resistance, determining if layers in a microprocessor will peel apart under temperature extremes) than into determining how to meet the main functional requirements. (See Box 3.6.) • Better understanding of relationships between different representations and models. Because of the interdisciplinary nature of collaborative product design, relationships between different representations are important. Questions such as how to relate a model in someone else's view of the world to a model in one's own view are ubiquitous. Determining relationships between models (e.g., between models of different levels or types of abstraction) will remain a difficult problem for some time, but a farsighted approach to knowledge representation can greatly enhance computational support through the exchange of information among different applications. Logic-based representations may help in this area in the future, although there is debate on the point; further exploration is needed. • Formats and representations that enable data exchange among different design tools. While different computer-aided tools for design are based on different data models, much of the content of these models is overlapping. As more computational support for design comes on-line, engineers will rely 7 A 1991 NRC report recommends research on tolerance analysis, tolerance representations, tolerance-performance relationships, and tolerance standards and measurement methods. See MSB (1991), pp. 56-57.

OCR for page 60
Page 74 BOX 3.5 Feature-based Design A feature-based data model is a way of describing a design that contains more information than just geometry. In a computer model, a circle may represent a hole, but it is not a hole. However, feature-based data could include text data saying that it is a hole and giving the diameter and tolerances, plus numerical control instructions for how to drill it. The "hole" is then called a feature, which in fact is a data object that contains instructions for how to draw itself on the computer screen and how to drive a machine tool to drill it. Features can describe machined areas, or they can describe areas where a measurement will be taken ("measurement features") or where one part will be joined to another ("assembly features"). Features are thus able to hold a great deal more information than a purely geometric model can. In principle, features can contain design intent as well as details about how to make and use the feature. For example, a pocket to hold a precision ball bearing would require tighter tolerances and a finer surface finish than a hole through which oil is squirted. Feature-based models may also reduce storage requirements, because compact functional descriptions could be stored and voluminous geometric descriptions could be generated from the functional descriptions when needed. Feature-based design also supports higher-level product data models and descriptions of configurations. A simple model of configurations may include relationships among subparts and attributes, Such a model can be used to create a more specialized theory of configuration design that includes connections between parts, special part subclasses, predefined lists of available parts, and so on. The idea is to create multiple layers of representation that mimic the multiple levels of abstraction at which we view things. This approach has two benefits: work is reduced through reuse of existing representations, and different representations can be related through their common pieces. Several STEP projects provide for specification of simple form features, for example, through and blind holes, although the taxonomy is far from complete. In addition, new work has just been initiated in providing a parametric representation within STEP, an ability that is necessary to support more extensive use of form features. Ford Motor Company is actively working on a research project called Rapid Response Manufacturing that makes extensive use of form features. Output from this project is expected to drive development of the STEP standard in this area. This program is being conducted jointly with General Motors, Texas Instruments, United Technologies Corporation, and Allied Signal.   increasingly on information models to help bridge the gap between multidisciplinary users of diverse tools; exchangeable geometric models are thus a particularly pressing need. Success in developing exchangeable representations of performance, geometry, and process requirements is a prerequisite for their use in design tools and by practitioners in the allied domains of process equipment design and shop floor planning and operations, as well as by designers in other companies or in other technical domains. • Tools that facilitate understanding the relationship between cost and design choices.

OCR for page 60
Page 75 BOX 3.6 Limitations of Tolerance Analysis The inherent variability in the shape of manufactured parts imposes major limitations on the performance of products assembled from those parts. To reduce the impact of such limitations, designers traditionally impose tolerances on these parts. However, too little is known at present about the relationships between unwanted variations in shape and product performance to permit rational assignment of tolerances except in a few special cases such as optical systems and journal bearings. In most cases, tolerances are assigned on the basis of experience or the capability of available manufacturing systems. Two fundamental tolerance problems are tolerance analysis (i.e., identifying which dimensional variations contribute most to a final error) and tolerance allocation (i.e., choosing the best way to distribute inevitable tolerances among several potential contributing sources during design so that the desired level of final error is achieved at minimum cost). Today, most tolerance analyses are simple one-dimensional fit studies. More complex multidimensional studies are done less often and employ Monte Carlo methods. These methods are difficult to use because there is no automated way to apply them to a computer model of the geometry. They also suffer from combinatorial and scale-up barriers. Research is needed to extend engineering models and then link them to geometry and geometric variations.   Design tools for electronic devices typically do not provide information about fabrication cost, because the cost of fabricating a given chip is approximately independent of what is put on it.8 But this is not true for mechanical design, in which design choices may have a significant impact on the cost of making the final product. Designers need to be able to keep such high-level trade-offs constantly in mind through the use of better design tools. Many papers have been written on this subject, and some such tools exist (e.g., Hewlett-Packard's Sheet Metal Design Advisor). More generally, product data models will have to include information that goes beyond simple geometry. These models should also relate high-level function, assembly processes, possible interactions between product and user, the product's structure and geometry, a schema for describing component parts, and the product's behavior, manufacture, maintenance, and ultimate disposal or recycling—in principle, any information that could describe a product. Moreover, the information contained in a product data model may represent a single instance of the product, a set of possible instances, or even a set of possible descriptions. 8 One qualification is necessary. While chip fabrication is generally pattern-insensitive, the yield (i.e., the fraction of chips that are usable after the fabrication process is complete) is a direct function of the feature sizes and material gaps chosen: wider lines and wider gaps between lines generally result in a higher yield, which in turn reduces the unit costs of chips that may be sold.

OCR for page 60
Page 76 BOX 3.7 Using Design as a Strategic Tool Nippondenso Co. Ltd. (NDCL) manufactures automotive components. Its order mix is highly variable and unpredictable, and thus a primary challenge for NDCL has been to develop a factory environment in which it can produce a near-arbitrary mix of product variants. This type of flexible manufacturing has often been attacked as a problem involving factory floor operations, but NDCL has defined and solved it primarily as a problem of design. In particular, NDCL's design approach is based on what can be called a combinatoric method (Figure 3.1). Product variety is created by selecting among several versions of each part in a product. Each product is designed so that the physical and functional interfaces between parts are the same for all versions of each part. The result is that any combination of versions of parts can be assembled into a working unit. The interfaces between the parts and the assembly equipment are similarly standardized so that differences between versions of parts are transparent to the equipment. NDCL achieves its flexibility goals by using fairly ordinary parts and then employing unusual logistical methods to assemble them into different variants. The schedule for fabrication of parts is not based on the details of what kinds of items are ordered, but instead follows broad statistical patterns of orders. It is the assembly process that addresses the detailed stream of orders. Because assembly is so much faster than fabrication and because NDCL's assembly machines can be switched from model to model so quickly, model mix can be addressed much more easily and economically during assembly than during fabrication. Close coordination of top management objectives, product design, and production technology is required to carry out this approach. As a result, it can be said that NDCL has taken concurrent engineering well beyond the goal of improving fabrication or assembly. Instead, NDCL has learned how to use design to achieve the essentially strategic goal of meeting the demands of its customers. Thus, in addition to serving the traditional role of providing neutral data formats for computer-aided application tools, models must support a host of activities for managing and exchanging information. Research for Process Description Tools for process description have application to the design and operation of factories; in addition, they are the basis for experiments with and evaluation of control and organizational changes before actual systems are installed. Process descriptions will also be used to enhance product design, so that by simulation the best process can be matched to the product design (and vice versa) for maximum economic advantage (or to satisfy whatever criteria—such as quality or time to delivery—are important for the particular case). Box 3.7 describes a rich and productive interaction among process design, product design, and what happens on the shop floor. The primary need in process description is formalization, which is necessary

OCR for page 60
Page 77 FIGURE 3.1 A panel meter (left) and the combinatoric strategy (right). Each zigzag line on the right represents a valid type of meter, which is assembled by following the path from top to bottom. "SD" stands for standardized design, an effort that reduced the number of variants of each part as shown. The production rate is 32,000 per shift. A "catalog" of only 16 parts is sufficient to support production of 288 different kinds of meter. If all the 288 possible paths at the right were drawn, one would see that each part is a member of many possible types of meters. Thus to first order most parts will be used regardless of the pattern of the order stream, so that there is little inventory risk in making the 16 kinds of parts. If each different meter type were created by a few parts that were special to that type, a shift in the order pattern would require a large and awkward shift in the schedules for fabricating parts, which could not be accomplished as quickly as the switching of an assembly machine. Also, feeding the hundreds of different kinds of parts needed to support so many varieties of meter would be very awkward. These are some of the reasons that variety can be achieved more easily during assembly than during fabrication. Courtesy of Nippondenso. for representing processes in sufficient detail and with enough specificity to make the process description adequately complete and unambiguous.9 Such formalisms allow designers to describe, enforce, and simulate processes, including factory 9 Of course, at the root of a good process description is a good scientific and engineering understanding of the specific process involved—the best tools to formalize process description are not helpful if the knowledge base they are used to formalize is shaky and uncertain. Indeed, many fabrication processes in use today are incompletely understood and only partially characterized. Still, as important and crucial as such understanding may be, a research agenda for process characterization is outside the scope of this report.

OCR for page 60
Page 78 processes (involving both machines and people), design activities, and decision processes. Research needs in the area of process description include: • A language for expressing process descriptions that facilitates checking for correctness and completeness and the capability to express not only nominal process behavior but also variant behavior. Such a language must also be translatable across technical domains. • Process model representation schemes (both aggregate and detailed) and on-line data collection. Data included in such schemes should describe logistic, fabrication, material-handling, inspection, assembly, and test processes and should give information on the characteristics, capabilities, and costs of various production and assembly methods. Data should be captured as a product is being produced so that the process can be improved. • Specific process models that reflect all relevant spatial and temporal transformations. Such models are critical for the local control and planning of manufacturing operations and will draw on knowledge about the kinematic capabilities of individual pieces of equipment and other process limitations, processing capabilities of the equipment, and tool and fixturing capabilities associated with the equipment. Ultimately, these models should contain the detail necessary for dynamic control of the individual operations as well as the information required to simulate the operation of the manufacturing system, indicate the effects of perturbing the operational parameters as well as the effects of complex interactions among processes, and be generalizable across a wide range of production environments by appropriate parameterization. • Algorithms and tools to solve process problems. For example, one such problem is the determination of efficient assembly sequences; an inappropriate assembly sequence may result in the need for a tool to reorient the item being assembled many more times than necessary, thus increasing the time needed for assembly and the likelihood of breakage. Such factors are to a considerable degree irrelevant to the design of the product itself but have a significant influence on the cost of making the product. Other important process problems include the determination of efficient equipment layouts on a factory floor, equipment selection (matching equipment capability to process needs), make-buy decisions, and determination of how best to cut and shape materials to minimize waste. • Dynamic models for describing resources available over time to the manufacturing

OCR for page 60
Page 79   system.10 Such models will be used in both the design and the operational control of manufacturing systems. Common representations and descriptions of resources are necessary to enable development of transferable (from planning to analysis to control) models and analysis. Despite the importance of resource management, little research and effort have been devoted to creating generic representations of resources.11 As a result, specific resource characteristics must be recreated each time a modeling activity is undertaken. Resources include those related to fabrication (e.g., tooling, machines, available controller features) and their interconnection, as well as system resources such as corporate information or knowledge and information such as company or external standards. Research for Tools to Support Integrated Product and Process Design Computer tools that directly aid the management of the IPPD process itself would be helpful to managers. Today, such tools are limited primarily to communication aids or "groupware" for helping people post notices and share information. Means of describing and managing the design process need to be developed. Few tools exist for creating, monitoring, and guiding the design process itself, except for familiar project management tools like PERT. (PERT is largely a schedule and resource management tool.) Scheduling aids beyond PERT are required to help in determining effective task sequences, setting up information flows, establishing schedules and milestones, identifying people, assigning work to them, routing information to them, and linking them to colleagues elsewhere. Existing tools do not help to identify information flows or facilitate them. A single design decision may have several simultaneous impacts, some of which may be beneficial and others adverse. The design environment should support techniques to express comparisons and trade-offs vividly so that a designer can assess the impact of a wide range of design decisions on a product's cost, time to produce, or quality. Tools are needed that focus on the identification 10 See MSB (1988), pp. 11-13. It recommends research on a number of specific resource management modeling methods, for example, modeling methods based on knowledge-based systems, object-oriented systems, and Petri nets; methods that are sufficiently fast and efficient that resource problems are tractable while plants are being designed and built, as well as being operated; methods for correcting models, based on comparisons of predicted and measured performance; and many other methods. 11 A notable exception is IDEFx, an evolving language developed for the Air Force in the Integrated Information for Concurrent Engineering program that does identify information flows within the process and is used to support the reengineering of business processes through process modeling and simulation. The first version, IDEF0, was developed under the Integrated Computer-Aided Manufacturing program and was used primarily for modeling of individual activities within processes; IDEF0 is today a Federal Information Processing Standard. See Moore (1994), p. 49.

OCR for page 60
Page 80 of design trade-offs between cost, performance, and reliability; alternate space allocations; functional decomposition; subassembly definition; three-dimensional geometric reasoning; and make-or-buy decisions. To produce such tools, research is needed on decision tools that draw on the product-process data model and performance simulations of the product being designed, as well as process models and various data on costs and tolerances of different processes. These decision tools will sort data and models on the basis of criteria supplied by the designer to aid in making comparisons between alternate designs and processes. In addition, improvements to simulation and rapid prototyping tools should be brought to bear on the problems of viewing physical parts, ''visualizing" complex relationships (certainly between physical parts, but also between more abstract relationships such as design requirements and their costs), and presenting design alternatives to customers and to designers. This category also includes methods or tools to handle groups of parts such as assemblies, subsystems, product families, made-to-order configurations, and selected combinations of parts that create different product models by virtue of which parts are selected. (Box 3.7 describes such an application.) Research is needed on design methods and computer tools adapted for the design and creation of groups of parts or systems, in addition to individual parts. Such methods and tools will enable the designer to divide a product into subassemblies, design optimal in-process test strategies during assembly, and identify assembly sequences that minimize cost, tolerance errors, material handling, and part damage during assembly. Also, the methods and tools will include standard design modules and methods that facilitate optimization of part or all of a product for cost and quality. Such optimization requires a deep understanding of specific components and features of the product.12 Finally, it would be highly desirable to have tools that would allow product design and process design to proceed more in parallel. These tools would enable product designers to work with some degree of incomplete information about the process designer's work, and vice versa. Successful development of such tools would contribute greatly to the reduction of needed design time. Research Areas Not Specific to Manufacturing Geometric Reasoning A generic intellectual activity required by mechanical design is geometric reasoning. A major difference between VLSI design and mechanical design is 12 Because such understanding is often proprietary to component suppliers, research in this and similar areas of collaborative design must consider nontechnical issues such as intellectual property rights and intercompany data exchange.

OCR for page 60
Page 81 the degree to which three-dimensional geometric reasoning is fundamental to even the simplest mechanical designs; VLSI design is based on two-dimensional (2-D) analysis (or at worst, 2 1/2 dimensional analysis—the use of stacked 2-D layers). Three factors make three-dimensional (3-D) reasoning more complex: 3-D design is more likely to involve moving parts or flows; 3-D design may involve interconnections and tolerance relationships between 2-D domains; and 3-D designs can be visualized only in cross sections, perspective, and exploded views. Research is needed on building robust geometric modelers. Many of the "boundary representation" modelers in use today are not robust: interactions between the algorithms they use and finite-precision arithmetic offered by the computer result in certain modeling operations that yield incorrect results. Truly robust algorithms to correct such problems remain a challenging research problem. Ultimately, the design environment should support improved visualization tools or other design aids that will help make geometric reasoning faster and efficiently achievable by a broader range of people. In addition, tools that undertake geometric reasoning automatically (i.e., without relying on a skilled human) may be able to replace human designers for certain purposes. Knowledge and Information Management Basic to design are many issues of data management and of data themselves. The future design environment will include a number of data management methods and tools. For example, the design environment will have to handle data legacy issues, such as converting data from one CAD system to another and preserving old data for decades or more so that they can still be read, edited, and processed. Today, such data are either lost, kept on paper, or accessed in a limited way by old hardware kept on hand for the purpose.13 A second data management issue is that the design environment will have to include ways to capture corporate memory and knowledge so that successors of current designers can tell what knowledge was used, what competitive methods were used, what errors were made, and on what factors success was based. The ability to record design history and rationale is of particular importance. Every design is a historical web of decisions that grow out of each other and depend on each other. Revision, whether for correcting an error, absorbing a new outside circumstance, or improving manufacturability, requires unraveling the web to a certain degree. The design selected, the web of decisions leading to it, and "roads not taken" indicate the corporate state of belief at the time the decisions were made and thus form a historic context. 13 Siewiorek (1992) posed the following research questions that must be resolved before concurrent design and rapid prototyping become integrated into industrial practice: How can design and manufacturing information be reused in future products? and, How can the compatibility of new incremental information with all the previously acquired information be ensured?

OCR for page 60
Page 82 Specific Research Questions A variety of specific research questions are motivated by discussions earlier in this chapter. In accordance with the charge of the study, the focus of these research questions is largely information technology and associated disciplines such as mathematics. If the underlying science and engineering aspects of manufacturing are well understood, IT can help immeasurably in exploiting such knowledge and information, but by itself IT is not a substitute for that knowledge. • How should the data contained in a product data model be organized to accommodate their huge size and complexity and the many disciplines that need access to them? • How much of that information is physical and how much is "relational"? How much can be captured in traditional geometry and how much is nongeometric or not focused on one item but shared or spread among many items, or even not attached to specific items? • To what extent can a product description for mechanical items be converted automatically into a production plan, that is, a sequence of fabrication steps that transform raw materials into a final product? • Can a general process description language be developed that would be both man- and machine-intelligible, permitting processes to be described more precisely than is possible now? • How can CAD tools be used at different stages of the design process? Can high-order abstractions be used as a starting point for encapsulating product and process facts and knowledge, geometry, requirements, tolerances, and other product characteristics, providing a link between product function, geometry, and processes? Can "meta-features" be defined that will encapsulate groups of features, creating a feature hierarchy? If not, how can the scale of real designs be encompassed using features or any other means of capturing and combining detailed design data, geometry, and intent? • Features are often very process-dependent, and the fabrication of parts may require the application of multiple processes, each of which uses a different set of features to address the same part. Attaching features to descriptions of parts may prove impractical due to the differing feature sets for the same part. Feature-based product description may prove unproductive. In this event, to what extent is it feasible to design process models that interact with separate product models to generate appropriate product models?

OCR for page 60
Page 83 • How can the human interfaces of CAD systems be improved so that more complex shapes, assemblies, and other multidimensional problems can be handled more easily? • What new process analysis tools can be developed, especially to handle complex problems like assembly, model mix manufacture, and tolerances? • How can the reliability of new product or process designs be better predicted, including trade-offs between cost and reliability? • What new languages or data structures can be developed to better describe product requirements, such as performance, reliability, shapes, interconnections, interfaces, and tolerances? • Similarly, what new languages or data structures can be developed to better describe process requirements, such as performance, cost, reliability, ease of diagnosis and repair, material handling, ease of use, and ease of modification? • What new data logging and correlation methods can be devised that would help the process of continuous improvement, such as finding multiple occurrences of the same type of machine failure or deducing what is the best sequence for testing a broken system to diagnose its problems?