Click for next page ( 136


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 135

OCR for page 135
136 _l MAGINE A ROOM 50 feet long and 30 feet wide, filled with cabinets contain- _ - ing 18,000 vacuum tubes intercon- nected by miles of copper wire. This was the ENIAC (Figure 8.1), a 30-ton behemoth that was the world's first electronic computer. Now imagine that ENIAC had been given the task of solving a set of simultaneous linear equations that embodied 30,000 independent variables. With a team of people working around the clock to record intermediate solutions and feed them back into the computer (ENIAC had a limited amount of memory), ENIAC would still be chugging away more than 40 years later to solve the problem! A supercomputer using modern algorithms could solve the problem in an hour or so. The speed and capability of the modern com- puter have tremendous implications for the practice of chemical engineering. In the future, computer programs incorporating artificial in- telligence or expert systems will help engineers design improved chemical processes more ef- ficiently. Complex computations based on fun- damental engineering knowledge will allow en- gineers to design reactors that can virtually eliminate undesired by-products, making pro- cesses less complex and less pol- luting. New sensors, many of which will be miniature analyti- cal laboratories tied to miniature electronics, will allow rapid and accurate measurements for con- trol that are currently impossi- ble. New chemical products that today are discovered predomi- nantly through laboratory work for instance, reinforced plastics that are as strong as steel and weigh less than aluminum or drugs with miraculous properties may be discovered in the future by computer calculations based on models that predict the detailed behavior of molecules. Chemical engineers will lead this revolution. They will need to be trained to use advanced computer techniques for process design, process control, and FRONTIERS I.\ CHEMICAL ENGINEERING management of process information. Advanced engineering development will be based more than ever on mathematical modeling and sci- entific computation. Reliable modeling at the microscale, the individual process unit scale, and the plant process scale will improve our ability to scale up processes in a few large steps, possibly bypassing the need for a pilot plant and saving the 2 or 3 years required to build and operate it. Process models capable of pre- dicting dynamic behavior, operability, flexibil- ity, and potential safety problems will permit these aspects of a process to be considered more fully earlier in the design stage. Because improved computers can perform the extensive computations required by such models, it will be possible to test alternative designs more quickly. A chemical process must be designed to operate under a chosen set of conditions, each of which must be controlled within specified limits if the process is to operate reliably and yield a product of specified quality. Accurate, complex, computer-solvable models of chemical processes will incorporate features of the con- trols that are needed to maintain the desired process conditions. Such models will be able to FIGURE 8.1 The Electronic Numerical Integrator and Calculator (ENIAC) and its inventor, J. Presper Eckert, circa 1946. ENIAC was the world's first electronic computer. Courtesy, UNISYS Corporation.

OCR for page 135
COMPUTER-ASSISTED PROCESS AND C0.~L ENGINEERING predict the effects of process excursions and the control measures needed to correct them. Computer management of the process operation will rapidly initiate control correction of process excursions. The development of new types of process sensors will be essential to this degree of process control and will eliminate the time- consuming withdrawal of process stream sam- ples for analysis. The design of a commercial process can generate an almost uncountable number of pos- sible solutions for seemingly simple problems. Even a decade's 10,000-fold increase in com- putational capability in terms of faster computer speeds and better algorithms does not permit a person to search among these alternatives, nor would such a search make strategic sense. Give engineers a design problem for which the best solution is obvious from today's technology, and they will quickly write down the correct solution without searching. If the solution is not obvious, they will often home in on the infor- mation needed and perform the computations that will expose the right solution quickly and with minimal effort. By using intuition and experience, they eliminate the need for testing every possible alternative. We need to under- stand how to use computer technology in much the same way, to solve complex problems where many of the decisions are based on qualitative information and insights that develop as the problem is attacked. Encoding this activity in the computer in- volves a type of modeling in which the capa- bilities of the designer and his tools, the alter- native procedures by which complex problem solving can be performed, and effective methods of information management are all incorporated. Advances in artificial intelligence, expert sys- tems, and information management will revo- lutionize the automation of this activity, giving us computers that can display encyclopedic recall of relevant information and nearly human reasoning capabilities. A HAL 9000 of 2001: A Space Odyssey fame may indeed exist in the future. Computer-generated visual information, for example, three-dimensional portrayal of pro- posed new designs, will be commonplace in the future. Communication will be in natural lan 137 guage, using both pictures and voice. This setting, which will address the need for new chemical processes and products by harnessing almost unimaginable computing power, will pro- vide significant new research opportunities in chemical engineering. USING THE COMPUTER'S POTENTIAL Each decade over the last 35 years has seen the processing speed of newly designed com- puters increase by a factor of about 100 owing to advances in the design of electronic micro- circuits and other computer hardware. On top of this has been another 100-fold increase per decade in computer speed, thanks to more efficient methods of carrying out computations (algorithms). It is not widely appreciated that new algorithms have been as valuable as hard- ware design in improving computer perform- ance. With the combination of improved com- puter hardware and better algorithms, effective computer speeds have more than doubled on average each year. The availability of computing resources is also increasing rapidly. The actual and projected availability of high-speed supercomputers is shown in Table 8.1. The projection for 1990- at least 700 computers of Cray I class repre- sents 35 times more available computing power than that available in 1980. While continued substantial investment in supercomputers is needed, support for better ways of using them should not be neglected. It is conceivable that a new algorithm could effectively increase the power of supercomputing for a specific problem by a factor of 35 overnight. During the last two decades, many developments in numerical anal TABLE 8.1 Actual and Estimated Supercomputing Resources Available to Researchers in the United States, 1980-1990 Year Number of Cray I-CIass Supercomputers 21 142 700-1,000 980 985 990

OCR for page 135
FRONTIERS IN' CHEMICAL ENGINEERI~G ysis have had a profound impact on scientific computation. Clearly computer technology has improved rapidly; there is little reason to doubt that it will continue to do so. The problem is now and will continue to be the lack of people trained to ap- ply computer technology to sci- entific and engineering tasks. The improvements suggested in Chapter 7 in terms of our ability to design and control better chemical products and processes will be made by chemical engi- neers who understand comput- ers- not by computer scientists or by software engineers. The countries that understand this distinction will lead the world in chemical technology. MATHEMATICAL MODELS OF FUNDAMENTAL PHENOMENA Chemical engineers have tra- ditionally used mathematical models to characterize the phys- ical and chemical interactions oc- curring in chemical processes. Many of these models either have been entirely empirical or have relied on crude approximations of the basic physics or chemistry of the process. This is because a typical chemical process comprises an assemblage of interacting Ho`vs, transports, and chemical reactions. Ac- curate analysis and prediction of the behavior of . such a complex system require detailed portrayal of the physics of transport and the chemistry of reactions, which calls for complex equations that do not yield to traditional math- ematics. Nonlinear partial differential and in- tegral equations in two and sometimes three spatial variables must be solved for regions with complicated shapes that often have at least some free boundary. The more accurate the model, the more mathematically complex it becomes, but it cannot be more complex than allowed for by the available methods for solving its equa- tions. Before the advent of modern computer-aided mathematics, most mathematical models of real chemical processes were so idealized that they had severely limited utility being reduced to one dimension and a few variables, or linearized, or limited to simplified variability of parameters. The increased availability of supercomputers along with progress in computational mathe- matics and numerical functional analysis is rev- olutionizing the way in which chemical engi- neers approach the theory and engineering of chemical processes. The means are at hand to model process physics and chemistry from the

OCR for page 135
C0MPU~ER-ASSISTED PROCESS AND CONTROL E\GI.~ERI~G molecular scale to the plant scale; to construct models that incorporate all relevant phenomena of a process; and to design, control, and opti- mize more on the basis of computed theoretical predictions and less on empiricism. Chemical engineers, using advanced computational meth- ods and supercomputers, can now readily iden- tify the important phenomena in a complex chemical process over the entire range of ap- plicable conditions by exhaustive solution of detailed models. The benefits of investing in less empirical, more fundamental mathematical models are becoming clear: . The capability to construct mathematical models that more fully incorpo- rate the basic chemistry and physics of a system provides a mechanism for assessing under- standing of fundamental phe- nomena in a system by compar- ing predictions made by the model with experimental data. ~ Better models can replace laboratory or field tests that are difficult or costly to perform or identify crucial experiments that should be carried out. In either case, they will significantly en- hance the scope and productivity of chemical engineering re- searchers in academia and in- dustry. ~ In process design, it is fre- quently discovered that many of the basic data needed to under- stand a process are lacking. Be- cause most current mathematical models are not sufficiently ac- curate to permit direct scale-up of the process from laboratory data to full plant size, a pilot plant must be constructed. As models are improved, it may be- come possible to evaluate design decisions with more confidence, and bypass the pilot plant stage. Process technologies for which the use of more comprehensive mathematical models can result in major im- provements include those for biochemical re- action processing; high-performance polymers, plastics, composites, and ceramics; chemical reaction processing (e.g., reaction injection molding, reaction coating, chemical vapor dep- osition); microelectronic circuits; optical fibers and disks; magnetic memory systems; high- speed coating; photovoltaic and semiconductor materials; coal gasification; enhanced petroleum recovery; solution mining; and hazardous waste disposal. To date the most extensive use of supercomputer modeling has been in space age weapons technologies, where objectives, eco- nomics, and time frames differ from those in

OCR for page 135
/ - - - the chemical process indus- tries. It is clearly in the national interest to stimulate the more extensive use of advanced com- putational methods and super- computers in other industries critical to our worldwide com- petitive position. A program to encourage the greater dissemi- nation of advanced computa- tional techniques and hardware will offer challenges and oppor- tunities to computational math- ematicians and numerical ana- lysts, to engineering scientists, to applications and software experts in firms that develop and manufacture supercompu- ters, and, above all, to perceptive leaders in high-technology pro- cess industries. The following sections de- scribe in more detail a number of areas in chemical engineering in which the ability to develop and apply detailed mathematical models should yield substantial rewards. Hydrodynamic Systems Much of the current compu- tational modeling research in chemical engi- neering is concerned.with the behavior of flow- ing fluids. The general system of equations that describe fluid mechanics, called the Navier- Stokes equations, has been known for more than 100 years, but for complex phenomena the equations are exceedingly difficult to solve. Only recently have methods been devised to treat such phenomena as shock waves and turbulence. Further difficulties arise when dis- parate temporal and spatial scales are present and when chemical reactions occur in the fluid. Solutions of the Navier-Stokes equations can be smooth and steady, or they can exhibit regular oscillations or even chaos. In some cases the fluid flow is enclosed by a rigid boundary with a complex shape, as in the extrusion of polymers; in others the flow is effectively un bounded and the solution must extend to infin- ity, as in atmospheric systems; and in still others, such as the flow of blood in vessels, the boundary is deformable. Solution of the Navier- Stokes equations for systems of technological interest remains an exceedingly challenging task; supercomputers are needed to treat those sys- tems that can be solved. Polymer Processing The development of polymers and polymer composites will benefit greatly from the avail- ability of better computers and better algo- rithms. The inherent properties of a polymer are governed by the chemical structure of its molecules, but the properties of a finished poly- mer product are affected by the interactions

OCR for page 135
among these molecules, which are strongly influ- enced by the way in which the material has been processed. While it is now possible to predict certain properties of polymers from their molecular structure, the ability to predict the effect of polymer processing steps on polymer properties is just being developed. Ideally it would be desirable to model all steps from the formation of the polymer through its processing and then predict the final properties of the material from structure-property relationships. Although such modeling is a formidable prob- lem, it is becoming feasible with the advent of supercomputers and improved algorithms. Petroleum Production Computation is widely used in petroleum exploration and production by exploration geo ~Air ~ ,:. physicists, petroleum engineers, and chemical engineers. As more sophisticated techniques are de- veloped for locating and recover- ing petroleum, mathematical modeling is playing an ever-ex- panding role. Once regions that may contain petroleum are located, local geo- logical features must be sought that might have trapped the hy- drocarbons. The discovery pro- cess is based on a kind of seismic prospecting in which geologic maps are constructed from re- flected seismic signals generated by explosions or vibrations at the surface of the earth. These signals are reflected or refracted in varying degrees by different rock strata and are recorded by a set of receivers. Thus, the prob- lem of interpreting signals can be likened to that of analyzing light beams reflected by an array of variously curved plates of glass of different reflectivities sepa- rated by liquids of different re- fractive indexes. The inverse mathematical problem of deter- mining the earth structure and the properties of the strata from the recorded signals is extremely difficult. After a hydrocarbon reservoir has been lo- cated, the flow of oil, water, gas, and possibly injected chemicals in the reservoir must be modeled. This challenge is particularly appro- priate for chemical engineers working with pe- troleum engineers because of the important role played by molecular level interactions between oil, subsurface water, and rock. Models for fluid flow in porous media comprise coupled systems of nonlinear partial differential equations for conservation of mass and energy, equations of state, and other constraining relationships. These models are usually defined on irregular domains with complex boundary conditions. Their nu- merical solution, with attendant difficulties such as choice of discretization methods and grid or- ientation, is a challenging intellectual problem.

OCR for page 135
Once wells have been drilled into the for- mation, the local properties of the reservoir rocks and fluids can be determined. To construct a realistic model of the reservoir, its properties over its total extent- not just at the well sites- must be known. One way of estimating these properties is to match production histories at the wells with those predicted by the reservoir model. This is a classic ill-posed inverse prob- lem that is very difficult to solve. When or if the reservoir is successfully sim- ulated, the engineer can turn to optimizing petroleum recovery, and theoretical ideas can be applied to models for various enhanced recovery methods to select optimal procedures and schedules (see Chapter 71. Combustion Systems Combustion is one of the oldest and most basic chemical processes (Plate 61. Its accurate mathematical modeling can help avoid explo- sions and catastrophic fires, promote more ef- ficient fuel use, minimize pollutant formation, and design systems for the incineration of toxic materials (see Chapter 81. For example, mod- eling the initiation and propagation of fires, explosions, and detonations requires the ability to model combustion phenomena. Models of the internal combustion engine can shed light on the influence of combustion chamber shape or valve and spark plug placement on engine performance. Models at the molecular level can provide a fundamental understanding of how fuels are burned and how gaseous and particu- late pollutants are formed. This can lead to ways to improve the design of combustion systems. Mathematical models of combustion must incorporate intricate fluid mechanics coupled with the kinetics of many chemical reactions among a multitude of compounds and free radicals. They must also consider that those reactions are taking place in turbulent flows inside chambers with complex shapes. Because complete models of real combustors, incorpo- rating accurate treatment of both fluid mechan- ics and chemistry, are still too large for present computers, the challenge is to construct simpler, yet still valid, models by using critical insight FRO.\TEERS I^Y CHEMICAL ENGI.~G into the important chemical and physical phe- nomena found in combustors. Chemical engi- neers have the mix of expertise necessary to accomplish this. Environmental Systems The environment can be likened to a giant chemical reactor. Gases and particles are emit- ted into the atmosphere by industrial and other man-made processes, as well as by a variety of natural processes such as photosynthesis, vul- canism, wildfires, and decay processes. These gases and particles can undergo chemical re- actions, and they or their reaction products can be transported by the wind, mixed by atmos- pheric turbulence, and absorbed by water drop- lets. Ultimately, they either remain in the at- mosphere indefinitely or reach the earth's surface. For example, the hazes of polluted atmospheres consist of submicron aerosols of inorganic and organic compounds, which are formed by chem- ical reaction, homogeneous nucleation, or con- densation of gases (Plate 71. Models of atmospheric phenomena are similar to those of combustion and involve the coupling of exceedingly complex chemistry and physics with three-dimensional hydrodynamics. The distribution and transport of chemicals intro- duced into groundwater also involve a coupling of chemical reactions and transports through porous solid media. The development of ground- water models is critical to understanding the effects of land disposal of toxic waste (see Chapter 71. PROCESS DESIGN The primary goal of process design is to identify the optimal equipment units, the optimal connections between them, and the optimal conditions for operating them to deliver desired product yields at the lowest cost, using safe process paths, and with minimal adverse impact to the environment. Design is a complex prob- lem that involves not only the quantitative computing depicted in the previous section, but also the effective handling of massive amounts of information and qualitative reasoning.

OCR for page 135
CO~PUTER-ASSISTED PROCESS AVID CO1YTROL ENGINEERING Computer-Assisted Design of New Processes Designs for new processes proceed through at least three stages: Conceptual design the generation of ideas for new processes (process synthesis) and their translation into an initial design. This stage includes preliminary cost estimates to assess the potential profitability of the process, as well as analyses of process safety and environmental considerations. Final design a rigorous set of design cal- culations to specify all the significant details of a process. Detailed design preparation of engineer- ing drawings and equipment lists needed for construction. The key step in the conceptual design of a new chemical manufacturing process is gener- ating the process flowsheet (Figure 8.24. All other elements of computer-aided design (e.g., process simulation, design of control systems, and plantwide integration of processes) come into play after the flowsheet has been estab- lished. In current practice, the pressure to enter the market quickly often allows for the explo- ration of only a few of the process alternatives that should be considered. To be fair to today's designers, it is possible to generate a very large number of alternative process paths at the conceptual stage of design, and yet experience indicates that less than 1 percent of the ideas for new designs become commercial. Thus, the challenge in computer-aided process synthesis is to develop systematic procedures for the generation and quick screening of many process alternatives. The goal is to simplify the synthe- sis/analysis activity in conceptual design and give the designer confidence that the initial universe of potential process paths contained all the pathways with reasonable chances for commercial success. The advances in computer- aided process synthesis that are possible over the next decade are dramatic. They include both an increasing level of sophistication (e.g., the synthesis of heat exchanger networks, se- quences of separation processes, networks of reactors, and process control systems) and com- putational procedures that should make possible ~3 the identification of the most viable process option in a relatively short amount of time. As the designer moves from conceptual design toward final design, he or she must analyze a number of alternatives for the final design. The development of large, computer-aided design programs (so-called process simulators) such as FLOWTRAN, PROCESS, DESIGN 2000, and ASPEN (or other equivalent programs used in various companies J has significantly automated the detailed computations needed to analyze these various process designs. The availability of process simulators has probably been the most important development in the design of petrochemical plants in the past 20 years, cutting design times drastically and resulting in better designed plants. Although the available simulators have done much to achieve superior design of petrochem- ical processes, there is considerable room for improvement. For example, better models are needed for complex reactors and for solids processing operations such as crystallization, filtration, and drying. Thermodynamic models are needed for polar compounds. Moreover, the current process simulators are limited to steady- state operations and are capable of analyzing only isolated parts of a chemical plant at any given time. This compartmentalization is due to the limitations on computer memory that prevailed when these programs were first de- veloped. This memory limitation resulted in a computational strategy that divided the plant into "boxes" and simulated static conditions within each box, iteratively merging the results to simulate the entire plant. With today's su- percomputers, it is possible to simulate the dynamics of the entire chemical plant. This opens the way for dramatic advances in mod- eling and analysis of alternative process designs, because the chemical reactions that occur in manufacturing processes are usually nonlinear and interdependent, and random disturbances in the process can propagate quickly and threaten the operation of the entire plant. To nullify the effects of such disturbances, the designer must know the dynamics of the entire plant, so that control failure in any one unit does not radiate quickly to other units. It is now within our reach to integrate this sophisticated level of

OCR for page 135
~4 rip in A) . _ LL a' I in - ._ ~ o co ~ cn ~ ~ I, ~ c ~ - ' At o Cat ~ - - CO C,,, ._ ._ C C o o al ~ ~ Cat , ~ o o cn On CO in U. V' - CO - CO ~ 1 4 ~ o ~_ _ ~ cn - ~n An, \/ to On ~ Lo_ _ cn C~ 1 ~ ~ u) ~< ~ (D _ ~n 1 4 C~ ~n cn 1 ~ c~ 1 1~ tD 1 ~\ UD CO ~r ~ . cn~ f ~ U1, ~ ~ ~:CO m' 1 _ C~ rn CO CO cn CO 03 ~ ~7 _ ~= Tm 4 1 ! mT 1 4 l - C~ C~ y.:..:i 2..::3 y..:1 ~ 3 o ....1 it .3 ~ .::.3 e~, . ~ o 3 ..... 3 y..:3 y::1 ~. ~ : :.y, .: ~ y : : ::y :.:: y y .. :y: y y :y.:y _ ::::: y ey~ y ~y y ~ ~ ~ ~y y~ --y--y ~- ~y ~ y-y ~ y ~ y y- ~ y- . y.:::: :.:. y : y y ~ : : y ~ ,: , .y:'.: .. .. .. : y ::: ::.y: :.: - . :y: .* - , y :.:: :.y :.: : :: :.,. - ~- . ~ : :y ~ y ~ . yy 0 ~ .,, ,: .,,,y _ [7 . ~ .:.::y.:.::: ~ .:.y :.::.: ~_ 7.:.**y7*~ : ?. ~ y* ~ ~ y ~ *. . yZ xy - x. . .y.y~ x.y :. o .y ~ _ ~ ~ ~ ~ . ~ . ~ $* ~ ye.~ y..y .; - *$ - .y. I * $ * * ~ : $ y: *$ y.- y.:.7 .i . :. N 7 - *$ -::: $ ~ .y.* y y , y :.: ~ * : : :y :. . * - : y -,y, ..,y..~ .y y y y ~: [.: :.:.:.: : .::.: :y.:::. ::. :: ~ * :.: :::: ::.::y$ y.:y: :: :: . , ~=~= .= ~ . Ct . C~ _ ;> C~ Q O ~ _ ._ C) ~ .O 04 _ ~ Ct ~ Ct Ct ,.~ V ~ ^ O , ~ O ~ '' (V - (L) C~ (V ~ ~ I . oc ~ ~L) o ~. O _ ~ ._ ~ 50 _ ~ s O C~ ~ U) {- ~o - o C) ,= ~, . . _ . : - ~ C) . _ s~ (V O C) ;^ ~ O Ce Ct ~ ~, O o~ c C) V, 3 O ~ O Ct O ~ ,' ~ 5 Ct - ~ 3 C~ X n ._ rn ,_ C: _

OCR for page 135
COMPUTER-ASSISTED PROCESS AND CONTROL ENGINEERING design and analysis on a plantwide scale (in- cluding design and performance modeling of plantwide control systems) into the computa- tional tools used to analyze and optimize the performance of individual processes in the plant. In the detailed design stage for a chemical manufacturing process, a chosen process design must be converted into a list of equipment items to be purchased and a set of blueprints to guide their assembly. The design is presented as a detailed process flow diagram (PFD), from which is constructed a list of all needed items of equipment, and piping and instrumentation dia- grams (PIDs) that show the equipment and its interconnections. The next task is to establish the physical layout for the entire plant. Ad- vanced computer-based drafting tools aid in all these activities. Computer-Assisted Process Retrofitting The preceding section focused largely on the design of new plants. However, these proce- dures can also be adapted to the retrofitting of existing plants. Retrofitting is generally under- taken to increase the capacity of a plant; to make use of a new technology such as an improved catalyst, a new material of construc- tion, or a new unit operation; or to respond to a significant change in the cost of energy or raw materials. A fair amount of retrofitting in the chemical process industries in recent years has been undertaken to improve energy efficiency through plantwide energy integration retrofit- ting of about 50 processes to incorporate modern heat-exchanger network synthesis concepts has reduced energy requirements in the chemical industry by 30 to 50 percent. Retrofitting will continue to play a major role in the design of chemical plants as new procedures for com- puter-aided synthesis of separation systems are applied and research on process synthesis be- gins to yield large savings by helping existing petrochemical plants produce the same mix of products through more economical chemical reaction pathways. We need to develop a systematic approach to analyzing the impact of making changes in the connections between process units or in the ~5 size of units that are undertaken to improve operating costs, plant flexibility, or safety. Research Opportunities in Process Design The overall goal of process design research is to develop a systematic procedure, probably in the form of an interactive computer program, that contains design heuristics and interchange- able approximate and rigorous models that can lead an engineer from an initial concept to a final design as quickly as possible. The final design must include considerations of econom- ics, controllability, safety, and environmental protection. We need to extend the conceptual and final design procedures that have been developed for petrochemical processes to pro- cesses for producing polymers, biochemicals, and electronic devices. We also must develop systematic synthesis/analysis procedures for studying batch processes analogous to the pro- cedures that have been developed for studying continuous petrochemical processes. There are some aspects of process design in which decisions are based primarily on past experience rather than on quantitative perform- ance models. Problems of this type include the selection of construction materials, the selection of appropriate models for evaluating the phys- ical properties of homogeneous and heteroge- neous mixtures of components, and the selec- tion of safety systems. Advances in expert systems technology and information manage- ment will have a profound impact on expressing the solutions to these problems. In summary, systematic procedures must be developed for the following: generation of process alternatives; quick screening of process alternatives us- ing both rule-of-thumb and short-cut calcula- tions; inclusion of controllability, safety, and en- vironmental factors in the initial design; more detailed screening of a small number of promising design alternatives; design of the process and management of its construction; use and extension of expert systems con- cepts to handle aspects of design that deal with

OCR for page 135
a mixture of qualitative and quantitative infor- mation; and retrofitting of existing plants. PROCESS OPERATIONS AND CONTROL Process operations and control have a tre- mendous impact on the profitability of a man- ufacturing operation. In some cases, they can determine the economic viability of a manufac- turing facility. For example, Du Pont's Process Control Technology Panel has estimated that if Du Pont were to extend the degree of computer process control that has been achieved at a few of its plants across the entire corporation, it would save as much as half a billion dollars a year in manufacturing costs. If Du Pont's num- bers are representative, the entire chemical industry could save billions of dollars each year through more widespread application of the best available process control. This could be the single most effective step that the U.S. chemical process industry could take to improve its global competitive position in manufacturing. Why are such savings suddenly possible? Because of the explosive developments in com- puter technology, research on operations and control is no longer constrained by lack of computing power. In particular, the traditional boundaries between design, control, optimiza- tion, simulation, and operation are disappearing. Control is becoming a part of process design; simulation and optimization are becoming com- ponents of control design. Research opportunities in process operations and control lie in three areas: collection of information through process measurements; interpretation and communication of infor- mation by use of process models; and utilization of information through control algorithms and control strategies for both nor- mal and abnormal operation. Measurements The essence of process control is to take appropriate and quick corrective action based on measured information about the behavior of the process. The concept of the process is FRO~Tl,~RS IN CHE1441~AL E:iGINEERING contained in a process model, and measure- ments are used to evaluate the degree to which the process conditions deviate from those of the model. When a mismatch occurs between actual process conditions and those postulated by the model, a control and operating strategy is invoked to correct the process conditions. A critical interrelationship exists between mea- surements and operating/control strategy, one that is too often neglected. The control strategy depends on what information is or can be available, even while it dictates which mea- surements are needed. This is perhaps best illustrated in a number of manufacturing pro- cesses in cutting-edge technologies, where con- trol and operating strategy is circumscribed by the lack of appropriate sensors for many critical process variables. Conventional estimation techniques are used that infer the values of unmeasured variables from measured variables, but these provide imperfect guidance for process control. Even something as empirical as process mea- surement cannot be divorced from the need for good process modeling. In the absence of a good model, it may not be known what variables affect process operation or product quality and should therefore be measured or estimated. Process measurements are subject to errors. Random (stochastic) disturbances are ubiqui- tous, and gross or systematic errors can be caused by malfunctioning sensors or instru- ments. The detection and elimination of these errors are essential if the data are to be used for process operations and control. The success of this screening depends on the measurements themselves, the failure data available, and the process control strategy. At the present time, diagnostic programs are not applied to most sensor failure data. The detection and remedia- tion of significant errors in measurements for process control pose interesting research op- portunities. Interpretation of Process Information The quality of an operation and control strat- egy depends on the quality of the model on which it is based (Figure 8.31. We are only beginning to understand this relationship quan

OCR for page 135
Ct3.~PtJTE1R-4SSIST]ED P3RGitCiESS A Dry ~ C)Ai7~OL JE.~ ~.~1lNG - r ~ i' ...and in 1 /10,000 of a second it can compound the process model's error 87,500 times!" FIGURE 8.3 The importance of accurate process models. Copyright 1988 by Sidney L. Harris. titatively, even in the relatively simple context of the feedback control of linear systems. Even if it is assumed that the structure of the process model is correct, we do not yet know how to translate uncertainties in model parameters into uncertainty in the performance of the control system. A more difficult problem is to assess the effect of an incorrect model structure, such as a wrong set of basic equations, on the performance of the control strategy on which it is based. Understanding the effect of model- process mismatch on control system perform- ance provides a critical research opportunity. In the context of operations and control, simulations can be used to test new process strategies as well as to train operating personnel to control the process and to respond to emer- gencies. The increasing use of simulation in process control requires that the cost of dynamic simulation be brought down. This could be done by taking advantage of new developments in computers, such as new user interfaces, com- puter architectures, and languages, and by de- veloping faster numerical integration algorithms for ordinary differential equations. Alarm management also requires research. Modern chemical plants usually have audible and visual annunciators to warn operators when key variables deviate from acceptable or safe values. A process upset in a plant that has several interconnected units with many feed- back controls can set off multiple alarms, and the consequences of misinterpreting the alarms can range from inefficient process operation to outright disaster. When the alarm sounds, the operator must decide quickly what action to take. A hybridization of expert systems and process control systems can assist the operator in interpreting process status after an abnormal event. The need for better handling of abnormal events makes research in artificial intelligence of great importance to the chemical industry. Integration of Process Design with Control Most continuous plants are now designed for steady-state operation with little regard for the ease (or difficulty) with which the steady state can be maintained through control. Such a plant can be difficult to control once it deviates from the steady state. Design and control have traditionally been treated separately for the following reasons: The problems in each area alone are ex- ceedingly complex. The interactions between design, control, and optimization are poorly understood. The computational requirements of an in- tegrated approach to design and control have been beyond the capability of available hard- ware and software. For example, a chemical plant might be de- signed to achieve high efficiency by integrating the operation of many individual process units across the plant (e.g., by using waste heat from one unit as an energy source in another unit). However, the tight coupling of process units generally makes the entire plant more difficult to control. Therefore, this is a factor that must be considered at the design stage. No method- ology currently exists for including this consid- eration in plant design; its development consti- tutes a significant research opportunity. The supercomputer power that is becoming available will provide the opportunity to com

OCR for page 135
- - c' bine process and control sys- tem design-including optimiza- tion into one large problem that can be solved in a way that ac- counts for their interactions. The success of such a consolidation will depend on the develop- ment of approximate compatible models and of techniques to re- late model quality to perform- ance. Robust and Adaptive Control Control systems are designed from mathematical models that are generally imperfect descrip- tions of the real process. It is essential that control systems op- erate satisfactorily over a wide range of process conditions. Thus, the control algorithm must pro- vide for control of the process even when the dynamic behavior of the process differs signifi- cantly from that predicted by the model. A control system with this characteristic is sometimes called robust. In fact, a tradi- tional disregard for the model error problem is one of the main reasons for the frequently cited gap between theory and practice in process control. Industry needs algorithms that are robust rather than ones that "get that last half percent performance." Control strategies that work all the time within reasonable limits are bet- ter than those that work optimally some of the time but that frequently require reversion to manual control. Because over time a process often changes, its model parameters must be continually up- dated; in extreme cases, the basic model must be reformulated. An adaptive system is a control system that automatically adjusts its controller settings or even its structure to accommodate changes in the process or its environment. The problem of model-plant mismatch is of crucial importance in the design of adaptive controllers for processes since it is that very mismatch that drives changes in the controller parameters. The engineering theory and methodology for designing reliable adaptive controllers for chem- ical processes are in the earliest stages of development. Finally, there are always process operations in which neither classical nor modern control is effective. Such operations may require qual- itative decision making or the use of past knowl

OCR for page 135
CO.~JPU1~-~SSIS~D PROCESS A.~D CON~L E\~.~G edge. Artificial intelligence techniques offer promise for control system design in these cases. Batch Process Engineering The production of fine and specialty chemi- cals, which are usually made by batch pro- cesses, is becoming increasingly important and competitive. The efficient operation of multi- product and multipurpose batch plants offers a variety of challenging research problems for chemical engineers. Most industrial batch chem- ical operations are now scheduled by intuitive, ad hoc methods that consist of modest variations around historical operating patterns and that make little or no use of computer technology. It is now widely recognized that the scheduling problems associated with batch processes are immensely complex and, in fact, are among the most difficult combinatorial problems known. Limited progress has been made in using math- ematical models in the simplest types of batch process scheduling. Current algorithms are too computationally demanding and complex for industrial use. An important intellectual chal- lenge is to generate a unified field of batch process engineering theory and to put it into a practical context by using case studies. Linear control theory will be of limited use for operational transitions from one batch re- gime to the next and for the control of batch plants. Too many of the processes are unstable and exhibit nonlinear behavior, such as multiple steady states or limit cycles. Such problems often arise in the batch production of polymers. The feasibility of precisely controlling many batch processes will depend on the development of an appropriate nonlinear control theory with a high level of robustness. While startup and shutdown occur relatively infrequently in large continuous plants, they are inherent in batch plant operation. Most startup and shutdown procedures, whether devised em- pirically or theoretically, are designed to follow a recipe of actions with no feedback. Thus, if up- sets occur, there is often no way to change the startup or shutdown in time to avoid unwanted process excursions. Procedures are needed that incorporate feedback and adaptive techniques to the problem of plant startup and shutdown. PROCESS SENSORS If we had a completely accurate model of a process and accurate measurements of process disturbances at their inception, then corrective action could be taken directly without the need to measure the output streams from the process after the disturbance has propagated through it. But because we generally do not have adequate models, the output streams of processes must be measured for the purpose of feedback con- trol. The sensor is the "fingertip" of the process control system. The principal challenge in pro- cess sensing is the development of analytical sensors, particularly for determining process stream composition. Such sensors eliminate the need to withdraw samples to determine process and product parameters, a practice that should be minimized because of inherent problems (e.g., samples of reactive intermediates may be toxic or otherwise dangerous, or the interven- tion represented by withdrawing a sample may affect process operation). Since it is important for process control not to disturb the normal operation of the process, sensors are needed that can operate in the environment of the process stream. The key to meeting this chal- lenge is a fundamental understanding of the physical and chemical interactions at the sen- sor-environment interface and, in particular, the transport and kinetic processes that occur there. Future Sensor Developments The techniques used in the chemical pro- cessing of electronic microcircuits (see Chapter 4) are being adapted to the microfabrication of two- and three-dimensional structures for solid- state sensors. These techniques will permit the integration of transducers, optoelectronics, sig- nal-conditioning and data-processing devices, and micromechanical devices into extremely small packages. Reduced size offers advantages in thermal uniformity and response speed; shock and acceleration resistance; and reduced weight, volume, power, and cost. Solid-state sensors may be developed that will be responsive to a broad range of acoustic

OCR for page 135
150 inputs, electromagnetic radiation, ionizing ra- diation, and electrochemical stimuli. Response elements may be tailored for high selectivity among ions, free radicals, or specific com- pounds. Alternatively, elements with low selec- tivity are also useful because information from an array of such sensors, each with a different but known broad response, can be processed to provide quantitative analysis of a complex mixture. Complex mixtures also lend them- selves to chromatographic analysis. It has been shown that gas chromatographic data can be analyzed on a silicon chip, although with some loss in recognition reliability. Combinations of gas or liquid chromatography or capillary elec- trophoresis of microsamples with mass spec- trometry may be developed to provide superior performance. The development of biological sensors is taking place at a rapid pace. Biological sensors analyze chemical mixtures using biological re- agents of exquisite specificity for example, enzymes, immunoproteins, monoclonal anti- bodies, and recombinant nucleic acids. Such sensors may permit the analysis of fast reactions of species in very dilute media. Multicomponent biological sensors may be able to perform com- plex analyses that involve multiple reactions, with automatic regeneration of the biochemical reagents or removal of interfering species. Unfortunately, current biological sensors are extremely delicate. Even when the biological reagents are immobilized on a solid carrier, such sensors require careful construction and frequent recalibration, are not always amenable to automation or unattended operation, and sometimes have inconsistent dynamic response and limited life. Although biological reagents are ideally suited for some applications, partic- ularly those in relatively mild environments, they may not survive the harsher conditions often found in process industries. Here again, miniaturization of the biological sensor and its direct integration into an optoelectronic trans- ducer, potentiometric electrode, or membrane are promising approaches. With the current worldwide interest in biotechnology, major in- novations in biological sensors can be antici- pated. Further advances in optoelectronics will allow FRONTIERS IN CHEMICAL ENGINEERING d_ L OUTPUT TRANSMITTER 0~ /COUPLER RECEIVER OPTO ELECTRONIC HUB r FIBER /LIGHT -. ~ ~ ~ OR -~ ~^ .- ~ \ / EXTRINSIC: OPTICAL SENSOR (OPTRODE) SENSOR NETWORK OR -~- INTRINSIC FIBER SENSOR NETWORK FIGURE 8.4 Configurations for several different kinds of optical fiber sensing systems are shown. The common factor in ail these systems is the use of an optical fiber as an integral element in the system, either to carry light to and from discrete sensors (often referred to as optrodes), or as sensitive elements themselves (intrinsic fiber sensors). Courtesy, AT&T Bell Laboratories. the development of instrumentation with no electrical components in the sensor (Figure 8.41. These devices will operate by transmitting probe light from a remote source to the process sensor with an optical fiber light guide. In the sensor, the light signal will be altered by the sensed environment (e.g., by absorption of certain wavelengths, fluorescence, or scattering) and will thus be "encoded" with information. The encoded signal is transmitted through the optical fiber to a transducer that produces an electronic signal. The advantages of such systems include inherent safety, low signal attenuation, and the ability to multiplex signals in the optical fiber. Such instrumentation can incorporate additional chemical, biological, and electronic components and is likely to play a major role in many future sensor systems. Future sensors and their associated data pro- cessing elements will need capabilities beyond those required for the simple measurement of process variables, such as periodic self-calibra- tion against known standards, automatic com- pensation for environmental or other interfer- ences, signal conditioning including linearization or other variable calculation, and fault recog- nition and diagnosis. Some of these capabilities are now available to a limited extent. Others will become available with the continued de- velopment of integrated sensors and data pro

OCR for page 135
CO'WPUTER-ASSISTED PROCESS A1YD CO1~L E^~1VEERING cessing instrumentation. Arrays of sensors have already been mentioned in connection with complex mixture analysis, but they may also be used to provide redundancy, fault detection, and data reconciliation. Research Opportunities The availability of high-quality, real-time in- formation on the conditions and composition of the process stream will permit engineers to develop a completely new generation of process control strategies. The physical, chemical, and biological phenomena at the sensor/process in- terface must be understood and translated into sensor technology. Chemical engineers are well positioned to contribute to the development of improved process sensors in a variety of ways, including work in interdisciplinary collaborations with electronic engineers, biologists, analytical chemists, and others to elucidate the biological, chemical, and physical interactions to be meas- ured; application of fundamental principles of reaction engineering and transport phenomena to the design of sensor surfaces; development of new process control sys- tems and operation strategies in response to improved capabilities for measurement; and determination of the implications for pro- cess design of wholly new types of process sensors. PROCESS ENGINEERING INFORMATION MANAGEMENT In the next decade, competition among in- dustrialized countries will be influenced by the way in which information and knowledge are managed in industry. The challenge is to be the first to find relevant information, to recognize the key elements of that information, and to apply those elements in the manufacture of desired products. Computer technology will continue to provide new generations of hardware and software for fast information processing and low-cost storage and retrieval. The use of computers for infor 151 mation management and decision making will be essential, and advanced capabilities in user interfaces and networking will bring new di- mensions to this application. For example, cur- rent on-line literature search systems allow data sharing among many users, significantly increas- ing individual productivity. However, this tech- nology is generally only used to manage well- organized data; basic research is needed to apply it to engineering data, which are not as well organized. A process engineer will need to be able to store and access relevant data rapidly in order to carry out process development and design in less time and to solve problems arising from new and complex design requirements (e.g., designing for multiple objectives of profitability, safety, reliability, and controllability). To pro- vide for rapid data storage, access, and transfer, new generations of computer hardware (bulk storage, network, and work stations) and soft- ware (data bases) will be needed. Some specific research challenges for chemical engineers, working together with computer scientists and information specialists, follow. Most chemical manufacturing processes in the future will be monitored and controlled by computer. Process data will be collected con- tinuously and stored either locally or in a central data base. Research is needed to develop mu- tually compatible, efficient algorithms for stor- ing and retrieving process data. In addition, computerized procedures will be needed for sifting through voluminous process data for information to use in process improvement and in the generation of new processes. Methods must be developed for storing judgments, assumptions, and logical informa- tion used in the design and development of processes and models. Procedures and methods will be needed for retrieving and operating on other types of im- precise data. Efficient transfer of information among en- gineers and designers will depend on how easily these data can be accessed. The special needs of chemical engineers in this area the partic- ular ways in which they generate, manage, and use information merit study.

OCR for page 135
152 IMPLICATIONS OF RESEARCH FRONTIERS The speed and capability of the modern com- puter, as well as the developing sophistication of chemical engineering design and process control tools, have tremendous implications for the practice of chemical engineering. Chemical engineers of the future will conceptualize and solve problems in entirely new ways. There are two bottlenecks to the application of these powerful resources, though. First, there are not enough active research groups at the frontiers of computer-assisted process design and con- trol. A larger effort is needed in order for the field to keep pace with the expanding power of available computers. Second, many chemical engineering departments lack the computational resources needed to fully integrate advances in design and control into their curricula. For the full potential of the computer to be realized in improved design of chemical products and the improved design and operation of processes to FRAN TIERS IN CHE`~ ICAL E.~.N'EE,~G produce them, chemical engineers must be broadly versed in advanced computer technol- ogy. This can only happen if they have access to state-of-the-art computational tools through- out their educational careers, not in an isolated course or two. Making this broad access and utilization of the computer in education possible will require substantial government, academic, and indus- trial funding to provide both hardware and software. In some cases, groups may need remote access to networks of supercomputers. In other cases, dedicated array processors and other computational hardware may be required in the chemical engineering department. If this country is to maintain a leadership role in chemical technology, critical needs for both research support and facilities acquisition must be addressed. The status of funding in this field and a specific initiative to achieve the goals outlined above are discussed in Chapter 10 and Appendix A.