systems. Informally, when we introduce new components to reduce the effects of uncertainty in the environment, we inevitably create increased vulnerability either to these new components, or to other uncertainties in the environment. Since we control the design, if we are careful we can use this tradeoff to our advantage and shift our vulnerability from things that are more uncertain to things that are less, but explicit models of uncertainty are critical in achieving this. Unfortunately, with increasing complexity, evaluating these tradeoffs can be conceptually and computationally overwhelming.

The earlier section on software engineering discussed how large software development projects require a highly structured approach throughout, since interconnection management dominates component design. While this is now and always will be a challenging domain, it is still relatively homogeneous domain with limited uncertainty. Complex systems engineering has all of the challenges of software engineering plus heterogeneity (hardware and software plus chemical, electrical, mechanical, fluid, communications, and so on) and greater uncertainty (in environment and in system components). Complex systems remain even more poorly understood than large software systems.

Complex systems are poorly understood in part simply because nonlinear, heterogeneous, interconnected, complex dynamical systems are intrinsically difficult to model and understand. But more importantly, the role of uncertainty is critical, but very poorly understood. Furthermore, scaling of problem size can make the interaction of these issues overwhelming. As we will see, control theory addresses uncertainty management explicitly, but from a very narrow perspective. A deeper understanding of complex systems is emerging, but in separate and fragmented technical disciplines.

Finally, there is the “referee effect.” The referee effect comes from the observation that we notice referees only when they do a bad job. Similarly, we notice the details of our watches, televisions, phone systems, cars, planes, networks, and nuclear reactors only when they fail to provide reliable operation and shield us from the world's uncertainties. Basically, the product of a superior design process makes itself virtually invisible. Even when the design is flawed, it may appear to the user that the failure was due to some component, rather than an underlying design process. This is true in all the examples of failures above. Success or failure of components, including computer hardware and software, is relatively easily understood. The role of the system design process itself, deciding which components to use and how to interconnect them, remains a mystery outside of a narrow technical community. Thus complexity in engineering systems is very much in the eye of the beholder. A design engineer may deliberately introduce great complexity specifically for the purpose of providing the end user with an apparently simple and reliable system. The apparent complexity depends on the viewpoint, and traditionally the only global viewpoint is that of the control engineer.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement