Box 4-2

Primer on Concepts and Variables in Systems Thinking

Social systems contain intricate networks of feedback processes, both self-reinforcing (positive) and self-correcting (negative) loops. Failure to focus on feedback in policy design has critical consequences. Suppose the hospital you run faces a deficit, caught between rising costs and increasing numbers of uninsured patients. In response, you might initiate quality improvement programs to boost productivity, announce a round of layoffs, and accelerate plans to offer new high-margin elective surgical services. Your advisors and spreadsheets suggest that these decisions will cut costs and boost income. Problem solved—or so it seems.


Contrary to the open-loop model behind these decisions, the world reacts to our interventions (Figure). There is feedback: our actions alter the environment and, therefore, the decisions we take tomorrow. Our actions may trigger so-called side effects that we did not anticipate. Other agents, seeking to achieve their goals, act to restore the balance that we have upset; their actions also generate intended and unintended consequences. Goals are also endogenous, evolving in response to changing circumstances. For example, we strive to earn more in a quest for greater happiness, but habituation and social comparison rapidly erode any increase in subjective well-being (Kahneman et al., 1999).


Policy resistance arises because we do not understand the full range of feedbacks surrounding—and created by—our decisions. The improvement initiatives you mandated never get off the ground because layoffs destroyed morale and increased the workload for the remaining employees. New services were rushed to market before all the kinks were worked out; unfavorable word of mouth causes the number of lucrative elective procedures to fall as patients flock to competitors. More chronically ill patients show up in your ER with complications after staff cuts slashed resources for patient education and follow-up; the additional workload forces still greater cuts in prevention. Stressed by long hours and continual crisis, your most experienced nurses and doctors leave for jobs with competitors, further raising the workload and undercutting quality of care. Hospital-acquired infections and preventable errors increase. Malpractice claims multiply. Yesterday’s solutions become today’s problems.

NOTE: Arrows indicate causation, e.g., our actions alter the environment. Thin arrows show the basic feedback loop through which we seek to bring the state of the system in line with our goals. Policy resistance (thick arrows) arises when we fail to account for the so called “side effects” of our actions, the responses of other agents in the system (and the unanticipated consequences of these), the ways in which experience shapes our goals, and the time delays often present in these feedbacks.

NOTE: Arrows indicate causation, e.g., our actions alter the environment. Thin arrows show the basic feedback loop through which we seek to bring the state of the system in line with our goals. Policy resistance (thick arrows) arises when we fail to account for the so called “side effects” of our actions, the responses of other agents in the system (and the unanticipated consequences of these), the ways in which experience shapes our goals, and the time delays often present in these feedbacks.


Ignoring the feedbacks in which we are embedded leads to policy resistance as we persistently react to the symptoms of difficulty, intervening at low leverage points and triggering delayed and distant effects. The problem intensifies, and we react by pulling those same policy levers still harder in an unrecognized vicious cycle. Policy resistance breeds cynicism about our ability to change the world for the better. Systems thinking requires us to see how our actions feed back to shape our environment. The greater challenge is to do so in a way that empowers, rather than reinforces, the belief that we are helpless victims of forces that we neither influence nor comprehend.


Time delays


Time delays in feedback processes are common and particularly troublesome. Most obviously, delays slow the accumulation of evidence. More problematic, the short- and long-run impacts of our policies are often different (smoking gives immediate pleasure, while lung cancer develops over decades). Delays also create instability and fluctuations that confound our ability to learn. Driving a car, drinking alcohol, and building a new semiconductor plant all involve time delays between the initiation of a control action (accelerating/braking, deciding to “have another,” the decision to build) and its effects on the state of the system. As a result, decision makers often continue to intervene to correct apparent discrepancies between the desired and actual



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement