Thinking in systems

Ronald Kaiser
11 min readMar 1, 2020
Design by Ronald Kaiser

Last week I finished reading “Thinking in Systems: A Primer” by Donella H. Meadows. It’s definitely worth the reading if you are looking for books that change the way you look at the world. This is the kind of book you would probably want to reread after living by its concepts for a while.

Below, a summary of the most important and interesting points.

Part One — System structure and behavior

I. Basics

  • A system consists of: elements, interconnections and a function (purpose).
  • Systems can be embedded in systems.
  • Many of the interconnections in systems operate through the flow of information.
  • Keeping sub-purposes and overall system purposes in harmony is an essential function of successful systems.
  • Purpose is deduced from behavior.
  • Self-preservation is an important function of almost every system.
  • Changing elements < Changing interconnections < Changing Purpose.
  • Stocks: elements of the system that you can see, feel, count, or measure at any given time. A stock is the memory of the history of changing flows within the system.
  • Dynamic equilibrium: inflow = outflow.
  • The human mind seems to focus more easily on stocks than on flows. On top of that, when we do focus on flows, we tend to focus on inflows more easily than on outflows.
  • A stock can be increased by decreasing its outflow rate as well as by increasing its inflow rate.
  • Flows take time to flow.
  • Stocks act as delays or buffers or shock absorbers in systems. Stocks allow inflows and outflows to be decoupled and temporarily out of balance with each other.

Everything we do as individuals, as an industry, or as a society is done in the context of an information-feedback system.

  • Feedback loop: when changes in a stock affect the flows into or out of that same stock.
  • Balancing feedback loop (negative): source of stability and resistance to change.
  • The presence of a feedback mechanism doesn’t necessarily mean that the mechanism works well.
  • Reinforcing feedback loop (positive): self-enhancing, leading to exponential growth or to runaway collapses.
  • Systems with similar feedback structures produce similar dynamic behaviors.
  • If A causes B, is it possible that B also causes A?

II. A brief visit to the Systems Zoo

Thermostat — a stock with two competing balancing loops

  • Heat leaks out of the warm room to the cool outdoors (insulation is imperfect). It’s like trying to keep a bucket full when there’s a hole in the bottom.
  • The information delivered by a feedback loop can only affect future behavior.

Population — a stock with one reinforcing loop (fertility) and one balancing loop (mortality)

  • Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate behavior.
  • Shifting dominance: when loops change their order of impact.

Business inventory — system with delays

  • Types of delay: perception delay (takes time to realize the problem), response delay (takes time to act on it), delivery delay (takes time to have the results delivered).
  • A delay in a balancing feedback loop makes a system likely to oscillate.
  • Reacting too fast can be bad (can cause oscillations).
  • Look for delays you don’t control.
  • Changing the length of a delay may make a large change in the behavior of a system. Increasing the delay can avoid oscillations.

Oil economy renewable stock constrained by a nonrenewable stock

  • Any physical, growing system is going to run into some kind of constraint, sooner or later. It will take the form of a balancing loop that in some way shifts the dominance of the reinforcing loop driving the growth behavior.
  • Growth in a constrained environment -> “limits-to-growth” archetype
  • The higher and faster you grow, the farther and faster you fall, when you’re building up a capital stock dependent on a nonrenewable resource.
  • The real choice in the management of a nonrenewable resource is whether to get rich very fast or to get less rich but stay that way longer.

Fishing economy — renewable stock constrained by a renewable stock

  • The regeneration rate of fish is not linear. If the population is very dense, their reproduction rate is near zero, limited by available food and habitat. If the fish population falls a bit, it can regenerate at a faster rate, because it can take advantage of unused nutrients or space in the ecosystem.
  • Nonrenewable resources are stock-limited.
  • Renewable resources are flow-limited.

Part Two — Systems and us

III. Why systems work so well

Properties of highly functional systems: resilience, self-organization, hierarchy.

  • Resilience: ability to recover.
  • Resilience arises from a rich structure of many feedback loops that can work in different ways to restore a system even after a large perturbation.
  • There are always limits to resilience.
  • Resilience ≠ static stability.
  • Self-organization: ability to learn, diversify, complexify, evolve.
  • Self-organization produces heterogeneity and unpredictability. It is likely come up with whole new ways of doing things. It requires freedom and experimentation, and a certain amount of disorder.
  • Hierarchy: arrangement of systems and subsystems.
  • Hierarchies reduce the amount of information that any part of the system has to keep track of.
  • People in the same university department talks to each other more than they talk to people in other departments. If you have a liver disease, a doctor usually can treat it without paying much attention to your DNA, or personality. Take a step back to consider the whole hierarchy, though. There are some exceptions to this.
  • Suboptimization: when subsystem’s goals dominate at the expense of the total system’s goals.
  • Hierarchical systems evolve from the bottom up. The purpose of the upper layers is to serve the purposes of the lower layers.

IV. Why systems surprise us

  • Everything we think we know about the world is a model.
  • Events are the most visible aspect of a larger complex — but not always the most important.
  • Much analysis in the world goes no deeper than events.
  • Most economic analysis goes one level deeper, to behavior over time.
  • Behavior-based models problem: if something changes in the system’s structure, the behavior-level analysis won’t help because it was dependent on the previous structure.
  • Behavior-based econometric models are pretty good at predicting the near-term performance of the economy, quite bad at predicting the longer-term performance, and terrible at telling one how to improve the performance of the economy.
  • Linear vs non-linear.
  • Everything is connected to everything else. There is no clearly determinable boundary between the sea and the land, between sociology and anthropology, between an automobile’s exhaust and your nose. There are only boundaries of word, thought, perception, and social agreement — artificial, mental-model boundaries.
  • Mixed-up borders are sources of diversity and creativity.
  • There is no single, legitimate boundary to draw around a system. We have to invent boundaries for clarity and sanity; and boundaries can produce problems when we forget that we’ve artificially created them.
  • It is a challenge to stay creative enough to drop the boundaries that worked for the last problem and to find the most appropriate set of boundaries for the next question. It’s a necessity, if problems are to be solved well.
  • At any given time, the input that is most important to a system is the one that is most limiting.
  • There always will be limits to growth. They can be self-imposed. If they aren’t, they will be system-imposed.
  • Ubiquitous delays: Estimate and multiply by 3.
  • Bounded rationality: people make quite reasonable decisions based on the information they have, but they don’t have perfect information. Also, we tend to pay less attention to information that doesn’t fit our mental models. Look for better, more complete and timelier information and enlarge your view of the system.
  • The bounded rationality of each actor in a system may not lead to decisions that further the welfare of the system as a whole.

V. System Traps…and opportunities

Policy resistance — fixes that fail

  • Problem: Policy resistance comes from the bounded rationalities of the actors in a system, each with his or her own goals. Any new policy, specially if it’s effective, just pulls the stock farther from the goals of other actors and produce additional resistance, with a result that no one likes, but that everyone expends considerable effort in maintaining.
  • Solution: Let go. Bring in all the actors and use the energy formerly expended on resistance to seek out mutually satisfactory ways for all goals to be realized — or redefinitions of larger and more important goals that everyone can pull toward together.

The tragedy of the commons

  • Problem: when there is a commonly shared resource, every user benefits directly from its use, but shares the costs of its abuse with everyone else. Therefore, there is a very weak feedback from the condition of the resource to the decisions of the resource users. The consequence is overuse of the resource, eroding it until it becomes unavailable to anyone.
  • Solution: Educate and exhort the users, so they understand the consequences of the abusing the resource. Also, restore or strengthen the missing feedback link, either by privatizing the resource so each user feels the direct consequences of its abuse or by regulating the access of all users to the resource.

Drift to low performance

  • Problem: allowing performance standards to be influenced by past performance, especially if there is a negative bias in perceiving the past performance, sets up a reinforcing feedback loop of eroding goals that sets a system drifting toward low performance.
  • Solution: Keep performance standards absolute. Even better, let standards be enhanced by the best actual performances instead of being discouraged by the worst. Use the same structure to set up a drift toward high performance!

Escalation

  • Problem: When the state of one stock is determined by trying to surpass the state of another stock, then there is a reinforcing feedback loop carrying the system into an arms race, a wealth race, a smear campaign, escalating loudness, escalating violence. The escalation is exponential and can lead to extremes surprisingly quickly. If nothing is done, the spiral will be stopped by someone’s collapse — because exponential growth cannot go on forever.
  • Solution: Avoid getting in. If caught in an escalating system, one can refuse to compete (unilaterally disarm), thereby interrupting the reinforcing feedback loop. Or one can negotiate a new system with balancing loops to control the escalation.

Success to the successful

  • Problem: If the winners of a competition are systematically rewarded with the means to win again, a reinforcing feedback loop is created by which, if it is allowed to proceed uninhibited, the winners eventually take all, while the losers are eliminated.
  • Solution: Diversification and policies that level the playing field.

Shifting the burden to the intervenor

  • Problem: shifting the burden, dependence, and addiction arise when a solution to a systemic problem reduces the symptoms, but does nothing to solve the underlying problem. The system will become more and more dependent on the intervention and less and less able to maintain its own desired state.
  • Solution: Avoid getting in. Beware of symptom-relieving or signal-denying policies or practices that don’t really address the problem. Take the focus off short-term relief and put it on long term restructuring.

Rule beating

  • Problem: Perverse behavior that gives the appearance of obeying the rules or achieving the goals, but that actually distorts the system.
  • Solution: Design better rules to release creativity not in the direction of beating the rules, but in the direction of achieving the purpose of the rules.

Seeking the wrong goal

  • Problem: If the goals are defined inaccurately or incompletely, the system may obediently work to produce a result that is not really intended or wanted.
  • Solution: Specify indicators and goals that reflect the real welfare of the system. Be especially careful not to confuse effort with result or you will end up with a system that is producing effort, not result.

Part Three—Creating change

VI. Leverage points — Places to intervene in a system

  • 12. Numbers — don’t overthink them. Probably 99 percent of our attention goes to parameters, but there’s not a lot of leverage in them.
  • 11. Buffers — big stocks (relative to flows) are more stable than small ones. Usually physical and not easy to change.
  • 10. Stock-and-flow structures — the only way to fix a system that is laid out poorly is to rebuild it, if you can.
  • 9. Delays — a system just can’t respond to short-term changes when it has long term delays. If there is a delay in your system that can be changed, changing it can have big effects.
  • 8. Balancing feedback loops — any loop of this type needs a goal, a monitoring and signaling device and a response mechanism. One of the big mistakes we make is to strip away these “emergency” response mechanism because they aren’t often used and they appear to be costly. The strength of a balancing feedback loop is important relative to the impact it is designed to correct.
  • 7. Reinforcing feedback loops — this type of system are sources of growth, explosion, erosion and collapse in systems. A system with an unchecked reinforcing loop ultimately will destroy itself.
  • 6. Information flows — missing information flows is one of the most common causes of system malfunction. There is a systematic tendency on the part of human beings to avoid accountability for their own decisions.
  • 5. Rules —if you want to understand the deepest malfunctions of systems, pay attention to the rules and to who has power over them.
  • 4. Self-organization — encouraging variability and experimentation and diversity means “losing control”.
  • 3. Goals — growth is the goal of a cancer too. Be careful when choosing yours.
  • 2. Paradigms — sources of systems. The ancient Egyptians built pyramids because they believed in an afterlife. No physical limitation is imposed on changing paradigms, although it can be hard to do it. Tip: build a model of the system, so you can see the whole, outside of it.
  • 1. Transcending paradigms — no paradigm is true. There is no certainty in any worldview.

VII. Living in a world of systems

  • We can’t control systems, but we can dance with them.
  • Get the beat of the system —watch how it behaves before you disturb the system. Watch the facts, not theories. Define the problem by the system’s behavior, not through the lack of our favorite solution (The problem is, we need to find more oil).
  • Expose your mental models the more you work on defining your model and play with it, the clearer your thinking will become. The faster you’ll admit your uncertainties and correct your mistakes, and the more flexible you will learn to be. Invite others to challenge your assumptions.
  • Honor, respect and distribute information — most of what goes wrong in systems is caused by biased, late, or missing information. Information is power.
  • Use language with care and enrich it with systems concepts — Make information as cleanest as possible and expand your language to talk about complexity. Eskimos and the many words for snow.
  • Pay attention to what is important, not just what is quantifiable —pretending something doesn’t exist if it’s hard to quantify leads to faulty models.
  • Make feedback policies for feedback systems — the best policies design learning into the management process.
  • Go for the good of the whole —hierarchies exist to serve the bottom layer, not the top. Don’t maximize parts of systems or subsystems while ignoring the whole.
  • Listen to the wisdom of the system — aid and encourage the forces and structures that help the system run itself. Before you charge in to make things better, pay attention to the value of what’s already there.
  • Locate responsibility in the system — look for the ways the system creates its own behavior. Design a system for intrinsic responsibility.
  • Stay humble, stay a learner —make mistakes and admit them.
  • Celebrate complexity — the universe is messy, turbulent, nonlinear and dynamic.
  • Expand time horizons —you’d be a fool to keep your head down and look just at the next step in front of you. You’d be equally a fool just to peer far ahead and never notice what’s immediately under your feet. You need to be watching both the short and the long term — the whole system.
  • Defy the disciplines — admit ignorance and be willing to be taught by experts in other disciplines and by the system.
  • Expand the boundary of caring —the real system is interconnected.
  • Don’t erode the goal of goodness —don’t weight the bad news more heavily than the good. Keep standards absolute.

--

--