Level 2 · Module 8: Systems Thinking · Lesson 1

Why Fixing One Thing Breaks Another

storyhuman-naturegroups-power

In any system where parts are connected, changing one part doesn't just affect that part — it ripples through the whole system. Most failures of planning come not from bad intentions but from failing to ask: 'And then what happens?'

Building On

Perverse incentives and unintended consequences

The Hanoi rat bounty showed how a well-intentioned incentive produced the opposite of its intended result. This lesson expands that idea: it's not just incentives that backfire. Any change to a connected system can produce unintended consequences, because the parts of a system don't sit still when you move one of them.

Institutional self-preservation and delay

The water test story showed how institutions respond to problems by protecting their image. This lesson looks at what happens when institutions try to fix problems without understanding the system — they often create new problems, which they then have to manage or hide.

You've probably been told some version of this: see a problem, fix the problem. If the cafeteria line is too long, add another line. If students are failing a test, make the test easier. If traffic is bad on one road, build a new road. Simple, right?

Except it almost never works that simply. Add another cafeteria line and now the seating area is overcrowded and kids eat in the hallways. Make the test easier and the grade inflation means the next year's teacher can't tell who actually understands the material. Build a new road and more people start driving because the route looks faster, until both roads are jammed.

This isn't bad luck. It's what happens in systems — networks of connected parts where everything affects everything else. Learning to think about connections before you act is one of the most valuable skills you can develop, because the world is made of systems, and people who don't understand systems are constantly surprised by the consequences of their own actions.

The Meadowbrook Speed Problem

Meadowbrook was a quiet neighborhood where families liked to walk, bike, and let their kids play outside. But there was a problem: cars were speeding down Elm Street, the main road through the neighborhood. Parents were worried. A seven-year-old named Lily had nearly been hit crossing the street to her friend's house.

The neighborhood council held a meeting. Residents were angry and scared. They demanded that the city do something. The city's traffic department responded quickly — within a month, they installed four speed bumps on Elm Street. The council celebrated. Problem solved.

For about three weeks, it was solved. Cars on Elm Street slowed down significantly. But then something started happening on Oak Street — the quiet residential road one block over.

Drivers who used Elm Street as a shortcut to the highway didn't like the speed bumps. They found an alternate route: turn on Oak Street, cut through the parking lot behind the grocery store, and rejoin the main road past all the bumps. Within two months, Oak Street — which had almost no traffic before — was carrying twice the cars, many of them speeding. Oak Street had no speed bumps, no crossing guards, and no sidewalks on the south side.

A boy named Jerome lived on Oak Street. He'd always ridden his bike in the road because there was barely any traffic. Now cars were flying past his house all day. His mom started driving him the three blocks to school because she no longer felt it was safe for him to bike.

Back on Elm Street, the speed bumps had another unintended effect. Emergency vehicles were being slowed down. A fire truck responding to a kitchen fire on Elm Street lost almost a minute navigating the bumps. An ambulance crew reported that the bumps jolted a patient on a stretcher. The fire department filed a formal complaint with the city.

Meanwhile, the grocery store whose parking lot had become a cut-through was furious. Their lot was full of non-customers speeding through, and two shopping carts had been hit. They threatened to install barriers, which would have pushed the cut-through traffic onto yet another street.

At the next council meeting, a retired engineer named Mr. Okafor stood up and said something that silenced the room: "We didn't solve a problem. We moved it. Elm Street is quieter, but Oak Street is more dangerous than Elm Street ever was. Emergency vehicles are slower. The grocery store is dealing with a mess it didn't create. And we're about to push the problem onto a third street. Every fix we've tried has made the system worse because we never looked at the system — we only looked at Elm Street."

A girl named Priya, who was eleven and had come to the meeting with her dad, raised her hand. "So what should we do? Just let the cars speed?" Mr. Okafor said, "No. But before we do anything else, we need to understand why the cars are on Elm Street in the first place. What's the actual system here?"

It turned out that the real issue was a poorly timed traffic light on the main highway. Drivers were cutting through Meadowbrook to avoid a three-minute wait at that light. The speed bumps didn't address the reason drivers were there — they just made Elm Street unpleasant enough to push drivers to Oak Street instead. Once the city retimed the highway traffic light, cut-through traffic in the whole neighborhood dropped by sixty percent. It wasn't a perfect fix, but it addressed the cause, not just the symptom.

System
A set of connected parts that affect each other. A neighborhood's roads are a system. A school is a system. A family is a system. When you change one part, other parts respond.
Unintended consequence
A result of an action that wasn't planned or expected. Not all unintended consequences are bad, but the damaging ones usually come from failing to trace how a change will ripple through connected parts.
Problem displacement
When a 'fix' doesn't eliminate a problem but moves it somewhere else — often somewhere less visible, which makes it look like the fix worked.
Root cause
The underlying reason a problem exists, as opposed to the symptom you can see. The speeding cars on Elm Street were a symptom. The poorly timed traffic light was the root cause.

Ask: 'Did the speed bumps work?' This is a trick question, and the answer your child gives will tell you a lot about how they currently think. If they say yes — the bumps did slow traffic on Elm Street. If they say no — the bumps made the overall situation worse. Both answers are partially correct, and that's the point. The speed bumps worked for Elm Street. They failed for the system. And this distinction — between solving a local problem and solving a system problem — is the core of this lesson.

Trace the chain with your child. This is a critical skill to practice explicitly. Start with the action: speed bumps installed on Elm Street. Then ask, 'And then what happened?' Drivers avoided Elm Street. 'And then what happened?' They found Oak Street. 'And then what happened?' Oak Street became dangerous. 'And then what happened?' Jerome couldn't bike to school. Each step is logical and predictable if you think about it in advance. But the people who installed the speed bumps didn't trace the chain. They stopped at step one: 'Elm Street will be slower.' That's where most people stop.

Ask: 'Why didn't the city see this coming?' Not because they were stupid. The traffic engineers were smart, experienced professionals. They didn't see it because they were focused on one part of the system — Elm Street — instead of the whole system. This is the default way humans think: we zoom in on the problem we can see and ignore the connections to everything around it. Systems thinking is the deliberate effort to zoom out and ask, 'What is this connected to, and how will those connections respond?'

Connect this to the incentive framework from Module 1. The drivers had an incentive: get to the highway faster. The speed bumps didn't change that incentive — they just blocked one path. So the drivers optimized around the obstacle, exactly as the rat farmers optimized around the bounty rules. When you block a behavior without addressing the incentive behind it, people will find another way to satisfy the incentive. This is a universal pattern.

Ask: 'What did Mr. Okafor mean when he said they moved the problem instead of solving it?' Problem displacement is one of the most common failures in systems. A school that bans phones in classrooms but not in hallways displaces the distraction. A parent who forbids a behavior in one context but doesn't address the underlying desire displaces the behavior into a context they can't see. Whenever you see a problem disappear from one place, ask where it went.

Now ask the key question: 'What's the difference between fixing a symptom and fixing a root cause?' The speeding cars were a symptom. The root cause was the traffic light timing that made cutting through Meadowbrook faster than staying on the highway. The speed bumps treated the symptom. Retiming the light treated the cause. This distinction matters in every area of life. If you're tired all the time, the symptom is tiredness — but the root cause might be staying up too late, or anxiety, or poor nutrition. Drinking more coffee treats the symptom. Fixing your sleep schedule treats the cause.

The takeaway principle: before you fix anything, ask three questions. First: what is this connected to? Second: if I change this, what else will change? Third: am I treating a symptom or a root cause? These three questions won't guarantee a perfect solution, but they'll prevent the most common and predictable mistakes.

Watch for problem displacement in the world around you. When a school, a family, or a city announces that they've 'solved' a problem, ask where the pressure went. Did the problem actually disappear, or did it move somewhere less visible? You'll start to notice that many so-called solutions are actually displacements — the speeding cars just moved to the next street, the banned behavior just moved to a different context, the cost just shifted to someone who wasn't at the table when the decision was made.

Before proposing or supporting a solution to any problem, trace the connections. Ask 'and then what happens?' at least three times. Identify who else is affected, what incentives are at play, and whether you're addressing the symptom or the root cause. You won't always get it right — systems are complicated and no one has perfect foresight. But the habit of asking these questions will save you from the most predictable failures. The goal isn't to avoid acting. It's to act with your eyes open.

Prudence

Prudence demands that we look beyond the immediate fix and ask what else might change. The imprudent person solves the problem in front of them and is surprised when new problems appear. The prudent person traces the connections first — not to avoid acting, but to act wisely.

This lesson could lead a child to believe that nothing should ever be done because every action has unintended consequences. That's paralysis, not wisdom. The point isn't that action is futile — it's that action should be informed by an understanding of connections. The speed bumps weren't wrong because the city tried to fix the problem. They were wrong because the city fixed only one part of the system without considering the rest. Mr. Okafor doesn't say 'do nothing.' He says 'understand the system before you act.' There's a world of difference between those two positions.

  1. 1.Why did the speed bumps make Elm Street safer but the neighborhood less safe overall?
  2. 2.What does Mr. Okafor mean when he says they 'moved the problem' instead of solving it?
  3. 3.Why didn't the city engineers think about what would happen on Oak Street?
  4. 4.Can you think of a time when fixing one thing in your life created a new problem somewhere else?
  5. 5.What's the difference between a symptom and a root cause? Why does it matter which one you fix?

The Chain Reaction Trace

  1. 1.Pick one of these scenarios (or create your own): (a) A school bans candy from the cafeteria to improve student health. (b) A family gets a puppy to make their lonely child happier. (c) A city closes a polluting factory to clean up the river.
  2. 2.For your chosen scenario, trace the chain of consequences. Start with the action and ask 'and then what happens?' at least five times. Write each step.
  3. 3.Identify at least one unintended consequence that the original decision-makers probably didn't think about.
  4. 4.Ask: is this fix addressing a root cause or a symptom? If it's a symptom, what might the root cause be?
  5. 5.Now propose a better approach — one that addresses the root cause and accounts for at least some of the connections you identified.
  6. 6.Discuss your chain with your parent. Did they see consequences you missed? Did you see some they missed? That's normal — no one person sees the whole system, which is why the best decisions come from people who are willing to ask others what they might be missing.
  1. 1.What is a system, and why does changing one part affect other parts?
  2. 2.In the story, what happened when speed bumps were added to Elm Street?
  3. 3.What is problem displacement?
  4. 4.What was the root cause of the speeding problem in Meadowbrook?
  5. 5.What three questions should you ask before trying to fix a problem?

This lesson introduces systems thinking — arguably the most practically valuable intellectual framework in the entire curriculum. The story is deliberately set in a familiar context (a neighborhood traffic problem) because systems thinking can feel abstract and this grounds it in something a child can visualize. The key pedagogical move is the 'and then what happens?' chain. Practice this explicitly and repeatedly. It's a thinking habit that transfers to every domain — schoolwork, friendships, family decisions, eventually career and civic life. The most common resistance you'll encounter is the feeling that thinking this way is 'overthinking.' Help your child see that it's not overthinking — it's just thinking further than the first step. Most people stop at step one, which is why most solutions create new problems. One note: be careful not to turn this into analysis paralysis. The lesson emphasizes that the goal is informed action, not inaction. Mr. Okafor doesn't say 'do nothing.' He says 'understand the system first.' That distinction matters, and it's worth reinforcing.

Found this useful? Pass it along to another family walking the same road.