Level 2 · Module 1: Incentives Run the World · Lesson 4
Why Good People Do Bad Things in Bad Systems
When a system is badly designed, it doesn't just fail — it turns good people into participants in bad outcomes. A teacher who cares about students might start teaching to the test instead of teaching to learn, because the system rewards test scores. A doctor who cares about patients might rush through appointments because the system rewards volume over quality. A factory worker who cares about safety might stay quiet about problems because the system punishes anyone who slows down production. The problem is not that these people are bad. The problem is that the system has made doing the right thing expensive and doing the wrong thing easy.
Building On
The previous lesson showed how badly designed incentives can lead to absurd outcomes. This lesson takes it further: when entire systems are built around bad incentives, even good people start doing things they'd never do on their own. The problem isn't that people are bad — it's that the system makes bad behavior the rational choice.
The three-layer model explains why bad systems are so powerful: they usually align all three layers against doing the right thing. The material incentive rewards going along. The social incentive punishes anyone who speaks up. And over time, even the internal incentive — conscience — gets worn down because 'everyone is doing it.'
Why It Matters
It would be nice if the world were simple enough that good people always did good things and bad people always did bad things. But the real world is more complicated. Systems — the rules, rewards, and structures that shape how organizations work — have enormous power to shape behavior. A well-designed system makes it easy for ordinary people to do good work. A badly designed system makes it hard for even the best people to do the right thing.
Understanding this matters because you will spend your whole life inside systems — schools, teams, workplaces, communities. Some of those systems will be well-designed, and you'll be able to do good work without much friction. Others will be badly designed, and you'll feel the pull to cut corners, go along with problems, or look the other way. Recognizing when a system is pulling you toward bad behavior is the first step toward resisting it.
This lesson also matters because it teaches you not to be too quick to judge individuals without looking at the system they're operating in. When you see someone doing something wrong, the first question shouldn't always be 'What's wrong with that person?' Sometimes the better question is 'What's wrong with the system that makes that behavior rational?'
A Story
The Hospital That Punished Honesty
Riverside Community Hospital had a problem. Every time a nurse or doctor made a mistake — gave the wrong dose of medicine, mixed up a patient's chart, forgot to wash their hands — the hospital's policy was to punish whoever made the error. They'd get written up, face a disciplinary hearing, and sometimes get suspended or fired. The idea seemed reasonable: punish mistakes, and people will make fewer of them.
Dr. Anita Patel was a new resident at Riverside. She was smart, careful, and genuinely cared about her patients. In her first month, she noticed something strange. Nobody ever reported mistakes. The error log was almost empty. At first she thought Riverside must be an unusually good hospital. Then she started paying attention.
One afternoon, a senior nurse named Marcus quietly fixed a medication mix-up without telling anyone. Anita saw it happen and asked him about it afterward. Marcus looked around to make sure no one was listening. 'You never report an error here,' he said. 'If you do, you get punished. So people fix things quietly, and nobody writes anything down.'
Anita was troubled. She knew that in a good hospital, errors get reported — not to punish people, but to figure out what went wrong and prevent it from happening again. If a nurse gives the wrong dose because the labels on two medicines look almost identical, the fix isn't to punish the nurse. The fix is to change the labels. But you can only change the labels if someone reports the problem.
At Riverside, the punishment system had created a wall of silence. Good nurses and doctors — people who genuinely cared about patients — were hiding their mistakes because the system made honesty dangerous. And because mistakes were hidden, the same errors kept happening. The confusing medicine labels never got changed. The too-similar patient chart formats never got redesigned. The system was actually making the hospital less safe, not more.
Anita tried to raise the issue at a staff meeting. She suggested that the hospital adopt a 'no-blame reporting' system, where people could report errors without being punished, so the hospital could learn from them and fix the root causes. The hospital administrator, Mr. Keller, shut her down immediately. 'If we don't punish errors,' he said, 'people will get sloppy.' Several senior staff nodded in agreement.
Anita looked around the room. She could see it on their faces: Marcus, and at least a dozen other nurses and doctors, knew the current system was broken. But nobody spoke up to support her. The social incentive — don't make waves, don't challenge the boss — was too strong.
Six months later, a serious error happened. A patient received the wrong blood type during a transfusion because two patient charts had been accidentally swapped. The patient survived, but barely. An outside investigation found that the same chart-swapping problem had happened at least four times before — but it had never been reported because the nurses who caught it were afraid of being punished.
The investigation led to exactly the changes Anita had suggested: a no-blame reporting system, redesigned chart formats, and new procedures for double-checking blood types. But it had taken a near-tragedy to force the change, because the system had been designed to punish the very behavior — honest error reporting — that would have prevented it.
Vocabulary
- Perverse incentive
- An incentive that produces the opposite of its intended result. The hospital's punishment policy was a perverse incentive: it was meant to reduce errors but actually increased them by making people afraid to report problems.
- System design
- The way rules, rewards, and structures are set up in an organization. Good system design makes it easy for people to do the right thing. Bad system design makes the right thing costly and the wrong thing easy.
- Blame culture
- An environment where the response to problems is to find someone to blame rather than to find the root cause and fix it. Blame cultures discourage honesty and ensure that the same problems keep happening.
- Root cause
- The underlying reason a problem keeps happening — not the person who made the mistake, but the system condition that made the mistake likely. Fixing the root cause prevents future errors. Punishing the person does not.
Guided Teaching
Begin with the core question. Ask: 'If a nurse gives a patient the wrong medicine because two bottles look almost identical, whose fault is it — the nurse or the person who designed the bottles?' Most students will initially say the nurse, because the nurse made the mistake. Guide them to see that while the nurse is responsible for being careful, the bottle designer created a system where mistakes are predictable. Both matter — but only fixing the system prevents the next mistake.
Walk through the Riverside Hospital story. Ask: 'Why did good nurses like Marcus hide their mistakes?' The answer is that the system punished honesty. Ask: 'Does that make Marcus a bad nurse?' No — Marcus was a good nurse trapped in a bad system. The system's incentive structure made hiding errors the rational choice, even though everyone knew it was wrong.
Introduce the concept of perverse incentives. Ask: 'Can you think of other examples where a rule designed to help actually makes things worse?' Common examples for this age: a school rule against asking for help (which makes students afraid to admit confusion), or a team rule against making errors (which makes players afraid to try new things). The pattern is always the same: punish the symptom, and people hide the symptom instead of fixing the problem.
Connect to blame culture vs. learning culture. Ask: 'What's the difference between asking "whose fault is this?" and asking "why did this happen?"' The first question looks for someone to punish. The second question looks for something to fix. Ask: 'Which question makes things actually get better?' Systems that ask the second question improve. Systems that ask the first question just get better at hiding problems.
Close with personal application. Ask: 'Are there any systems in your life — at school, on a team, at home — where the rules make it hard to do the right thing?' This isn't about blaming the system for everything. It's about seeing clearly when a system is part of the problem. Ask: 'If you could redesign one rule to make it easier for people to do the right thing, what would you change?'
Pattern to Notice
When you see people consistently doing something that seems wrong or foolish, before judging them, look at the system they're in. Ask: what is the system rewarding? What is it punishing? If the system rewards the wrong behavior and punishes the right behavior, you shouldn't be surprised that people behave badly — you should be surprised that anyone behaves well. The pattern to notice is: when lots of people are making the same 'mistake,' it's probably not a people problem. It's a system problem.
A Good Response
A student who understands this lesson can explain why bad systems produce bad behavior even from good people. They can identify the difference between blaming individuals and examining system design. They understand what a perverse incentive is and can give examples. Most importantly, they've learned to ask 'What's wrong with the system?' before asking 'What's wrong with the person?'
Moral Thread
Courage
It takes courage to do the right thing when the system around you is designed to reward the wrong thing. Most people don't set out to do bad things — they get pulled into them by systems that make bad behavior easy and good behavior costly. The courageous person is the one who sees the system clearly and refuses to let it turn them into someone they don't want to be.
Misuse Warning
This lesson could be misused to excuse all bad behavior by blaming 'the system.' That's not the point. Individual responsibility still matters. Marcus could have spoken up and didn't — the system made it hard, but he still made a choice. The lesson is not that systems are responsible and individuals aren't. It's that both matter, and if you want to actually fix problems rather than just punish people, you have to look at the system as well as the individuals inside it.
For Discussion
- 1.Why did the hospital's punishment policy make things worse instead of better? What was it trying to do, and what did it actually do?
- 2.Is Marcus responsible for hiding his mistakes? Or is the system responsible for making honesty dangerous? Can both be true at the same time?
- 3.Can you think of a rule at your school that might be a perverse incentive — a rule that actually encourages the behavior it's trying to prevent?
- 4.What's the difference between a 'blame culture' and a 'learning culture'? Which one actually makes things better?
- 5.If you were put in charge of Riverside Hospital, what would you change first? Why?
Practice
System Detective
- 1.Think of a situation where you've seen people consistently doing something that seems wrong or foolish — at school, on a team, in your community.
- 2.Instead of blaming the people, investigate the system. Answer these questions:
- 3.1. What behavior is the system rewarding? (What do people get for going along?)
- 4.2. What behavior is the system punishing? (What happens to people who try to do the right thing?)
- 5.3. Is there a perverse incentive — a rule or structure that produces the opposite of what it's supposed to?
- 6.4. What is the root cause of the problem — not who is making mistakes, but what system condition makes those mistakes likely?
- 7.5. If you could change one thing about the system to make the right behavior easier, what would you change?
- 8.Write your analysis in one page and share it with a parent or teacher.
Memory Questions
- 1.What is a perverse incentive? Give an example from the hospital story.
- 2.Why did good nurses at Riverside hide their mistakes? Was it because they were bad nurses?
- 3.What is the difference between a blame culture and a learning culture?
- 4.What is a 'root cause,' and why is it more important to fix the root cause than to punish the person who made a mistake?
- 5.When you see lots of people making the same mistake, what should you look at before blaming the individuals?
A Note for Parents
This lesson teaches one of the most important concepts in organizational thinking: that system design matters more than individual willpower. The hospital story is a real pattern — healthcare has spent decades learning that punitive error-reporting systems make hospitals less safe, not more. The application for your child is immediate: when they see classmates, teammates, or even family members behaving badly, this lesson gives them the tools to ask whether the system is part of the problem. It also applies to your own parenting: are there rules or structures in your household that accidentally incentivize the wrong behavior? The parent who can honestly examine their own 'system design' models exactly the kind of thinking this lesson teaches.
Share This Lesson
Found this useful? Pass it along to another family walking the same road.