Level 3 · Module 5: Strategy and Consequences · Lesson 4

Why Smart Plans Fail

observationgroups-powerhuman-naturecharacter-leadership

Plans fail not because planners are stupid but because the world is complex. The gap between a plan on paper and its execution in reality is where friction, human nature, and uncontrollable variables destroy even the most brilliant strategies.

Building On

The fog of decision-making and incomplete information

After examining the fog that surrounds every decision, this lesson asks: what happens after the decision is made? Even when a plan is well-conceived despite the fog, execution introduces new layers of failure — friction, complexity, human error, and the stubborn refusal of reality to cooperate with even the most brilliant strategy.

How institutions drift from their original mission

Just as institutions drift from their founding purpose over time, plans drift from their original design during execution. The same forces are at work: complexity, self-interest, communication failures, and the gap between what leaders intend and what organizations actually do.

History’s graveyard is full of excellent plans. Plans that were logical, carefully reasoned, and designed by the smartest people available — and that fell apart on contact with reality. This isn’t an accident. It’s a pattern so consistent that it amounts to a law: no plan survives first contact with the enemy. That famous paraphrase of Helmuth von Moltke, the Prussian field marshal, captures something deeper than military wisdom. It describes a fundamental property of complex systems.

Plans fail because they are simplifications. Every plan is a model of reality, and every model leaves things out. The things left out — the variables the planner didn’t consider, the reactions they didn’t predict, the friction they didn’t account for — are precisely the things that destroy the plan. The more complex the situation, the more the plan must simplify, and the more it simplifies, the more it will be wrong.

Understanding why plans fail doesn’t mean you shouldn’t plan. It means you should plan differently: build flexibility into your strategy, expect the unexpected, and develop the capacity to adapt when — not if — things go wrong.

The Perfect Invasion That Wasn’t

In April 1961, approximately 1,400 CIA-trained Cuban exiles landed at the Bay of Pigs on Cuba’s southern coast. Their mission: to establish a beachhead, spark a popular uprising against Fidel Castro’s government, and overthrow the regime. The plan had been developed by the CIA’s best minds, approved at the highest levels of the American government, and rehearsed in detail.

On paper, the logic was compelling. Cuban exiles, motivated by genuine desire to liberate their homeland, would provide the manpower. American air support would neutralize Castro’s small air force. Once the exiles established a foothold, dissatisfied Cubans across the island would rise up and join them. Castro’s government, already unpopular (the CIA believed), would crumble.

Almost nothing went as planned. The air strikes, intended to destroy Castro’s air force on the ground, missed key targets. Castro’s planes survived and attacked the invasion fleet, sinking supply ships carrying ammunition, communications equipment, and medical supplies. The exiles reached the beach but found themselves pinned down, cut off from their supplies, and under air attack.

The popular uprising never materialized. The CIA’s assumption that ordinary Cubans were ready to revolt proved catastrophically wrong. Castro’s government, whatever its flaws, had genuine support among large segments of the population. Instead of joining the invaders, Cubans rallied to defend their country against what looked like a foreign-backed attack.

President Kennedy, newly inaugurated and inheriting the plan from the Eisenhower administration, had been assured by CIA and military planners that the operation had a strong chance of success. But the assurances were based on a chain of assumptions, each plausible individually, that were collectively fragile. The air strikes would succeed. The supply ships wouldn’t be sunk. The population would revolt. Castro’s army would fragment. Remove any single link, and the chain broke. Several links broke simultaneously.

Within three days, the invasion had collapsed completely. Most of the exiles were captured. The United States was humiliated internationally. Castro’s position was strengthened rather than weakened. And the failure set the stage for the Cuban Missile Crisis eighteen months later — the very crisis Kennedy then had to navigate through the fog we discussed in the last lesson.

Kennedy, to his credit, took personal responsibility. “Victory has a hundred fathers,” he said, “but defeat is an orphan.” He later told advisor Theodore Sorensen: “How could I have been so stupid?” But stupidity wasn’t the problem. The problem was that a complex plan with many interdependent assumptions met a complex reality that didn’t cooperate.

Friction
Clausewitz’s term for the countless small difficulties that, in combination, make the execution of any plan far harder than it looks on paper. Miscommunication, equipment failure, fatigue, weather, bad luck — individually trivial, collectively devastating.
Assumption chain
A plan that depends on multiple assumptions being true simultaneously. The more links in the chain, the more fragile the plan — because the failure of any single assumption can cause the entire plan to collapse.
Complexity
The property of systems with many interacting parts whose behavior cannot be fully predicted from knowledge of the individual parts. Complex situations generate surprises because the interactions between elements produce outcomes that no one anticipated.
Groupthink
The tendency of cohesive groups to converge on a shared view without critically examining alternatives, driven by the desire for harmony and the suppression of dissenting opinions. A major source of plan failure.

Ask: “What was wrong with the Bay of Pigs plan?” The answer is not that it was poorly reasoned. Each individual assumption was plausible. The problem was that the plan required all of them to be true at the same time. This is the assumption chain problem: a plan that depends on five things going right has the combined probability of all five — which is always much lower than the probability of any one.

Let’s make this concrete with numbers. If each of five assumptions has a 70% chance of being correct — which sounds high — the probability of all five being correct simultaneously is 0.7 × 0.7 × 0.7 × 0.7 × 0.7 = about 17%. A plan built on five “probably true” assumptions has less than a one-in-five chance of working as designed. This is why complex plans fail at rates that surprise their creators.

Ask: “Why didn’t the CIA planners see these problems?” Several reasons, all of which show up whenever smart groups make bad plans:

1. Groupthink. The planning team shared a common worldview and reinforced each other’s assumptions. Dissenting voices were marginalized. When everyone in the room agrees, it feels like evidence of a good plan — but it’s often evidence that the room lacks diversity of perspective.

2. The sunk cost trap. By the time Kennedy reviewed the plan, enormous resources had already been invested. Training camps were built, exiles were recruited, equipment was purchased. Canceling the operation felt like waste. But continuing a bad plan because you’ve already invested in it is one of the most common and destructive decision-making errors.

3. Overconfidence in expertise. The CIA was an elite organization staffed with brilliant people. Their very competence made them less likely to question their own assumptions. Expertise can be a source of overconfidence: the more you know, the more certain you feel — and the less you notice what you don’t know.

4. The map-territory confusion. The plan was a map of how the invasion would unfold. But Cuba was the territory — a real place with real people whose behavior couldn’t be scripted. The planners fell in love with their map and forgot that the territory always has the final word.

Ask: “What should the planners have done differently?” Three practices that reduce plan failure:

(1) Stress-test assumptions. For each critical assumption, ask: “What if this is wrong?” and “What’s my evidence that it’s right?” The Bay of Pigs planners assumed a popular uprising without strong evidence that one would occur. If they’d asked “What if nobody revolts?” the entire plan would have looked different.

(2) Build in flexibility. The best plans include contingencies: “If X doesn’t happen, we do Y instead.” The Bay of Pigs plan had almost no fallback. When the air strikes failed, there was no Plan B. When the uprising didn’t materialize, there was no alternative path to success.

(3) Invite the skeptic. Deliberately include someone whose job is to argue against the plan — to find its weaknesses, challenge its assumptions, and point out what could go wrong. This is uncomfortable, which is why most groups don’t do it. But the discomfort of internal criticism is far less costly than the disaster of unexamined groupthink.

Ask: “Does this mean planning is pointless?” No. Eisenhower, who knew more about military planning than almost anyone in history, said: “Plans are worthless, but planning is everything.” The process of planning forces you to think through the problem, identify risks, and prepare for contingencies. The plan itself will need to change. But the thinking you did while planning prepares you to change it intelligently.

Watch for assumption chains in your own plans and in the plans people present to you. Whenever someone describes a plan that requires multiple things to go right, count the assumptions. The more links in the chain, the more fragile the plan. Also notice groupthink: when everyone agrees enthusiastically and no one is raising concerns, that’s not necessarily a sign of a good plan. It may be a sign that dissent has been suppressed. The best plans are the ones that have survived serious criticism, not the ones that have never faced it.

Before committing to any significant plan, stress-test it. List every assumption the plan depends on and ask: what if this one is wrong? Identify the weakest links. Build contingencies for the most likely points of failure. Invite criticism from someone who thinks differently than you do. And once you begin executing, stay alert for signs that reality is diverging from the plan — because it will. The measure of a good strategist is not whether their plan survives intact but how quickly and intelligently they adapt when it doesn’t.

Adaptability

The virtue this lesson develops is not cleverness in planning but adaptability in execution — the willingness to abandon a beautiful plan when reality proves it wrong, and to respond to what is actually happening rather than what was supposed to happen. Adaptability is not the absence of planning; it is the refusal to let a plan become a prison.

This lesson could produce a teenager who criticizes every plan anyone makes by pointing out all the things that could go wrong. That’s not strategic wisdom — it’s learned helplessness dressed up as sophistication. Every plan can fail. Every plan has assumptions. The existence of risk is not an argument against action. It’s an argument for better planning and greater adaptability. It could also encourage a false sense that if you just think hard enough, you can anticipate everything. You can’t. The lesson is about reducing the probability of catastrophic failure, not eliminating all risk.

  1. 1.What is an assumption chain, and why does it make plans fragile?
  2. 2.What role did groupthink play in the Bay of Pigs failure?
  3. 3.Why did Eisenhower say “plans are worthless, but planning is everything”? What’s the difference between the plan and the planning?
  4. 4.What three practices reduce the likelihood of plan failure?
  5. 5.Think of a plan you’ve made that didn’t work out. Can you identify the assumption that broke?

The Assumption Audit

  1. 1.Choose a plan you are currently working on or considering. It could be an academic project, a social event you’re organizing, a personal goal, or a plan for summer.
  2. 2.Conduct an assumption audit by following these steps:
  3. 3.1. Write the plan out in a few sentences. What are you trying to accomplish, and how?
  4. 4.2. List every assumption the plan depends on. Be ruthless — include assumptions you haven’t thought about explicitly. (Examples: I’m assuming I’ll have enough time. I’m assuming my friend will cooperate. I’m assuming this will cost what I think it will.)
  5. 5.3. For each assumption, rate your confidence: High (I have strong evidence), Medium (seems likely but I’m not sure), or Low (I’m hoping this is true).
  6. 6.4. For every Medium or Low assumption, write a contingency: “If this assumption is wrong, I will...”
  7. 7.5. Find one person who thinks differently than you do and ask them to critique the plan. Write down their objections. Do any of them reveal assumptions you missed?
  8. 8.After the audit, revise your plan. The goal is not a perfect plan — it’s a more resilient one.
  1. 1.Why did the Bay of Pigs invasion fail despite careful planning by experienced professionals?
  2. 2.What is an assumption chain, and how does it make plans fragile?
  3. 3.What is groupthink, and how does it contribute to plan failure?
  4. 4.What did Eisenhower mean by “plans are worthless, but planning is everything”?
  5. 5.What three practices help reduce the likelihood of plan failure?

This lesson uses the Bay of Pigs invasion — one of the most studied failures in American foreign policy — to illustrate why intelligent plans executed by competent people still fail. The concept of the assumption chain is particularly useful for teenagers, who often create plans that depend on everything going right and are then surprised when something goes wrong. The Assumption Audit exercise directly applies the lesson’s framework to your teenager’s actual plans, making the abstract concrete. Groupthink is introduced here and is especially relevant for adolescents, who are highly sensitive to social consensus and may struggle to voice dissent in peer groups. The Eisenhower quotation (“plans are worthless, but planning is everything”) captures the lesson’s core paradox: you must plan thoroughly while holding your plan loosely. This is a sophisticated idea, but your teenager is ready for it — and the ability to plan adaptively is one of the most practically valuable skills this curriculum offers.

Found this useful? Pass it along to another family walking the same road.