·Psychology & Behavior
Section 1
The Core Idea
On April 17, 1961, approximately 1,400 CIA-trained Cuban exiles landed at the Bay of Pigs on Cuba's southern coast, expecting to trigger a popular uprising that would topple Fidel Castro. The operation was a catastrophe. Castro's forces, fully alerted and numerically overwhelming, killed or captured nearly every member of the invasion force within seventy-two hours. President Kennedy, who had approved the plan just three months into his presidency, was surrounded by what journalist David Halberstam would later call "the best and the brightest" — McGeorge Bundy, Robert McNamara, Dean Rusk, Allen Dulles — a cabinet of extraordinary intellect and experience. Yet the plan they endorsed was, by any sober military analysis, absurd: a daytime amphibious landing by a small force against a prepared army, with no credible air cover, no realistic escape route, and an assumption of mass civilian defection that had zero evidentiary support. How did some of the smartest people in American government approve a plan that a competent undergraduate could have dismantled? The answer was not stupidity. It was structure.
Irving Janis, a research psychologist at Yale, spent a decade studying that question. His 1972 book Victims of Groupthink introduced the concept that would become one of the most influential frameworks in social psychology, organisational behaviour, and decision science. Janis defined groupthink as a mode of thinking that occurs when the desire for harmony and conformity within a cohesive group overrides realistic appraisal of alternatives. The group does not fail because its members lack information or intelligence. It fails because the social dynamics of the group systematically suppress the processes — dissent, critical evaluation, consideration of alternatives — that would convert that intelligence into sound judgment. Groupthink is not a failure of individual cognition. It is a failure of collective architecture.
Janis identified eight symptoms that characterise groupthink, and they operate in concert like a self-reinforcing system. The first is an illusion of invulnerability — a shared belief that the group is too talented, too experienced, or too powerful to fail, which generates excessive optimism and encourages extreme risk-taking. The second is collective rationalisation — the group's tendency to discount warnings or disconfirming evidence by constructing shared justifications for why the evidence doesn't apply. The third is an unquestioned belief in the group's inherent morality — the assumption that the group's decisions are ethically sound, eliminating the need to examine moral consequences. The fourth is stereotyping of out-groups — dismissing opponents, critics, or competitors as too weak, too stupid, or too evil to require serious engagement. The fifth and sixth are direct pressure on dissenters and self-censorship — the social punishment of anyone who challenges the emerging consensus and the pre-emptive silence of members who anticipate that punishment. The seventh is an illusion of unanimity — the false perception that silence equals agreement, transforming a room full of private doubts into an apparent consensus. The eighth is the emergence of self-appointed mindguards — members who take it upon themselves to shield the group from information that might challenge the consensus, filtering dissent before it reaches the discussion.
The Challenger disaster of January 28, 1986, provided groupthink's most devastating confirmation. Engineers at Morton Thiokol — the company that manufactured the space shuttle's solid rocket boosters — warned NASA managers that the O-ring seals in the booster joints had never been tested at the low temperatures forecast for launch day. The data was unambiguous: below 53°F, the rubber O-rings lost elasticity and could fail to seal properly. Launch-day temperature was 36°F. Thiokol's engineers recommended postponing the launch. NASA managers pushed back. Thiokol management, under pressure to maintain the launch schedule and the contractor relationship, reversed its own engineers' recommendation. The decision to launch was presented as unanimous. Seventy-three seconds after liftoff, the O-ring on the right solid rocket booster failed. The shuttle broke apart. Seven crew members died. The Rogers Commission that investigated the disaster identified the decision-making process as a textbook case of groupthink: pressure on dissenters to conform, collective rationalisation of risk, an illusion of invulnerability rooted in seventy-nine previous successful shuttle missions, and the systematic suppression of the one group — the engineers with direct data — that had the information to prevent the catastrophe.
The corporate world generates its own Challenger moments with sobering regularity. Nokia's leadership team watched the iPhone launch in 2007 and collectively concluded that a device without a physical keyboard could not threaten their market dominance — a consensus maintained by a culture where challenging the prevailing strategic view was career suicide. Kodak's executives understood digital photography's trajectory a full decade before it destroyed their business, but the cohesion of the film-revenue-dependent leadership team ensured that every strategic review ended with the same conclusion: digital was a complement, not a replacement. Enron's board approved increasingly opaque financial structures because the social dynamics of the boardroom made questioning the company's celebrated leadership tantamount to admitting you didn't understand the strategy. In each case, the failure was not a lack of information. It was a surplus of cohesion — a group whose internal bonds were so strong that the bonds themselves became the obstacle to honest evaluation.
Groupthink's most dangerous feature is that it produces decisions that feel rigorous. The Bay of Pigs plan was debated in multiple meetings. The Challenger launch decision involved hours of teleconference discussion. Nokia's strategy was reviewed quarterly by experienced executives with decades of industry knowledge. The process looks like deliberation. But the outcome of the deliberation is predetermined by the social structure: the group's cohesion, the leader's stated preference, the implicit cost of dissent, and the illusion that silence constitutes agreement. Groupthink does not eliminate discussion. It hollows it out — preserving the form of critical evaluation while destroying its substance. The result is a group that is simultaneously confident in its process and catastrophically wrong in its conclusions.