·Psychology & Behavior
Section 1
The Core Idea
Humans don't evaluate risk by computing probabilities. They evaluate risk by imagining outcomes. The more vivid the outcome — the more easily they can picture the plane crash, the terrorist attack, the catastrophic failure — the more dangerous it feels, regardless of how likely it actually is. This is neglect of probability: the systematic tendency to judge the risk of an event by the emotional intensity of its outcome rather than the mathematical likelihood of its occurrence.
Kahneman and Tversky identified the mechanism across decades of research: when the brain evaluates a potential threat, it does not perform a Bayesian calculation. It performs a simulation. Can I picture this happening? How clearly? How viscerally? The ease and emotional intensity of the simulation substitutes for the probability assessment. A 1-in-11-million chance of dying in a plane crash generates a vivid, terrifying mental image — the fireball, the plummeting fuselage, the helplessness. A 1-in-5,000 chance of dying in a car accident generates almost nothing — car accidents are so common they've become psychologically invisible. The plane crash is 2,200 times less likely. It feels infinitely more dangerous.
The post-9/11 driving shift is the most lethal documented case of probability neglect. After September 11, 2001, Americans abandoned air travel in massive numbers and drove instead. Gerd Gigerenzer's analysis showed that in the twelve months following 9/11, approximately 1,600 additional Americans died in car accidents above the statistical baseline — a direct consequence of substituting driving for flying. The total exceeded the number of passengers killed in the four hijacked planes. The fear of a vivid, emotionally overwhelming event — another terrorist attack at 30,000 feet — drove people toward a statistically far more dangerous alternative. They didn't calculate the risk. They imagined the outcome. And the outcome they could imagine most vividly was the one they feared most, regardless of its probability.
The pattern repeats across domains.
Terrorism kills fewer Americans annually than lightning, but terrorism receives orders of magnitude more policy attention and public fear. Nuclear power kills fewer people per unit of energy than any other major energy source, including solar and wind (when accounting for manufacturing and installation fatalities), but nuclear accidents are vivid — Chernobyl, Fukushima — and the fear of nuclear catastrophe has constrained the technology for decades. Shark attacks kill roughly five people per year worldwide. Falling coconuts kill roughly 150. The Discovery Channel does not run Coconut Week.
Applied to startups: probability neglect explains why founders systematically overestimate their chances of success. The vivid success story — Zuckerberg in his dorm room, Bezos in his garage — is emotionally available. The base rate — roughly 90% of startups fail, 75% of venture-backed startups return less than invested capital — is abstract and psychologically inert. The founder doesn't calculate the expected value of starting a company. They simulate the outcome. And the simulation is always the success story, because that's the narrative the ecosystem amplifies. The failed startups are invisible. The survivors are on magazine covers. Probability neglect, powered by survivorship bias, produces a founder class that systematically overestimates its odds.
Applied to investing: lottery-ticket stocks — low probability of massive payoff, high probability of total loss — are consistently overpriced relative to their expected value. Investors can vividly imagine the 50x return. They cannot feel the 90% probability of losing everything, because a gradual decline to zero is not a story the brain can simulate with emotional intensity. The result: speculative stocks trade at premiums that rational probability-weighting would never justify. The crypto market from 2017 to 2022 was probability neglect at industrial scale — millions of investors pricing assets based on the vividness of the upside story while functionally ignoring the probability distribution underneath it.