In 1999, Timur Kuran and Cass Sunstein published a paper that explains half of what goes wrong in public discourse: "Availability Cascades and Risk Regulation." The concept is a self-reinforcing cycle. A belief gains credibility simply because it gains repetition. The more people talk about something, the more cognitively available it becomes — it sits at the front of memory, ready to be retrieved. The more available it is in memory, the more people estimate it as common, important, or true. The higher the estimated importance, the more people talk about it. The cycle feeds itself. The belief doesn't need to be accurate. It needs to be repeated.
The mechanism combines three forces. First, the availability heuristic — Daniel Kahneman and Amos Tversky's finding that people judge the probability of events by how easily examples come to mind. If you can quickly recall instances of something, you estimate it as more common than it actually is. Second, social proof — if many people believe something, others infer it must be true, because why would so many people be wrong? Third, media incentives — stories that generate attention get more coverage, which makes them more available in memory, which generates more attention. Each force amplifies the others. Availability makes the belief seem common. Social proof makes it seem credible. Media coverage makes it seem important. The cascade accelerates.
Shark attacks are the textbook example. In any given year, roughly 5-10 people die from shark attacks worldwide. Approximately 450,000 people die from falls. Shark attacks receive thousands of times more media coverage per death than falls. The coverage makes shark attacks highly available in memory. The availability makes people dramatically overestimate the probability of being attacked. The overestimation generates public concern. The concern generates more coverage. After a single high-profile shark attack, beach tourism in the affected region can decline 20-30% — a multi-million dollar economic response to a risk that is statistically less dangerous than the drive to the beach.
Crypto hype cycles run on the same mechanism. In 2021, stories of ordinary people becoming millionaires through cryptocurrency investments saturated social media and mainstream news. Each success story made crypto wealth more available in memory. The availability made people overestimate the probability of similar returns. The overestimation attracted new investors. The new investment pushed prices higher. The higher prices generated more success stories. Bitcoin went from $10,000 to $69,000 in 18 months. The cascade wasn't driven by fundamentals. It was driven by the self-reinforcing loop between stories, availability, and behaviour. When the stories reversed — losses, bankruptcies, fraud — the cascade reversed too. The same mechanism that inflated the bubble deflated it.
The "Great Resignation" narrative followed the pattern. In 2021, media outlets reported a wave of workers quitting their jobs. The framing — millions abandoning traditional employment — generated enormous coverage. The coverage made quitting seem common. The perceived commonality gave people social permission to quit. Some quit. Their quitting generated more data points for the narrative. The narrative generated more coverage. Economists later showed that quit rates, while elevated, were consistent with a tight labour market and pandemic recovery — not a structural transformation of the employment relationship. The gap between the narrative ("everyone is quitting") and the reality ("quit rates increased modestly in a hot labour market") was the availability cascade in action.
Kuran and Sunstein's insight was that availability cascades are not just cognitive errors. They have political and regulatory consequences. Public fear of a risk, inflated by an availability cascade, generates demand for regulation. Politicians respond to the fear because ignoring it is politically costly. The regulation addresses the perceived risk rather than the actual risk. Resources flow to low-probability, high-availability threats and away from high-probability, low-availability threats. The result: society over-invests in defending against shark attacks and under-invests in preventing falls.
The cascade has a structural companion: preference falsification. People who privately doubt the prevailing narrative stay silent because expressing doubt carries social costs. The crypto skeptic at a 2021 dinner party learned to nod rather than argue. The manager who doubted the "Great Resignation" framing kept quiet because the narrative had acquired moral weight — questioning it felt like dismissing worker suffering. The silence of skeptics removes the corrective force that might slow the cascade. Without pushback, the narrative accelerates. Without acceleration, there would be pushback. The cascade sustains itself by suppressing the very mechanism — public dissent — that could deflate it.
Section 2
How to See It
Availability cascades appear wherever a narrative's prominence exceeds its empirical foundation, where the volume of discussion about a risk or opportunity is disproportionate to its actual magnitude, and where repetition is doing the work that evidence should be doing.
You're seeing an availability cascade when the frequency of a claim's repetition is growing faster than the evidence supporting it, when anecdotes are displacing statistics, or when the emotional intensity of a discussion is inversely proportional to the data behind it.
Technology & Startups
You're seeing an availability cascade when every conference keynote, podcast, and LinkedIn post declares that AI will eliminate 80% of jobs within five years — and the claim is supported by the same two or three studies, recited without caveats, while contradicting research receives no coverage. The repetition makes the claim feel settled. The feeling of settledness discourages investigation. The lack of investigation allows the claim to circulate unchallenged. Each repetition strengthens the cascade. The actual research is contested, the timelines are uncertain, and the historical precedent (every previous automation wave created more jobs than it destroyed) receives almost no attention because nuance doesn't cascade.
Financial Markets
You're seeing an availability cascade when a market narrative detaches from fundamentals and sustains itself through self-referential repetition. "This time is different" during the dot-com bubble. "Housing prices never go down nationally" before 2008. "Crypto is the future of money" in 2021. Each narrative started with a kernel of truth, gained repetition, attracted believers whose participation seemed to validate the narrative, and accelerated until the gap between narrative and reality became unsustainable. The availability cascade in financial markets has a specific name: a bubble. The mechanism is identical — belief → repetition → availability → more belief.
Media & Public Discourse
You're seeing an availability cascade when a rare event receives coverage disproportionate to its frequency, and the coverage itself changes public behaviour. Stranger kidnapping of children: extremely rare (roughly 100-150 cases per year in the US), yet consistently ranked by parents as their top fear. School shootings: statistically unlikely for any individual child, yet driving billions in school security spending. The coverage makes the event feel imminent. The feeling of imminence drives behavioural and policy responses calibrated to the perceived probability rather than the actual probability.
Organisational Decision-Making
You're seeing an availability cascade when an internal narrative at a company gains momentum through repetition rather than evidence. "We need to pivot to AI" because every competitor press release mentions AI, not because customer research supports the shift. "Remote work is destroying our culture" because three executives said it in different meetings, not because engagement data shows a decline. The narrative circulates in Slack channels, gets repeated in all-hands meetings, and eventually becomes organisational orthodoxy — not because it was validated, but because it was available.
Section 3
How to Use It
Availability cascades are a diagnostic tool for separating signal from noise. The discipline is recognising when the volume of a discussion reflects the importance of a topic and when it merely reflects the self-reinforcing mechanics of repetition, social proof, and media incentives.
Decision filter
"Before acting on a widely-held belief, apply the cascade test: is the evidence for this belief growing proportionally to the conviction? Or is the conviction growing through repetition while the evidence remains static? If conviction is outpacing evidence, you are likely inside an availability cascade."
As a founder
Availability cascades create both traps and opportunities. The trap: building a product because the market narrative says the category is hot, not because customer research validates the problem. In 2021, hundreds of startups launched crypto products because the cascade made crypto seem inevitable. Most failed when the cascade reversed. The opportunity: when an availability cascade inflates a perceived risk, you can build a business that addresses the risk — not because the risk is real, but because people's willingness to pay is driven by perception, not probability. Home security systems sell based on the perceived risk of burglary, which availability cascades inflate well above the statistical baseline. The founder's discipline: distinguish between building for real demand and building for cascade-inflated demand. Real demand persists when the narrative fades. Cascade-inflated demand collapses with the narrative.
As an investor
Availability cascades are the primary mechanism behind investment bubbles. The discipline is asking: what percentage of my conviction about this investment comes from evidence I've personally evaluated, and what percentage comes from the fact that everyone is talking about it? In 2021, SPACs, NFTs, and metaverse plays attracted billions in capital. The investment theses were thin. The availability cascades were thick. Every conference, every deal sheet, every LP meeting referenced the same narratives. Investors who couldn't distinguish between a cascade and a trend deployed capital into the cascade. When the cascade reversed, the capital evaporated. The Buffett principle — "be fearful when others are greedy and greedy when others are fearful" — is, at its core, an instruction to trade against availability cascades.
As a decision-maker
Inoculate your organisation against internal availability cascades by requiring evidence standards that increase with the conviction level. If a belief is widespread in the organisation ("our biggest competitor is about to launch X," "our customers are leaving for Y"), the conviction level is high — which means the evidence standard should be proportionally high. Ask: what is the primary source for this belief? How many independent data points support it? Is the evidence growing, or is the same evidence being cited more frequently? In most internal cascades, the answer is revealing: the belief traces back to a single data point — one customer call, one industry report, one executive's opinion — that was repeated until it felt like consensus.
Common misapplication: Dismissing all widely-held beliefs as availability cascades. Some beliefs are widely held because they are well-evidenced. Climate change is discussed frequently and is supported by overwhelming scientific evidence. The test is not popularity — it is the ratio of evidence to conviction. A belief where evidence and conviction grow together is not a cascade. A belief where conviction outpaces evidence is.
Second misapplication: Assuming you're immune. Kuran and Sunstein's most unsettling finding is that awareness of availability cascades does not protect against them. Knowing the mechanism helps you detect it in others' reasoning. It does not reliably prevent it in your own. The heuristic is too fast, too automatic, and too deeply embedded in how human memory works. The defence is structural — evidence requirements, base-rate checks, deliberate contrarian inquiry — not willpower.
Section 4
The Mechanism
Section 5
Founders & Leaders in Action
The leaders below built decision-making cultures that resist the pull of availability cascades — replacing the question "what is everyone talking about?" with "what does the evidence actually show?" Their organisations treat narrative skepticism as a core competency.
Bezos built Amazon's strategic culture around a principle that directly counteracts availability cascades: "It's always Day 1." The Day 1 philosophy is a refusal to let the prevailing narrative determine strategy. When the narrative said e-commerce was dead (after the dot-com crash), Bezos invested in infrastructure. When the narrative said Amazon should focus on retail (during the mid-2000s), Bezos launched AWS. When the narrative said hardware companies should outsource manufacturing (during the 2010s), Bezos invested in logistics and delivery. Each decision contradicted the availability cascade of its era. Bezos's six-page memo requirement serves a specific anti-cascade function: it forces decision-makers to articulate the evidence for a belief in writing, making it impossible to coast on "everyone knows" or "the market is moving toward." A belief that cannot survive six pages of written scrutiny was probably sustained by repetition rather than evidence. Bezos's most cascade-resistant decision was launching AWS in 2006, when the prevailing narrative held that Amazon was a retailer and had no business in cloud computing. The narrative was loud. The evidence — that Amazon had built world-class infrastructure to serve its own needs and could rent it to others — was quiet. Bezos followed the evidence. AWS now generates more operating profit than Amazon's retail business.
Hastings made two of the most consequential anti-cascade decisions in technology history. In 2007, the availability cascade said physical media was the future of home entertainment — DVD sales were growing, Blu-ray was launching, and Blockbuster had 9,000 stores. Hastings invested in streaming, a technology that barely worked at the time, because his analysis of bandwidth trends and content licensing economics suggested the cascade was wrong. The narrative said he was crazy. He was reading a different dataset. In 2011, the availability cascade said Netflix should remain a hybrid DVD-and-streaming company. Customers, media, and investors were furious when Hastings tried to separate the businesses (the Qwikster debacle). Hastings retreated tactically but maintained the strategic direction: streaming would replace DVDs. By 2023, Netflix had 260 million streaming subscribers. DVDs were extinct. Hastings's discipline was not predicting the future. It was refusing to let the present narrative — however loud, however widely repeated, however emotionally charged — override his independent analysis of the underlying data. The availability cascade said one thing. The data said another. Hastings followed the data both times.
Section 6
Visual Explanation
The top row shows the cascade's ignition: an initial claim or event triggers media coverage, which drives social repetition, which activates the availability heuristic (easy to recall = seems true). The feedback loop connects the heuristic back to media coverage — the perceived importance generates demand for more coverage, which generates more repetition. The cycle has no natural stopping point. Each rotation increases the perceived importance of the topic, which accelerates the next rotation.
The middle row shows the cascade's effects: inflated probability estimates, social pressure against dissent, and misallocated resources directed at perceived rather than actual risks. These effects are not sequential — they operate simultaneously, each reinforcing the others. Inflated probability estimates create demand for action. Social pressure silences the skeptics who might correct the estimates. Misallocated resources validate the narrative by demonstrating institutional seriousness. The cascade produces its own evidence of importance.
The bottom row shows the defence: base-rate thinking that separates the frequency of discussion from the frequency of occurrence, and a conviction-evidence ratio check that detects when belief is growing faster than its empirical foundation. The defence is simple to describe and difficult to practice — it requires the discipline to ask "what is the actual data?" at exactly the moment when the social environment is insisting that the question is unnecessary because "everyone already knows."
Section 7
Connected Models
Availability cascades draw power from multiple cognitive biases and social dynamics simultaneously. The connected models below explain the component mechanisms, the contexts where cascades are most dangerous, and the frameworks for resisting them.
Reinforces
Availability Heuristic
The availability heuristic is the cognitive engine inside every availability cascade. Kahneman and Tversky's finding — that people estimate probability based on how easily examples come to mind — explains why repetition creates perceived truth. Each repetition makes the claim more available in memory. Each increase in availability makes the claim feel more probable. The cascade is the availability heuristic running in a social feedback loop: individual cognitive bias, amplified by collective repetition, producing shared distortion at scale.
Reinforces
Narrative Fallacy
Nassim Taleb's narrative fallacy — the human tendency to construct coherent stories from random events — accelerates availability cascades by giving them explanatory structure. A cascade built on isolated data points ("crypto prices went up," "some people quit their jobs") gains power when wrapped in a narrative ("we are witnessing a financial revolution," "the Great Resignation reflects a fundamental shift in values"). The narrative makes the cascade feel like an explanation rather than a pattern of repetition. Once people have a story, they resist data that contradicts it because abandoning the narrative feels like losing understanding.
Reinforces
Social Proof
Social proof — the tendency to follow the behaviour and beliefs of others — is the social amplifier that converts individual availability bias into collective cascades. One person overestimating shark attack risk is a cognitive error. Ten million people overestimating it is a cascade, and the cascade is sustained by each person looking at the others and concluding: "everyone else believes this, so it must be true." Social proof transforms private cognitive bias into public consensus, which then reinforces the private bias. The loop is closed.
Section 8
One Key Quote
"People tend to assess the relative importance of issues by the ease with which they are retrieved from memory — and this is largely determined by the extent of coverage in media."
— Daniel Kahneman, Thinking, Fast and Slow (2011)
Kahneman identified the mechanism that makes availability cascades possible: importance is not assessed objectively. It is assessed heuristically, through the proxy of cognitive availability. And cognitive availability is not determined by actual frequency. It is determined by media coverage, social repetition, and emotional intensity. The implication is that our collective sense of "what matters" is systematically distorted by the information environment. The topics that receive the most coverage feel the most important — regardless of whether they are. The topics that receive the least coverage feel irrelevant — regardless of whether they are.
This is not a minor calibration error. Kahneman's research suggests that availability-driven importance estimates can be orders of magnitude wrong. People estimate the probability of dying in a plane crash at roughly 1 in 10,000. The actual probability is roughly 1 in 11 million. The thousand-fold overestimate is not stupidity. It is the availability heuristic processing decades of vivid, emotional, extensively covered plane crash stories and converting the coverage intensity into a probability estimate. The cascade doesn't make people irrational. It makes them rational about the wrong inputs — they are correctly processing the frequency of stories rather than the frequency of events.
The deepest implication of Kahneman's work is that availability cascades are not bugs in human cognition. They are features — the availability heuristic is fast, efficient, and usually correct. It fails specifically in environments where the frequency of stories is decoupled from the frequency of events. Modern media environments are precisely such environments. The heuristic evolved for a world where stories and events were tightly correlated — if your neighbour told you about a predator, a predator was probably nearby. It breaks in a world where a single predator event can generate millions of stories across global media channels. The cascade is the cost of running stone-age cognitive software on information-age inputs.
Section 9
Analyst's Take
Faster Than Normal — Editorial View
Availability cascades are the operating system of modern misinformation — and most people don't realise they're running it. The mechanism is subtle because it doesn't require anyone to lie. No one needs to fabricate a shark attack, invent a crypto millionaire story, or manufacture a resignation statistic. The cascade works with true information, selectively amplified. Every individual story can be accurate. The collective impression can be wildly wrong. That gap — between the accuracy of the individual data points and the inaccuracy of the emergent narrative — is where availability cascades do their damage.
The 2021 crypto cascade is the cleanest modern example. Bitcoin's rise from $10,000 to $69,000 was accompanied by an availability cascade that operated across every information channel simultaneously: mainstream media (CNN running Bitcoin price tickers), social media (crypto influencers with millions of followers), workplace conversations ("my colleague made $200K on Dogecoin"), and political discourse (Congressional hearings on crypto regulation). Each channel reinforced the others. The resulting perception — that crypto wealth was common, that missing the opportunity was costly, that adoption was inevitable — drove millions of retail investors to buy at inflated prices. When the cascade reversed, $2 trillion in market value evaporated. The information that drove the cascade wasn't false. It was selectively available. Stories of crypto millionaires cascaded. Stories of crypto bankruptcies didn't — until the bankruptcies became too numerous and dramatic to ignore, at which point a negative cascade replaced the positive one.
The "Great Resignation" is the case study in narrative-reality divergence. The Bureau of Labor Statistics data showed quit rates of 2.9-3.0% in 2021-2022, elevated from the 2.3% pre-pandemic baseline but not historically unprecedented — quit rates reached 2.8% in 2001. The media narrative, however, described a civilisational transformation: workers rejecting the fundamental premises of employment, pursuing passion over paycheques, and permanently reshaping the labour market. The gap between the modest statistical elevation and the revolutionary narrative was filled entirely by the availability cascade. Each "I quit my job" story on social media made quitting feel more common than it was. Each piece of coverage reinforced the perception. By late 2022, quit rates had returned to baseline. The "Great Resignation" was a real phenomenon, modestly overstated by roughly an order of magnitude by the cascade that named it.
AI discourse in 2025-2026 is running a textbook availability cascade. "AI will replace X% of jobs" claims circulate with extraordinary frequency and extraordinary confidence, supported by a thin empirical base of McKinsey reports and academic projections that are themselves based on capability assessments rather than actual labour market data. The cascade mechanics are visible: each dramatic claim generates coverage, each piece of coverage makes the claim more available, each increase in availability makes the claim feel more certain. Nuanced positions — "AI will transform many jobs without eliminating them," "previous automation waves created net new employment," "the timeline for job displacement is highly uncertain" — receive almost no cascade energy because nuance doesn't generate engagement. The result: public perception of AI's labour market impact is calibrated to the loudest claims rather than the best evidence.
Section 10
Test Yourself
The scenarios below test whether you can identify availability cascades in action, distinguish cascade-driven conviction from evidence-driven conviction, and apply base-rate thinking to narratives that feel intuitively true.
Each scenario presents a belief that seems well-established — your job is to determine whether the evidence supports the conviction or the repetition does.
Cascade or signal?
Scenario 1
In Q3 2025, every major business publication runs stories about 'the death of remote work.' CEOs from Goldman Sachs, JPMorgan, and Amazon issue return-to-office mandates. LinkedIn is filled with posts debating the topic. Your company's leadership team cites these stories in a meeting and proposes mandating five days in-office. When asked for internal data, they reference 'the industry trend' and two anecdotes about teams that improved after returning to the office.
Scenario 2
A venture capital firm notices that 'AI agents' have become the dominant theme at every conference, in every pitch deck, and across all industry newsletters in early 2026. The firm's partners are divided: half want to increase allocation to AI agent startups because 'this is clearly the next platform shift,' and half are cautious because 'we've seen this pattern before with crypto, metaverse, and Web3.' The firm has not yet invested in any AI agent companies.
Section 11
Top Resources
Availability cascades sit at the intersection of cognitive psychology, media theory, law, and behavioural economics. The strongest resources provide the theoretical framework, the empirical foundation, and the practical tools for detecting and resisting cascades in personal and organisational decision-making.
The origin paper. Kuran and Sunstein's analysis in the Stanford Law Review provides the formal framework for understanding how the availability heuristic, social proof, and preference falsification combine to produce self-reinforcing belief cycles. The paper's most important contribution is showing that cascades are not merely cognitive curiosities — they drive regulation, policy, and resource allocation, often misallocating society's response to risks based on perception rather than probability.
Kahneman's comprehensive treatment of heuristics and biases provides the cognitive foundation for understanding why availability cascades work. The chapters on the availability heuristic, WYSIATI ("What You See Is All There Is"), and the substitution of easier questions for harder ones explain the individual-level cognitive mechanics that cascades exploit. The book's most relevant insight: people do not assess probability. They assess the ease of imagining an event — and availability cascades systematically manipulate that ease.
Kuran's earlier work on preference falsification provides the social mechanism that sustains availability cascades. His argument: people publicly express beliefs they privately doubt because dissent carries social costs. In the context of cascades, preference falsification means that people who suspect a narrative is overblown stay silent because questioning the narrative feels socially risky. The silence of skeptics removes the corrective force that might slow the cascade, allowing it to accelerate unchecked until reality intervenes.
Sunstein's collaboration with Thaler extends the availability cascade framework into institutional design. The book's most relevant insight for cascade resistance: the structure of choices determines outcomes more reliably than the information available to the chooser. Applied to cascades: rather than expecting individuals to resist cascade-driven beliefs through willpower, design institutional processes (pre-mortems, evidence requirements, red teams) that structurally counteract cascade dynamics.
Shiller's analysis of speculative bubbles is the definitive treatment of availability cascades in financial markets. His account of the dot-com bubble, the housing bubble, and the bond market bubble demonstrates how narrative-driven availability cascades create and sustain asset price inflation far beyond fundamental value. The book's most useful framework: "naturally occurring Ponzi processes" where early investors' returns attract later investors, whose participation sustains the returns — a financial availability cascade where the availability of success stories drives the behaviour that creates more success stories, until it doesn't.
Availability Cascade Loop — a self-reinforcing cycle where repetition creates perceived importance, perceived importance generates more discussion, and more discussion drives further repetition. The belief doesn't need to be true. It needs to be talked about.
Reinforces
Information Cascades
Information cascades — where people abandon their private information in favour of following the observed behaviour of others — are the decision-making consequence of availability cascades. An availability cascade inflates the perceived probability of an event. An information cascade converts that inflated perception into action: people invest, flee, regulate, or build based not on their own analysis but on the observed behaviour of others who are themselves acting on cascade-inflated perceptions. The 2021 crypto bubble was both simultaneously: an availability cascade (repetition inflating perceived importance) driving an information cascade (people investing because other people were investing).
Tension
FOMO Components
FOMO — fear of missing out — is the emotional fuel that converts availability cascades into action. A cascade tells you "everyone is doing X." FOMO tells you "if you don't do X, you'll be left behind." The combination is potent: the cascade provides the perceived reality (X is happening everywhere), and FOMO provides the emotional urgency (act now or lose your chance). The tension: FOMO can be a valid signal when the underlying opportunity is real. It is a cascade amplifier when the underlying opportunity is inflated by repetition. Distinguishing between the two requires base-rate analysis that FOMO's emotional urgency actively discourages.
Leads-to
Herding
Herding — the tendency of individuals to mimic the actions of a larger group — is the behavioural output of availability cascades. The cascade creates the belief ("this sector is hot," "this asset will appreciate," "this risk is serious"). Herding converts the belief into coordinated action (everyone invests in the same sector, buys the same asset, demands the same regulation). The distinction: the availability cascade operates in the domain of belief. Herding operates in the domain of action. The cascade convinces people that something is true. Herding causes them to act on it simultaneously. The combination produces bubbles, panics, and regulatory overreactions — coordinated collective action based on collectively distorted perception.
The defence against availability cascades is unglamorous and difficult: base-rate thinking. When a narrative feels urgent and universal, stop and ask: what is the actual frequency of the thing being discussed? How does the frequency of discussion compare to the frequency of occurrence? What data would change my mind, and am I seeking it or avoiding it? These questions are simple. They are also psychologically costly — they require resisting the social pressure to agree with the prevailing narrative, the cognitive ease of accepting available information, and the emotional pull of a compelling story. Most people, most of the time, pay the psychological cost of thinking independently only when the cascade has already reversed and the damage is done.
One structural defence works in organisations: the pre-mortem. Before committing to a decision influenced by a popular narrative, ask the team: "Assume we acted on this belief and it turned out to be wrong. What would the post-mortem say?" The pre-mortem forces the team to construct a counter-narrative — to imagine a world where the cascade was wrong — and identify the evidence that would support the alternative. If the pre-mortem produces a plausible failure story with identifiable warning signs, the decision deserves more scrutiny. If the team cannot construct a plausible alternative, the decision may be sound. The pre-mortem doesn't prevent cascades. It creates a structured moment where cascade-resistant thinking is socially permitted.
The asymmetry is the real danger. Availability cascades inflate perceived risks and opportunities that are dramatic, emotional, and story-friendly. They deflate perceived risks and opportunities that are gradual, statistical, and boring. Terrorism cascades. Heart disease doesn't. Crypto cascades. Index fund returns don't. The "Great Resignation" cascaded. The steady, decades-long decline in worker bargaining power didn't. The world's actual risk landscape looks nothing like the world's perceived risk landscape — and the gap between the two is almost entirely explained by which risks generate good stories and which don't. Building a decision-making framework on perceived risk is building on a map that systematically distorts every feature. Base-rate thinking is the correction. It is also the discipline that almost no one practices until after the cascade has already cost them money, time, or credibility.
Scenario 3
A consumer health startup notices that 'gut health' has become a dominant topic on social media, with millions of posts, influencer endorsements, and mainstream media features. The startup's growth team proposes repositioning the company's existing supplement product around gut health messaging, arguing that 'the market is clearly moving in this direction.' The product team notes that their supplement has no specific gut health benefits and that the scientific evidence for most gut health claims is preliminary.