When two explanations account for the same evidence, prefer the one with fewer assumptions.
That's it. The entire principle fits in a single sentence. Which is, of course, exactly the point.
The principle has survived seven centuries not because it sounds wise but because it works — in science, in business, in medicine, in engineering, in every domain where people build explanations from evidence and need those explanations to hold up under scrutiny.
Its longevity is itself a kind of evidence: the razor has outlasted every intellectual fashion because the underlying logic is structural, not stylistic.
William of Ockham, a 14th-century English Franciscan friar and logician, articulated the idea around 1323 in his Summa Totius Logicae: "It is futile to do with more things that which can be done with fewer." He was writing against the prevailing scholastic tradition — a system of medieval philosophy that had accumulated layers of metaphysical entities to explain natural phenomena. Every observation seemed to require a new category of being, a new hidden cause, a new theoretical apparatus. Ockham's razor cut through the accumulation. If a simpler account explains the same facts, the additional entities aren't wrong — they're unnecessary. And unnecessary complexity carries a cost that most people undercount.
The principle he's most commonly credited with — entia non sunt multiplicanda praeter necessitatem ("entities must not be multiplied beyond necessity") — he never actually wrote. It was attributed to him posthumously, a compressed version of arguments scattered across his logical works. The irony is fitting: the principle of parsimony was itself simplified.
What makes the razor powerful isn't a philosophical preference for elegance. It's a statistical reality about how hypotheses fail. Every assumption in an explanation is a joint that can break. A two-assumption hypothesis has two potential points of failure. A seven-assumption hypothesis has seven. If each assumption carries even a modest probability of being wrong — say 90% likely correct — the two-assumption chain holds with 81% probability. The seven-assumption chain holds with 48%. Same evidence, same explanatory power, radically different fragility. The razor isn't aesthetic. It's structural risk management applied to reasoning.
Isaac Newton codified the principle for science in 1687, listing it as the first of his Regulae Philosophandi in the Principia: "We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances." Newton wasn't speculating about simplicity. He was establishing a methodology — a rule for how to build reliable knowledge. Admit only what's necessary. Everything else is overhead that weakens the structure.
The canonical demonstration came two centuries earlier, though Ockham couldn't have known it. Ptolemy's geocentric model of the solar system worked — it predicted planetary positions with reasonable accuracy. But it required epicycles: small circles riding on larger circles, dozens of them, each a patch to accommodate observations that the core model couldn't explain. Copernicus's heliocentric model explained the same observations with dramatically fewer moving parts. Epicycles vanished. The math simplified. The predictions improved. The simpler model wasn't just more elegant — it was closer to the physical truth, because the complexity Ptolemy had added was compensating for a wrong foundational assumption rather than reflecting genuine structure in the data.
This is the razor's deepest insight: unnecessary complexity usually isn't harmless decoration. It's a signal that something foundational is off. When a business plan requires twelve strategic pivots to reach profitability, the complexity isn't sophistication — it's a warning that the core thesis doesn't hold. When a medical diagnosis requires invoking three simultaneous rare conditions, the parsimonious question is whether a single common condition explains all the symptoms. When a founder's explanation for declining revenue involves market timing, competitor tactics, seasonal effects, and a sales team reorganisation, the razor asks: is there one underlying cause that makes the other four unnecessary?
The pattern recurs with striking regularity across centuries of intellectual history. In medicine, the principle surfaces as "diagnostic parsimony" — when a patient presents with multiple symptoms, look first for a single disease that explains all of them before hypothesising multiple concurrent conditions. In law, the standard of proof implicitly favours the prosecution's simplest coherent narrative over the defence's elaborate alternative theories, because jurors — like all humans — correctly intuit that explanations requiring fewer coincidences are more probable. In software engineering, the principle appears as "when you hear hoofbeats, think horses, not zebras" — check the configuration file before hypothesising a kernel bug. Each domain independently rediscovered the same structural truth: the number of possible complex explanations is infinite, but the number of simple ones that actually fit the evidence is small. Start with the small set.
The critical caveat, and the one most people skip: the razor does not say simpler is always correct. It says don't add complexity unless the evidence demands it. Sometimes the universe is genuinely complex. Quantum mechanics is not simple. The human immune system is not simple. General relativity replaced Newtonian gravity with a more complex model because the simpler one couldn't explain Mercury's orbital precession — and that additional complexity was earned, demanded by data that the parsimonious version couldn't accommodate. Occam's Razor doesn't deny complexity where it exists — it insists that the burden of proof falls on the person adding the assumption, not on the person questioning it.
Section 2
How to See It
The razor operates quietly. It rarely announces itself. But once you recognise the signature — someone stripping away accumulated complexity to find the load-bearing explanation underneath — you see it everywhere. The key tell: someone in the room asks a question that makes a complicated discussion suddenly feel embarrassingly simple. That question is almost always the razor.
Technology
You're seeing Occam's Razor when a debugging engineer ignores the team's elaborate theories about race conditions, concurrency edge cases, and distributed system failures, opens the logs, and finds a typo in a configuration file. The simplest explanation — human error in a manual process — explained 100% of the symptoms. The complex hypotheses weren't technically impossible. They were just unnecessary given the evidence. Most production outages trace back to the simplest possible cause. The teams that recover fastest are the ones that check the simple explanations first.
Business
You're seeing Occam's Razor when a board member asks a founder presenting a 47-slide deck with five revenue streams, three platform plays, and a marketplace flywheel: "What is the one thing that has to be true for this business to work?" The question is the razor in action — cutting through strategic complexity to find the single load-bearing assumption. If that assumption holds, the complexity may be warranted. If it doesn't, the elaborate strategy is decoration on a broken foundation.
Investing
You're seeing Occam's Razor whenWarren Buffett passes on an investment because the thesis requires too many things to go right. His famous criterion — "I want to be able to explain why I own something in two sentences" — is the razor applied to portfolio construction. If the bull case requires a paragraph of conditional logic, the position carries more assumption risk than the potential return justifies. The investments that have compounded most reliably for Berkshire — Coca-Cola, American Express, Apple — each have a thesis expressible in a single clause.
Science
You're seeing Occam's Razor when a medical team stops chasing three simultaneous rare diagnoses and tests for the single common condition that explains all the patient's symptoms. In clinical medicine, the maxim is Hickam's Dictum versus Occam's Razor — and the razor wins the majority of the time. A 55-year-old with fatigue, weight gain, and cold intolerance probably has hypothyroidism, not a combination of chronic fatigue syndrome, early-stage lymphoma, and Raynaud's phenomenon. The parsimonious diagnosis isn't lazy. It's statistically correct.
Section 3
How to Use It
Decision filter
"How many assumptions does this explanation require? Could a simpler account — with fewer moving parts — explain the same evidence? If so, start there. Add complexity only when the simple version breaks."
As a founder
When your strategy deck exceeds ten slides, apply the razor to your own thinking. What is the single mechanism by which this business creates value? If the answer requires five interconnected flywheels, you may not have a strategy — you may have a hope. The most durable companies tend to have embarrassingly simple core theses. Google: organise the world's information and sell attention against search intent. Amazon: widest selection, lowest prices, fastest delivery. Stripe: make accepting payments as simple as adding a line of code. The strategic complexity comes later, in execution. But the thesis fits on an index card.
When diagnosing problems, resist the narrative that makes you look sophisticated. If revenue is declining, the simplest explanations — the product isn't good enough, the price is wrong, or the market is smaller than you assumed — deserve the first investigation. Complex explanations involving competitor conspiracies, market timing, and macroeconomic headwinds may feel better. They're usually wrong.
As an investor
Before committing capital, count the assumptions. Write them out explicitly. If your investment thesis requires the market to grow at 40% annually, the management team to execute flawlessly, regulators to remain friendly, and no well-capitalised competitor to enter the space — that's four independent assumptions, each carrying its own failure probability. The razor says: find an investment where fewer things have to go right.
This is why Buffett and Munger keep returning to the concept of the "fat pitch." They're not looking for the most complex arbitrage. They're looking for the situation where the thesis is so simple and the margin of safety so wide that the number of required assumptions approaches one: this business will continue doing roughly what it's already doing.
As a decision-maker
In any meeting where the explanation for a problem keeps getting more elaborate, pause and ask: what is the simplest account that fits all the facts we actually have? Not the facts we're speculating about — the ones we've verified.
Organisations are complexity-generating machines. Every new initiative, process, and reporting layer adds assumptions. The razor is the counterweight: before adding anything, demonstrate that the existing, simpler approach is genuinely insufficient. Andy Grove's principle at Intel — "let chaos reign, then rein in chaos" — was the razor applied to management. Don't add process until the absence of process is demonstrably causing harm. The burden of proof belongs to complexity, not to simplicity.
The operational discipline, across all three contexts, is the same: count your assumptions explicitly. Write them down. Then ask, for each one, whether removing it would change the conclusion. If removing an assumption doesn't change the outcome, it wasn't load-bearing — it was decoration. The razor is the habit of performing this audit before committing resources, capital, or conviction.
Common misapplication: The razor becomes dangerous when it's used to dismiss genuinely complex phenomena. "The simplest explanation" for a market crash, a geopolitical crisis, or a product failure is sometimes too simple — a monocausal narrative that ignores real structural complexity. The razor says prefer fewer assumptions, not zero assumptions. A founder who explains every setback with "we just need more sales" is misapplying the principle — using it to avoid analysis rather than to sharpen it. The tell: if your "simple explanation" keeps failing to predict what happens next, the evidence is demanding more complexity, and the razor requires you to provide it.
Section 4
The Mechanism
Section 5
Founders & Leaders in Action
The razor doesn't generate headlines. It operates in the negative space — in what gets cut, what gets refused, what never ships. The founders and thinkers who wield it most effectively are recognisable not by the complexity of their strategies but by the discipline of their constraints.
What's consistent across these cases — spanning consumer electronics, e-commerce, investing, physics, and the founding of modern science — is that the razor wasn't a tool of convenience. It was a tool of conviction. Each person below faced pressure to add complexity: from boards, from peers, from the institutional momentum of their fields. The razor gave them a principle to push against that pressure — not because simplicity was easy, but because unnecessary complexity was expensive in ways that others hadn't bothered to calculate.
When Jobs returned to Apple in September 1997, the company was manufacturing over 350 products — dozens of Macintosh variants, printers, scanners, the Newton PDA, servers aimed at markets Apple had no business competing in. The company was 90 days from insolvency. Every product had a justification, a champion, and a strategic rationale that sounded reasonable in isolation.
Jobs drew a two-by-two grid on a whiteboard. The columns were Consumer and Professional. The rows were Desktop and Portable. Four quadrants. Four products. Everything else was eliminated. The iMac, the Power Mac G3, the iBook, the PowerBook — that was it. Over 340 products were killed in a matter of weeks.
The strategic logic was Occam's Razor applied to product portfolio management: what is the minimum set of products that covers the actual market? The answer required exactly four. Every additional product was an assumption — that Apple could compete in printers, that the Newton had a market, that enterprise servers were a viable play — and none of those assumptions had evidentiary support. Jobs didn't just simplify for aesthetic reasons. He cut because every unnecessary product was consuming engineering talent, manufacturing capacity, and management attention that the four surviving products desperately needed. Apple posted a $309 million profit in fiscal 1998, its first profitable year since 1995.
Years later, Jobs applied the same logic to the iPhone. The original iPhone launched in 2007 without an app store, without copy-and-paste, without multitasking. Competitors mocked the missing features. But Jobs understood that every feature was an assumption about what users needed, and he refused to include assumptions that hadn't been validated by actual usage data. Features were added iteratively as evidence accumulated — the App Store in 2008, multitasking in 2010. Each addition was earned by data, not included by default. The razor didn't just clarify the strategy. It freed the resources to execute it.
Bezos's 1997 letter to shareholders — Amazon's first after its IPO — is one of the most parsimonious strategic documents in corporate history. The thesis: the internet was growing rapidly, customer obsession would compound over time, and Amazon would sacrifice short-term profitability to build long-term market position. Three ideas. One page of reasoning. Everything Amazon has done in the 28 years since — AWS, Prime, Marketplace, Alexa, logistics — traces back to those three axioms.
While competitors in the late 1990s built elaborate "portal" strategies involving content partnerships, advertising networks, and media acquisitions — each adding assumptions about what the internet was "really" for — Bezos applied the razor to a simpler question: what do customers want that won't change in ten years? Lower prices. Wider selection. Faster delivery. He said publicly in 2012 that he couldn't imagine a future in which customers wanted higher prices or slower delivery. The variables that wouldn't change were the simplest to identify and the safest to invest against. Amazon's entire strategic architecture sits on that parsimonious foundation. The complexity is in the execution — the logistics network, the fulfilment centres, the recommendation algorithms. But the thesis requires exactly one assumption: customers prefer better over worse. Everything else is engineering.
The razor shows up again in Bezos's communication style. Amazon's famous six-page memo format — replacing PowerPoint presentations with structured narratives for every significant decision — is itself an application of parsimony to corporate reasoning. A slide deck can hide weak logic behind bullet points and visual flourishes. A six-page narrative forces the author to make every assumption explicit, every causal connection visible. The format is a complexity detector: if the argument doesn't hold as continuous prose, the assumptions are probably weaker than the author realises.
Charlie MungerVice Chairman, Berkshire Hathaway, 1978–2023
Munger's investment filter was the razor made operational. When evaluating a business, he looked for the simplest possible explanation for its competitive advantage. If the explanation required invoking five layers of strategic genius, network effects, regulatory capture, and switching costs all operating simultaneously — he put it in the "too hard" pile. Not because the analysis was wrong, but because the number of assumptions made the thesis fragile.
His investment in Costco — which Munger held personally and championed for decades — illustrates the method. The thesis: Costco buys in bulk, marks up a fixed 14% maximum, and passes the savings to members who pay an annual fee for access. The business model fits in two sentences. The competitive advantage (extreme cost discipline plus membership lock-in) is structurally simple and observable in the financial statements. No hidden assumptions about future market dynamics. No dependence on management genius. The model works if Costco keeps doing what it's already doing.
Contrast this with the elaborate theses that Wall Street analysts construct for companies whose success requires seven things to go right simultaneously — a new product launch, international expansion, margin improvement, a favourable regulatory ruling, and a cooperative macroeconomic environment. Each assumption sounds reasonable in isolation. In combination, they're a house of cards.
Munger's filter isn't anti-intellectual — he's one of the most voracious readers in the history of business. The filter is probabilistic. Each additional assumption multiplies the ways the thesis can break. Munger wants investments where the thesis breaks only if something fundamental changes about consumer behaviour or economic gravity. Those are the assumptions he's willing to carry. Everything else gets shaved.
Feynman treated explanatory complexity as a diagnostic. If a physicist couldn't explain a result simply, Feynman took it as evidence that the physicist didn't truly understand it. "What I cannot create, I do not understand," he wrote on his Caltech blackboard — but the corollary was equally important: what I cannot simplify, I have not yet grasped.
His development of Feynman diagrams in the late 1940s was the razor applied to quantum electrodynamics. The existing mathematical formalism for particle interactions was technically correct but computationally intractable — pages of integral equations for a single scattering event. Feynman replaced the algebra with visual diagrams: lines for particles, vertices for interactions, simple rules for translating pictures into probabilities. The physics didn't change. The explanatory apparatus became radically simpler. The diagrams let physicists calculate in hours what had previously taken weeks, and they remain the standard notation seventy-five years later.
During the Challenger investigation in 1986, Feynman applied the same instinct. Weeks of testimony had produced institutional complexity — overlapping accounts, bureaucratic self-protection, and no clear causal explanation. Feynman's ice water demonstration with the O-ring rubber took thirty seconds and explained everything the committee needed to know. The shuttle failed because cold rubber doesn't seal. No additional assumptions required. The elaborate institutional narrative — about management pressure, communication breakdowns, and risk acceptance culture — was true, but the physical cause was one material property at one temperature. The razor cut through organisational complexity to reach the mechanical fact.
Isaac NewtonPhysicist & Mathematician, Cambridge, 1687
Newton didn't just use the razor. He codified it as the first rule of scientific method. The Regulae Philosophandi in Book III of the Principia (1687) opens with a statement that would have made Ockham proud: "We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances." Newton then adds the justification that elevates it from preference to principle: "for Nature is pleased with simplicity, and affects not the pomp of superfluous causes."
The application was immediate and consequential. Newton's theory of universal gravitation replaced a thicket of separate explanations — one for falling apples, another for planetary orbits, another for ocean tides, another for the Moon's motion — with a single equation. One force, one inverse-square law, four phenomena explained. Kepler had identified the mathematical patterns in planetary motion but required separate empirical laws for orbital shape, orbital speed, and orbital period. Newton showed that all three were consequences of a single gravitational force operating between masses. Three laws collapsed into one.
The contrast with his rivals sharpened the point. Descartes's vortex theory of planetary motion required the universe to be filled with invisible swirling matter — an elaborate mechanical apparatus with no independent evidence for its existence. Newton's gravitational model required one assumption: masses attract each other in proportion to their product and inversely with the square of their distance. The vortex theory explained the same observations with vastly more machinery. The razor made the choice clear, and three centuries of subsequent physics confirmed it. Newton's gravitational framework predicted the existence of Neptune in 1846 — a planet no one had seen, located mathematically by Urbain Le Verrier based solely on gravitational perturbations in the orbit of Uranus. A theory with one assumption predicted a planet. That's parsimony earning its keep.
The deeper lesson from Newton: the razor isn't just a tool for choosing between existing explanations. It's a design constraint for building new ones. Newton could have added epicycles to his gravitational theory the way Ptolemy had. He refused — not because he couldn't, but because he understood that each added parameter would weaken the theory's predictive power and make it harder to test. The discipline of parsimony in construction, not just in selection, is what separated the from the theories it replaced.
Section 6
Visual Explanation
Section 7
Connected Models
No model works alone. The razor is at its most useful — and its most dangerous — in combination with other frameworks. It disciplines them, trimming excess from strategies, diagnoses, and theses the way an editor trims excess from prose. But it also creates tensions that sharpen your thinking in ways the razor alone cannot.
Here's how it connects to the broader lattice:
Reinforces
First Principles Thinking
First principles decomposition strips a problem to its fundamental truths. Occam's Razor strips an explanation to its essential assumptions. The two operate in sequence: first principles identifies the irreducible components, and the razor ensures you haven't added anything beyond them during reconstruction. When Musk decomposed rocket costs to raw materials, the razor was implicit — the simpler cost structure (materials plus manufacturing) explained reality better than the complex one (legacy vendor margins plus contractual overhead plus institutional inertia). First principles finds the foundation. The razor keeps you from building unnecessary floors on top of it.
Reinforces
[Inversion](/mental-models/inversion)
Inversion asks "what would guarantee failure?" and eliminates those causes. Occam's Razor asks "what's unnecessary?" and eliminates those assumptions. Both models operate by subtraction — removing rather than adding. The reinforcement is operational: when Munger inverts to identify what kills an investment thesis, he's simultaneously applying the razor, stripping the thesis to the minimum number of assumptions required to justify the position. Anything that survives both filters — the inversion test and the parsimony test — tends to be genuinely load-bearing. Anything that doesn't survive was, by definition, either dangerous or unnecessary.
Tension
Second-Order Thinking
Occam's Razor says strip to the simplest sufficient explanation. Second-order thinking says trace the consequences past the obvious first step, which necessarily adds complexity. The tension is real: a parsimonious explanation of a policy change might capture the first-order effect perfectly while missing the second- and third-order consequences entirely. Bastiat's broken window fallacy is a case where the simple explanation (the glazier gains work) was incomplete precisely because it stopped at the first order. The resolution: apply the razor to your hypothesis, then apply second-order thinking to your conclusions. Simplify the model, then complicate the forecast.
Section 8
One Key Quote
"We are to admit no more causes of natural things than such as are both true and sufficient to explain their appearances."
— Isaac Newton, Principia Mathematica, 1687
Section 9
Analyst's Take
Faster Than Normal — Editorial View
The most common failure mode I see with Occam's Razor is people using it as a permission slip to stop thinking. "The simplest explanation is usually correct" gets shortened to "don't overthink it," which gets shortened to "my first instinct is probably right." That's not parsimony. That's laziness wearing parsimony's clothes.
The razor has a prerequisite that its popularisers rarely mention: you have to genuinely consider the competing explanations before you can responsibly dismiss them. Ockham didn't say "don't bother examining complex hypotheses." He said don't retain them once a simpler one accounts for the same evidence. The examination comes first. The cutting comes after. Skip the examination and you're not applying the razor — you're just incurious.
The founders who use this model most effectively share a specific trait: they're comfortable with the discomfort of not knowing. When Bezos reduced Amazon's strategy to three variables, he wasn't being simplistic — he'd spent years studying retail, logistics, and internet economics before concluding that those three variables were the load-bearing ones. When Jobs cut Apple's product line to four, he'd personally reviewed every product, met with every engineering team, and understood exactly what was being sacrificed. The simplicity of the output was the product of exhaustive analysis, not the absence of it.
The razor is most powerful in environments saturated with motivated complexity. Wall Street generates complexity because complexity justifies fees. Consulting firms generate complexity because complexity justifies engagement length. Corporate bureaucracies generate complexity because complexity justifies headcount. In each case, the added assumptions serve the institution rather than the analysis. The razor asks: does this additional layer of the model explain something that the simpler version doesn't? If the answer is no, the complexity exists for someone's benefit — but not for the truth's benefit.
The version of the principle I return to most often isn't Ockham's or Newton's — it's the one attributed to Einstein, possibly apocryphal: "Everything should be made as simple as possible, but not simpler." The "but not simpler" is where most misapplications originate. Oversimplification is just as expensive as overcomplexity, and the razor cuts in both directions. If your parsimonious explanation keeps getting surprised by reality, the evidence is telling you that you've cut too deep. The discipline is knowing when to stop cutting — and that requires the kind of domain knowledge that no heuristic can substitute for.
There's also an organisational dimension that deserves more attention than it gets. Investment banks, management consultancies, enterprise software vendors, and government agencies all have structural incentives to add assumptions, layers, and dependencies — because each one generates revenue, justifies headcount, or extends a contract. The razor is the natural enemy of these institutions, which is why the people who apply it most ruthlessly tend to be outsiders, founders, or leaders with enough authority to override the institutional immune response. Jobs could kill 340 products because he was CEO. A middle manager proposing the same cut would have been fired for insubordination. The razor's difficulty is often political, not intellectual.
Section 10
Test Yourself
The razor sounds straightforward until you have to apply it under pressure — when the simple explanation feels too simple, the complex one feels more "rigorous," and your ego wants to demonstrate analytical depth. The hardest part isn't understanding the principle. It's resisting the social and psychological incentives to add one more assumption, one more variable, one more caveat.
These scenarios test whether you can distinguish genuine parsimony from oversimplification — and, just as importantly, whether you can recognise when the razor is being weaponised to shut down legitimate analysis.
Is this mental model at work here?
Scenario 1
A startup's monthly revenue has declined for three consecutive months. The founder presents a board deck attributing the decline to a combination of seasonal effects, a competitor's pricing move, a sales team reorganisation, and macroeconomic headwinds. A board member asks: 'Have you checked whether the product's core activation metric has changed?'
Scenario 2
A data science team builds a machine learning model with 47 features to predict customer churn. A senior engineer suggests starting with a logistic regression using the three features with highest individual correlation to churn, then adding complexity only if the simple model's performance is insufficient.
Scenario 3
After a plane crash, an investigator dismisses the possibility of mechanical failure because 'pilot error is always the simplest explanation' and closes the investigation without examining the flight recorder or maintenance logs.
Scenario 4
A doctor sees a patient presenting with fatigue, joint pain, and a distinctive butterfly-shaped facial rash. Rather than investigating three separate conditions, she orders a single test for systemic lupus erythematosus, which can cause all three symptoms.
Section 11
Top Resources
The best material on Occam's Razor spans medieval logic, philosophy of science, and applied decision-making. The principle is old enough that the primary sources matter — and recent enough in its practical applications that the business literature is essential reading alongside the philosophy. Start with Sober for the rigorous treatment, then move to the practitioners who demonstrate the razor in action.
The definitive philosophical treatment. Sober, a philosopher of science at the University of Wisconsin, distinguishes between different versions of the razor — the version that says simpler theories are more likely to be true, the version that says simpler theories are more testable, and the version that says simpler theories are more useful — and examines when each applies. Dense but rewarding. This is where you go when you want to understand why parsimony works, not just that it does.
Based on Feynman's Messenger Lectures at Cornell, this slim volume demonstrates the razor in practice across physics — from gravitation to quantum mechanics. Feynman doesn't cite Ockham, but every chapter embodies the principle: strip the mathematics to its essential structure, find the simplest formulation that preserves the physics, and treat residual complexity as a signal that understanding is incomplete. Chapter 7, "Seeking New Laws," is the best short essay on scientific parsimony ever written.
Buffett's annual letters are the razor applied to investment communication. Every letter distils complex capital allocation decisions into language a high school student could follow — not because the decisions are simple, but because Buffett tests his own understanding by demanding simplicity of expression. The 1996 letter on "cigar butt" investing versus quality compounders is a masterclass in reducing investment philosophy to its minimum viable assumptions.
Munger's collected speeches demonstrate the razor as an investment filter. His "elementary worldly wisdom" framework is explicitly parsimonious — a handful of big ideas from a handful of disciplines, applied repeatedly. The 1994 USC speech on "The Psychology of Human Misjudgment" shows how the razor interacts with cognitive biases: the simplest explanation for most investment errors is that a known psychological bias distorted the analysis. Start there before constructing elaborate theories.
The Regulae Philosophandi in Book III establish parsimony as the first rule of scientific method. Newton's four rules — admit no more causes than necessary, assign the same causes to the same effects, treat universal properties as universal, and hold inductive conclusions until contradicted — remain the most concise methodological framework in the history of science. The rules occupy fewer than two pages. They restructured human knowledge. Read alongside the gravitational derivations that follow, and you'll see the razor not as an abstract principle but as a working tool — applied in real time to the most consequential scientific argument ever published.
Principia
Two hypotheses explain the same evidence. Each added assumption is a joint that can fail. The razor prefers the chain with fewer links.
Tension
[Narrative](/mental-models/narrative) Fallacy
Nassim Taleb's Narrative Fallacy warns that humans compulsively simplify complex events into neat, coherent stories — stripping out randomness, contingency, and structural complexity to produce a satisfying narrative. Occam's Razor says prefer simplicity. The tension is uncomfortable: the razor can license exactly the oversimplification that the Narrative Fallacy warns against. "The company failed because the CEO was bad" is parsimonious and often wrong — a narrative that feels explanatory while ignoring market dynamics, timing, and systemic factors. The distinction: the razor demands the simplest explanation that accounts for all the evidence. The Narrative Fallacy produces the simplest story that feels satisfying. Evidence-fitting and story-telling are different operations, and confusing them is how the razor gets weaponised into intellectual laziness.
Leads-to
[Simplify](/mental-models/simplify)
Occam's Razor is the diagnostic; Simplify is the execution. The razor tells you that your product, your strategy, or your organisation has more moving parts than the evidence justifies. Simplify is the operational discipline of actually removing them. Jobs's four-quadrant product grid was the razor's conclusion; the eighteen months of cancelling product lines, reorganising teams, and shutting down factories was the work of simplification. One without the other is incomplete: the razor without execution is academic, and simplification without the razor's diagnostic is arbitrary — you'll cut things that matter along with things that don't.
Leads-to
Circle of Competence
The razor naturally leads to competence boundaries. If your investment thesis requires assumptions about biotech regulation, Chinese monetary policy, and semiconductor supply chains simultaneously — and you're an expert in none of those domains — the razor suggests the thesis is too complex for your knowledge base. Reducing the number of assumptions often means reducing the scope of the analysis to domains where your understanding is deep enough to evaluate each assumption independently. Munger's "too hard" pile is the razor intersecting with the circle: when the minimum number of assumptions required exceeds the number you can personally evaluate, the correct move is to pass.
The razor is hardest to apply inside institutions that profit from complexity.
One final observation. The razor's greatest contribution isn't to the quality of any single explanation — it's to the speed of learning. Simple hypotheses are testable. Complex hypotheses resist testing because they have too many degrees of freedom; when reality contradicts them, you can always adjust one of the seven assumptions rather than abandoning the framework. Simple hypotheses break cleanly, which means you learn faster, iterate faster, and converge on truth faster. The founder who runs ten simple experiments in a quarter learns more than the founder who runs one complex experiment in a year. The razor isn't just about being right. It's about being wrong in a way that teaches you something.