·Economics & Markets
Section 1
The Core Idea
In approximately 1754 BC, the Babylonian king Hammurabi inscribed a principle into basalt that four millennia of institutional design have failed to improve upon. Law 229 of his code stated: if a builder builds a house and the house collapses and kills the owner, the builder shall be put to death. The law was not subtle. It was not proportional by modern standards. But it solved a problem that remains unsolved in most contemporary institutions — the problem of asymmetric consequences. The builder who must inhabit the structure he constructs builds differently from the builder who walks away after collecting payment. Not because of character. Because of structure.
Skin in the game is the principle that the people who make decisions should bear the consequences of those decisions — symmetrically, personally, and inescapably. The concept was given its modern articulation by Nassim Nicholas Taleb, who spent the better part of two decades arguing that the most dangerous feature of modern institutions is the systematic separation of decision-making from consequence-bearing.
A fund manager who collects 2% of assets and 20% of profits regardless of long-term performance has an asymmetric payoff. A CEO with a $60 million severance package has a floor under their downside that the shareholders funding the severance do not. A politician who authorises a war has no personal exposure to the battlefield. In each case, the decision-maker captures a disproportionate share of the upside while transferring a disproportionate share of the downside to someone else. The asymmetry changes the decision. Every time.
The mechanism is not complex. When you bear the full consequences of being wrong, your definition of "prudent" recalibrates. You check the numbers a second time. You stress-test the assumption. You build the margin of safety wider. When someone else bears the consequences, the recalibration reverses. You round up. You assume the best case. You move faster than the evidence justifies — because speed has upside for you and the downside lands elsewhere. The shift is not conscious. It operates in the same cognitive layer that adjusts your driving speed based on whether you are in your own car or a rental.
The principle predates Taleb by millennia and extends far beyond finance. Roman engineers were required to stand beneath the arch when the scaffolding was removed. Ship captains in the age of sail were the last to leave a sinking vessel — not by tradition alone, but because maritime law held them personally liable for the loss. The Medici bankers of Renaissance Florence invested their personal fortunes alongside their clients'. The pattern across centuries and cultures converges on the same structural insight: systems where decision-makers bear consequences outperform systems where they don't. Not because the people are better. Because the architecture is honest.
The modern economy has systematically dismantled this architecture. Limited liability corporations, government bailouts, golden parachutes, regulatory capture, and the professionalisation of management have all widened the gap between the people who make decisions and the people who absorb their consequences. The evolution was gradual and, at each step, individually rational. Limited liability encouraged entrepreneurial risk-taking. Professional management improved operational efficiency. Government backstops prevented bank runs.
But the cumulative effect was a system in which the people making the largest and most consequential decisions were insulated from the outcomes of those decisions to a degree that would have been inconceivable to Hammurabi, to the Roman engineers, or to the Medici bankers.
The 2008 financial crisis was the most expensive demonstration. The executives who constructed the subprime mortgage apparatus collected hundreds of millions in compensation while the losses — $22 trillion in household wealth — were borne by pensioners, homeowners, and taxpayers who had no seat at the table where the risk was manufactured. Not one senior executive at a major Wall Street firm went to prison. The asymmetry was total. Angelo Mozilo, the CEO of Countrywide Financial who presided over the origination of millions of subprime loans, paid a $67.5 million SEC settlement — less than a third of the compensation he had received during the years the loans were being written. The fine was a rounding error on the profit. The system had been designed so that skin in the game was structurally absent at the exact nodes where it mattered most.
The pattern has a corollary that is equally important and less frequently discussed: skin in the game is not only a risk management mechanism. It is an information system. The Roman engineer standing beneath his own arch has access to information about the arch's structural integrity that no external inspector can replicate — because the engineer's life depends on the accuracy of his assessment. The fund manager whose net worth is in the portfolio evaluates counterparty risk with a precision that the manager investing other people's money cannot match — not because of training or intelligence, but because the personal consequence creates a quality of attention that detached analysis cannot produce. Skin in the game doesn't just change behaviour. It changes perception.
This is why Taleb argues that practitioners who bear consequences generate more reliable knowledge than theorists who don't. The pilot who flies the plane calibrates risk differently from the aviation theorist who models it. The entrepreneur who invested their savings evaluates market demand differently from the consultant who projected it. The difference is not expertise — it is exposure. Consequence is a form of information that cannot be acquired through study, and its absence produces a form of confidence that has no empirical foundation.
Taleb's contribution was not discovering the principle — it is ancient. His contribution was demonstrating that the absence of skin in the game is not merely unfair but structurally fragile. Systems that allow decision-makers to transfer consequences accumulate hidden risk, because the people creating the risk have no personal incentive to measure it accurately. The risk doesn't disappear. It migrates — from those who can see it to those who can't, from those who created it to those who never consented to it — until it concentrates somewhere the system cannot absorb it. Then it detonates.