·Economics & Markets
Section 1
The Core Idea
In 1865, the English economist William Stanley Jevons published "The Coal Question," a book that disturbed the prevailing assumption of Victorian policymakers. Britain's coal reserves were finite. Steam engines were becoming more efficient. The intuition seemed straightforward: better engines would stretch coal supplies further, buying the empire time. Jevons demonstrated the opposite. Every improvement in steam engine efficiency — from Newcomen's 1% thermal efficiency to Watt's 4% to the compound engines approaching 12% — had been followed not by a reduction in coal consumption but by a dramatic expansion. Britain burned 16 million tons of coal in 1829. By 1865, it burned 83 million tons. The engines consumed less coal per unit of work. The economy consumed vastly more coal in total.
The mechanism is deceptively simple. When a resource becomes more efficient to use, the effective price of the service it provides drops. Cheaper energy per unit of work means more work gets done. Cheaper lighting per lumen means more lumens get deployed. Cheaper compute per operation means more operations get run. The demand increase overwhelms the per-unit savings. Total consumption rises, often by multiples of the original level. The paradox is not that efficiency fails — it succeeds spectacularly at its stated goal. The paradox is that success at the unit level produces the opposite of the expected outcome at the aggregate level.
This is not a theoretical curiosity. It is one of the most consistently observed dynamics in economic history, and it operates across every category of resource and technology.
The Bessemer process made steel 80% cheaper per ton between 1856 and 1890. Global steel production didn't fall by 80%. It rose from 500,000 tons to 28 million tons. Cheap steel enabled skyscrapers, transcontinental railroads, naval fleets, and mass-produced machinery — applications that were structurally impossible when steel cost ten times more. The green revolution increased crop yields per acre by 200–300% starting in the 1960s. Global agricultural land use didn't shrink proportionally — food became cheaper, populations grew, diets shifted toward resource-intensive meat, and total agricultural resource consumption expanded. LED lighting uses roughly 75% less electricity per lumen than incandescent bulbs. McKinsey estimated in 2012 that LED adoption would reduce global lighting electricity by 40%. By 2023, the actual reduction was closer to 15%, because the world deployed dramatically more lumens — illuminating buildings, stadiums, digital signage, and decorative installations that were economically unfeasible under incandescent pricing.
The bandwidth market tells the same story. The cost of transmitting one gigabyte of data fell from approximately $150 in 1998 to under $0.01 by 2023 — a reduction of over 99.99%. Global internet traffic didn't stay flat. It grew from roughly 100 petabytes per month in 2002 to over 400 exabytes per month by 2023 — a four-million-fold increase. Cheaper bandwidth created video streaming, cloud computing, social media, remote work infrastructure, and the entire mobile application economy. None of these existed at 1998 bandwidth prices. All of them are now among the largest sectors in the global economy.
The paradox operates most powerfully when three conditions converge. First, the resource must serve a function with elastic demand — meaning that a price reduction significantly increases the quantity demanded. Coal-powered mechanical work, automobile transportation, and cloud computing all meet this criterion. Second, the efficiency improvement must be large enough to materially lower the effective cost to the end user. A 5% improvement rarely triggers a Jevons effect. A 50% improvement often does. Third, the economy must be sufficiently flexible to absorb the increased demand — through new applications, new users, new industries, or geographic expansion. When all three conditions hold, the rebound effect exceeds 100% of the efficiency gain, and total resource consumption increases. Economists call this "backfire." Jevons simply called it obvious.
The error that Jevons identified has been repeated by policymakers and strategists in every century since. In the 1970s, the US government projected that improved automobile fuel efficiency would reduce national gasoline consumption. It didn't — Americans drove more. In the 2000s, analysts projected that server virtualization would reduce total data center energy consumption. It didn't — virtualization made compute so cheap that workloads proliferated. In the 2020s, AI companies project that more efficient inference will reduce the cost of deploying language models. It will reduce the per-query cost. It will not reduce total spending on inference. The pattern repeats because the underlying economics are invariant: when efficiency reduces the effective price of a service, and demand for that service is elastic, consumption expands.
The concept matters for founders because the technology industry is the most fertile environment for Jevons effects in modern economic history. Software reduces the marginal cost of distribution to near zero.
Moore's Law reduces the cost of compute by roughly 50% every eighteen months. Cloud infrastructure makes server capacity a variable expense instead of a capital investment. Each of these efficiency gains has consistently produced not conservation but explosion — in usage, in spending, in the creation of entirely new markets that didn't exist before the efficiency improvement made them economically viable. Understanding Jevons Paradox doesn't just explain history. It predicts where demand will grow next.