·Mathematics & Probability
Section 1
The Core Idea
How many piano tuners work in Chicago? The question sounds absurd — the kind of riddle designed to stump job candidates rather than illuminate anything useful. But the question has an answer, and reaching it requires no data, no research, and no expertise in either pianos or Chicago. It requires only the willingness to decompose an unknowable whole into estimable parts.
Enrico Fermi, the Italian-born physicist who built the first nuclear reactor under the bleachers of the University of Chicago's Stagg Field in December 1942, posed this question to his students not as a test of knowledge but as a demonstration of method. The method: break any apparently unanswerable question into a series of smaller questions, each of which can be estimated within a reasonable range from general knowledge, then multiply the estimates together to arrive at an order-of-magnitude answer.
Chicago's population in the 1950s was roughly three million. Assume an average household size of three — that gives one million households. Perhaps one in five households owns a piano: 200,000 pianos. A piano needs tuning once or twice per year — call it 1.5 times — producing 300,000 tuning appointments annually. A piano tuner can service perhaps four pianos per day, works roughly 250 days per year, yielding 1,000 tunings per tuner per year. Divide 300,000 appointments by 1,000 tunings per tuner: approximately 300 piano tuners. The Yellow Pages for metropolitan Chicago in the 1950s listed between 200 and 300. The estimate, built from nothing but structured guesswork, landed within the right order of magnitude.
The power of the method is not precision — it is calibration. A Fermi estimate does not tell you the answer is 300. It tells you the answer is hundreds, not tens and not thousands. That distinction, crude as it sounds, eliminates the vast majority of decision-relevant error. The executive who needs to know whether a market opportunity is a $10 million business or a $10 billion business does not need two decimal places. She needs the correct power of ten. Fermi estimation delivers exactly that.
Fermi demonstrated the method's most dramatic application on July 16, 1945, at the Trinity nuclear test in Alamogordo, New Mexico. As the shockwave reached his position, ten miles from the detonation, Fermi dropped small pieces of paper and watched them scatter. From the displacement — roughly two and a half metres — he estimated the blast yield at approximately ten kilotons of TNT. The instrument-measured yield, calculated over subsequent weeks from seismographic data, radiochemical analysis, and blast-damage surveys, was twenty-one kilotons. Fermi's estimate, produced in under thirty seconds with torn paper, was accurate to within a factor of two. The instrumented measurement required months and millions of dollars of equipment. For every decision that depended on knowing the order of magnitude — evacuation radii, fallout patterns, weapon design parameters — Fermi's scraps of paper delivered the relevant answer faster than any instrument could.
The mathematical foundation is the central limit theorem applied to logarithmic estimates. When you decompose a problem into independent sub-estimates and multiply them together, the errors in individual estimates tend to partially cancel. An overestimate on one factor offsets an underestimate on another. The more factors in the decomposition, the more cancellation occurs, and the closer the product converges toward the true value. This is not intuition. It is statistics: the geometric mean of a set of independent, unbiased estimates converges on the true value faster than any individual estimate, and the variance of the product shrinks as the number of factors increases. Fermi estimation works not despite its roughness but because of it — the decomposition structure exploits the law of large numbers to extract signal from individually noisy inputs.
The method's deepest contribution is epistemological. It forces the estimator to make explicit what they know, what they don't know, and where the uncertainty is concentrated. A Fermi decomposition is a map of ignorance: each factor in the chain represents a claim about reality, and the factors with the widest uncertainty ranges identify where additional information would most reduce the estimate's error. The physicist who decomposes "how many golf balls fit in a school bus" into volume of bus, volume of golf ball, and packing efficiency has not just estimated a number — she has identified that the binding constraint is the packing efficiency estimate, and that refining it would improve the answer more than refining the bus volume. The decomposition is a prioritisation tool disguised as an arithmetic exercise.
Every significant resource allocation decision — entering a market, sizing an investment, staffing a project, setting a price — involves estimating quantities that cannot be measured directly. The choice is never between estimation and certainty. It is between structured estimation that makes assumptions explicit and unstructured guessing that hides them. Fermi's method is the discipline of choosing the former.