·Computer Science & Algorithms
Section 1
The Core Idea
In 1965,
Gordon Moore was asked by Electronics magazine to predict what would happen in the semiconductor industry over the next decade. Moore, then director of R&D at Fairchild Semiconductor, examined the data. The number of transistors that could be placed on a single integrated circuit had been doubling roughly every year since the invention of the planar process in 1959. He extrapolated: this doubling would continue for at least ten more years. The article — "Cramming More Components onto Integrated Circuits," published April 19, 1965 — contained a single graph with a trend line that would become the most consequential prediction in the history of technology. Moore projected that by 1975, a single chip would contain 65,000 transistors. The industry hit that target almost exactly.
In 1975, Moore revised the doubling period from every year to every two years, reflecting the increasing complexity and capital requirements of each new process node. Caltech professor Carver Mead coined the term "Moore's Law" shortly afterward, and the label stuck — despite Moore himself pointing out repeatedly that it wasn't a law of physics but an empirical observation about industrial capability. The distinction matters. Gravity operates whether or not anyone believes in it. Moore's Law operates because an entire industry organizes its R&D roadmaps, capital expenditure cycles, and competitive strategies around the expectation that it will continue. It is a prediction that became a coordination mechanism that became a self-fulfilling prophecy.
The numbers are staggering in their consistency. Intel's 4004 processor in 1971 contained 2,300 transistors on a 10-micron process. The 8086 in 1978 had 29,000. The 486 in 1989 had 1.2 million. The Pentium 4 in 2000 had 42 million. By 2024, Apple's M4 chip contained approximately 28 billion transistors manufactured on TSMC's 3-nanometer process — a 12-million-fold increase over the 4004 in roughly 53 years. Plot those data points on a logarithmic scale and the line is remarkably straight. No other industrial trend in human history has maintained exponential growth over five decades.
The economic consequences of this trajectory have reshaped every industry on earth. When transistor density doubles every two years, the cost per transistor halves. In 1970, a single transistor cost roughly one dollar. By 2024, that cost had fallen below one ten-billionth of a dollar. Computation that required a room-sized mainframe costing millions in 1970 now fits in a device that costs $200 and slips into your pocket. This exponential deflation in the cost of logic gates is the substrate on which the entire digital economy was built — from personal computers to the internet to smartphones to artificial intelligence. Every software company, every cloud platform, every social network, every autonomous vehicle project is downstream of the trend Gordon Moore sketched on a napkin in 1965.
What makes Moore's Law intellectually distinctive is that it operates at the intersection of physics, economics, and collective will. The physics sets upper bounds — you cannot make a transistor smaller than an atom. The economics create incentives — whoever reaches the next process node first captures disproportionate margins. And the collective will — the shared expectation across chip designers, equipment manufacturers, materials scientists, and corporate planners that the trend will continue — coordinates the billions of dollars in investment required to make each doubling happen. Intel, TSMC, Samsung, and ASML don't achieve the next node independently. They achieve it because the entire ecosystem plans on the same cadence. Remove the shared expectation and the coordination collapses. This is why Moore's Law is better understood as an industry metronome than as a physical constant.
The law's influence extends far beyond semiconductors. Entire strategy frameworks — disruption theory, technology adoption curves, venture capital return models — assume exponential improvement in price-performance as a background condition. When
Marc Andreessen declared in 2011 that "software is eating the world," the implicit assumption was that the hardware underneath the software would continue to get cheaper and more powerful on the Moore's Law trendline. When
Jeff Bezos built Amazon Web Services on the bet that computing would become a utility, he was making a Moore's Law wager: that server costs would fall fast enough to make on-demand cloud computing economically rational for every business. The law doesn't just describe semiconductor progress. It underwrites the entire strategic logic of the technology industry.
The counterintuitive consequence of sustained exponential improvement is that human beings consistently underestimate its long-run effects and overestimate its short-run effects. In 1977, Ken Olsen, founder of Digital Equipment Corporation, reportedly said "there is no reason anyone would want a computer in their home." In 1995, Clifford Stoll wrote a Newsweek column arguing that the internet would never replace newspapers, teachers, or physical stores. Both were reasoning linearly about an exponential process. The transistor budgets required for personal computing and internet infrastructure were, at the time of those statements, genuinely insufficient. Two or three doublings later, they were more than sufficient. Moore's Law punishes linear forecasters and rewards those willing to project the curve forward — uncomfortably, counterintuitively — and position themselves for where the exponential is heading rather than where it is today.