The Lucky Guess That Governed the World
In April 1965, a thirty-six-year-old chemist — not an engineer, not a physicist, not someone who had ever set out to make a prediction of any historical consequence — sat down at his desk at Fairchild Semiconductor in Santa Clara, California, and plotted five data points on a sheet of semi-logarithmic graph paper. The vertical axis counted components per integrated circuit: one, then a few, then dozens. The horizontal axis marked the years since 1959. The points, when connected, made a line that climbed at a steep, regular angle. Gordon Moore drew a ruler along them and extended the line ten years into the future. By 1975, he reckoned, a single chip might contain 65,000 components. He didn't believe the extrapolation would be especially accurate. He was trying to make a point about cost, not prophecy — that cramming more components onto silicon was going to make electronics unimaginably cheap. "I had no idea it would be so precise as a prediction," he said half a century later, still sounding faintly baffled. The line he drew that afternoon became the metronome of modern life. It governed the pace at which Intel and its competitors invested billions of dollars in fabrication plants. It set the cadence by which personal computers became possible, then portable, then disposable. It made the smartphone, the internet, the self-driving car, and the AI revolution not just imaginable but, in a certain mathematical sense, inevitable. And yet the man who drew it spent twenty years unable to utter the phrase "
Moore's Law" without cringing. "A lucky guess," he called it, "that got a lot more publicity than it deserved."
The paradox of Gordon Moore is that the most consequential technologist of the twentieth century was, by his own accounting and everyone else's, the quietest person in every room he entered. He did not invent the integrated circuit — that was his partner
Robert Noyce, along with Jack Kilby at Texas Instruments. He did not build the management system that made Intel a fearsome competitor — that was
Andy Grove, the Hungarian refugee who became Intel's operational engine. Moore did something more elusive and, arguably, more important: he saw the trajectory. He saw that silicon transistors and the chips made of them would make electronics profoundly cheap and immensely powerful, and that the doubling would continue, year after year, in clockwork fashion, not because physics demanded it but because economics rewarded it. Then he spent three decades quietly ensuring that the future he'd glimpsed actually arrived on schedule.
By the Numbers
Gordon Moore's Legacy
94Years lived (1929–2023)
3Legendary companies cofounded (Fairchild, Intel, Moore Foundation)
50+Years Moore's Law held true
$5.1B+Donated through the Gordon and Betty Moore Foundation
$600MSingle gift to Caltech — then the largest ever to a university
10^18Transistors shipped annually by industry at its peak under his watch
$300B+Global semiconductor industry annual revenue by Moore's later years
Pescadero and the Chemistry of Detonation
The Moore family had been in Pescadero since the 1840s — among the first Anglo settlers of the tiny coastal farming village north of Santa Cruz on the San Francisco Peninsula. Gordon Earle Moore was born on January 3, 1929, in a San Francisco hospital only because it was the nearest one to Pescadero, a community whose sole distinction, Moore later noted with dry amusement, was that it was "the only town I know of in California that's smaller now than it was sixty years ago." His father, Walter Harold Moore, was the San Mateo County Deputy Sheriff; his mother, Florence Almira Williamson, was a Pescadero native. They were practical, stoic, thrifty people. The kind who measured worth in what you produced, not what you said.
Moore spent his first decade in this pastoral backwater — fishing in Pescadero Creek, an activity that became a lifelong obsession, and absorbing the temperament of a place where nothing moved fast and nobody talked more than necessary. When he was ten, his father received a promotion to Chief Deputy Sheriff and the family relocated to Redwood City, less than five miles from where Moore would eventually build his estate. It was in Redwood City that the quiet boy discovered his first true love. Not a person. A chemistry set.
A neighbor had received one for Christmas. The two boys began experimenting immediately, and Moore's experiments trended, with notable consistency, toward things that exploded. He converted a backyard shed into what his biographers describe as "a sophisticated if risky explosives laboratory," progressing from simple firecrackers to homemade rockets to nitroglycerin. He was, by all accounts, on the road to making dynamite. "Most people who knew me then would have described me as quiet," he recalled decades later, "except for the bombs." The line is vintage Moore — a deadpan understatement that doubles as a precise self-assessment. The boy who nearly had to repeat first grade because of his extreme introversion, who was later assigned to a remedial speech class, had found in chemistry what he could not find in human conversation: a medium through which to express volition. The flashes, bangs, and stinks were not rebellion. They were communication.
He became the first member of his family to attend college, spending two years at San Jose State — where he studied chemistry and, with characteristic efficiency, met Betty Irene Whitaker, a journalism major from a fruits-and-nuts ranch in what was then called the Valley of Heart's Delight and would later be called Silicon Valley. They married in 1950, immediately after his graduation from the University of California, Berkeley, with a bachelor's degree in chemistry. Moore was even going to work on his wedding day. A day and a half later, he started graduate school at Caltech.
The [Cost](/mental-models/cost) Per Word
At Caltech, Moore earned his PhD in chemistry and physics in 1954, specializing in infrared spectroscopy — precise measurement, careful mathematical analysis, intricate equipment. His professors included Linus Pauling ("very intimidating") and
Richard Feynman ("a lot of fun"). The education was first-rate. The job market was not. In the early 1950s, California offered few positions for a freshly minted PhD in physical chemistry, so Moore moved his young family east to Silver Spring, Maryland, where he joined the Applied Physics Laboratory at Johns Hopkins University.
There, he did basic research on the physical chemistry of solid rocket propellants, examined the shapes of infrared spectral lines, and studied flames. The work was intellectually interesting and practically meaningless. Moore began calculating the cost per published word of the research papers emerging from his lab and arrived at a figure — roughly five dollars a word — that troubled him. "I wasn't really sure that society was benefiting sufficiently from what I was doing," he said. He didn't know whether anybody was even reading the articles. The pragmatist in him, the Pescadero in him, the son of a deputy sheriff who measured a man by what he accomplished, not what he theorized — this essential Moore recoiled from the prospect of a career spent publishing expensive sentences that no one would read.
When the research group began to break up, Moore started looking for something closer to a real product. He interviewed at several places, including Lawrence Livermore Laboratory, where the work — thermonuclear devices — struck him as "a practical application" of a peculiar sort. He turned the offer down. But the interview had consequences he could not have foreseen:
William Shockley, the brilliant, mercurial, Nobel Prize-winning co-inventor of the transistor, had connections at Livermore and obtained access to the names of candidates who had been offered positions and declined them. In 1956, Shockley called Gordon Moore.
The Traitorous Eight and the Architecture of Betrayal
William Shockley — born in London in 1910, raised in Palo Alto, educated at Caltech and MIT, co-recipient of the 1956 Nobel Prize in Physics for his work on the transistor at Bell Laboratories — was, by the testimony of virtually everyone who worked for him, a catastrophe as a manager. He was secretive, distrustful, erratic, and cruel. He once forced his entire staff to take a lie detector test over a perceived slight. He possessed the kind of genius that made people want to work for him and the kind of personality that ensured they couldn't bear to do so for long. In 1956 he left Bell Labs, returned to the San Francisco Peninsula, and established Shockley Semiconductor Laboratory in Palo Alto with backing from Arnold O. Beckman, the instruments magnate. He recruited a team of extraordinary young scientists — a dream roster of talent that included, among others, a twenty-seven-year-old chemist from Johns Hopkins named Gordon Moore and a twenty-eight-year-old physicist from Iowa named Robert Noyce.
Robert Norton Noyce — born in 1927 in Burlington, Iowa, the son of a Congregational minister, state diving champion, Phi Beta Kappa physics graduate of Grinnell College, MIT doctorate in solid-state physics — was everything Moore was not, or at least everything Moore didn't show. Charismatic, voluble, athletic, a natural leader who exuded the confident optimism of a man who expected doors to open and usually found that they did. Where Moore was the quiet observer who saw patterns in data, Noyce was the electric presence who saw patterns in people and organizations. Tom Wolfe, in his celebrated 1983 Esquire profile, painted Noyce as the embodiment of a new American archetype: the Silicon Valley entrepreneur who rejected the feudal hierarchies of eastern industry in favor of meritocratic informality. Together, Moore and Noyce would form one of the great complementary partnerships in the history of American enterprise. But first they had to escape.
By early 1957, the researchers at Shockley Semiconductor had had enough. In May, eight of them — Moore, Noyce, Julius Blank, Victor Grinich, Jean Hoerni, Eugene Kleiner, Jay Last, and Sheldon Roberts — approached Arnold Beckman and explained that they simply couldn't work under Shockley's management anymore. Beckman initially agreed to hire a professional manager, limiting Shockley's authority. Then he changed his mind. Realizing how destructive the move would be to Shockley's career and ego, Beckman reversed course: Shockley would remain director with full powers intact. Take it or leave it.
The eight men left it. On September 18, 1957, they resigned from Shockley Semiconductor. The next day, they signed a contract with Fairchild Camera and Instrument Corporation — a New York firm involved in missiles and satellite systems — backed by $1.3 million from Fairchild and a $500 investment from each of the founding members. Shockley called them "the traitorous eight." It was meant as an epithet. It became a badge of honor. The defection planted the seeds for Silicon Valley's renegade culture — the principle that engineers who disagreed with their bosses didn't negotiate; they competed.
One of the things that made it work was that all eight of us were in, and each of us had a different specialty. We were a prepackaged solution.
— Gordon Moore, recalling the founding of Fairchild Semiconductor
Fairchild's Factory and the Art of Silicon
Fairchild Semiconductor Corporation, located in Santa Clara, quickly emerged as a major transistor manufacturer. But Moore saw something that others missed — something that would define both Fairchild's success and his own future. No matter how much science went into conceiving silicon wafers, he recognized, there would always be an art-like skill associated with their production. Theory was necessary but insufficient. You had to get your hands dirty. You had to stand in the fabrication room and watch what actually happened when you tried to deposit impurities into silicon, when you tried to etch lines of conductive metal onto a wafer's surface. The gap between what the equations predicted and what the factory produced was where the real breakthroughs lived.
Moore became director of research and development at Fairchild in 1959, after Noyce — who had just co-invented the integrated circuit, one of the most consequential innovations of the twentieth century — was elevated to general manager. As R&D director, Moore supervised the development of the metal-oxide semiconductor (MOS) process, a technique that would prove fundamental to all subsequent chip manufacturing. He led the team that produced early high-frequency silicon transistors. He oversaw the practical realization of Noyce's integrated circuit concept — the wild idea that instead of cutting a silicon wafer apart into individual transistors and reconnecting them with wires, you could manufacture an entire circuit on a single chip.
Jean Hoerni — one of the traitorous eight, a Swiss-born physicist educated at the University of Geneva and Cambridge, whose personality was as prickly as his intellect was brilliant — engineered the process of placing a layer of silicon oxide on top of transistors, sealing out contaminants. For Noyce, Hoerni's oxide layer made the fundamental innovation possible: the planar process, in which conductive metal lines — the "wires" — were evaporated directly onto the silicon wafer's surface. This was the technique that made integrated circuits commercially viable. It was also the technique that made both Noyce and Fairchild wealthy when, after extensive litigation with Texas Instruments, Fairchild was granted the patent.
The critical early customers were military. Integrated circuits were expensive, and the military was, as Moore put it, "a cost-insensitive application." Small size and light weight mattered to missiles and spacecraft far more than price. Fairchild supplied the chips that went into the computers aboard Apollo spacecraft. But Moore kept his eye on a different prize — the point at which increasing complexity would drive costs down far enough to make integrated circuits viable for commercial electronics. He saw the inflection coming before almost anyone else.
Sixty Components and a Wild Extrapolation
In February 1965, Lewis Young, editor of the weekly trade journal Electronics, wrote to Gordon Moore. The magazine was planning a thirty-fifth anniversary issue featuring a series titled "The Experts Look at the Future." As director of R&D at Fairchild, Moore was the sole microchip expert invited to contribute. His words would reach 65,000 subscribers. Moore made a giant asterisk mark with his pencil at the top of Young's invitation and underlined an exhortation to himself: "GO-GO."
What followed was "Cramming More Components onto Integrated Circuits," published on April 19, 1965 — a four-page article that, as Intel later discovered when it tried to locate an original copy and couldn't find one, nobody at the time thought to preserve. The article's central insight was economic, not physical. Moore observed that the number of components on a state-of-the-art chip had been roughly doubling every year — from one transistor in 1959 to about sixty components by 1965. He plotted the data on semi-logarithmic paper, drew his straight line, and extrapolated ten years forward: by 1975, a chip might contain 65,000 components.
The actual transistor count for a new memory chip released in 1975 was 65,536. Moore had been accurate to within a single percentage point over a decade. One historian called it "the metronome of modern life."
The article also contained something less frequently cited but arguably more prescient: a set of predictions about what cheap, powerful chips would make possible. "Integrated circuits will lead to such wonders as home computers, automatic controls for automobiles, and personal portable communications equipment," Moore wrote. He anticipated, in 1965, the PC, the smart car, and the smartphone. "The electronic wristwatch needs only a display to be feasible today." He even included a cartoon, showing people standing in line to buy what looked remarkably like a modern laptop.
In 1975, Moore updated his projection, revising the doubling time from one year to two. Carver Mead — a Caltech professor who had first met Moore in 1959 when the Fairchild man showed up unannounced at Mead's lab carrying a briefcase bulging with state-of-the-art transistors — coined the term "Moore's Law" in the early 1970s. Moore couldn't bring himself to say it for two decades. "I couldn't even utter the term 'Moore's Law' for a long time," he told IEEE Spectrum. "It just didn't seem appropriate." Eventually, with characteristic dryness, he made his peace: "A while back I Googled 'Moore's Law' and '
Murphy's Law' and discovered that Moore's Law had more references than Murphy's. It's about as profound a law too."
It's not a law in any real respect. It was an observation and a projection.
— Gordon Moore, IEEE Spectrum, 2015
One Spring Afternoon
One spring afternoon in 1968, Gordon Moore dropped by Robert Noyce's house in Los Altos Hills. Noyce was mowing the lawn. In the course of their conversation — two men in their late thirties, standing in the California sun, one pushing a mower and the other presumably watching — Moore suggested that semiconductor memory, an emerging technology, might form the basis of a new company. Noyce agreed.
On July 18, 1968, the two men incorporated a venture they would call Intel — derived from "integrated electronics." They each put up $250,000 of their own money and secured backing from Arthur Rock, the venture capitalist who had helped arrange the original Fairchild deal in 1957. Rock — a Rochester, New York native who had struggled with polio as a child, worked the soda fountain at his father's candy store, and talked his way into a career on Wall Street before essentially inventing venture capital as a West Coast practice — raised additional funding with almost absurd ease. When Bob Noyce and Gordon Moore said they wanted to start a company, investors didn't ask what the company would make. They wrote checks.
Almost immediately, Andrew Grove joined them. Grove — born András Gróf in Budapest in 1936, Jewish, a survivor of Nazi occupation and the 1956 Hungarian revolution, who had arrived in the United States barely able to speak English, earned a PhD in chemical engineering at Berkeley, and was currently Fairchild's assistant director of development — became Intel's first employee. Moore had originally hired Grove at Fairchild, right after Grove finished his doctorate. "Gordon was the only boss I ever had," Grove would say decades later. The two men worked together for more than thirty years. They were an improbable pair: Moore, the introverted Californian fisherman who described himself as "a loner"; Grove, the intense Hungarian refugee with a reputation for aggressive intelligence and a wry, combative wit. "I'm an older, creakier version of the same aggressive young engineer that Gordon hired," Grove told NPR in 2012.
The arrangement was unspoken but clear. Noyce was the visionary and external ambassador — the "Mayor of Silicon Valley," as people called him, the man who could charm a roomful of investors or a congressional committee. Moore was the technologist, the one who understood what silicon could actually do and at what pace. Grove was the operator, the disciplinarian, the manager who turned conviction into execution. Three men, three functions, one company. It would become perhaps the most important company in the history of technology.
Merging Theory and Practice
At Fairchild, Moore had learned a lesson that would define Intel's culture. The gap between the research lab and the production line was where companies died. Scientists conceived brilliant designs; manufacturing engineers struggled to replicate them at scale; the two groups rarely spoke the same language. When Moore and Noyce established Intel, they made a decision that was, by the standards of the era, radical: they would force research scientists and engineers to work directly on the production of chips. There would be no separate research division sealed off from the factory floor. Theory and practice would be merged, literally, in the same building and the same organizational structure.
This was not an abstract management philosophy. It was an insight born from years of watching silicon wafers behave in ways that no equation predicted. Moore understood that semiconductor manufacturing was part science, part art, and part brute empiricism. The critical knowledge lived in the hands of the people who actually made the chips — in the subtle adjustments they made to temperature, timing, and chemical composition that meant the difference between a working device and a useless piece of silicon. Separating those people from the researchers who designed the devices was, in Moore's view, organizational malpractice.
Intel's first big commercial success was the 1103 dynamic random-access memory chip, released in 1970 — the first DRAM to achieve commercial viability, which established semiconductors as the new industry standard for memory, replacing magnetic core memories. Then, in 1971, came the product that would transform the company and the world: the Intel 4004, the first commercially available microprocessor. It had been conceived as a contract project for a Japanese calculator company called Busicom, which wanted a set of custom chips. Intel engineer Ted Hoff — working within the culture Moore had established, where practical constraints and theoretical ambitions collided daily — proposed a radical alternative: a general-purpose processor that could be programmed to perform different functions. Intel bought back the rights to the 4004 from Busicom for a fraction of the device's eventual value.
Moore served as executive vice president from 1968 to 1975, then as president and CEO from 1975 to 1987, then as chairman of the board from 1979 to 1997. He was chairman emeritus from 1997 until stepping down in 2006. During his tenure as CEO, the company went through the wrenching decision — driven primarily by Grove — to abandon the DRAM business that Moore had founded and bet the company entirely on microprocessors. Intel lost a third of its workforce in the mid-1980s downturn. Moore backed the pivot. The exponential curve on his semi-log paper demanded it.
We just tried to move the technology at as fast a rate as made sense, and it turns out that it pretty much stays on the same curve.
— Gordon Moore, Fortune, 2002
The Law That Wasn't a Law
Moore's Law is not a law. Moore himself said so, repeatedly, with the patient exasperation of a man who had spent decades correcting a misunderstanding that had become too useful to correct. "It's not a law in any real respect," he told IEEE Spectrum. "It was an observation and a projection." It is not derived from physics. It makes no claim about fundamental limits. It is, at bottom, an economic observation dressed up in the language of technological inevitability — the recognition that the semiconductor industry had strong financial incentives to keep doubling transistor counts, and that it would continue to find ways to do so as long as the economics held.
What transformed Moore's observation from a magazine article into the governing principle of an industry was precisely this economic dimension. Intel CEO Brian Krzanich put it plainly at the fiftieth-anniversary celebration in 2015: "Moore's Law is an economics law, not built so much on physics and chemistry." By doubling transistor counts in a given area, you could eventually halve the chip size — and thus halve the cost. The incentive structure was self-reinforcing. Companies that kept pace with the doubling could offer better products at lower prices. Companies that fell behind were destroyed. Moore's Law became, in effect, a competitive standard — a clock that told the entire industry how fast it needed to run.
It also became, as Moore acknowledged with a mixture of pride and bemusement, a self-fulfilling prophecy. "In one respect it has become a self-fulfilling prophecy," he told Scientific American in 1997. "People know they have to stay on that curve to remain competitive, so they put the effort in to make it happen." The billions of dollars that Intel and its competitors plowed into R&D were not motivated by scientific curiosity; they were motivated by the knowledge that the law demanded progress and that the market would punish anyone who didn't deliver it. The prophecy created the conditions for its own fulfillment.
Over the decades, commentators predicted the death of Moore's Law with metronomic regularity. The limits of physics, the atomic scale, the rising cost of fabrication plants — each was cited as the inevitable endpoint. Moore himself was cautious. "I've never been able to see more than two or three generations ahead without seeing something that looked like a fairly impenetrable barrier," he said. But good engineering kept finding ways around the barriers. From the planar transistor to metal-oxide semiconductors to three-dimensional transistor architectures, each generation of innovation pushed the curve forward.
By the time of the fiftieth-anniversary celebration in May 2015 — held at the Exploratorium on Pier 15 in San Francisco, with Moore, then eighty-six, interviewed onstage by Thomas Friedman before a crowd of gray-haired semiconductor veterans — Intel's Core i5 processor contained 3,500 times the processing power of the original 4004. It offered 90,000 times the energy efficiency at one-60,000th of the cost. If a 1971 Volkswagen Beetle had undergone the same transformation, Krzanich noted, it would travel at 300,000 miles per hour, achieve 2 million miles per gallon, and cost four cents.
The Accidental Entrepreneur and the Nature of Quiet
"I'm a loner," Moore told NPR in 2012, and Andy Grove, sitting beside him, nodded. The self-description was not false modesty or affectation. It was diagnosis. Moore was constitutionally reserved in a way that distinguished him not only from the gregarious Noyce but from virtually every other figure in Silicon Valley's mythology. He did not give rousing speeches. He did not cultivate a public persona. He did not want Moore's Law to be his legacy. "Anything but Moore's Law," he told IEEE Spectrum's Tekla Perry in 2008, when she asked what he'd like to be remembered for. He didn't have the heart, apparently, to accept that the wish was futile.
He called himself "the ultimate accidental entrepreneur," and the characterization was, in one sense, accurate. He had not set out to build companies. He had set out to do chemistry. Each career turn — from Johns Hopkins to Shockley to Fairchild to Intel — was propelled less by ambition than by a pragmatic dissatisfaction with the status quo: the cost per published word was too high; the management was intolerable; the parent company was too bureaucratic; the memory market was collapsing. Moore did not leap toward opportunities. He stepped away from situations that no longer made sense.
And yet the "accidental" framing conceals something. Moore was fiercely competitive. The biographers of
Moore's Law: The Life of Gordon Moore, Silicon Valley's Quiet Revolutionary — Arnold Thackray, David C. Brock, and Rachel Jones, who had unparalleled access to Moore, his family, and his contemporaries — make clear that beneath the self-effacing exterior was a man who understood power, strategy, and the ruthless economics of technological competition as well as anyone in the Valley. His quietness was not passivity. It was a different mode of control — the control of the man who sets the tempo rather than the man who gives the orders.
The Moore-Grove partnership illustrated this perfectly. Grove, the survivor of two totalitarian regimes, operated on a principle he distilled into a book title: Only the Paranoid Survive. He pushed, prodded, confronted, fired. Moore set the direction. "Gordon Moore defined the technology industry through his insight and vision," Pat Gelsinger, who joined Intel as an eighteen-year-old technician and rose to become CEO, said after Moore's death. "My career and much of my life took shape within the possibilities fueled by Gordon's leadership at the helm of Intel." The word "leadership" is doing more work in that sentence than it appears to. Moore's leadership was the kind that creates a framework — a law, a culture, a set of expectations — within which more overtly forceful personalities can operate. He led by defining the game, not by playing it most aggressively.
The Fortune and the Foundation
Moore's wealth, derived primarily from his Intel holdings, eventually made him one of the richest men in the world — the 288th richest, according to one ranking that IEEE Spectrum cited in 2008 with the kind of precision Moore himself would have appreciated. He and Betty established the Gordon and Betty Moore Foundation in 2000, endowing it with approximately $5 billion worth of Intel stock — roughly half of their estimated wealth. Since its founding, the foundation has donated more than $5.1 billion to charitable causes, focusing on environmental conservation, scientific research, patient care, and quality of life in the San Francisco Bay Area.
The philanthropic philosophy reflected the same principles that had governed Moore's career. He believed in unrestricted funding — giving money to institutions and trusting them to allocate it wisely rather than dictating specific uses from outside. "Those within the Institute have a much better view of what the highest priorities are than we could have," he said of his donations to Caltech. "We'd rather turn the job of deciding where to use resources over to Caltech than try to dictate it from outside." In 2001, he and Betty personally committed $300 million to Caltech and contributed another $300 million through the foundation — a combined $600 million that represented, at the time, the largest philanthropic donation to an institution of higher learning in history.
He joined
Bill Gates and
Warren Buffett's Giving Pledge, committing to give away at least half of his wealth. But he did it quietly, without the public performances that characterized some other signatories. The foundation's emphasis on environmental conservation — protecting marine ecosystems, preserving wild salmon habitats, combating overfishing — reflected Moore's lifelong passion for the natural world, rooted in those childhood years fishing in Pescadero Creek. The boy who had grown up in a farming village by the Pacific, surrounded by land his family had settled in the 1840s, spent his final decades trying to protect the landscapes that had formed him.
Moore cofounded three institutions that reshaped their domains.
1957Cofounds Fairchild Semiconductor — the "first trillion-dollar startup" whose alumni spawned Apple, Google, Oracle, AMD, and dozens more.
1965Publishes "Cramming More Components onto Integrated Circuits" in Electronics magazine, establishing what would become Moore's Law.
1968Cofounds Intel Corporation with Robert Noyce; Andy Grove joins as first employee.
1971Intel releases the 4004 — the world's first commercially available microprocessor.
1975Becomes Intel president and CEO; revises Moore's Law to doubling every two years.
1990Receives National Medal of Technology from President George H.W. Bush.
2000Establishes the Gordon and Betty Moore Foundation with approximately $5 billion in Intel stock.
What He Missed, and What He Saw
Moore was disarmingly honest about what he got wrong. "I calibrate my ability to predict the future by saying that in 1980 if you'd asked me about the most important applications of the microprocessor, I probably would have missed the PC," he told MIT Technology Review in 2001. "In 1990 I would have missed the Internet. So here we are just past 2000, I'm probably missing something very important." When he reread his 1965 article decades later, he was surprised to find that he had, in fact, predicted home computers. "I was surprised to find that myself," he told a reporter. "I didn't realize that I had predicted them in 1965."
The modesty was genuine but also, in its way, misleading. Moore did not predict specific applications because he understood something more fundamental: that exponential progress in the underlying technology would create applications that no one could foresee. The important prediction was not the PC or the internet. The important prediction was the exponential itself — the steady, relentless doubling that would make all future applications possible, including the ones that would surprise even the man who had drawn the line on the graph paper.
Asked at the fiftieth-anniversary celebration what the biggest lesson of Moore's Law was, Moore responded: "Once I made a successful prediction, I avoided making another." The crowd laughed. It was a joke, but it was also a statement of epistemological principle. The future is not knowable in its specifics. What is knowable — what Moore grasped before almost anyone else — is the rate of change. If you know the rate, you can build an industry. You don't need to know the destination.
"I wish I had seen the applications earlier," he said that evening, in one of his final major public appearances. "To me the development of the internet was a surprise. I didn't realize it would open up a new world of opportunities. We have just seen the beginning of what computers will do for us."
The Backyard in Hawaii
Gordon Moore died on March 24, 2023, at his home in Waimea, on Hawaii's Big Island, at the age of ninety-four. Betty, his wife of seventy-three years, survived him, along with sons Kenneth and Steven and four grandchildren. He had spent his retirement years on the island, fishing, tending to the foundation, and occasionally granting interviews in which he expressed the same mild astonishment at the persistence of his law that he had expressed since the 1970s.
In one of his last extended interviews, conducted by Rachel Courtland of IEEE Spectrum at his Hawaiian home, Moore was asked if he had come to terms with having a law named after him. "Oh, 20 years or so," he said. "It really took a long time." He was asked about the future. He talked about speech recognition, about machines that could understand context, about a world in which "people with little education are going to be able to participate" and "the digital divide is going to disappear." He was still, at eighty-six, looking at the line on the graph paper and projecting forward.
The backyard of his Hawaiian home overlooked the Pacific — the same ocean that lapped against the coast near Pescadero, where a deputy sheriff's son had learned to fish and to be quiet and to notice the pattern in things that others overlooked. Intel renamed its main Oregon research campus "Gordon Moore Park at Ronler Acres" in April 2022. It is the home of the chipmaker's most advanced research, where each new generation of technology is crafted — where, in other words, engineers continue to bend the arc of Moore's line, pushing against the atomic limits of matter in pursuit of the doubling that a chemist had sketched on a piece of paper nearly sixty years earlier.
"If everything you try works," Moore once said, "you aren't trying hard enough."
A single data point on semi-log paper. Then another. Then a line, drawn with a ruler, extending into the unknown. The rest was engineering.