The Telephone and the Telegraph
In February 1985, a twenty-nine-year-old worth roughly a quarter of a billion dollars — though he'd lost the same amount in a single year and laughed about it — sat down with
Playboy magazine and compared the personal computer to
Alexander Graham Bell's telephone. Not the internet, which barely existed outside of university labs and military installations. Not the smartphone, which wouldn't be invented for another two decades. The telephone. "Remember that first the public telegraph was inaugurated, in 1844," Steve Jobs told the interviewer, David Sheff. "It was an amazing breakthrough in communications. You could actually send messages from New York to San Francisco in an afternoon. People talked about putting a telegraph on every desk in America to improve productivity. But it wouldn't have worked. It required that people learn this whole sequence of strange incantations, Morse code, dots and dashes, to use the telegraph. It took about 40 hours to learn. The majority of people would never learn how to use it." Then the pivot, which is where the entire story of Apple Computer lives: "So, fortunately, in the 1870s, Bell filed the patents for the telephone. It performed basically the same function as the telegraph, but people already knew how to use it. Also, the neatest thing about it was that besides allowing you to communicate with just words, it allowed you to sing."
It allowed you to sing. Here was the whole thesis, already fully formed in the mind of a man who hadn't yet been fired from the company he'd founded, who hadn't yet spent a decade in the wilderness building expensive machines almost nobody wanted, who hadn't yet acquired a struggling computer graphics division from
George Lucas for $5 million and nursed it into a studio that would produce the first entirely computer-animated feature film, who hadn't yet returned to a company ninety days from bankruptcy and transformed it into the most valuable enterprise on Earth. The computer was not a productivity tool. It was not a business machine. It was an instrument for human expression — and the people who built it needed to understand that the difference between the telegraph and the telephone was not a matter of engineering but of empathy. The question was whether technology would demand that humans learn its language, or whether it would learn theirs.
Everything Jobs built, broke, abandoned, and rebuilt over thirty-five years was an attempt to answer that question. And the answer — always, relentlessly, sometimes to the point of cruelty — was the same.
By the Numbers
The Apple Empire
$2.3TApple market cap at time of Jobs's death (Oct 2011)
7Industries transformed: PCs, animated films, music, phones, tablets, retail, digital publishing
$5.4BPixar acquisition price paid by Disney (2006)
346Patents listing Jobs as inventor or co-inventor
$1Jobs's annual salary as Apple CEO after his return
90 daysApple's estimated distance from bankruptcy when Jobs returned in 1997
A Machinist's Son in the Valley of Heart's Delight
The Santa Clara Valley, before anyone called it Silicon Valley, was orchards — apricots, cherries, plums — and the aerospace plants and electronics firms that had begun to cluster around Stanford University like iron filings around a magnet. Into this landscape, in 1960, Paul and Clara Jobs moved their family from San Francisco to a three-bedroom ranch house in Mountain View.
Paul Jobs was the kind of man whose biography compresses into verbs. Born in Wisconsin, he never finished high school. He joined the Coast Guard in World War II, ferried troops for General Patton across the Pacific, got busted down to private with reliable frequency, and came out of the service a machinist. He married Clara Hagopian, an Armenian-American accountant, and when their attempts to have children failed, they adopted a boy — Steven Paul — born on February 24, 1955, to Abdulfattah Jandali, a Syrian political science graduate student at the University of Wisconsin, and Joanne Schieble, an American speech therapist whose Catholic father threatened to disown her if she married a Muslim. The biological parents would later marry and have another child, the novelist Mona Simpson, but Steve wouldn't learn any of this until he was twenty-seven.
What he learned instead, in the garage of the Mountain View house, was how to take things apart. Paul Jobs showed his adopted son the guts of machines — circuits, wiring, the logic of mechanical systems — with the patient authority of a man who'd spent his life understanding how physical things worked. "He had a workbench out in his garage," Jobs recalled in a 1995 Smithsonian oral history, and "he spent a lot of time with me, teaching me how to build things, how to take things apart, put things back together." The elder Jobs's other obsession was craftsmanship: even the back of a fence, the part nobody sees, should be built properly. It was an ethic his son would internalize so completely that, decades later, he would demand the redesign of an internal circuit board — invisible to any user — because he found it aesthetically offensive.
Clara taught Steve to read before kindergarten. He was bored almost immediately and almost permanently. He was expelled from one school for misbehavior. His fourth-grade teacher, Imogene Hill, bribed him into studying — and the bribes worked so well that he tested at a level that prompted administrators to suggest he skip two grades. His parents allowed one. At Homestead High School in Cupertino, he joined the Explorer's Club at Hewlett-Packard, where he saw a computer for the first time. When he needed parts for a frequency counter he was building, he did what would become characteristic: he looked up
Bill Hewlett's home phone number in the directory and called him directly. He got the parts. He got a summer job. He got, without knowing it yet, an education in the power of simply asking.
Two Steves and a Garage
Stephen Wozniak — universally known as Woz — was five years older, which at that age meant everything and nothing. A mutual friend introduced them in 1969. Where Jobs was mercurial, verbal, charismatic, and interested in everything, Wozniak was gentle, obsessive, and interested in one thing: making circuits sing. He was, by all accounts, among the most gifted digital engineers of his generation — a man who could design an entire computer using fewer chips than anyone thought possible, not for commercial reasons but for the sheer beauty of the reduction. He worked at Hewlett-Packard. He had no interest in starting a company.
Jobs had no interest in not starting one.
Their first collaboration, in 1972, was illegal. They built and sold "blue boxes" — devices that generated the precise tones needed to hack the phone system and make long-distance calls for free. The boxes worked. They sold about a hundred of them, at $150 apiece. The technical brilliance was Woz's; the audacity and the salesmanship were Jobs's. "If it hadn't been for the blue boxes," Jobs would later say, "there wouldn't have been an Apple." Not because of the revenue — which was negligible — but because of the revelation: two guys in a bedroom could build a device that gave them control over a multibillion-dollar infrastructure. The world was not fixed. It could be hacked.
Between the blue boxes and Apple, Jobs's biography takes a detour through the countercultural currents of the early 1970s that would have been unremarkable for anyone else of his generation but became, in retrospect, foundational. He enrolled at Reed College in Portland, Oregon, in the fall of 1972 — a small, expensive, fiercely intellectual liberal arts school — and dropped out after one semester. But he didn't leave. He slept on the floor of friends' dorm rooms, returned Coke bottles for food money, and audited the classes that interested him, including, famously, a calligraphy course taught by Robert Palladino. "I learned about serif and sans-serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great," Jobs said in his 2005 Stanford commencement address. "It was beautiful, historical, artistically subtle in a way that science can't capture." None of it had any practical application. Ten years later, when he was designing the Macintosh, it had all the practical application in the world.
He took a job at Atari — the video game company, then at its zenith — in early 1974, saved money, and used it to fund a pilgrimage to India with his Reed College friend Dan Kottke. He went looking for enlightenment. What he found, he said later, was that the people in the Indian countryside weren't using intellect to solve problems but intuition, and that intuition was at least as powerful. He came back to Silicon Valley in the autumn of 1974 with a shaved head, Indian cotton clothing, and a deepened commitment to Zen Buddhism — a practice he would maintain, in various forms, for the rest of his life. The emphasis on emptiness, on the elegance of reduction, on seeing through complexity to essential form: these were not metaphors for his later design philosophy. They were his design philosophy.
The Garage, the Byte Shop, and the Fortune 500
Wozniak had been tinkering with computer designs and showing them off at the Homebrew Computer Club, a gathering of hobbyists and engineers in Menlo Park. In March 1976, he and Jobs showed the early Apple I board at one of these meetings. It was a single-board computer — no case, no keyboard, no power supply. A circuit board. Wozniak saw an elegant machine. Jobs saw a business.
On April 1, 1976, Apple Computer Inc. was incorporated by Steve Jobs, Steve Wozniak, and Ron Wayne — a former Atari colleague who drew up the partnership agreement, contributed the first Apple logo (a baroque pen-and-ink drawing of Newton under the apple tree), and then sold his 10% stake back to Jobs and Wozniak for $800 twelve days later. Wayne's share, had he held it, would eventually have been worth tens of billions of dollars. He has said he has no regrets. This may be the most heroic or the most delusional statement in the history of American capitalism.
They assembled Apple I computers in the Jobs family garage and sold them to hobbyists — including fifty units to the Byte Shop, a local computer retailer — at $666.66 apiece. Jobs funded the operation by selling his Volkswagen minibus; Wozniak sold his HP-65 programmable calculator. The total startup capital was approximately $1,300.
What happened next was the application of a single, radical insight: the personal computer would appeal to a broad audience if — and only if — it did not look like it belonged in a junior high school science fair. Jobs understood this. Almost nobody else did. With his encouragement, Wozniak designed the Apple II — a vastly improved machine, complete with a keyboard, color graphics, and a sleek molded plastic case that Jobs insisted upon. In January 1977, Mike Markkula — a former Intel executive turned business angel, then thirty-three, who had made his fortune on Intel stock options and retired to a house in the hills above Cupertino — invested $250,000 and joined Apple as a partner. Markkula was quiet, analytical, and understood distribution and marketing in ways the two Steves did not. He also wrote Apple's first business plan and helped hire Mike Scott as the company's first CEO.
The Apple II debuted at the West Coast Computer Faire in April 1977 and created an immediate sensation. When the spreadsheet program VisiCalc was released for the platform in 1979, sales skyrocketed. The Apple II became the first mass-market personal computer. The company's growth was vertiginous: by 1980, Apple was generating hundreds of millions in revenue. On December 12, 1980, Apple went public at $22 per share. By the end of the first day of trading, Steve Jobs's net worth exceeded $200 million. He was twenty-five years old.
And he was already, in the estimation of several colleagues, nearly impossible to work with.
Xerox PARC and the Window into the Future
In December 1979, Jobs negotiated a visit to the Xerox Corporation's Palo Alto Research Center — PARC — one of the most storied research labs in the history of computing, a place where brilliant engineers had invented, among other things, the graphical user interface, the mouse, Ethernet, and the laser printer. Xerox, a company that made its fortune on copiers, had no idea what to do with any of it. Jobs made an arrangement: Xerox would be allowed to invest in Apple's upcoming IPO (they bought 100,000 shares at $10 each) and in exchange, Apple engineers would get a tour of PARC.
What Jobs saw that day changed everything. Instead of the green-on-black command lines that defined computing in 1979, PARC's engineers had built an interface with windows, icons, folders, and a mouse to navigate them. "It was like a veil being lifted from my eyes," Jobs told Walter Isaacson. "I could see what the future of computing was destined to be."
He didn't invent the graphical user interface. He didn't invent the mouse. What he did — and this is the core of the Steve Jobs story, the thread that runs through every product he ever shipped — was see what others had built, understand its implications more deeply than its creators did, and then execute a version of it so refined, so integrated, so empathetic to the human being on the other end of the experience that it became, effectively, a different thing entirely. The Xerox mouse cost $1,000 to build and had three buttons. Jobs had Apple's engineers design one that cost $20 and had a single button. "We found that people would push the wrong button or be scared that they were going to push the wrong button, so they always looked at the mouse instead of the screen," he told Terry Gross in 1996. "So we got it down to one button so that you could never push the wrong button." This is not engineering. This is psychology. This is watching a human being's eyes and understanding where fear lives.
The Xerox mouse cost about $1,000 a piece to build. We had to engineer one that cost 20 bucks to build. But the basic concept of the mouse came originally from a company called SRI, through Xerox and then to Apple.
— Steve Jobs, 1996 interview with Terry Gross
The technologies he saw at PARC were channeled into two Apple projects: Lisa, a business computer, and the Macintosh, a lower-cost machine. Jobs was initially involved with Lisa but was removed from the project by Mike Scott — the company's CEO, who found Jobs's management style disruptive. Jobs then took over the Macintosh team, which had been started by Jef Raskin, an Apple engineer who envisioned a low-cost, easy-to-use computer. Raskin was forced out in early 1981 as Jobs seized control of the project that would define his career.
Insanely Great
The Macintosh team was sequestered in a separate building — Bandley 3 — with a pirate flag flying from the roof. Jobs cultivated an us-against-the-world mentality, positioning the Mac team as artists and rebels fighting the corporate bureaucracy of the rest of Apple. He called his engineers artists. He also screamed at them, belittled their work in front of colleagues, and demanded the redesign of components no user would ever see. The intensity was indistinguishable from the cruelty.
Andy Hertzfeld, the lead designer of the original Mac operating system, described Jobs's effect on his team as a "messianic zeal." Bud Tribble, another early Mac engineer, gave it a different name: the "reality distortion field" — an ability to convince people that the impossible was not merely possible but inevitable. "In his presence, reality is malleable," Tribble said. "He can convince anyone of practically anything." Debi Coleman, who served as the Mac's controller, put it more bluntly: "You did the impossible, because you didn't realize it was impossible."
On January 24, 1984, in a brilliantly choreographed demonstration at the annual shareholders' meeting in Cupertino, Jobs introduced the Macintosh to the world. He pulled it from a bag, inserted a floppy disk, and let the machine speak for itself — literally, using a text-to-speech synthesizer. "Hello," the Macintosh said. "I'm Macintosh. It sure is great to get out of that bag." The audience exploded. The event would later be cited as the archetype of what the marketing world calls "event marketing," though that phrase drains it of its genuine strangeness: a twenty-eight-year-old in a bow tie conjuring a machine that spoke.
The Macintosh was, in many ways, brilliant. It was also underpowered, expensive, and had almost no software. Sales were disappointing. Jobs, who had convinced PepsiCo president John Sculley to become Apple's CEO in 1983 with the legendary challenge — "Do you want to sell sugar water for the rest of your life?" — now found himself outmaneuvered by the very executive he'd recruited. Sculley, who had risen through the consumer-goods world on the strength of the Pepsi Challenge and careful market segmentation, was a different species of corporate animal entirely. He was polished, cautious, data-driven. Jobs was none of these things.
The clash was inevitable. By 1985, with Mac sales stalling and the board losing patience, Sculley convinced Apple's directors to strip Jobs of his operational responsibilities. On September 17, 1985, Steve Jobs resigned from Apple Computer.
He was thirty years old.
The Wilderness and Its Gifts
What followed was, by the standards of Silicon Valley mythology, a period of exile — though the exile was conducted from a mansion in Woodside, California, and involved spending $12 million of his own money to start a new company. Jobs founded NeXT Inc. in September 1985, announced to the world that he would build the most advanced personal computer ever made, and attracted $20 million from Texas billionaire Ross Perot and later an investment from Canon, the Japanese electronics giant.
The NeXT computer, released in 1988, was a work of art — a perfect black magnesium cube, one foot on each side, running an operating system called NEXTSTEP that was years ahead of anything else on the market. It was also $6,500 — far too expensive for the education market Jobs had targeted. It sold poorly. Very poorly. The hardware division was eventually shut down, and NeXT pivoted entirely to its software, which would prove, in a twist of fate that Jobs could not have predicted, to be the foundation of everything Apple would build in the twenty-first century.
But before that redemption, there was Pixar.
In 1986, Jobs acquired a controlling interest in the computer graphics division of Lucasfilm — George Lucas's production company — for $5 million, plus a $5 million capital investment. The division had been founded to develop computer graphics technology for filmmaking but had never produced a commercially successful product. It was, essentially, a money pit staffed by extraordinarily talented people —
Ed Catmull, a computer scientist who had pioneered techniques for rendering 3D images, and John Lasseter, a former Disney animator who had been fired for pushing too aggressively for computer animation. Catmull was quiet, methodical, and deeply interested in the organizational conditions that produce creative work; Lasseter was exuberant, emotionally open, and possessed of a storytelling instinct that was immediately apparent to anyone who watched his early short films.
Jobs funded Pixar for nearly a decade, pouring in an estimated $50 million of his own money during years when the company's hardware and software products generated almost nothing. He came close to selling the company multiple times. He did not. On November 22, 1995, Toy Story — the first full-length, entirely computer-animated feature film — was released by Disney. It earned $373 million worldwide. A week later, Pixar went public. Jobs's stake was worth approximately $1.2 billion.
It was at Pixar that he learned to let other creative people flourish and take the lead.
— Walter Isaacson, Steve Jobs biography
The conventional narrative holds that the wilderness years were purely instructive — that Jobs was "humbled" by his failures at NeXT and "matured" at Pixar. The truth is messier. He was humbled and matured, yes, but he was also stubborn, imperious, and intermittently brilliant in exactly the ways he had always been. What changed was subtler: he learned, painfully and slowly, that he could not do everything himself, that creative excellence sometimes required him to create the conditions for other people's genius rather than imposing his own. At Pixar, he largely stayed out of the storytelling. At NeXT, he built an operating system that was technically magnificent but commercially irrelevant — until it wasn't.
Ninety Days from Bankruptcy
By 1996, Apple Computer was dying. The company had cycled through three CEOs after Sculley — Michael Spindler and then Gil Amelio, a semiconductor executive who understood manufacturing but not consumer desire — and had lost nearly $2 billion in the previous two fiscal years. Its market share was collapsing. Its product line was a sprawl of confusingly named models — the Performa, the Quadra, the Power Macintosh, the Centris — with no coherent strategy behind any of them. Microsoft's Windows 95 had captured the world. Apple's board was contemplating a sale. The company was, by most informed estimates, approximately ninety days from insolvency.
What Apple needed, above all, was a modern operating system. Amelio decided to acquire one. He considered Be Inc., founded by former Apple executive Jean-Louis Gassée, but the negotiations stalled over price. Then, in what can only be described as a plot twist engineered by the universe, Apple agreed to acquire NeXT in December 1996 for $429 million. The deal brought NeXT's operating system — the software that would become macOS — into Apple. It also brought Steve Jobs.
He returned initially as an "advisor." Within months, the board had ousted Amelio and Jobs had assumed the role of interim CEO — a title he held, with the word "interim" never quite dropping away, for three years before formally accepting the permanent position in January 2000. His salary was $1 per year. He did not need the money. He needed the company.
What happened in the first weeks of his return has been recounted so many times that it has acquired the quality of scripture, but the details still astonish. Jobs called a product review meeting. Apple had dozens of products. Jobs drew a simple two-by-two grid on a whiteboard: consumer and professional across the top; desktop and portable down the side. Four quadrants, four products. Everything else was to be killed. "Deciding what not to do is as important as deciding what to do," he said. "That's true for companies, and it's true for products."
He cut 70% of the product line. He laid off thousands of employees. He killed the Newton, Apple's personal digital assistant. He shut down the clone licensing program that had allowed other manufacturers to run Apple's operating system, a short-term revenue strategy that was cannibalizing Apple's hardware sales. He partnered with Microsoft — the enemy — securing a $150 million investment from
Bill Gates and a commitment to continue developing Microsoft Office for the Mac. When Gates's face appeared on a massive screen at the 1997 Macworld Expo to announce the deal, the audience booed. Jobs told them to stop.
In 1998, he introduced the iMac — a translucent, Bondi Blue, gumdrop-shaped all-in-one computer designed by Jonathan Ive, a thirty-one-year-old British industrial designer who had been languishing at Apple under the previous regime. The iMac was friendly, surprising, beautiful, and sold 800,000 units in its first five months. It had no floppy drive — an omission that the tech press treated as heresy and that Jobs treated as inevitability. He was right. He was usually right about what to remove.
The Integrated Man
The period from 1998 to 2011 — the final thirteen years of Jobs's life — constitutes one of the most extraordinary runs in the history of business, and perhaps in the history of industrial design. The products arrived with the regularity of seasons: the iPod in October 2001; the iTunes Store in April 2003; the iPhone on June 29, 2007; the iPad on April 3, 2010. Each one transformed an industry. Several created industries that hadn't existed before.
The pattern was always the same: Jobs identified an existing technology or product category that was needlessly complex, poorly designed, or fundamentally broken. He then assembled a team — almost always including
Jony Ive on design and a rotating cast of world-class engineers — to build something that felt, to the user, like it had always been obvious. The iPod was not the first MP3 player. The iPhone was not the first smartphone. The iPad was not the first tablet. But in each case, Jobs's version was the one that made the category real for the mass market, because it was the one that made the technology disappear behind the experience.
The key to understanding this is to understand Jobs's obsession with what he called "end-to-end" control. Apple designed the hardware, wrote the software, built the applications, controlled the retail experience, and — after the opening of the first Apple Store in Tysons Corner, Virginia, on May 19, 2001 — even controlled the physical spaces in which customers encountered the products. No other technology company operated this way. Microsoft licensed its software to dozens of hardware manufacturers. Google gave away its mobile operating system. Dell and HP competed on price. Jobs competed on integration.
"People who are serious about software should make their own hardware," he said, quoting Alan Kay. The corollary, which he never stated but always practiced, was that people who are serious about the customer experience should control every single element of it, from the silicon chip to the unboxing.
You can't connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future.
— Steve Jobs
This was more than a business strategy. It was a philosophical commitment — an extension of the conviction, traceable all the way back to that calligraphy class at Reed, that technology and the humanities were not separate domains but a single discipline. The fonts on the Macintosh. The click-wheel on the iPod. The multitouch glass on the iPhone. The aluminum unibody of the MacBook. Each decision was simultaneously an engineering choice and an aesthetic one, and Jobs refused to accept that these were different conversations. "It's in Apple's DNA that technology alone is not enough," he said at the iPad launch in 2010. "It's technology married with liberal arts, married with the humanities, that yields us the result that makes our heart sing."
The phrase sounds sentimental. The execution was ruthless.
The Missionary and the Market
Jobs's disdain for market research was legendary, and frequently misunderstood. He did not ignore customers. He watched them constantly — watched how they held devices, where their eyes moved on a screen, what made them hesitate, what delighted them. What he refused to do was ask them what they wanted. "It's not the consumers' job to know what they want," he said. When the original Macintosh team suggested conducting focus groups, Jobs shut the discussion down: "No, because customers don't know what they want until we've shown them." He invoked
Henry Ford: "If I'd asked customers what they wanted, they would have told me, 'A faster horse!'"
This was not arrogance, or not only arrogance. It was a specific theory of innovation: that truly transformative products cannot be derived from surveys, because they occupy spaces that consumers cannot yet imagine. The telephone could not have been focus-grouped into existence. The automobile could not have been derived from asking horse owners what they wanted. The iPhone could not have been designed by a committee analyzing existing smartphone usage patterns. What was required was taste — a word Jobs used with the precision of a sommelier — combined with the technical knowledge to understand what was possible and the will to make it real.
Edwin Land, the founder of Polaroid, had operated on the same principle half a century earlier, and Jobs revered him. Both men insisted that their inventions would change the fundamental character of human interaction with images and information. Both spent lavishly on research and development, to the dismay of financial analysts. Both believed that demonstration was more persuasive than argument. "If Steve Jobs studied Edwin Land," David Senra has observed on the Founders podcast, "I think every other founder should as well."
The connection was not merely intellectual. Jobs met Land three times during Apple's early rise. Land had dropped out of Harvard, turned his apartment into a laboratory, and made his first scientific breakthrough — a polarizing filter — at nineteen. He built Polaroid into one of the great technology monopolies of its era by insisting that instant photography was not merely a product improvement but a fundamental reordering of the relationship between humans and the images they made of their lives. Jobs understood this instinctively. He operated the same way. The products were not incremental improvements. They were, or aspired to be, instruments of transformation.
The Cruelty and the Craft
Any honest accounting of Steve Jobs must reckon with the cruelty. It was not incidental. It was not a minor flaw in an otherwise admirable character. It was structural, persistent, and — this is the part that makes the story genuinely difficult — productive.
Jobs berated employees publicly. He told engineers their work was "shit" — the word was apparently a favorite — and he did so in front of their colleagues. He cried, raged, and manipulated. He parked in handicapped spaces. He denied the paternity of his daughter Lisa for years, subjecting her mother, Chrisann Brennan, to public humiliation and financial hardship, even as the child bore the name of the computer he was building. Robert Sutton, the Stanford organizational psychologist, cited Jobs as a leading example of what he clinically termed workplace bullying, noting that "the degree to which people in Silicon Valley are afraid of Jobs is unbelievable."
And yet. The people who survived the gauntlet — who withstood the berating, the contempt, the impossible demands — almost universally describe the experience as the most productive of their careers. Tim Cook, who succeeded Jobs as CEO, framed it this way: "What I learned about Steve was that people mistook some of his comments as ranting or negativism, but it was really just the way he showed passion. So that's how I processed it, and I never took issue personally." Cordell Ratzlaff, who worked closely with Jobs on OS X for eighteen months, said simply: "I learned a tremendous amount from him." Debi Coleman's formulation remains the most haunting: "You did the impossible, because you didn't realize it was impossible."
There is a temptation, in writing about Jobs, to resolve this contradiction — to declare him either a visionary whose intensity was the price of greatness or a tyrant whose cruelty was inexcusable regardless of the products it produced. The honest answer is that both are true simultaneously and that the simultaneity is the point. His personality and his products were interrelated, as Isaacson put it, "just as Apple's hardware and software tended to be, as if part of an integrated system." The man who demanded that the inside of the computer be beautiful — the part no one would see — was the same man who demanded that the people around him operate at a level of excellence they didn't believe they were capable of. The cost was borne by human beings. The result was borne by human beings, too.
His insistence on hiring only what he called "A players" was absolute. "I've learned over the years that when you have really good people, you don't have to baby them," he said. "By expecting them to do great things, you can get them to do great things." The corollary, unstated but enforced: those who were not A players were removed with a speed and coldness that verged on the surgical.
Richard Branson observed that Jobs "had a meticulous eye for detail, and surrounded himself with like-minded people to follow his lead." But "follow his lead" is diplomatic. The reality was closer to: perform at the level I demand, or cease to exist in my field of vision.
The Thing That Could Not Be There
Laurene Powell Jobs, in her introduction to
Make Something Wonderful, the posthumous collection of Jobs's speeches, emails, and interviews published by the Steve Jobs Archive, offered the most precise description of her husband's cognitive gift: "His mind was never a captive of reality. Quite the contrary: he imagined what reality lacked and set out to remedy it. His ideas were not arguments, but intuitions, born of a true inner freedom and an epic sense of possibility."
This is the quality that gets called the "reality distortion field," but that phrase — coined half-affectionately by Bud Tribble — trivializes it. What Jobs possessed was not the ability to distort reality but the ability to perceive a reality that did not yet exist and then to will it into being. "It is hard enough to see what is already there, to gain a clear view," Laurene wrote. "Steve's gift was greater still: he saw clearly what was not there, what could be there, what had to be there."
The 2007 iPhone launch is the purest example. When Jobs walked onto the stage at the Moscone Center in San Francisco on January 9, 2007, and announced that Apple was introducing "three revolutionary products" — a widescreen iPod with touch controls, a revolutionary mobile phone, and a breakthrough internet communicator — and then revealed that all three were a single device, he was not presenting a gadget. He was revealing a future that would, within five years, reorganize the daily life of billions of people. The App Store, which launched on July 10, 2008, created an entirely new economy — the modern mobile app economy — that would generate hundreds of billions of dollars in annual revenue and transform industries from transportation (Uber, launched 2009) to food delivery to social media to dating to banking.
None of this was in the market research. All of it was in the intuition.
Five Hundred and Fifty-Six Weeks
In October 2003, during a routine CT scan for kidney stones, doctors discovered a tumor on Jobs's pancreas. It was an islet cell neuroendocrine tumor — a rare form of pancreatic cancer that, unlike the more common adenocarcinoma, is often treatable if caught early. Jobs's doctors recommended immediate surgery. Jobs refused. For nine months, he pursued alternative treatments — dietary regimens, herbal remedies, acupuncture, a vegan diet — before finally agreeing to surgery in July 2004.
Whether this delay shortened his life is a matter of medical debate that will never be definitively resolved. What is not debatable is the irony: the man whose entire career was built on the conviction that he saw reality more clearly than others chose, in the matter of his own mortality, to substitute intuition for evidence. The same quality that allowed him to see the iPhone before it existed may have prevented him from seeing the cancer as it was.
In 2009, he received a liver transplant. He continued to work, though his absences grew more frequent and his frame, always slender, became gaunt. On August 24, 2011, he resigned as CEO of Apple. His letter to the board was characteristically brief: "I have always said if there ever came a day when I could no longer meet my duties and expectations as Apple's CEO, I would be the first to let you know. Unfortunately, that day has come."
He died on October 5, 2011, at his home in Palo Alto. He was fifty-six years old. His last words, according to his sister Mona Simpson, who delivered the eulogy, were spoken while looking past the shoulders of his family: "Oh wow. Oh wow. Oh wow."
The Back of the Fence
Apple's market capitalization on the day Jobs died was approximately $350 billion. Within a decade, it would exceed $2 trillion. The company he had built — not once but twice, the second time from near-extinction — had become the most valuable enterprise in the history of capitalism. It had also become, under the stewardship of Tim Cook, a company far better at operations, supply chain management, and shareholder returns than it had been under Jobs. Whether it remained, in some essential way, a company animated by Jobs's vision or merely executing the last several decades of his playbook was a question that grew louder with each passing year, each incremental product update, each record-breaking quarterly earnings report.
But the question misses the deeper legacy, which was never really about Apple. It was about the argument Jobs spent his entire adult life making — through products rather than words — that technology is not neutral, that the choices embedded in a machine's design are moral choices, that the difference between a tool that diminishes the human and a tool that elevates the human is not processing speed or storage capacity but care.
Attention. Love, even, if you're willing to use the word.
"There's lots of ways to be, as a person," Jobs said in 2007. "And some people express their deep appreciation in different ways. But one of the ways that I believe people express their appreciation to the rest of humanity is to make something wonderful and put it out there. And you never meet the people. You never shake their hands. You never hear their story or tell yours. But somehow, in the act of making something with a great deal of care and love, something's transmitted there."
Paul Jobs, the machinist who never graduated from high school, building the back of a fence in a garage in Mountain View — the part nobody would ever see — with the same care he brought to the front. His son, forty years later, demanding that the circuit board inside the Macintosh be beautiful.
The fence. The circuit board. The invisible thing, made right.
Steve Jobs's career offers no single transferable framework — he was too idiosyncratic, too contradictory, too bound to his particular moment in technological history for that. But embedded in the decisions he made, the products he shipped, and the failures he survived are principles of unusual clarity. What follows is an attempt to extract them — not as inspirational platitudes but as operational insights, grounded in the specific evidence of what Jobs actually did.
Table of Contents
- 1.See the telephone, not the telegraph.
- 2.Subtract until it hurts.
- 3.Own the whole stack.
- 4.Refuse to ask — learn to watch.
- 5.Use exile as a laboratory.
- 6.Draw the two-by-two.
- 7.Hire missionaries, fire mercenaries.
- 8.Demonstrate — never argue.
- 9.Make the invisible beautiful.
- 10.Marry the humanities to the sciences.
- 11.Bend reality by setting impossible deadlines.
- 12.Know when to partner with the enemy.
Principle 1
See the telephone, not the telegraph
Jobs's career-long analogy between the personal computer and the telephone was not a marketing slogan. It was a design philosophy. The telegraph required specialized training — Morse code, forty hours of study — and so it remained a tool for professionals. The telephone performed the same function but removed every barrier to human use. "People already knew how to use it," Jobs said. The entire history of Apple's product design is an attempt to build telephones in industries that were selling telegraphs.
The Apple II succeeded because it had a keyboard and a plastic case when other personal computers were bare circuit boards. The Macintosh succeeded because it replaced command-line prompts with icons, folders, and a mouse. The iPod succeeded because the click-wheel made navigating a thousand songs intuitive. The iPhone succeeded because multitouch glass replaced the stylus and the physical keyboard. In every case, the breakthrough was not in the underlying technology — which Apple frequently borrowed or acquired from others — but in the interface between the technology and the human being.
The principle is not about simplification for its own sake. It is about ruthlessly eliminating everything that forces the user to think about the tool rather than the task. The telegraph requires you to learn its language. The telephone speaks yours.
Tactic: When evaluating any product or process, ask: are we building a telegraph or a telephone? If the user needs training to understand it, you haven't finished designing it.
Principle 2
Subtract until it hurts
Jobs's Zen Buddhism training — years of meditation practice, a commitment to the aesthetic of emptiness — was not a biographical curiosity. It was the engine of his design philosophy. "Simplicity is the ultimate sophistication," Apple declared in its early marketing, and Jobs practiced this with a literalism that frequently horrified his colleagues. The iMac had no floppy drive. The MacBook Air had no optical drive. The iPhone had one button. Each time, the tech press declared the omission fatal. Each time, Jobs was right.
The principle extended far beyond physical products. When Jobs returned to Apple in 1997, the company had dozens of product lines. He cut them to four. When confronted with feature requests for the iPod, he stripped the interface to the bare minimum. When designing the Apple Store, he eliminated the cluttered, fluorescent-lit retail experience that had defined electronics shopping and replaced it with clean lines, natural light, and wood tables.
Key decisions defined by what was eliminated, not added.
| Product | What was removed | Industry reaction | Outcome |
|---|
| iMac (1998) | Floppy drive | Outrage | Vindicated within 2 years |
| iPod (2001) | Complex navigation buttons | Skepticism | Defined the MP3 player category |
| iPhone (2007) | Physical keyboard | BlackBerry mocked it | Destroyed BlackBerry |
| MacBook Air (2008) | Optical drive, ports | Called impractical | Set the template for all laptops |
Subtraction is harder than addition. It requires a level of confidence — and taste — that most product teams lack. The instinct in any organization is to add features, to hedge against every possible use case, to say yes. Jobs's genius was in saying no.
Tactic: Before launching any product or initiative, identify the three features or elements you would most like to add — then ask whether removing them would make the core experience stronger. If the answer is yes, remove them.
Principle 3
Own the whole stack
Apple's defining strategic choice — the decision that separates it from virtually every other technology company — is vertical integration. Apple designs its own chips, writes its own operating systems, builds its own applications, manufactures its own hardware (through tightly controlled contract manufacturing), runs its own retail stores, and curates its own content ecosystems. This is expensive. It is complex. It is the source of Apple's margins, Apple's user experience, and Apple's enduring competitive advantage.
Jobs's commitment to end-to-end control was absolute. He would not allow unapproved applications to "pollute" the Apple ecosystem. He insisted that the hardware and software be designed together, as a unified system, because only then could the experience be fully optimized. "People who are serious about software should make their own hardware," he said, channeling Alan Kay.
The logic is straightforward: every interface between your product and someone else's product is a potential point of failure, a seam where the user experience degrades. If you control the whole stack, you control every seam. You can sand them smooth. You can make the transitions invisible. You can build something that feels, to the user, like a single, seamless object rather than an assemblage of components from different suppliers.
The trade-off is that you must be excellent at everything. You cannot hide a weakness behind a partner's strength. This is why Jobs insisted on hiring only A players and firing everyone else — because in an integrated system, every weak link is exposed.
Tactic: Identify the single most critical interface in your customer's experience — the point where your product touches something you don't control — and bring it in-house. Own the seam.
Principle 4
Refuse to ask — learn to watch
Jobs's rejection of focus groups is the most frequently cited and most frequently misunderstood element of his playbook. He did not ignore customer needs. He was obsessed with customer needs. What he refused to do was derive product strategy from customer articulation of those needs, because he understood — correctly — that customers can only articulate desires within the framework of what already exists.
The distinction is between stated preferences and revealed behavior. A focus group in 2006 would have told you that smartphone users wanted a better physical keyboard. Observation of those same users would have told you that they spent most of their time looking at screens, not pressing buttons, and that the keyboard was an obstacle to the screen real estate they actually wanted. Jobs watched. He did not ask.
This is not an argument against customer research. It is an argument against a specific kind of customer research — the kind that asks people to predict their own future behavior. Jobs replaced that with two things: deep observation of how people actually used existing products, and taste — his own, cultivated over decades of studying calligraphy, design, consumer electronics, and the liberal arts — about what a better version of the experience could feel like.
Tactic: Spend a week watching five customers use your product in their natural environment. Do not interview them. Do not ask questions. Record what they do, not what they say. The gap between stated preference and revealed behavior is where your next product lives.
Principle 5
Use exile as a laboratory
Jobs's eleven years away from Apple — 1985 to 1996 — are conventionally narrated as a period of failure redeemed by eventual return. This misses the point. The exile was the education.
At NeXT, Jobs learned to build an operating system from the ground up — NEXTSTEP, the software that would become macOS, iOS, and every other Apple operating system. The hardware failed. The software was the future. At Pixar, he learned to manage creative people without dominating them, to create the organizational conditions under which others could do their best work. He also learned patience: he funded Pixar for nearly a decade before Toy Story generated a return.
What each venture taught Jobs before his return to Apple.
1985Founds NeXT. Builds NEXTSTEP operating system — the technical foundation for Apple's future.
1986Acquires Pixar for $10 million. Begins decade-long education in patience, creative management, and storytelling.
1988NeXT Cube ships. Critically admired, commercially irrelevant. Jobs learns that elegance alone is not enough.
1993NeXT exits hardware. Jobs learns to kill his own products when the market speaks clearly.
1995Toy Story released. Pixar IPO makes Jobs a billionaire. Validates long-term creative investment.
1996Apple acquires NeXT for $429 million. The exile ends — and everything Jobs learned walks in the door with him.
The most important thing Jobs brought back to Apple was not a technology or a management technique. It was the knowledge — earned through failure, not study — that the best products emerge from the intersection of technical excellence and aesthetic vision, and that this intersection can only be found by someone who has operated on both sides.
Tactic: If you've been removed from a position of influence — fired, sidelined, passed over — treat the interval not as a setback but as a laboratory. Build the thing you couldn't build in your previous role. The constraints are different, and so the lessons will be different, and when you return (or start again), you'll be operating with a toolkit your former self didn't have.
Principle 6
Draw the two-by-two
When Jobs returned to Apple in 1997, the company's product line was an incoherent mess — dozens of models with overlapping specifications and confusing nomenclature. His first major act was radical simplification. He drew a two-by-two grid on a whiteboard: consumer/professional across the top, desktop/portable down the side. Four quadrants. Four products. Everything else was eliminated.
This was not merely a product strategy. It was a cognitive strategy — a way of imposing clarity on an organization that had lost the ability to make decisions. When you have forty products, every decision is a negotiation between competing priorities. When you have four, every decision is clear.
The grid also forced trade-offs. If you're building only four products, each one must be excellent, because there is no secondary product to catch the customers the primary one misses. The two-by-two made mediocrity structurally impossible. It also made focus structurally inevitable — every engineer, every designer, every marketer knew exactly which product they were working on and why.
Tactic: Map your current product line, service offerings, or strategic priorities onto a simple matrix. If you cannot fit them into four quadrants without overlap, you have too many. Cut until the grid is clean.
Principle 7
Hire missionaries, fire mercenaries
Jobs's language around talent was revealing. He called his engineers "artists." He described the Macintosh team as pirates. He said, repeatedly, that A players want to work with other A players, and that the fastest way to destroy a team is to tolerate B players, because "B players hire C players, and C players hire D players, and pretty soon you've got Z players."
The distinction he drew — though he wouldn't have used these terms — was between missionaries and mercenaries. Missionaries believe in the mission. They are drawn to the work itself, to the possibility that the thing they're building might change something. Mercenaries are drawn to the compensation, the status, the brand on the résumé. Both can be talented. Only one produces insanely great work.
Jobs recruited missionaries by creating an environment that only missionaries could survive. The hours were brutal. The criticism was withering. The pay, at least in Apple's early years, was not exceptional by Silicon Valley standards. But the work — the chance to build something that would put a dent in the universe — was irresistible to a certain kind of person. And Jobs knew how to identify that person, often in the first few minutes of an interview, by the quality of their passion.
Tactic: In your next hiring round, ask candidates not what they've accomplished but what they believe — about your industry, your product, the problem you're solving. Missionaries will light up. Mercenaries will pivot to their résumé.
Principle 8
Demonstrate — never argue
Edwin Land, the Polaroid founder Jobs revered, operated on a principle that Jobs adopted wholesale: "No argument in the world can compare with one dramatic demonstration." Jobs's product launches — the Macintosh pulling itself from a bag in 1984, the first iPhone call on a San Francisco stage in 2007, the MacBook Air slid from a manila envelope in 2008 — were not marketing events. They were demonstrations. They showed the product doing the thing it was designed to do, in real time, in front of an audience.
The strategic logic is that a product, if it is truly great, is its own best argument. A PowerPoint slide describing a product's features is a telegraph — it requires the audience to decode abstract information and imagine the experience. A live demonstration is a telephone — the audience experiences the product directly, and the experience does the persuading.
This principle extends beyond product launches. Jobs ran Apple on the principle that showing was always superior to telling. He would walk into a meeting, see a prototype, and react viscerally — "This is shit" or "This is great" — based on the physical experience of the object. He did not read reports about user testing. He tested the product himself.
Tactic: Abolish one presentation from your weekly calendar and replace it with a live demonstration of the thing you're actually building. If the demo isn't compelling, the product isn't ready.
Principle 9
Make the invisible beautiful
Paul Jobs taught his son that even the back of a fence — the side no one would see — should be built with the same care as the front. It was a lesson in integrity in the original sense of the word: wholeness, the refusal to divide the world into the parts that matter and the parts that don't. Jobs applied this to every product he built. He demanded that the internal circuit boards of the Macintosh be aesthetically pleasing. He obsessed over the typography on packaging. He specified that the screws on the inside of the first iPod be invisible.
The practical logic is that quality is not a set of visible decisions. It is a culture — a set of standards that permeate every level of the organization. If you accept mediocrity in the parts of the product no one sees, you have established a cultural norm that mediocrity is acceptable under certain conditions. And cultural norms, once established, spread.
The deeper logic is psychological. Teams that know their work will be judged at every level — including the levels no customer will ever see — operate differently than teams that know only the surface matters. The first kind of team internalizes quality as an identity. The second kind performs it as a task.
Tactic: Pick one invisible element of your product or service — something no customer will ever directly experience — and raise its quality to the level of your best visible work. Then watch what happens to the team's standards everywhere else.
Principle 10
Marry the humanities to the sciences
At the iPad launch in January 2010, Jobs displayed a slide showing a road sign at the intersection of Technology and Liberal Arts. "It's in Apple's DNA that technology alone is not enough," he said. This was not corporate boilerplate. It was the single most important idea in his career.
Jobs dropped out of Reed — a liberal arts college — but the calligraphy course he audited there directly influenced the Macintosh's typography, which in turn influenced the typography of every personal computer ever built afterward. His time in India studying Buddhism influenced his commitment to simplicity and reduction. His love of
Bob Dylan and the Beatles influenced his understanding of how creative work could move millions of people simultaneously.
The practical implication is that the best product teams are not composed exclusively of engineers. Edwin Land recruited art history graduates for Polaroid. Jobs staffed the original Macintosh team with people who were, in his words, "poets and musicians on the side." The logic is that innovation lives at the intersection of disciplines — that the person who understands both the physics of a touchscreen and the psychology of a human hand will build a better product than either a pure engineer or a pure designer working alone.
Tactic: The next time you hire for a technical role, weight one non-obvious, humanities-oriented skill — writing, music, visual art, history — alongside the technical requirements. Then ensure that skill is valued in the team's daily work, not treated as a curiosity.
Principle 11
Bend reality by setting impossible deadlines
The "reality distortion field" was not magic. It was a management technique — a brutally effective one — that operated on a specific psychological mechanism: if you tell talented people that something must be done in a timeframe they consider impossible, and if you refuse to accept their objections, a significant percentage of them will find a way to do it. Not because the deadline was realistic, but because the belief that it was impossible was the only thing preventing them from finding the solution.
This is dangerous. It burns people out. It produces resentment. It is, when practiced without discernment, a form of abuse. Jobs practiced it without discernment frequently, and the human cost was real. But the products that resulted — shipped on timelines that conventional management would have considered insane — were also real. The original Macintosh team worked eighty-hour weeks for months. The iPhone team worked in secrecy under extreme pressure for over two years. In both cases, the product that emerged was something the team had not believed they could build when they started.
The principle is not "set impossible deadlines." The principle is: understand the difference between a deadline that is impossible because it violates physics and a deadline that is impossible because it violates convention. Jobs was relentless about the second kind and accepted no excuses. He was also, occasionally, wrong about the first kind — and the cost of being wrong fell on the people around him.
Tactic: Set one deadline in your current project plan that your team considers aggressive but not insane, and then move it forward by 30%. When they push back, ask them to identify specifically what is physically impossible about the new timeline, not merely uncomfortable. Cut everything that's merely uncomfortable.
Principle 12
Know when to partner with the enemy
In August 1997, less than a year after his return to Apple, Jobs stood on a stage at Macworld and announced a $150 million investment from Microsoft — the company that Apple's faithful regarded as the enemy, the company that had, in their view, stolen the graphical user interface and used it to build Windows, the operating system that had crushed the Macintosh's market share. When Bill Gates's face appeared on the screen behind Jobs, the audience booed.
Jobs told them to stop. "We have to let go of this notion that for Apple to win, Microsoft has to lose," he said. Apple needed Microsoft Office to remain viable on the Mac platform. Apple needed the $150 million investment as a signal of stability. Apple needed, in short, to survive — and survival required a deal with the only entity that could provide what Apple lacked.
This was not ideological compromise. It was strategic pragmatism — the recognition that a company ninety days from bankruptcy does not have the luxury of purity. Jobs's ability to make this deal, to endure the booing of his own base, to publicly embrace a partner he privately disdained, reveals a dimension of his leadership that is often obscured by the mythology of the visionary loner. He was a visionary. He was also, when necessary, a realist.
Tactic: Identify the competitor or rival whose partnership would most offend your organization's identity — and then ask, honestly, whether that partnership would solve a problem that no one else can solve. If the answer is yes, make the call and endure the booing.
In their words
It's often the same with any new, revolutionary thing. People get stuck as they get older. Our minds are sort of electrochemical computers. Your thoughts construct patterns like scaffolding in your mind. You are really etching chemical patterns. In most cases, people get stuck in those patterns, just like grooves in a record, and they never get out of them.
— Steve Jobs, Playboy interview, February 1985
There's lots of ways to be, as a person. And some people express their deep appreciation in different ways. But one of the ways that I believe people express their appreciation to the rest of humanity is to make something wonderful and put it out there.
— Steve Jobs, 2007, from Make Something Wonderful
You can't connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something: your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.
— Steve Jobs, Stanford Commencement Address, June 2005
It is hard enough to see what is already there, to gain a clear view. Steve's gift was greater still: he saw clearly what was not there, what could be there, what had to be there. His mind was never a captive of reality.
— Laurene Powell Jobs, introduction to Make Something Wonderful
You did the impossible, because you didn't realize it was impossible.
— Debi Coleman, Macintosh team, quoted in Walter Isaacson's Steve Jobs
Maxims
-
The telephone always beats the telegraph. Build technology that speaks the user's language, not its own. If the human has to learn to operate the tool, you haven't finished designing it.
-
Taste is not a luxury — it is a strategy. The ability to distinguish between good and great, between what works and what sings, is a competitive advantage that compounds over decades and cannot be replicated by engineering alone.
-
Focus means saying no. Not to bad ideas — that's easy — but to good ideas that dilute the best ones. Four products instead of forty. One button instead of three. The discipline of refusal is the prerequisite of excellence.
-
Your personality and your products are an integrated system. The things you build reflect the standards you enforce. If you tolerate mediocrity in one domain, you will produce it in every domain.
-
The wilderness is a curriculum. Exile, failure, and rejection are not interruptions to the career. They are the career. What you build when nobody is watching may become the foundation for everything you build afterward.
-
Demonstration is the highest form of persuasion. A great product, shown in real time, is more convincing than any slide deck, pitch, or argument. If the demo doesn't sell it, the product isn't ready.
-
Control the seams. Every interface between your product and someone else's is a place where the experience degrades. Own as many seams as you can. Sand them smooth.
-
Missionaries outperform mercenaries. The person who believes in the work will always, given enough time, produce better results than the person who believes in the compensation. Build an environment that only missionaries survive.
-
Make the invisible beautiful. Quality is not a set of visible decisions — it is a culture. The standard you set for the things no one sees determines the standard for everything.
-
Stay hungry, stay foolish. The line from the last issue of the Whole Earth Catalog that Jobs quoted at Stanford in 2005. It is not about poverty or recklessness. It is about refusing the comfort that comes from believing you've already arrived — because that comfort is where curiosity goes to die.