The Telephone and the Telegraph
In February 1985, a twenty-nine-year-old worth roughly a quarter of a billion dollars — though he'd lost the same amount in a single year and laughed about it — sat down with Playboy magazine and compared the personal computer to Alexander Graham Bell's telephone. Not the internet, which barely existed outside of university labs and military installations. Not the smartphone, which wouldn't be invented for another two decades. The telephone. "Remember that first the public telegraph was inaugurated, in 1844," Steve Jobs told the interviewer, David Sheff. "It was an amazing breakthrough in communications. You could actually send messages from New York to San Francisco in an afternoon. People talked about putting a telegraph on every desk in America to improve productivity. But it wouldn't have worked. It required that people learn this whole sequence of strange incantations, Morse code, dots and dashes, to use the telegraph. It took about 40 hours to learn. The majority of people would never learn how to use it." Then the pivot, which is where the entire story of Apple Computer lives: "So, fortunately, in the 1870s, Bell filed the patents for the telephone. It performed basically the same function as the telegraph, but people already knew how to use it. Also, the neatest thing about it was that besides allowing you to communicate with just words, it allowed you to sing."
It allowed you to sing. Here was the whole thesis, already fully formed in the mind of a man who hadn't yet been fired from the company he'd founded, who hadn't yet spent a decade in the wilderness building expensive machines almost nobody wanted, who hadn't yet acquired a struggling computer graphics division from George Lucas for $5 million and nursed it into a studio that would produce the first entirely computer-animated feature film, who hadn't yet returned to a company ninety days from bankruptcy and transformed it into the most valuable enterprise on Earth. The computer was not a productivity tool. It was not a business machine. It was an instrument for human expression — and the people who built it needed to understand that the difference between the telegraph and the telephone was not a matter of engineering but of empathy. The question was whether technology would demand that humans learn its language, or whether it would learn theirs.
Everything Jobs built, broke, abandoned, and rebuilt over thirty-five years was an attempt to answer that question. And the answer — always, relentlessly, sometimes to the point of cruelty — was the same.
By the Numbers
The Apple Empire
$2.3TApple market cap at time of Jobs's death (Oct 2011)
7Industries transformed: PCs, animated films, music, phones, tablets, retail, digital publishing
$5.4BPixar acquisition price paid by Disney (2006)
346Patents listing Jobs as inventor or co-inventor
$1Jobs's annual salary as Apple CEO after his return
90 daysApple's estimated distance from bankruptcy when Jobs returned in 1997
A Machinist's Son in the Valley of Heart's Delight
The Santa Clara Valley, before anyone called it Silicon Valley, was orchards — apricots, cherries, plums — and the aerospace plants and electronics firms that had begun to cluster around Stanford University like iron filings around a magnet. Into this landscape, in 1960, Paul and Clara Jobs moved their family from San Francisco to a three-bedroom ranch house in Mountain View.
Paul Jobs was the kind of man whose biography compresses into verbs. Born in Wisconsin, he never finished high school. He joined the Coast Guard in World War II, ferried troops for General Patton across the Pacific, got busted down to private with reliable frequency, and came out of the service a machinist. He married Clara Hagopian, an Armenian-American accountant, and when their attempts to have children failed, they adopted a boy — Steven Paul — born on February 24, 1955, to Abdulfattah Jandali, a Syrian political science graduate student at the University of Wisconsin, and Joanne Schieble, an American speech therapist whose Catholic father threatened to disown her if she married a Muslim. The biological parents would later marry and have another child, the novelist Mona Simpson, but Steve wouldn't learn any of this until he was twenty-seven.
What he learned instead, in the garage of the Mountain View house, was how to take things apart. Paul Jobs showed his adopted son the guts of machines — circuits, wiring, the logic of mechanical systems — with the patient authority of a man who'd spent his life understanding how physical things worked. "He had a workbench out in his garage," Jobs recalled in a 1995 Smithsonian oral history, and "he spent a lot of time with me, teaching me how to build things, how to take things apart, put things back together." The elder Jobs's other obsession was craftsmanship: even the back of a fence, the part nobody sees, should be built properly. It was an ethic his son would internalize so completely that, decades later, he would demand the redesign of an internal circuit board — invisible to any user — because he found it aesthetically offensive.
Clara taught Steve to read before kindergarten. He was bored almost immediately and almost permanently. He was expelled from one school for misbehavior. His fourth-grade teacher, Imogene Hill, bribed him into studying — and the bribes worked so well that he tested at a level that prompted administrators to suggest he skip two grades. His parents allowed one. At Homestead High School in Cupertino, he joined the Explorer's Club at Hewlett-Packard, where he saw a computer for the first time. When he needed parts for a frequency counter he was building, he did what would become characteristic: he looked up Bill Hewlett's home phone number in the directory and called him directly. He got the parts. He got a summer job. He got, without knowing it yet, an education in the power of simply asking.
Two Steves and a Garage
Stephen Wozniak — universally known as Woz — was five years older, which at that age meant everything and nothing. A mutual friend introduced them in 1969. Where Jobs was mercurial, verbal, charismatic, and interested in everything, Wozniak was gentle, obsessive, and interested in one thing: making circuits sing. He was, by all accounts, among the most gifted digital engineers of his generation — a man who could design an entire computer using fewer chips than anyone thought possible, not for commercial reasons but for the sheer beauty of the reduction. He worked at Hewlett-Packard. He had no interest in starting a company.
Jobs had no interest in not starting one.
Their first collaboration, in 1972, was illegal. They built and sold "blue boxes" — devices that generated the precise tones needed to hack the phone system and make long-distance calls for free. The boxes worked. They sold about a hundred of them, at $150 apiece. The technical brilliance was Woz's; the audacity and the salesmanship were Jobs's. "If it hadn't been for the blue boxes," Jobs would later say, "there wouldn't have been an Apple." Not because of the revenue — which was negligible — but because of the revelation: two guys in a bedroom could build a device that gave them control over a multibillion-dollar infrastructure. The world was not fixed. It could be hacked.
Between the blue boxes and Apple, Jobs's biography takes a detour through the countercultural currents of the early 1970s that would have been unremarkable for anyone else of his generation but became, in retrospect, foundational. He enrolled at Reed College in Portland, Oregon, in the fall of 1972 — a small, expensive, fiercely intellectual liberal arts school — and dropped out after one semester. But he didn't leave. He slept on the floor of friends' dorm rooms, returned Coke bottles for food money, and audited the classes that interested him, including, famously, a calligraphy course taught by Robert Palladino. "I learned about serif and sans-serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great," Jobs said in his 2005 Stanford commencement address. "It was beautiful, historical, artistically subtle in a way that science can't capture." None of it had any practical application. Ten years later, when he was designing the Macintosh, it had all the practical application in the world.
He took a job at Atari — the video game company, then at its zenith — in early 1974, saved money, and used it to fund a pilgrimage to India with his Reed College friend Dan Kottke. He went looking for enlightenment. What he found, he said later, was that the people in the Indian countryside weren't using intellect to solve problems but intuition, and that intuition was at least as powerful. He came back to Silicon Valley in the autumn of 1974 with a shaved head, Indian cotton clothing, and a deepened commitment to Zen Buddhism — a practice he would maintain, in various forms, for the rest of his life. The emphasis on emptiness, on the elegance of reduction, on seeing through complexity to essential form: these were not metaphors for his later design philosophy. They were his design philosophy.
The Garage, the Byte Shop, and the Fortune 500
Wozniak had been tinkering with computer designs and showing them off at the Homebrew Computer Club, a gathering of hobbyists and engineers in Menlo Park. In March 1976, he and Jobs showed the early Apple I board at one of these meetings. It was a single-board computer — no case, no keyboard, no power supply. A circuit board. Wozniak saw an elegant machine. Jobs saw a business.
On April 1, 1976, Apple Computer Inc. was incorporated by Steve Jobs, Steve Wozniak, and Ron Wayne — a former Atari colleague who drew up the partnership agreement, contributed the first Apple logo (a baroque pen-and-ink drawing of Newton under the apple tree), and then sold his 10% stake back to Jobs and Wozniak for $800 twelve days later. Wayne's share, had he held it, would eventually have been worth tens of billions of dollars. He has said he has no regrets. This may be the most heroic or the most delusional statement in the history of American capitalism.
They assembled Apple I computers in the Jobs family garage and sold them to hobbyists — including fifty units to the Byte Shop, a local computer retailer — at $666.66 apiece. Jobs funded the operation by selling his Volkswagen minibus; Wozniak sold his HP-65 programmable calculator. The total startup capital was approximately $1,300.
What happened next was the application of a single, radical insight: the personal computer would appeal to a broad audience if — and only if — it did not look like it belonged in a junior high school science fair. Jobs understood this. Almost nobody else did. With his encouragement, Wozniak designed the Apple II — a vastly improved machine, complete with a keyboard, color graphics, and a sleek molded plastic case that Jobs insisted upon. In January 1977, Mike Markkula — a former Intel executive turned business angel, then thirty-three, who had made his fortune on Intel stock options and retired to a house in the hills above Cupertino — invested $250,000 and joined Apple as a partner. Markkula was quiet, analytical, and understood distribution and marketing in ways the two Steves did not. He also wrote Apple's first business plan and helped hire Mike Scott as the company's first CEO.
The Apple II debuted at the West Coast Computer Faire in April 1977 and created an immediate sensation. When the spreadsheet program VisiCalc was released for the platform in 1979, sales skyrocketed. The Apple II became the first mass-market personal computer. The company's growth was vertiginous: by 1980, Apple was generating hundreds of millions in revenue. On December 12, 1980, Apple went public at $22 per share. By the end of the first day of trading, Steve Jobs's net worth exceeded $200 million. He was twenty-five years old.
And he was already, in the estimation of several colleagues, nearly impossible to work with.
Xerox PARC and the Window into the Future
In December 1979, Jobs negotiated a visit to the Xerox Corporation's Palo Alto Research Center — PARC — one of the most storied research labs in the history of computing, a place where brilliant engineers had invented, among other things, the graphical user interface, the mouse, Ethernet, and the laser printer. Xerox, a company that made its fortune on copiers, had no idea what to do with any of it. Jobs made an arrangement: Xerox would be allowed to invest in Apple's upcoming IPO (they bought 100,000 shares at $10 each) and in exchange, Apple engineers would get a tour of PARC.
What Jobs saw that day changed everything. Instead of the green-on-black command lines that defined computing in 1979, PARC's engineers had built an interface with windows, icons, folders, and a mouse to navigate them. "It was like a veil being lifted from my eyes," Jobs told Walter Isaacson. "I could see what the future of computing was destined to be."
He didn't invent the graphical user interface. He didn't invent the mouse. What he did — and this is the core of the Steve Jobs story, the thread that runs through every product he ever shipped — was see what others had built, understand its implications more deeply than its creators did, and then execute a version of it so refined, so integrated, so empathetic to the human being on the other end of the experience that it became, effectively, a different thing entirely. The Xerox mouse cost $1,000 to build and had three buttons. Jobs had Apple's engineers design one that cost $20 and had a single button. "We found that people would push the wrong button or be scared that they were going to push the wrong button, so they always looked at the mouse instead of the screen," he told Terry Gross in 1996. "So we got it down to one button so that you could never push the wrong button." This is not engineering. This is psychology. This is watching a human being's eyes and understanding where fear lives.
The Xerox mouse cost about $1,000 a piece to build. We had to engineer one that cost 20 bucks to build. But the basic concept of the mouse came originally from a company called SRI, through Xerox and then to Apple.
— Steve Jobs, 1996 interview with Terry Gross
The technologies he saw at PARC were channeled into two Apple projects: Lisa, a business computer, and the Macintosh, a lower-cost machine. Jobs was initially involved with Lisa but was removed from the project by Mike Scott — the company's CEO, who found Jobs's management style disruptive. Jobs then took over the Macintosh team, which had been started by Jef Raskin, an Apple engineer who envisioned a low-cost, easy-to-use computer. Raskin was forced out in early 1981 as Jobs seized control of the project that would define his career.
Insanely Great
The Macintosh team was sequestered in a separate building — Bandley 3 — with a pirate flag flying from the roof. Jobs cultivated an us-against-the-world mentality, positioning the Mac team as artists and rebels fighting the corporate bureaucracy of the rest of Apple. He called his engineers artists. He also screamed at them, belittled their work in front of colleagues, and demanded the redesign of components no user would ever see. The intensity was indistinguishable from the cruelty.
Andy Hertzfeld, the lead designer of the original Mac operating system, described Jobs's effect on his team as a "messianic zeal." Bud Tribble, another early Mac engineer, gave it a different name: the "reality distortion field" — an ability to convince people that the impossible was not merely possible but inevitable. "In his presence, reality is malleable," Tribble said. "He can convince anyone of practically anything." Debi Coleman, who served as the Mac's controller, put it more bluntly: "You did the impossible, because you didn't realize it was impossible."
On January 24, 1984, in a brilliantly choreographed demonstration at the annual shareholders' meeting in Cupertino, Jobs introduced the Macintosh to the world. He pulled it from a bag, inserted a floppy disk, and let the machine speak for itself — literally, using a text-to-speech synthesizer. "Hello," the Macintosh said. "I'm Macintosh. It sure is great to get out of that bag." The audience exploded. The event would later be cited as the archetype of what the marketing world calls "event marketing," though that phrase drains it of its genuine strangeness: a twenty-eight-year-old in a bow tie conjuring a machine that spoke.
The Macintosh was, in many ways, brilliant. It was also underpowered, expensive, and had almost no software. Sales were disappointing. Jobs, who had convinced PepsiCo president John Sculley to become Apple's CEO in 1983 with the legendary challenge — "Do you want to sell sugar water for the rest of your life?" — now found himself outmaneuvered by the very executive he'd recruited. Sculley, who had risen through the consumer-goods world on the strength of the Pepsi Challenge and careful market segmentation, was a different species of corporate animal entirely. He was polished, cautious, data-driven. Jobs was none of these things.
The clash was inevitable. By 1985, with Mac sales stalling and the board losing patience, Sculley convinced Apple's directors to strip Jobs of his operational responsibilities. On September 17, 1985, Steve Jobs resigned from Apple Computer.
He was thirty years old.
The Wilderness and Its Gifts
What followed was, by the standards of Silicon Valley mythology, a period of exile — though the exile was conducted from a mansion in Woodside, California, and involved spending $12 million of his own money to start a new company. Jobs founded NeXT Inc. in September 1985, announced to the world that he would build the most advanced personal computer ever made, and attracted $20 million from Texas billionaire Ross Perot and later an investment from Canon, the Japanese electronics giant.
The NeXT computer, released in 1988, was a work of art — a perfect black magnesium cube, one foot on each side, running an operating system called NEXTSTEP that was years ahead of anything else on the market. It was also $6,500 — far too expensive for the education market Jobs had targeted. It sold poorly. Very poorly. The hardware division was eventually shut down, and NeXT pivoted entirely to its software, which would prove, in a twist of fate that Jobs could not have predicted, to be the foundation of everything Apple would build in the twenty-first century.
But before that redemption, there was Pixar.
In 1986, Jobs acquired a controlling interest in the computer graphics division of Lucasfilm — George Lucas's production company — for $5 million, plus a $5 million capital investment. The division had been founded to develop computer graphics technology for filmmaking but had never produced a commercially successful product. It was, essentially, a money pit staffed by extraordinarily talented people — Ed Catmull, a computer scientist who had pioneered techniques for rendering 3D images, and John Lasseter, a former Disney animator who had been fired for pushing too aggressively for computer animation. Catmull was quiet, methodical, and deeply interested in the organizational conditions that produce creative work; Lasseter was exuberant, emotionally open, and possessed of a storytelling instinct that was immediately apparent to anyone who watched his early short films.
Jobs funded Pixar for nearly a decade, pouring in an estimated $50 million of his own money during years when the company's hardware and software products generated almost nothing. He came close to selling the company multiple times. He did not. On November 22, 1995, Toy Story — the first full-length, entirely computer-animated feature film — was released by Disney. It earned $373 million worldwide. A week later, Pixar went public. Jobs's stake was worth approximately $1.2 billion.
It was at Pixar that he learned to let other creative people flourish and take the lead.
— Walter Isaacson, Steve Jobs biography
The conventional narrative holds that the wilderness years were purely instructive — that Jobs was "humbled" by his failures at NeXT and "matured" at Pixar. The truth is messier. He was humbled and matured, yes, but he was also stubborn, imperious, and intermittently brilliant in exactly the ways he had always been. What changed was subtler: he learned, painfully and slowly, that he could not do everything himself, that creative excellence sometimes required him to create the conditions for other people's genius rather than imposing his own. At Pixar, he largely stayed out of the storytelling. At NeXT, he built an operating system that was technically magnificent but commercially irrelevant — until it wasn't.
Ninety Days from Bankruptcy
By 1996, Apple Computer was dying. The company had cycled through three CEOs after Sculley — Michael Spindler and then Gil Amelio, a semiconductor executive who understood manufacturing but not consumer desire — and had lost nearly $2 billion in the previous two fiscal years. Its market share was collapsing. Its product line was a sprawl of confusingly named models — the Performa, the Quadra, the Power Macintosh, the Centris — with no coherent strategy behind any of them. Microsoft's Windows 95 had captured the world. Apple's board was contemplating a sale. The company was, by most informed estimates, approximately ninety days from insolvency.
What Apple needed, above all, was a modern operating system. Amelio decided to acquire one. He considered Be Inc., founded by former Apple executive Jean-Louis Gassée, but the negotiations stalled over price. Then, in what can only be described as a plot twist engineered by the universe, Apple agreed to acquire NeXT in December 1996 for $429 million. The deal brought NeXT's operating system — the software that would become macOS — into Apple. It also brought Steve Jobs.
He returned initially as an "advisor." Within months, the board had ousted Amelio and Jobs had assumed the role of interim CEO — a title he held, with the word "interim" never quite dropping away, for three years before formally accepting the permanent position in January 2000. His salary was $1 per year. He did not need the money. He needed the company.
What happened in the first weeks of his return has been recounted so many times that it has acquired the quality of scripture, but the details still astonish. Jobs called a product review meeting. Apple had dozens of products. Jobs drew a simple two-by-two grid on a whiteboard: consumer and professional across the top; desktop and portable down the side. Four quadrants, four products. Everything else was to be killed. "Deciding what not to do is as important as deciding what to do," he said. "That's true for companies, and it's true for products."
He cut 70% of the product line. He laid off thousands of employees. He killed the Newton, Apple's personal digital assistant. He shut down the clone licensing program that had allowed other manufacturers to run Apple's operating system, a short-term revenue strategy that was cannibalizing Apple's hardware sales. He partnered with Microsoft — the enemy — securing a $150 million investment from Bill Gates and a commitment to continue developing Microsoft Office for the Mac. When Gates's face appeared on a massive screen at the 1997 Macworld Expo to announce the deal, the audience booed. Jobs told them to stop.
In 1998, he introduced the iMac — a translucent, Bondi Blue, gumdrop-shaped all-in-one computer designed by Jonathan Ive, a thirty-one-year-old British industrial designer who had been languishing at Apple under the previous regime. The iMac was friendly, surprising, beautiful, and sold 800,000 units in its first five months. It had no floppy drive — an omission that the tech press treated as heresy and that Jobs treated as inevitability. He was right. He was usually right about what to remove.
The Integrated Man
The period from 1998 to 2011 — the final thirteen years of Jobs's life — constitutes one of the most extraordinary runs in the history of business, and perhaps in the history of industrial design. The products arrived with the regularity of seasons: the iPod in October 2001; the iTunes Store in April 2003; the iPhone on June 29, 2007; the iPad on April 3, 2010. Each one transformed an industry. Several created industries that hadn't existed before.
The pattern was always the same: Jobs identified an existing technology or product category that was needlessly complex, poorly designed, or fundamentally broken. He then assembled a team — almost always including Jony Ive on design and a rotating cast of world-class engineers — to build something that felt, to the user, like it had always been obvious. The iPod was not the first MP3 player. The iPhone was not the first smartphone. The iPad was not the first tablet. But in each case, Jobs's version was the one that made the category real for the mass market, because it was the one that made the technology disappear behind the experience.
The key to understanding this is to understand Jobs's obsession with what he called "end-to-end" control. Apple designed the hardware, wrote the software, built the applications, controlled the retail experience, and — after the opening of the first Apple Store in Tysons Corner, Virginia, on May 19, 2001 — even controlled the physical spaces in which customers encountered the products. No other technology company operated this way. Microsoft licensed its software to dozens of hardware manufacturers. Google gave away its mobile operating system. Dell and HP competed on price. Jobs competed on integration.
"People who are serious about software should make their own hardware," he said, quoting Alan Kay. The corollary, which he never stated but always practiced, was that people who are serious about the customer experience should control every single element of it, from the silicon chip to the unboxing.
You can't connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future.
— Steve Jobs
This was more than a business strategy. It was a philosophical commitment — an extension of the conviction, traceable all the way back to that calligraphy class at Reed, that technology and the humanities were not separate domains but a single discipline. The fonts on the Macintosh. The click-wheel on the iPod. The multitouch glass on the iPhone. The aluminum unibody of the MacBook. Each decision was simultaneously an engineering choice and an aesthetic one, and Jobs refused to accept that these were different conversations. "It's in Apple's DNA that technology alone is not enough," he said at the iPad launch in 2010. "It's technology married with liberal arts, married with the humanities, that yields us the result that makes our heart sing."
The phrase sounds sentimental. The execution was ruthless.
The Missionary and the Market
Jobs's disdain for market research was legendary, and frequently misunderstood. He did not ignore customers. He watched them constantly — watched how they held devices, where their eyes moved on a screen, what made them hesitate, what delighted them. What he refused to do was ask them what they wanted. "It's not the consumers' job to know what they want," he said. When the original Macintosh team suggested conducting focus groups, Jobs shut the discussion down: "No, because customers don't know what they want until we've shown them." He invoked Henry Ford: "If I'd asked customers what they wanted, they would have told me, 'A faster horse!'"
This was not arrogance, or not only arrogance. It was a specific theory of innovation: that truly transformative products cannot be derived from surveys, because they occupy spaces that consumers cannot yet imagine. The telephone could not have been focus-grouped into existence. The automobile could not have been derived from asking horse owners what they wanted. The iPhone could not have been designed by a committee analyzing existing smartphone usage patterns. What was required was taste — a word Jobs used with the precision of a sommelier — combined with the technical knowledge to understand what was possible and the will to make it real.
Edwin Land, the founder of Polaroid, had operated on the same principle half a century earlier, and Jobs revered him. Both men insisted that their inventions would change the fundamental character of human interaction with images and information. Both spent lavishly on research and development, to the dismay of financial analysts. Both believed that demonstration was more persuasive than argument. "If Steve Jobs studied Edwin Land," David Senra has observed on the Founders podcast, "I think every other founder should as well."
The connection was not merely intellectual. Jobs met Land three times during Apple's early rise. Land had dropped out of Harvard, turned his apartment into a laboratory, and made his first scientific breakthrough — a polarizing filter — at nineteen. He built Polaroid into one of the great technology monopolies of its era by insisting that instant photography was not merely a product improvement but a fundamental reordering of the relationship between humans and the images they made of their lives. Jobs understood this instinctively. He operated the same way. The products were not incremental improvements. They were, or aspired to be, instruments of transformation.
The Cruelty and the Craft
Any honest accounting of Steve Jobs must reckon with the cruelty. It was not incidental. It was not a minor flaw in an otherwise admirable character. It was structural, persistent, and — this is the part that makes the story genuinely difficult — productive.
Jobs berated employees publicly. He told engineers their work was "shit" — the word was apparently a favorite — and he did so in front of their colleagues. He cried, raged, and manipulated. He parked in handicapped spaces. He denied the paternity of his daughter Lisa for years, subjecting her mother, Chrisann Brennan, to public humiliation and financial hardship, even as the child bore the name of the computer he was building. Robert Sutton, the Stanford organizational psychologist, cited Jobs as a leading example of what he clinically termed workplace bullying, noting that "the degree to which people in Silicon Valley are afraid of Jobs is unbelievable."
And yet. The people who survived the gauntlet — who withstood the berating, the contempt, the impossible demands — almost universally describe the experience as the most productive of their careers. Tim Cook, who succeeded Jobs as CEO, framed it this way: "What I learned about Steve was that people mistook some of his comments as ranting or negativism, but it was really just the way he showed passion. So that's how I processed it, and I never took issue personally." Cordell Ratzlaff, who worked closely with Jobs on OS X for eighteen months, said simply: "I learned a tremendous amount from him." Debi Coleman's formulation remains the most haunting: "You did the impossible, because you didn't realize it was impossible."
There is a temptation, in writing about Jobs, to resolve this contradiction — to declare him either a visionary whose intensity was the price of greatness or a tyrant whose cruelty was inexcusable regardless of the products it produced. The honest answer is that both are true simultaneously and that the simultaneity is the point. His personality and his products were interrelated, as Isaacson put it, "just as Apple's hardware and software tended to be, as if part of an integrated system." The man who demanded that the inside of the computer be beautiful — the part no one would see — was the same man who demanded that the people around him operate at a level of excellence they didn't believe they were capable of. The cost was borne by human beings. The result was borne by human beings, too.
His insistence on hiring only what he called "A players" was absolute. "I've learned over the years that when you have really good people, you don't have to baby them," he said. "By expecting them to do great things, you can get them to do great things." The corollary, unstated but enforced: those who were not A players were removed with a speed and coldness that verged on the surgical. Richard Branson observed that Jobs "had a meticulous eye for detail, and surrounded himself with like-minded people to follow his lead." But "follow his lead" is diplomatic. The reality was closer to: perform at the level I demand, or cease to exist in my field of vision.
The Thing That Could Not Be There
Laurene Powell Jobs, in her introduction to
Make Something Wonderful, the posthumous collection of Jobs's speeches, emails, and interviews published by the Steve Jobs Archive, offered the most precise description of her husband's cognitive gift: "His mind was never a captive of reality. Quite the contrary: he imagined what reality lacked and set out to remedy it. His ideas were not arguments, but intuitions, born of a true inner freedom and an epic sense of possibility."
This is the quality that gets called the "reality distortion field," but that phrase — coined half-affectionately by Bud Tribble — trivializes it. What Jobs possessed was not the ability to distort reality but the ability to perceive a reality that did not yet exist and then to will it into being. "It is hard enough to see what is already there, to gain a clear view," Laurene wrote. "Steve's gift was greater still: he saw clearly what was not there, what could be there, what had to be there."
The 2007 iPhone launch is the purest example. When Jobs walked onto the stage at the Moscone Center in San Francisco on January 9, 2007, and announced that Apple was introducing "three revolutionary products" — a widescreen iPod with touch controls, a revolutionary mobile phone, and a breakthrough internet communicator — and then revealed that all three were a single device, he was not presenting a gadget. He was revealing a future that would, within five years, reorganize the daily life of billions of people. The App Store, which launched on July 10, 2008, created an entirely new economy — the modern mobile app economy — that would generate hundreds of billions of dollars in annual revenue and transform industries from transportation (Uber, launched 2009) to food delivery to social media to dating to banking.
None of this was in the market research. All of it was in the intuition.
Five Hundred and Fifty-Six Weeks
In October 2003, during a routine CT scan for kidney stones, doctors discovered a tumor on Jobs's pancreas. It was an islet cell neuroendocrine tumor — a rare form of pancreatic cancer that, unlike the more common adenocarcinoma, is often treatable if caught early. Jobs's doctors recommended immediate surgery. Jobs refused. For nine months, he pursued alternative treatments — dietary regimens, herbal remedies, acupuncture, a vegan diet — before finally agreeing to surgery in July 2004.
Whether this delay shortened his life is a matter of medical debate that will never be definitively resolved. What is not debatable is the irony: the man whose entire career was built on the conviction that he saw reality more clearly than others chose, in the matter of his own mortality, to substitute intuition for evidence. The same quality that allowed him to see the iPhone before it existed may have prevented him from seeing the cancer as it was.
In 2009, he received a liver transplant. He continued to work, though his absences grew more frequent and his frame, always slender, became gaunt. On August 24, 2011, he resigned as CEO of Apple. His letter to the board was characteristically brief: "I have always said if there ever came a day when I could no longer meet my duties and expectations as Apple's CEO, I would be the first to let you know. Unfortunately, that day has come."
He died on October 5, 2011, at his home in Palo Alto. He was fifty-six years old. His last words, according to his sister Mona Simpson, who delivered the eulogy, were spoken while looking past the shoulders of his family: "Oh wow. Oh wow. Oh wow."
The Back of the Fence
Apple's market capitalization on the day Jobs died was approximately $350 billion. Within a decade, it would exceed $2 trillion. The company he had built — not once but twice, the second time from near-extinction — had become the most valuable enterprise in the history of capitalism. It had also become, under the stewardship of Tim Cook, a company far better at operations, supply chain management, and shareholder returns than it had been under Jobs. Whether it remained, in some essential way, a company animated by Jobs's vision or merely executing the last several decades of his playbook was a question that grew louder with each passing year, each incremental product update, each record-breaking quarterly earnings report.
But the question misses the deeper legacy, which was never really about Apple. It was about the argument Jobs spent his entire adult life making — through products rather than words — that technology is not neutral, that the choices embedded in a machine's design are moral choices, that the difference between a tool that diminishes the human and a tool that elevates the human is not processing speed or storage capacity but care. Attention. Love, even, if you're willing to use the word.
"There's lots of ways to be, as a person," Jobs said in 2007. "And some people express their deep appreciation in different ways. But one of the ways that I believe people express their appreciation to the rest of humanity is to make something wonderful and put it out there. And you never meet the people. You never shake their hands. You never hear their story or tell yours. But somehow, in the act of making something with a great deal of care and love, something's transmitted there."
Paul Jobs, the machinist who never graduated from high school, building the back of a fence in a garage in Mountain View — the part nobody would ever see — with the same care he brought to the front. His son, forty years later, demanding that the circuit board inside the Macintosh be beautiful.
The fence. The circuit board. The invisible thing, made right.