In August 1684, the astronomer Edmond Halley—a man who would later lend his name to a comet, who had already sailed the Atlantic to catalogue the southern stars, who possessed in equal measure the gifts of mathematical fluency and social grace that Newton so conspicuously lacked—rode from London to Cambridge to ask a question. It was a question that had been gnawing at the finest minds of the Royal Society: if the sun exerted an attractive force on the planets that diminished as the square of their distance, what shape would a planetary orbit take? Newton answered immediately. An ellipse. Halley was stunned—not merely that Newton knew the answer, but that he spoke of it as settled fact, as though the deepest problem in celestial mechanics were no more perplexing than the price of bread. When pressed for the proof, Newton said he had mislaid the paper. Whether that paper ever existed remains a matter of scholarly dispute. What is not disputed is what happened next: within three months, Newton sent Halley a nine-page manuscript titled De Motu Corporum in Gyrum—"On the Motion of Bodies in Orbit"—which contained ten propositions that would, over the next two and a half years of furious, almost hallucinatory labor, expand into the 500-page Philosophiae Naturalis Principia Mathematica, the single most consequential work of science ever written. The 44-year-old man who emerged from that labor was no longer a reclusive Cambridge professor nursing old grudges and private obsessions. He was, overnight and irrevocably, the central intellectual figure of the age. But the person who entered that labor—the person who already had the answer at the ready when Halley came calling—had been carrying the essentials of that answer in his head, unpublished and unshared, for nearly two decades.
That gap—between knowing and telling, between discovery and disclosure—is the central fact of Newton's life. It explains everything that followed and complicates everything we think we understand about genius, ambition, and the price of both.
The Fatherless Farmer's Son
By the Numbers
Newton's Life and Legacy
84Years lived (1642–1727)
~500Pages in the Principia
60+Years of intense intellectual activity
~1,000,000Words written on alchemy alone
~300Copies in the first Principia press run
30Years as Master of the Royal Mint
£2,000/yrAnnual income as Master of the Mint
Isaac Newton was born on Christmas Day 1642—old calendar; January 4, 1643, by ours—in the hamlet of Woolsthorpe, Lincolnshire, a few days short of one year after Galileo died at Arcetri near Florence, as though the cosmos, having permitted the Italian to expire, required an immediate replacement. His father, also Isaac, was a prosperous but illiterate farmer who could not sign his own name and who died two months before his son drew breath. The baby was premature, tiny, and not expected to survive the day. He survived eighty-four years, which was long enough to invent the calculus, reformulate the nature of light, discover the law of universal gravitation, serve as Master of the Royal Mint, preside over the Royal Society, pursue counterfeiters to the gallows, write more than a million words on alchemy, and wage intellectual war against nearly every contemporary who dared to claim credit for anything.
When Isaac was three, his mother Hannah married Barnabas Smith, a 63-year-old minister of considerable wealth, and moved to her new husband's village, leaving the boy behind with his maternal grandparents. The abandonment was formative in the most clinical sense. When, at age nineteen, Newton compiled a catalogue of his sins, one entry read: "Threatning my father and mother Smith to burne them and the house over them." The acute sense of insecurity that this abandonment produced—the "pronounced psychotic tendencies," as Richard Westfall put it in
Never at Rest—never left him. It rendered Newton obsessively anxious when his work was published and irrationally violent when he defended it. It made him suspicious of intimacy in all its forms. It is one of the great ironies of intellectual history that the man who unified heaven and earth through a single mathematical law could not achieve the remotest unity between his inner world and the human beings around him.
Hannah returned to Woolsthorpe in 1653, after Smith died, bringing three half-siblings Newton seems to have regarded with indifference. Two years later Isaac was sent to boarding school in Grantham, lodging with Mr. Clark the apothecary, where school reports described him as "idle" and "inattentive." His room was full of tools and homemade contraptions—model windmills, sundials, a mouse-operated flour mill. The school saw none of this inventiveness. When Hannah called him home in 1659 to manage the family farm, he was spectacularly incompetent: set to watch the cattle, he would curl up under a tree with a book. An uncle, William Ayscough—who had received an M.A. from Cambridge, and who recognized in his nephew something the cattle could not appreciate—persuaded Hannah to let the boy prepare for university. In 1661, somewhat older than his classmates, Newton entered Trinity College, Cambridge.
The Plague and the Pebbles
The Cambridge Newton entered was still a stronghold of Aristotelianism. The curriculum dealt in rhetoric, logic, and Aristotelian physics—a geocentric, qualitative account of nature that had nothing to say about the world Copernicus, Kepler, and Galileo had been revealing for over a century. But the new philosophy was, as Westfall noted, "in the air." Newton, entirely on his own, began reading Descartes's Opera philosophica—the Meditations, the Discourse on Method, the Dioptrics, the Principles of Philosophy—and then the modern mathematicians: Oughtred, Viète, Wallis, Descartes again via van Schooten's Latin translation of the Géométrie. By 1664, working without formal guidance, he had mastered the literature and begun moving into new territory. Under the heading "Quaestiones Quaedam Philosophicae"—"Certain Philosophical Questions"—he entered the slogan that would govern his entire career: Amicus Plato amicus Aristoteles magis amica veritas. Plato is my friend, Aristotle is my friend, but my best friend is truth.
When Newton received his bachelor's degree in April 1665, the most remarkable undergraduate career in the history of university education passed entirely unrecognized.
Then the plague closed Cambridge, and for most of the next two years Newton was forced back to Woolsthorpe. What followed has been called his annus mirabilis—his miracle year, though it stretched across eighteen months—and it remains, by any honest accounting, the most concentrated burst of original discovery in the history of human thought. In those months, working alone in a farmhouse in Lincolnshire, Newton:
- Discovered the binomial theorem and developed the foundations of the calculus—"the method of fluxions," as he privately named it—becoming, before his twenty-fifth birthday, de facto the leading mathematician in the world.
- Conducted his first experiments on light, passing a narrow beam of sunlight through a glass prism and projecting the resulting spectrum onto the wall of a darkened chamber. He concluded that white light was not simple and homogeneous, as everyone since Aristotle had assumed, but a heterogeneous mixture of rays with distinct and immutable colors. This was not a minor contribution to optics. It was the destruction and replacement of an entire framework.
- Developed the mathematical theory of uniform circular motion, derived the inverse-square relation between centripetal force and distance, and noted its connection to Kepler's third law—the rule relating the square of planetary periods to the cube of their mean distance from the sun.
The world heard nothing of any of this. Newton returned to Trinity as a Fellow in 1667, continued his research in optics, built the first reflecting telescope in 1669, and wrote a tract on the calculus titled De Analysi per Æquationes Numero Terminorum Infinitas. On the basis of this tract, Isaac Barrow—the Lucasian Professor of Mathematics, a man whose own considerable talents were of a different order than those of his student—recommended Newton as his replacement. Newton assumed the Lucasian Chair in October 1669, four and a half years after receiving his bachelor's degree.
He was twenty-six. He had already done enough to secure permanent fame. And almost no one knew.
I don't know what I may seem to the world, but, as to myself, I seem to have been only like a boy playing on the sea shore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay undiscovered before me.
— Isaac Newton, near the end of his life
The Terror of Being Seen
Newton's first public exposure to the intellectual world came in early 1672, when he submitted a paper to the Royal Society—"A Letter of Mr. Isaac Newton … containing his New Theory about Light and Colors"—describing his prism experiments and their implications. The paper was, by any measure, a triumph: it presented a series of carefully designed experiments, drew from them a set of thirteen propositions about the nature of light, and offered reasoning of extraordinary rigor. The Royal Society was impressed. The reflecting telescope had already brought Newton to their attention.
But the paper also contained a philosophical argument—that since rays of light possess colors as intrinsic qualities, those rays must be substances rather than properties of some medium, and therefore light cannot be waves. This was a step beyond the experimental evidence, and it caught the eye of Robert Hooke.
Hooke was a pivotal figure in English science—curator of experiments for the Royal Society, brilliant and contentious, the author of Micrographia, a man of extraordinary range and chronic insecurity about receiving credit for his ideas. He considered himself the master of optics. He wrote a condescending critique of the unknown parvenu, arguing that his own hypothesis—that light is a wave propagated through a homogeneous medium—could save the same phenomena just as well. Christiaan Huygens, on the Continent, raised similar objections.
Newton's response to honest, if occasionally careless, criticism was volcanic. The flaming rage that Hooke's critique provoked—the desire, as Westfall observed, to publicly humiliate him—"bespoke the abnormal." Newton was unable rationally to confront criticism. Less than a year after submitting the paper, he began cutting ties and withdrawing into virtual isolation. The exchange with Hooke initiated a pattern Newton would repeat for the rest of his life: a brilliant disclosure, followed by criticism, followed by fury, followed by retreat. When a group of English Jesuits in Liège challenged his experimental results, Newton responded with escalating rage until, in 1678, "a final shriek" was followed by silence. The death of his mother the following year completed his withdrawal. For six years he refused almost all intellectual commerce.
This was not the behavior of a man who disliked attention. It was the behavior of a man who craved it and could not tolerate the terms on which it was offered.
The Secret Furnace
The isolation of the late 1670s and early 1680s was not idle. Newton retreated not from work but from the world, and the work he pursued was stranger, more obsessive, and more revealing than anything he had yet attempted. He had purchased chemical apparatus and alchemical treatises as early as 1669, but now he plunged into alchemy with a focus that would persist for more than thirty years. He copied treatises by hand, collated them, annotated them, compiled concordances comparing the sayings of different authors, and kept experimental laboratory notebooks recording decades of alchemical research. We now know that at least a million words in Newton's hand survive addressing alchemical themes—more than he devoted to mathematics and physics combined.
The economist John Maynard Keynes, who purchased a trove of Newton's alchemical manuscripts at the famous Sotheby's auction of 1936, delivered a verdict that has haunted Newton's reputation ever since: "Newton was not the first of the age of reason. He was the last of the magicians, the last of the Babylonians and Sumerians, the last great mind that looked out on the visible and intellectual world with the same eyes as those who began to build our intellectual inheritance rather less than 10,000 years ago."
The judgment is memorable and wrong—or, rather, it is precisely the kind of anachronistic projection that Newton's own method would have rejected. In the seventeenth century, "alchemy" and "chemistry" were not yet distinct disciplines. The term chymistry encompassed distilling, pigment-making, salt-refining, drug manufacture, and the perennial attempt to transmute base metals into gold. As William Newman, the historian of science who has spent decades editing Newton's alchemical papers, points out: seventeenth-century natural philosophers had not yet grasped the distinction between nuclear reactions and chemical reactions. Alchemy presented itself as a discipline that could make the most fundamental changes to matter. "For a man of Newton's intellect and desire to get to the bottom of nature," Newman argues, "it really made perfect sense for him to be involved in alchemy."
What Newton found in the alchemical tradition was a conception of nature that his mechanical philosophy could not provide. The standard Cartesian view held that all natural change occurred through contact between particles of matter. But certain phenomena puzzled Newton—chemical affinities, the generation of heat in reactions, surface tension, capillary action, the cohesion of bodies. He spoke of a "secret principle" by which substances are "sociable" or "unsociable" with others. Around 1679, he abandoned the mechanical ether and began ascribing these puzzling phenomena to attractions and repulsions between particles of matter—forces that acted at a distance, without physical contact.
This was heresy to the mechanists. But it was also, as Newton conceived it, a bridge between the two great traditions of seventeenth-century thought: the mechanical tradition, which dealt in verbal imagery of particles and impacts, and the Pythagorean tradition, which insisted on the mathematical nature of reality. Newton's attractions were not the vague "sympathies" of the old occultists. They were quantitatively defined. They could be measured. And from that measurement, the entire edifice of the Principia would rise.
The Book That Changed Everything
📜
The Making of the Principia
From nine pages to the foundation of modern science
1679Hooke's letter poses the inverse-square orbit problem; Newton solves it privately but tells no one.
1684Halley visits Cambridge in August; Newton sends nine-page De Motu in November.
1685–86Newton expands De Motu from 10 propositions to 192, working almost without interruption.
1686Manuscript for Book I sent to London; Hooke claims plagiarism.
1687Roughly 300 copies of the Principia come off the press, thrusting the 44-year-old Newton into international prominence.
The story of the Principia's composition is one of the most extraordinary episodes in the history of ideas—not because it is surprising that Newton could do such work, but because of the circumstances under which he did it. Save for a few weeks away from Cambridge, from late 1684 until early 1687, Newton concentrated on nothing else. He often forgot to eat. His assistant Humphrey Newton—a distant relative, no immediate kin—later recalled that he saw the great man laugh only once, when someone asked him why anyone would want to study Euclid.
The initial plan called for two books; Newton subsequently shifted to three, and replaced the original version of the final book—a more accessible, less mathematical treatment later published posthumously as The System of the World—with one far more demanding. The result was a work of almost inhuman rigor. Its three laws of motion—a body at rest remains at rest unless acted upon by an impressed force; the change of motion is proportional to the force impressed; to every action there is an equal and opposite reaction—were not in themselves entirely original. What was original was the systematic framework they provided. Newton's second law, in the formulation F = ma, gave the concept of force a precise quantitative meaning. (The formulation F = ma appears nowhere in the Principia itself; it was Euler, decades later, who cast the law in that algebraic form. Newton's own presentation was geometric.) This framework allowed Newton, through a chain of propositions of increasing audacity, to demonstrate that the centripetal force holding the planets in their orbits must decrease with the square of their distance from the Sun; that the same force governs the Moon's orbit around the Earth; that the distance by which the Moon deviates from a straight-line path in one second, when compared with the distance a body falls from rest on the Earth's surface in one second, yields the precise ratio of 3,600—sixty squared—confirming that one and the same force, governed by one quantitative law, operates in both cases. He called this force gravitas. Weight. Heaviness. The most ordinary word in any language, now carrying the most extraordinary meaning.
The law of universal gravitation states that every particle of matter in the universe attracts every other particle with a force proportional to the product of their masses and inversely proportional to the square of the distance between their centers. Newton confirmed it from the tides, from the orbits of comets, from the shape of the Earth itself. It was the first successful unification of terrestrial and celestial physics—the first demonstration that the same laws govern the fall of an apple and the orbit of a planet—and it would remain the foundation of physical science for more than two centuries, until Einstein showed it to be a limiting case of something deeper.
When the Royal Society received the manuscript of Book I in 1686, Hooke raised the cry of plagiarism, claiming he had given Newton the idea of an inverse-square law in their 1679 correspondence. The charge was largely baseless—Hooke had posed the problem but lacked the mathematics to solve it; his "knowledge of the inverse square relation," as Westfall demonstrated, "rested only on intuitive grounds"—but Newton's response revealed everything about his character. Hooke would have been satisfied with a generous acknowledgment. It would have been a graceful gesture to a sick man already well into his decline, and it would have cost Newton nothing. Instead, Newton went through his manuscript and eliminated nearly every reference to Hooke. Such was his fury that he refused to publish his Opticks or to accept the presidency of the Royal Society until Hooke was dead.
The one proceeds upon the Evidence arising from Experiments and Phenomena, and stops where such Evidence is wanting; the other is taken up with Hypotheses, and propounds them, not to be examined by Experiments, but to be believed without Examination.
— Newton, anonymous review of the Royal Society's report on the calculus priority dispute, 1715
Hypotheses Non Fingo
Newton's method—the thing that made the Principia not just a great work of physics but a revolution in how to do physics—is encapsulated in a Latin phrase he added to the second edition in 1713: Hypotheses non fingo. "I feign no hypotheses." The phrase infuriated his critics as much as it inspired his followers, and it has been misunderstood ever since.
Newton was not opposed to hypotheses as such. In the Scholium to Proposition 96 of Book I, he discusses hypotheses concerning light rays. In Query 21 of the Opticks, he proposes a speculative aether whose differential density might account for gravitational force. What he opposed was the Cartesian and Leibnizian practice of positing invisible mechanisms—vortices, subtle fluids, contact actions—and then treating them as established truths rather than provisional suggestions. For Newton, every element of theory had to be decided by specific phenomena. Hypotheses could be entertained, but they could not be "feigned"—that is, they could not be presented as demonstrated conclusions when they had not been tested against empirical evidence.
This distinction—between what the phenomena compel you to accept and what you merely suppose—was Newton's deepest philosophical contribution, more consequential even than any particular discovery. His fourth Rule of Reasoning, added in the third edition of the Principia, stated it with lapidary precision:
In experimental philosophy, propositions gathered from phenomena by induction should be taken to be either exactly or very nearly true notwithstanding any contrary hypotheses, until yet other phenomena make such propositions either more exact or liable to exceptions.
The rule sounds modest. Its implications were revolutionary. It meant that the gravitational force binding the solar system together did not need to be explained by a mechanism in order to be accepted as real. It meant that the question "Why does gravity act at a distance?" could be left open—indeed, had to be left open—until phenomena emerged to answer it. It meant that science could advance without metaphysical certainty, that an honest confession of ignorance was not a defect of a theory but a feature of its integrity.
Leibniz and his followers could not accept this. For them, any theory that permitted action at a distance was invoking an "occult quality"—a hidden power embedded in matter, the kind of vague Scholastic nonsense that the mechanical philosophy had been designed to banish. The dispute was not merely technical. It was a dispute about what counted as an explanation, about how much a theory was required to say. Newton believed it was enough to describe how gravity behaves; Leibniz insisted you had to explain why it behaves that way. This argument—between empirical description and causal explanation, between operational adequacy and metaphysical completeness—has never been resolved. It is, in many ways, the founding argument of modern science.
The God of the Gaps and the Gaps in God
Newton was not merely a physicist who happened to have theological views. He was a theologian who happened to be a physicist—or, more precisely, he was a natural philosopher in the original sense of the term, someone for whom the study of nature and the study of God were aspects of a single enterprise. He spent as many hours on biblical exegesis as on mathematics, and his theological conclusions were radical enough to have destroyed his career had they become public.
Through careful study of the doctrine of the Trinity, Newton concluded—probably in the early 1670s, around the time the Lucasian Professorship forced the question of ordination vows—that Trinitarianism was a corruption of early Christianity, introduced in the fourth century and formalized at the Council of Nicaea.
Jesus of Nazareth was, in Newton's view, not a divine figure on the same level as God the creator. This was Arianism, or something close to it, and it was heresy by any Anglican standard. Newton kept these views secret among a tiny circle of friends, trusting John Locke enough to send him a long letter—"Two Notable Corruptions of Scripture"—presenting his evidence that key Trinitarian passages in the Bible were later-day interpolations. When Locke moved to publish it, Newton withdrew in fear.
The theological papers that Newton left at his death—roughly half of the estimated ten million words in his surviving manuscripts—reveal a mind pursuing the same patterns of evidence-based reasoning in scripture that it applied to nature. He compiled minutely detailed chronologies of ancient kingdoms, attempting to date biblical events from astronomical phenomena. He wrote extensive commentaries on the prophecies of Daniel and the Apocalypse of St. John. He reconstructed Solomon's Temple from the biblical account. To the modern eye, these pursuits look eccentric—the "magician" side of Keynes's famous judgment. But they were entirely consistent with Newton's method. He was looking for lawful regularities in every domain of knowledge, treating the Book of Scripture with the same empirical seriousness he brought to the Book of Nature.
The connection between his theology and his physics was not incidental. In the General Scholium added to the second edition of the Principia in 1713, Newton wrote of God with a specificity that alarmed both the Cartesians and the Leibnizians:
He endures always and is present everywhere, and by existing always and everywhere he constitutes duration and space. Since each and every particle of space is always, and each and every indivisible moment of duration is everywhere, certainly the maker and lord of all things will not be never or nowhere.
God was not, for Newton, a watchmaker who built the mechanism and walked away. God was present—substantially present—throughout all of space throughout all of time. The regularity of nature was evidence of divine governance. The instability of the solar system—which Newton recognized, even as he demonstrated its mathematical structure—was evidence that divine intervention was periodically required to keep the planets in their orbits. This was not a weakness in the theory. It was, for Newton, part of the point.
London, Money, and the Terror of Counterfeit
Newton moved to London in 1696, accepting the position of Warden of the Royal Mint at the Tower of London, on the recommendation of his friend Charles Montague, Chancellor of the Exchequer. He was fifty-three. His creative work was essentially complete. He was never again satisfied with the academic cloister, and the move was, in practical terms, the end of his scientific career and the beginning of something else entirely.
The Mint was in crisis. Roughly ten percent of coins in circulation were counterfeit. The value of silver on the Continent exceeded the face value of English silver coins, incentivizing merchants to melt genuine coins and sell the bullion abroad—a textbook case of
Gresham's Law, in which bad money drives out good. Business was grinding to a halt. The government had ordered a Great Recoinage: all old silver money was to be collected, melted down, and reissued at correct weight and value.
The Wardenship was traditionally a sinecure. Newton treated it as a military command. He studied the Mint's history and operations, analyzed the skills and working practices of the men producing coins, and drastically improved productivity. Before his arrival, the Mint could produce 15,000 coins per week. Newton had them turning out 50,000. He organized 500 men at the Tower of London, working six days a week from 4 a.m. to midnight. He oversaw the establishment of branch mints in Bristol, Chester, Exeter, Norwich, and York. Between 1696 and 1699, the value of silver struck exceeded £5.1 million, compared to £3.3 million coined in the preceding thirty-five years.
He also became a detective. The Warden's principal duty was to investigate and prosecute counterfeiting—a capital crime. Newton threw himself into the work with the same obsessive focus he had brought to optics and alchemy. He conducted interviews with criminals and informers in Newgate gaol and the coffeehouses and alehouses of London's underworld. He built networks of surveillance. He extracted confessions through threats and bribes, occasionally paying for clothing so that newly compliant witnesses could appear credible before judges and juries. He sent a goodly number of counterfeiters to the gallows. His most famous quarry was William Chaloner, a skilled counterfeiter whose forgeries were of exceptionally high quality. Newton pursued him for years before finally securing a conviction and execution in 1699.
In that same year, the post of Master of the Mint fell vacant. Though technically less senior than the Warden, the Master's position was far more lucrative, and Newton took it. He held the post until his death in 1727—thirty years of public service, a second career as long and consequential in its own domain as the first had been in science. His income as Master—as much as £2,000 per annum—made him a wealthy man. His concern for accuracy in the coinage was legendary. When the Trial of the Pyx in 1710 judged his gold coins to be below standard, Newton proved that the fault lay not in his coins but in the new gold trial plate introduced in 1707, which contained too much gold. The earlier trial plate of 1688 was returned to use.
The War Against Leibniz
If Newton's pursuit of counterfeiters was professionally motivated, his war against Gottfried Wilhelm Leibniz was personal, relentless, and conducted with the full resources of institutional power.
Leibniz—born in Leipzig in 1646, a philosopher and mathematician of extraordinary range, the inventor of a notation for the calculus that proved far more practical than Newton's, and a man whose irenic temperament was the temperamental opposite of Newton's—had arrived at the calculus independently in the mid-1670s. This is now universally accepted by historians. Newton had discovered the method of fluxions first, probably by the mid-1660s, but had never published it. Leibniz published his version in 1684, and it was Leibniz's notation—the dx and dy and ∫ that are now standard—that spread across the Continent and became the lingua franca of mathematical analysis.
Newton could not bear it. The priority dispute smoldered for years before erupting in 1710, when John Keill accused Leibniz in the Philosophical Transactions of having plagiarized the calculus. Leibniz, a Fellow of the Royal Society since 1673, demanded redress. The Society's response, published in 1712, was anything but redress—Newton himself was a dominant figure in drafting it. He then published an outspoken anonymous review of the Society's own report in the Philosophical Transactions in 1715. He appointed an "impartial" committee to investigate the issue, secretly wrote the committee's report, and anonymously reviewed that report. He allowed younger men to publish attacks under their own names, attacks Newton had drafted himself. Even Leibniz's death in 1716 could not quench the fury. Newton continued to pursue the enemy beyond the grave.
The calculus dispute expanded from a personal quarrel into a continental schism. On one side stood the English, loyal to Newton and the Royal Society; on the other, the mathematicians who had been working with Leibniz since the 1690s, most notably Johann Bernoulli. The schism in turn transformed into a methodological divide between English and Continental approaches to mathematics and physics that persisted for decades—and that, paradoxically, did more damage to English mathematics than to anyone else, since the Leibnizian notation and methods proved far more fertile for the eighteenth-century developments that would extend Newton's own physics.
The irony is savage. Newton's refusal to publish the calculus in the 1660s and 1670s, his inability to tolerate criticism, his pathological need to control the narrative of his own accomplishments—these traits, born of that abandoned boy in Woolsthorpe, directly ensured that the mathematical tools bearing his name would be developed and refined by others, in a form he opposed, using methods he disdained.
The Solitary Patron
Newton's everyday life in London was transformed by the arrival of his niece Catherine Barton—the daughter of his half-sister Hannah, born of the marriage to Barnabas Smith that had cost Newton his mother—who moved in with him shortly after his arrival in the city. She was a teenager, extraordinarily vivacious, socially prominent among the powerful, and celebrated among the literati. Jonathan Swift and Voltaire both remarked on her wit. She kept house for Newton until her marriage to John Conduitt in 1717, and through her and her husband, Newton's papers came down to posterity.
It is through Catherine Barton that we can see, faintly, the outline of a private Newton who existed beneath the public monument. Newton, whose only close contacts with women had been his unfulfilled relationship with his mother and a rumored youthful attachment to one Catherine Storer in Grantham, found in his niece something he had never permitted himself before: sustained domestic affection. He never married. He seems to have had no romantic attachments of any kind, unless one counts his intense friendship in the early 1690s with Nicolas Fatio de Duillier, a Swiss mathematician of considerable talent and volatile temperament—a relationship that was, in Newton's biographer's words, "the most profound experience of his adult life." When Fatio fell ill in 1693 and financial pressures threatened to call him home to Switzerland, Newton's distress "knew no limits." He offered to support Fatio in Cambridge. Nothing came of the proposal. The correspondence broke off without explanation, and four months later Newton suffered what he himself described, in retrospect, as a nervous breakdown—his second. He sent wild, accusatory letters to Samuel Pepys and John Locke. Pepys was informed that Newton would see him no more; Locke was charged with trying to entangle him with women.
He recovered. He always recovered, in the sense that he resumed functioning. But the cost of his emotional architecture—the isolation, the suspicion, the rage—was visible to everyone around him.
I never knew him to take any recreation or pastime either in riding out to take the air, walking, bowling, or any other exercise whatever, thinking all hours lost that was not spent in his studies, to which he kept so close that he seldom left his chamber.
— Humphrey Newton, Newton's assistant, on his employer's temperament
The Patriarch's Final Decades
Newton was elected President of the Royal Society in 1703—the year Hooke died—and he ruled it, as Westfall observed, "magisterially." In 1704 he finally published his Opticks, appended to which were two mathematical treatises—his first work on the calculus to appear in print, more than three decades after its discovery. In 1705 Queen Anne knighted him, the first occasion on which a scientist was so honored. A second edition of the Principia, painstakingly revised under the editorship of Roger Cotes, appeared in 1713; a third edition, which Newton began working on at the age of eighty, was published in 1726.
He remained intellectually active into his eighties—the third edition of the Principia contains substantive additions and modifications—but the creative fire that had blazed so terrifyingly in the plague years was long spent. The man who had solved the deepest problems in celestial mechanics now occupied himself with bureaucratic administration, doctrinal feuds, and the relentless prosecution of the war against Leibniz. "Almost any paper on any subject from those years," Westfall noted, "is apt to be interrupted by a furious paragraph against the German philosopher, as he honed the instruments of his fury ever more keenly."
Yet Newton's influence expanded even as his personal energies contracted. "Newtonianism," in one form or another, had taken firm root in Britain within the first decade of the eighteenth century. On the Continent, the resistance was longer—Christiaan Huygens and Leibniz both rejected action at a distance as invoking an occult power—but as the promise of the gravitational theory became increasingly substantiated during the 1740s and 1750s, Newton became equally dominant there. The key advances were not Newton's. They were made by Euler, who reformulated Newton's three laws in the algebraic language of F = ma and extended them to rigid bodies and fluids; by Clairaut, who resolved a factor-of-two discrepancy in the lunar apsidal motion that Newton himself had glossed over; by d'Alembert, who solved the precession of the equinoxes; by Laplace, who developed a proper theory of the tides and identified a 900-year fluctuation in the motions of Jupiter and Saturn. What physics textbooks now call "Newtonian mechanics" was assembled primarily on the Continent between 1740 and 1800, and it owed as much to Euler as to Newton.
Newton died on March 20, 1727, at the age of eighty-four. He was buried in Westminster Abbey, with the pomp afforded to a head of state. Alexander Pope composed the epitaph that best captured the Enlightenment's view of him: Nature and Nature's laws lay hid in night; / God said, "Let Newton be" and all was light. The posthumous publications continued for decades—The Chronology of Ancient Kingdoms Amended in 1728, Observations upon the Prophecies of Daniel in 1733, A Treatise of the Method of Fluxions in 1737. But the works that had been published represented, as the Stanford Encyclopedia notes, "only a limited fraction of the total body of papers."
The Papers in the Box
Newton left no will. His papers—the manuscripts, the notebooks, the alchemical recipes, the heretical theological writings, the millions of words on subjects he had never permitted the world to see—passed to John Conduitt and Catherine Barton. The family retained them for more than a century, eventually passing them to the Earls of Portsmouth through the marriage of the Conduitts' daughter Catherine. In 1872, the Portsmouth family allowed the papers to be reviewed by scholars at Cambridge. A catalogue was issued in 1888. The university retained the papers of a "scientific character." The rest—the alchemy, the theology, the biblical chronology—were returned to the family, deemed not worth keeping.
In 1936, the Portsmouth heirs, facing ruinous death duties, sent the non-scientific papers to Sotheby's. They were auctioned over two days in July, scattered among dozens of buyers. Keynes purchased the alchemical manuscripts and delivered his famous verdict about the last of the magicians. The orientalist Abraham Yahuda acquired many of the theological manuscripts, which eventually went to the Jewish National and University Library in Jerusalem. Other lots went to dealers, collectors, and institutions around the world. Serious scholarly work on this material did not begin until the 1970s. Much remains to be done.
The irony is exquisite. Newton, who controlled the narrative of his own life with an obsessiveness bordering on mania—eliminating references to Hooke, ghostwriting attacks on Leibniz, burning certain papers before his death—ultimately lost control of the story. The "private Newton" that has emerged since the mid-twentieth century is richer, stranger, and more contradictory than the Enlightenment saint that the "public Newton" had been for two hundred years. He is a man whose investigation of light was entangled with alchemical theories of material transformation; whose gravitational physics was inseparable from his theology of divine omnipresence; whose mathematical rigor coexisted with decades of effort to decode the prophecies of Daniel.
What remains, in the end, is the gap—the same gap Halley encountered in 1684. Between what Newton knew and what he chose to reveal. Between the discovery and the disclosure. Between the pebble in his hand and the ocean he could see but never cross.
In the reading room of Cambridge University Library, in a metal box not much different from the one Newton packed when he left for London in 1696, there are manuscripts covered in his handwriting—the neat, cramped script of a man who revised obsessively, who crossed out and rewrote until the phrasing met his private standard of exactitude. The pages are brown with age. Some are singed, as though he had started to burn them and then thought better of it. They smell, faintly, of old leather and chemical residue. If you hold one to the light, you can see where the ink has eaten through the paper.