THE PLAGUE OF 1665 raged on through the fall. In December, a bitter cold settled across the south of England. Samuel Pepys wrote that the hard frost "gives us hope for a perfect cure of the plague." But the disease persisted—up to thirteen hundred Londoners a week were still dying—and prudent folk shunned crowds if they could.

Isaac Newton was cautious to a fault. He celebrated his twenty-third birthday that Christmas Day at home, safely distant from the infectious towns. He stayed there into the new year, working, he said, with an intensity he never again equaled: "In those days," he remembered fifty years on, "I was in the prime of my age for invention & minded Mathematicks & Philosophy more then at any time since."

Mathematics first, continuing what he had started before his enforced retreat from Cambridge. The critical ideas emerged from the strange concept of the infinite, in both its infinitely large and infinitesimally small forms. Newton would later name the central discovery of that first plague year the "method of fluxions." In its developed form, we now call it the calculus, and it remains the essential tool used to analyze change over time.

He did not complete this work in total isolation. In the midst of his thinking about infinitesimals, the epidemic seemed to ease in the east of England. By March, Cambridge town had been free of plague deaths for six consecutive weeks. The university reopened, and Newton returned to Trinity College. In June, though, the disease reappeared, and on the news of more deaths, Newton again fled home to Woolsthorpe. Back on the farm, his attention shifted from mathematics to the question of gravity.

The word already had multiple meanings. It could imply a ponderousness of spirit or matter—the affairs of nations had gravity, and to be said to possess gravitas was a badge of honor for the leaders of nations. It had a physical meaning too, but what it was—whether a property of heavy objects or some disembodied agent that could act on objects—no one knew. In the *Quœstiones,* Newton had titled one essay "Of Gravity and Levity," and he wrestled there with concepts that he found to be vague and indistinct. He wrote of "the matter causing gravity" and suggested that it must pass both into and out of "the bowels of the earth." He considered the question of a falling body and wrote of "the force which it receives every moment from its gravity"—that is, force somehow inherent in the object plummeting toward the ground. He wondered whether "the rays of gravity may be stopped by reflecting or refracting them." For the time being, all that Newton knew about the connection between matter and motion was that one existed.

Now, in his enforced seclusion, Newton tried again. According to legend, the key idea came to him in one blinding flash of insight. Sometime during the summer of 1666, he found himself in the garden at Woolsthorpe, sitting "in a contemplative mood," as he remembered—or perhaps invented, recalling the moment decades later, in the grip of nostalgia and old age. In his mind's eye the apple tree of his childhood was heavy with fruit. An apple fell. It caught his attention. Why should that apple always descend perpendicularly to the ground, he asked himself. Why should it go not sideways or upward but constantly to the earth's center?

Why not indeed. The myth that has endured from that time to this declared that that was all it took: on the spot, Newton made the leap of reason that would lead to the ultimate prize, his theory of gravity. Matter attracts matter, in proportion to the mass contained in each body; the attraction is to the center of a given mass; and the power "like that we here call gravity ... extends its self thro' the universe."

Thus the story of what one author has called the most significant apple since Eve's. It has the virtue of possessing some residue of fact. The tree itself existed. After his death, the original at Woolsthorpe was still known in the neighborhood as Sir Isaac's tree, and every effort was made to preserve it, propping up its sagging limbs until it finally collapsed in a windstorm in 1819. A sliver of the tree ended up at the Royal Astronomical Society, and branches had already been grafted onto younger hosts, which in time bore fruit of their own. In 1943, at a dinner party at the Royal Society Club, a member pulled from his pocket two large apples of a variety called Flower of Kent, a cooking apple popular in the 1600s. These were, the owner explained, the fruit of one of the grafts of the original at Woolsthorpe. Newton's apple itself is no fairy tale; it budded, it ripened; almost three centuries later it could still be tasted in all the knowledge that flowed from its rumored fall.

But whatever epiphany Newton may have had in that plague summer, it did not include a finished theory of gravity. At most, the descent of that apple stimulated the first step in a much longer, more difficult, and ultimately much more impressive odyssey of mental struggle, one that took Newton from concepts not yet formed all the way to a finished, dynamic cosmology, a theory that reaches across the entire universe.

That first step, of necessity, turned on the existing state of knowledge, both Newton's and that of European natural philosophers. Earlier in the plague season, Newton had studied how an object moving in a circular path pushes outward, trying to recede from the center of that circle—a phenomenon familiar to any child twirling a stone in a sling. After a false start, he worked out the formula that measures that centrifugal force, as Newton's older contemporary, Christiaan Huygens, would name it. This was a case of independent invention. Huygens anticipated Newton but did not publish his result until 1673. That is: Newton, just twenty-two, was working on the bleeding edge of contemporary knowledge. Now to push further.

He did so by testing his new mathematical treatment of circular motion on the revolutionary claim that the earth did not stand still at the center of a revolving cosmos. One of the most potent objections to Copernicus's sun-centered system argued that if the earth really moved around the sun, turning on its axis every day as it went, that rotation would generate so much centrifugal force that humankind and everything else on the surface of such an absurdly spinning planet would fly off into the void. With his new insight, however, Newton realized that his formula allowed him to determine just how strong this force would be at the surface of the turning earth.

To begin, he used a rough estimate for the earth's size—a number refined over the previous two centuries of European exploration by sea. With that, he could figure the outward acceleration experienced at the surface of a revolving earth. Next, he set out to calculate the downward pull at the earth's surface of what he called gravity, in something like the modern sense of the term. Galileo had already observed the acceleration of falling bodies, but Newton trusted no measurement so well as one he made himself, so he performed his own investigation of falling objects by studying the motion of a pendulum. With these two essential numbers, he found that the effect of gravity holding each of us down is approximately three hundred times stronger than the centrifugal push urging us to take flight.

It was a bravura demonstration, an analysis that would have placed Newton in the vanguard of European natural philosophy, had he told anyone about it. Even better, he found he could apply this reasoning to a larger problem, the behavior of the solar system itself. What was required, for example, to keep the moon securely on its regular path around the earth? Newton knew one fact: any such force would strain against the moon's centrifugal tendency to recede, to fly off, abandoning its terrestrial master. At the appropriate distance, he realized, those impulses must balance, leaving the moon to fall forever as it followed its (nearly) circular path around the center of the earth, the source of that still mysterious impulse that would come to be called gravity.

Mysterious, but calculable. To do so, he needed to take one last, great step and create a mathematical expression to describe the intensity of whatever it was connecting the earth and the moon with the distance between the two bodies. He found inspiration in Kepler's third law of planetary motion, which relates the time it takes for a planet to complete its orbit with its distance from the sun. By analyzing that law, Newton concluded (as he later put it) that "the forces which keep the planets in their orbs must [be] reciprocally as the squares of their distances from the centers around which they revolve." That is, the force of gravity falls off in proportion to the square of the distance between any two objects.

With that, it was just a matter of plugging in the numbers to calculate the moon's orbit. Here he ran into trouble. From his pendulum experiments, he had a fairly precise measurement of one crucial term, the strength of gravity at the earth's surface. But he still needed to know the distance between the moon and the earth, a calculation that turned on knowledge of the earth's size. This was a number Newton could not determine for himself, so he used the common mariner's guess that one degree of the earth's circumference was equal to "sixty measured Miles only." That was wrong, well off the accurate figure of slightly more than sixty-nine miles. The error propagated throughout his calculation, and nothing Newton could do would make the moon's path work out. He had some guesses as to what might be happening, but these were loose thoughts, and as yet he knew no way to reduce them to the discipline of mathematics.

The setback was enough to provoke Newton to move on. New ideas were crowding in. Optics came next, a series of inquiries into the nature of light that would bring him a first, ambivalent brush with fame in the early 1670s. Thus engaged, Newton let the matter of the moon rest.

But if his miracle years, as they have come to be known, did not produce the finished Newtonian system, still by the end of his enforced seclusion Newton understood that any new physical system could succeed only by "subjecting motion to number." His attempt to analyze the gravitational interaction of the earth and the moon provided the model: any claim of a relationship, any proposed connection between phenomena, had to be tested against the rigor of a mathematical description.

Many of the central ideas that would form the essential content of his physics were there too, though an enormous amount of labor remained to get from those first drafts to the finished construction of the system. Newton would have to redefine what he and his contemporaries thought they knew about the most basic concepts of matter and motion just to arrive at a set of definitions that he could turn to account. For example, he was still groping for a way to express the crucial conception of force that would allow him to bring the full force of mathematics to bear. By 1666, he had got this far: "Tis known by y^{e} light of nature ... y^{t} equall forces shall effect an equall change in equall bodys ... for in loosing or ... getting y^{e} same quantity of motion a body suffers the y^{e} same quantity of mutation in its state."

The core of the idea is there: that a change in the motion of a body is proportional to the amount of force impressed on it. But to turn that conception into the detailed, rich form it would take as Newton's second law of motion would require long, long hours of deep thought. The same would prove to be true for all his efforts over the next twenty years as they evolved into the finished edifice of his great work, *Philosophiœ naturalis principia mathematica*—The Mathematical Principles of Natural Philosophy—better known as the *Principia.* For all his raw intelligence, Newton's ultimate achievement turned on his genius for perseverance. His one close college friend, John Wickens, marveled at his ability to forget all else in the rapt observation of the comet of 1664. Two decades later, Humphrey Newton, Isaac Newton's assistant and copyist (and no relation), saw the same. "When he has sometimes taken a turn or two [outdoors] has made a sudden stand, turn'd himself about and run up y^{e} stairs, like another Archimedes, with an Eureka, fall to write on his Desk standing, without giving himself the Leasure to draw a Chair to sit down in." If something mattered to him, the man pursued it relentlessly.

Equally crucial to his ultimate success, Newton was never a purely abstract thinker. He gained his central insight into the concept of force from evidence "known by y^{e} light of nature." He tested his ideas about gravity and the motion of the moon with data drawn from his own painstaking experiments and the imperfect observations of others. When it came time to analyze the physics of the tides, the landlocked Newton sought out data from travelers the world over; barely straying from his desk in the room next to Trinity College's Great Gate, he gathered evidence from Plymouth and Chepstow, from the Strait of Magellan, from the South China Sea. He stabbed his own eye, built his own furnaces, constructed his own optical instruments (most famously the first reflecting telescope); he weighed, measured, tested, smelled, worked—hard—with his own hands, to discover the answer to whatever had sparked his curiosity.

Newton labored through the summer. That September, the Great Fire of London came. It lasted five days, finally exhausting itself on September 7. Almost all of the city within the walls was destroyed, and some beyond, 436 acres in all. More than thirteen thousand houses burned, eighty-seven churches, and old St. Paul's Cathedral. The sixty tons of lead in the cathedral roof melted; a river of molten metal flowed into the Thames. Just six people are known to have died, though it seems almost certain that the true number was much greater.

But once the fire destroyed the dense and deadly slums that cosseted infection, the plague finally burned itself out. That winter, reports of cases dropped, then vanished, until by spring it became clear that the epidemic was truly done.

In April 1667, Newton returned to his rooms at Trinity College. He had left two years earlier with the ink barely dry on his bachelor of arts degree. In the interval, he had become the greatest mathematician in the world, and the equal of any natural philosopher then living. No one knew. He had published nothing, communicated his results to no one. So the situation would remain, in essence, for two decades.