Tag: Newton

Here is my burnt offering upon the unholy alter of quit-lit.

This spring I sat on a career advice panel in Montreal at the Geometry of Subgroups conference, offering my thoughts from the perspective of someone who recently jumped from academia to industry. As fine as the panel was, I am frustrated at how far short I fell of what I felt should be said. So here is my attempt at better expressing myself. If you have more questions, feel free to pass them on to me, and maybe I’ll do another round. The more impertinent the questions, the better.

Q: If I quit mathematics, can I really consider myself to be a mathematician. Like, really?

There is a rich historical tradition of mathematicians leaving academia and going out into the world to do other things.

  • Isaac Newton in his later years tired of his cushy Cambridge position and wanted an exciting job in the capital. He became Warden of the Royal Mint and oversaw the Great Recoinage of 1696 (there is a book about the whole story).
  • John von Neumann had an impact on the post war world that I cannot adequately describe in a few sentences. But his work consulting with the US defense department led one recent AMS review of a recent biography to spend time addressing the question of whether or not he was evil.
  • Jim Simons ran off and started one of the world’s most successful hedge funds. He was successful in large part by employing quants — phds from scientific rather than financial backgrounds. He made so much money doing this he created the Simons Foundation. If you’ve ever read a Quanta magazine article, he paid for that (it certainly isn’t a revenue generating enterprise).
  • Robert Zimmer ascended to administrative heights, becoming Chicago University president. He was so celebrated that after his passing the NYT columnist Bret Stephens gave an invited Class Day speech complaining about cancel culture in tribute.
  • Alexander Grothendieck founded a commune, and later became a reclusive shepherd.
  • John Meynard Keynes got drawn into economics, trying to prevent World War 2, and founding the welfare state. (The real takeaway of all this is that you should go and read The Price of Peace by Zachary D. Carter).
  • Richard Garfield did his PhD in combinatorics, but found his dreams really came true when a dinky card game he designed became the international sensation known as Magic: The Gathering. It is only a matter of time before MTG cards become accepted as international reserve currency.

Being an mathematician at an academic institution is a very narrow avenue of human experience, and plenty of successful mathematicians have had healthy appreciation of this.

Q : Why did you quit mathematics?

Principally to resolve a two body problem; I was separated from my partner by the UK travel ban to the US during the pandemic. I saw my chances of getting a suitable job as being so negligible that as soon as my visa was approved, I quit my job, the UK, and flew over the Atlantic to get married and pursue the American Dream. I had another year to my postdoc, but considered waiting another year to be abject foolishness. Being separated from my partner by the pandemic did dramatic things to my tolerance of that situation. I certainly wasn’t willing to protract it any further on the vague possibility that something might turn up for me on the job market. At least part of this intensity of feeling was Pandemic induced, but it was pretty consistent with my general thinking. I’m a lot more mellow about the way it all worked out now I’m on the other side of it.

Q : Why couldn’t you get a job?

There a number of reasons. My job search became progressively more restricted as efforts to resolve my two body problem restricted me geographically. I was applying into the US, a highly competitive market, with a real lack of suitable teaching experience. I also didn’t have any fancy grants, and although I had a very fancy Oxford postdoc, I don’t believe it would have moved the needle in the same way as having a NSF grant. (A grant which is very important in the US, and I have never been eligible to even apply for). My visibility in the US was pretty low due to giving almost no talks while doing my PhD in Canada, and then being out in Israel and the UK. I certainly invited myself to seminars when I was stateside, but I was never invited to give conference talks or present at AMS meetings, which would have done a lot more. I was interviewed, once for a postdoc position when I finished my PhD and that is the most interest the US job market has ever shown. My strongest research and fanciest papers were produced towards the end of my time in math — during the pandemic, in fact. My fortunes would have been significantly different if I had finished my PhD with such results. My best chance of getting a job was probably when my partner got a bunch of job offers, but none of them were willing/able to consider a spousal hire. That kind of thing seems to work out for other people.

Q: Do you think it represents an institutional failure that someone as talented as yourself cannot get a job?

I obviously like to hear that people appreciated the work that I did, and I am very proud of it. But it is worth considering what this question might mean. The math job market is zero sum in every meaningful sense.
If you want there to be more nice research jobs to be available, sure, that would be nice, but that is a very general complaint, and applies as much to everyone else as to me. (Pure mathematicians are the only people who feel this way. Most people want more doctors, nurses, and teachers). If you think I am better qualified than some of the people getting the jobs, then maybe or maybe not, but that is very hard to speak to. Generally, it is worth being humble about how limited your judgment is about all the mathematics being done, and how strong the other candidates out there are.

Q: Why do other people quit mathematics?

They can’t get the job they want. They don’t believe they can get the job they want. They get bored with mathematics (or at least they way it ultimately gets practiced). They get a job and realize they don’t like some aspect of it, and it’s very difficult to get another job. They become interested in doing something else. They become interested in the idea of making lots of money. They need to make more money. They become sickened by the scientific community’s connections to the military industrial complex.

Q: Why do so many mathematicians leave academia and then go into tech and finance?

The short answer is because that is where the money and opportunity is, but there is a far better, longer answer that makes such a decision look far more sympathetic. Getting an academic, tenure track job is an example of what I will call a prestige vocation. There are other examples, mostly creative pursuits, but the easiest identifier is whether you can imagine a MasterClass (TM) Video Series being taught by someone doing this job. These jobs tend to have limited opportunities, fought over by highly qualified, highly educated, frequently highly privileged individuals, who have to stick it out in subordinate positions for many years. At the same time the industry itself either isn’t particularly lucrative, or even profit driven at all, from a market perspective. In some of the worst examples, aspirants are often required to pay increasing amounts for the degrees and accreditation just to access the bottom rungs.

Often this was different, many generations ago, but not anymore. Frequently people in these fields are overworked and underappreciated, becoming a miserable and dispirited. So when a mathematician jumps ship they would be wise to avoid making the same mistake again, and if you are a mathematician then there are these two industries that are ready and happy to put you to work.

As prestige vocations go, mathematics is probably one of the best. You are paid to go to graduate school, and there are a lots of perks like getting to travel. Best of all, you get to do mathematics.
You actually get to do the thing. You aren’t an assistant or an extra on a film set waiting for your moment.
You aren’t an editorial assistant with no time to work on your own novel. You get to do your thing.
Frequently, people do get the kind of job they are after — and maybe you will too! And, when the opportunities run out, you have a kind of training and qualification that you can convert into a new career.

It is also worth saying something about the tech industry. Computers are fascinating engines of mathematics worthy of a little respect. I often think about Frank Nelson Cole. In 1903 he gave an AMS presentation entirely devoted to multiplying 193,707,721 and 761,838,257,287 to demonstrate he had factorized the Mersenne number 147,573,952,589,676,412,927. According to his own report, discovering the factors had taken three years of Sundays. Today, I can spend 5 minutes writing a program in python that will discover the prime factors in a couple of seconds.

Q: Do you spend all your time wishing you had a nice tenure track job at a research university?

Not really. Getting a tenure track job, even a nice one, would have been a considerable change of pace. Assuming that the set-up resolved my personal situation as conveniently as my current job, a lot would have changed: more teaching, more meetings, and more responsibilities. Do the right postdocs and you can avoid all that until you hit the tenure track, but risk possibly ruling yourself out from the running for many such jobs.

I feel like I had a lot to bring to the table as a teacher, that I never really had the opportunity to offer. It would have been a lot of fun to have clever graduate students to farm out all the problems I couldn’t solve. I think I would have been a good graduate student supervisor.

In terms of the research, I now have a solid appreciation the amount of work that is involved to make substantial contributions happen. I don’t rue what I wasn’t able to do. In my more reflective moments, I rue the fact I can’t go back and become a graduate student again with all that I now know. If I were really dead set on returning to academia, my best bet would be to change my name, choose a distant, unrelated, but similarly exciting field, lie about my age, and start all over. I’d fucking kill the second time around. Leave everyone else for dead. Be feted within that clique, crank out some collaborations with the top peeps, and then when I collect my Fields medal all the geometric group theorists would jump out their seats and exclaim “That’s Daniel. He’s an old man! Take that prize away from him, the fraud!”

Q: Should I become a data scientist?

I can’t tell you what you should do. You have to look into your heart. Data scientist is a popular option and you can probably discuss that career with people who have actually gone away and done it. There are many such careers you might think about. Software engineering, AI, actuarial science, consulting, options trader, teacher, project manager, life coach, technical writer, substacker, astronaut. I can’t tell you how realistic or suitable any of them are for you. I can’t tell you what positions to apply for. I can’t tell you what company to try and work for. Academia offered a very limited range of opportunities and people generally would apply for almost anything and take what they could get. When you make the jump you have to think more carefully and act more deliberately.

Where would you like to work?
Would you like to work remotely?
Do you want to work for a nonprofit?
Do you want to work for a large or small company?
A startup?
Do you want to travel?
Would you like to work with customers?
Would you like to go into management?
Do you want to work for a public or privately traded company?
What sectors do you want to work in?
The financial sector?
Advertising?
The arms industry?
You should have opinions on these and other questions and let them guide you.

I know someone who quit academia and went off cycling around South America until Covid put an end to that. You have to respect the ability to make such decisions.

Q: Can I get a job in the tech sector without programming experience?

Yes, but you should spend time learning to program. Enough that at least you can do the interview style problems you can find on leet code. It is then important to find a job opening where they are sympathetic to your case and will appreciate what you can offer. There are many companies that understand that the ability to program can be learned on the job, but many of the qualities you have — mathematical facility, for one — are not so easy to find. There are many jobs you will not be qualified for (at least not at first), so you should keep an open mind about what you can do.

Q: What advice would you give for getting a job?

  1. I don’t know how long the transition will take, but I psychologically prepared for six months. The good news is you can start investigating and preparing in small ways and big ways, even before actually quitting.
  2. Get in contact with people you know or knew who left for industry. They will generally be very willing to talk and help you out. There is a kind of camaraderie among former math/phds. And at many big companies there is a financial reward when someone you refer gets hired.
  3. Get in contact with people you don’t know in industry. There are all kinds of ways to do this. I haven’t investigated them all. But learn to write short, polite emails that get to the point and aren’t weird.
  4. Learn Python. This is worth doing even if you are remaining in academia. Even if you aren’t going into the tech sector. It’s widely used, allows you to script quickly, and there are lots of resources for learning. Having some appreciation of what can be done in code is a very useful skill to have. What constitutes “learning Python” is obviously unclear. But best of all, if you hate it, you have learned that much. All that said, like mathematics, if you don’t have the right kind of motivation, it might be worth not bothering at all.
  5. Write your CV. Get other people to read it.
  6. Don’t be discouraged when your applications are ignored. Probably you didn’t apply for the right position. Half the battle is finding the right position to apply for. You probably don’t even understand what the right position to apply for is. That is why being able to talk to someone like a recruiter is so useful. They might actually know where they can place you. (I had the fortune to talk to an internal recruiter. I hear that external recruiters can be playing a numbers game, and may be less useful.)
  7. Take/audit courses that might be relevant. If you are still in an academic institution, taking a course on the side during a semester is a great way to hedge against the possibility of leaving at a later date. MIT has a bunch of good lectures online.
  8. Trust your mathematical instincts.
  9. Communication skills are valuable, to the point that this is something of a cliche. When I did my technical interview I went into TA mode, explaining what I was doing, and being self-deprecating about saying what I was confused about. If I had to interview again, I’d be 300% better. At least. But I was good enough.
  10. Be interested in what companies do, what people’s jobs involve. Sometimes you just have to ask to learn something useful.

Q: Weren’t you afraid that once you abandoned academia all your accumulated mathematical powers would dissipate, leaving you an empty vessel?

I feel like I spent enough time into the game that the thinking has been inscribed into me in some permanent way. I’ve also concluded that mathematical knowledge and understanding is not some precarious tower built up over successive years that will collapse without careful maintenance.
I believed that when I was an undergraduate, and the cumulative effect of undergraduate education certainly gave that impression. While I think good undergraduate courses should have that satisfying effect, I also suspect it can be harmful when transitioning into research.

Q: Weren’t you afraid of losing all those friends you made in mathematics?

Friends have confessed to me that this has been a non-trivial consideration. What is definitely true is that after a while as a mathematician you know a lot of people attending conferences, and you do enjoy having these regular reunions with them. The important point is that you don’t have to organize the get together (unless you are actually the conference organizer). This is nice, but really you should proactively maintain relationships with friends. Email, whatsapp, or meet up. Postcards are cool.

Q: I feel like my sense of self worth is tied to the mathematics I do, and the mathematical success I aspire to have. Is this a good reason to stay in mathematics?

You should not tie your self worth to success in mathematics. Pick anything else:
Being a good cook.
Having a sick set of shiny OG Pokemon cards.
Reading the poems published in the New Yorker.
Having a super discerning taste in music.
Always doing the washing up immediately after you’ve finished cooking.
Having a article appear on an online humour site.
Raising your FIDE chess ranking.
Raising your chess.com ranking.
Maintaining your wordle averages.
Finishing Gravity’s Rainbow.
Having very tidy cursive handwriting.

Just not being successful in mathematics.

Q: It sounds like by leaving academia and going into industry you have become a little too comfortable with late capitalism.

If you are afraid that I am not a good communist, then you are probably right. But there is a serious point, that isn’t especially political, and might be helpful. You should distinguish between having principals about what kind of work you want to do and being generally squeamish about capitalism. The danger is that when you do start to feel the desperation of finding some kind of job, you will embrace a kind of nihilism and decide that since you can’t achieve the purity of tenured academia, then you might as well make money by whichever means is most lucrative, working for whoever is ready to pay you.

[A quick note: this article lives here, on my blog. If you liked it, please do share it. Otherwise, I can assure you, very few people will find and read it.]

Retrograde Motion

Before Newton there was Copernicus, and before Copernicus there was Ptolemy. Living in the second century AD, Ptolemy produced what would become the definitive work in astronomy for the next millennium. It was a geocentric system: the Earth, quite sensibly, set at the center of the solar system. While geocentricism was ultimately to suffer the ignominy of being synonymous with backward thinking, Ptolemy certainly didn’t lack in mathematical sophistication.

Keeping the Earth at the center of the solar system required a great deal creative invention. It was taken as axioms that the planets should travel at constant speeds, and adhere to the perfect forms of geometry (that is to say circles and spheres). But the planets that appeared in the night sky did not conform to these expectation. Unlike the sun and moon, which flattered us earthlings with their regular appearance and disappearance, the planets would sometimes slow down and reverse direction — what they called “retrograde motion”. The solution that Ptolemy and his predecessors developed was a whole Spirograph set of celestial structures called deferents and epicycles. This essentially involved imagining that the other planets were not set upon a wheel revolving about the Earth, but set on a wheel on a wheel in motion about the earth. And, if necessary, perhaps some greater sequence of nested wheels.

Copernicus, the Catholic canon and Polish polymath of the fifteenth and sixteenth century, had, like every other astronomer of his day read Ptolemy. Yet after carefully studying the night sky and much thought, he developed a heliocentric map of the solar system. That is to say, with the Sun at the center. While he managed to free himself from geocentric difficulties, and dramatically simplify the situation in many respects, he still adhered to a belief in constant speed and circular orbits. It would take Keplar and ultimately Newton to settle the matter with elliptic orbit determined by the force of gravity.


The heliocentric theory was controversial for two reasons. The first, and quite reactionary, objection was based on readings of a handful of bible verses. For example, when Joshua led the Israelite in battle against an alliance of five Amorite kings he ordered the sun to halt its motion across the sky, thus prolonging the day, and with it the slaughter of the opposing army. The point is that Joshua ordered the sun to stop, and not the earth. This might seem like pedantry, but that was precisely the point. The Catholic church hoped to hold a monopoly on biblical interpretation, and someone lower down the ecclesiastical hierarchy conducting their own paradigm shift equipped with nothing more than astronomical data and mathematics could set a dangerous precedent. At a time when many such precedents already being set.

The second, and quite serious objection, was that it created a whole new set of scientific questions. Why don’t we feel like we are moving through space? Not only about the sun, but when we make the Earth rotates daily about its own axis? The numbers required to calculate the implied velocity were known. And on top of that, if we were moving at such great speed, they why did we not observe a parallax effect between the stars? As the apparent distance between two buildings appears to change as we move past them, why couldn’t we observe a similar shift in the stars as we moved? Copernicus’ answer was that the stars were much farther away from earth than had ever been imagined before. It was a correct deduction that didn’t do much to convince anyone.


Both Copernicus and Newton were reluctant to publish their ideas. In Newton’s case he was satisfied to have developed his Calculus and did not care to suffer the scrutiny that others would subject his theory of gravity to. His experience justify his thinking to other scientists of his had soured his relationship with the wider scientific community of his day. It was only when it became clear that Leibniz had independently developed the tools of Calculus that he finally set about writing up, formalizing, and getting his hands on data in order to present his Principia.

Copernicus had gathered his data and written his book, yet for many years did not publish it. De revolutionibus orbium coelestium would only arrive in print as he lay on his death bed. While Copernicus had friends who supported his astronomical pursuits, it seems to have been the arrival of a young Lutheran mathematician Georg Joahim Rheticus, who was the key instigator in bringing the manuscript to print.

No one had invited him or even suspected his arrival. Had he sent advance notice of his visit he doubtless would have been advised to stay far away from Varmia. Bishop Dantiscus’ most recent anti-heresy pronouncement, issued in March, reiterated the exclusion of all Lutherans from the province — and twenty-five-year-old Georg Joachim Rheticus was not only Lutheran but a professor at Luther’s own university in Wittenberg. He had lectured there about a new direction for the ancient art of astrology, which he hoped to establish as a respected science. Ruing mundane abuses of astrology, such as selecting a good time for business transactions, Rheticus believed the stars spoke only of the gravest matters: A horoscope signaled an individual’s place in the world and his ultimate fate, not the minutiae of his daily life. If properly understood, heavenly signed would predict the emergence of religious prophets and the rise or fall of secular empires.

A More Perfect Heaven — Dava Sobel

I suspect that we may undervalue the weight that the belief in astrology may have carried in some (but not all) quarters. Many looked back to the Great Conjunction of 1524 as heralding the rise and spread of Lutheranism — an ideological shift with profound and widespread implications that might only be matched by Communism. We live in an age of scientific prediction, taking for granted the reliable weather forecast on our phone in the morning. We (at least most of us) accept the deep implications of the climate data for our future, while also paying heed to the sociology and political science can help us understand our lack of collective action. If we accept the astrology as being a kind of forebear to our own understanding, you can perhaps appreciate why Rheticus might have been willing to take such risks to pursue a better understanding of the stars.

We can only imagine what Rheticus must have said to Copernicus that led him to finally prepare his manuscript for publication. And that is what Dava Sobel has done, writing a biography of Copernicus, A More Perfect Heaven, which contains within it a two act play dramatizing how she imagines the conversation might have gone. It presents a Rheticus shocked to discover that Copernicus literally believes that the Earth orbits about the Sun, a Copernicus perplexed that the young man takes astronomy seriously, but who is won over by the prospect of taking on such a capable young mathematician as his student.

Rheticus’ principal legacy is in the précis of Copernicus’ theory that he wrote and had distributed as a means of preparing the way for the ultimate text. His contributions would ultimately be overshadowed by the later accusation, conviction, and banishment for raping the son of a merchant. While Sobel presents Rheticus in her play as pursuing/grooming a fourteen year old boy, it does not feel like she knows exactly where to take this dramatically. By way of contrast, John Banville in his novel Doctor Copernicus gleefully transitions to a Nabokovian narrative upon Rheticus’ arrival.


There is an interesting dramatic irony in the way Copernicus’ ideas were initially received. There was a ruse, by certain parties, to present Copernicus’ heliocentric theory as simply a means of computation. It could be tolerated if it was understood that one was not supposed to actually believe that the Sun was at the center of the solar system. Which struck some as a reasonable compromise. The Catholic church was drawing up what would become the Gregorian calender, and Copernicus’ made important contributions to calculating a more accurate average for the length of a year.

Yet now the situation has been reversed. While Copernicus’ techniques were rendered obsolete with the arrival of calculus, the conceptual understanding carries on in popular understanding. Meanwhile, as Terence Tao and Tanya Klowden have noted Ptolemy’s deferents and epicycles live on in the mathematics of Fourier analysis — a means of approximating arbitrary periodic functions using trigonometry.

Even within a field as definitive as mathematics and science it is interesting how even defunct and obsolete thinking can both be revealing and even persist with strange second lives. Why someone believed something can become more important than the truth of the thing. Eratosthenes deduced an impressive approximation for the Earth’s circumference after hearing a story about a well that would reflect the light of the sun at noon. We posses a more accurate figure now, but technique never grows old.

The wrangler’s insecurity.

[This is my third post on Newton. Previous posts: one and two.]

If you were to take a look around you during a math department seminar or colloquium, you would witness the audience’s attention begin to drift as the talk sunk further into detail and became increasingly difficult to follow. Losing interest in a talk is more or less expected, and the professional mathematicians in the audience come prepared. Maybe they bring a paper to read, or possibly exam scripts to grade. Sometimes they will turn to a fresh page of their notebook and begin doing some actual mathematics of their own.

As a graduate student at McGill, I remember watching a postdoc fill up a page with long exact sequences and all kinds of diagrams, the notation veering into doodles as he got stuck at what must have been a familiar dead-end. It was a rare, voyeuristic glimpse into someone else’s solitary mathematical practice. I later asked this postdoc — whose notebook I presumed was full of such pages — if he ever went back and reread what he had written. No, he admitted, with a guilty smile.

Which was a relief. Not only because my own notebooks were full of repetitious dead-ends, but also because I too almost never went back to review anything I’d written.

Much has been made of Imposter Syndrome among academics — doubting whether we have truly earned whatever position we have reached given how paltry our contributions can sometimes feel. There is a related sense of insecurity to be found in wondering if you are doing mathematics correctly. To be clear, I don’t mean whether a proof we have written up is sound, but whether or not our process of formulating and devising them is the the “proper” way. As if there might be a correct way (or even a professional way) of doing mathematics.

These are not new concerns to have.

Isaac Newton was incredibly secretive in his work and did not have anything approaching students as we might describe them. But after his death, the calculus he developed would form the foundation of modern mathematical and scientific education at Cambridge.

Those who scribbled hastily on those exam papers were students, above all, of Newton’s mathematical physics. Though Newton had not cultivated a following during his own tenure at Cambridge, by the end of the eighteenth century the principals laid down in the Principia — and in particular the mathematical contents of that book — formed the basis for an intensely competitive system of testing at the university by which students were ranked in descending order based on their results on terminal examinations. known as the “Mathematical Tripos.” (The origin of the term Tripos is uncertain, but it may refer to the three legged stool on which students originally sat to take the oral examinations.)

The Newton Papers – Sarah Dry, pg 85

The material, and especially the notation, would be modernized as European influences arrived, but Newton did not lose his centrality. The manner in which he actually arrived at his great insights became a matter of interest. There is a great distinction between how discoveries are made, and how they finally appear on the page. Everyone knew how they personally went about doing mathematics, and even how their tutors told them to do mathematics, but was that the same as how Newton went about making his original discoveries? For all they knew, it might have all been provided to him by divine revelation.

On the Quadrature of Curves. See the Cambridge website to view more scans of Newton’s papers.

In 1872 a means of settling the question presented itself. Newton’s papers — or at least a large portion of them, covering far more than mathematical physics — had resided for nearly 150 years in the library of one of England’s aristocratic houses: Hurstbourne park. But now the Earl of Portsmouth was donating the scientific portion of papers back to the University of Cambridge.

Newton had been famously coy about his own methods, suggesting that he had kept his true means of discovering the Principia private and had only cast them publicly in the language of geometry. The question was therefore whether he adhered to the rigorous, manly, and above all morally upright techniques of thinking that Cambridge undergraduates were coached to acquire. To answer this Stokes and Adams were forced to consider whether Newton himself should — or could– be held accountable to the techniques that were mastered in his name. The Newton papers had the potential to probe more deeply the shadowy divide between patient work and divine inspiration, offering the promise of settling not simply what Newton had done but how he had done it. […] the question had a special urgency at Cambridge where the moral value of study was paramount. In that respect the Newton papers mattered for every undergraduate preparing for the Tripos and for what the Tripos itself stood for. Would the man who served as a model for what should be learned also reveal himself through his private papers, as a model for how to learn?

The Newton Papers – Sarah Dry, pg 88

Sarah Dry, author of The Newton Papers, a chronicle of the journey Newton’s writings took after his death, presents an interesting comparison of the two mathematicians, John Couch Adams and George Gabriel Stokes, who were tasked with making sense of his old notebook papers.

John Couch Adams (1819-1882),

On the one hand was John Couch Adams, whose ability to compute mathematically in his head was the stuff of Cambridge legend. This savant-like ability came with a tenacious reluctance to write anything down. This reluctance cost English astronomers the first opportunity to observe Neptune. Having deduced, from Uranus’ orbital irregularities, where a mystery planet should be found in the night sky, he failed to explain himself clearly to the astronomical bigwigs, who had little patience for the recent graduate. Roughly a year later, in 1846, the Frenchman Urbain Le Verrier managed to solve the problem and pointed his country’s own telescopes in the right direction. Adams was left with nothing but his incomplete written accounts and undated papers declaring his discovery, making establishing precedence impossible. Not that he seemed much bothered by losing out on the glory. He was personally very satisfied simply to have managed the computation.

Sir George Gabriel Stokes, 1st Baronet (1819-1903)

George Gabriel Stokes (of Navier-Stokes and Stokes’ Theorem) on the other hand, wrote compulsively, both mathematically and in personal correspondence (often to ease his own insecurities). Later in life he became editor of the Philosophical Transactions of the Royal Society, then the foremost journal in science, and this involved dealing with a huge amount of correspondence. Unfortunately he was a hoarder of papers of all and every kind, filling the rooms at his disposal with tables on which to pile up his papers. This was all compounded by his inclination towards procrastination

They might have made a formidable team, had their temperaments combined to negate the other’s weaknesses. Instead the project to deliver a verdict on the value of Newton’s papers and reveal his way of thinking was subject to great delay. Of the two however, it seems that Adams was the one who was most readily able to probe the documents deeply. Sarah Dry quotes Glaisher (Adam’s obituarist) as saying:

[…it was a] difficult and laborious task, extending over years, but once which intensely interested him, and upon which he spared no pains. In several instances he succeeded in tracing the methods that Newton must have used in order to obtain the numerical results which occurred in the papers. The solution of the enigmas presented by these numbers written on stray papers, without any clue to the source from which they were derived, was the kind of work in which all Adam’s skill, patience, and industry found full scope, and his enthusiasm for Newton was so great that he had no thought of time when so employed. His mind bore naturally a great resemblance to Newton’s in many marked respects, and he was so penetrated with Newton’s style of thought that he was peculiarly fitted to be his interpreter. Only a few intimate friends were aware of the immense amount of time he devoted to these manuscripts of the pleasure he derived from them.

John Glaisher — Memoir of the life of John Couch Adams

What Adams was doing, in his own manner, was nerding out. As with the discovery of Neptune, it seems that his motivations were overwhelmingly personal, and less in service to the scientific community. Imagine a referee today reading the paper under review very carefully, but forgetting to take notes and neglecting to get back to the editor. Nevertheless, conclusions were eventually drawn out of their little committee and a report of their findings was presented.

Here was confirmation that Newton had indeed worked by process of refinement that inevitably included false starts and error. In this sense, Newton revealed himself to be less an otherworldly genius and more a figure with whom the Cambridge wranglers could identify, a tireless worker in the mathematical trenches, where progress was made by increments rather than leaps. Adams knew the feeling well. In 1853 he had published an important paper pointing out errors made by Laplace in determining lunar motion and promising to provide the correct calculations soon; it had taken him six long years to get the final numbers. Here, in the papers, was evidence that Newton had worked just as hard to come up with his results.

The Newton Papers – Sarah Dry, pg 105

By the time I had finished my PhD I had produced a sizable pile of used dollar-store notebooks. Browsing through them I could recognize the contours of what I’d spent the past four years trying (and occasionally succeeding) to do. I might even have reconstructed from the pictures and computations I had written out what I might have actually been thinking at the time. And aside from myself, there are a few people in the world who could possibly make sense of their contents. It all went in the recycling. If somehow one of my notebooks did manage to survive, and made its way into the hands of future scholars, I would be alarmed to consider them giving the content more than cursory attention.

The hundreds (?) of pdfs that I have produced, now sitting out there in the cloud stand a far better chance at outliving me. And not just my published work and arxiv pre-prints (which number in the tens). But everything I ever committed to a latex document in my own personal space up there in the cyber heavens. Among all the discarded drafts that might find evidence of something interesting. Not only what I managed to do, but also what I failed to do. What I thought I had succeeded in doing, but had in fact betrayed my own good sense. When I have found mistakes, I am occasionally mindful enough to leave a short note in all-caps to make it clear where the point of failure lies. There is a great deal we can learn from knowing what other mathematicians have tried and failed to do.

Unlike physical notebooks, our cloud storage is password protected. Digital inheritance is already “a thing”, but it seems unclear to me how it will work out in practice. Kafka left his manuscripts in the possession of Max Brod with the instructions that they be destroyed in the event of his death. Brod told Kafka himself that he certainly wouldn’t, and indeed when Kafka died at the age of 40 as a consequence of tuberculosis, Brod set about getting Kafka’s work published. I haven’t taken a survey, but I would imagine that most young writers have made no attempt to ensure their passwords and unpublished estates are in suitable hands. In principal it is possible to submit a request to Google for access to the accounts of the deceased, but I can’t imagine there are any guarantees. I certainly have no idea what the terms and conditions that I have accepted have to say about such eventualities.

Newton died a man of wealth and importance. With neither wife nor children he had no direct descendants, but he did have a slew of half-nephews, half-nieces, and children of his half-sisters. The assets of obvious value were split between them. Those assets of less obvious value — the leftover pile of notebooks and “reams of loose and foul papers” fell into the possession of Catherine Conduitt. She was one of Newton’s half-nieces, and wife to John Conduitt, who had actively assisted Newton in his duties as master of the Royal Mint. This was the consequence of some rather wild tying-up of loose ends:

Newton had died while holding the post of master of the Mint, which in those days required that its holder assume personal responsibility for the probity of each new coinage of money. That meant that at Newton’s death he had nominal debts amounting to the entire sum of Great Britain’s national coinage. John Conduitt agreed to take on this debt until the coinage had been certified, accepting liability for any imperfections in the coins. In exchange for assuming this risk, he asked for, and was granted Newton’s manuscripts.

The Newton Papers – Sarah Dry, pg 15

The Conduitts took ownership of these papers with the view of producing a biography, and begin the work of securing Newton’s posthumous reputation. They became the first in a long line of people who had access to the papers, but lacked the tools really required to properly make sense of them. Their daughter, Kitty Conduitt married John Wallop who would become the Earl of Portsmouth, and papers would enter the library of Hurstbourne Park, seat of the Portsmouth family. (That is to say they fell into the possession of the aristocracy.) And it was there that they would remain, save occasional minor forays, and the recovery of the substantial portion of scientific papers by Stokes and Adams. What finally shifted the remaining papers out into the open was the fall of the English Aristocracy. In 1936, under the financial pressure of death duties and a recent divorce, Gerald Wallop, the ninth Earl of Portsmouth had the papers put up for auction at Sothebys.

John Maynard Keynes, 1st Baron Keynes (1883-1946)

If there is a hero in Dry’s account of the Newton Papers, it must be John Maynard Keynes. His heroic virtue being exceptional taste and judgement. Having begun collecting books as a child (possibly his first foray into speculation) he developed a rather prescient sense for what should be considered valuable. Unlike the majority of collectors he shared the marketplace with, Keynes was actually interested in reading the books themselves. He was less interested in the superficial qualities: illuminations, illustrations, binding, or an illustrious list of prior owners left him unmoved.

Keynes’s new style of collection was self-consciously intellectual, as opposed to aesthetic or literary. It asserted that a particular history of ideas or chain of thought linked certain men through the ages. And it projected the implicit assumption that its creator was an inheritor of both the material and the intellectual masterpieces of a previous age. Keynes was a thoroughgoing Bloomsburyite in his respect. The paintings on the wall, the rugs on the floor, the furnishing in the room, and the books on the shelves were never just things: they were the physical embodiment of ideas and values whose display was a source of both aesthetic pleasure and moral reinforcement. A book in the hand, like the good life in Bloomsbury of the Sussex countryside, linked the life of the mind with that of the physical world.

The Newton Papers – Sarah Dry, pg 147

You might already get the sense that Dry sees Keynes as simply bringing a new set of beliefs to the table, complete with their own set of limitations. Indeed, Keynes considerable contribution to our modern impression and understanding of Newton as half magician and half scientist, was really a very hot take based on an initial reading. He was the one who announced that the papers reveal Newton devoted great time and energy to the disreputable pursuits of alchemy and heretical theology. Yet the fact that so many of Newton’s papers have remained together and in the possession of the University of Cambridge can be attributed to his prescience sense of the papers’ importance.

Abraham Shalom Yahuda (on the right) (1877–1951)

Kaynes was only one of two major buyers at Sotheby’s. Abraham Yahuda, a scholar of ancient languages, bought most of Newton’s theological writings. Yahuda had found himself alienated from his own field of scholarship, due to recent developments in Higher Criticism applied to biblical scholarship. The Documentary Hypothesis was a shocking new line of textual analysis that argued the origins of the Torah were of combination and synthesis with earlier texts. As a consequence, these texts cease to resemble one coherent whole revealed to man, and begin to look more like artifacts of history and culture.

For Yahuda this was a vision of criticism taken to extremes, the text reduced to nothing but error, the possibility of meaning dissolving amid a multiplicity of authors, leaving only commentary, a Talmud with no Torah left in it. He thought in particular that too many sources were being attributed to the Pentateuch and that too many “experts” were exerting themselves “in the art of text alterations and source-hunting.” Thus “the original text was distorted and disfigured and in its place was offered a quite new text of pure invention.” In Newton, who himself sought to return a blemished Christianity to its purer origins, Yahuda found a kindred soul. Interpreting ancient texts didn’t require robbing them of fixed meaning. Both Newton and Yahuda sought instead to find a singular truth amid the variations.

The Newton papers – Sarah Dry, pg170

As a consequence of Yahuda’s desire to find an ally in Newton, those theological papers now reside in The National Library of Israel.

The final portion of Dry’s book concerns the subsequent attempts at synthesis of the material. The fact of the matter is that the task was simply impossible. There is too much material, covering too many subjects for any grand unifying conclusions to be drawn. It was hard to even put together a definitive edition of the Principia that covered all the different editions as well as Newton’s own marginalia. When finally published it was controversial due to the inevitable editorial decisions to include or not include certain material.

It is worth making one final point clear. I have never read Newtons’ Principia. I don’t believe you could find a research mathematician alive who has — unless their research happens to be the history of mathematics. It is a book whose significance is measured in its influence. Many decisions in its composition — in particular the modelling Newton’s Laws on the axioms in Euclid’s Elements — were very important. But you should not read it. When we discuss “great” books, there is usually the tacit understanding that we are missing out if we have not actually read the book. I am quite certain that we have not missed out.

On English Magic

Jonathan Strange and Mr Norrell, Susanna Clarke’s alternate history fairy-tale, opens with one of its characters asking a question that carries through the rest of the novel’s thousand pages.

“Mr Segundus wished to know,” he said, “why modern magicians were unable to work the magic they wrote about. In short, he wished to know why there was no more magic done in England.”

This is an England that had once been a very magical place, yet is no longer so. Over the decade “history” the novel covers (1806-1817) we see two new magicians arrive to provide their spells in service to their county in the Napoleonic Wars. The magic of previous generations had been lost, or forgotten, or become dysfunctional in some way. This was apparently despite the many books about/of magic that had been written by the very real magicians of the past, making their secrets and practices clear. Indeed, as the novel opens, England has many leaned societies of magicians, but these members are exclusively of a “theoretical” type — quite unable to cast a single simple spell.

The title characters are our heroes, of a sort. Mr Norrell, an uncharitable and unsociable Yorkshire gentleman who had devoted his youth to carefully studying the remaining books of magic, while also hoarding them away from others. Jonathan Strange, the more sympathetic of the two, is of a more obsessive and intuitive character, sociable and likable, ready to befriend Norrell, and complement his own innate ability by becoming Norrell’s apprentice.

Reviewers have noted the imbalance between the two protagonists, with Strange being the more compelling of the two, yet only actually arriving in the narrative proper a third of the way in. I personally found a great deal interesting in Norrell, however, when I recognized parallels between him and the Isaac Newton I had recently been reading about. Indeed, as I previously described, many of Newton’s pursuits could be described as attempts to recover magical techniques or knowledge from the past that had become lost or forgotten. Norrell’s inclination towards either preserving his recovered knowledge and even monopolize magic are reflected in many of Newton’s own inclinations. The ultimate difference being that science is not magic, and Newton himself was indebted to many of his contemporaries, (most controversially to John Flamsteed for astronomical data). But if you were trying to imagine the mentality of a man like Isaac Newton, I think you could do much worse than consider the character of Norrell.

It can be considered a kind of rule in story telling that you make a promise at the start of a story and you must deliver on it by the end. The question of why there is no magic done in the England certainly makes a clear promise that some kind of light will be shed on the matter. While neither the reader, nor the characters, get direct or complete answers to that question, we do however learn a great deal that is interesting on the subject. Plenty can be deduced a careful reader — enough to leave the book satisfied.

Jonathan Strange and Mr Norrell was a publishing sensation. Bloomsbury invested heavily in marketing it, imagining that it would allow them to expand beyond their foundation of Harry Potter sales. I think it is fair to say that almost twenty years on, it is regarded as a classic. In as far as such a thing could be said at this point. I certainly found the length no obstacle, and by the final third I was enraptured by the characters’ unfolding trajectories and their ultimate ends.

It is easy to say that a novel is just words on the page, but the word “just” is doing a great deal of heavy lifting. I find myself increasingly paying attention to what goes on in an individual paragraph the way that film buffs concern themselves with actors, cinematography, and special effects. Take the following paragraph, which demonstrates quite well the Austen-esque prose styling along with Clarke’s ability to capture the regional richness of England which I either never really encountered or appreciated before in English fiction.

At no. 9 Harley-street Lady Pole’s country servants were continually ill at ease, afraid of going wrong and never sure of what was right. Even their speech was found fault with and mocked. Their Northamptonshire accent was not always intelligible to the London servants (who, it must be said, made no very great efforts to understand them) and they used words like goosegogs, sparrow-grass, betty-cat and battle-twigs, when they should have said gooseberries, asparagus, she-cat and ear wigs

pg 173

The list of alternates given in that paragraph pass by quickly in the way that good set dressing, special effects, and cinematography do, but this is harder to pull of than you might imagine. The following paragraph similarly stood out to me, again for it’s command of its setting, but also for the kind of fantastical whimsy that the likes of Lord of the Rings, Harry Potter, and Neil Gaiman manage to tap into. If it could be meaningfully produced industrially you’d find it sold in the Harry Potter stores scattered all across England.

Whereupon Mr Strange told them how, to his certain knowledge, there had been four copies of The Language of Birds in England not more than five years ago: one in a Gloucester bookseller’s; one in the private library of gentleman-magician in Kendal; one the private property of a blacksmith near Penzance who had taken it in part payment for mending an iron-gate; and one stopping a gap in a window of the boy’s school in the close of Durham Cathedral.

pg 281

For those of you reading this who are put off by a thousand page door stop, there is also the BBC miniseries adaptation that I hear was pretty good. And if you would prefer a shorter novel, her second, Piranesi was released last year, is much shorter, and also very good. The New Yorker wrote a good profile of Clarke discussing it.

There has been a murder in Gathertown

If you orient yourself temporally you may remember that back in August there was a online fracas involving mathematics. A teenage girl, doing her makeup before work, decided to take the opportunity to lay down for her TikTok followers her skepticism about the idea of math generally:

Who came up with this concept? “Pythagoras!” But how? How did he come up with this? He was living in the … well I don’t know when he was living, but it was not now, where you can have technology and stuff, you know?

Grace Cunningham, TikTok user.

As was keenly observed by the many keen observers out there, the initial response was a pile-on that combined general misogyny with gen-Z hatred; it was the latest installment in the long running complaint about kids these days. This reactionary abuse was soon countered by a more positive wave of responses that acknowledged that her questions were not only legitimate, but exactly the kind of questions our curriculum does little to answer.

I don’t think many mathematicians are particularly satisfied about the way our subject is generally taught. At the university level I find it hard to love force marching students through rote material, stripping centuries worth of mathematics of all its scientific and historical context along the way. So obviously I am happy to see any student kicking back at what we inflict on them. But for those who have made mathematical communication their vocation it was a solid gold opportunity to evangelize. Euginia Cheng wrote a pdf answering Grace Cunningham’s formalized list of questions, and Francis Su wrote a twitter thread.

Grace Cunningham was calling the bluff on the pretenses of her education. In particular, the pretense that you should obviously be learning whatever we are telling you. “Why are we even doing this?” is a legitimate question in a mathematics course, and “why on earth did anyone prove these theorems in the first place” is an even better one. “How did people know that they were right,” presents the awkward truth that people most often are certainly incorrect about many things. What makes these questions awkward is that the people teaching you mathematics will frequently know little to nothing about the history and context within which the theory was developed. Mathematicians are terrible, as a rule, at scholarship, and the history of ideas within mathematics is an essentially distinct field. Most of the context that I have for the mathematics I do is essentially gossip, urban myth, and pablum. Fortunately, while we might be terrible historians we remain excellent gossips, so at least we have plenty of stories to tell.

(I should also concede that it is impossible to generalize in any way about most of my peers. Many of them are tremendously knowledgeable about all kinds of things and wonderful educators. I am, at least to some extent, either projecting or talking about our very worst failings.)

I was dissatisfied by the responses I found to Cunningham’s questions. Not least of all because I don’t think they really answer the questions. No actual historical context was given. The answers more resemble the kind of general motivation and propaganda we give students to encourage them to listen in class and do their homework. I think a good answer would address the fact that the people who developed much of classroom mathematics had some pretty wild ideas about what they were doing. Their motivations would be pretty alien to us, and is a far cry from their homework, exams, or getting a well paid job.

Just to make this explicit: How many of us who have ever taught or taken calculus a calculus course have even done any astronomy? Just from doing a little reading, an obvious observation seems to be that when people sat down to first learn calculus from Newton’s Principia, the big incentive for them was the promise of a serious set of answers about the Sun, the Earth, the Moon, the stars, and even comets. A modern mathematician explaining their motivation for calculus today is a little like a 21st century Western evangelical Christian explaining what the “Old Testament” is all about to an orthodox rabbi.

My modest reading has focused on the life of Isaac Newton. I read Jame’s Gleick’s biography of Newton (highly recommended) and I have a few more on the shelf. I already had some understanding that aside from developing calculus Newton was a heretic, alchemist, and later in life warden of the royal mint. I knew he lived through times of plague, apocalypse, dictatorship, conspiracies, and his work was a major part of the scientific revolution. Particularly pertinent to Cunningham’s question is the fact that for centuries after Newton’s death there was a suppression of the full range of Newton’s intellectual activities. It was only when John Maynard Keynes acquired a substantial portion of Newton’s surviving papers at auction that the truth came out. For a long time Newton’s preoccupations would be considered intellectually inconvenient for all those trying to boost his posthumous reputation, and that of British science with it.

The idea of knowledge as cumulative — as a ladder, or a tower of stones, rising higher and higher — existed only as one possibility among many. For several hundred years, scholars of scholarship had considered that they might be like dwarfs seeing further on the shoulders of giants, but they tended to believe more in rediscovery than progress. Even now, when for the first time Western mathematics surpassed what had been known in Greece, many philosophers presumed they were merely uncovering ancient secrets, found in sunnier times and the lost or hidden.

Isaac Newton – James Gleick (pg 34-35)

Here is a not entirely fanciful reading of Newton’s life: starting his university career dissatisfied with the existing knowledge, and curious about the latest developments in astronomy, Newton develops his theory of calculus. But he is not yet really a scientist. He is still very much a wizard. A young man who has uncovered some profound secrets and is keen to discover more. He invests huge amounts of time and energy in alchemy and theology. The alchemy involved tracking down obscure texts that he hoped would contain the secret knowledge of transforming base metals into precious metals, and his notebooks from this period often amount to his copying out these texts. It also involved working with mercury, a poisonous metal known to drive the alchemists who used it to madness.

His theological interests were no less hazardous since they would have been viewed as clearly heretical to both the Protestant and Catholic religious authorities at the time. By studying the earliest Greek manuscripts he discovered that the concept of the Trinity — that the Godhead is three and one; Father, Son, and Holy Spirit — emerged late in the early church, and certainly couldn’t be considered part of the original Christian tradition. Newton concluded Jesus was not at the same level as God and had never claimed to be. At a time in England when having Catholic sympathies could land you in trouble, this was a dangerous view to have.

I would argue that Newton transformed from a wizard into a scientist the moment the German mathematician Leibniz independently derived his own theory of calculus. No longer had Newton uncovered a forgotten knowledge, but he had derived a theory that someone else could also derive. He was now entered into a race to establish the precedence for his own results — and this meant writing up.

For decades his tools of calculus had languished in notebooks and in his mind. Now he had to write them down, and he chose to present them in the style of Euclid’s Elements, with axioms, definitions, lemmas, theorems. And most intriguingly, in order to prove the correctness of his theory, he drew upon experimental data: astronomical observations from the newly establish Greenwich observatory and tidal charts. He was able to explain and predict natural phenomena that perplexed his contemporaries such as the sudden appearance of comets, and their unusual paths across the night sky. We can recognize this now as a prototype of the modern scientific method, but back then it was controversial, becoming part Newton’s dispute with Leibniz.

Newton wrote many private drafts about Leibniz, often the same ruthless polemic again and again, varying only by a few words. The priority dispute spilled over into the philosophical disputes, the Europeans sharpening their accusations that his theories resorted to miracles and occult qualities. What reasoning, what causes, should be permitted? In defending his claim to first invention of the calculus, Newton stated his rules for belief, proposing a framework by which his science — any science — out to be judged. Leibniz observed different rules. In arguing against the miraculous, the German argued theologically. By pure reason, for example, he argued from the perfection of God and the excellence of his workmanship to the impossibility of the vacuum and of atoms. He accused Newton — and this stung — of implying an imperfect God.

Newton had tied knowledge to experiments. Where experiments could not reach, he had left mysteries explicitly unsolved. This was only proper, yet the German threw it back in his face: ‘as if it were a Crime to content himself with Certainties and let Uncertainties alone.’

Isaac Newton – James Gleick (pg 176-177)

Data is now the recognized currency of modern science, and theology is, well, theology. The mathematical analysis that makes calculus rigorous didn’t come until much later. Newton had started using infinite series in his calculus, but it was understood that you had to be careful because sometimes you could get some bad results.

When Cunningham asks her TikTok followers how early mathematicians knew they were right, in Newton’s case at least, it seems that there are three answers. Newton first convinced himself with arguments we would not consider mathematically rigorous along with his his own empirical observations. Decades later he convinced his peers by publishing a full written account of his theory (in Latin) that provided supporting data. Then a century or so later the full theory of mathematical analysis was developed.

These questions have complicated answers for Newton, but they are really no less complicated for us today, even if they are quite different answers. We live in the age of the arxiv, computer assisted proofs, machine learning, and bodies of work that amount to many hundreds of pages. I’m not going to lie; I love the drama of it all. Some would like to present mathematical proof and progress as being an enterprise free from being sullied with the humanity of its practitioners. For my part I am of the belief that the reasons people commit themselves to mathematics are more complicated than just the aesthetic appreciation of equations.