I have never read a Brandon Sanderson novel. Plenty of people haven’t, so that doesn’t make me special, even among avid readers. But a great many people do read Sanderson. So many, in fact, that even among high profile writers, Sanderson certainly is special. And this week Sanderson’s readership went from buying Sanderson’s books, to buying into Sanderson and his books; they put down an accumulated and unholy twenty million dollars (and growing) on his kickstarter to publish four “surprise” novels in 2023.
But perhaps I should feel a little bit special, because while I haven’t read any Sanderson, he has read me. Or at least he has, on one occasion, in interview, indicated that he had read the web-comic that I drew as a teenager.
Thog Infinitron, written by Riel Langlois, a Canadian I met on a web-comic forum, and drawn by me, Daniel Woodhouse, is the story of a cyborg caveman and his various adventures. After his body is crushed during a Rhino stampede, the titular Thog is rebuilt with all manner of enhancements by a pair of alien visitors. I uploaded a page a week, and the story ran for a grand total of 129 pages before I unceremoniously lost interest and ditched the project somewhere in the middle of my second year of undergraduate mathematics. I had other things going on. Thog’s story was left at a haunting cliffhanger, with story-lines left open and characters stuck forever mid-arc.
I do not even have to look back over my work to recognize that I was a callow and unsophisticated artist. My potential was frustratingly underdeveloped. In retrospect, I cringe at my own haste to produce a popular webcomic that would bring in wealth and recognition, and how that haste led me to neglect my craft. I lacked influence and serious guidance. Or maybe I was simply too stubborn in my ambition. I do wonder how I would have fared if I had been that same teenager today, able to discover the wealth of material and advice that is now available online. You can literally watch over the shoulder of accomplished artists as they draw.
Nevertheless, when I revisited Thog, I was impressed by the comic as a body of work. Langlois’ writing was truly fantastic — in an completely different league to my art. And as rough as the art is to my eye, I have to appreciate the sheer cumulative achievement.
(Please do not go looking for my webcomics. Aside from Thog Infinitron I sincerely hope that my teenage juvenalia has disappeared from the internet, and for the most part this wish seems to have come true through a combination of defunct image hosting and link-rot. Thog is still out there and readable thanks to a surviving free webcomic hosting site, although I’m not sure your browser will forgive you for navigating into those waters.)
At the time, Thog, with it’s regular update schedule, was a major feature in my life. Now it feels like a distant and minor chapter. Years later I would occasionally do a web search to see if people still mentioned it , if Thog was still being discovered, or if the comic had any kind of legacy at all. It was during one of those web searches that I discovered a passing reference to Thog by Sanderson in an interview on Goodreads.
It is a strange an unusual writer who does not want to be read. And indeed it is a strange and gratifying to discover that you have been read. It is an experience that Sanderson enjoys to a singular degree, but that I too have enjoyed to a thoroughly modest degree. At some point during Thog’s run we even gave permission to some particularly keen readers to translate it into Romanian. I have never received a dime for my web-comics, and at the time I didn’t take much note at the time, but in retrospect I’m in awe that I should have received such an honor. Sanderson’s works have been translated into 35 different languages.
The money that is being amassed on Kickstarter for Sanderson’s project is no small thing. The way the arts and literature are funded have profound effects on the culture. The proceeds of bestsellers have traditionally been reinvested by published houses in new writers (or so it has been claimed), and I imagine that more than a few people will look at Sanderson’s foray into self-publishing (or working outside a major publishing house) and wonder how different the future might be. But it is at least, for now, worth appreciating the sheer spectacle of a truly devoted readership.
In Don DeLillo’s novel White Noise the protagonist and narrator, Jack Galdney, finds himself struggling to get a straight answer from his precocious teenage son, Heinrich.
“It’s going to rain tonight.” “It’s raining now,” I said. “The radio said tonight.”
[…] “Look at the windshield,” I said. “Is that rain or isn’t it?” “I’m only telling you what they said.” “Just because it’s on the radio doesn’t mean we have to suspend belief in the evidence of our senses.” “Our senses? Our senses are wrong a lot more often than they’re right. This has been proved in the laboratory. Don’t you know about all those theorems that say nothing is what it seems? There’s no past, present or future outside our own mind. The so-called laws of motion are a big hoax. Even sound can trick the mind. Just because you don’t hear a sound doesn’t mean it’s not out there. Dogs can hear it. Other animals. And I’m sure there are sounds even dogs can’t hear. But they exist in the air, in waves. Maybe they never stop. High, high, high-pitched. Coming from somewhere.” “Is it raining,” I said, “or isn’t it?” “I wouldn’t want to have to say.” “What if someone held a gun to your head?”
The exchange is as infuriating as it is amusing and you can’t help but wonder where your sympathies should lie. On the one hand Heinrich is deploying tendentious po-mo deconstruction. Yet his father is a professor at the town’s liberal arts college where he founded the academic field of Hitler Studies, created in service of academic advancement, providing a stage for his own po-mo preoccupations.
I couldn’t help but think of White Noise as I recently read Joy Williams’ Harrow. If you put a gun to my head and told me to describe the book I’d say it reads like White Noise meets Cormac McCarthy’s The Road. Describing the actual plot of Harrow makes describing the plot of White Noise seem easy. If there is a central conceit to the novel it is that there has been some kind of global environmental catastrophe — the titular “Harrow” — the details of which are only ever alluded to and described indirectly. The situation is stated most clearly towards the end of the novel.
Bouncing back from such historical earth-caused losses, humankind had become more frightened and ruthless than ever. Nature had been deemed sociopathic and if you found this position debatable you were deemed sociopathic as well and there were novel and increasingly effective ways of dealing with you.
None of this really reflects the nature of what awaits a reader in the book. So I will try again. We follow a teenager Khristen who is sent off to a mysterious school for gifted children, until “the Harrow” causes the school to be swiftly shuttered. Khristen goes in search of her mother and arrives instead at The Institute: a kind of eco-terrorist training camp for geriatrics who have decided to dedicate what remains of their lives to coordinated acts of revenge against the people who inflicted so much cruelty and damage on the natural world. Khristen eventually leaves the institute and in the final portions of the novel arrives in the bizarre courtroom of a twelve-year-old judge. I’ve skipped a great deal, but hopefully you get a sense of how resistant the book is to any kind of conventional narrative arc.
I might as well divulge another central conceit of the novel: Khristen’s mother holds the firm conviction that Khristen had briefly died and returned to life when she was a baby. None of the witnesses to the incident or the doctors who examined the child believe this happened. The baby just appeared to have momentarily stopped breathing. Yet this non-incident is returned and treated like it should hold a great deal of resonance. Later on there is much discussion of Kafka’s short story The Hunter Gracchus, which is obviously great fun if, like me, you’ve never read that particular story. But I am led to believe Gracchus’ own un-dead predicament should resonate with Khristen’s.
I should say that Joy Williams is very highly regarded as a writer and you can find plenty of evidence on the page of her skill as a prose stylist. Even if I spent most of the book waiting for it all to accumulate in some or any way, the scenes are nevertheless wildly inventive and individual lines can haunt you:
The fish was not rose-mole stippled and lovely but gray and gaunt as though it had lived its brief life in a drainpipe.
The poetic beauty of the initial description contrast powerfully with the bleak point at which the sentence ends. It is a knight’s move of a sentence, shifting trajectory somewhere along the way. A quick google search reveals that this “rose-mole stippled” business is lifted from a nature poem Pied Beauty by Gerard Manley Hopkins. Which is all to say that there is a lot going on if you look carefully.
But is Harrow actually a good novel? I cannot help myself but channel the spirit of Heinrich Gladney: “Do you think Harrow was good?” “In what sense good? Good to all readers in all times and in all circumstances? Good on a first reading or on a rereading? Perhaps you want me to give an Amazon star rating, because to that I must outright object on aesthetic grounds.” “How about to you, today, when you read it.” “I feel like any serious art inevitably provokes complicated sets of emotions in me that resist easy reduction.” “So you did not enjoy it?” “‘Enjoy’ is too narrow a term to capture whatever virtues the artists was aiming for. I feel like giving a straight answer would serve to do nothing more that to open me up to being accused of exhibiting a lack of literary sophistication.” “Sounds like you are afraid that the book was good but that you were not able to appreciate it fully. Which would be awkward because lots of other people said it was great. Kirkus named it 2021’s best.” “I certainly managed to appreciate some of it.”
As an undergraduate, I was prone to wandering the university library, looking to some kind of literary distraction from whatever math assignment I was suffering to complete. One day pulled a worn paperback copy of Sexual Politics by Kate Millett off the shelf. It was not a book I had ever been assigned or recommended. I had no understanding or conception of what it might contain, aside from the vague possibility of feminism. But the title seemed provocative enough, so I read a few pages to get a sense. It was polemical and read like a response to an ongoing conversation that I was not party to. There was an account of the notorious and horrifying evening when Norman Mailer stabbed his then wife with a penknife. But as horrifying as that was to read, I had no idea who Norman Mailer was. (I later discovered that I had read maybe fifty pages of his biography of Lee Harvey Oswald while in high school before discarding it; I had been hoping for conspiratorial speculation).
What did capture my imagination, was the marginalia on the opening page of the text. As I recall, someone had taken issue with the author’s use of the default male pronoun, as if that was enough to discredit and undermine the integrity of all that followed. I did not get the impression that this was done from a viewpoint sympathetic to feminism, and was a very bold assertion to make so early on in the reading. This is what scholars term “hostile” marginalia.
From this initial salvo followed a whole chain of pencilled and biro-ed interjections and objections from subsequent readers, although who knows how much further than the opening paragraphs any of them made it through the text. In content alone, this was little better than the below-the-line comments that have become ubiquitous online. And to be fair, much worse than most of what in the tech space is called user generated content. But it felt fun and exciting to read for a few obvious reasons. The first being the irreverence of writing in the book itself. The second being the simple aesthetic appeal of seeing the back and forth written out in people’s own hand.
Much has been written about marginalia and its virtues. The rise of the e-reader was occasion for much thinking about what might be lost in a transition to digital. Sam Anderson, who has written much on the subject, writes here from the point of view of a practitioner (as opposed to my own as the voyeur):
One day in college I was trawling the library for a good book to read when I found a book called “How to Read a Book.” I tried to read it, but must have been doing something wrong, because it struck me as old-fashioned and dull, and I could get through only a tiny chunk of it. That chunk, however, contained a statement that changed my reading life forever. The author argued that you didn’t truly own a book (spiritually, intellectually) until you had marked it up.
This hit home for me — it spoke to the little scribal monk who lives deep in the scriptorium of my soul — and I quickly adopted the habit of marginalia: underlining memorable lines, writing keywords in blank spaces, jotting important page numbers inside of back covers.
[…] Soon my little habit progressed into a full-on dependency. My markings grew more elaborate — I made stars, circles, checks, brackets, parentheses, boxes, dots and lines (straight, curved and jagged). I noted intra- and extratextual references; I measured cadences with stress marks. Texts that really grabbed me got full-blown essays (sideways, upside-down, diagonal) in the margins. I basically destroyed my favorite books with the pure logorrheic force of my excitement, spraying them so densely with scribbled insight that the markings almost ceased to have meaning. Today I rarely read anything — book, magazine, newspaper — without a writing instrument in hand. Books have become my journals, my critical notebooks, my creative outlets. Writing in them is the closest I come to regular meditation; marginalia is — no exaggeration — possibly the most pleasurable thing I do on a daily basis.
This belongs to a genre of advocacy for marginalia: looking to transforming passive readers into pencil wielding intellectuals deploying their critical judgements, droll asides, and tasteful underlinings. A closely related genre is more in my own voyeuristic camp, inspecting the great marginalia of the past. To my mind, the most significant marginalia lies in scientific discourse. An example: the journalist Arthur Koestler once dismissively referred to Copernicus’ On The Revolution as “the book that nobody read”, in his history of cosmology. In response Owen Gingrich titled a book with the offending quote as the title, in which he described his efforts to examine all surviving copies of On The Revolution from the 16th century. In the margins of these copies he found copious evidence that Copernicus’ opus was read very carefully indeed. Indeed, if you were to break into the offices of likely any practicing mathematician you will find preprints, with very wide margins, covered in scribbles testifying to the continuing engagement with tricky mathematical texts.
Back in the literary sphere the current king of marginalia has to be David Foster Wallace. While his posthumous reputation as a secular saint has been shattered, Infinite Jest continues to attract devoted readers. His personal library has entered the collection of the Ransom Center, and we’ve all been able to peer at the scans of the paperbacks he extensively and idiosyncratically annotated. More than anything else, these particular images illustrate else the purely aesthetic and textural appeal of marginalia.
But all of this commentary, to my mind, hasn’t acknowledged a more obvious truth about reader engagement with the printed word. Engagement is not always welcome, wanted, or virtuous. We do not tickled by the puerile and often hateful graffiti to be found in any high school textbook in the same way we are when we discover DFW added galsses, fangs, and a mustache to DeLillo’s author photo. Nor do we treat the posts of an unmasked sock puppet account as being worthy of the same respect as the abuse that the “great artist” leaves in the margins of their private library.
The digital age has transformed marginalia into a public performance, and if you abandon the fixation on the codex itself (to my money, still a robust and preferable reading technology) you can find much engagement and innovation, but frequently not among the artists we admire, valorize, and maybe aspire to.
“Fisking” is the practice of taking a text that you find particularly egregious and writing a rebuttal by quoting the text, likely in its entirety interspersed in-line with your debunking, criticism, and abuse. It takes its name from the British journalist whose reports from the Middle East were an early and frequent target of the conservative bloggers who pioneered this innovation. Whatever its roots, it is the logical digital evolution of hostile marginalia. What shifted the paradigm was the ability to instantly copy, paste, and then start writing the commentary. It’s the kind of thing that happens when there are no editors around and your audience seems to appreciate open insults.
The term still gets used, but mostly in conservative circles. There used to be a Wikipedia page that subsequently got removed. Fisk’s Wikipedia page used to mention it, but no longer. There is a dictionary definition, at least. Fisking does not have the same champions as marginalia. Nor the same caliber of practitioner. It also has to be said that the people who coined the term may not have appreciated all the possible associations the term might conjure.
Pile ons, ratios, and dunkings are nothing more than hostile marginalia done in public, en mass. It is possible to have too much of a good thing, and certain things are better done in the privacy of one’s own home. Editors offer a valuable service, and writing is often the process of working out what you think, rather than just responding in the moment. What can be an exciting insight into someones inner life — their unfiltered response as they read a novel — maybe shouldn’t become a model for public discourse. What made those glimpses so interesting, was how fleeting and how precious they could be. And let’s not over-think the thrill of seeing what a writer’s actual handwriting was like.
A “serious writer” might not like to think of themselves as being in the business of children’s entertainment. Yet our most profound reading experiences frequently occur when we are young. In my own experience, there was a powerful frisson, not easily replicated, in reading a book or comic that strayed into some new and adult territory that had, up until that point, remained uncharted. These readings were engaging in ways that they could never be again, especially when returned to as an adult.
For a long time — to my mortal frustration — I browsed my local library’s limited offerings of comic books and repeatedly passed over Niel Gaiman’s Sandman. Even when I learned that these were critically acclaimed the artwork remained, to my eyes, far removed from what I was looking for in a comic. (Conversely, I browsed Frank Miller’s The Dark Knight Returns, felt the immediate need to read it). When I finally read Sandman (or at least the collected editions in the school library), I found the stories powerful in an unexpected way. Even the once off putting art offered something new. But much of this effect was lost as an adult when I started reading the entire series from start to finish.
It is perplexing to me that more writers and artists do not aspire to have their works read by such audiences. Yet at the very same time, it is clear that many are are trying to recreate that same quality of wonder for an adult audience, with confounding results. I don’t know how else to explain the DC Snyderverse, Disney’s Star Wars reboot, or modern Dr Who, other than as adults trying to recreate what they experienced as teenagers in comics, cinemas, or TV.
Comics are a particularly interesting, in terms of their actual readership, because the situation is in fact very different that you might expect. To judge on the strength of sales figures and the implied readership, the modern (American/English language) comic book industry is not dominated by Marvel and DC superhero comics. While tent-pole superhero movies are dominant in the cinemas, their star has fallen in their native medium. The most widely read comic books are graphic novels are written, marketed, sold, and almost exclusively read by a young adult audience.
Sales figures for books are notoriously difficult to measure; there are many different kinds of sales, including the direct market and digital. There is also something almost political about the way best-sellers are measured, with The New York Times even having its own secret source metric for producing its own best-seller lists. But this article from 2021 on graphic novel sales, for all its caveats, makes a good case for the overall picture. Scholastic and Viz are the biggest graphic novel publishers. Viz, contrary to all received wisdom about English speaking audience’s disinterest in fiction in translation, has no problem selling Manga to young people. Scholastic sells young adult books to school kids, often directly, within schools themselves via their book-fairs.
Depending on your exact definitions of intended audiences, it appears that each and every one of the top 20 is intended for children or middle readers. It won’t be until #22 that you can find a comic intended for a different audience ( “Strange Planet”, a collection of webcomics), and if you are looking for a “Marvel / DC-style” comic, you are not even in the top fifty. “Watchmen” finally shows up at #57 – after that the next aimed-at-adults superhero comic is “Harleen” way the heck down at #144. The earliest Manga in the charts are “My Hero Academia” at #18, the first one aimed at adults would appear to be… well, depends who you ask? I tend to think that “Demon Slayer Kimetsu No Yaiba” (#33) is probably rated “T”, so next after that would be Ito’s “Uzumaki” at #34.
The true titan of Scholastic’s graphic novel output is Raina Telgemeier whose autobiographical opus Smile covered her high school experience when she underwent dental reconstruction after an accident. I first heard about her work here:
The New York Times has a good profile in which you can even see the process by which she writes, pencils, and inks her comics.
“Raina single-handedly created the market for middle-grade graphic memoir,” said Saylor, who is now the publisher of Graphix. “There was a common trope at the time that girls didn’t read comics and that was a boy thing, so the market wasn’t catering to girls and women.”
For my part I have recently made my own amateur foray into (very young) children’s books. In honour of my nephew’s birthday I wrote my own knock-off Mr Men book. It’s the first comic I have drawn in many years, and the first that I have done entirely digitally (using a very basic wacom tablet). Like they always say, it is not as easy as it looks. My big takeaway is a reminder of how much of the creative process is work. I suspect I broke a fair number of the rules for writing for children. Some things make complete sense to children and little sense to adults. It is hard to know what will capture an imagination.
Subscribing to the New Yorker can feel like being caught in the classic bind: having your wish granted, only to discover your wish is a curse. The curse in this case being that my New Yorkers pile up and I realize that I will never have the time to them all. I don’t even read them cover to cover. I skip the Goings On About Town; I only recently began to occasionally read a poem; I’m not sold on the Talk of the Town; and I rarely bother with the music, film, and TV critics because I don’t consume nearly enough of those three to make it worth my time. (Although the TV critic Naomi Fry is worth following on Twitter).
I started subscribing almost two years ago; an early pandemic decision justified to myself in part as a show of solidarity with the journalistic class. I intended to write out here a list of articles that I particularly enjoyed over that time, but the commentary grew, so I’m only presenting a handful.
Most of my physical New Yorkers are back in the UK, so I was only able to peruse the contents of the more modest stack that has grown since I finally made the move over the Atlantic. Even so, as I bookmarked articles that I had enjoyed, I noticed just as many that I had missed and fancied going back and reading.
There had been a great deal said about clear and concise writing, with Orwell’s essay on Politics and the English language is often brought up. It is pretty easy to accuse a writer of producing tortured sentences or old fashioned overwriting. But to my mind the real skill in writing is in discovering an interesting way of saying whatever it is that you hope to say. And “hope” really is the appropriate verb; even when you have clear sense of what you are trying to say, actually making it appear on the page is frequently, in my experience at least, the business of multiple revisions.
For a hundred and fifty years, when the Falkland Islands were a distant outpost of the British Empire, many men came from the Scottish Highlands to work as shepherds, and the islands are oddly similar to the Shetlands or the Isle of Skye—the bleak, rocky landscape; the blustery rain; the nearness of the sea—as though a piece of Scotland had broken off into the Atlantic and drifted eight thousand miles south, past Ireland, then Portugal, past Morocco and Mauritania and Senegal, down past the coasts of Brazil and Uruguay, and come to rest just a few hundred miles north of Antarctica. But here, on days when the air is very sharp and clear, people know that a floating iceberg must be close. And here there are penguins at the water’s edge: three-foot king penguins with egg-yolk bibs; squat rockhopper penguins with spiky black head feathers like gelled hair; whimsy-hatted gentoos. In March, as the plague was circling, the penguins had nothing to do. They were molting, so they couldn’t swim or eat. Molting, people said, was tiring and uncomfortable. The penguins stood about in crowds near the surf, backs to the wind, waiting for their feathers to fall out.
The whole article is extremely quotable, and it is easy to look at any given paragraph and feel like it captures the power of the whole piece. Of course, you keep on reading, and finding new paragraphs so it is easy to miss the cumulative effect. With non-fiction you can look for the seams: the facts, the quotes, all the raw material that the journalist turned up and then had to synthesize to arrive at the sentences that were eventually printed. When the writing is really good you begin to wonder if it was all just laid out like that by a particularly erudite source. Or maybe just the Holy Spirit. How else would you explain it?
For the first twenty years that Tim Blake was at Hill Cove, from the late fifties to the late seventies, the farm, like the other farms in the Falklands, was run on a system that had progressively been outlawed in Britain by legislation, the Truck Acts, which stretched back to the fifteenth century. The farmworkers rarely handled cash: they were paid in scrip, and had a credit account at the farm store in the settlement. At the end of the year, the farm manager would tell them how much money they had left after subtracting their purchases; he would pay their taxes for them and deposit what remained into a government savings account, or help them invest it. The manager might be the only local authority—he conducted marriages and assigned punishments; it was said that not long before Tim Blake came to Hill Cove a man there was fired for whistling. Because drinking could be a problem, especially in winter, when there wasn’t much to do, the farm store rationed sales of alcohol. When a man grew too old for farmwork, he had to retire, which meant that he had to leave his house on the farm and move to Stanley. But there was little for retired men to do in Stanley except go to the pub, and they often died soon afterward.
I found this article to be an impressive work of social history, taking what might seem to be an unremarkable subject, and making it into the most remarkable reading experience. If there hasn’t already been, there needs to be a serious study of American coverage of British affairs. There is no clearer indication of domestic deficiencies of a national media, than a keen outside eye.
I grew up being taken to English Country Houses — which I was largely bored and indifferent to, at least as a child. They would typically boast a fancy garden, a cafe, with a bit of history on the side. The instigating incident of Knight’s article, is the arrival of a relatively unusual tour group, consisting of mostly “older Carribbean women” to Dyrham Park, the quintessential English country house. This group was relatively unusual in that visitors to England’s country houses have been overwhelmingly white, and the tour, organised by a pair of filmmakers and researchers, Shawn Sobers and Rob Mitchell, was part of a larger effort to change that. But they weren’t quite prepared to walk in on a pair of statues of African slaves.
The National Trust, which was founded in 1895, relies on thousands of volunteers, mostly white retirees, to show visitors its properties. Dyrham Park has a roster of around a hundred and twenty. When Sobers and his group entered the Balcony Room, they came face to face with the slave stands and stood there, listening politely. “I couldn’t believe it. I really couldn’t believe it was happening,” Sobers told me. “And the tour guide talked about every single thing in that room, you know, talked about everything for a good ten, fifteen minutes and not once mentioned it.” A rope cordons off most of the Balcony Room, so visitors stand on a narrow walkway, facing the stands. There is nowhere else to look. “There wasn’t even a kind of a, you know, ‘Yeah, we don’t know what those are. . . .’ There wasn’t even an explaining it away,” Sobers said. “They just acted as if they just weren’t there at all.”
The wealth that produced many of England’s country houses has its roots in Empire and slavery. The rest of the article details a national delusion and denial of its own history. As one person is quoted, visitors only want a nice day out.
Researchers of Britain’s colonial history also welcomed the charity’s decision to consider the legacies of slavery and empire alongside each other. For more than two centuries, the transatlantic slave trade coexisted with a busy period of expansion in other parts of the world, notably in Asia. Nonetheless, the subjects usually occupy distinct places in the public imagination—a splitting that has helped to preserve a thick vein of imperial nostalgia in Britain. A poll last year found that thirty-two per cent of British adults are proud of the Empire; among the other European countries surveyed, only the Dutch recorded a higher percentage. “There’s an interesting understanding of what slavery was and what the colonization of Asia was,” Olivette Otele, a history professor at the University of Bristol, told me. (Indenture, a form of bonded labor under which more than a million Indian workers were transported around the Empire, lasted well into the twentieth century.) Of Britain’s Asian conquests, Otele said, “You think about the fabric, you think about the grandeur, you think about the beauty, the jewelry. Most people think that it was prettier, in a way. Whereas slavery is Black bodies, transported and trafficked and all that. So they don’t want to link those histories, because it forces them to see the ugliness behind the Asian colonization as well.”
These country houses are important cultural institutions, due to all the history and culture that accumulated in and around them. I previously wrote here about Susan Dry’s book The Newton papers, which described how Newton’s writings unpublished writing resided forgotten for many years in one such house until the English aristocracy collapsed and the manuscripts became part of the subsequent fire-sale. It is not an exaggeration to say that The National Trust in England was a kind of cultural bailout of these houses, and the country’s estimation of itself. As this article makes clear, the bailout is also part of an ongoing cover-up.
Some issues of the New Yorker issues are complete home runs, and the issue The Dead Ship appeared in (the online and print titles are different) stands out as such in my mind. Alongside this alarming and utterly gripping story of a disaster waiting to happen, were articles on fusion energy, the true crime Fatty Arbuckle scandal from the golden age of Hollywood, and Gary Schteyngart’s describing the aftermath of his botched circumcision. But the ongoing situation of the F.S.O. Safer, moored in the Red Sea, just off Yemen, sticks with me the most.
The Safer’s problems are manifold and intertwined. It is forty-five years old—ancient for an oil tanker. Its age would not matter so much were it being maintained properly, but it is not. In 2014, members of one of Yemen’s powerful clans, the Houthis, launched a successful coup, presaging a brutal conflict that continues to this day. Before the war, the Yemeni state-run firm that owns the ship—the Safer Exploration & Production Operations Company, or sepoc—spent some twenty million dollars a year taking care of the vessel. Now the company can afford to make only the most rudimentary emergency repairs. More than fifty people worked on the Safer before the war; seven remain. This skeleton crew, which operates with scant provisions and no air-conditioning or ventilation below deck—interior temperatures on the ship frequently surpass a hundred and twenty degrees—is monitored by soldiers from the Houthi militia, which now occupies the territory where the Safer is situated. The Houthi leadership has obstructed efforts by foreign entities to inspect the ship or to siphon its oil. The risk of a disaster increases every day.
The article combines outright horror at the implications of the Safer sinking or exploding with bewilderment at what possible options exist and gripping insight into the world of international shipping and geopolitics. If the worst happens, we will all hear about it, if not immediately, then very soon afterwards when the consequences make themselves utterly evident. This article will no doubt be shared over social media all over again.
The beginning of the New Yorker’s history as an outlet for serious journalism is usually dated to the 1946 issue which was entirely devoted to John Hershey’s report on the effects of the atomic bomb on Hiroshima. At that point the US public had been subject to a PR campaign by the government, downplaying the possibility that there would be any lasting effects of radiation poisoning. The book the was produced from Hershey’s reporting would never go out of print. Reading Caesar’s article put me in mind of how those first readers must have felt, with the exception that I am reading the article before the terrible event has happened.
I recently finished The Glass Hotel, Emily St John Mandel’s wonderful novel about a Ponzi scheme, international shipping, prestige hotels, guilt, and ghosts. One idea that she and her characters wrestle with is that of the “counter-life” — the life a person could have been living had they made a different set of decisions. To some of the characters their counter-lives begin to feel as real as their own, taunting them from afar. Those characters involved in a Madoff-style Ponzi scheme consider their counter-lives had they taken a different job, gone to the authorities, or fled the country when the jig was up.
I can’t be alone in occasionally dwelling on a particular set of my own counter-lives. I can identify many pivotal decisions I have made (none of them criminal), but being a mathematician I cannot help but wonder what kind of life or person I would be had I chosen a different academic discipline. If my mathematical education has dramatically shaped the person I am — which is not obvious at all — then the implications are even more profound than the professional life I would pursued or my material circumstances. I would see the world differently.
I might as well entertain the most radical possibilities. Of course, life would have been different if I had continued to study physics, or even more so if I’d gone off to art school. But it’s the Shimer Great Books School that really makes me think.
I was first introduced to Shimer College by Jon Ronson, writing back in 2014 after Shimer had been ranked bottom in a survey of America’s colleges. Ronson discovered that the college wasn’t bad, but suffered because its singular vision of education, lack of the usual accoutrements of American college life, and its very small size left it badly served by the survey metrics. The singular vision was a complete dedication to studying the canon of literature, otherwise known as the “great books”.
Textbooks about the great books are forbidden. That would be too easy. It is primary sources only here. Students can concentrate on humanities, or natural sciences, they can take electives in feminist theories, or Auden, or Zen masters, but it’s all great books and nothing else. There are no lectures. Each class takes the form of Socratic dialogue between the students, guided by a professor if necessary.
This is very much not how most higher education works. If the Shimer curriculum is to believed and understood at face value, they actually read Newton’s Principia, presumably with pre-Leibniz notation (but not it Latin, surely). They read Darwin’s Origin of the Species over a modern understanding of the science. And they read da Vinci’s Notebooks, which I’m not entirely certain has any kind of easy comparison. Engineering textbooks?
The real draw of a great books course is exposure to foundational texts of the humanities. Books that most of us know of principally through their reputations; books that are believed to be a bedrock of a well cultured intellect. Not having read these books niggles away at me, like a known dietary deficiency that I cannot get around to addressing.
I’m not the only one to have felt this way. Last year Naomi Kanakia wrote an incredible essay analyzing and deconstructing her own relationship to the canon. Having dedicated nearly a decade of her life to doing the reading in her own time she discovered that very few people, especially those with an ostensibly fancy education, had actually read any the great books. Kanakia explains that it is a myth that the elite — political, pedaligical, cultural — are well read in the Western canon:
Moreover, when intellectuals, particularly academics, bewail the cheapening of elite education, there’s an almost comical element to their complaint. For most of their histories, neither the Ivy Leagues nor the Oxbridge colleges were particularly known for the difficulty of their education. It’s impossible to overstate how easy it was to get into Harvard in the 19th century. If you were of the right background and had gone to the right secondary school, you would get in. The Greek and Latin requirements were merely class markers. No intimate understanding of the texts or dedication to scholarship was needed to enter.
This leads Kanakia to considering the classes of people who actually produced the “great books”, and who actually read them. Or didn’t read them.
A class can be literate even if it doesn’t produce notable writers, but the English and American elites also became renowned for their disdain for learning. Although a stint at Cambridge and Oxford continued to be seen as de rigueur for the English gentry, just as acceptance at Harvard, Yale, or Princeton was for their American counterparts, neither set was famed for their commitment to learning. Even among the well-off, fashionable set, it would be quite rare to find someone who remembered their schoolboy Latin or who could discourse with any sense of authority on the work of the ancients. Edith Wharton claimed that, although her childhood home was full of books, nobody ever read them — that in fact, to her knowledge, nobody in her extended family had ever read her own books. In In Search of Lost Time, Proust describes a high society that grudgingly allows entrance to literary figures, so long as they are witty and entertaining, but pays no attention to their works. Indeed, Marcel is shocked by how distant many writers are from the heights of the fashionable society they write about, and by how quickly a writer is dropped by high society if he starts to talk of intellectual matters.
Kanakia’s misapprehension was likely cultivated by certain partisans within the education system. In a recent New Yorker hatchet job, Louis Menand attacks the underlying premise of two newly released polemics decrying the state of liberal education and the general neglect of great books. As Menand notes, there is a long history of such discourse. Shimer college was itself founded on the principals of one such polemic: Robert Maynard Hutchin’s The Higher Learning in America.
That conflict is essentially a dispute over the purpose of college. How did the great books get caught up in it? In the old college system, the entire curriculum was prescribed, and there were lists of books that every student was supposed to study—a canon. The canon was the curriculum. In the modern university, students elect their courses and choose their majors. That is the system the great books were designed for use in. The great books are outside the regular curriculum.
(The emphasis is mine.) Indeed, there is an underlying hostility to precisely the kind of education I have received and benefited from.
The idea made its way into universities after 1900 as part of a backlash against the research model, led by proponents of what was called “liberal culture.” These were professors, mainly in the humanities, who deplored the university’s new emphasis on science, specialization, and expertise. For the key to the concept of the great books is that you do not need any special training to read them.
(I could argue that mathematical education predates anything like a “great books” education, but let’s ignore that rather appealing idea.)
So what benefit does an actual, honest to goodness, education in the great books actually offer? I discovered a recent PhD thesis from a former Shimer student, Jonathan Goldman, that seeks to address some version of that question. Conducting interviews with sixteen former students of the college from the sixties and seventies, he investigated the effect Shimer had on them. Certainly among those he interviewed the impact was very positive. They went on to graduate school, industry, and all kinds of other adventures, feeling well prepared and full of confidence. Whatever challenges they faced, they had no problem sitting down and doing the reading. The years in college were reportedly hugely rewarding — the holistic view of intellectual history, in particular. Their memories of college conform more strongly to what many of us hoped for from higher education, but maybe never quite attained.
Participants described their campus colleagues as being very smart, interesting, and always engaging. A few people felt that for the first time in their lives they were with students who were as smart as they were, if not smarter. Irv enjoyed being “surrounded by people who were smarter than I was … talking nonstop, they were just so excited.” Irene thought “the fact that it was small and yet there was a high percentage of very bright and interesting people there … was crucial.” Ian said, “a lot of the people who showed up at Shimer were very talented, they were creative, interested, and they have very active minds and they were—they had high IQs and were just smart.”
Particularly galling is the fact that those of us who haven’t enjoyed the benefits of the great books might not quite cut the mustard for them socially:
Participants discussed how their relationships at Shimer changed their perspectives about relationships after leaving school. Kathy said that she “never again found relationships as meaningful as at Shimer.” She added that Shimer “spoiled me for friendships.” Others felt that being around the people at Shimer set higher standards for future relationships. Olivia said that “I really can’t stand to be around people who are really ignorant” and that she likes “to have discussions with people about their ideas, and listen to what their ideas are, and challenge them, and have them challenge me, and talk about stuff.” Olivia said that people with whom she talks “can’t just arbitrarily say something and not be able to defend it.” Others discussed wanting only friends who are able to carry on a meaningful and rational conversation.
Unfortunately, I likely wouldn’t even understand their jokes.
The participants described Shimer people as having a different way of looking at things and Riley cited an adage popular among some Shimer alumni that “Shimer people don’t have to explain jokes to other Shimer people.” Carol said that being at Shimer “helped learning to work with lots of different people who think differently.” Zoe said that after leaving school, she would talk to people and I might then reference whatever the topic, social topic that was going on. Whether it was about cities and war or different things, poverty or something and then I might mention an author or subject I had studied, and people would respond to that as if it was unusual. I thought it was what we did. It was everyday conversation at Shimer or with people from Shimer, it wasn’t a big deal. I started realizing that is something of interest to other people and … things that were ordinary at Shimer were extraordinary elsewhere. She also noted that one person told her that she “talk[ed] in metaphors.”
These shared jokes and mutual understanding seems to be the upshot of what being unusually well read means. So while our elites do not actually bear the benefits of a liberal education we believed them to possess, the graduates of a weird little college (and a handful of others which also offer their own variation) have the actual goods.
Not that I am falling over myself to work my own way through the reading list. There is a great deal to be said for engaging with the contemporary. Or to put it another way, if I was busy reading through the canon, I likely would never have got around to The Glass Hotel.
Back in 2018 I listened with great interest to the New York Times podcast series Caliphate. This ten part, multi-award winning series, narrated by the journalist Rukmini Callimachi, reported on ISIS, focusing on the testimony of Abu Huzaifa, née Shehroze Chaudhry. I heard Huzaifa describe, in interview, his online radicalization, journey into Syria, joining ISIS as a foreign fighter, and even performing executions.
The podcast made headlines, because this former ISIS member had returned to Canada where he was a citizen living freely. In discussion he remained sympathetic while disillusioned with the caliphate. Questions were raised in the Canadian parliament and inquiries made. At the time there was a great deal of concern in the media about fighters returning to the West after the collapse of ISIS. The media had reported with grim fascination on the alienated young people in the West who had been radicalized via social media and then traveling to Syria to join their new cause. ISIS itself had played up to its own sensational image by posting gruesome execution videos online.
But Huzaifa was lying. The entire podcast was based on a lie. He was a fabulist who no doubt harbored genuine sympathies for ISIS, but had likely never even entered Syria and certainly hadn’t joined the Caliphate. When the Royal Canadian Mounted Police concluded their investigation they prosecuted Huzaifa not as a terrorist, but for committing a terrorism hoax. (The case was later dropped in exchange for an admission in court to lying about joining ISIS and agreeing to a $10000 peace bond.) In late 2020 the New York Times issued a retraction.
Journalists live and die by access. Access to evidence, sources, records, and anything at all on which a story can be built. This leaves journalists in constant danger of getting burned by their sources. Leakers are notoriously difficult to work with. Often they are simply disgruntled former employees with a particular axe to grind, and the very same motivations that lead them to talking to a journalist make them suspect.
Sometimes the betrayals can seem utterly inexplicable. In 2020 the New Yorker informed its readers that a celebrated and award winning story on the “rent-a-family” industry in Japan, was compromised because no less than three of the sources used in the story had been lying outright to the writer, Elif Batuman. As Ryu Spaeth outlines in the New Republic:
The trouble began a year after the article was published, when a Japanese magazine reported that an employee of Family Romance had pretended to be a client of the company in a documentary produced by the giant Japanese broadcaster NHK. NHK confirmed that Ishii had told his staffers to carry out the ruse. The New Yorker then began its own investigation, culminating in the stunning admissions that were published this week: that “Kazushige Nishida,” the lonely widower, was in fact married and did not provide his real name; that “Reiko Shimada,” the lonely single mother, was in fact married and did not provide her real name; and that, craziest of all, Reiko and Yuichi Ishii are married to each other. Despite these elaborate deceptions, they all insisted that their stories were otherwise true.
In the aftermath of such retractions, the postmortem can present the mistakes made in the light of a morality play. In the case of Caliphate, the Times admitted that the series lacked the “regular participation of an editor experienced in the subject matter.” And on a practical level, they should have done reverse image searches on the pictures Huzaifa provided as evidence of his travels, and more thoroughly examined his passport and travel records.
But reading Ben Smith’s media column (in the very same newspaper) you are presented with a more expansive set of sins. The kind of narratives that Callimachi actively sought to present not only predisposed her to placing too much trust in a dubious source, but those narratives were themselves problematic.
Terrorism coverage can also play easily into popular American hostility toward Muslims. Ms. Callimachi at times depicted terrorist supersoldiers, rather than the alienated and dangerous young men common in many cultures. That hype shows up in details like The Times’s description of the Charlie Hebdo shooters acting with “military precision.” By contrast, The Washington Post’s story suggested that the killers were, in fact, untrained, and noted a video showing them “cross each other’s paths as they advance up the street — a type of movement that professional military personnel are trained to avoid.” On Twitter, where she has nearly 400,000 followers, Ms. Callimachi speculated on possible ISIS involvement in high-profile attacks, including the 2017 Las Vegas shooting, which has not been attributed to the group. At one moment in the Caliphate podcast, Ms. Callimachi hears the doorbell ring at home and panics that ISIS has come for her, an effective dramatic flourish but not something American suburbanites had any reason to fear.
This particular critique is interesting because although Huzaifa’s story was false, his story more or less follows the arc of the very real foreign fighters who joined ISIS. “Jihadi John” really did grow up in London, get a degree, and then later join ISIS and perform a series of beheadings that were recorded and uploaded online. But of course, by the time Caliphate was being made, Jihadi John was dead, leaving him very far from being inclined to offer any reporter at the Times his exclusive story.
Back with the rent-a-family story, Ryu Speath, writing at the New Republic, considered the possibility that the New Yorker fell into the “weird Japan” trap of reporting on the country to satisfy a preconceived notion of Japan’s otherness, oddness, and in-explicability.
Some will say Batuman, a gifted writer, got the story wrong because she had little professional or personal familiarity with Japan. But I think that only makes the Japanese seem even more mysterious, as if these strange creatures can only be understood through lengthy anthropological immersion. Anyway, Japanese journalists fell for the story, too. (No one is more fascinated by Japan’s weirdness than the Japanese themselves.) And everyone is susceptible to cultural blind spots. As I wrote earlier this year, I long viewed the Japanese fondness for sanitary masks as evidence of some deep-seated cultural defect. Now that I wear a mask myself every day, it’s amazing to me that I could not see the obvious, banal reason people use masks: to protect their health.
For all this I should say that I am a massive fan of journalists and the work they do. I consume considerable amounts of American journalism, and subscribe to both the Times and the New Yorker. There is a convenient argument that the retractions and subsequent analysis of their failings provide evidence for the very editorial standards these institutions failed to meet, but that argument is a little too convenient. Smith’s column strongly suggests that the flaws were fundamentally institutional, and the same thing could likely happen again. That said, I do think the retractions mean something.
Ultimately I want to sit down and read reporting from someone who has been there, talked to the people who were there, and gone through the evidence. I want to read articles which have been through a rigorous editorial process. I want to read analysis by people who have spent a lot of time thinking about an issue. Obviously I don’t want to be imbibing the talking points of corporate lobbyists, paying heed to astro-turfing organizations, treating cop-aganda credulously, letting green-washing get a free pass, or uncritically accepting press releases from the military-industrial complex. But assuming good faith, I’m ready to accept the inevitable mistakes, bias, and omissions.
Don’t believe everything you read in the papers. But that doesn’t mean you should stop reading the papers. Don’t expect the “truth” to be handed to you on a platter. Remember that some things really are too good to be true. But most of all, don’t shoot the messenger.
New Hampshire’s state motto — Live Free or Die — struck me from the moment I first read it off a rear license plate as amusingly over-determined. Sure, the sentiment calls back to New England’s revolutionary tradition, but is now laden with so much other import that you have to laugh when you read it. Like with many other of the state mottoes I feel the need to ask “OK, but what are we really concerned about here?
In the case of New Hampshire the answer to that question seems to be “covered bridges”. Which are quite literally what the name suggests. That is to say bridges built with their own roofs, preventing snow and ice accumulating on the road surface beneath. I am to understand that such bridges last longer, are safer to use, and attract an unusual degree of local pride. Based on the selection postcards available on the rack I perused, they are far prouder of their covered bridges than most of the White Mountain peaks. Driving past a particularly congested covered bridge we could see visitors slowing down to gawk and take photographs of themselves.
It’s all much less “Live Free or Die”, and much more “We Like Our Safer Bridges.”
The day after new year I went out for a jog around the small-town-liberal-arts-college-campus. With their leaves wet and glistening in sodden piles on the ground around them, the trees had the air of men who had thrown off their soiled garments and stood there completely exposed to the elements. They continued to hold up their branches — as dignified as they had been in their summer pomp.
One tree had been decorated in the fashion of a Christmas tree. I suppose we were still close enough to Christmas that this remained acceptable. Except that instead of the usual hung ornaments, baubles, tinsel and lights, the branches were laden with surgical masks, N95s, mini whiskey bottles, larger gin bottles, and the lids off tin cans, hanging by the ring pull. I imagine a student art project, or perhaps a particular kind of creative writing teacher trying to demonstrate the value of ritual and tradition, and the value of subverting it.
I jog by two kids throwing a blue American football back and forth while an adult supervised. They were wearing facemasks outdoors. I later pass a woman walking her dog and I began to sense a growing trepidation. Omicron was looming on the horizon.
I bought an LCD computer monitor for $25 on Craigslist and I’m feeling the thrill like it is 2002. The online listing site still lives, looking every bit like it is still 2002.
We drove across town to pick up the monitor, and I paid the man in cash. “I’m sorry, I just don’t like letting people into my house,” the seller told me as he led the way into his garage where he would demonstrate the monitor actually working before I took off with it. I told him that was entirely understandable, especially these days, but I realized his concern was not Covid related. After all, I was the only one of the two of us wearing a mask.
He was an older man, lean and wearing faded jeans and sneakers. His garage was tidy and well organized. The mind runs to dark and sensational places when a man explains that he doesn’t like letting strangers into his house. But these are usually the least interesting explanation for a man’s abiding attachment to his own privacy.
I just finished reading Lauren Hough’s collection of autobiographical essays Leaving isn’t the Hardest Thing. One of her more famous essays that appeared a few years back recounted her years working as a “cable guy” in a DC suburb. She often had to explain to customers that if they wanted their internet back, they were going to have to let her go down into their basements. “Unless you have kids in cages, I don’t care,” she would assure them. As she discovered, people have all kinds of reasons to be cautious about letting strangers into their home, many of them not actually sinister.
I feel a real thrill at the prospect of making a New Year’s resolution. While usually inclined toward maintaining my set routine, I remain vulnerable to the countervailing compulsion to rearrange my life, as some are drawn to rearranging the furniture every couple of months. Even the common varieties of vague and Protestant resolution stir something in me. Cook more, eat vegetarian, cut the processed sugar, cut the booze, hit the gym, walk a mile a day; I’m not a fan of the misery and the defeat and, worst of all, the society that makes people suffer for their bodies, yet the idea of experimenting with the way I live persists. Also, I am always eager to hear from someone who has made a resolution because, to my mind, they are setting off on a kind of adventure.
I have a history of resolutions — New Year’s and otherwise. As a graduate student in Montreal I spent a year exclusively reading books written by women. Another year I read the news exclusively in French. Lent 2017, now living in Israel, I ignored the news entirely. In 2020 I started listening to a slow French news podcast every day, and I succeeded until the national lock-down was declared. That resolution was less failed, more redundant. While half the world was planning on Duolingo-ing their way through their confinement, I realized that my resolution had become superfluous. Like it or not, the entire world’s collective furniture was being rearranged. I no longer had to worry about breaking up my own routines or trying out a new way to live. Not when going to the supermarket became a surreal and uncertain experience. Not when the pandemic began to change the way we dreamed.
The desire to find a different way to live is tightly coupled with a prurient interest in the way other people live. Praise be then to those brave voyagers reporting back to us with their experiences and collected wisdom via You-tube. They have tried doing 100 press-ups a day, 100 crunches a day, and 100 squats a day. They have tried reading 100 pages a day, writing 1000 words a day, and writing a novel in a month. You can find them quitting smoking, vaping, drinking, coffee, sugar, gluten, video games, world of warcraft specifically, social media, and the internet. You can find them trying meat for the first time, playing magic the gathering for the first time, watching a BTS music video for the first time, a kitten seeing snow for the first time, Amish girls seeing an airport for the first time, and Koreans trying bourbon for the first time. People have will tell you how they found eating like Taylor Swift, Donald Trump, Christiano Ronaldo, Adele, Michael Phelps, and The Rock. There are people who try, and debate the merits of, military rations from across the world.
To my mind, we (and especially these Youtubers) are all the descendants of Henry David Thoreau, the New England writer, proto-naturalist, proto-activist, and abolitionist. From July 1845 to September 1847 Thoreau lived apart from his town and community, working with his hands, in a cottage on the edge of Walden pond. His written account of the experiment was published almost seven years later so we can, as the expression goes, read all about it — from growing beans to visiting town every once in a while. But he also explains his motivations — principally, a deep cynicism about the lives of the men around him. He saw them “digging their graves before they are born” and living under the tyranny of their own opinions of themselves.
But men labor under a mistake. The better part of the man is soon plowed into the soil for compost.
Walden – Thoreau
It is not easy to transpose a thinker of the past into current circumstances. If Thoreau was alive today I can’t be sure if we’d find him vlogging about polyamory, a mid level marketing scheme, or radical politics. Quite possibly all of the above. I doubt that he would be the kind of person to play the incredibly sincere looking video game adaptation of his book.
In lieu of Thoreau, we have the likes of the Try Guys, who arrived on the video landscape with a four minute report on wearing women’s underwear for the first time.
You cannot watch such a video of those young men and not acknowledge that they are bucking against the tyranny of their own opinions of themselves. And having glanced through all these videos of people trying, quitting, warning and advising I suspect there is also the underlying cynicism about the way we live our lives that reflects something of Thoreau as well.
So what is my own new years resolution? As a teenager I read a book whose title and precise nature has faded from memory, but what I do remember is that the author, by way of illustrating some other point, briefly described spending a year reading the complete works of Shakespeare. At first, he recounted, it was difficult going, trying to soldier on through all that pentameter. But with perseverance the language opened up and he developed an authentic appreciation of the plays. It was far neater experience than I would today grant credence to, but it struck me at the time as a deeply worthy thing to do. I know for a fact, however, that I would not be able to read that much Shakespeare in a year. Aside from anything else, I am generally committed to reading wherever my fancy takes me. Instead, and by way of trying to get some Shakespeare into my system, I will commit myself to devoting each year to a different play. And I will begin, in 2022, with Macbeth.
The biographer of Isaac Newton is in an unenviable position. Usually the writer thrives on the access they can get to their subject — their writings, correspondence, contemporaneous accounts. But in Newton’s case, the biographer is cursed with too much material. Newton’s unpublished writings form an extensive body of work spanning an impressive and embarrassing array of interests; from science and mathematics, to alchemy and heretical theology, Newton was a compulsive note taker. Susan Dry’s The Newton Papers gives a careful account of how these papers managed to escape their fate languishing forgotten in one of England’s aristocratic estates, and into the hands of scholars who could read and make sense of them. Or so they hoped. The writings were so extensive, that they were impossible for any individual to meaningfully absorb. The hope that a definitive or comprehensive view of Newton might be revealed revealed itself to be futile. Dry even concludes that the endeavor is fundamentally misguided.
The first two Newtonbooks I wrote about here took two distinct strategies to avoid the trap presented. James Gleick’s Isaac Newton took the light touch, providing a readable biography that was blessed with being selective in what it presented to a reader. Thomas Levenson’s Newton and the Counterfeiter focused on a lesser known chapter of Newton’s life — his role as Warden of the Royal Mint. Levenson’s book was blessed with seeing Newton out in London, interacting with the world, and thus managed the feat of stepping far enough back from the man that we could begin to see him more fully in his time and place. What emerged was a far more interesting portrait than one man being the turning point of history.
Levenson’s most recent book, Money for Nothing, takes this a step further, to the point that we no longer have a “Newton book”. The real subject is the South Sea Bubble, the arrival of modern finance, and the connection to the “Scientific Revolution”. Newton’s significance, beyond having himself bought shares in the South Sea Company, is that he had developed the keenest understanding yet of the relationship between equations and the world that they could represent.
Ultimately, this mathematical insight is at the heart of modern physics, the science that Newton, more than any other single thinker, would create. It it’s simplest form, the idea is this: the full picture, the complete geometrical representation of all the available solutions to a system of equations, can be understood as all the possible outcomes for a given phenomena described by that mathematics. Each specific calculation, fed with observations of the current state of the whatever you’re interested in, the flight of a cannonball, the motion of a planet, how a curveball swerves, how rapidly an outbreak of the plague might spread, makes a prediction for what will happen next. In his twenties, working on his own, with almost no systematic experience of the study of the real world, Newton did not yet grasp the full power of the ideas implied by the way he had begun to think about the math. That would come in time. But what made his annus mirabilis so miraculous, was the speed and depth with which Newton forged the foundations of his ultimately revolutionary way of comprehending the world.
Money For Nothing, Thomas Levenson
Levenson explains that this was not the Ponzi scheme of capitalism that many claim it is. The value the South Sea shares was a measure of trust that the Treasury could reliably pay out in future. Even in these early days of state finance, there was an understanding that a state that was constantly borrowing could also be worthy and trusted creditor. The theory was that the size of the national debt as it stood was only important when considered against the future productivity of the nation. In principal, and indeed in practice, a nation that invested in itself would grow and develop economically, allowing itself to making good on future repayments.
The second advantage that the British government possessed was the inexorable passage of time. The funds it borrowed at any moment became bets on the nation’s economic life year over year. The wager was that the ongoing work of every new enterprise, each voyage, everything that Britons did to get and spend in the future, would create enough wealth to support the debts being incurred. The chancellor of the Exchequer didn’t have to treat every expense as a pay-as-you-go imperative. Whole nations, as London’s monetary thinkers has discovered, need not perform the virtues embodied in the very good advice to pay off a credit card balance in full every month. Rather, the task was to balance the needs of the moment with an analytical picture that could be drawn of Britain as a whole, all its getting and spending and accumulation, integrated over years to come.
Money For Nothing
Making this case can be divisive. Indeed many, such as Daniel Defoe, seem to have been divided within themselves about this development; on one hand despising the traders and stockjobbers who ran the secondary markets, while supporting the state borrowing that they enabled on the other. Many readers, if they were to correctly read the argument I believe the book is making, would probably object to it. The argument is that national debt and secondary markets for financial products are important, necessary, and work (except when they don’t). Making this argument can be as tricky as convincing someone of the merits of modern art or free verse. In the case of government borrowing and stock markets, the most obvious problem you have in this case is what ultimately befell the South Sea Company.
After almost a decade of providing reliable and unremarkable returns via direct Treasury payments, there was in 1720 an attempt to convert a huge amount of illiquid government debt into the liquid and more manageable form of South Sea Shares. At this point, the win-win-win equation that held between company, shareholder, and government was badly abused. Just about every kind of financial crime was practiced (insider trading, artificially pumping up prices, and outright bribery), and over course of the year the price of the shares increased ten-fold, from 100 GBP to 1000 GBP.
Among the reasons Levenson presents for the South Sea Stock crashing at the moment it did, was a collective realization that the stock could not offer a rate of return any better than the most ordinary of private loans. In truth the company couldn’t even offer that. In a desperate attempt to prop up the share price, a completely unsustainable dividend of 50 GBP was offered to shareholders. While that would be a magnificent return on the “par” price of 100 GBP, on the recent sale price of 1000 GBP this was a very ordinary 5% return.
I cannot help but draw analogies with the current excitement around cryptocurrencies. In place of Hogarth’s satirical paintings, Defoe’s commentary, and Pope’s poetry, which accompany Levenson’s account, we have Twitter memes about buying the dip and right-clicking NTF art. We can also imagine that once cryptocurrencies begin to look a lot more “boring”, there might be a major correction. From this perspective the volatility of cryptocurrencies is less a liability and more of a feature.
If we are to embrace the analogy, there is a dis-quietening reality that the South Sea Bubble offers. Although the share price crashed, the political careers ended, and assets seized from many of the incriminated, the financial tools and derivatives that made it all possible would go on to form the backbone of modern finance (with some occasional regulation, if you can believe it). Similarly, even if bitcoin and etherium suffer some almighty crash, it doesn’t mean that it won’t find a place in the long term landscape of finance.
To be clear, I do not welcome a bitcoin future. Plenty of people, in particular those who understand what a blockchain actually is, are writing in strong terms about how little this offers. But at the heart of why I don’t like cryptocurrency is my suspicion of the world it would produce. As of the moment the main contributions of cryptocurrency to society are enabling cyber criminals looking to profit from ransomware, and diverting huge amounts of computational hardware, time, and energy towards “mining” these tokens. It is a libertarian future where governments can’t meddle with money on our behalf.
Where is Newton left in all of this? As best we can tell, he wisely sold his initial investment in the South Sea Company, at a profit, mid way through the bubble. He then unwisely reinvested later, as the price continued its precipitous rise, and lost out when the bubble crashed. So Newton, for all his unprecedented insight, was just as vulnerable to making a fool of himself as the rest.
Instead Levenson presents us with with Archibald Hutcheson MP, who despite his lack of scientific training best embodied the scientific analysis of the market when he sat down an began computations to derive how much the shares would have to return to justify their price.
This was recognizably a scientific revolutionary’s way of thinking. In the Principia Newton had constructed mathematical models that could explore the behavior over time of the moons of Jupiter or could predict the motion of a comet with a track that remained mostly unknown. He published his results both as an exercise in scientific reasoning and with persuasive intent: he sought to persuade his readers that what he had discovered “cannot fail to be true.” In his earlier writings, Hutcheson attempted much the same double act. His work focused on the dynamics of budgets instead of celestial bodies, but it spoke in the same unassailable language of numbers in flux — and thus asserted a claim to like power: just as Newton had declared his system of the world, Hutcheson’s arguments could not fail to be on the money.
Money For Nothing
Although this paragraph is immediately followed by a caveat.
There was, of course, a key difference between Hutcheson’s calculations and the utterly authoritative demonstrations in the Principia. When Newton bragged about his work’s unassailable accuracy, he could let nature be the judge, pointing to the agreement between his mathematical account of a comet’s flight and the track it actually traversed. Hutcheson could not command such certainty. Instead, he used the cultural power Newton and his friends had given to mathematical reasoning to strengthen his political argument. Whatever truth his algebra might contain was continent on the uncertain behavior of the human actors involved in any financial choice.
Money For Nothing
I have often been perplexed to read of kings, rulers, and governments being compelled to certain courses of action by economic necessity. It is hard to buy into a motivation you have little intuition for, and that belongs to a game for you don’t know the rules. Currency crises and borrowing crises and monetary crises and even national productivity crises are often referenced with little explanation. This is all to admit a glaring hold in my own education, but I certainly can’t imagine that I’m the only one.
It is a testament to the success of Levenson’s book that I found it as enlightening as I did. Having read no previous account of the South Sea Bubble, I was effectively going in cold. Levenson takes the reader through all the mechanics of the swaps and trades, providing the important back of the envelope calculations that make sense of what happened. There is no unnecessary hand-holding, and I did reread certain passages, but it was all there. On top of this, Levenson populates his account with an impressive dramatis personae, providing a vivid portrait of British society reacting to these events. The final chapters outlined the future success of British state borrowing, and I possessed a good sense of what that actually meant. I will be able to make far sense of at least some of the history I was reading than I did before.
There are some conspicuous omissions in Levenson’s narrative. While the South Sea Company’s involvement in the slave trade is covered (practiced, but not profitably) there is no consideration of how the rise of credit based finance might have driven the growth of the trade itself. There is far more discussion of how financing Britain’s wars made a secondary market for government issued debt necessary, and it is argued that the success of the treasury policy that Robert Walpole, Britain’s first Prime Minister, developed in the aftermath of the bubble both incentivized avoiding war while also enabling Britain to “punch above it’s weight” when it did go to war. I found the passages that did address this particularly interesting, and would have read more. But there was no reflection on the implications of a system that enabled Empire, and while Levenson mentions the industrial revolution in Britain as a triumph for capital, I was left wondering about the huge social cost to the working classes of Britain.
To be fair, this would be the subject of a different book. (David Graeber’s Debt springs to mind). There is a very specific moral that Levenson wants to lead the reader to: That the crash of 2008 was fundamentally no different from the crash of 1720. Financial markets are ingenious human inventions, but they need careful supervision and regulation.
A fine message — and I agree. But given what was being invested in back in 18th Century England, you might imagine that some people would have been quite happy to have seen the system crash, investors ruined, and a political system collapse. There are very different kinds of consequences out there that investors or a nation should consider than a crash. Dangers we should also be vigilant for and legislate against.