I shall not reply

In the summer of 1859, the New-York Tribune accused Robert E Lee of having three of his late father-in-law’s slaves, who had run away about a month before, caught and whipped, with Lee personally whipping a woman when the man administering the beating refused to. Horace Greeley’s Tribune was an anti-slavery paper and the accusation was made in an anonymous letter by a writer clearly unfamiliar with the provisions of Lee’s father-in-law’s will—of which Lee was the executor—and ended with a pointed political message. It was propaganda calculated to invited outrage—and provoke a response.

Lee’s only statement on the matter came in a letter to one of his sons: “The N.Y. Tribune has attacked me for my treatment of your grandfather’s slaves, but I shall not reply.”

One of the most annoying and unseemly aspects of online and social media culture is the endless calling-out of haters. Public figures of whatever level of fame, influence, and authority inevitably end up spotlighting and condemning their critics, which prompts fans to voice their support and dog-pile the enemy.

I’ve unfollowed a number of writers and thinkers I otherwise like precisely because of this. One popular evangelical literary scholar eventually made her presence on Instagram entirely about screenshotting hate mail and sharing it with a dismissive, above-it-all caption. An up-and-coming novelist on Substack has recently lashed out at a few people poking fun at her pretentions in a long essay describing them as anti-intellectuals and misogynists. I could multiply examples. The comments on these posts are always full of praise and affirmation, which is surely part of the point. It betrays a neediness and fragility I find not merely off-putting but embarrassing.

The technology doesn’t help, of course. The perverse incentives of social media demand response, immediately, and the knowledge that the fans will have your back against the haters only intensifies the pull toward the reply button. A mob can make anything feel righteous. Then follows the well-known dopamine rush of the zinger. And once the habit is formed, there’s no going back. You’ve fed the trolls. You’re ensnared, no better than the haters, slinging mud in the notes or reels or comments and basking in the praise of your yes-men. It’s this scene from “Community” all day, every day.

What I would like to see much, much more of is detachment. I shall not reply. Rather than acting like you’re above it all, rather than saying the criticism doesn’t matter, why not be above it all by ignoring it, not even mentioning it? Answer not a fool. That might mean letting the opinions of idiots stand but it wouldn’t degrade your own character. But as was clear even 2,000 years ago, most people would rather seem than be.

Lee understood this even in the newspaper era. There is some criticism not worth responding to, to which responding would only validate and encourage your critics by lowering yourself to their level. What must it have taken a man like him, of his background and character, facing such an accusation in such a difficult personal situation, not to reply? Discipline, for one thing, which the technology actively works to erode. He had avoided entanglement in journalistic controversies before and that habit didn’t fail him now. I doubt many of us could have made the same choice in 1859. I know even fewer could now.

The writing rule everyone misses

A recently popular genre of Substack note—judging by what the algorithm sends my way, anyway—is complaining by writers about “rules” for writing. These frequently take the form of fulminations against old advice to avoid adverbs. To paraphrase one note, which if I remember correctly was originally much ruder, “Every adverb I write is a little screw you to Stephen King.” More broadly, some will argue that the there are no rules for good writing and even to formulate rules is a kind of tyranny or imposition or—for a special subset of writers who self-consciously posture as independent outsiders—the mark of the dreaded “MFA writing.

I can’t speak for every writer who has ever laid out a list of rules for their own writing, but these Substack warriors could save themselves a lot of time and lower their blood pressure by noticing one all-important caveat or disclaimer in every good list of rules I’ve ever seen: break the rules if breaking them will produce better writing.

In the early days of the blog I collected three sets of writing rules from three favorite writers: CS Lewis, George Orwell, and Elmore Leonard. They have areas of broad overlap, especially a concern with precision and clarity, but here’s perhaps the most important:

  • Orwell: “Break any of these rules sooner than say anything outright barbarous.”

  • Leonard: “If you have a facility for language and imagery and the sound of your voice pleases you, invisibility is not what you are after, and you can skip the rules.”

Orwell’s rule comes at the end of his list; Leonard’s before he even lays his out. They’re emphasizing that these are their rules for how they write, a set of strictures that they have found effective, but space must remain for artistic judgment. This rubbishes another species of Substack complaint about rules, one often leveled at Orwell specific: that of hypocrisy.

Most of the complaints I’ve seen about rules for writing stem from a misapplication of or unwarranted rigidity in a using particular set of rules. This is a legitimate problem. The rule against adverbs exists not because adverbs are inherently bad, but because they become a crutch for weak writers. Most of the items prohibited by lists of rules have this temptation about them: overuse of adverbs can cause verbs, where the action happens, to atrophy; overspecific dialogue tags can, in addition to reading clumsily, bear more of a burden of information than the dialogue itself; passive voice can become an unthinking habit until the involuted relation of subject and verb in repeated passive sentences kills the pace of writing. Noticing and controlling these effects is necessary for strong prose; never, ever using an adverb or passive voice is something a high school English teacher might enforce (indeed, I know specific examples), but is an overreaction.

Again—the authors who lay out these rules usually say exactly that. These rules aren’t hard and fast. These rules aren’t universal. Sometimes you must break them. After all, they’re my rules.

One would think that would settle it, but some of these complaints also seem born of willful misunderstanding or mere resentment. This isn’t limited to Substack writers: I was surprised several years ago to see Ursula Le Guin taking an obvious potshot at Leonard in her book on writing. She took much the same tack, talking about his rules as if they are obviously phony and accusing him of hypocrisy. She was wrong, and the complainers are wrong.

Most crucially, rejecting or refusing even to consider rules and constraints will warp a writer’s artistic judgment. Any attempt to be bold or daring must begin at a baseline, because without that one cannot make judgments about what does and doesn’t work, and a writer who never works within constraints will never grow. Writing without rules is no more feasible than living without them.

* * * * *

Addendum: Even Strunk and White, who are the object of a Two Minutes Hate that comes in almost predictable cycles, were trying to train the sensibilities of beginners, not lay down eternal laws of good writing. One can write well while ignoring their advice, but not until it’s become a conscious decision, not a habit one slips into.

No aristocracy worth its salt

This week Before They Were Live dropped a new episode on Moana 2, which I haven’t seen, but Michial and Josh’s discussion of the film’s manifold weaknesses got me thinking about one of the biggest flaws in Frozen.

A few years ago I ranted about the dam in Frozen II—a badly imagined piece of infrastructure that has no use beyond serving as a cack-handed metaphor for the film’s political message. But that dam is not the first useless thing affecting the plot of a Frozen movie. I want to look at the first film’s villain, Prince Hans, and more specifically Arendelle’s useless aristocracy.

Here’s the rub: Prince Hans arrives early in the film and he and Anna, Queen Elsa’s younger sister, fall instantly in love. He swans around in a secondary role for a while until the climactic twist: Hans does not love Anna and, as the youngest son of another kingdom’s dynasty, as deliberately insinuated himself into Arendelle’s royal family to await an opportunity to take over. With Elsa feared and effectively outlawed and Anna mortally wounded by Elsa’s ice powers, Hans refuses Anna the kiss that will save her life, tells the handful of nobles hanging around the court that she’s dead, seizes control of Arendelle, and leads the attempt to eliminate Elsa. Boo, hiss.

I’m heartened to learn that I’m not the first person to criticize Hans as a villain. Others have pointed out the thin to nonexistent foreshadowing of his ulterior motives and the fact that his actions earlier in the film are counterproductive to his plot. (He’s also, in keeping with the political valence of the dam in Frozen II, more of a feminist device than a character, but more on that later.) These are legitimate complaints but not my chief problem with him.

The biggest problem with Hans, his plot, and Frozen’s climax is Arendelle’s useless aristocracy. I actually use this as a negative example when lecturing on the medieval nobility in Western Civ. Imagine: the youngest son of a foreign royal family shows up in a kingdom just emerging from a regency and ingratiates himself with the princess who is second in line to the throne. And consider the climax, when Hans, the only person allowed to talk to the severely ill princess, appears and tells the leading men that Anna is dead. Somewhere else. Trust me, bros. And they do.

A real aristocracy would have sniffed out Hans’s intentions in about ten seconds. No aristocracy worth its salt would have missed this, or failed to act against it. They would have sworn oaths to Elsa and her family and had roles to play under her rule and with respect to each other, roles they would fiercely protect. They would have duties and prerogatives. If they had somehow let things get to the point of Hans announcing Anna’s death, they would have demanded evidence. Immediately. He would have been an object of suspicion from beginning to end. A Bismarck, a John of Gaunt, a William Marshal, an Eorl Godwin, or your pick of the Percys, Hohenzollerns, or Carolingians would have eaten Hans alive.

But Arendelle does not have an aristocracy worth its salt. There are only four other men in the room when Hans makes his bid for control and one of them is a foreign diplomat. The rest are nameless drones in uniforms and sashes. This curiously empty kingdom must be either an absolute monarchy, with Elsa at the top and no mediating ranks between her and the people, or have an unseen, unmentioned parliament that has reduced the monarch to a figurehead—which I strongly doubt, if Elsa’s throne is as desirable as Hans thinks it is.

You could try to excuse this as the necessary simplicity of a children’s film, but children’s films don’t have to be simple. It’s more a cliche born of a typical American incuriosity regarding nobility, Americans being incapable of imagining aristocrats as having functions and not just being privileged people who are excusable as targets of scorn and envy. Frozen’s feminist underpinnings are also a factor, feminist ideology—whatever the movement’s other merits—being a universal machine for making complex reality stupidly oversimplified. Google Prince Hans and see how often the cliche “toxic” comes up. He’s a powerful man and other powerful men are just going to trust him and follow him.

Again, study history, even a little bit.

Hans and the Arendelle nobility aren’t just unrealistic—though it’s fun to nitpick and, when I point this out in class, to see students recognize it as a flaw based on what we’ve learned about the past. The real problem is that the combined lack of imagination and ideological cliche evidenced in Hans weaken the story. Like the dam in Frozen II, he’s there to make a point and reinforce a message, not to live and breathe.

A real aristocracy—the kind that patronized the courtly love poets and commissioned altarpieces and cathedrals—wouldn’t have made this mistake.

Richard Cory and ambiguity

One of my favorite poets is Edwin Arlington Robinson. Though both popular and respected in his day, winning the Pulitzer for poetry three times, he seems largely forgotten now. I suspect this is largely a matter of timing: he mastered traditional form and meter, especially the sonnet and villanelle, just as Pound and Eliot and company were coming along to blow it all up.

Robinson’s skill also makes his tightly constructed verse seem effortless, even conversational. It’s clear and understandable—something else the modern poetry establishment, which came more and more to resemble a clique or cult, won’t abide—and mines powerful emotions from everyday scenes and images. Perhaps his best-known poems in this regard are a series of character sketches describing people from a fictitious New England village: “Reuben Bright,” “Aaron Stark,” “Luke Havergal,” “Cliff Klingenhagen,” and my personal favorite—read it and you’ll get why—“Miniver Cheevy.”

Another favorite, and one of Robinson’s most memorable, challenging, and dark, is “Richard Cory.” Take a minute and read it—I’m going to spoil it.

In sixteen lines, Robinson introduces us to a handsome, elegant, popular, courteous, and, yes, wealthy local gentleman, a man with everything going for him. Envy is perhaps too strong a word for the community’s attitude—Richard Cory is too well respected, if not beloved, to warrant envy—but the anonymous speaker of the poem makes it clear that Richard Cory lives in a world everyone else only aspires to. And then Richard Cory kills himself.

I still feel the shock of the final line all the years later, and the bitter irony with which it reframes the entire preceding poem. There is some ambiguity there—was Richard Cory discontent? ungrateful? depressed?—but the import is fairly clear: money can’t buy happiness, and you never know what troubles afflict someone of seemingly greater privilege than you.

The Simon and Garfunkel version, released on Sounds of Silence in 1966, traffics in a different kind of ambiguity. It’s less than three minutes long—listen to it here.

Paul Simon, in adapting Robinson’s poem, makes some noteworthy thematic changes. Where Robinson began with the impression Richard Cory gave his neighbors on the street and mentions his wealth last, Simon leads off with his wealth and even explains where it came from—an inheritance from his banker father, though we’re told later he owns a factory—highlighting the extent of his property and influence. “He had everything a man could want,” in this version, “Power, grace, and style,” which is the reverse of the human view Robinson gives us. (Simon also updates the outward signs of Richard Cory’s wealth for the swingin’ sixties with “the orgies on his yacht.”)

But the biggest change is the inclusion of a chorus, in which the anonymous speaker of Robinson’s poem, one of Richard Cory’s neighbors, comments on his own situation:

But I, I work in his factory
and I curse the life I’m living
and I curse my poverty
and I wish that I could be (3x)
Richard Cory.

The chorus comes around three times and, on its final repetition, which comes immediately after the announcement of Richard Cory’s suicide, it takes on a powerful irony. Much the way Richard Cory’s fate in the last line of Robinson’s original changes the feeling and meaning of the rest of the poem, in Simon’s lyric version it changes the tone and meaning of the chorus.

This is where the ambiguity arises. Just what kind of envy—certainly the appropriate word here—is the speaker revealing?

If Simon has directly addressed his adaptation anywhere, I haven’t seen it. But an interpretation I’ve run across again and again online takes the final repetition of the chorus to be an admission by the speaker that he wants, like Richard Cory, to kill himself. (This is the interpretation presented in the Wikipedia summary, which cites no sources.)

I don’t think this is correct. For one, it makes the speaker far too individual, where in both Robinson and the rest of Simon’s version the “we” and the “I” stand in for the whole community. It’s also nihilistic in a way I don’t feel jibes with the rest of the song or Simon’s general oeuvre. But, most importantly, I think it has a simpler, more straightforward meaning related to that of the original poem: people don’t learn. The desire for wealth and material comfort lead us to overlook, ignore, or wish away the problems that come with them. We all know money doesn’t buy happiness—it’s a cliche for a reason—but who actually lives as if they know that? Literature and mythology, not to mention real life, are full of people who choose wealth and success knowing it will destroy them.

The yearning-for-suicide reading, which is rooted in an apparent ambiguity, bothers me. I think it’s a misreading of the song, yes, but I also think ambiguity, which can be a valuable tool in the hands of a purposeful artist, is overvalued today. The ambiguous ending is a mainstay of twee arthouse cinema. But ambiguity ceases to be cute when applied to suicide.

While feeling down and exhausted over the last month I’ve been doing a slow reread of Chesterton’s Orthodoxy. Chesterton’s light and frothy reputation is belied by his serious treatment of a subject like suicide. Here he is in Chapter V, “The Flag of the World,” writing forcefully about the deadly sin at the heart of it:

Not only is suicide a sin, it is the sin. It is the ultimate and absolute evil, the refusal to take an interest in existence; the refusal to take the oath of loyalty to life. The man who kills a man, kills a man. The man who kills himself, kills all men; as far as he is concerned he wipes out the world. His act is worse (symbolically considered) than any rape or dynamite outrage. For it destroys all buildings: it insults all women. . . . [H]e is a mere destroyer; spiritually, he destroys the universe.

The power of Robinson’s poem and Simon’s song derives from the assumed heinousness of Richard Cory’s act. That’s why it’s shocking in both. His wealth, personal elegance, and position in life only make it ironic, not less terrible. If Richard Cory’s suicide is just one more option, one a person with far more reasons to be bitter might justifiably desire to take, the entire story loses its meaning and weight.

Maybe that’s what Simon intended. I don’t know—but it would ruin the song. As good a song as it is, Robinson’s poem, in its structure and its properly used ambiguity, is better, and better for us.

Does it matter if the movie is faithful to the book?

Over the weekend Substack, in its mysterious way, showed me a month-old note by a literary critic I follow and respect. Since this is a month old and there was already some debate along these lines in the comments, I’ll share and gloss it anonymously:

It doesn’t matter if the film is faithful to the book.
It’s a film! Judge it as a film.
And anyway, you cannot faithfully turn prose into film.
It’s an affront to literary genius to think otherwise.

I’m not actually sure what the last line is supposed to mean. How does holding a filmmaker to a high standard when adapting a writer’s work degrade the writer? But I strenuously object to the rest of it.

To work backwards, the critic here is asserting that the difficulty of adaptation from one medium into another actually makes it impossible—“you cannot faithfully” adapt from book to film, he says. An appalling oversimplification. What does he mean by “prose,” here? When we talk about how a book is adapted into a film and the film isn’t faithful, we might mean it fails with regard to one or more of the following:

  • The literal events of the book

  • The overall story arc of the book

  • Particular details of the settings and/or characters

  • The narrative structure of the book

  • The meaning or thematic import of the book

  • The tone of the book

I’ve tried to arrange that list from simplest to most complex. The events narrated in a story are the easiest to get on screen. The meaning, what the author is apparently both getting out of the story and trying to share through it, and the tone of his storytelling are much harder. We’ve probably all seen movies that more or less adapted a book’s events without capturing the immaterial elements that give the book personality. A Handful of Dust, a quite literal adaptation of the great Waugh novel, comes to mind, as does the John Wayne True Grit. But other films might deviate here and there from the original while nailing its tone and moral register. The Coens’ No Country for Old Men and True Grit, both of which capture most of the events of their respective novels while, much more importantly, faithfully adapting their tones, are masterpieces in this regard.

All of this, according to our critic, is just “prose,” which “cannot faithfully” be made into a film. Cannot. This is not only oversimplified but wrong. Adaptation is difficult, but that we want to judge faithfulness at all indicates that it can be done, and can be done well.

Our critic is on firmer ground in asserting that films and books should be judged by different artistic standards, but this is common sense. Novels and movies tell stories in different ways and may or may not do so well, of course. But—still moving backwards—to assert a novel and its film adaptation are so separate that “it doesn’t matter” whether the adaptation is true to the book is foolishness.

Of course it matters. It matters because if a film adaptation of a book exists it exists because of the book. If a movie presumes to share a title with an author’s book, if it is meant to please readers of the book at all and not to be purely parasitic on the writer’s work and readership—we’re all familiar with the term cash-grab by now—the filmmakers owe it to the book to be faithful in at least some of the areas listed above. And having established that faithfulness is not, in fact, impossible, they owe it to the original to try.

I think it also matters because this kind of talk about the difficulty or impossibility of faithful adaptation has far too often served as an excuse for vandalism. Some vandalism originates with filmmakers contemptuous of their literary source material and wanting to drag it down to their level. Some comes from filmmakers who hubristically think they can improve on great literature. But perhaps the most common problem is the filmmaker with neither contempt nor reverence for the original, who sees it only as raw material to be reworked according to his preferences. It’s all content, after all.

This was my problem with two of the worst film adaptations I’ve seen in the last few years, The Green Knight and All Quiet on the Western Front, both of which—if you look at my reviews—I tried to judge on their merits as films while also noting their utter failure as adaptations. They don’t adapt the events, characters, meaning, or tone of the originals even a little bit faithfully. Are we to give them a pass because they have nice cinematography? Because they try to flatter our present assumptions?

There are other reasons to demand faithfulness of a film adaptation—the movie may be the one and only time many viewers, especially students, encounter any version of an author’s story—but these, I think, are the strongest. There is room for debate, of course. Arguments about whether and how Peter Jackson succeeded in adapting The Lord of the Rings, for example, have been fruitful for an appreciation of both the film trilogy and the novel. But handwaving even the possibility of faithfully adapting a book is bad for both.

A film might be just a film, but a film based on a book exists in relation to that book. If an author cared enough to write it and readers cared enough to read it, filmmakers owe them something more than apathy, hubris, or contempt. So do critics.

Goodreads Inferno

In a longish state-of-the-publishing-world essay on Substack, independent publisher Sam Jordison gives special consideration to the disappearance of the negative book review—the hatchet job—as a symptom of decline. He notes that author and critic DJ Taylor, whose excellent guide to Orwell I wrote about here last year, described the disappearance of “tough-minded” reviews, criticism that “often bordered on outright cruelty,” ten years ago. According to Jordison, the tepid positivity of book review pages has only worsened since then.

What caught my attention was Jordison’s second mention of Taylor’s phrase “outright cruelty,” which Jordison notes we shouldn’t want or need to come back: “We have Goodreads for that.” This observation is glossed with the following footnote:

Goodreads has risen just as professional book pages have declined. The nastiness and ignorance on display there is a reflection of internet culture, and the way everything Jeff Bezos touches is infected with his mean spirit. But I do also wonder if some people think they are restoring some kind of balance?

The nastiness on Goodreads is well known. Goodreads users mob and harrass authors over single lines, engage in character assassination, try to preemptively get books canceled before they’re even published, and even the authors who use Goodreads join in the bad behavior. Imagine the vitriol of Twitter, the politics of Tumblr, and the righteous self-assurance of a school librarian in a Subaru and you have the predominant tone of Goodreads today.

Thanks to the nastiness the profound ignorance on Goodreads is perhaps less visible. But as it happens, it was fresh on my mind because this morning, as I searched for a brand new one-volume edition of The Divine Comedy that I’m about to start reading, I made the mistake of looking at its top review.

According to the user responsible, Dante has written this “OG” “self-insert bible [sic] fanfiction” because he “thanks he is very special” (stated twice), “has a bit of a crush . . . on both Beatrice,” “his dead girlfriend,” and “his poetry man crush” Virgil, and wants “to brag about Italy and dunk on the current pope.” All of this is wrong, for what it’s worth, but here’s the closing paragraph:

TLDR: Do I think everyone should read this? No, it’s veryyyyy dense. But I think everyone should watch a recap video or something to understand a lot of famous literary tropes that become established here.

Read The Divine Comedy for the tropes. Or better yet, “watch a recap video.”

This is a five-star review, by the way.

I wish this were the exception on Goodreads, but it’s not. Here’s a person with the capacity and the patience—perhaps? the review is short on details of anything beyond Inferno—to read the Comedy but who is utterly unprepared to receive and understand it, presumably having lost the good of intellect. This review reads like those parody book review videos that were popular a decade ago, except Thug Notes actually offered legitimate insight as well as laughs.

I have a love-hate relationship with Goodreads. I signed up fourteen years ago and still use it every day. But I can only do so and maintain my sanity by sticking to my tiny corner of online acquaintances and people I actually know and avoiding the hellscape of popular fiction, where the fights that can break out in review comment sections resemble nothing so much as Dante’s damned striving against each other even in death. Finding a legitimate, thoughtful, accurate review is harder than ever. One must dig, sometimes through hundreds of reviews like the one above, to find something helpful. And it’s even harder if you’re interested in older books, for which the temptation toward glibness or snark—omg so outdated! so racist! so sexist!—is for many irresistible.

And, for authors whose books are on Goodreads, it’s hard not to let a latent anxiety build up. Sometimes it feels like, inevitably, it’ll be your turn in the crosshairs.

Jordison blames Jeff Bezos, who he correctly points out—as I just did in my Tech & Culture class last week—started selling books not because he loves them but because they’re easy to catalog and ship. I’m sure that’s a factor, but it’s not sufficient to explain the whole problem. His other culprit, “internet culture,” that broad and protean devil, plays a crucial role as well. Regardless, Jordison ends his essay on a note of hope:

But I don’t counsel despair. Because the truth is that there is still good work being done. There are a few decent book sections left. Writers are producing fine books. Publishers are bringing them into the world. People are reading them.

At least some of those books will endure.

Truly encouraging to remember. But that this must happen despite rather than because of the technologies we’ve created from an ostensible love of books is a judgment on our culture.

Me and the Southern accent

Last month on his microblog, Alan Jacobs linked to this short Atlantic piece—now paywalled—about the slow extinction of the Southern accent. Quoting the author of the essay on the decline of distinctive Southern accents among the young and the eventual reality that the accents will only survive among the old in out-of-the-way places, Jacobs noted, “I’m part of the trend too: I certainly have a Southern accent, but it’s not as pronounced as it was when I was younger, and I profoundly regret that.”

Likewise and likewise. The regret is painful.

My speech, like Jacobs’s, is identifiably Southern to outsiders, but largely through syntax (e.g. double modals), vocabulary (e.g. y’all, fixing to), and peculiarities of emphasis (e.g. saying umbrella instead of umbrella). My accent, in terms of pronunciation, is limited to ineradicable features like the long I noted in that Atlantic essay, yod-dropping, hanging on to the H in wh- words, and the occasional dropped G. I have neither a drawl nor a twang.

This is a regret to me because I feel it severs me from previous generations and the place I come from in one of the most fundamental ways. We learn speech at our mother’s breast and from those closest to us, not only in terms of family but in physical proximity. Gradually losing that means losing a part of me that participates in them and in home.

And I cherish those accents—of which The Atlantic rightly notes there are many. I learned two kinds of Georgia accent growing up. My dad’s parents, natives of Clarke County and the Athens area, spoke a lot like Flannery O’Connor—a Savannah native with her own peculiarities of pronunciation—does in this recording, a soft, non-rhotic accent that outsiders read as genteel. My maternal grandparents, Rabun County natives, spoke in a strongly rhotic accent with heavy Appalachian features. Both of these are from “north Georgia,” broadly speaking, but couldn’t be more different. Southern accents have immense county-by-county variety.

Generation adds more variation. My parents’ accents, both still marked by their parents’ roots, nevertheless grew toward each other, and my own is a yet finer blend—dominated by my maternal side’s Appalachian terseness. It comes out when I try to say iron (arn) or Florida oranges (Flarda arnjes).

In old home movies I have a shrill, squeaky, very country little voice. I’m not sure when the most obvious marks of family and home began to fall away, but it must have been around middle or high school. Unlike the writer in The Atlantic, it was never intentional. I never wanted to blend in, was never ashamed of being Southern—far from it, I grew a sizable chip on my shoulder during an undergrad career surrounded by Yankees and Midwesterners who thought nothing of moving South and mocking the locals for saying umbrella—and, if anything, I wanted more of an accent than what I ended up with.

Faking it is not, I decided long ago, an option. Better to let it emerge occasionally, a nice surprise. (I’ve noticed myself, in the classroom, pronouncing opportunity without the R lately, a real surprise.) I try to comfort myself with examples of other provincials who unintentionally lost their accents—namely CS Lewis, a Belfast native who, quite unconsciously, slowly conformed to the speech of whomever surrounded him and ended up sounding like this.

But when I remember my grandparents’ voices, and talk to my parents and aunts and uncles and siblings, and think about those home movies, and then recall my own kids’ sweet speech—in which very little Southern remains—all I can do is regret. Time isn’t the only thing that gets away from us. And this, the Ubi sunt? sense of loss, is perhaps the only thing more Southern than the accent I used to have.

Elegy for the mass market paperback

Some of my oldest and most cherished mass market paperbacks

It’s been a busy week both recovering from last weekend’s ice-storm and two lost days of school and preparing for this weekend’s snow, but not so busy that I didn’t catch a tempest in the Substack teapot: the apparent extinction of the mass market paperback.

In actual fact, Publisher’s Weekly reported last month that the country’s largest book distributor had decided not to bother shipping mass market paperbacks anymore, citing a steep decline in sales over the last few decades and profit margins that were already thin. This will naturally have an effect on how many of them are available and where, but the news was being misunderstood on Substack as either 1) mass market paperbacks will no longer be produced by publishers at all or, more egregiously, 2) paperbacks in general are being discontinued.

In the middle of his hubbub a not insignificant number of voices were raised crying “Good riddance!” Mass market paperbacks, they said, are cheap, badly designed, have small print and margins so narrow your thumbs cover the words, and their spines fall apart almost immediately. A lot of the same people paired their condemnation of the mass market paperback with praise for the hardback.

The mass market paperback may not, in fact, be extinct quite yet, but I can’t tolerate hatred for it.

Let me start with the crassly material. Cheapness is a feature, not a bug. The hardback aficionados seem to forget the kids who want to read but can’t stomach getting only one book for $30 they worked hard for or saved until the day they could visit a good bookstore. The money is just a facilitator, not the point; How much reading will my $20 get me? is the question I asked over and over as a teenager. I still do paperback math in my head—for most modern hardbacks I could have gotten at least six mass market paperbacks back in the day.

As for flimsiness, that’s just the nature of paper and glue. Even the $20—and more and more often $22 or $25—trade paperbacks dominating bookstores today will eventually fall apart from rough use or shoddy binding. Even hardbacks are not bound like they used to be. In my experience, if one takes even a little care of one’s books—not getting them wet, not just throwing them around, not intentionally breaking the spine like a barbarian—they’ll last a long time, and a mass market paperback from a decent publisher will likely be as sturdy as any other size.

And regarding design, a small book will necessarily have smaller print. Adapt. And just how big are your thumbs?

So much for that. Why do I feel so strongly about this?

My affection for the mass market paperback runs deep. I was a country boy without a lot of spare cash for books, so from quite early on, when I got a book, I got a mass market paperback. Many of the books my parents ordered for me at a discount from the God’s World Book Club in elementary school—Rifles for Watie and Across Five Aprils come to mind, as well as things like World’s Strangest Baseball Stories—were mass market paperbacks. I still have many of these.

The Hallmark store on St Simons Island where my mom shopped for gifts and ornaments every summer had an entire wall of these books. It was here that I first found a copy of The Killer Angels, which I knew as the source of Gettysburg, my favorite movie. My parents bought it for me and I just about wore it out reading it in the condo, by the pool, even at supper while a Japanese hibachi chef lit onion volcanoes on fire. Look at the photo at the top of this post—that’s the same copy. Cheap, yes—$5.99 is printed on the spine—but still serviceable.

When I started reading seriously in high school, the mass market paperback made entire literatures available to me for five or six dollars apiece. Signet Classics, which was repackaging their line in nicely designed matte-finish covers as I finished high school and started college, became my go-to. I’d pick up as many as I could with my birthday money during summer trips to St Simons—by this time blessed with a Books-a-Million, which always had a huge inventory of them—or at the Greenville Barnes & Noble when I didn’t have to be in class and had a little money. Burton Raffel’s Beowulf and Sir Gawain, Ovid, Virgil, Boccaccio, Euripides, Malory, Shakespeare, O Henry, David Copperfield, The Song of Roland—just this partial list is an introduction to a whole civilization for about $50.

The most important of all of these was a Signet Classics mass market paperback of Inferno, translated by John Ciardi, which I got sometime in 2001, a quarter century ago. And I can precisely date another important mass market paperback acquisition thanks to Amazon, where my very first order was a copy of All Quiet on the Western Front—in a metallic bronze cover I can still picture—on February 8, 2000.

Again, these are two books that transformed my life and I got both for about $10. (Amazon records the price of my copy of All Quiet as $4.79.) But much more important than the cost effectiveness is what I got out of these books, and the memories I have of them.

I’ve already mentioned a few of these. I also remember reading Inferno on the bus as a high school junior, canto by canto and reading every one of Ciardi’s notes both to uncover more of this amazing book and to block out the chaos around me. I first read Sir Gawain in a single Sunday afternoon as a college freshman, and plowed through The Bonfire of the Vanities over several weeks of lunch in the campus snack shop the same year. I carried my copy of Raffel’s Beowulf in my jacket pocket as I graduated from Clemson sixteen years ago. It was August and it got sweaty but it’s still here on my shelf. And of course there’s the drive to Atlanta in 2000 that I’ve mentioned before, when I read a chapter called “Riddles in the Dark” and realized I loved The Hobbit (purchased at Walmart) and would forever.

The mass market paperback met an important need for me at a specific time. Maturing as a reader, wanting to read a lot, but not having much money or space, and being limited to what was widely available in big bookstores, mass market paperbacks were an intermediate step between the $1 and $2 books in the Dover Thrift catalog I pored over in high school and the Penguin Classics I began collecting in college. Good books, readily available, in workmanlike binding, inexpensive—anything more strikes me as luxury.

I don’t begrudge anyone their hardback library—far from it—but I hate to see the mass market paperback impugned. It’s done humble and honorable service making entertainment and learning available to millions. I’m one of them.

I hope the mass market paperback’s death has been greatly exaggerated and that it will have many years left. But even if not, I’m grateful, and I’ll still enjoy mine.

Kubrick, conspiracism, and what happens when we assume

Stanley Kubrick and Jack Nicholson on set. The miniature hedgemaze in the foreground is an accidental metaphor for the subject of this post.

YouTuber Man Carrying Thing posted a funny and thought-provoking video yesterday concerning a strange emergent pop culture conspiracy theory. Apparently some disappointed fans of “Stranger Things” decided that secret new episodes are on their way, a fact signaled through elaborate visual codes in the final season. (I have no dog in this fight. I saw the first season when it first appeared and have not bothered with any of it since.) These fans have compiled huge numbers of minor details as “evidence” but the date of the supposed release of these surprise episodes has already come and gone. Undeterred, they continue with the predictions.

Jake (Man Carrying Thing) has some thoughtful things to say about this weird story, the most important of which, I think, is the role of bad storytelling in creating false assumptions and the way those assumptions fuel the mad conclusions these fans have come to. In the process he makes a brief comparison to Stanley Kubrick and The Shining, which is what I really want to write about here.

The Shining is the subject of several bizarre but elaborately worked out theories, the two most prominent of which are that the film functions as a hidden-in-plain-sight confession by Kubrick that he faked the moon landings for NASA and that the film has something to say about the fate of American Indians.

The latter is more easily disposed with along the lines Jake uses for some of the “evidence” of the “Stranger Things” theory. Why does the Overlook Hotel have so many Native American decorations? Because that’s what a Western hotel in that era would decorate with. Next question.

The NASA stuff goes deeper, though, and this is where Jake’s comments on the assumptions behind such theories are pertinent. The conspiracy theory interpretations of The Shining lean heavily on several assumptions, the most important of which is that Stanley Kubrick meticulously planned everything about his films down to the last item in every frame. Every detail, the argument goes, is intentional and meaningful, and so the film can and has, as Jake notes, been analyzed frame by frame for “evidence” of these theories. But is this assumption correct?

No. Kubrick was meticulous, yes, but not that meticulous. Or not that kind of meticulous. He was, in fact, too good an artist for that.

I encourage everyone to watch “The Making of The Shining,” a documentary shot by Kubrick’s 17-year old daughter Vivian and included as a special feature on the DVD and Blu-ray. (You can also watch it online here.) While the myth of Kubrick is of the chilly visionary with a perfect movie in his head that he brutally forces into reality, Vivian Kubrick captures her father changing and adapting on the fly, picking the ballroom music at the last minute, discussing the different versions—plural—of the script, and even coming up with the iconic floor-level angle of Jack Nicholson in the storage locker as they’re shooting the scene. She presents us the collaborative mess of filmmaking.

Kubrick knew what he wanted, but he had to work his way there, improvising and improving. This both rubbishes the conspiracist assumption about Kubrick, that The Shining presents some utterly controlled pre-planned message, and also functions as broadly applicable insight into creative work and human nature.

Any good artist in whatever medium will have a clear goal and an idea of how to accomplish it but will also adapt as they go, even the meticulous ones. That’s because every plan is subject to the combined friction of creative work and reality, which test the artist. The later illusion of coherence and completeness is part of the art. A great artist like Kubrick can disguise it well, because thanks to his gifts the final product is better than what he set out to make. But the Duffer brothers? Jake—and audience reaction to the conclusion of “Stranger Things”—suggests otherwise.

As for human nature, conspiracy theories, whose protagonists are often hypercompetent if not omnipotent, fail to take account of the messy, improvisatory quality of reality, especially when they presume to encompass a larger slice of it than that available on a film set. They are, as German scholar Michael Butter puts it in The Nature of Conspiracy Theories, “based on the assumption that human beings can direct the course of history according to their own intentions . . . that history is plannable.” Or in Jake’s words, “So many conspiracy theories would lose their convincing quality if those who believed them acknowledged human fallibility.”

Recognizing this can make us less susceptible to falsehood—because we all know what happens when we assume—and better creators. A strange but heartening intersection.

In Dilbert memoriam

A childhood favorite. some of my interests have never changed.

I’m late to the game in memorializing Scott Adams, who died a week ago today, and can offer only a personal appreciation. I hadn’t kept up with him consistently for about twenty years and heard of him just often enough to be amused at what he was getting up to. When I heard of his terminal illness last year and his plans to seek assisted suicide, I was grieved.

But to begin in the proper place. I was a comics-loving kid and while I was aware of Dilbert, which came packaged with all my favorites in my grandparents’ Atlanta Journal-Constitution and Anderson Independent, I don’t know how often I actually read it. My fundamental sense of what comic strips were came from Peanuts, Calvin and Hobbes, and—for one-panel high strangeness—The Far Side. These are still the three highest peaks in my estimation of the form. Dilbert was of a different world and valence than these, and its subjects and artwork probably didn’t immediately appeal.

But sometime in the mid-90s I got a new classmate at my small Christian school. I already owe one lifelong debt to Clint because he told me about this short story he had read at his previous school, “The Tell-Tale Heart,” and accidentally introduced me to Poe, but he was also a huge fan of Dilbert. I remember him bringing a copy of Fugitive from the Cubicle Police—the politically correct title for what in the strip itself is referred to as the Cubicle Gestapo—to read between classes. His enthusiasm and the specific strips he shared with me from this book led me to look closer at Dilbert. It was soon a favorite.

It’s a testament to Adams’s genius that a couple of twelve-year olds could have found Dilbert’s workplace humor so funny. For us Dilbert was essentially fantasy literature, full of strange races and the vocabulary of forbidden tongues. I had no idea what HR was (those were the days) or what a consultant or software engineer did or what any of the office-specific jargon and tech lingo of the mid-90s actually meant, but we floated along on the vibes and characterization, inferring the meaning and import of jokes. Adams was very good at this. His skill with story, characterization, and the crucial timing of written humor meant our lack of experience of this world posed no obstacle to understanding—and laughing. We got the point even when we didn’t get the reference.

The chapter on office pranks was not especially helpful job preparation for a middle-schooler

Soon I had a respectable stack of Dilbert books, including one that worked as a key to Dilbert’s world and appealed to my Aristotelian love of taxonomy: Seven Years of Highly Defective People, a best-of sorted by character with notes by Adams in the margins. These were informative and funny and his personality came through clearly.

I got to know that better by signing up—again, this is still the mid or late 90s—for his e-mail newsletter, which automatically made me part of the DNRC: Dogbert’s New Ruling Class, the intellectual elite of his forthcoming new world order. Here Adams offered updates and commentary and responded to reader e-mails with a brimful serving of his wry snark. It was here, I think, or perhaps in The Joy of Work, one of his non-cartoon books on business culture, that I learned the word cynical.

I was in middle school by then (I remember reading The Dilbert Future on my first trip to Europe in 1998, not quite fourteen) and that’s a heady moment to be introduced to cynicism. Not that it wouldn’t naturally have occurred about that time, but I’m not sure learning that one could adopt a self-aware, sardonic, Olympian aloofness about one’s environment was helpful to me. I’m already bent, in Malacandran terms, in these directions anyway, and Dilbert encouraged me to adopt a more self-conscious and ironic posture strictly because it was funny. This cynicism was, ironically, quite naive.

Perhaps this would have been fine in a Sisyphean office environment, but at fourteen my environments were family, church, and school, fields where earnestness is actually warranted—most of the time. Because I learned cynicism as a way of humor about the same time I learned that, as a true believer, I would often be let down, I learned to use wry humor as a shield. I don’t think Dilbert did me any long-term damage but I’ve had to mature past these attitudes and habits.

Back to Adams himself and the DNRC. The Dilbert newsletter was probably my first experience of a writer opening up his mind to his readers. In addition to cartooning, the business world, and the vast intellectual superiority of his subscribers, Adams unironically flogged his vegetarian taco brand and his thought experiments—another phrase I learned from him. He shared a lot of the ideas he’d eventually package as God’s Debris. I may have been naive but I wasn’t suggestible and wouldn’t follow the funny man into woo-woo agnosticism. I had accidentally learned how to observe proper boundaries with people I liked but couldn’t agree with on the important stuff, a lesson I can take no credit for. It also won’t be the last appearance of grace in this story.

I kept up with Dilbert online through college—it was one of several strips I checked daily—but Adams himself, whom I admired as the off-kilter mind behind the cartoon, fell out of my awareness and I was content simply to read the strip. Somewhere between my undergrad and grad school years I lost the habit even of this, so it was a shock to run across it occasionally and see updates. Dilbert in polo and lanyard? That would have been unthinkable in 1998. (But guess what I wear to work every day.)

I have no opinion on Adams and politics. When he popped up on my radar over the last ten years saying contrarian things to the great consternation of a lot of people, I was unsurprised. Hadn’t y’all met him? He was a contrarian. If he hadn’t been, Dilbert would never have had the edge and absurdity that made it great. It would have been Cathy in a software company.

But to return to where I started, when Adams announced his imminent death from pancreatic cancer and his plans to end his own life, I was grieved. I remembered my mixed feelings about his dorm room-style philosophizing, his know-it-all pandeism, his air of superiority—in a word, his arrogance, a trait that attracts middle schoolers like a whirlpool attracts flotsam—and worried that his gifts would end in a final act of nihilism as dark as anything in Catbert’s HR department. What I did not do was hope or pray for him.

I am in no position to weigh the merit of Adams’s announcement of his conversion to Christianity just before he died last week. The various algorithms have tried to feed me a lot of videos—all with thumbnails of frantic, outraged people mugging in front of microphones—arguing yea or nay on his reasons. What I do know is that Adams was facing death, the ultimate argument-ender, and these podcasters are not, and that God is not willing that any should perish. In a history replete with sinners converting in the most miserable of conditions, how is God diminished by saving one more? What I felt when I learned of his decision, a Pascal’s Wager deathbed conversion, was relief and gratitude.

Again, these are my observations as an old fan who, after childhood, held Adams at arm’s length but always appreciated him. Dilbert’s peculiar sense of humor is a key middle-layer of the development of my own sensibilities, and Adams’s genius was the same as that that made Peanuts, Calvin and Hobbes, and The Far Side great—the ability to heighten the ordinary while keeping it familiar, to people his imaginary landscape with characters we recognize as our friends, family, coworkers, and ourselves, to make this hilarious, and to do it seemingly effortlessly. Also like Schulz, Watterson, and Larson, he was, for better or worse, uncompromising. That his complicated story and difficult personality ended with not just a turn toward grace but a casting of himself on God makes it all the more poignant.

Adams’s story seems to me one of eucatastrophe, of grace snatching victory from the jaws of defeat. It is not a story Adams would have written. Is there any better end for the cynic than redemption?