An effect of sense

When I reviewed Rob Brotherton’s Suspicious Minds: Why We Believe Conspiracy Theories earlier this week I mentioned the pattern-finding processes built into our minds, the necessary, natural, and helpful instincts that can also lead us into error unless we carefully discipline our thinking. As it happens, I’ve run across two good examples of this kind of aberrant pattern-finding in the last few days (Coincidence??? Yes!), which I’ve decided to supplement with one more that I’ve personally encountered several times.

An ambiguous provocation

One of the pitfalls of writing fiction is the possibility of mistakes creeping in during revision, the stage when you’re supposed to be fixing mistakes. (I generate more typos in my own work during revision than at any other point of the process.)

This week I finally started reading The Name of the Rose, the great historical novel by Umberto Eco in which William of Baskerville, a Franciscan friar, investigates a series of murders in an Italian monastery. Assisting him is Adso of Melk, a young German Benedictine, and opposing him is Bernard Gui, a real-life Dominican inquisitor. In his lengthy postscript to the novel, Eco relates the following anecdote:

As I read the reviews of the novel, I felt a thrill of satisfaction when I found a critic . . . who quoted a remark of William's made at the end of the trial . . . “What terrifies you most in purity?” Adso asks. And William answers: “Haste.” I loved, and still love, these two lines very much. But then a reader pointed out to me that on the following page, Bernard Gui, threatening the cellarer with torture, says: “Justice is not inspired by haste, as the Pseudo Apostles believe, and the justice of God has centuries at its disposal.” And the reader rightly asked me what connection I had meant to establish between the haste feared by William and the absence of haste extolled by Bernard. At that point I realized that a disturbing thing had happened. The exchange between Adso and William does not exist in the manuscript. I added this brief dialogue in the galleys, for reasons of concinnity: I needed to insert another scansion before giving Bernard the floor again. And naturally, as I was making William loathe haste . . . I completely forgot that, a little later, Bernard speaks of haste. If you reread Bernard’s speech without William’s, it becomes simply a stereotyped expression, the sort of thing we would expect from a judge, a commonplace on the order of “All are equal before the law.” Alas, when juxtaposed with the haste mentioned by William, the haste mentioned by Bernard literally creates an effect of sense; and the reader is justified in wondering if the two men are saying the same thing, or if the loathing of haste expressed by William is not imperceptibly different from the loathing of haste expressed by Bernard. The text is there, and produces its own effects. Whether I wanted it this way or not, we are now faced with a question, an ambiguous provocation; and I myself feel embarrassment in interpreting this conflict, though I realize a meaning lurks there (perhaps many meanings do).

Here, the accidental repetition of a distinctive word creates “an effect of sense” in the reader, the feeling that there is some significant linkage between the two characters. And because its meaning is not immediately clear, it provokes the reader, who feels intuitively that there is something here that must be investigated and uncovered. Its very ambiguity suggests significance, so much so that a reader went to the trouble of asking Eco for an explanation.

It turns out there is no such linkage at all, but the feeling remains. Not a bad parallel to the kind of suspicions, arising seemingly out of nowhere, that commonly lead to conspiracy theories.

Cui bono?

I decided to follow up Suspicious Minds by reading the new revised edition of Conspiracy Theories: A Primer, by Joseph Uscinski and Adam Enders, a short academic study of conspiracy theories and other “anomalous beliefs.” In its chapter on the psychology and sociology of conspiracism, the authors introduce intentionality bias, which Brotherton covers well in Suspicious Minds, as well as a concept the authors call cheater detectors: “the willingness to suspect others of cheating,” especially when those others are perceived to benefit from an event. This can lead to “a tendency . . . to make an inferential leap from incentive to conspiracy.”

They continue:

For a real-world example, we could look to the death of Supreme Court Justice Antonin Scalia in 2016. Scalia's passing gave then-president Barack Obama the opportunity to shift the balance of the Court in his favor. Since he and his party had something to gain, some (including former president Trump) jumped to the conclusion that Obama had Scalia murdered. A more sober interpretation might be that an overweight, seventy-nine-year-old smoker with diabetes and heart problems isn’t exactly unlikely to die from natural (i.e., non-homicidal) causes. If we assumed that every time a grandmother passed away the grandchildren expecting to receive an inheritance murdered her, then every grandchild who inherits money must be a murderer! Such a view is obviously untenable.

That last example is a good takedown of one of the most annoying hermeneutical principles in modern popular discourse: the cui bono? (Who benefits?) principle. The question of cui bono? is staple of conspiracist thinking, which is a problem because of its simplifying, reductivist effect. Just because someone benefits in some relative way from an event does not mean that they intended or even wanted it to happen.

Kingfish

I’ve taught both halves of US History for eleven years now, and still use, with occasional updates and modifications, the PowerPoint slideshows I designed for my lectures during my first year. When I teach the Great Depression and introduce left-wing critics of the New Deal, one of the major figures I describe is Huey Long, the governor of Louisiana and eventually one of its two US senators. Long, a populist autocrat and vocal proponent of public spending and wealth redistribution, viewed FDR and the New Deal as insufficiently left-wing and vocally criticized both the policy program and the president himself.

I include some photos and usually take a detour to YouTube to show clips of Long giving speeches, but here are the points on my one slide about Long in a subsection I call “New Deal Backlash”:

  • Huey Long of Louisiana

  • Radical democratic populist

  • “Share the Wealth” plan to make “every man a king”

  • Popularity a challenge to FDR

  • Possibility of presidential campaign, but assassinated

I’ve been meaning to modify these last two points for years, because do you know what a consistent minority of students immediately suspect when this information is presented in this way? Again—the effect is instantaneous. Such patterns seem to suggest themselves.

As it happens, Long’s assassination also offers a good example for how to discipline this kind of thinking: by simply delving into the details. In the last few years I’ve shown my classes an “Unsolved Mysteries” segment on the Long assassination from 1992. It’s as fun and sensationalistic as you’d expect (I vividly remember watching Long’s bodyguards blow the assassin away as an eight-year old), but it does a good enough job of conveying the complexity in the lives of Long and his aggrieved assassin, Dr Carl Weiss, to put a hypothetical FDR hitman firmly out of mind.

To me, one of the most fascinating aspects of two fruitful fields for conspiracy theories—the JFK assassination and Hitler’s suicide—is the way the very possibility of conspiracy dissolves the more specifically you look at the details. Each event involved not only the major names but hundreds of other people, all of whom can be studied and charted individually and all of whose stories interact with each other’s and hundreds more. And there are tons of documentation. It’s often possible to know, minute by minute, who is in which room of the Führerbunker at any given time in the days surrounding Hitler’s death, and the same is true of the people inside the Texas Schoolbook Depository on the day Oswald shot Kennedy. (Here’s an excellent recent video on precisely this topic.)

All of which shows that conspiracy theories are easier to formulate and to believe—these dots are easier to connect—when you forget that the figures involved in them are people with lives and attachments living in complex communities, not game pieces.

Conclusion

In all three of these cases you have patterns naturally detected and suggested by the mind. Merely noticing them is not enough. A pattern is not evidence of the truth of any conclusions you may draw from them—the pattern may not even exist. Our thinking has to be subject to standards of truth outside its own natural processes.

More if you’re interested

Definitely check out that Lemmino documentary on the people inside and near by the Texas Schoolbook Depository on November 22, 1963. It’s excellently done, and if it weren’t so long I would certainly show it to my students. Here’s one I always show them, about one of the individuals whose behavior on that day never could have been predicted. I especially like the interviewee’s macro vs micro view of history. For what really happened in Hitler’s busy, crowded bunker in April and May of 1945, I always recommend the sixth edition of The Last Days of Hitler, by Hugh Trevor-Roper, and the more up-to-date Hitler’s Death, by Luke Daly-Groves, which I reviewed here long ago. If you’d like to hear from one of the many people present, Heinz Linge’s memoir is a worthwhile read. And Umberto Eco was no stranger to conspiracy theories. His satirical novel Foucault’s Pendulum concerns academics who invent a wild conspiracy theory for fun, only to have the theory start coming true.

Finally, I can’t pass over the actor playing Huey Long in the “Unsolved Mysteries” reenactment. This is Coen brothers veteran John McConnell (“And stay out of the Woolsworth!”), who also originated the role of Ignatius J Reilly in a stage version of A Confederacy of Dunces.

Suspicious Minds

Rob Brotherton’s book Suspicious Minds: Why We Believe Conspiracy Theories had been sitting on my shelf, waiting to be read, for just over four years when I ran across an Instagram reel in which a smirking mom wrote about how proud she was of her homeschooled child questioning the reality of the moon landing “and other dubious historical events.” When people in the comments asked, as I had wondered the moment I saw this video, whether this was really the kind of result homeschoolers would want to advertise, she and a posse of supporters aggressively doubled down, lobbing buzzwords like grenades. I think the very first reply included the loathsome term “critical thinking.”

Silly, but unsurprising for the internet—especially the world of women mugging silently into phone cameras while text appears onscreen—right? But I had not seen this video at random. Several trusted friends, people whose intellects and character I respect, had shared it on multiple social media platforms. I started reading Suspicious Minds that afternoon.

Brotherton is a psychologist, and in Suspicious Minds he sets out not to debunk or disprove any particular conspiracy theory—though he uses many as examples—but to explain how and why people come to believe and even take pride in believing such theories in the first place. He undertakes this with an explicit desire not to stigmatize or demean conspiracy theorists and criticizes authors whose books on conspiracism have used titles like Voodoo Histories and How Mumbo Jumbo Conquered the World. He also, crucially, dispels many common assumptions surrounding conspiracist thinking.

First among the misconceptions is the idea that conspiracy theories are a symptom of “paranoid” thinking. The term paranoid, which became strongly associated with conspiracism thanks to Richard Hofstadter’s 1964 essay “The Paranoid Style in American Politics,” is inappropriate as a descriptor because of its hint of mental imbalance and indiscriminate fear. Most conspiracy theorists, Brotherton points out, believe in one or a small number of mundane theories that are untrue but not especially consequential, much less worthy of anxiety. A second, related misconception—and by far the more important one—is that conspiracy theories are a phenomenon of the “fringe” of society: of basement dwellers, militia types, and street preachers in sandwich signs. In a word, obsessives. As Eric Ambler puts it in A Coffin for Dimitrios, “‘Obsession’ was an ugly word. It conjured up visions of bright stupid eyes and proofs that the world was flat.”

The idea of conspiracy theories as fringe is not only false, Brotherton argues, it is the exact opposite of the truth. In terms of pure numbers, repeated polls have found that an overwhelming majority of Americans believe in at least one major conspiracy theory—the most common by far being the belief that JFK was killed by someone other than or in addition to Lee Harvey Oswald—and often more than one. Conspiracist thinking is mainstream. It is the norm. This cannot be emphasized enough.

But why is this? Is it, as I must confess I used to think, that those numbers just provide evidence for how stupid the majority of people are? Brotherton argues that this conclusion is incorrect, too. There is no meaningful difference in how often or how much educated and uneducated people (which is not the same thing as smart and dumb people) adhere to conspiracy theories. Conspiracism is rooted deeper, not in a kernel of paranoia and fear but in the natural and normal way we see and think about the world.

Conspiracy theories, Brotherton argues, originate in the human mind’s own truth-detecting processes. They are a feature, not a bug. The bulk of Suspicious Minds book examines, in detail, how both the conscious and unconscious workings of the mind not only make conspiracist beliefs possible, but strengthen them. In addition to obvious problems like confirmation bias, which distorts thinking by overemphasizing information we already believe and agree with, and the Dunning–Kruger Effect, which causes us to overestimate our expertise and understanding of how things work, there are subtler ways our own thinking trips us up.

Proportionality bias, for example, causes disbelief that something significant could happen for insignificant reasons. As an example, Brotherton describes the freakish luck of Gavrilo Princip, a Serbian assassin who thought he had missed his target, Archduke Franz Ferdinand of Austria-Hungary, until the Archduke’s car pulled up a few feet in front of him and stalled out as the driver changed gears. This farcical murder of an unpopular royal by an inept assassin caused a war that killed over twenty million people. That people after the war—on both the winning and losing sides—sought an explanation more commensurate with the effect of the war is only natural. And the classic example is JFK himself, as many of the conspiracy theories surrounding him inevitably circle back to disbelief that a loser like Oswald could have killed the leader of the free world.

Similarly, intentionality bias suggests to us that everything that happens was intended by someone—they did it on purpose— especially bad things, so that famines, epidemics, stock market crashes, and wars become not tragedies native to our fallen condition but the fruit of sinister plots. Further, our many pattern-finding and simplifying instincts, heuristics that help us quickly grasp complex information, will also incline us to find cause and effect relationships in random events. We’re wired to disbelieve in accident or happenstance, so much so that we stubbornly connect dots when there is no design to be revealed.

That’s because we’re storytelling creatures. In perhaps the most important and crucial chapter in the book, “(Official) Stories,” Brotherton examines the way our built-in need for narrative affects our perceptions and understanding. Coincidence, accident, and simply not knowing are narratively unsatisfying, as any internet neckbeard complaining about “plot holes” will make sure you understand. So when outrageous Fortune, with her slings and arrows, throws catastrophe at us, it is natural to seek an explanation that makes sense of the story—an explanation with clear cause and effect, an identifiable antagonist, and understandable, often personal, motives.

Why does any of this matter? As I heard it put once, in an excellent video essay about the technical reasons the moon landing couldn’t have been faked, what is at stake is “the ultimate fate of knowing.” The same mental tools that help us understand and make quick decisions in a chaotic world can just as easily mislead and prejudice us.

This is why Brotherton’s insistence that conspiracy theories are, strictly speaking, rational is so important. As Chesterton put it in a line I’ve quoted many times, “The madman is not the man who has lost his reason. The madman is the man who has lost everything except his reason.” Merely thinking is not enough to lead us to the truth. Brotherton’s book is a much-needed reminder that finding the truth requires discipline, hard work, and no small measure of humility.

Scruton on style

Last week I revisited the late Sir Roger Scruton’s Beauty: A Very Short Introduction via audiobook on my commute. It’s an excellent precis of much that is fundamental to his thinking and, true to the subtitle, a wide-ranging introduction to many topics that bear further thought. Here’s one.

From a discussion of the role proportion plays in the creation of vernacular architectures by launching the builder on “a path of discovery” to what “fits” and is “suitable” for each detail in relation to the others in Chapter 4, “Everyday Beauty”:

One result of this process of matching is a visual vocabulary: by using identical mouldings in door and window, for example, the visual match becomes easier to recognize and to accept. Another result is what is loosely described as style—the repeated use of shapes, contours, materials and so on, their adaptation to special uses, and the search for a repertoire of visual gestures.

I like the idea of a style as mastery of a discipline’s “repertoire,” the selective, purposeful use of a shared vocabulary. Scruton’s example is architectural, but he also refers throughout the book to painting, sculpture, cinema, and most especially music. My mind naturally suggested literary style, with its literal shared vocabulary and the many effects and fine shades of meaning that a firm control of English can yield.

Scruton himself raises the idea of control as a component of style in the next chapter, “Artistic Beauty”:

True artists control their subject-matter, in order that our response to it should be their doing, not ours.

True artists control their subject-matter, in order that our response to it should be their doing, not ours. One way of exerting this control is through style . . . Style is not exhibited only by art: indeed, as I argued in the last chapter, it is natural to us, part of the aesthetics of everyday life, through which we arrange our environment and place it in significant relation to ourselves. Flair in dressing, for example, which is not the same as an insistent originality, consists rather in the ability to turn a shared repertoire in a personal direction, so that a single character is revealed in each of them. That is what we mean by style, and by the ‘stylishness’ that comes about when style over-reaches itself and becomes the dominant factor in a person’s dress.

The tension between originality and a common vocabulary and the need for balance is an important topic and one Scruton returns to later in the book, but he continues by introducing another consideration:

Styles can resemble each other, and contain large overlapping idioms—like the styles of Haydn and Mozart or Coleridge and Wordsworth. Or they might be unique, like the style of Van Gogh, so that anyone who shares the repertoire is seen as a mere copier or pasticheur, and not as an artist with a style of his own. Our tendency to think in this way has something to do with our sense of human integrity: the unique style is one that has identified a unique human being, whose personality is entirely objectified in his work.

This passage in particular offers a lot for the writer to think about. Every writer has heroes and idols and role models, other writers whose control over their work has influenced our own technique, consciously or not. This starts young. It’s been more than twenty years since I read Stephen King’s On Writing, but I still remember and think often about this passage:

You may find yourself adopting a style you find particularly exciting, and there’s nothing wrong with that. When I read Ray Bradbury as a kid, I wrote like Ray Bradbury—everything green and wondrous and seen through a lens smeared with the grease of nostalgia. When I read James M Cain, everything I wrote came out clipped and stripped and hard-boiled. When I read Lovecraft, my prose became luxurious and Byzantine.

All of which is, for King, a crucial developmental stage in the writer’s life, one that should be refined through constant reading and writing, so that eventually one is no longer writing in imitation but in “one’s own style.”

But if you’re aware of what you’re doing and working hard at it, particularly in order to achieve a certain specific effect—so that, per Scruton, the readers’ response will be my doing, not theirs—it’s hard not to become anxious that one is working merely in pastiche or even accidental parody. Have I sacrificed my integrity to sound like someone else? Inconsistency doesn’t help. I’ve worried more about this on some projects than others. Why am I confident that I can use tricks learned from Charles Portis but not those from Cormac McCarthy? Food for thought.

I think, naturally, of John Gardner and his description of “mannered” prose, a term he’d certainly have applied to McCarthy. “Mannered” suggests artificiality or phoniness, the lack of integrity Scruton suggests above, which is how every good writer hopes not to come across. But I also think of Elmore Leonard, another author whom I’ve quoted here many times, and who worked hard to make his style the absence of style. Scruton contends that that is impossible:

Style must be perceivable: there is no such thing as hidden style. It shows itself, even if it does so in artful ways that conceal the effort and sophistication . . . At the same time, it becomes perceivable by virtue of our comparative perceptions: it involves a standing out from norms that must also be subliminally present in our perception if the stylistic idioms and departures are to be noticed. Style enables artists to allude to things that they do not state, to summon comparisons that they do not explicitly make, to place their work and its subject-matter in a context which makes every gesture significant, and so achieve the kind of concentration of meaning that we witness in Britten’s Cello Symphony or Eliot's Four Quartets.

This is exactly right, and Leonard would agree. Leonard’s style, which was precisely designed to “conceal the effort and sophistication” of his writing and make it seem effortless, was immediately recognizable because it was distinct from the “norms” described above in particular ways—something Leonard himself noted. Those “norms” or context are the broader shared vocabulary we began with—which gives shape to one’s work through contrast.

And that final sentence on what a firm, controlled, purposeful, precise style can do, using the power of allusion, implicit comparison, the subtle significance of every detail to “achieve . . . concentration of meaning”—is there a writer who wouldn’t die happy having that said of his work?

Political prestige and pathetic dignity in a dying civilization

Yesterday was South Carolina’s Republican primary. Coincidentally, I also started a classic espionage novel I’ve been meaning to read for a while: A Coffin for Dimitrios, by Eric Ambler. Last night as the unwanted updates on the unwanted results of the unwanted primary slowed to a trickle I settled in to read a few more chapters before bed. And in the middle of Chapter 5 I read this:

 
In a dying civilization, political prestige is the reward not of the shrewdest diagnostician, but of the man with the best bedside manner. It is the decoration conferred on mediocrity by ignorance.
 

Apropos of nothing, right? After all, more than just about any other political process, a primary election is a popularity contest that is all about flattering, cajoling, and slinging enough mud to win. And winning is not the mark of distinction the candidates think it will be. Verily, they have their reward.

Ambler continues:

Yet there remains one sort of political prestige that may still be worn with a certain pathetic dignity; it is that given to the liberal-minded leader of a party of conflicting doctrinaire extremists. His dignity is that of all doomed men: for, whether the two extremes proceed to mutual destruction or whether one of them prevails, doomed he is, either to suffer the hatred of the people or to die a martyr.

Ambler was wryly describing the situation in many former Austro-Hungarian and especially Ottoman territories as part of the background plot of his novel, but the situation is instantly recognizable, not only in many other historical eras—I think immediately of Cicero—but in the present. Both major American political parties have plenty of doctrinaire extremists and doomed men to go around. But what we have too little of is that “pathetic dignity,” the attitude of the defeated who are truer to principle than to victory.

Maybe it’s my contrarianism, my commitment to a conservatism with little modern application, or my Reepicheep-like love of lost causes and last stands, but I hope to see more of that “pathetic dignity,” more people willing to lose than to flatter a terminal patient.

Seneca on internet rage

Seneca was a Stoic philosopher and teacher most famous, in the former role, for his Letters on Stoic philosophy and, in his latter role, as the personal tutor to Nero. Talk about wayward pupils. The following comes from Book III of his treatise De Ira (On Anger) in James Romm’s translation for Princeton UP, published as How to Keep Your Cool:

[Y]our anger is a kind of madness: because you set a huge price on worthless things.
— Seneca

Look now! Let’s examine other slights: food, and drink, and the elegance people work at for the sake of these; insulting words; gestures that don’t convey enough honor; stubborn beasts of burden and tardy slaves; suspicions and dark interpretations of someone else’s words, which make the gift of human speech into one of nature’s many injuries—believe me, these things are not serious, though we get seriously heated over them. They’re the sort of things that send young boys into fights and brawls. We pursue them so gravely, yet they hold nothing weighty or great. That’s why I tell you that your anger is a kind of madness: because you set a huge price on worthless things.

Years ago, in the early days of this blog, I shared a passage from another great ancient thinker, St Augustine of Hippo, that seemed to describe internet trolls 1600 years before the fact. Let us add this bit of Stoic insight to that file. As an acquaintance wrote to me after I rediscovered and shared this line yesterday, it’s remarkable how much of people’s behavior and reasoning on the internet can be explained by Stoic teaching on how unchecked passions over piddling things warp one’s reason.

James Romm, by the way, is also the author of Dying Every Day, an excellent book on Seneca, his relationship with his most famous student, and the way that relationship and the seeming failure of Seneca to decisively shape Nero has dogged his posthumous reputation.

Werner Herzog on psychoanalysis (and the 20th century)

Coincidental to my reading and review of Bill Watterson’s The Mysteries last weekend, today I ran across this passage on psychoanalysis from filmmaker Werner Herzog’s recent memoir Every Man for Himself and God Against All*:

 
I’d rather die than go to an analyst, because it’s my view that something fundamentally wrong happens there. If you harshly light every last corner of a house, the house will be uninhabitable. It’s like that with your soul; if you light it up, shadows and darkness and all, people will become ‘uninhabitable.’ I am convinced that it’s psychoanalysis—along with quite a few other mistakes—that has made the twentieth century so terrible. As far as I’m concerned, the twentieth century, in its entirety, was a mistake.
 

As in Watterson’s book, Herzog suggests here that the drive to illuminate and resolve—and, inevitably, to control—can only end in catastrophe. Food for thought.

Last year I read Herzog’s short novel The Twilight World and greatly enjoyed it. I haven’t delved deep into his filmography, which I keep meaning to correct, but his movie Invincible has proven uniquely haunting to me ever since I first watched it twenty years ago. I recommend it.

*German title: Jeder für sich und Gott gegen alle. The German-language audiobook is the only version currently available through my library. Might be a good opportunity to scrub some of the rust off my German.

Agatha Christie on historical perspective

Coincident to my recent posts about the “right side” of history and how our understanding of what happened in the past changes and, ideally, grows more thorough and accurate as time passes, here’s Agatha Christie in the short story “The Coming of Mr Quin,” which I’m reading in the collection Midwinter Murder: Fireside Tales from the Queen of Mystery.

Briefly, a New Year’s Eve party at a comfortable home is interrupted just after midnight by the arrival of a Mr Harley Quin, whose car has broken down. Quin says that he knew the house’s former owner, one Derek Capel, who unexpectedly killed himself a decade prior. Notice how Quin invites the partygoers to revisit what they know about the incident:

‘A very inexplicable business,’ said Mr Quin, slowly and deliberately, and he paused with the air of an actor who has just spoken an important cue.

‘You may well say inexplicable,’ burst in Conway. ‘The thing's a black mystery—always will be.’

‘I wonder,’ said Mr Quin, non-committally. ‘Yes, Sir Richard, you were saying?’

‘Astounding—that's what it was. Here's a man in the prime of life, gay, light-hearted, without a care in the world. Five or six old pals staying with him. Top of his spirits at dinner, full of plans for the future. And from the dinner table he goes straight upstairs to his room, takes a revolver from a drawer and shoots himself. Why? Nobody ever knew. Nobody ever will know.’

‘Isn’t that rather a sweeping statement, Sir Richard?’ asked Mr Quin, smiling.

Conway stared at him.

‘What d’you mean? I don't understand.’

‘A problem is not necessarily unsolvable because it has remained unsolved.’

‘Oh! Come, man, if nothing came out at the time, it's not likely to come out now—ten years afterwards?’

Mr Quin shook his head gently.

The contemporary historian never writes such a true history as the historian of a later generation. It is a question of getting the true perspective, of seeing things in proportion.
— Mr Quin

‘I disagree with you. The evidence of history is against you. The contemporary historian never writes such a true history as the historian of a later generation. It is a question of getting the true perspective, of seeing things in proportion. If you like to call it so, it is, like everything else, a question of relativity.’

Alex Portal leant forward, his face twitching painfully.

‘You are right, Mr Quin,’ he cried, ‘you are right. Time does not dispose of a question—it only presents it anew in a different guise.’

Evesham was smiling tolerantly.

‘Then you mean to say, Mr Quin, that if we were to hold, let us say, a Court of Inquiry tonight, into the circumstances of Derek Capel’s death, we are as likely to arrive at the truth as we should have been at the time?’

More likely, Mr Evesham. The personal equation has largely dropped out, and you will remember facts as facts without seeking to put your own interpretation upon them.’

Evesham frowned doubtfully.

‘One must have a starting point, of course,’ said Mr Quin in his quiet level voice. ‘A starting point is usually a theory. One of you must have a theory, I am sure. How about you, Sir Richard?’

Simple and tailored to the mystery genre, but not a bad explanation of how the greater perspective afforded by historical distance can lead to a more accurate understanding of important events. There are, certainly, parts of my own life I understand much better now than when I was an eyewitness living through them.

I’ve been trying to read more of Agatha Christie the last year or so after having made it to my late thirties with Murder on the Orient Express as my sole experience of her storytelling. My wife, on the other hand, has read a lot of Christie, and has done so over many years. But even she was unfamiliar with Christie’s Mr Quin, who is the subject of several short stories collected as The Mysterious Mr Quin. I’m enjoying him in this story so far—especially with this kind of sharp historical aside—and plan to check that out.

History has no sides

History, a mosaic by Frederick Dielman in the Library of COngress

I started this post some weeks ago, but sickness—mine and others—intervened. Fortuitously so, since it seems appropriate to finish and post this as a New Year’s Eve reflection, a reminder as 2023 gives way, irretrievably, to 2024.

Writing in Law & Liberty a few weeks ago, Theodore Dalrymple takes the recent conflict between Venezuela and Guyana, a large area of which Venezuela is now claiming as its own territory, as an opportunity to consider an idea invoked by Guyana’s rightly aggrieved foreign minister: “the right side of history.”

This is now a common term for an idea that was already fairly widespread, a sort of popularized Whig or Progressive view of history’s supposed outworkings that, as Dalrymple notes, “implies a teleology in history, a pre-established end to which history is necessarily moving.” History has a goal, an ultimate good toward which societies and governments are moving, a goal that offers an easy moral calculus: if a thing helps the world toward that goal, it is good, and if it hinders or frustrates movement toward that goal, it is bad. This is how history comes to have “sides.”

As worldviews go, this is relatively simple, easily adaptable—whiggishness, as I’ve noted, tends to be its conservative form, and Progressivism or doctrinaire Marxism to be its liberal form—and offers a clarity to thorny questions that may have no easy answer. This is why people who believe in “the right side of history” are so sure both of themselves and of the perversity and evil of anyone who disagrees with them.

But “the right side of history” has one problem: it doesn’t exist. Dalrymple:

[H]istory has no sides and evaluates nothing. We often hear of the ‘verdict of history,’ but it is humans, not history, that bring in verdicts.
— Theodore Dalrymple

But history has no sides and evaluates nothing. We often hear of the “verdict of history,” but it is humans, not history, that bring in verdicts, and the verdicts that they bring in often change with time. The plus becomes a minus and then a plus again. As Chou En-Lai famously said in 1972 when asked about the effect of the French Revolution, “It is too early to tell.” It is not merely that moral evaluations change; so do evaluations of what actually happened and the causes of what actually happened. We do not expect a final agreement over the cause or causes of the First World War. That does not mean that no rational discussion of the subject is possible—but finality on it is impossible.

“It is true,” he continues, “that there are trends in history, but they do not reach inexorable logical conclusions.” This is the false promise of Hegel or, further back, the Enlightenment. Outcomes are not moral judgements, and victories of one side over another are not proof of rightness. Dalrymple:

History is not some deus ex machina, or what the philosopher, Gilbert Ryle, called the ghost in the machine; it is not a supra-human force, a kind of supervisory demi-urge acting upon humans as international law is supposed to act upon nations. . . . Are we now to say that authoritarianism is on the right side of history, as recently liberal democracy was only thirty years ago, because so much of the world is ruled by it?

To equate victory with goodness or to view success as superiority—the inescapable but usually unstated Darwinian element in “the right side of history”—is, as CS Lewis put it, to mistake “the goddess History” for “the strumpet Fortune.”

Dalrymple concludes with an important question, one he is unusually reticent in answering:

History might excuse our worst actions, justifying grossly unethical behaviour.
— Theodore Dalrymple

Does it matter if we ascribe right and wrong sides to history? I think it could—I cannot be more categorical than that. On the one hand, it might make us complacent, liable to sit back and wait for History to do our work for us. Perhaps more importantly, History might excuse our worst actions, justifying grossly unethical behaviour as if we were acting as only automaton midwives of a foreordained denouement. But if history is a seamless robe, no denouement is final.

I’m going to be more categorical and say that it certainly matters whether we believe history has sides, and for the latter of the two reasons Dalrymple lays out. History—with a right and wrong side and a capital H—offers a rationalization, a handy excuse. Armed with an ideology and a theory of history’s endpoint and the post-Enlightenment cocksureness that society is malleable enough to submit to scientific control in pursuit of perfection, group after group of idealists has tried to shove, whip, or drag the world forward into the light. And when the world proves intractable, resistant to “the right side of history,” it is easy to treat opponents as enemies, blame them for failure, and eradicate them.

This is true even, and perhaps especially, of groups that start off making pacifist noises and decrying the violence and oppression of the status quo. The Jacobins and the Bolsheviks are only the most obvious examples, though our world in this, the year of our Lord 2023, is full of groups that have granted themselves permission to disrupt and destroy because they are on “the right side of history.” What do your puny laws, customs, and scruples matter in the face of History?

That’s the extreme danger, but a real one as the last few centuries have shown. Yet the first danger Dalrymple describes is even more insidious because it is so common as to become invisible—the smug complacency of the elect.

What kind of grim New Year’s Eve message is this? It’s a denunciation of a false idea, sure, but also a plea to view the change from 2023 to 2024 as no more than that—the change of a date. Year follows year. Time gets away from us. Everything changes without progress, things neither constantly improving nor constantly worsening and with no movement toward a perfect endpoint of anyone’s choosing.

Unless, of course, something from outside history intervenes. History, like war, like gravity, like death, is a bare amoral fact in a fallen world. If it is to have meaning and moral import at all it must come from somewhere other than itself. For those of us who believe in God, this is his providence. He has an endpoint and a goal and a path to get there but, tellingly, though he has revealed his ends he has kept his means, the way there, hidden. Based on what I’ve considered above, this is for our own good. The temptation not only to divine his hand in our preferred outcomes but to seize control of history and improve the world is powerful. We haven’t reached the end of it yet.

Until then, if history has sides at all, they are only the two sides of Janus’s face—looking behind and ahead, observing but never reaching either past or future. The more clearly we see this, the more deliberately we can dispel the luminous intellectual fog of thinking about the movement of History with a capital H, the more we can focus on the things nearest and most present with us. Celebrate the New Year, pray for your children, and get to work on the little patch that belongs to you, uprooting evil in the fields you know. That’s my goal, at least.

Thanks as always for reading. Happy New Year, and best wishes to you for 2024!

More if you’re interested

Dalrymple’s entire essay is worth your while. Read it at Law & Liberty here. The sadistic violence of the ostensibly pacifist French Revolutionaries is fresh on my mind because of David A Bell’s excellent book The First Total War, which I plan to write more about in my reading year-in-review. For CS Lewis on the false idea of “the judgement of history,” see here. And for one of my favorite GK Chesterton lines on progress, see here. For a view of history and progress and the pursuit of human perfectibility that closely aligns with my own, see Edgar Allan Poe here. Let me also end the year with another recommendation of Herbert Butterfield’s classic study The Whig Interpretation of History, the fundamental text in rebuking ideas of progress.

Ciceronian political moderation

I’ve been slowly, slowly reading through John Buchan’s posthumously published memoir Memory Hold-the-Door over the last couple of months. I’m sick for the third or fourth time since October, and while resting yesterday I dived back into Buchan’s book again and reached the point in his career when he entered politics, standing as a Conservative candidate for the Commons in 1911. Buchan:

My political experience at the time was nil, and my views were shallow and ill-informed—inclinations rather than principles. I believed profoundly in the possibilities of the Empire as a guardian of world peace, and as a factor in the solution of all our domestic problems, but I no longer accepted imperial federation, and I had little confidence in Mr. Chamberlain’s tariff policy. For socialism I had the distrust that I felt for all absolute creeds, and Marxism, to which I had given some attention, seemed to me to have an insecure speculative basis and to be purblind as a reading of history. On the other hand I wanted the community to use its communal strength when the facts justified it, and I believed in the progressive socialisation of the State, provided the freedom of the personality were assured. I had more sympathy with socialism than with orthodox liberalism, which I thought a barren strife about dogmas that at that time had only an antiquarian interest. But I was a Tory in the sense that I disliked change unless the need for it was amply proved, and that I desired to preserve continuity with the past and keep whatever of the old foundations were sound. As I used to put it in a fisherman's simile, if your back cast is poor your forward cast will be a mess.

There’s much to both agree and quibble with here—not least whether it’s even possible to have “freedom of the personality” under an ever more socialist state, though one has to forgive Buchan for having no idea just how bloated and all-smothering a bureaucracy could become—but the thing about Buchan is I know we could have a good-faith conversation about it. And I agree with most of the rest of it, especially the barrenness of liberalism and the need for continuity.

Buchan seems to have been ill-at-ease in the world of politics, not only because of his “inclinations” and his lack of striving ambition but because of his broad sympathies, fairmindedness, and honesty.

I had always felt that it was a citizen’s duty to find some form of public service, but I had no strong parliamentary ambitions. Nor was there any special cause at the moment which I felt impelled to plead. While I believed in party government and in party loyalty, I never attained to the happy partisan zeal of many of my friends, being painfully aware of my own and my party’s defects, and uneasily conscious of the merits of my opponent.

Ditto. This is actual political moderation, not the phony and elusive “centrism” promoted as the cure to our ills.

Buchan then quotes a passage from Macaulay’s History of England that describes the political stance of the 1st Marquess of Halifax, a political attitude that Buchan owned he “was apt to fall into”:

His place was on the debatable ground between the hostile divisions of the community, and he never wandered far beyond the frontier of either. The party to which he at any moment belonged was the party which, at that moment, he liked least, because it was the party of which at that moment he had the nearest view. He was therefore always severe upon his violent associates, and was always in friendly relations with his moderate opponents. Every faction in the day of its insolent and vindictive triumph incurred his censure; and every faction, when vanquished and persecuted, found in him a protector.

This description of his inclinations and positions, and most especially the passage from Macaulay, brought to mind Finley Hooper’s summary of Cicero’s politics, one I’ve often felt describes my own “inclinations” and that I now try consciously to hold myself to. Hooper, in his Roman Realities:

Cicero was a man of the middle class all his life. He opposed the selfish interests of a senatorial oligarchy and the selfish interests of the Populares, who had their way in the Tribal Assembly. When one side appeared to have the upper hand, he leaned toward the other. He was very conscious of a decadent ruling class which insisted on its right to rule regardless of whether it ruled well or not. The demagogues of Clodius’s stripe were even more frightening to him, and most of the time their activities kept him estranged from the people.

Hear hear. But while both Cicero and Buchan were sensitive to the cultural rot and decadence that manifested itself among the political elite and the wider culture, both would also aver that politics is not the solution. In Cicero’s own, words: “Electioneering and the struggle for offices is an altogether wretched practice.”

I’ve been savoring Memory Hold-the-Door, a warmly written and often poignant book, and I look forward to finishing it. And the above is not the only distinctly Ciceronian passage. Buchan, no mean classicist, describes his friend and publisher Tommie Nelson, who was killed in the First World War, this way:

His death made a bigger hole in the life of Scotland than that of any other man of his years. . . . In the case of others we might regret the premature loss to the world of some peculiar talent; with Tommie we mourned especially the loss of a talent for living worthily and helping others to do likewise. It is the kind of loss least easy to forget, and yet one which soon comes to be contemplated without pain, for he had succeeded most fully in life.

This could come straight from Cicero’s De Amicitia (On Friendship), another favorite essay of mine from late in his life. Interesting how a long life and nearness to an unexpected death sharpened the insights of both men.

For more of Cicero on politics, see this election day post from three years ago. For Buchan’s nightmare vision of individual moral rot leading to civilizational decline, see here.

Grace and the Grinch

I’m home with a sick four-year old today, which means I’m also home with the Paw Patrol. This morning began with “The Pups Save Christmas,” an episode in which Santa crashes in Adventure Bay on Christmas Eve, losing his reindeer and scattering presents over a wide area. It’s up to Ryder and the pups to help Santa or “Christmas will be canceled.” Naturally they pull through.

There’s more to the episode than that, but I was struck for the first time by how many Christmas shows and movies center on a team of good characters helping Santa “save” Christmas. They have to work to make Christmas happen, otherwise there’s a real possibility that it won’t. “There won’t be a Christmas this year” is an oft-repeated foreboding in these stories.

By contrast, think of “How the Grinch Stole Christmas,” a story the daring of which has been lost on us through sheer familiarity. The Grinch, not just a villain but a Satanic figure, does all he can to stop Christmas. He removes all of the Whos’ material means of joy, all the trappings of Christmas that characters in other stories work to save, and Christmas still happens. “It came without ribbons,” he says in outrage that turns to wonder. “It came without tags. It came without packages, boxes, or bags.”

“Paw Patrol” and other “save Christmas” stories show us the logic of magic or paganism—or, for that matter, computer programming, which is more like magic than devotees of either science care to admit. Certain conditions have to be met to get the desired result. If presents, then happiness. Mistakes or missing parts will crash the whole system. All of these stories have a lot to say about “Christmas spirit” and “believing” but this rhetoric is belied by the stories themselves, which always feature a desperate race to help Santa on his way.

What “The Grinch” shows us, on the other hand, is the logic of grace. It shows better than any other Christmas entertainment the pure gratuitous gift of Christmas, a gift that comes into the world through the goodness of someone else and that we have no control over. We can reject it, as the Grinch does at first, but we can neither make it happen nor stop it.

The nearest that that episode of “Paw Patrol” can get to grace is to assert that “Christmas is about helping others,” which is still making Christmas happen through your own best efforts. Again, compare the Grinch. Having put a lot of work into stopping Christmas and failed, he is transformed by it. You might even say converted. The grace given to the Whos extends even to him, and he returns the literal gifts that have proven, through grace, immaterial to them. Now that the presents and ornaments and roast beast don’t matter to him either, he has the grace to share them. Material blessing comes from joy and grace rather than the other way around, which is the Grinch’s starting assumption—and that of a lot of other Christmas stories in which mere mortals have to create the conditions for Christmas themselves.

This is the wonderful paradox of Christmas. The promise that Christmas will happen no matter what we do is a purer hope than any moralistic message about spending time together or helping others. Joy comes from grace, and that joy will produce everything else that makes Christmas meaningful—including helping others. We just have to let it transform us.

CS Lewis, 60 and 125 years later

Last week I was too busy critiquing Napoleon to note the 60th anniversary of the death of CS Lewis here—one more thing to hold against Napoleon—though I did manage to slip through an Instagram memorial. Fortunately, today is Lewis’s 125th birthday, so in the spirit of commemoration and appreciation here are a few good things I read from others to mark sixty years since his passing.

CS Lewis (1898-1963)

At her Substack Further Up, Bethel McGrew has an excellent reflection on her own lifelong connection to Lewis and the way the endless quoting of his work risks simplifying him into a generator of therapeutic fortune cookie messages:

Lewis is much-quoted, for good reason: He is prolifically quotable. (There are also a few famously misattributed quotes, like “You are a soul, you have a body,” which no doubt would have annoyed him greatly.) And yet, there’s a paradoxical sense in which his quotability almost risks watering down his true value as a thinker. There’s a temptation to see Lewis as a one-stop “Christian answer man,” the super-Christian who always had the perfect eloquent solution to every Christian’s hard problems. To be sure, he came closer than most Christian writers to providing a sense-making framework for hard problems. But even he wouldn’t claim to have “solved” them. Indeed, his very strength as a writer was that his work swung free of top-down systematic theologies which claim to provide comprehensively satisfying theological answers.

She continues with a particularly poignant example from A Grief Observed. I recommend the whole post.

At Miller’s Book Review, another outstanding Substack, Joel Miller considers Lewis’s humor in the years just before his death, when failing health should have robbed him of his joy:

Sayer says that Lewis “never lost his sense of humor.” Indeed, he was famously good natured, even amid dire circumstances. On July 15, 1963, he suffered a heart attack and slipped into a coma. Friends feared the worst; some came and prayed; a priest gave the sacrament of extreme unction. Amazingly, an hour after the sacrament, Lewis awoke, revived, and asked for a cup of tea.

True to form, he found a joke in it. “I was unexpectedly revived from a long coma,” he wrote Sister Penelope, an Anglican nun with whom he frequently corresponded. “Ought one honor Lazarus rather than Stephen as the protomartyr? To be brought back and have all one’s dying to do again was rather hard.”

Miller also reflects on his own experience of reading and rereading Lewis. Like Miller, I came to Narnia late, well after many of Lewis’s other books, and I have also read and reread Lewis’s work many times. As Miller notes, though Lewis did not expect his work to be remembered, it’s a safe bet that readers like him and myself will continue to find and appreciate Lewis’s work.

At World magazine, Samuel D James has a good short essay on Lewis as a prophet:

Precisely because Lewis knew that the claims of Christianity were all-encompassing, he recognized that no civilization that abandoned it could function. This was not because Lewis desired some kind of baptized Anglo-Saxon ethnonationalist state (born in Belfast, Lewis never forgot the high cost of religious intolerance), but because modern man’s alternatives were quite literally inhumane. Lewis saw from afar, with striking prescience, that humans had no choice but to retreat from personhood if they wanted to escape the implications of Christian revelation.

At The Critic, Rhys Laverty elaborates more deeply on the same theme:

At the close of the Second World War, Lewis was one of a number of Christian intellectuals (alongside Jacques Maritain, Simone Weil, W.H. Auden, and T.S. Eliot) who had begun to consider what world the Allied powers would now make for themselves. Lewis saw a future in which the rejection of transcendent values would allow a technologised elite to re-make nature as they saw fit, ultimately overthrowing human nature itself — a process made possible through the ideological capture of education.

Laverty invokes not only The Abolition of Man, as James does, but Lewis’s dramatization of those ideas in the final novel of The Space Trilogy, That Hideous Strength, in which the elite of the National Institute of Coordinated Experiments (NICE) pursue genuinely diabolical technological progress and control:

With N.I.C.E, Lewis anticipated our contemporary technocracy. “Progress” is our unquestionable sacred cow, and its faithful handmaiden is technology. Whether we are tearing up areas of ancient natural beauty in order to build infrastructure supposedly intended to help protect the environment, prescribing new cross-sex hormones and surgery to enable greater self-realisation, or developing artificial wombs which we unconvincingly insist will only ever be used for the care of premature infants, there is now no technological innovation that we will deny ourselves today if it supposedly contributes to the nebulous “future good of humanity”.

It is only Green Book education which makes N.I.C.E possible. If truth, goodness, beauty, and so on are merely relative then there is nothing to rein in man’s “conquest of nature”. His scruples are mere hang-ups to be educated out. He will be driven by pure reason or pure appetite, with no sentiment to regulate their respective metrics of efficiency or pleasure. 

I commend all four of these essays to y’all. They’re good celebrations of a worthy life and a worthy mind, and have gotten me wanting to reread pretty much all of my Lewis shelf. Which might take a while.

Let me conclude with a brief personal reflection of my own. Growing up in the environment I did, I don’t remember ever not knowing about Lewis. He was a byword for intelligent Christian thought, something that stood out to me among the generally anti-intellectual atmosphere of fundamentalism. My earliest accidental exposure was probably the BBC Narnia films. I recall catching a long stretch of The Silver Chair on PBS at my grandparents’ house one morning. As dated as those adaptations are now, it scared me. But it also riveted me, and stayed with me. Indeed, The Silver Chair may still be my favorite of the Narnia books.

But it was a long time before I actually read anything by CS Lewis. My parents got me a set of his non-fiction books at our church bookstore when I was in high school. I started The Great Divorce one night and something about the Grey Town and the bus ride into the unknown disturbed me so much that I put it away. That nightmare quality again. But when I tried the book one sleepy Sunday afternoon in college—my way prepared by Dante, whom I discovered my senior year of high school—I read the entire thing in one sitting. It’s still among my favorite Lewis books.

From there it was on to The Screwtape Letters and Mere Christianity and The Four Loves. I read The Lion, the Witch, and the Wardrobe and liked it but returned to the non-fiction, devouring Lewis’s essays on any topic. After college I read The Space Trilogy—all three in one week, if I remember correctly—and delved into his scholarly work: An Experiment in Criticism and, crucially, The Discarded Image. I also read as much about Lewis as I read by him, and dug into the works that Lewis loved only to discover new loves of my own, most notably GK Chesterton.

Only with the birth of my children did I seriously return to Narnia, and now I genuinely love them. My kids do, too. They’ll be yet another generation entertained and blessed by Lewis’s work.

He is one of the few authors who has grown with me for so long—guiding me, enlightening me, introducing me to great literature, telling me entertaining and meaningful stories of his own, and deepening both my understanding and my faith. Where the fictional Lewis of The Great Divorce meets George MacDonald as his heavenly guide, the Virgil to his Dante, Lewis could well play that role for me.

On this, his 125th birthday, just over a week from the 60th anniversary of his death, I am more grateful than ever for CS Lewis. RIP.

The fog of war is no excuse

Speaking of John Keegan, here’s a passage from the chapter on Waterloo from The Face of Battle that I’d like to enlarge upon. Regarding the way the Battle of Waterloo is traditionally described as unfolding—in five “phases” of engagement—Keegan writes:

It is probably otiose to point out that the ‘five phases’ of the battle were not perceived at the time by any of the combatants, not even, despite their points of vantage and powers of direct intervention in events, by Wellington and Napoleon. The ‘five phases’ are, of course, a narrative convenience.

A narrative convenience, he might have added, laboriously gathered and constructed after the fact and over many years. He goes on to describe “how very partial indeed was the view of most of” the participants, beginning with distraction and proceeding to visibility:

There were other causes, besides the preoccupation of duty, which deprived men of a coherent or extended view of what was going on around them. Many regiments spent much of their time lying down, usually on the reverse slope of the position, which itself obscured sight of the action elsewhere. . . . A few feet of elevation, therefore, made the difference between a bird’s-eye and a worm’s-eye view . . . But even on the crest of a position, physical obstacles could limit the soldier’s horizon very sharply. In many places, at least at the beginning of the battle, the crops of wheat and rye stood tall enough for the enemy to approach to within close musket shot undetected. . . . [T]he men in the rear or interior of dense columnar formations, of the type adopted by the Guard in their advance, would have glimpsed little of the battle but hats, necks and backs, and those at a distance of a few inches, even when their comrades at the front were exchanging fire with the enemy. And almost everyone, however well-positioned otherwise for a view, would for shorter or longer periods have been lapped or enveloped by dense clouds of gunpowder smoke.

And those are just problems affecting vision. The other senses have equally severe limitations and are just as susceptible to illusion. Look up acoustic shadow sometime. Keegan: “To have asked a survivor . . . what he remembered of the battle, therefore, would probably not have been to learn very much.”

Now compound these limitations and frequent misperceptions and misunderstands by passing them through reporters. But at least reporters are impartial, right?

Visit the New York Times complete online digital archive—or the archive of any old newspaper—and look up a the earliest possible reporting on a conflict you know a lot about. You’ll be amazed at how much is simply wrong. And that’s not even allowing for spin, for bias, for lies, for manifold other motivated errors.

What we know about battles and wars and other conflicts we know because of that laborious process I mentioned above, of gathering, compiling, organizing, and collating sources and information, and then study and study and more study, not to mention walking the ground. There are things happening now that we will never—none of us in our own lifetimes—have the perspective, much less the information, to understand completely. Even then, there will still be unanswered questions, or questions answered after years, even centuries of uncertainty.

Assume that everything you hear or read about a current conflict is wrong, incomplete, made up, or the precise opposite of the truth.

So my rule of thumb: Assume that everything you hear or read about a current conflict is wrong, incomplete, made up, or the precise opposite of the truth. And wait. And don’t get emotionally invested in what’s happening, especially if your sense of moral worth depends upon viewing yourself as on The Right Side and raging against a barbarous enemy.

War is tragic, and people will suffer. That’s guaranteed. But there is no reason to compound those facts with ignorant and impotent rage.

If you slow down, you won’t beclown yourself the way certain institutions have in the previous week. Many of these have now, suddenly, discovered the concept of “fog of war,” which has been dusted off to provide a sage reminder to readers instead of a mea culpa. Look here and here for samples, and here for well-earned mockery.

Per Alan Jacobs, who wrote excellently and succinctly on this topic over the weekend:

The more unstable a situation is, the more rapidly it changes, the less valuable minute-by-minute reporting is. I don’t know what happened to the hospital in Gaza, but if I wait until the next issue of the Economist shows up I will be better informed about it than people who have been rage-refreshing their browser windows for the past several days, and I will have suffered considerably less emotional stress. . . .

“We have a responsibility to be informed!” people shout. Well, maybe . . . But let me waive the point, and say: If you’re reading the news several times a day, you’re not being informed, you’re being stimulated.

To the New York Times’s credit, it has offered an editorial apology, but, as Jeff Winger once put it, “Be sorry about this stuff before you do it, and then don’t do it!

I’ll end with a reflection from CS Lewis, in a passage from his World War II radio talks eventually incorporated into Mere Christianity, a passage that was going the rounds late last week:

Suppose one reads a story of filthy atrocities in the paper. Then suppose that something turns up suggesting that the story might not be quite true, or not quite so bad as it was made out. Is one's first feeling, ‘Thank God, even they aren't quite so bad as that,’ or is it a feeling of disappointment, and even a determination to cling to the first story for the sheer pleasure of thinking your enemies are as bad as possible? If it is the second then it is, I am afraid, the first step in a process which, if followed to the end, will make us into devils. You see, one is beginning to wish that black was a little blacker. If we give that wish its head, later on we shall wish to see grey as black, and then to see white itself as black. Finally we shall insist on seeing everything . . . as bad, and not be able to stop doing it: we shall be fixed for ever in a universe of pure hatred.

Let the reader understand.

We already have something approaching Screwtape’s universe of pure noise. Can we still turn back from a universe of pure hatred?