Scruton on what children can teach us about art

From the late Sir Roger Scruton’s documentary “Why Beauty Matters”:

 
Art needs creativity, and creativity is about sharing. It is a call to others to see the world as the artist sees it. That is why we find beauty in the naïve art of children. Children are not giving us ideas in the place of creative images, nor are they wallowing in ugliness. They are trying to affirm the world as they see it and to share what they feel. Something of the child’s pure delight in creation survives in every true work of art.
— Sir Roger Scruton
 

Scruton makes this aside as a point of contrast with modern art—which is intentionally insular, confrontational, transgressive, and over-intellectual if not ideological—but in doing so he makes a broader point about what art is and what it’s for. This description of children’s art is also honestly and accurately observed.

I’ve thought of this passage many times over the last few weeks, ever since my eldest son eagerly presented me with a picture he had drawn. It was a pencil and highlighter drawing that showed me holding my youngest son at the dinner table—a picture of his dad and one of his little brothers. It was drawn from life without my noticing, and joy he took both in drawing and giving it to me, the joy in and care taken over the details, including the stubble of my beard, and the simple, straightforward, honest love in the picture itself have stuck with me. My kids have drawn many things for me, but this one in particular struck me as a clear example of Scruton’s “pure delight” in “sharing.”

Last week I tacked it to the wall of my office at school. May any art I create be motivated as purely as my son’s.

“Why Beauty Matters” is worth your while, as I wrote here almost four years ago following Scruton’s death. You can watch the whole thing on Vimeo here.

Wildcat trailer reaction

Maya Hawke as Flannery O’Connor in Wildcat

As I noted in my 2023 movie year-in-review, Wildcat is one of the films I’ve been looking forward to this year. Though it was completed and premiered at a film festival last year I hadn’t heard any news about its distribution or release until yesterday, when a great trailer appeared on YouTube.

Wildcat takes place over a short stretch of the early 1950s, when young writer Flannery O’Connor (Maya Hawke) moves back home to Milledgeville, Georgia and is diagnosed with lupus, the same disease that had killed her father when she was sixteen. While struggling with her illness and its severe effects she tries to sell her first novel, a searing Southern gothic religious fable called Wise Blood. Like her short stories, it’s deeply Catholic and Southern and poignant in the sense of sharp, cutting. It’s a hard sell.

It’s unclear from the trailer precisely how much of O’Connor’s life Wildcat covers, but there are scenes suggesting her time among the literary elite in the northeast in the late 1940s, after she had graduated from the Iowa Writers’ Workshop and when she was laboring over Wise Blood. The trailer suggests a strong contrast between the world O’Connor leaves behind and the clay-banked roads and nosy church ladies back home in Georgia—a contrast O’Connor was certainly aware of and wrote about.

Perhaps the most intriguing thing in the trailer are the scenes from several of her short stories—”Parker’s Back,” “The Life You Save May Be Your Own,” “Revelation,” and especially “Good Country People”—in which O’Connor and her mother Regina (Laura Linney) play major characters like the cynical Hulga or the self-righteous Mrs Turpin. Catching even short glimpses of scenes I’ve imagined many times—a crowded doctor’s office waiting room, a Bible salesman running across a field carrying a prosthetic leg—got me excited in a way I haven’t felt for a movie in a while. Apparently these are intricately intertwined with the events of O’Connor’s real life. I’m curious to see how this works, especially since it’s so easy for a film about a writer to slip into the biographical fallacy (or what CS Lewis called The Personal Heresy): the idea that everything a writer writes is based on his or her actual experiences.

But I’m most pleased to see that Wildcat takes O’Connor’s Christianity seriously. Apparently Ethan Hawke, who directed and co-wrote the film, was inspired to make it when he read the Prayer Journal that O’Connor kept as a writing student in Iowa. O’Connor, in addition to being a brilliant writer, was prickly, hard-edged, had a chip on her shoulder as an outsider in the postwar literary world, and was fervently orthodox and devout. Her faith suffuses her work not only coincidentally but by design. Wildcat’s trailer manages to evoke all of this. Here’s hoping that the full film delivers.

A few other notes:

  • The Southern accents sound pretty good. O’Connor was originally from Savannah and, though recordings of her remind me a lot of my paternal grandmother, an Athens native, O’Connor’s speech has some peculiarities that must be down to her roots, Savannah having some distinctive dialect features even by Southern standards. Listen to her read “A Good Man is Hard to Find” sometime.

  • I’m interested to see how the film explores what some people perceive as O’Connor’s cruelty (“Sometimes I feel like you’re trying to stick pins in your readers,” her editor says in the trailer). The question of just how unpleasant a writer can or should make the reader feel in order to make a point has concerned me for a long time.

  • Maya Hawke looks a lot more like O’Connor than I would have guessed was possible based on what I’ve seen of her in “Stranger Things” and Once Upon a Time in Hollywood. Kudos to her and the film’s hair and makeup folks.

  • I like the cinematography a lot. It’s clearly digital but has some creative composition choices and lens work—e.g. the way the focus and bokeh fall off at the edges off the frame, which reminds me of The Batman. A distinctive look will probably help support O’Connor’s story and give it the otherworldly feel it will probably need.

  • Wildcat was apparently shot mostly in Kentucky rather than Georgia. From what I can see in the trailer it looks like a good stand-in, though it’s funny to me that, with so many movies shooting in Georgia as a substitute for more expensive locales, such a Georgia-centric story wound up being shot elsewhere.

It’s striking, having watched the trailer several times now, how present O’Connor’s crutches are. The final “coming soon” shot of O’Connor at the family mailbox, which has been one of the only images available for a while, has them plainly visible but I never noticed them. And there they are behind her as she types up a manuscript or struggles even to walk around the house. Some early film festival reviews I’ve read suggest that Wildcat is not just a story about a writer publishing a novel but a meditation on suffering, the threat of death, and God’s grace. I’m here for it.

Wildcat is currently scheduled for a big-city release on May 3 with wider availability to follow, though I haven’t been able to find any details about that yet. Hopefully we can look forward to a time in the late spring or early summer when we can catch Flannery O’Connor in theatres.

Scruton on style

Last week I revisited the late Sir Roger Scruton’s Beauty: A Very Short Introduction via audiobook on my commute. It’s an excellent precis of much that is fundamental to his thinking and, true to the subtitle, a wide-ranging introduction to many topics that bear further thought. Here’s one.

From a discussion of the role proportion plays in the creation of vernacular architectures by launching the builder on “a path of discovery” to what “fits” and is “suitable” for each detail in relation to the others in Chapter 4, “Everyday Beauty”:

One result of this process of matching is a visual vocabulary: by using identical mouldings in door and window, for example, the visual match becomes easier to recognize and to accept. Another result is what is loosely described as style—the repeated use of shapes, contours, materials and so on, their adaptation to special uses, and the search for a repertoire of visual gestures.

I like the idea of a style as mastery of a discipline’s “repertoire,” the selective, purposeful use of a shared vocabulary. Scruton’s example is architectural, but he also refers throughout the book to painting, sculpture, cinema, and most especially music. My mind naturally suggested literary style, with its literal shared vocabulary and the many effects and fine shades of meaning that a firm control of English can yield.

Scruton himself raises the idea of control as a component of style in the next chapter, “Artistic Beauty”:

True artists control their subject-matter, in order that our response to it should be their doing, not ours.

True artists control their subject-matter, in order that our response to it should be their doing, not ours. One way of exerting this control is through style . . . Style is not exhibited only by art: indeed, as I argued in the last chapter, it is natural to us, part of the aesthetics of everyday life, through which we arrange our environment and place it in significant relation to ourselves. Flair in dressing, for example, which is not the same as an insistent originality, consists rather in the ability to turn a shared repertoire in a personal direction, so that a single character is revealed in each of them. That is what we mean by style, and by the ‘stylishness’ that comes about when style over-reaches itself and becomes the dominant factor in a person’s dress.

The tension between originality and a common vocabulary and the need for balance is an important topic and one Scruton returns to later in the book, but he continues by introducing another consideration:

Styles can resemble each other, and contain large overlapping idioms—like the styles of Haydn and Mozart or Coleridge and Wordsworth. Or they might be unique, like the style of Van Gogh, so that anyone who shares the repertoire is seen as a mere copier or pasticheur, and not as an artist with a style of his own. Our tendency to think in this way has something to do with our sense of human integrity: the unique style is one that has identified a unique human being, whose personality is entirely objectified in his work.

This passage in particular offers a lot for the writer to think about. Every writer has heroes and idols and role models, other writers whose control over their work has influenced our own technique, consciously or not. This starts young. It’s been more than twenty years since I read Stephen King’s On Writing, but I still remember and think often about this passage:

You may find yourself adopting a style you find particularly exciting, and there’s nothing wrong with that. When I read Ray Bradbury as a kid, I wrote like Ray Bradbury—everything green and wondrous and seen through a lens smeared with the grease of nostalgia. When I read James M Cain, everything I wrote came out clipped and stripped and hard-boiled. When I read Lovecraft, my prose became luxurious and Byzantine.

All of which is, for King, a crucial developmental stage in the writer’s life, one that should be refined through constant reading and writing, so that eventually one is no longer writing in imitation but in “one’s own style.”

But if you’re aware of what you’re doing and working hard at it, particularly in order to achieve a certain specific effect—so that, per Scruton, the readers’ response will be my doing, not theirs—it’s hard not to become anxious that one is working merely in pastiche or even accidental parody. Have I sacrificed my integrity to sound like someone else? Inconsistency doesn’t help. I’ve worried more about this on some projects than others. Why am I confident that I can use tricks learned from Charles Portis but not those from Cormac McCarthy? Food for thought.

I think, naturally, of John Gardner and his description of “mannered” prose, a term he’d certainly have applied to McCarthy. “Mannered” suggests artificiality or phoniness, the lack of integrity Scruton suggests above, which is how every good writer hopes not to come across. But I also think of Elmore Leonard, another author whom I’ve quoted here many times, and who worked hard to make his style the absence of style. Scruton contends that that is impossible:

Style must be perceivable: there is no such thing as hidden style. It shows itself, even if it does so in artful ways that conceal the effort and sophistication . . . At the same time, it becomes perceivable by virtue of our comparative perceptions: it involves a standing out from norms that must also be subliminally present in our perception if the stylistic idioms and departures are to be noticed. Style enables artists to allude to things that they do not state, to summon comparisons that they do not explicitly make, to place their work and its subject-matter in a context which makes every gesture significant, and so achieve the kind of concentration of meaning that we witness in Britten’s Cello Symphony or Eliot's Four Quartets.

This is exactly right, and Leonard would agree. Leonard’s style, which was precisely designed to “conceal the effort and sophistication” of his writing and make it seem effortless, was immediately recognizable because it was distinct from the “norms” described above in particular ways—something Leonard himself noted. Those “norms” or context are the broader shared vocabulary we began with—which gives shape to one’s work through contrast.

And that final sentence on what a firm, controlled, purposeful, precise style can do, using the power of allusion, implicit comparison, the subtle significance of every detail to “achieve . . . concentration of meaning”—is there a writer who wouldn’t die happy having that said of his work?

Melancholy in the outfield

A few weeks ago I revisited a childhood favorite with my own kids. Angels in the Outfield came out when I was ten years old and an enthusiastic baseball fan. I must have watched it fifty or sixty times over the next few years, before I aged out of it and the real-life drama of the mid-90s Braves gently edged it out of my imagination.

What I remembered most about Angels in the Outfield was the comedy, the slapstick baseball action, the standard sports movie joys of becoming a team and winning the big game, and the music. (I noticed, though very young, that composer Randy Edelman’s score had a lot of cues suspiciously similar to his work on the previous year’s Gettysburg, one of my favorite soundtracks.) What I was not prepared for upon rewatching it as an adult just how firmly the plot’s foundation was built upon pain, sorrow, and longing.

Roger, the main character, lives in foster care because his mom has died and his dad is a negligent, uncommunicative deadbeat. When the film starts his father has already signed over his rights to his son and has shown up just long enough to tell Roger, a job he performs badly. Is that guilt we see in his eyes, or just awkwardness in performing the unwanted duty of talking to his child? When an oblivious Roger asks when they can “be a family again,” his dad replies with a “when pigs fly” scenario that Roger takes literally. And Roger’s younger friend JP seems bright and happy all the time but collapses into grief when another boy is moved out of the foster home, an emotional response the movie suggests is always ready just below the surface. This is clearly a child struggling with abandonment.

But the vein of sadness runs through the adults, too. California Angels manager George Knox seethes with grievance, not only having had his career cut short when a dirty player slid into him cleats-first, but also becoming a manager only to be saddled with the worst team in the league. The man who injured him, Ranch Wilder, is now the Angels’ radio announcer and loathes the team as well as Knox. His entire demeanor suggests he resents being kept down when he is meant for greater things. And Mel Clark, a former star pitcher who developed a pain pill addiction under Knox’s managership at Cincinnati and who has the film’s clearest redemption arc, is revealed at the end to be only six months away from death. He has lung cancer and doesn’t even know it yet. And so even the longed-for victory in the playoffs is tinged with loss.

I’m not going to pretend that Angels in the Outfield is a great movie or serious drama; it’s simply well and honestly crafted and it treats all of these scenarios seriously. None of it feels forced, none of it is used merely to jerk tears, and none of it is tidily and painlessly resolved. In fact, most of the characters don’t actually get the specific thing they want at the beginning of the film.

This brought to mind two things I had reflected on long ago. The first is an essay from Film School Rejects called “The Melancholy of Don Bluth,” an excellent read on animated films like The Land Before Time, All Dogs Go to Heaven, or An American Tail—all three of which were in constant rotation in the Poss household when I was growing up. Bluth’s movies have a reputation for going to dark places Disney typically balks at, to the point that they’re sometimes the subject of internet memes about “trauma.” Please.

The artistic upshot of Bluth’s willingness to include death and—perhaps more importantly—mourning in his films is a truth and richness often missing from comparable animated films:

Thematically, there is an ever-present air of death about Bluth’s work that is profoundly sad. Bones litter certain set-pieces; illness and age are veritable threats (shout out to Nicodemus’ gnarly skeleton hands); and characters can and do bleed. Critically, Bluth films don’t gloss over grief, they sit with it. From Littlefoot’s straight up depression following the on-screen death of his mom, to Mrs. Brisby’s soft sorrow at finding out the details of her husband’s death. There is a space for mourning in Bluth’s stories that feels extra-narrative, and unpretentious. Critically, this is distinct from, say, wallowing. Bluth’s films have a ridiculously productive attitude towards mourning, most lucidly articulated through Land Before Time’s moral mouthpiece Rooter: “you’ll always miss her, but she’ll always be with you as long as you remember the things she taught you.” Disney meanwhile, tends to treat death as a narrative flourish, or worse, a footnote. And in comparison, even notable exceptions like Bambi and The Lion King seem immaturely timid to let palpable grief linger for longer than a scene, let alone throughout a film’s runtime.

The other thing that came to mind was a podcast conversation on The Sectarian Review concerning Hallmark Christmas movies. At some point during the conversation I drew a comparison between Hallmark romantic comedies and older romcoms by pointing out that films like You’ve Got Mail, as fun and bubbly and appealing as they are, also have vein of genuine pain running through them. Kathleen Kelly takes her mom’s little bookshop up against the big chain store and loses, an event the film doesn’t gloss over and doesn’t paint as some kind of moral victory. Who doesn’t feel the pang of her loss as she closes up shop for the final time and walks away into the night, her mom’s shop doorbell jingling in her hand?

Only Pixar, in older movies like Up and Toy Story 2 and Inside Out, has recently attempted to include such real pain in their stories. By comparison, most of the recent crowd-pleasing PG-13 action fare or animated kids’ movies in theatres or the mass-produced dramas of the Hallmark Channel are pure saccharine—thin, fake, and probably carcinogenic.

I have no firm conclusions to draw on this topic except to note that, for whatever reason, even in our simplest and cheapest stories we’ve lost something important. And if you feel some of this and hope for catharsis, one of the oldest reasons for watching a drama that there is, you’ll have to go to older films for it.

Sturgeon Wars

Last week some of the staff writers at National Review, of all places, had an amusing exchange of views on the current state of Star Wars. It began when one wrote of being “Star Wars-ed out.” Another seconded that feeling and drew an analogy with the Marvel movies: both are series that have decreased in quality as the suits behind them have produced more and more “content.” Yet another followed up specifically critiquing the trilogy produced by Disney while rightly reserving some small praise for Rogue One.

But the best and most incisive perspective came from Jeffrey Blehar, who with aggressive indifference toward everything since Return of the Jedi forty years ago, mildly suggested that not much of Star Wars is any good. Dissect and fuss over the prequel trilogy, the sequel trilogy, the Disney+ shows, and cartoon shows and novels and comics and video games however you want, none of it is as good as the original trilogy and most of it is terrible. In fact, the best thing to come of Star Wars since 1983 is Mr Plinkett.

I mostly agree (and wholeheartedly agree about Mr Plinkett), and that’s because I’m a big believer in Sturgeon’s Law. In its simplest formulation, Sturgeon’s Law states that:

 
90% of everything is crap.

For several years now I’ve been saying that Sturgeon’s Law applies just as much to Star Wars as to anything else, it’s just that Star Wars got its 10% of quality out of the way first. What they’ve been producing ever since is, well…

I have ideas about why this is, including but by no means limited to Disney’s desperately overvalued purchase of the rights to the series and—probably more importantly—its merchandising, executive mismanagement, ideological capture of the filmmakers, oversaturation (speaking of Marvel), and of course simple artistic failure. But there are three more fundamental problems that I’ve seen with Star Wars over the last couple decades.

One is that everyone forgot that Star Wars was lightning in a bottle. The original film didn’t emerge fully formed from George Lucas’s head like a nerd Athena, it was the product of a difficult production, a demanding shoot, and a host of other limitations. The many points of friction in the production required genuine creativity to solve, not least from a brilliant editor and one or two real creative geniuses like Ben Burtt and John Williams. But the very success of Star Wars meant that the circumstances that shaped the originals have not recurred. Everything since has been greased by money, money, money, and the synthetic smoothness of the prequel and sequel trilogies allowed bad or incomplete or incoherent story ideas to slide straight through into the finished films.

Second and relatedly, with one or two exceptions the fans and producers of Star Wars drifted into a category error regarding what kind of stories these are. Star Wars since Return of the Jedi has been treated like fantasy set in space. Mr Plinkett, among many others, has noted the ridiculous and gratuitous multiplication of planets, species, vehicles, and everything else since The Phantom Menace. But Star Wars wasn’t originally fantasy—it was a Boomer pastiche of westerns, Kurosawa samurai films, World War II movies, Flash Gordon serials, and a film school dweeb’s skimming of Joseph Campbell. As Star Wars quickly became the cultural remit of younger generations and more and more Star Wars “content” was churned out, those referents were lost to all except the buffs and nerds. The galaxy far, far away came to be treated as an infinitely expandable object of “world-building” when it is and always was an assemblage of spare parts.

I don’t mean that dismissively. Being made of spare parts is not necessarily a bad thing. The originals are greater than the sum of their parts, and it’s worth pointing out that the handful of new Star Wars material that tried to tap directly into some of what inspired Lucas—war movies about ill-fated missions in Rogue One, westerns in the first season of “The Mandalorian”—were good. Eventually ruined by committee-think, but good.

The final problem, which brings us back around to Sturgeon’s Law, is that the fans allowed it, even demanded it. Having had that 10%, they gobbled up that 90% we’ve been getting since and kept wanting more. I know plenty of people have complained about the storytelling, the filmmaking, the behind-the-scenes drama, the ideological drift of the Disney films, and everything else, but for every Mr Plinkett or Critical Drinker on YouTube there are a thousand people who are satisfied with anything as long as it has the Star Wars logo on it. From archetypal storytelling to lifestyle brand—that’s the real Skywalker saga.

This is by no means unique to Star Wars fans, as some trends among purported Tolkien fans have made clear in the last couple years. But if people want to enjoy their favorite things again they need to regain their suspicion of corporations as well as remember the difference between quantity and quality.

Poetry of reinforcement

From Tom Shippey’s preface to his new translation of Beowulf, in which he notes some of the strange poetic artifacts of the poem’s alliterative form and explores their deeper implications—both for the poem’s original audience and for us:

King David as Anglo-Saxon bard in the Vespasian Psalter

One may sum up by saying that, rather oddly, the words in the poem which receive the greatest sonic emphasis are sometimes the ones which carry the least information. They are there to help the poet with the first of his major aims: which is, one might say, to maintain the beat and the meter of his poetic lines.

This seems a rather humble aim to us, for our idea of poetry is that its wording should be exact, unexpected, provocative—to paraphrase the Savage in Aldous Huxley’s Brave New World, who has just been introduced to Shakespeare—words which make you feel like you'd sat on a pin. But we emphasize novelty, originality, surprise: and accordingly we fail to feel the power of reinforcement, familiarity, recognition. And it is this which satisfied the poet's second major aim: to express the ethos of a social group.

The modern vision of the poet as an outsider speaking truth to power and challenging norms is not only historically recent but a sadly narrow and limiting vision of what poetry does. It requires a posture of continuous antagonism to everything that grows both tedious and phony. The stereotype of the tiresome and hypocritical modernist poet and his or her predictably transgressive free verse exists for a reason.

But worse, this vision of poetry and the poet warps the interpretation of the great poetry of the past. People go galloping off in search of the hidden subversion in Homer or Beowulf and, having searched long enough and screwed their jeweler’s loupe of critical theory tightly enough into their eye, find it. Turns out these poets were just like the longhairs at the campus poetry slam. But, satisfied with presentist political interpretations, they miss what’s actually going on—and the chance to encounter people radically unlike themselves.

Good poetry can challenge, certainly. But I’d argue that the most effective and lasting prophetic verse challenges from within a culture—thus the entire power of the Old Testament prophets—rather than from some self-congratulatory political margin. But just as often, if not more so, good poetry reminds its audience of who they are. Remember, it says, This is us. This is what we love. This is what we must protect. And, with striking frequency, This is what we have lost. Consider the worlds in which the Iliad and Beowulf were composed and the poetry of reinforcement and shared love and loss makes much more sense.

Recovering the ability to “feel the power of reinforcement, familiarity, recognition” may prove a crucial part of the modern man’s great spiritual task.

Shippey has an online Beowulf “masterclass” coming up at the beginning of December. I’ve already signed up. It should be well worth your while if you’re interested in this period and its poetry. You can find information about the class here.

That's not how any of this works

Director Ridley Scott talks with Dan Snow about Scott’s forthcoming film Napoleon

Yesterday History Hit released a 16-minute talk with Ridley Scott covering some aspects of his epic drama Napoleon, which comes out in three weeks. The interview is mostly interesting even if host Dan Snow doesn’t dig very deep, but Scott got strangely testy when Snow—over a clip of cannonballs smashing up the ice of a frozen pond beneath the feet of retreating Russian infantry at Austerlitz—raised the question of historical accuracy:

Snow: What about historical accuracy? When a historian says, “Uh, sorry, Sir Ridley, it didn’t quite happen like that,” you say, “Listen, I’ve done enough with you.” You have to have artistic license, right?

Scott: You know, I would say, “How would you know? Were you there?”

Snow: [laughs]

Scott: They go, “Oh, no, right.” I say, “Exactly.” So I said, You know, Napoleon [?] had four-hundred books written about him. So it means, maybe the first was the most accurate. The next one is already doing a version of the writer. By the time you get to 399, guess what—a lot of speculation.

Oof. That’s not how this works. That’s not how any of this works.

Historians don’t know things because they were there, they know things because they study. It’s work. They’ve read and researched and compared notes and argued and walked the ground. Scott’s rejoinder is surprisingly childish for such a sharp and accomplished man.

Further, his breezy explanation of how history works as a discipline and a profession is simply bizarre. The implication of what he says about how books cover a subject over time is that historical facts are established at the beginning, and the rest is just eggheads batting ever more intricate theoretical interpretations back and forth.

The truth is that, as I’ve had cause to reflect here recently, the first accounts of an event are fragmentary or partial even if they’re accurate. It takes diligent study, the perspective of time, the synthesis of all available sources, and a good bit of luck to piece together a big-picture account of what actually happened. And with big, heavily-documented subjects—like, say, a French emperor—new material is being discovered all the time. There is no substitute for a primary source or eyewitness account, but if you want accuracy qua accuracy, you will absolutely want a secondary source, a book written later.

I’m all for allowing responsible artistic license—I’m always interested to hear filmmakers explain how and why they choose to change what they change—but Scott doesn’t stop at artistic license. His arrogant dismissiveness toward truth in historical storytelling is breathtaking. Maybe he picked up more from Napoleon than he’s aware.

To be fair, Scott was speaking off-the-cuff, and is 85 years old. I’m not even absolutely certain he said “Napoleon” when he cited the figure of 400 books because he was mumbling. (The real figure, if he was talking about Napoleon, is tens of thousands, more than 300,000 by one old estimate.) But given his track record with using history for his own purposes—I stand by my thoughts on Kingdom of Heaven from the early days of this blog—and the forcefulness with which he said this, I have to assume he means it. I can’t say I’m surprised.

At any rate, I’m cautiously optimistic about Napoleon, but I’m not hoping for much more than interesting performances and exciting spectacle.

Literary cameos

Yesterday Alan Jacobs posted a longish recommendation of Francis Spufford’s latest novel, an alternate history detective noir titled Cahokia Jazz. I’m intrigued. But I especially enjoyed this minor note from the end of Jacobs’s post:

At one point, late in the story, our hero is at Cahokia’s railway station and happens to see a family, “pale, shabby-grand, and relocating with their life’s possessions”—including, curiously enough, butterfly nets: “white Russians on their way to Kodiak, by the look of it.” One of them, “a lanky twenty-something in flannels and tennis shoes,” is called by his family Vovka, and he briefly assists our hero. Then off they go, leaving our story as abruptly as they had arrived in it. Assuming that they made their way to Kodiak—or, more formally, as our map tells us, NOVAYA SIBIRSKAYA TERRITORII—it is unlikely that their world ever knew Lolita or Pale Fire.

This is “one of several delightful cameos” in the novel, and Jacobs’s recommendation and praise got me thinking about such cameos in fiction.

I haven’t read Cahokia Jazz yet, though I intend to, but I’m willing to take Jacobs at his word that Spufford does this well. The example he cites certainly sounds subtle enough to work. But done poorly, such cameos awkwardly shoehorn a well-known figure into the story and call unnecessary attention to themselves. Think Forrest Gump in novel form. They can also, if used to denigrate the characters in the story, turn into the kind of wink-wink presentist authorial irony that I deplore.

I think the best version of the literary cameo functions much like a good film cameo—if you spot the cameo and know who it is, it’s a nice bonus, but if you don’t it doesn’t intrude enough to distract. And, ideally, it will work with and add to the story and characterization of the main characters.

A good and especially subtle example comes from Declare, which I’m almost finished reading. Early in the novel we read of protagonist Andrew Hale’s background, specifically where he was in the early stages of World War II before embarking on his first espionage assignments in occupied France:

In November he successfully sat for an exhibition scholarship to Magdalen College, Oxford, and in the spring of 1941 he went up to that college to read English literature.

His allowance from Drummond’s Bank in Admiralty Arch was not big enough for him to do any of the high living for which Oxford was legendary, but wartime rationing appeared to have cut down on that kind of thing in any case—even cigarettes and beer were too costly for most of the students in Hale’s college, and it was fortunate that the one-way lanes of Oxford were too narrow for comfortable driving and parking, since bicycles were the only vehicles most students could afford to maintain. His time was spent mostly in the Bodleian Library researching Spenser and Malory, and defending his resultant essays in weekly sessions with his merciless tutor.

A Magdalen College tutor ruthlessly grilling a student over Spenser and Malory? That can only be CS Lewis.

They’re not precisely cameos, but I have worked a few real-life figures into my novels in greater or lesser supporting roles: David Howarth in Dark Full of Enemies, Gustavus W Smith and Pleasant Philips in Griswoldville. I’ve aimed a little lower in the name of realism, I suppose. But the precise dividing line between a cameo of the kind described here and a real person playing a serious role in a story is something I’ll have to figure out.

At any rate, a well-executed literary cameo is a joy. Curious to see who else might surprise us in the pages of Cahokia Jazz.

Tim Powers on the danger of chasing trends

Over the last few weeks I’ve been (very, very slowly) reading Declare, a supernatural Cold War espionage thriller by Tim Powers. I reached the halfway point the other night and it’s so continuously involving and intriguing, so brilliantly imagined and deeply realized, and so different even from the science fiction that I occasionally read that I looked for some interviews with Powers. I found several recent ones on YouTube and they haven’t disappointed.

Here’s an excellent exchange from a 57-minute interview with a channel called Media Death Cult. After discussing Powers’s love of Robert Heinlein and the contemporary obsession with how “problematic” he is, Powers and the interviewer consider whether it is possible to write old-fashioned fiction in a world that adheres so dogmatically to the prevailing political pieties:

Media Death Cult: I think it’s easy to pick on [Heinlein] because his heart was in the right place. You know what I mean?

Powers: Yeah, and he suffers from, uh, being dead, uh, in that the current standards of acceptability move on. Harlan Ellison was certainly progressive, liberal, cutting edge in his time, but now being dead, the standards, acceptable norms, have moved on.

MDC: So you think someone like Heinlein or Ellison, if they were to pop up now and want to write like that, it’s just not going to stick because of this weird situation that the Western world seems to be in with the microscope that you’re put under? I think you’d be a fringe writer, wouldn’t you? Which I think is a shame. I think if we can’t get that kind of thing going again it’s a bit of a shame.

Powers: Yeah. I mean, there’ve always been trends, which I think any writer is wise to ignore. Cyberpunk, nanotech, steampunk, uh, have all been flurries that briefly inspired lots of imitations and, you know, follow-alongs, and I always think it’s a big mistake for a writer to do that—to clock what is acceptable right now, what’s popular right now? I will do that. Because at best you’re going to be one of a crowd following along, and, more likely, by the time you finally get on the bandwagon the wheels will have fallen off and it’s overturned in a field somewhere.

MDC: Yeah.

Powers: And I think these days—and I speak from the advantage of complete ignorance—

MDC: Me too.

Powers: Ha! I think there are a number of bases to touch, boxes to check, especially in current science fiction and fantasy, which I think would be detrimental for a writer to pay much attention to. I think we’re going through a sort of tunnel. I think it may not be related but I think it’s alarming that Roald Dahl, RL [Stine] who did those sort of spooky stories for kids, and Ian Fleming and Agatha Christie are having their works retroactively revised to be acceptable to 2003 [sic] standards.

It’s definitely related. Powers’s choice of the tunnel as a metaphor for our cultural moment is fitting, tunnels being narrow, dangerous, and impossible to escape in any way but getting through it.

Just don’t live unto the tunnel, or take on its shape. Do your own thing. Be your own man, write your own stories, and don’t chase the latest trend—especially if that trend is writing to appease the legion of scolds who want to dictate how you must write your story and what you must include. Bowing to this kind of political orthodoxy is the worst way to fit in and be trendy. After all, recent events have demonstrated that you can toe the line—touch every base and check every box, in Powers’s terms—and still fall afoul of the mob. A lesson anyone familiar with Bolshevism should know.

Powers and his interviewer do, however, offer a hopeful vision of the future:

MDC: You got this kind of weird landscape where everybody is—you’re right, you’re writing through this tunnel. I think we’ll come out of the other side of it, though. There is pushback.

Powers: Oh, yeah.

MDC: And it’s not just the old school who are old enough to have enjoyed these things before the world started turning woke or whatever it is. I think there is a movement, there is pushback on it. We don’t want art to be that way. We don’t want it.

Powers: Yeah, yeah. And certainly, you think, ‘Well, it’s old now, the original text of Fleming and Agatha Christie.’ But then, when I was reading Heinlein, Sturgeon, Leiber, Murray Leinster, Henry Kuttner, those were all before my time. Those weren’t new writers. I was, you know, digging around used book stores and, yeah—I don’t think the readership is going to confine itself to the new editions. I think readers are hungry enough to dig widely.

The used book store may well prove to be the ashes from which literature will resurrect itself. But first we have to pass through the tunnel.

Good stuff, and there’s more I could have included. There’s a lengthy section in which Powers talks about his friendship with Philip K Dick that was especially good. Check out the entire interview at the link above or in the embedded YouTube player.

Special thanks to those of y’all who’ve recommended Declare to me at some point, especially David and Chet. I’m enjoying it so much that I’ve already picked up Powers’s supernatural pirate epic On Stranger Tides, which looks amazing, and I’d be glad to hear from other Powers fans which of his other books would be good to look into after that. I’ve already heard good things about Last Call. In the meantime, I’m trying to make time between work and the commute and feeding babies all night to finish Declare. Looking forward to all the big revelations along the way.

Woke Bond is boring Bond

Earlier this week I read a short piece by Niall Gooch at the Spectator called “The terribleness of a progressive Bond.” It’s a review of a new Bond novella by Charlie Higson, On His Majesty’s Secret Service, which was written to coincide with the coronation of Charles II. The story, insofar as it has one, involves Bond traveling to Hungary to infiltrate a nationalist plot to overthrow Charles and install a pretender claiming direct descent from Alfred the Great.

Gooch was not impressed. In addition to poor plotting and writing (“It makes Dan Brown look like a master of nuance, understatement and subtle characterisation”), Higson’s novella is overtly political, with a menagerie of baddies gathered from the most fevered imaginations of left-leaning Twitter types. The villains are cartoonishly anti-immigration, anti-EU, and vaccine-skeptical, and needless to say they’re all inarticulate white men who like guns and beer. Gooch:

None of them is a genuine character. Instead they are mere empty vessels, onto which he projects his bizarre fantasies about the motivations and beliefs of conservatives. People who are sceptical about mass immigration or transgenderism or the erosion of free speech are simply itching to engage in mass terror attacks in the heart of London, apparently.

But long before this becomes explicit, you’ll feel it. They’re interested in Anglo-Saxon history? They like Hungary? If you are left wondering why a London businessman calling himself Athelstan of Wessex would organize his plot in Hungary, you are not part of Higson’s political bubble, and On His Majesty’s Secret Service is not written for you. It is, Gooch writes, “clearly a work of propaganda.”

As it happens, I read On His Majesty’s Secret Service this summer, and there’s a reason you didn’t hear anything about it here. Gooch’s review is wholly accurate.

I thought it perhaps better written than Gooch did, but that’s damning with faint praise. My one thought through the entire first half of the story was “Okay, I see what you’re doing,” which was personally irritating and, artistically, meant that the second half held no surprises. And I agree entirely that the staid “Centrist Dad” Bond of this novella—a man who is in a carefully worked out and consensual open relationship; whose self-satisfied inner thoughts range across a litany of studiedly correct leftwing opinions on everything from English nationalism and Viktor Orban to sweatshops and gut health; and who is comfortable dropping terms like “toxic” and “far right”—is a diminished Bond. For Gooch, this is “cringeworthy.” My word was “annoying.”

It’s also boring.

Why? The key word comes in Gooch’s final paragraph:

It is perhaps some consolation that there must eventually be a reaction against the smug, complacent tone of of the contemporary cultural scene. Until then, it seems like we may be in for some very bad films, books and TV shows, praised not for any artistic merit but for their ideological conformity.

Complacent. You could never call the original Bond complacent. He was not a happy man. Despite his smarts, skills, strength, love of the high life, and success with women, Bond was always a bit out of step with the modern world, ever more so as time went on. When Judi Dench’s M calls Bond a “dinosaur” in GoldenEye it is meant as an insult but accurately captures a fundamental aspect of the original character. This is because Fleming’s Bond—and, to a lesser but still palpable extent, the Bond of the films—was a relic of the Empire. His fate all the way through Fleming’s series is to risk all and suffer much on behalf of something that was crumbling anyway, often preventably and therefore pointlessly.

And so Fleming’s Bond grows more bitter and the novels more poignant and reflective as the series goes on. By the time of You Only Live Twice, the penultimate original novel, Bond is so alienated, so disillusioned with his work and what Britain has become, that the only person left who can understand him is a former enemy, a Japanese kamikaze pilot. Both know not only what it means to lose, permanently, but to survive to no apparent purpose.

By contrast, a Bond who shares the views dominant in media and academia is comfortable, static, and smug in a way Fleming’s Bond never could be. The original Bond is fighting what Tolkien called a “long defeat,” a doomed but heroic defense of something that will perish but is worthwhile anyway. Higson’s Bond critiques everything he sees from the lofty height of his own detached correctness. He would be more likely to process his trauma with a therapist than find a friend in a past enemy. He has nothing to learn, nothing to lose, and nothing to die for. He is right where he—and, indeed, everyone else—should be.

Blame the author. Fleming put a lot of himself into Bond; hence not only the womanizing and love of scrambled eggs but the bitterness, weariness, and disillusionment. Fleming was a dinosaur, too, and he knew it. Higson, on the other hand, and his Bond belong. Gooch:

I admit to being somewhat surprised by quite how leaden and didactic this book was. Are there no editors left, I asked myself as I waded through the underpowered, hectoring prose. Perhaps, however, that is a function of how hegemonic Higson’s views are among the creative classes.

After all, goldfish do not know they are wet, and people who conform instinctively and wholeheartedly to contemporary pieties—about borders and gender and free speech and identity—find it very difficult to understand the extent of their epistemic bubbles. We seem to be entering an age when didactic pro-establishment propaganda with little merit is not only everywhere, but goes unremarked and uncriticised because the people with cultural power generally agree with each other about almost every issue of importance.

If a literary or even cinematic Bond is to retain any shred of his antiheroic character—or even to remain merely interesting—he’s going to have to become ever more an outsider in his behavior and opinions. He can do that simply by remaining himself. Whether the people at the levers of publishing and filmmaking will allow that is another question entirely.

Gooch’s entire review is worth reading, not only for its critique of Higson’s book but for its insight into the present cultural hegemony. I’ve written about Bond along similar lines several times before: here on the blog about the vein of melancholy running through Fleming’s stories as Bond watches the disintegration of the world he is defending, and at the University Bookman about Bond’s arc and Fleming’s craftsmanship. I’ve also speculated about what is to become of the film series and its Bond here.

JRR Tolkien, 50 years later

Yesterday was the 50th anniversary of the death of JRR Tolkien, an occasion for reflection and appreciation. A few impromptu thoughts on Tolkien’s work and what it—and Tolkien himself—have meant to me over the years:

Part of Tolkien’s legacy for me is purely philosophical. Writing at The Critic yesterday, Sebastian Milbank considers the unlikely success of Tolkien’s storytelling in a literary environment under the Sauron-like dominion of irony, cynicism, amorality, and the tortured solipsism of post-Freudian psychology. Milbank:

More than anything else Lord of the Rings communicates a sensibility utterly at odds with the spirit of the age in which it was written. It is one of profound, tragic loss, of the vulnerability of irretrievable, ancient beauty, that must desperately be conserved and defended. It is of the inherent heroism of standing against destructive change, of hope beyond all reason, amidst the logic of history, which Tolkien named “the long defeat”.

Further, there is no “authorial wink,” no signaling or messaging for fellow bien pensants of the kind typical of elitist, politically motivated modernist art. In its morality (not moralism), earnestness, and total commitment to the act of storytelling and the sub-creation of imaginary worlds, Tolkien’s legendarium has become perhaps the anti-modernist myth par excellence, and not by taking any conscious stance but simply by being utterly and sincerely itself.

And yet this philosophical and religious understanding of the drama of Tolkien’s stories came to me later, after long thought and even longer thoughtless basking in his world and words. And I do mean words quite literally.

During college I moved from The Hobbit and The Lord of the Rings to some of Tolkien’s essays. This proved a fortuitous time to do so. I read “Beowulf: The Monsters and the Critics” at about the same time I was taking British History I and British Literature I, a one-two punch of subjects that have become lifelong passions. Reading and rereading Beowulf, and learning about the Anglo-Saxons, and rereading The Lord of the Rings, and then giving intense nuts-and-bolts attention to language, style, tone, and technique in my Creative Writing minor classes, aided by the insights and example of John Gardner, revealed much that was at work beneath the surface of Tolkien’s stories. For the first time I perceived the purposeful deliberation behind his choice of words and the structure of his sentences and poems, and the careful use of allusion to expand the world in which a small story takes place.

This was heady stuff and, emboldened with the enthusiasm of discovery and youthful experiment, I plunged into my Novel Writing class in my final semester of college armed with new and exciting tools. The eventual result was No Snakes in Iceland. It may be that I belong to that class of “mediocre imitators,” as Milbank calls them in his piece, but my first published novel would not have come to be without Tolkien’s example.

And Tolkien’s example extends beyond the literary. Milbank does not delve far into Tolkien the man in his essay, but Tolkien’s actual life story stands as just as strong a rebuke to modernism as his novels. A child of hardship, orphaned at an early age, raised in a teeming industrial city, a soldier in some of the worst combat in history, here is a man who lived through all the blights that should have embittered and driven him to misery and indulgence nevertheless living quietly and faithfully with his wife and children and working just as happily in his professional field as on his private hobbies. He is not the tortured, arrogant literary scribbler of modern myth and his protagonists are neither whiners nor degenerates. Geniuses don’t have to be jerks. A man of faith, duty, family, close friendship, rigorous and honest scholarship, and devotion to the small and parochial and workaday, Tolkien was in every way a candidate for a saintly “hidden life.” And yet everyone knows who he was.

This is perhaps the starkest irony of Tolkien’s life: that a man so contented with his lot and so unambitious (in the way of the worlds of commerce, politics, or celebrity) should become the author of the twentieth century. The more I have studied his life the more I admire him and wish for the grace to emulate him.

All of these things matter to me—the philosophy, the aesthetic, the man himself—but at the root of Tolkien’s meaning for me lie the stories. As it should be.

I can actually date my love for Tolkien. I read The Hobbit in high school at a friend’s urgent insistence (thanks, Josh!) and—again, fortuitously—my county’s brand-new Walmart had the book in stock. On July 10, 2000, I was reading it while my family sat in Atlanta traffic on our way to Turner Field for the home run derby. I had reached a chapter called “Riddles in the Dark.” We drove through a tunnel, one of those places where the interstate runs under a major street like Jimmy Carter Boulevard, and I was hooked. I had enjoyed The Hobbit up to this point but now I loved it, and knew I would read the rest and move straight on to The Lord of the Rings as soon as I could. I’ve never looked back and that love and excitement has never flagged or diminished.

That’s the power of a good story told by a great artist. In Chesterton’s words, Tolkien became for me the “center” of a “flaming imagination.” That imagination has remained aflame for twenty-three years because Tolkien not only told a good story, which plenty of people can do, but because his work is rich and deep and loving and, most of all, true enough to return to again and again for more. No burglar can diminish this hoard by even a cup.

I’ve used the word “fortuitous” twice in this reflection despite knowing that Tolkien would not himself think of it that way. Rightly so. Tolkien believed in Providence and it is largely through his example that I can grasp and trust in that idea. So when I do think of those coincidences, the circumstances and strong confluences and old friendships that kindled and kept my love for Tolkien burning, I hear Gandalf’s closing admonition to Bilbo: “You don’t really suppose, do you, that all your adventures and escapes were managed by mere luck, just for your sole benefit?”

No, I don’t. Thank goodness.

JRR Tolkien, artist, scholar, elf-friend, and faithful servant of God, from whom all creativity descends, RIP.

More if you’re interested

Milbank’s essay at The Critic is excellent—one of the best recent pieces on Tolkien that I’ve read. Check out the whole thing here. For more on the religious and philosophical underpinnings of Tolkien’s world, all worked out organically through his storytelling rather than imposed as a moral, read Peter Kreeft’s The Philosophy of Tolkien. I’ve written about Tolkien here many times before—just click the Tolkien tag below for more—but in the early days of this blog I reviewed a delightful and beautifully illustrated children’s book called John Ronald’s Dragons that I want to recommend here again.

Further notes on Indy and Oppie

July was a big movie month here on the blog, with three reviews of movies ranging from “adequate compared to Kingdom of the Crystal Skull” to “great.” Two of them I’ve reflected on continually since seeing them and reviewing them here, especially as I’ve read, watched, and listened to more about them.

Here are a few extra thoughts on my summer’s movie highlights cobbled together over the last couple of weeks:

Indiana Jones and the Curse of Woke

When I reviewed Indiana Jones and the Dial of Destiny a month and a half ago, I didn’t dwell on the malign influence of woke ideology in its storytelling, only mentioning that I had justifiable suspicions of any Indiana Jones film produced by Disney. I wanted to acknowledge those doubts without going into detail, because after actually watching and, mostly, enjoying the movie, I found that the problems I had with Dial of Destiny weren’t political at all, but artistic. It isn’t woke, it’s just mediocre.

That didn’t stop a certain kind of critic from finding the spectral evidence of wokeness in the film and trumpeting their contempt for it. I’m thinking particularly of a caustic YouTube reviewer I usually enjoy, as well as this review for Law & Liberty, which comes out guns blazing and attacks Dial of Destiny explicitly and at length along political lines.

The problem with these reviews is that in their hypersensitivity and their mission to expose ideological propaganda they do violence to the object of their criticism, not just misinterpreting things but getting some thing completely wrong. Here’s a representative paragraph from that Law & Liberty review:

Next, we cut to 1969, the Moon Landing. Indy is an old tired man, sad, alone, miserable. The camera insists on his ugly, flabby naked body. His young neighbors wake him up with their rock music and despise him. His students don’t care about his anthropological course. His colleagues give him a retirement party and soon enough they’re murdered, by Nazis working secretly in the government, with the complicity of the CIA or some other deep state agency. We see the wife is divorcing him; we later learn, it’s because his son died in war, presumably Vietnam—Indy told the boy not to sign up.

What was remarkable about this paragraph to me is how much it simply gets wrong. Indy’s hippie neighbors wake him up by blasting the Beatles, yes, but they also treat him perfectly amiably. (In fact, it’s Indy who knocks on their door armed with a baseball bat.) It is never clear that Voller’s men have help from the CIA or any other “deep state agency;” I kept waiting for that connection but it never came. And Indy did not try to stop his son from joining the army, a point made so clear in the film—Indy’s one stated wish, were time travel possible, would be to tell him not to join—that it’s staggering to think a critic went to print with this.*

From later in the same review: “But turning from obvious metaphors to ideology, Indy is replaced by a young woman, Helen [sic—her name is Helena], daughter of his old archaeological friend Basil, but the film suggests you should think of her as a goddess to worship.” One of my chief complaints about Dial of Destiny was its failure to deal with Helena’s criminality, giving her a half-baked or even accidental redemptive arc that spares her a face-melting, as befitted all similar characters in Indy’s inscrutable but always moral universe. That bad writing again. But how one could watch her character in action and conclude that the audience is meant to “worship” her is beyond me. This is anti-woke Bulverism.

What these hostile reviewers describe is often the opposite of what is actually happening in the film. I’ve seen multiple critics assert that Helena has “replaced” Indy and “controls” and “belittles” him. The Law & Liberty reviewer describes Indy as just “along for the ride.” Helena certainly intends to use him—she’s a scam artist and he’s a mark. This is all made explicit in the film. But it is also made explicit that Indy does, in fact, keep taking charge and leading them from clue to clue and that he is much a tougher mark than Helena was counting on.

Dial of Destiny’s actual problems are all classic artistic failures—poor pacing, overlong action sequences, plodding exposition, weak or cliched characters,** slipshod writing, and a misapprehension of what matters in an Indiana Jones movie that becomes clearest in the ending, when Indy is reunited (for the third time) with Marion. Here the filmmakers make the same mistake as the team behind No Time to Die by giving Indy, like Bond, romantic continuity and attempting to trade on sentimentality when that is not what the character is about.

Again—these are artistic problems. Helena Shaw isn’t a girlboss or avenging avatar of wokeness; she’s a poorly written villain who doesn’t get her comeuppance. But I saw little such criticism among the fountains of indignation from the reviewers who pursued the “woke Disney” line of criticism.

Perhaps this is the greatest curse of wokeness: that it distorts even its critics’ minds. Once they’ve determined that a movie is woke, they’ll see what they want to see.

Call it woke derangement syndrome and add it to all the other derangement syndromes out there. Woke ideology is real, even if the ordinary person can’t define it with the precision demanded by a Studies professor or Twitter expert, and it is pernicious, and it produces—even demands—bad art. It is a kind of self-imposed blindness, as are all ideologies. But zeroing in on wokeness as the explanation for bad art can blind us to real artistic flaws, and if any good and beautiful art is to survive our age we need a keen, clear, unclouded vision of what makes art work. We need not just a sensitivity to the bad, but an understanding of the good.

Douthat on Oppenheimer

On to better criticism of a better movie. Ross Douthat, a New York Times op-ed columnist who writes film criticism for National Review, has been one of my favorite critics for the last decade. Douthat begins his review of Oppenheimer with an abashed confession that he feels guilty saying “anything especially negative about” it, but that as brilliantly executed as it is, he is “not so sure” that it is “actually a great film.”

Fair enough. What gives Douthat pause, then? For him, the problem is Oppenheimer’s final third, which he sees not as a satisfying denouement but simply a long decline from the height of the Trinity test, a decline complicated by thematic missteps:

There are two problems with this act in the movie. The first is that for much of its running time, Oppenheimer does a good job with the ambiguities of its protagonist’s relationship to the commonplace communism of his intellectual milieu—showing that he was absolutely the right man for the Manhattan Project job but also that he was deeply naïve about the implications of his various friendships and relationships and dismissive about what turned out to be entirely real Soviet infiltration of his project.

On this point I agree. As I wrote in my own review, I thought this was one of the film’s strengths. Douthat continues:

But the ending trades away some of this ambiguity for a more conventional anti-McCarthyite narrative, in which Oppenheimer was simply martyred by know-nothings rather than bringing his political troubles on himself. You can rescue a more ambiguous reading from the scenes of Oppenheimer’s security-clearance hearings alone, but the portions showing Strauss’s Senate-hearing comeuppance have the feeling of a dutiful liberal movie about the 1950s—all obvious heroes and right-wing villains, no political complexity allowed.

The second problem, as Douthat sees it, is that the drama surrounding Oppenheimer’s political destruction and Strauss’s comeuppance is unworthy of the high stakes and technical drama of the middle half of the movie concerning the Manhattan Project: “I care about the bomb and the atomic age; I don’t really care about Lewis Strauss’s confirmation, and ending a movie about the former with a dramatic reenactment of the latter seems like a pointless detour from what made Oppenheimer worth making in the first place.”

There is merit here, but I think Douthat is wrong.

I, too, got the “dutiful liberal” vibe from the final scenes, but strictly from the Alden Ehrenreich character. Ehrenreich is a fine actor unjustly burdened with the guilt of Solo, but his congressional aide character’s smug hostility to Strauss as Strauss is defeated in his confirmation hearing feels too pat, too easy. It’s Robert Downey Jr’s sympathetic and complicated portrayal of Strauss, not to mention the fact that the film demonstrates that, however Strauss acted upon them, his concerns about espionage and Oppenheimer’s naivete were justified, that saves the film from simply being standard anti-McCarthy grandstanding.***

Regarding the seemingly diminished stakes of the final act, I too wondered as I first watched Oppenheimer whether Nolan might have done better to begin in medias res, to limit himself strictly to the story of the bomb. But that story has already been told several times and Oppenheimer is very much a character study; this specific man’s rise and fall are the two necessary parts of a story that invokes Prometheus before it even begins.

The key, I think, is in the post-war scene with Oppenheimer and Einstein talking by the pond at Princeton. Nolan brings us back to this moment repeatedly—it’s therefore worth paying attention to. The final scene reveals Oppenheimer and Einstein’s conversation to us:

Oppenheimer: When I came to you with those calculations, we thought we might start a chain reaction that would destroy the entire world.

Einstein: I remember it well. What of it?

Oppenheimer: I believe we did.

Cue a vision of the earth engulfed in flames.

A technology that can destroy the entire world is not just the literal danger of Oppenheimer’s project, but a metaphorical one. The Trinity test proves fear of the literal destruction of the world unfounded, but the final act of the film—in which former colleagues tear each other apart over espionage and personal slights and former allies spy and steal and array their weapons against each other and the United States goes questing for yet more powerful bombs, a “chain reaction” all beginning with Oppenheimer’s “gadget”—shows us an unforeseen metaphorical destruction as it’s happening. The bomb doesn’t have to be dropped on anyone to annihilate.

This is a powerful and disturbing dimension of the film that you don’t get without that final act.

Finally, for a wholly positive appraisal of Oppenheimer as visual storytelling—that is, as a film—read this piece by SA Dance at First Things. Dance notes, in passing, the same importance of the film’s final act that I did: “The two threads are necessary to account for the political paradox of not just the a-bomb but of all technology.” A worthwhile read.

Addenda: About half an hour after I posted this, Sebastian Milbank’s review for The Critic went online. It’s insightful well-stated, especially with regard to Oppenheimer’s “refusal to be bound” by anyone or anything, a theme with intense religious significance.

And a couple hours after that, I ran across this excellent Substack review by Bethel McGrew, which includes this line, a better, more incisive critique of the framing narrative than Douthat’s: “This is a weakness of the film, which provides all the reasons why Oppenheimer should never have had security clearance, then demands we root against all the men who want to take it away.”

Tom Cruise does the impossible

The most purely enjoyable filmgoing experience I had this summer was Mission: Impossible—Dead Reckoning, Part I. To be sure, Oppenheimer was great art, the best film qua film of the summer, but this was great entertainment. I enjoyed it so much that, after reviewing it, I haven’t found anything else to say about it except that I liked it and can’t wait for Part II.

Leaving me with one short, clearly expressed opinion—a truly impossible mission, accomplished.

Endnotes

* In fairness, the review has one really interesting observation: in reference to the film’s titular Dial being Greek in origin, unlike the Ark of the Covenant or the Holy Grail, “Jews are replaced by Greeks in the Indiana Jones mythology, since our elites are no longer Christian.” The insight here is only partially diminished by the fact that the elites who created Indiana Jones were not Christian, either. Steven Spielberg, Philip Kaufman, and Lawrence Kasdan—key parts of Raiders—are all Jewish.

** Here is where Dial of Destiny drifts closest to woke characterization. The agents working for Voller in the first half include a white guy in shirt and tie with a crew cut and a thick Southern accent and a black female with an afro and the flyest late 1960s fashion. Which do you think turns out to be a devious bad guy and which a principled good guy? But even here, I don’t think this is woke messaging so much as the laziness of cliché. Secondary characters with Southern accents have been doltish rubes or sweaty brutes for decades.

*** A useful point of comparison, also involving a black-and-white Robert Downey Jr, is George Clooney’s engaging but self-important Good Night, and Good Luck. Watch both films and tell me which is “all obvious heroes and right-wing villains.”