The Books of 2015

I was quite blessed last year to have a read a number of wonderful books, both fiction and nonfiction, some of which were new to me, and some of which were old acquaintances. I’ve read Mansfield Park and Macbeth, for instance, several times already, but they unfold unwonted revelations upon each reading. The following is a list of books that I read for the first time in the last year; it is all fiction, mostly for the sake of categorical clarity (I don’t really think I can justify how I would rank The Principle of Hope, Vol 1, for instance, against Muriel Rukeyser’s Book of the Dead; it just don’t sit right). Let me quickly give mention, though, to two impossible-to-categorize memoirs: Maxine Hong Kingston’s The Woman Warrior and Theresa Hak Kyung Cha’s Dictee. They are required reading for reasons that, in all honesty, elude my critical capacities. They’re simply that good. In any event, the following are all books I read in 2015 that have, in some way, deeply enriched my comprehension of life, the world, and the soul.

The Divine Comedy by Dante Alighieri. Technically, I had already read The Inferno back in high school, but I figure a gap of nearly 20 years ought to qualify reading the Commedia in its entirety as “new.” I’m not sure whether it’s because of arrogance, ignorance, or some bizarre combination thereof, but I went into the Comedy with relatively low expectations. What I learned from Dante is a truth universally acknowledged but not often enough reiterated: the classics have new things to teach us. In particular, Dante pretty nearly revolutionized the way I think about divine love and its relation to sin. Truly, if you haven’t ever read The Divine Comedy, please do so. The edition I read was the translation by Robert and Jean Hollander. To be honest, I didn’t care so much for their translation. It seemed as though I got the content of the poem, and the translation was elegant, but the side-by-side comparison showed me that, even with my tourist’s-level Italian, their English version contained almost none of Dante’s poetry. That said, the endnotes and introductions were enormously helpful and endlessly fascinating. I’m going to make a point of re-reading the Comedy throughout my life, and I will likely try a different translation each time until I finally find the one that’s right for me. I hope you all find the one that’s right for you.

Infinite Jest by David Foster Wallace. Yes, much like Gravity’s Rainbow, it’s that book that everyone claims to have read (but really hasn’t), because she has borrowed it from the library 20 times with every intention of reading it, only to get a couple hundred pages in (if that) and stop, or she’s had it sitting on her shelf for twenty years, relying upon friends and visitors picking it up and rifling through the pages as a substitute for actually having to dust it off, or has sworn never to read it on general principle, because it’s: a) the book all those obnoxious hipsters/English majors claim to have read and loved, or b) a thousand effing pages long, and screw those endnotes (because, honestly who does that?). So. I read it. The whole thing. Not only worth finishing on its literary merits (which are considerable), but a prophetic diagnosis of a culture that has resorted to self-fulfillment as the ultimate authority, and a remarkable feat of authorial empathy.

Tracks by Louise Erdrich. Opening lines: “We started dying before the snow, and like the snow, we continued to fall. It was surprising there were so many of us left to die. For those who survived the spotted sickness from the south, our long fight west to Nadouissoux land where we signed the treaty, and then a wind from the east, bringing exile in a storm of government papers, what descended from the north in 1912 seemed impossible.” If reading those lines doesn’t make you want to read the book, I don’t know what would.

The Grapes of Wrath by John Steinbeck. Another feat of authorial empathy, this classic of twentieth century American letters is justly considered to be an epic. It’s Dickensian social realism in the best possible ways: a feel for the vernacular patois of the characters, a masterful control over the rhythm of the sentences, and a surefooted sedimentation of the chapters. This is an edifice erected as a monument to a hard time in our history, to all who survived it, and to all who didn’t.

Bring the Jubilee by Ward Moore. One of the great things about science fiction is that it can give thought experiments the moral weight of narrative. To a certain extent, all stories are fables. We are invited to exercise judgment on the actions and meaning of the characters we read about, and the exercise of judgment is a healthy thing to do if we want to keep our consciences in trim, fighting shape. In Bring the Jubilee, Moore mounts one of the great thought experiments on sf about the nature of free will and historical determinism. There are ambiguities, as there must be in most great stories. In the end, however, he implies, in grand existential fashion, that free will or not, we still bear moral responsibility for choosing whether or not to act.

The Crossing by Cormac McCarthy. Can you believe I never read anything by McCarthy until 2015? While I also read Blood Meridian and The Road for the first time, The Crossing is the one that blew me away. It’s really stunning, prophetic. The artistic invention of grace out of the whole cloth of human cruelty and cosmic indifference.

Invisible Man by Ralph Ellison. Another one that rather eludes my critical capacities, but it’s another prophetic work that manages to be utterly alienated and utterly tuned in to the need for authentic connection. Somehow caustic, bitter, and unsparing without giving up hope.

The Dispossessed by Ursula K. Le Guin. I know, another one I can’t believe it took me this long to get around to. I don’t have the excuse, as with Infinite Jest, of it being particularly long or aesthetically forbidding. Le Guin is a challenging and precise writer, but not in that way. I can see why this is considered to be her masterpiece, and while I did appreciate the structure, the overwhelming impression left one me was that it manages to dramatize the complicated nature of social injustice. Le Guin is about as merciless as possible with her socialist-anarchist Anarres, emphasizing that problems remain in even the best of possible worlds, yet she manages to inspire palpable relief when Shevek finally returns home—home to his planet, to his family, to the way of life he knows best. Whatever its flaws and shortcomings, the striving for a better world only has meaning when it is embedded in a particular context, and Le Guin imbues that context with the kind of utopian possibility that can only be illuminated by disappointment, but a disappointment put in its proper perspective—the kind bred by intimate familiarity.

Hyperion by Dan Simmons. Though it doesn’t really stand alone very well (and, as part of a pre-planned series, it’s not meant to), Hyperion rather lives up to its reputation as a masterpiece of sf worldbuilding. Its rep as “cerebral” sf may be a touch overblown, but only in comparison to, say, The Book of the New Sun. Simmons is a very smart writer, and he manages to weave together a story that is sort of about everything that sf is about: human nature, free will v. determinism, ontological reliability, etc. Each section of the book is a wonderful variation in tone and subgenre, ranging from characters study to action adventure to bildungsroman. No trope is left unturned, and Simmons always ties character development into the weirdnesses of his recognizably alien universe.

Lightning Rods by Helen DeWitt. This is a book about how neoliberalism screws us from behind. Metaphorically or literally? Sort of both. Satire’s brutal honesty depends on being a bit outré, and as brutal riffs on contemporary society go, this is damned prophetic.

The House of Mirth by Edith Wharton. Lily Bart’s decline and fall is full of pathos, but like any good social satire, it’s also shockingly lively and witty. Wharton’s genius is in finding a way to make her ruthlessness a form of empathy. As with Lightning Rods, she’s brutally honest, but her tonic doesn’t taste bitter, just a sad combination of bemused and furious.

Sort of Dishonorable Mention:

Atlas Shrugged by Ayn Rand. This is the worst novel I’ve ever read that I think ought to be required reading for all Americans. Unlike most folks who hate Atlas Shrugged, I found the prose to be bracing, and for all the tedious sermonizing, Rand knows how to craft rhetorically compelling speeches. This is polemical pulp fiction at its stylistic best. It is also an argument in favor of straight-up anarcho-capitalist plutocratic oligarchy. The heroes are people who actively sabotage civilization so that they don’t have to put up with any peons or government bureaucrats siphoning any of their precious bodily fluids wealth. On the one hand, you might be tempted to feel sympathy for these ingenious captains of industry who have to deal with the incompetent jerks bleeding them dry at every turn. On the other hand, they cause an extinction-level event in order to make more money for themselves. To the extent that anyone can be sympathetic to the plight of genocidal one-percenters, this book makes the best possible case on their behalf. For the rest of us who have something resembling a social conscience, it’s a window into the ideology driving both the Tea Party and Trumpism.☕

Stranger Things ☕ Season One, 2016

I’d have to say that my favorite single character scene in the series involves the boy’s science teacher, Mr. Clarke. As boys are bombarding him with questions concerning parallel universes (it all makes sense in the end), they throw out an obscure Dungeons & Dragons reference and he knows exactly what they’re talking about. It’s little moments like this one where Stranger Things feels authentic — where the nerdy references and pop culture homages become more than the sum of their parts because of the delightful, sympathetic characters making them.

Jason Morehead nails the appeal of Stranger Things pretty generally in his review, but that paragraph clinches it. If you haven’t already binged on it, Netflix’s latest nerd-friendly show is possibly the best Stephen King adaptation never made. Without spoiling anything, it’s about a kid vanishing under mysterious circumstances and the encounters those searching for him have with weird things involved in his disappearance. Beyond the main title font and a few explicit nods to King’s work (like the guard reading Cujo in the morgue), Matt and Ross Duffer’s story takes advantage of the Netflix format to indulge a sprawling story peopled with a fairly large cast of small-town characters. Like Derry, ME, Hawkins, IN, is a fully-realized community. Unless you’re talking about Tolkein or perhaps Austin T. Wright, fiction is rarely able to give you a firm sense of topography; texture comes primarily through characterization or other tools of world-building: the accumulation of details often overlooked in real life, but which make all the difference in grounding audiences in other worlds. Detail is especially key to historical fiction, and critics have already spilled plenty of ink (physical and digital) over Stranger Things’s recreation of the early 1980s. But it’s really the characters that make it feel real, because they arise from real historical possibility, as Georg Lukacs might have put it.

While I’m not Stephen King’s biggest fan (to put it mildly), the standout trait of all his work that I’ve read is the amount of time he lavishes on his characters. His worlds feel real because of the often complicated (or overwrought; perhaps overdetermined or unnecessary) networks of characters that comprise his stories. King’s work often doesn’t focus on a single central protagonist. His heroes are often groups: motley assemblages of stereotypes tweaked by his eye for psychological detail into three-dimensionality. Obviously, this is not always the case, but even in stories focusing on a scant few individuals, they are always rooted in relationships with others, perhaps even people you never meet within the pages of the story proper. This is King’s greatest asset as a storyteller, even as his predilection for overstuffing his stories with subplots—sometimes stemming from an overabundance of characters—is also one of his greatest weaknesses.

At a relatively trim eight episodes, Stranger Things doesn’t tend to fall prey to King’s excesses in this regard. Joyce’s relationship with her no-good ex, Lonnie, for instance, might seem to go nowhere. He’s not a fully-realized character by any stretch, but the framework is there for him to become one. More importantly, Joyce’s hysterical personality comes into focus a bit when you finally meet him. A lot of folks haven’t dug Winona Ryder’s performance; I did. You get the distinct impression, seeing how Lonnie interacts with Joyce and Jonathan, that this guy is a master of playing his loved ones’ insecurities off each other. It’s easy to see how Joyce might have been “high strung” in her youth and how Lonnie pushed her relentlessly into something short of a basket case. Then, of course, there’s Barb, bookish and loyal to a fault. Her relationship to Nancy makes total sense, as does Nancy’s increasingly thoughtless behavior toward her friend. To put it bluntly, supporting characters are props. They need to be plausible; they need to have some dimension. But they are, in some ways, terrain, and the protagonists are the ones who traverse it. The role of the terrain is to give better shape, definition, and psychological dimension to the heroes; in turn, if the protagonists are well-crafted, the terrain itself becomes more real, better-defined. A world apart. We love supporting characters like we love gravity and breathable air. They’re necessary for life.

Which brings me back to Mr. Clarke. He’s pretty much the greatest teacher ever. He’s also very much of his time and place. Lots of middle school teachers, I’m sure, go above and beyond to help their students. But in 2016, teachers have to wary of boundaries. Mr. Clarke is both teacher and buddy (sort of) to the boys in Stranger Things. He’s a mentor in an era when institutional structures didn’t make the kind of relationship he has with the boys totally weird. One of the other great scenes in the film is when the boys phone him at home while he’s on a date with questions on how to build a sensory-deprivation tank. Randall P. Havens is pretty great throughout, but if the scene where he explains string theory with D&D references is the most charming, this one offers the greatest insight into how far away 1983 really is. Not only do the boys interrupt his date by calling him at home on the weekend, but Havens plays Mr. Clarke as savvy enough to know that the boys are Up To Something, yet, because they’re so invested in scientific geekery, he can’t help but give them the information they need to really get in trouble. In 2016, when your adolescent students call you at home to ask you how to build a DIY sensory-deprivation tank, you hang up and send an email to someone in the administration. In Stranger Things, Mr. Clarke’s bond with and trust in his students is what helps them save their friend.

He’s a minor character, of course. Someone that my friend, Scott, calls “Mr. Plot,” a supporting cast member whose main function is to deliver exposition. Yet he feels real because his function in the story makes the main characters more grounded. He adds to the world. Another great minor character is Chris Sullivan’s short order cook. Because Stranger Things sets up a world in which a sprawling cast of characters can be supported, his scenes early in the series with Millie Bobby Brown are both tense and heartfelt, suggesting layers in his own personality and the potential of their own developing relationship. He functions mainly to give you a sense of the kind of town Hawkins is and the stakes of Eleven’s plight, but Stranger Things can spend time on his scenes with Eleven that a feature film would condense quite a bit more.

Netflix originals have been criticized in the past for not really understanding how to make the most of their medium. Jessica Jones, for instance, was critiqued for its pacing, as have other Netflix series. Making shows for a binging audience is a new thing. It’ll take time to crack that code on a consistent basis. I think the Duffers have taken us a good deal further toward that goal. Whereas time spent on supporting characters in a superhero show might feel like “filler” (though I’m not sure I totally agree with that assessment), for a show like Stranger Things, the little scenes spent with tertiary characters are utterly necessary to the show’s raison d’être. This is world-building, not padding. Even if those scenes don’t have a payoff in terms of plot mechanics, I can’t think of a scene from Stranger Things, off the top of my head, that isn’t in some way necessary to capturing the messy, sprawling reality of interpersonal relationships in a small town. Sometimes resolution is itself a bit of a cheat. Unlike a lot of the adaptations of Stephen King’s actual work, few of the “dead end” subplots in Stranger Things subtract from the overall experience. If there is to be a second season set in Hawkins, these things are utterly necessary for establishing a solid foundation for future chapters. Even if the first season of Stranger Things were to stand alone (and I think it certainly does), there’s almost nothing about it that feels totally wasted—if you consider replicating King’s dense texturing of community to be a paramount aesthetic goal.

Especially when you consider how important redemption is to the thematic arcs of so many characters, this becomes more important. It’s often easy to think of personal redemption in terms of individual achievement. Even when presented as something sought within a particular context, or something achieved with the help of others, stories of redemption often have a very individualist ethos to them. King’s stories often emphasize that doing the right thing is made more challenging by those in your own corner; those you love and rely on don’t make your life easier. They’re not supposed to, even when you’re doing all in your power to save yourself and them. Adversity creates fault lines as often as it cleaves people more strongly together, and even as a series like Stranger Things builds toward the main characters finally (finally!) pooling their knowledge and resources, it has to set the stage for fallout. No good deed goes unpunished, as they say, and no action provokes anything less than an equal and opposite reaction. You don’t just get scars from fighting monsters; you get them from friends and family, too. The best of us impose our flaws on the undeserving. That’s human nature. Without a capacious cast of characters and the little moments of grace and light that comes with them—like the boys’ interactions with Mr. Clarke—the dark lattice of shadows all people cast would not stand out so starkly in relief. Like all good tales of terror, Stranger Things knows that we are all made of light and shadow. Meaningful sacrifices aren’t made for one person, but for a world. Without a world of people—all fallen, all too human—you’ve come to know and care about, what difference would even one sacrifice make?☕

Ghostbusters ☕ d. Paul Feig, 2016

A modest prediction: like the original, 2016’s Ghostbusters will age well. Everyone knows that there are many New York Cities. There’s the real, actual NYC. There’s the NYC that each New Yorker lives in his or her own little world. Tourists, of course, have their own NYC. Then there’s the New York we see in movies: the violent dystopia, the romantic urbs bucolica, yesteryear’s city of tomorrow, etc. To paraphrase Whitman, it contains multitudes. The best movies set in New York City can only be set in New York City. Woody Allen doesn’t film, for the most part, in Boston, and despite what the Academy says, I don’t think it was such a hot idea for Martin Scorsese, either. By the same token, it’s impossible to think of the Ackroyd/Ramis/Reitman version of Ghostbusters taking place in Chicago, L. A., or New Orleans. “Let’s show this prehistoric bitch how we do things downtown.” Right? It’s gotta be set in New York, or it doesn’t quite work.

Or maybe it’s just that NYC as a milieu works so well as a catalyst for galvanizing types of humor honed elsewhere. After all, the original Ghostbusters cast was a mix of Canadians and Midwesterners, all connected with Second City and/or Saturday Night Live. The discipline of comedy tours and weekly television are rather like classical training for American comedians, who must adapt their routines and sketches to the demands of one of the most diverse audiences in the world. Live comedy demands an often tricky mix of topicality and timelessness—great jokes have to plugged into the here and now, but you can’t assume that everybody in the audiences is as plugged in as they should be. Film comedy is a different kind of tricky. Again, sharp humor always feels contemporary—but sharp humor always feels contemporary. The characters of Manhattan are as pathetic and funny in 2016 as they were in 1979; Peter Venkman’s narcissistic assholery and Ray Stanz’s blue collar geekery translated across state lines in 1984 as well as they translate across the three decades since they first appeared.

There’s little topical humor specific to 2016 in the new Ghostbusters, few allusions outside the franchise. Characters reference classic films like The Exorcist, but only to elements already deeply soaked into the pop culture consciousness. For instance, Andy Garcia plays the mayor of New York (because of course he does), and he deeply resents Kristin Wiig’s desperate scientist begging him not to be like the mayor from Jaws. Melissa McCarthy spends the whole film trying to get a decent bucket of wanton soup from her favorite Chinese restaurant—a running gag that works even better because only in (movie) New York City would someone stubbornly keep ordering the same disappointing soup from the same take-out joint and berate the delivery driver for it. Instead of “We’re ready to believe you!” or “Who you gonna call?,” the first slogan these Ghostbusters come up with is, “If you see something, say something,” only realizing after the flyers are already printed that someone is already using that one. In fact, that might be the most specifically New York joke of the film, and its topicality is restricted only in the sense that you have to know that the film takes place post-9/11.

In fact, that reference is probably the single strongest signal of the film’s temporal setting. There’s one instance of a smartphone video uploaded to YouTube costing a character a job, but apart from that, there’s little reference to the latest communication technologies, which probably comprise the single most conspicuous trait of our historical period. The (fictional, s’far’s I can tell) Mercado Hotel replaces 44 Central Part West as the site of the the climactic battle, and its art deco lobby is vintage (movie) New York City: it’s exactly the kind of perfectly preserved building you would expect to sit atop ancient ley lines, in addition to being an architectural expression of yesteryear’s cutting edge. It’s nebulously nostalgic, and while art deco might look simply dated elsewhere, it feels strangely a part of contemporary life in (movie) New York.

The Mercado Hotel climax is symbolic of what’s great about the film as well as what’s not so great. While it evokes that wonderful movie-NYC contemporary-nostalgia, it also evokes some of the most memorable scenes from the original Ghostbusters. Unfortunately, 2016’s Ghostbusters does entirely too much of that, and not cleverly enough. One callback that works well is the way this film brings in the classic logo, here spray-painted into a subway as a bit of mockery by a graffito. Another classy nod is the bronze bust of Harold Ramis glimpsed early in the film, gracing the hallowed halls of Columbia University. Cameos by other original cast members range from nice to outright distracting. Annie Potts essentially plays Janine, except here she’s the desk clerk in the Mercado. It works in part because her shtick is still funny, and because it’s a brief beat in the narrative flow. The single worst cameo is, unsurprisingly, Bill Murray’s. It’s not so much Murray’s performance as a paranormal debunker that clunks, but the fact that the film builds an entire sequence around him. While I think Paul Feig and Katie Dippold wanted him to be this version’s Walter Peck, it doesn’t really work out that way. For one, his cameo is too brief and poorly structured into the narrative to serve the catastrophic purpose of Peck in the original. For another, even if Murray’s performance is fine, he’s just too much Murray. Maybe other fans of the original will really dig him here. For me, the entire sequence screamed, “OMG you guys we got Murray for a day we gotta DO STUFF WITH HIM!!”

There’s really no way Feig et al. could win. Remaking a beloved film like Ghostbusters entails its own challenges that have little to do with the mechanics of storytelling and everything to do with fan service. Apart from the clunkiness of Murray’s extended cameo, he shows up at almost exactly the wrong time, a little more or less than halfway through the film. Until his appearance, the film had done deft work in metatextual commentary, sprinkling allusions to the earlier films into its original material in ways that were pleasing without interrupting the flow. In fact, the first 45 minutes or so of 2016’s Ghostbusters is borderline magnificent. It sets up a distinct cast, a different kind of villain, and it does all this with the workmanlike professionalism that makes for durable Hollywood cinema. The thematic arc is even distinct from the original. Ivan Reitman’s Ghostbusters were underdogs who got to prove their worth to a city famed for its facility with dream-crushing, and Peter Venkman learns to be a little less of a selfish asshole. Feig’s Ghostbusters are still underdogs who get to prove themselves, but this movie is really about what a difference friendship makes to said underdogs. The difference between the good guys and the bad guy here is that human connection. In a culture that frankly still often celebrates bullies and narcissists, the outcasts who save the city in the new film are honored for their personal strengths in ways that are subtext (if that) in the original Ghostbusters.

The cast makes that work. And as someone who is a big unplugged from pop culture, this was my first time really seeing Wiig, Kate McKinnon, and Leslie Jones as performers. SNL fans know them, but I don’t think I’ve watched SNL since about 2001 or 2002. They are simply terrific, as is McCarthy, whom I know going back to Gilmore Girls. The dialogue in this movie is good, and the special effects are okay; this is a movie you kind of have to see for the actors, though. Besides the great chemistry shared by the principle leads, they also spark with pretty much everyone else who shows up. I recognized Charles Dance, Ed Begley, Jr., Matt Walsh, Michael K. Williams, and Michael McDonald, of course; Cecily Strong, Neil Casey, and Steve Higgins are (apparently) SNL alumni as well. This isn’t quite an Ocean’s Eleven-level Who’s Who, but there are no wasted scenes with any of these performers. It’s all good stuff. Oh, and, yeah—Chris Hemsworth: delightful.

I’m interested to see how this movie plays over the long haul. Unlike a lot of my contemporaries, I didn’t see 1984’s Ghostbusters (or its sequel) until I was in my teens. So the nostalgia factor is a bit blunted, but I have watched the first film at least a dozen times. It’s impossible for me to watch 2016’s Ghostbusters and not be at least a little distracted by all the callbacks and cameos. Will younger audiences, those less attached to the original movies, feel the same way? What about viewers my age or older, who simply enjoy the cameos for what they are? I don’t typically see the point in doing a remake/reboot unless the filmmakers can find a reason to justify doing something new and different. Most of the new film hits the sweet spot between honoring the structure and vibe of the old one while still infusing it with the unique sensibility of its (re)makers. The very presence of the old cast (awesome though they are as individual performers) and some of the callbacks simply feels like an unwelcome intrusion, sort of like the VIPs that you’re obliged to put on the guest list even though the party will be super-unhip if they actually show up.

On the whole, though, it’s an enjoyable and—dare I say—necessary extension of the Ghostbusters franchise into the 21st century. The weird mix of welcome and unwelcome nostalgia is likely an unavoidable cost of that labor. All the same, what I kind of dig conceptually about the new film is that it formalizes the Ghostbusters not just as a viable franchise, but as a cultural institution, one that’s multigenerational in a meaningful, active sense. What would America be without its institutions—and what would (movie) New York be without its Ghostbusters? ☕

Reflections on revolution in American conservatism, part 2

Previously, I said that I have disavowed conservatism because a majority of American conservatives are aligned with bigotry. It’s a very presentist case to make, and the emergence of Donald Trump as the Republican (read: conservative, or at least “less liberal” than Hillary Clinton) candidate has made it not only easy, but convenient. To be perfectly honest, there’s a little bit of self-defense involved in my sudden deconversion: I don’t want to be associated with the racists and religious bigots on the Right who have made Trump their candidate. Since this blog is public, I don’t want anyone to make the mistake of thinking that I’m on board with the Lars von Trier melodrama unfolding within conservative circles. I’m not, in any way, interested in performing the ethical and rhetorical contortions to justify why I’m still conservative that other self-identified conservatives have been performing in order to explain away why they’re still voting for Donald Trump. I’m also not interested in performing the ethical and rhetorical contortions that other self-identified conservatives (the ones with a moral center) have been performing in order to place the responsibility for Trump’s ascendance on the Left. There’s blame enough to go around, I suppose, but I agree with Damon Linker that the main blame lies with the Right.

Therefore, a question that’s been vexing me for the last year is whether I’ve contributed to the surge of bigotry in any way simply by offering up conservative apologetics in the past.

This isn’t just navel-gazing; it’s a question of ethical responsibility. It’s a question I think every self-identified conservative ought to wrestle with. What is it in American conservatism (leaving aside other Western right wing traditions) that has enabled Trump, of all people, to be The Guy?

In assaying this question, I hope to make it clear that my disassociation with American conservatism as a political movement is not done purely for convenience, but something that is principled and which has been coming for some time. The truth is that it has been difficult for me for a long time to find much overlap between my own politics and the politics adopted by a majority of self-identified conservatives in this country. The difference now is that I no longer see myself as occupying a neglected corner of a big tent, but a place somewhere outside of it. In some ways, I feel that American conservatism pulled up stakes and left me behind some time ago, but it’s also likely that I simply wandered outside the tent at some point without realizing it until, just recently, I took a big gulp of fresh air and noticed that the circus, with its angry clowns and great heaps of elephant dung, was far, far away.

Most of what I wrote below with reference to historical context is mostly boilerplate summary and in no way my own original argument. As this is a personal reflection and not an academic essay, I’m not going to track down every document that has, for the last several years, nudged my thinking in this direction. Perhaps I’ll cover that in future reflections. At any rate, the historical context is my own words, but not my own ideas. Please read with that in mind.


Russell Kirk is probably one of the most famous twentieth-century theorists of conservatism as a distinct political philosophy. Among other things, he’s famous for enshrining Edmund Burke as a canonical forerunner of what we, in America, now think of as conservative ideology. His essay on the “Ten Principles of Conservative Thought” is one that I’ve returned to at different points in my life as I’ve tried to balance current political circumstances against my own evolving framework. It’s necessary to remember that Kirk’s main body of intellectual work was published in the context of the Cold War. In fact, it’s necessary to remember that what is now mainstream U.S. conservatism was developed in that context. American society underwent a number of social changes in the decades stretching from the end of World War II to the fall of the Berlin Wall.

For the moment, it’s important to keep in mind that conservative intellectuals contended with three political antagonists that they saw as mutually overlapping (or, rather, in alliance against them): 1.) the radical Left intelligentsia, based mainly in universities and cosmopolitan, mostly coastal, urban centers; 2.) international communism, exemplified by the Soviet Union, the People’s Republic of China, and, after the 60s, various countries in South America; 3.) the progressive/liberal political movement, dating back at least to Wilson and TR, with presidents like FDR, JFK, and LBJ carrying the torch forward. It’s true that these three traditions have included people who have felt kinship with all of them, and it’s also true that there were people in all of them that utterly despised and disavowed association with the other traditions. Broadly speaking, the only thing these traditions have in common is that they were generally “Leftist.”

Apart from that, there were often sharp disagreements in terms of ideology and praxis between them. One of the more important distinctions is that the radical Left was often deeply critical of the entire European Enlightenment tradition, going all the way back to Locke, Kant, and Smith, while liberals often championed their causes on the principles that the radicals despised: individual liberty and reason. The radical argument (and I’m being quite reductive here) is that the Enlightenment tradition paved the way for the worst excesses of capitalism, which undergirded the material wealth and accomplishments of the West, including the wealth gained through the various colonial projects of the European nations. For radicals, individualism and the celebration of reason were just tools for keeping democratic populations docile and too motivated by self-interest to actually follow through on overcoming inequality. Liberals, by contrast, often achieved their most significant victories—making voting more democratic, civil rights legislation, social safety net programs—by emphasizing the dignity and choice of the individual and her exercise of reason. Few of these progressive achievements actually undermined or attacked the foundations of capitalism itself, which is why the radicals saw those victories as nice but ultimately hollow. Communists, for their part, tended to combine the worst elements of Left agitation for equality and liberal designs for rationalized social equality. Liberals deplored the abuses of communist regimes; radicals were often split, with some simply turning a blind eye to communist horrors, some ascribing communist failures to capitalism’s unending malignancy, and some moving even further beyond the framework of the nation-state, often championing variations on anarchist subversion of convention, whether liberal-democratic, fascist, or communist.

Into this context, the modern conservative movement as such was nursed into being.

The “ten principles” outlined by Kirk were originally delivered to a speech to the Heritage Foundation in 1986, late in his life, and more than three decades after the publication of The Conservative Mind. By this point, “movement conservatism” had become an ideology of its own. William F. Buckley, Jr. had the ear of President Ronald Reagan, and National Review was the flagship publication of conservative intellectual thought. Conservative pundits had also established a significant presence in mass media. Readers could follow writers like George Will and Charles Krauthammer in syndicated newspaper columns or The Weekly Standard, while radio listeners could tune into Rush Limbaugh or Phyllis Schafly. Pat Robertson and Jerry Falwell, in the meantime, had consolidated the so-called Religious Right into what they termed a “Moral Majority,” citizens who tuned intoThe 700 Club or subscribed to newsletters from organizations like Focus on the Family. The Heritage Foundation itself had become a leading conservative think tank by the mid-80s. Organizations like this one were vital to the project of welding a more hardline ideological conservatism, developed among the activist base that had propelled the Barry Goldwater insurgency a generation earlier, to the forefront of the political platform of the Republican Party.

In reading Kirk’s declaration of principles, we cannot afford to ignore this context. The Cold War caused a lot of misery globally, and to consider only the American context, it resulted in at least two wars fought to loss or stalemate (Vietnam and Korea), the support of often horrific dictatorships in South America, and the support of the proxy insurgents in the Middle East (to fight Soviet aggression into South and Southwest Asia) who would evolve into radical Islamist terror groups or theocratic dictatorships. On a more abstract level, it also managed to calcify ideological divisions in the United States—as far as conservatives were concerned—into two groups. On the one side were the three Left traditions outlined above, conflated by conservatives into a single monolithic force. Movement conservatives are as adept at invoking Stalin and Mao when talking about the Left as liberals are at invoking Hitler and Mussolini when talking about the Right. The Right’s calcification of the Left into a single, homogenous entity with respect to the Cold War is significant for my purposes only in that movement conservatives still have not moved on from that (wildly distorted) paradigm. What they have yet to come to terms with is that the particular fusion of political interests developed during the Cold War era—Christian social/religious hegemony, economic libertarianism (free market triumphalism combined with slashing federal programs), hawkish military spending and belligerence, increasing the militarization of police (coupled with extreme positions on individual gun rights), and expanding independent power of the executive branch for national security purposes—were also calcified. Outside the context of the Cold War, the Reagan fusion makes little to no ideological sense. As a coalition of interests threatened specifically by Soviet-style communism, it does. Much as movement conservatism falsely conflated various strains of Leftist thought,  it at least responded to a specific real-world situation. That’s not where the world is now, though.

The ten principles outlined by Kirk are not presented specifically as anti-communist resistance. They give the impression that they transcend their moment in time, as principles are meant to do. Kirk’s summation at the very end feels apt to what it is that conservatism ought to be, a recognition of “an enduring moral order in the universe, a constant human nature, and high duties toward the order spiritual and the order temporal.” That’s almost too transcendent, though. It’s not much different from the affirmation that we ought to bend toward our platonic ideals, rather than bend a mutable universe to our own will. As the foundation for political philosophy, we could do worse, but there have been moments in progressive thought, especially of the Hegelian varieties, that see progress as teleological, which is often simply another form of pursuing transcendent ideals. We do better to consider some of the specific principles Kirk outlines to get a sense of what he means.

I won’t go through all of them, but I’ll highlight a couple that are most pertinent to my reflections.


Let’s start with this one, which is probably the most recognizably conservative in the American political context.

Seventh, conservatives are persuaded that freedom and property are closely linked. Separate property from private possession, and Leviathan becomes master of all. Upon the foundation of private property, great civilizations are built. The more widespread is the possession of private property, the more stable and productive is a commonwealth. Economic leveling, conservatives maintain, is not economic progress. Getting and spending are not the chief aims of human existence; but a sound economic basis for the person, the family, and the commonwealth is much to be desired.

You can see the fingerprints of the libertarian strain of Austrian economists all over this principle. More specifically, it is thinkers like Ludwig von Mises, F. A. Hayek, and Murray Rothbard, who developed the foundations for latter-day libertarian politics. Leviathan is a synonym for statist totalitarianism in this snippet, and libertarians contend that economic liberty and individual liberty are isomorphic concepts, and that if a single entity (such as the modern nation-state) controls the majority of private property, then the freedom to exercise individual liberty is necessarily curtailed. It follows, of course, that those who posses more wealth have more liberty than others, but this is, in the anarcho-capitalist realm of political theory, acceptable, because it both incentivizes individual striving for more wealth and allows individuals who exercise their freedom irresponsibly to lose it.

I’ve flirted with libertarianism for years, and I think it to be one of the most ideologically consistent political philosophies; the consistency itself is appealing to me. Unfortunately, my interest in libertarian thought led me to read a good deal of it. As with many things, direct exposure to something is the best inoculation against it.

Libertarian social economists have yet to provide an explanation of how statist nonintervention is to prevent a form of corporate oligarchy from replacing a functioning representative democracy, apart from the hazy belief that if a corporation runs its fief inefficiently, it will crumble and be replaced by another. For anyone concerned at all with maintaining social order, valorizing the volatility of free market competition and its fallout strikes me as naïve at best, and malicious at worst. Furthermore, the alleged efficiency of the market in weeding the strong from the weak competitors conventionally draws parallels between the processes of capitalism and natural selection. Natural selection, as a natural process, is amoral. Why, then, would human society wish to acclaim an inherently amoral process as a “good,” especially in contrast to public institutions erected to promote specific social goods? Finally, drawing a theoretical link between the amount of wealth one has and the amount of liberty one may exercise is something that both Marxists and libertarians have in common. The main difference is that Marxists wish to promote relatively more freedom for all, whereas libertarians wish to promote more freedom only for the wealthy. In a world where one’s labors directly correlate with the amount of private property one is able to amass, the libertarians might have the theoretical edge. In the real world, where things are not now and never will be fair, yoking possession of private property to possession of freedom is as much the road to serfdom as state socialism.

Of course, Kirk does not specifically advocate libertarianism per se as a principle of conservatism, nor does he say that it is the highest good. Instead, his emphasis on the benefits of protecting private property ownership, taken in tandem with the other principles, is meant to highlight the benefits of having a society in which individuals are encouraged to reap the benefits of investing their time, effort, and wealth into their own property.

Since this is but one of ten other principles, we might also recall that it takes no precedence over the others. Yet I do not think that there is any other principle in contemporary conservative thought that is held more dear than the idea of “limited government,” especially with respect to state intervention in the economy. Opposition to or critique of unions (public or private sector), state spending on education, welfare programs, environmental protection, and business regulation of any sort invariably falls back onto the notion that “the market” is better suited to mediate virtually every human endeavor, rather than the government.

Edmund Burke’s Reflections on the Revolution in France was particularly animated by the revolutionaries’ confiscation of private property, which he saw as a precondition for the lawless totalitarianism soon to follow. But even for Burke, a key issue was the balance of power between the landed nobility, the church, and the crown, whose mutually-dependent property relations underwrote the economy of the ancien regime the revolutionaries sought to upend—in short, Burke wasn’t defending private property rights as defined against the state as such, but against the illicit confiscation of property by a particular (in his view, illegitimate) government, judged against the particular historical circumstances of its constitution. That’s not quite the same thing as championing private sector solutions to public sector problems. Kirk’s own position, as stated in the principles, clearly is pitched against total state or communal ownership of property, but in no way does he militate against government intervention per se. What is primarily at stake here, as I see it, is the degree to which private property is the medium of individual liberty. In the context of the Cold War, it makes sense that a conservative would uphold the necessity of private property in opposition to the communist governments of Russia and China, which often used privation tactically to neutralize dissent, or, at the very least, had millions of citizens who had equality but nothing to do with it.

Confiscation is a powerful political weapon, and communist regimes have rarely hesitated to deploy it. (Again, I recognize my reductiveness.) Though Burke had indeed harshly condemned the French radicals’ confiscation and redistribution of property—as well as charged them with economic illiteracy—his eighteenth century view of property rights was much more moderate than late twentieth-century conservatives’ view. Kirk’s argument is favorable to the libertarian-ish Reagan era, but to say that “freedom and property are closely linked” is not to say that freedom depends upon private property. At the heart of Burke’s critique was the state’s exercise of power  through the lawless disregard for private subjects’ established claims to their property. Kirk observed a similar operation in the communist regimes of his century, but those were examples of totalitarian state tyranny, not examples of what happens whenever the state is an economic agent. That’s a distinction lost on whomever conflates all Left traditions into a monolithic whole. Despite conservative claims to the contrary, Keynesians are not de facto socialists or communists or totalitarians. Seeking the state’s aid in addressing the consequences of gross economic inequality is not tantamount to seeking complete economic leveling across the board. A close link is not a necessary causal relationship.

There aren’t many communist nation-states left, and arguably none that don’t qualify as failures, dictatorships, or (as in China’s case) quasi-state-capitalist. In 2016, much of the rhetoric surrounding economic justice is directed toward the obvious fact that a relative handful of people control the economic futures of everyone else on the planet. It may be the case that private property and personal freedom are indissolubly linked, but for most people in the world, that only means that most people have less freedom than others, and the structure of capitalist accumulation always works to the benefit of those who have already accumulated more capital. Any speech about infringements on freedom that is related to business or environmental regulation that is made on behalf of people who already have money is therefore a speech about preserving the inequalities that already exist.

I’ll say that again. When conservatives in 2016 talk about defending the free market, they are talking about defending economic inequalities that are part of the current economic structure.

Take that as you will. I’m just trying to put it in historical context.

The conservative principle of yoking personal wealth and personal freedom against economic leveling is not about protecting us from state communism. It is about enabling those who already possess property to get more. Again, in theory, maybe this isn’t so bad. It’s the American Dream to take your shot at prosperity, right? Well, yes, but in a market based on competition, there will always be losers in that competition. And in a market-based competition, the loser loses his property to the winner. The loser loses freedom.

Part of what makes freedom scary is that we have the freedom to fail. In theory. It also means that we have the freedom to succeed, often in amazing, unpredictable ways. In theory. That also means that we can fail in traumatic, unpredictable ways. A free market offers no provision for the losers. For those with a modest amount of private property to gamble, the stakes involved might indeed inspire prudence and innovation. For those with a great amount to gamble, the stakes involved might lead to recklessness. After all, the wealthy can afford golden parachutes. They have that freedom, whereas Joe and Jane Smith, who opened a restaurant with their life savings, will simply lose everything if the economy turns south.

Mind you, I’m not saying that the market is inherently bad. It is not inherently good, either. And having respect for the social benefits of private property ownership is not the same thing as libertarian free market triumphalism. So when I read conservatives prating about the efficiency of the market or how capitalism is pretty much the greatest thing ever or that money equals free speech, or that we can’t have freedom without unfettered capitalism, it grinds me to the core. I happen to accept the analogy that the market functions similarly to natural selection, but, because I have a moral sensibility, I am always inclined to weigh the human costs involved in that proposition. As a result, I don’t think the market deserves protection from state intervention; I think that human beings deserve protection from the fallout of a free market system.

Kirk, of course, was willing to associate himself with the Reagan coalition, which had quite radical ideas. I lack both the knowledge and time to discuss Kirk’s entire corpus as it relates to movement conservatism’s apotheosis in the 1980s. What is fascinating to me in the passage I quoted, though, is how tempered Kirk is, how much nuance he leaves in his contention that property rights underlie political freedom. Not only is this principle just one of ten (clearly not taking precedence over the others), but his presentation of property as foundational to the commonwealth really is fundamental—in the sense that it provides a stable basis for the family, for the community. This is not free market triumphalism. When conservatives valorize the market, with its volatility, its amorality, its blind indifference to reason and human dignity, they embrace the violent chaos of a Darwinist cosmos. It is to this vision of the cosmos that Kirk’s seventh principle stands opposed. Civilization is not built upon the chaos of the market, but upon the investment of human striving into material artifacts. The market is, as Joseph Schumpeter put it, an engine of creative destruction. This kind of instability, of rupture, of impermanence in the natural order is not, cannot be, representative of the transcendent moral order that Kirk declares to be first and foremost in the conservative mind.


Second, the conservative adheres to custom, convention, and continuity. It is old custom that enables people to live together peaceably; the destroyers of custom demolish more than they know or desire. It is through convention—a word much abused in our time—that we contrive to avoid perpetual disputes about rights and duties: law at base is a body of conventions. Continuity is the means of linking generation to generation; it matters as much for society as it does for the individual; without it, life is meaningless. When successful revolutionaries have effaced old customs, derided old conventions, and broken the continuity of social institutions—why, presently they discover the necessity of establishing fresh customs, conventions, and continuity; but that process is painful and slow; and the new social order that eventually emerges may be much inferior to the old order that radicals overthrew in their zeal for the Earthly Paradise.

Conservatives are champions of custom, convention, and continuity because they prefer the devil they know to the devil they don’t know. Order and justice and freedom, they believe, are the artificial products of a long social experience, the result of centuries of trial and reflection and sacrifice. Thus the body social is a kind of spiritual corporation, comparable to the church; it may even be called a community of souls. Human society is no machine, to be treated mechanically. The continuity, the life-blood, of a society must not be interrupted. Burke’s reminder of the necessity for prudent change is in the mind of the conservative. But necessary change, conservatives argue, ought to be gradual and discriminatory, never unfixing old interests at once.

Once more, the context of the Cold War is essential to understanding this principle, and Kirk uses notably Burkean language when he describes “the old order that radicals overthrew in their zeal for the Earthly Paradise.” In the eighteenth century, the “radicals” would have been Jacobins; in the twentieth, they were the Bolsheviks. What’s at stake here is both preserving “custom, convention, and continuity” and rejecting the rupture that revolutionaries often champion when attempting to subvert the old order.

In the American context, this principle is more than a little weird when it’s advanced by conservatives who insist on using the Founding Fathers as pole stars in their rhetoric. Just in case you forgot, the United States formed in the wake of a revolution fought by British colonists against Great Britain. This revolution was founded in Enlightenment notions of natural rights and it pitted the received parliamentary tradition directly against the monarchy, rather than in partnership with it. While I happen to be of the mind that the American revolution was indeed quite radical, and that my nation’s forerunners had just grievances that were unjustly not addressed, I also am fully of the conviction that there was nothing conservative in temperance or politics about the revolution. Historians would be quick to point out that when the Constitution was drafted, it drew upon a long tradition of English common law, and that the colonies already had a functioning civil service apparatus to fall back on once the shooting stopped. In short, the radical break from their colonial overlords was founded upon inherited principles and transitioned (somewhat) smoothly thanks to laws, traditions, and infrastructure that were already in place. In that sense, the American revolution preserved much in custom, convention, and continuity.

But it was still a revolution.

Men, women, and children were killed in battle and as collateral damage. British loyalists were sometimes tortured or killed for their beliefs. Early American history is not my forte, but when I say that the revolution exacted a horrific human cost on both sides, I do not exaggerate.

The Declaration of Independence is, for good reason, one of the canonical documents of American society. In its language, statesmen, philosophers, and everyday people have found a wellspring of wisdom and authority—most often the latter, when invoking the Declaration to frame their own agenda. I return to it each year on or around our Independence Day, and it always strikes me anew as incredible. Mind you, I use that term in a somewhat nominal sense, rather than as praise or condemnation. What people often forget is that the Declaration was not a mere statement of beliefs; it was an argument. Allow me to quote liberally from it by way of illustration:

When in the Course of human events, it becomes necessary for one people to dissolve the political bands which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature’s God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.–That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, –That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness. Prudence, indeed, will dictate that Governments long established should not be changed for light and transient causes; and accordingly all experience hath shewn, that mankind are more disposed to suffer, while evils are sufferable, than to right themselves by abolishing the forms to which they are accustomed. But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government, and to provide new Guards for their future security.

All of this is praemunitio for the series of specific charges laid against the British government, the charges that justified the colonists’ secession from their government of more than 150 years.  While the lines about “Life, Liberty and the pursuit of Happiness” are likely the most-quoted, it is really that last sentence in the above quote that forms the core of the document’s purpose: the colonists had to prove that what England had become to them was really “absolute Despotism,” rather than a constitutional monarchy. A despot, you recall, is a dictator who arrogates all civil institutional power to himself alone. To charge England with reducing them to absolute Despotism was to charge the king with ignoring the rule of law and acting autocratically, outside his prescribed legal authority, to oppress the Americans. In short, to justify their own flouting of all legal authority, the colonists had to prove that the king flouted all legal authority first.

The Americans’ allegations had teeth, of course. What remains outstanding in the introductory remarks of the Declaration is that the authority to make their political rupture was arrogated to themselves as a “people,” somehow no longer quite English, but owing their decision as much to their own “duty, to throw off such Government” as to “the opinions of mankind.” If this feels vague and jazzy-handsy, it is. It’s grounded in a lot of well-established political philosophy (Locke looms particularly large), but it nevertheless hinges on the premise that a rupture between two distinct peoples has already occurred, and this justifies the further rupture between them. It is a document that acknowledges the continuity that previously existed, then alleges that the fault for its dissolution really lies entirely on the other side.

One of the many great ironies of American history is the contemporary resurgence of interest in the Founders among the radical right. Most notably, the grassroots opposition to President Obama that coalesced on or about 2009 dubbed itself the Tea Party. Instigated first by the president’s stimulus programs (intended to spur an end to the recession that started in 2008 with the collapse of the housing bubble and the derivatives market), then really galvanized by the passage of the Affordable Care Act, the latter-day Tea Party’s talking points in the early days revolved around the role the federal government ought to play in its citizens lives and how much it ought to spend toward that end. Basically, Obamacare was to them simultaneously an egregious intrusion upon citizens’ freedom to manage their own health care and a condensed symbol of Washington’s spendthrift waste of taxpayer resources. To hear the rhetoric, you would think that Barack Obama was quartering redcoats in our homes and torching our shopping malls with napalm. Tea Partiers see themselves as fighting to protect long-cherished liberties that are inherent to the American character and necessary to preserving our way of life.

While I think Jill Lepore has written the definitive essay on the Tea Party movement’s appropriation of American history, I do think that the the movement, quite by accident, stumbled into the perfect metaphor for itself. According to the latter-day exponents, the Boston Tea Party of 1793 is a symbol of the people’s refusal to let its government exercise tyranny through unjust economic policy. So it was.

Never mind, though, that England’s coffers were sapped, in part, by the British Empire’s efforts to head off French imperial ambitions aimed at, along with a few other continents, North America. You might recall getting confused back in school about who was on what side in the French and Indian War—American colonists called it that because the French persuaded indigenous people on this continent to fight alongside them against the British. At that point, Americans were still Britons, as was a young soldier named George Washington, who cut his military teeth in the conflict fighting alongside British forces. The French and Indian War, however, was just one part of a conflict that spanned the globe. Fighting wars has always been expensive, and when this particular war of empires ended in the 1760s, that tyrant, King George III, whose armies had killed and bled to protect his American colonial subjects, had to recoup costs. British legislative efforts to balance the ledger that had become stained with red on behalf of Americans (among other British peoples) were the first to stoke the fires of revolution in the thirteen colonies. If I may put it a bit ungenerously, the Americans didn’t want to foot the bill for a war that England had fought to preserve their life, liberty, and pursuit of happiness.

So the Americans were one people with the British when the French were storming their borders, but fifteen years later, they were merely “one people to dissolve the political bands which have connected them with another.”

In the space of one generation, a rupture in cultural and institutional continuity had been effected by men who were primarily incensed over taxation without representation, a rupture of such magnitude that they went right back to mass slaughter, this time over the principle of liberty.

Once again, I acknowledge my historical reductiveness, as well as the fact that I’m playing fast and loose in my selection of historical facts, which seem always to conspire in their complicatedness to thwart every easy narrative that we try to make of them. Mea culpa.

What this digression is meant to illustrate is that contemporary conservatism is largely shaped by a spirit of radicalism. It is, furthermore, a radicalism that seeks to sever custom, convention, and continuity on multiple levels. While conservatives are busy embossing their newsletters with quotes from the Founders, they neglect to acknowledge the legally dubious nature of any revolutionary project. They further neglect to acknowledge that the rationale justifying the entire revolutionary project devolved upon a parcel of citizens the authority to decide whether or not any other “people” have a say in the matter who disagreed with them about what form of government is most conducive to the ends of providing Safety and Happiness. The logic of American Revolution, in other words, dictates that you can justify any political action, so long as it is in response to Despotism—and, as luck would have it, those with the authority to identify Despotism usually happen to be the ones proposing the radical political rupture with others who were, until just recently, their fellow citizens.

It is my fundamental contention that American conservatism, as currently expounded by a majority of its self-identified adherents, effaces old customs, derides old conventions, and breaks the continuity of social institutions. And if American conservatives think it unfair of me to say so, they might consider the fact that it is they who nursed the antigovernment, anti-establishment rhetoric for decades which eventually birthed the Tea Party, a political movement that named itself after a bunch of middle-aged white dudes dressing up in racist costumes and committing an act of large-scale vandalism because their parliamentary representatives approved a tax hike.

I’ve no idea what Kirk would have thought of the contemporary Tea Party or its radical politics. At the time that Kirk wrote his ten principles, his counter-revolutionary ideas were adequate to address the manifest failures of the Soviet Union and its offspring to follow through on their own revolutionary ruptures. Perhaps there’s contemporary relevance to the efforts of ISIS to destroy the cultural history of the cultures whose people have been afflicted by the misfortune of residing in territory occupied (however briefly) by the so-called caliphate. (Perhaps.) What strikes me is that we have had Obamacare now for about as long as the American Revolutionaries had King George’s tariffs before they decided to dump tea into Boston harbor. I doubt that many conservatives are making serious plans to secede from the Union. I do see, however, a parallel logic between the Declaration’s division of two peoples and the commonplace call to “Take our country back.” From whom? Our fellow Americans?


If you think that perhaps I’m being unfair to overlay conservatism so isomorphically with the Tea Party, allow me to direct your attention to the all-but-concluded Republican presidential primary. No, not to Donald Trump (although I’ll come back to him in a bit). Indeed, I think it is far more significant to reflect on the fact that the two men left standing at the end of the long march to Cleveland were Donald Trump and Ted Cruz.

Cruz is about as emblematic of the Tea Party as you can get, and he was widely acknowledged as the most conservative candidate running in this election cycle. While Trump is the most overtly noxious of the two, I think Cruz to be much more symbolic of this cultural moment in American movement conservatism, even in his failure to win the Republican nomination.

How many times did Cruz vow in his campaign to “repeal every word of Obamacare”? He offered a list of four cabinet positions and 25 agencies he would fight to eliminate. He said he’d carpet bomb ISIS strongholds, regardless of civilian casualties. When Donald Trump called for a ban on all Muslim immigration, Cruz countered by suggesting that we turn Muslim neighborhoods in the U.S. into state-run panopticons. He vowed that, on his first day in office, he would tear up the historic diplomatic deal with Iran. He led the fight in the Senate to push the government into a shutdown rather than fund the Affordable Care Act.

Conservatives call Cruz “principled.” Indeed he is. (Except when he apparently doesn’t tell the truth about his main opponent for eight months, but whatever.) The question is how he goes about applying those principles. It is my view that Cruz is not conservative. He is radical. It is radical to eliminate entire cabinet positions and agencies, just as it is radical to “repeal every word” of a law that is now an integral part of the nation’s health care system. It is radical to violate the constitutional rights of Muslims on account of them being Muslim. What Cruz consistently espouses is the violation of procedural norms for the sake of ideological purity. That’s radicalism. How is the word-for-word repeal of Obamacare by executive fiat any less radical than its creation and passage by duly-elected representatives of the American people? (It is, in fact, more so.) How is Cruz’s open declaration to turn American Muslim communities into de facto police states any less radical than, say, anything the president has said about limiting access to guns designed for efficiently killing mass numbers of people? (Again, I think it’s more so.) If anything, I think the president has largely worked within the institutional norms established by his predecessors, for better and worse. That is de facto conservatism.

Except, to the vast majority of American conservatives, it’s not.

The #NeverTrumpers endlessly repeat that Donald Trump is not now, nor has he ever been a true conservative. Indeed, Trump hasn’t really claimed to be. His supporters are the ones who claim to be conservative. Those agitating for the GOP to modify its rules so delegates won’t be bound to coronate that orange pile of smarm (as my wife calls him) are quick to point out that he didn’t win a majority of votes—just a mere plurality, as if the fact that only winning more primary delegates than any other single person is insufficient reason to nominate him the torch-bearer of the party whose candidate in 2000 was swept into office by overwhelming popular vote barely eked out a victory with a contentious Supreme Court decision that upheld the validity of the electoral college. These same #NeverTrumpers who pooh-pooh Trump’s mere plurality spent four years defending the legitimacy of George W. Bush’s first term on the same principle that Trump’s supporters cite to defend his right to the Republican nomination.

That’s why it’s important to consider Cruz and Trump together. Quite apart from the Republican media machine, the fact is that Cruz and Trump, considered together, do present a snapshot of conservative America. A majority of American conservatives voted for one of these two men. The question is what these two men have in common. Certainly not policy ideas. While they both took hardline stances on immigration, and while they are both bigoted against Muslims (though Cruz talks his way around more smoothly than Trump does), the only thing they really have in common is that their notions about how to go about actually governing the country are fundamentally extreme. Radical.

I do agree that a big part of Trump’s rise is simply attributable to old-fashioned racism. Old bigotry dies hard. I’d like to suggest, though, that there’s a deeper link between the fears of racial contamination and the trend toward polarization fueled by increasing ideological puritanism.

More than authoritarianism, what I think radicalism often does is lead people to place greater and greater faith in the power of ideological purity. Conservatism, as expounded in America for the last twenty years, takes extremism as a litmus test of seriousness.

When an entire political movement vets its candidates based on their commitment to radicalism, it is not difficult to see how a bully like Trump would appeal to its base. His entire shtick is premised on being extreme. It shows that he means business. The real story of the 2016 election is not how Trump got to be the Republican nominee. The real story is how people like John Kasich, Chris Christie, and Jeb Bush—politicians who are already very conservative—were deemed too moderate by the conservative base that turned out for the primaries.

Cruz and Trump were the last men standing because they were the most extreme candidates of the bunch. Despite all their manifest differences, what united them in their electoral successes was that every other conservative in the primary field was insufficiently radical. And when it came to the particular constellation of policy issues on which the conservative movement has staked its claim for the entirety of my lifetime, the conservative base didn’t go with the guy who became the most hated man in Washington for his utter commitment to ideological purity. No. They simply went for the guy who took the most pride in being offensive and amoral.


To me, conservatism is quintessentially a productive, perhaps even progressive, resistance to radical change.

Is there any other conservative in American that you can think of who sees conservatism that way? Oh, aye, I’m sure there are a few. As I said in my previous post, however, a “few” do not a political philosophy make. Not unless they gather adherents; not unless those few are deemed in time to be originators of the discourse, as Foucault might put it.

In the last few years, I’ve found myself gravitating more toward thinkers who openly declare at least some allegiance to the tradition of liberalism, like Alan Jacobs or Damon Linker. My go-to source for news analysis is Vox, and for political scuttlebutt I hit Talking Points Memo. My favorite general interest magazine is The Atlantic. I relish reading Walter Benjamin. I’ve transmogrified into an over-educated, low-income Mugwump. I’ve kept thinking of myself as conservative, even though, apart from a few amateur bloggers, the only conservatives I read regularly publish in The American Conservative, which is notable for the diversity of nuanced, erudite perspectives found among many of the contributors who aren’t named Patrick J. Buchanan, who is a racist ass. The rest tend to be pretty good, which is to say that they did not spend the first eight years of this century circling the wagons around George W. Bush and the past eight years frothing at the mouth every time Barack Obama laced his shoes left side first.

The fact is that, in terms of my opinions on specific issues, I’ve not been conservative for some time now. It feels as though I woke up one day this last year, looked around at the political landscape, and thought, “Have I ever actually been one of these people?” Well, yes. I was, at one point. That much is true. And though I will admit that I feel ashamed of it, I feel compelled to declare that I ought not to be. First and foremost, I believe that every person ought to be granted the right to be persuaded to change their opinions over time. Mine certainly evolve, but very slowly. My current political outlook could only have been formed by the path I’ve traveled to get here. Being able to recognize the moral insanity of American conservatism is a blessing (I think?) that flows from what I’ve attempted for years to call my own conservative political convictions. The fact that my political convictions are not, in aggregate, conservative is the sort of epiphany only granted, I suspect, to those who have the perverse sensibility to luxuriate in disillusionment. What’re those sayings? “A conservative is a liberal who has been mugged by reality,” and, “Reality has a liberal bias.” Here’s one, just for kicks: a liberal is a conservative who has been mugged by radicalism.

At bottom, I don’t really feel that I have a particularly large share of blame to shoulder for American political conservatism’s collective embrace of bigotry and radicalism, because it’s my temperamental conservatism that causes me to regard that spectacle with moral horror. It simply means that, politically, I must formally recognize that I am the loyal opposition to their ranks, not an outlier within them.

As tempting as it is to switch my label, as Linker did years ago, to liberal, or to triangulate myself, as Jacobs has, somewhere within a constellation of conservative-liberal-socialist positions, that doesn’t feel right to me, either. Not at this time, anyway. Given my temperamental conservatism, it’s probably obvious to you by now that such a shift would be a bit too swift, a bit too premature. What I hope that my conservative, liberal, socialist, libertarian, anarchist, distributist, and communitarian friends and acquaintances understand is that whatever our differences, there are some principles to which I will continue to adhere for the time being.

First, I will presume good faith on the part of individual political advocates—just because we disagree on something (or most things) does not mean I must ascribe to you all that is unholy and pernicious, and I hope you will extend the same courtesy to me. Second, I will presume that no individual is beholden to all that is most rotten in his own political tradition—I will continue to refrain from holding each Leftist accountable for Stalin and Mao as I will from holding each Right-winger accountable for Hitler and Trump. Third, I will endeavor to blunt the appeal of radical measures as political solutions, regardless of the nobility and justice of the goal. Fourth, I believe that I have something valuable to learn from all political traditions, and I will presume that interactions with someone from any tradition will teach me something I can reflect on as my own political philosophy continues to evolve.

I believe that each tradition is capacious enough to include someone with those principles. It seems to me that these principles create enough of a foundation for mutual understanding that productive dialogue can take place between us.

That said, I do think that labels have power, and I wish I had one that could accurately capture my political philosophy. As J. L. Austin argued years ago, words do things. To call myself liberal or socialist is to perform some sort of meaningful social act, and consequences follow. Unless and until I have better understanding of what those consequences are, I don’t think it prudent to throw in with you, whatever your political orientation may be. This isn’t because I don’t wish to associate with you; on the contrary, I wish our association to be productive. I wish to test you, and to make myself a better thinker and to clarify my own political ethics as part of the exchange. I just don’t wish to identify with you unless you and I are satisfied that, by doing so, we’re being accurate, and that there is something to be gained for all by having me join your tribe. By all means, proselytize me if you can.

It is my belief that politics ought to be productive. I don’t place my utter and complete faith in politics, but politics ought to serve a purpose beyond establishing and maintaining the hegemony of a radical minority. Conservatism in the United States, at present, is opposed to this conviction, which is why I therefore stand in opposition to American conservatism. ☕

Reflections on revolution in American conservatism, part 1

This blog is not primarily meant for engagement with contemporary electoral politics, but I do agree with Alan Jacobs that blogs are meant, among other things, to hold their writers publicly accountable for thinking out loud. Current events therefore demand a reckoning of sorts.

People who know me well and longtime readers of this blog know that I have been identifying myself with conservatism. I’ve struggled mightily to retain for that label, insofar as it applies to myself, something resembling moral integrity.

To start off, then, I’d like to associate myself with a couple of posts made by Jacobs, the first from his blog at The American Conservative:

We all know what Trump is: so complete a narcissist that the concepts of truth and falsehood, right and wrong, are alien to him. He knows only the lust for power and the rage of being thwarted in his lust. In a sane society the highest position to which he could aspire is apprentice dogcatcher, and then only if no other candidates presented themselves.

If you put a gun to my head and told me that I had to vote for either Donald Trump or Hillary Clinton, I would but whisper, “Goodbye cruel world.” But if my family somehow managed to convince me to stick around, in preference to Trump I would vote for Hillary. Or John Kerry, or Nancy Pelosi. In preference to Trump I would vote for the reanimated corpse of Adlai Stevenson, or for that matter that of Julius Caesar, who perhaps has learned a thing or two in his two thousand years of afterlife. The only living person that I would readily choose Trump in preference to is Charles Manson.

And this one, from his personal blog:

As a conservative-liberal-socialist, I don’t fit onto any political maps that I know of, and I am accustomed to feeling slightly out of place — more, out of focus — in any given policy debate. But despite the sizable liberal element in my own personal political constitution, in times of serious conflict — today’s Brexit contretemps, for instance — I am always temperamentally alienated from liberalism. For what distinguishes many (most?) liberals from both conservatives and socialists, as today’s social media torpedoes reveal, is genuine incomprehension that any sane and decent person could disagree with them. […]

And this is why, despite the significant proportion of my political views that is genuinely liberal, I am less at home among liberals than among any other political group. Once their howls of outrage get wound up — and there is no outrage like that of a thwarted cultural elite — I just want to back quietly out of the room, close the door behind me, and get as far away as I can.

What I’ve confirmed over the course of the past year of following national politics is something I’ve come to realize over the last several years—or, rather, in the last decade and a half.

A central tenet of what I call “conservatism” is that the opposite of conservatism is not liberalism but radicalism. Aphoristically: conservatism is a principle of political temperament, not a policy agenda.

Edmund Burke wrote, “A state without the means of some change is without the means of its conservation.” The same is true, I think, of an individual’s political philosophy. There’s no need to retain ideological dogmata if they retain little value over time, but we ought not discard received wisdom lightly.

Within that framework, though, I consider President Obama a conservative. That might get me a “no duh” in countries much more liberal than the U.S. (or from Americans who had persuaded themselves that Bernie Sanders, bless him, was not a chasing a herd of flying pigs on his unicorn), but here, that just doesn’t work. I doubt the president himself would embrace that label in the current political climate. While there are self-identified conservatives who highly prize being anti-radical in temperament, there are few to none who use that as the primary criterion for what constitutes conservatism.

So: either everyone else in America misunderstands what conservatism is, or I do.

Around this time a year ago, I may have been tempted to say that political conservatism of some extant variety was still recuperable. I would have continued to do my part to make it so.

Circumstances dictate, however, that I categorically reject any association with the category 5 flustercuck that has been brewing in the GOP-conservative coalition for the last few decades. I’ve never been a Republican, but my temperamental conservatism has, like Jacobs, led me frequently to identify more with those aligned with the conservative (or classical liberal, if you prefer) tradition than the Left. Much as I tried to distance myself from particular noxious ideas within that tradition, I never thought it necessary to renounce a shared political identity tout court. That’s over now.

You can read my opinion on the Republican Party’s wholesale embrace of bigotry in my commonplace blog. Since I’ve been old enough to vote, I’ve never identified as a Republican, but I have valued my identity as an unaffiliated independent. Until this year, I would have at least thoughtfully considered a Republican before casting my vote. No longer. I will never vote for a Republican for any elected office. Ever. I don’t care how much I like an individual candidate. Whatever happens at the convention this month, the Republican Party has amply demonstrated its commitment to the values of racism, sexism, xenophobia, religious bigotry, and tyranny. Consequently, my vote will never be used in support of that peculiar institution.

Conservatives may point out that “conservatives” and “Republicans” are not isomorphic groups. True enough. There are still several conservative thinkers I genuinely respect and admire (Jacobs among them). They comprise a vanishingly small group. Most of them do not identify strongly as Republican. Even if they are decent, intelligent, and erudite people, I’m afraid that they do not typify, in my view, American conservatism. They are the rare exceptions, and I can’t identify myself as part of a tradition if I selectively edit its roster to include only the handful of good folks who aren’t braying sociopaths or historically illiterate bletherskates.

This is a matter of lex parsimoniae. 1.) A majority of Republicans self-identify as conservative. 2.) A plurality of Republicans has endorsed Donald Trump for president. 3.) Most “movement” conservatives who command the lion’s share of public attention support Trump in the name of conservatism—or, at the very least, in the name of defeating liberalism. Quack, quack, quack. That’s a flappin’ duck, folks. And this foul game* is bigger than one election cycle.

Something is rotten in the state of American conservatism, and I, for one, refuse to follow that shambling ghost to the parapet.

My political temperament is still best described as conservative. That will certainly have influence on my political views, but it in no way reflects my identification with whatever the public discourse calls political conservatism. Let me stress this point. American conservatism has placed Donald Trump, a person in possession of mostly vile and/or dangerous political opinions, in serious contention for the presidency. Conservatism in the United States has led itself to this moment, so I think the time has come for anyone who still wants to call him- or herself “conservative” to reflect critically on what, exactly, they believe and whether the devil has given them a good price for their souls.

Time to rub the scales out of my eyes. Whatever I am, “conservative” apparently no longer applies, at least in any politically meaningful way in the present cultural context.

To be continued.


* Couldn’t resist. I’m so very sorry.

Updated with link to Part 2, 16 July 2016.

How does Marvel’s culture industry manage to keep hope alive?

Like most folks who saw it, I enjoyed Captain America: Civil War. It was inferior to the previous two Captain America films, in my estimation, but it was better than Age of Ultron. Much can and has been written about the drawbacks of the Marvel/Disney entertainment monolith, and I’ve been ruminating on the film since I saw it. Chuck Bowen recently used Civil War as the occasion to reflect on the State of Summer Cinema. Allow me to use Bowen as the occasion to reflect on everything that Marvel has done right so far. This will get a little dialectical. Bear with me.

Contrary to a cliche that dogs film critics, I don’t enjoy disliking nearly every movie that earns a significant amount of money. My words are carefully chosen. “Disliking” rather than “hating”, because to inspire such a passionate response as hate would require more than a preordained blockbuster usually offers. Works of art are like people: to hate either, one must be accorded a glimpse of their personality first, and a failure to exhibit personality provokes a muffled, low-risk indifference. But try telling people this sort of thing about a Marvel production and you’re a snob.

Of all people, I can empathize with Bowen’s gripe about his own audience’s bad-faith reception. The fact that one may simply not like a film (as opposed to hating, disliking, or any other strongly-agential gerund you please) does not compute for most people. When you’ve seen enough movies, the most common reaction, sadly, is non-reaction. Movies that most folks “love” or “hate” or think of as “just okay” or (God help me) “interesting” are, to the jaded cinephile, just sort of there. It’s almost a mercy when I actively hate a film, because I’m relieved to have my emotions excited by the experience. So what I’m saying is that I totally get what Bowen’s saying here. He’s got some other pity observations, such as when he compares blockbusters to the cautious personality of job interviewees, or that, given the choice simply to skip the requisite blockbusters, “Unending exclusion is dull and estranging,” so it’s probably better to be in the loop than out.

People have short cultural memories, but blockbusters used to occasionally be enjoyable. Even weird. Their plots might have been recycled and disposable, but they had plots, and some of them had ineffably powerful images. Raiders of the Lost Ark, one of the most influential of all blockbusters, is almost quaint now in its fealty to the idea of one hero, one villain, a heroine, a few colorful supporting characters, a MacGuffin, and a story that tied all these elements together with pleasurable simplicity. And while its protagonist, Indiana Jones, was an indestructible superman, he also has discernible human characteristics. For one, he clearly liked sex.

Whuh? Not quite sure how Indiana Jones is an “indestructible superman” (fridge-nuking notwithstanding), because the first and third films end with literal di ex machina that emphasize the hero’s relative powerlessness. Also, what the hell does it mean that “he clearly liked sex”? is that the most readily identifiable human characteristic—liking sex? Unlike Tony Stark, for instance?

I’m also unsure how the Thor movies or the first Captain America weren’t pleasurable in their simplicity: one hero, one villain, a heroine, a few colorful supporting characters, a MacGuffin, and a story that tied these elements together. Guardians of the Galaxy had multiple heroes, but it stands as this decade’s superlative example of exactly the kind of film Bowen is complaining that the blockbuster machine doesn’t produce. I haven’t seen some of the other films he name-drops, but if some ivory-tower-bound neckbeard like myself, who sees maybe ten new movies a year anymore, can think of several counterexamples out of hand—taken from the very franchise he’s arguing about—one might get the sense that Bowen’s punching a bit above his weight class. Or simply being obtuse. To wit:

“On a scene-by-scene basis, this new Marvel uber-movie makes almost no sense, hopscotching across dozens of cities and a couple of different timelines, plugging new superheroes such as Black Panther (Chadwick Boseman) and yet another Spider-Man (Tom Holland), while dropping cute little in-jokes designed to pressure audiences into catching up with past Marvel installments that they may have missed. This is the most irritating component of the new corporate blockbuster: it’s always heckling you to buy more, without ever giving you what you already paid for. It may be called Captain America, and more or less be a sequel to the vastly superior The Winter Soldier, but it offers a buffet of superheroes designed to abound in so much as to offer each audience member enough of what they individually like, so that they can each retrospectively assemble a different, more focused movie in their minds.”

This paragraph foreshadows how Bowen misunderstands Marvel’s project in some very fundamental ways. First, I’m not sure how the film makes no sense on a scene-by-scene basis. It doesn’t really hopscotch different timelines so much as parallel storylines. Most of these scenes are united by the MacGuffin (that’s right, folks—there is one!) of the Sokovia Accords, a multinational agreement among the world’s nations that superheroes require some sort of civilian oversight. Coming on the heels of Ultron’s robot uprising and Hydra’s hijacking of the U. S. military-industrial complex (not to mention the emergence of the Inhumans, if one still considers Marvel’s TV franchises to be part of the same universe), one can understand that non-superheroes might want a say in how the Avengers conduct their affairs.

Civil War is a sequel to Winter Soldier, but it’s more properly a sequel to every Marvel movie to date. Unlike virtually every major franchise produced by Hollywood, Marvel has made a point of making sequels that actually push their characters forward. It’s TV-style serialization—which, in turn, was influenced by early film serialization, so I guess the blockbuster has essentially come full circle. Unlike Raiders of the Lost Ark, which borrowed liberally from the early serials’ tropes, the MCU has borrowed from their episodic structure. More accurately, it uses the serial structure that has been the backbone of superhero comics for going on a century. While it’s fair to say that there are not really any long-term consequences in serialized comics (the medium that gave us the term “retcon”), there are often short- to medium-term consequences. Superman today is (sorta-kinda-pretty-much) the Superman of the 1940s, plus or minus a few powers. But there’s a gravitational power exerted by the shared universes of the two major comics publishers that basically requires their worlds to maintain a certain status quo. Mostly for business reasons: after all, it makes it easier for new readers to jump into a series when the basic premise and stakes are never-changing. There’s also a storytelling exigency, though: long-running series change creative teams from time to time, and it’s easier to do new(ish) things within an established paradigm if you’re not hamstrung by a never-ending series of paradigm shifts introduced by each previous team. It’s more about finding interesting new facets of a superhero (and that hero’s mythos) to explore, rather than totally reinventing the superhero from scratch. The Marvel movies do this better, in my opinion, than the Marvel comics.

The Marvel Cinematic Universe simply has exigencies that the comics do not, and that makes them more appealing to a certain kind of audience. The primary reason why I never got into superhero comics when I was collecting them is that I didn’t have the money or patience to go read everything I needed to know in order to fully appreciate the context of a current story arc. Storytelling in mainstream superhero comics is intransigently incestuous. After all, the whole point of having a shared universe is to do crossovers. That’s great when there’s only, like, a dozen titles in a given universe. It’s enough to induce a panic attack when dozens of titles and hundreds of characters are in play over the course of decades.

What’s worse is that so much of the pathos from these titles comes from the assumption that readers are at least somewhat familiar with the histories of these characters. Imagine watching Star Trek: Generations without ever having seen any TOS episodes or movies and maybe only a handful of TNG. It’s not a great film anyway, but watching Captain Kirk die for real and for good is kind of a kick in the gut if you’re a Trekker in any sense of the word. If you only have the vaguest idea who he is, due mainly to pop culture osmosis, maybe the moment works, but mostly not, I’d wager. The entire film hinges upon two legendary Enterprise captains meeting for the first time for the last time and the torch being officially passed from one generation of the franchise to The Next. If you haven’t been watching Star Trek, there’s no legend. No impact.

For me, reading virtually any major superhero book was like jumping straight into the final season of Lost. There were a few exceptions, as there always are. (I loved Ed Brubaker’s Daredevil, for instance, but then, I’d already read most of Miller’s run and most of Bendis’s thanks to my local library’s shockingly capacious graphic novel collection.) On the whole, though, it simply never mattered to me. Any of it. Or most of it, rather.

However much the Chuck Bowens of the world complain about each installment in MCU being a product placement for other installments, the product line is comparatively sparse if you put it alongside the comics. You want to get caught up before Civil War? ‘Kay. Rent Iron Man (just the first one), the two Avengers movies, and the first two Captain Americas. You don’t need any Hulk, Thor, or Guardians. You don’t need the TV shows. You don’t need the films produced by 20th Century Fox. Will you miss a few inside jokes? Sure. Are you on the hook for 10+ hours of entertainment. Yep.

Know what you’re not on the hook for? Approximately 82,946 comic books, including back issues and current releases, because you happened to pick up the latest X-Men and you don’t know who the hell any of the characters are or why the one guy you do recognize is now a gay psychopath who’s also apparently the clone half-brother of some other character who’s Professor X’s great-granddaughter from an alternate future who is responsible for the sixth (or seventh?) time Wolverine got amnesia and had to go work as a short-order cook in Laos, where he eventually teamed up with the Punisher to take out the assassin/warlord who will (tune in next month! Excelsior!) be responsible for murdering Matt Murdoch’s latest doomed girlfriend, which will somehow precipitate the third superhero Civil War.

Part of me appreciates the operatic plot lunacy and behind-the-scenes organization it takes to pull of stuff like that even halfway successfully. It’s the same part of me that, in the abstract, thinks that soap operas and pro wrestling are kind of cool, in theory. The other part of me looks at my wallet and my time commitments and goes, “Yeah, I can check out the latest Marvel flick every eight months. That’s way more doable.”

It’s doable for movie audiences because the kind of money and organization required for large-scale blockbuster film production can only make movies like this happen every eight months or so. The most marketable part of these movies, apart from the franchise branding, is the truly impressive roster of performers Marvel has assembled. You only get RDJ or Chris Evans for so many movies, so you better spread ‘em out and make ‘em really count. Similarly, Marvel’s scored big with some of its behind-the-scenes hires. Joss Whedon, obviously. James Gunn, though, was a stroke of genius. Guardians of the Galaxy is the single best film in the Marvel MCU, and apart from the first Iron Man, it is the least dependent on the films’ shared mythos.

What will be even more interesting to see is whether Marvel has the ambition and vision to continue the current MCU well past the tenure of its founding players. Chris Evans will be done with Steve Rogers after the Avengers two-parter. Downey, Jr. is likely to bow out sooner than later. Marvel can always recast or reboot, but the cool thing about movies like Ant-Man or Guardians is that they do well without necessarily being based on known quantities. I get that making a movie about Marvel’s first black superhero is a big deal (and more than a little overdue), but seriously: did anybody outside of the comics nerd-o-sphere know who Black Panther was until Marvel stuck him in Civil War?[1] Would Ta-Nahisi Coates have gotten a shot at writing the comic right now without Marvel deciding to branch Panther off into his own film? Would they have chosen to do so if they couldn’t have spliced him into a strong, stable franchise like Captain America first? Things like this are part of the upside of the entertainment-industrial complex. Money + hype = willing viable franchises into existence. (Not always, but you have to admit that momentum is on Marvel’s side at present.) Unlike the comics publishing arm, it’s quite possible that MCU can survive the retirement of its initial flagship characters if there are folks like Chadwick Boseman and Tom Holland waiting in the wings. (Or, for heaven’s sake, Scarlett freaking Johansson, who has already appeared in more Marvel movies than everyone but Downey and Evans.) Put another way, it’s possible for MCU to evolve in ways that the Marvel comics universe simply can’t, because we’re talking about two different markets and two different media, one of which depends on real, flesh-and-blood people to play the characters.

Not likely, I admit. Just possible.

I like the idea of the MCU growing and evolving, as it were, in real time. One of the distinct pleasures of watching the Marvel movies since Iron Man has been watching talented stars and writers collaborating to find interesting things to do within the constraints of the franchise. They age. They mature. Downey is a better Tony Stark now than he was in 2008. Chris Evans is a better Steve Rogers. Most movie actors get one film in which to get to know their characters. Downey and Evans have gotten six and five, respectively. To me, their performances reflect that process. And that’s tied to the thing that Bowens gets so, so wrong in his reflection Civil War. It is, in fact, the thing that he misses the most completely.

Would it kill the film-makers to offer just one memorable bit of dialogue? Every spoken line in Civil War serves an expository purpose. Or how about just one image that strives for poetry? Would it kill one of these movies to feature characters who are capable of actually dying? Or crying? Or changing allegiances? Or having money problems? Or loving, in a visceral, personal way, rather than in the usual platitudinous fashion that testifies to the needs of teaming up yet again to mount yet another adventure?

Watching Captain America: Civil War, in which positively nothing is at stake, I checked my watch 25 minutes into the film, sighing at the realization that there were nearly two hours remaining. How can audiences stand this? By submitting to the anesthesia of the loudness, I suspect, by comforting themselves with the knowledge that they are, at this moment, doing what culture expects of them. Seeing the “big” thing, the Super Bowl of yearly adventure epics.

The whole point of Bowen’s piece is (I think) to chastise audiences for letting Hollywood get away with selling them all of these films that are structurally the same, but he displays no grasp whatsoever of what has changed from film to film. So doing, he cannot understand why audiences keep showing up.

We’ll set aside haggling over what counts as a memorable line or a poetic image, or even what counts as “loving, in a visceral, personal way,” because I would argue that Civil War utterly hinges on what Eve Kosofsky Sedgwick calls homosociality, and even tweaks it a bit with the role that Black Widow plays in the character dynamics.[2] Let’s focus on three questions. “Would it kill one of these movies to feature characters who are capable of actually dying? Or crying? Or changing allegiances?” Let’s take these one at a time. The correct answers (phrased in the form of questions) are, in order:

Would it kill one of these movies to feature characters who are capable of actually dying? What, you mean like Peggy Carter? Title character of the Marvel Television Universe’s best series? The love of Steve’s life who kicked ass with him in the first film and died and was buried in Civil War?

Or crying? What, you mean like Wanda, after she blames herself for killing those civilians in the first scene? Or Tony, expressing a mix of sorrow and rage after Rhoadie gets shot down in the climactic fight at the airport? I’m sorry, does a character have to bawl uncontrollably—cry on cue, as it were—in order to count as “crying”?

Or changing allegiances? What, you mean like THE ENTIRE PLOT OF THIS MOVIE? Like how the first Avenger, Captain America (remember the first film’s title?), breaks his allegiance with the Avengers as a matter of conscience, and spends the whole film fighting with his former teammates as a result? Or how Black Widow totally confounds the entire idea of allegiance by trying to remain loyal to both of her friends and teammates, and also has to leave the Avengers as a result? Or how Black Panther goes from trying to murder Bucky to apprehending the real killer when he realizes he’s been duped? I’m actually thoroughly confused by this question. Just so I don’t cause any confusion, this question is not rhetorical: Chuck, did you actually watch Captain America: Civil War?

The purpose of Bowen’s series of facetious queries, of course, is to buttress the claim that “positively nothing is at stake” in Civil War. Again, this would be a totally baffling claim, even if we took Civil War as a case by itself. In the context of the MCU, it is about as objectively wrong as you can get. That is, if context and character development matter to any criticism based on a method of close textual analysis. (Hint: they really do!)

As others have already noted, both Steve Rogers and Tony Stark have actually had quite distinctive character arcs across the films in which they’ve appeared. Stark starts out as the reprobate Ayn Randian hero who worships at his own altar and must continually learn and re-learn the principle of self-sacrifice for the greater good. As the films progress, his sense of responsibility becomes less personal (due in large part to the lessons learned from the consequences of his own arrogance) and more based on principle, more directed toward the global community. In the Iron Man films, Tony repeatedly is forced to pay for his mistakes or the mistakes of his family or his corporate empire. Avengers is the first instance we get of Stark sacrificing himself genuinely selflessly. He invents Ultron partly as a psychological defense against his own perceive weakness, but also because he wants to protect Earth proactively from global threats. It is his greatest failure, the culmination of the arrogance displayed in each of his standalone films, and it is what leads this individualist bad boy to push for the institutional restraint of the Sokovia Accords. He knows that he cannot be trusted to hold himself accountable, so he welcomes the prospect of oversight. As the final confrontation with Cap at the end of the film shows, he knows himself quite well—he is unable to stop himself from trying to exact revenge for his parents’ murder. But the Tony Stark of Civil War is one making a conscious effort to restrain his arrogance; only someone with such a fatal flaw could recognize it manifesting in someone else: Steve Rogers.

A paragon of the Greatest Generation, he becomes a superhero principally out of a willingness to put himself at the service of the government to fight evil. Not just for his own sake; because he believes the world to be at stake. When Nick Fury taps him for the Avengers, it’s not much of an issue for him. It is, in fact, Tony’s innate resistance to institutional trust that persuades Cap to question Fury and discover the weapons program that provoked the alien invasion in the first place. He tries to carry on in Winter Soldier, but finds that the institution to which he has devoted his life is utterly infested with the evil he sacrificed himself (in the first film) to wipe out for good. Skepticism toward technocratic solutions to world piece underwrites his hostility to Ultron in Avengers 2, after which it is Tony who is forced to agree. By the end of that film, Steve takes over the Avengers because there is no institution left on earth that he can trust. That need for moral independence is what informs his rebellion against the Sokovia Accords in Civil War. His blind loyalty to Bucky is not merely personal affection and perhaps guilt over what happened to his oldest friend; Steve’s experience in each film has led him to value loyalty among comrades in arms above all else. External constraints, such as SHIELD or Ultron, have only compiled evil upon evil. In his arrogance, Steve believes that he can trust only his own moral compass, so he defies international law, deceives Tony about his parents’ death, and ends the film by founding his own rogue group of vigilantes. The consummate team player has become the ultimate loose cannon.

In short, Tony Stark and Steve Rogers’s character arcs have had an inverse trajectory that has been developed carefully and (shockingly) subtly over the course of the last decade, and what is at stake in Civil War is both thematic and personal. Thematically, the film presents two paradigms of the ethical use of force. Iron Man is a good guy, but he requires institutional constraints to use his power ethically, because he fears being a loose cannon. Captain America is a good guy, but he’s a loose cannon, because he fears that institutions will use his power unethically. Personally, Iron Man once again finds that his family’s tragic history traps him in an apparently unending cycle of retribution. Captain America is offered a final chance to save his oldest friend. Iron Man is spends most of the film seeking justice, only to have it turn into vengeance. Captain America is trying to redeem one friend by—to put it bluntly—screwing over another.

There be stakes all over the place. And that’s just for the two lead characters.[3]

More significantly, the stakes really only come into focus if you have, as Bowen says, done what culture expects of you: always checking out the Next Big Thing. Marvel counted on viewers having already invested their time and emotional energy into Tony Stark and Steve Rogers. Without that investment, there’s no payoff in Civil War. Just a bunch of latter-day demigods punching each other into buildings and making wisecracks. With that investment, the payoff is witnessing the tragedy of a broken friendship, of an already-broken man being denied justice for his parents, of a once-upright man turning lawless because the lawful institutions have, one by one, betrayed him for half a century. Amid all this tragedy remains hope, of course. That hope is stipulated by the money machine at the heart of MCU. Steve and Tony will reunite in Avengers: Infinity War because they have to. That doesn’t erase the manifold tragedy in Civil War, but it does structurally affirm that, despite the heartbreak and tragedy, heroes will ultimately do what they must simply because they’re heroes.


[1] If Civil War succeeded in nothing else, it made me terrifically excited for the Black Panther and Spider-Man movies. Boseman will be a great leading man, and there are a ton of exciting possibilities for T’Challa in the MCU. Tom Holland’s Spider-Man was both the best and most extraneous part of Civil War. In an already overstuffed film, his was the only new character who didn’t really serve a plot function. The scene where Tony recruits him, however superfluous, at least felt fleshed-out on a character level. In a story that leans so heavily on Tony’s troubled relationship to his dead dad, we get to see Tony get paternal with a kid who has so much in common with him. Peter and Tony both lost their father-figures (Uncle Ben and Howard Stark, respectively), both are nerds, both have taken it on themselves to be heroes outside the law. It’s a rather sweet scene. Also chilling. Just like Howard, Tony places unreasonably high expectations on Peter to manipulate him. The line between Tony turning Peter into his weapon and Tony relating to Peter paternally is blurry here, but that makes it all the more real and resonant, given how the film ends. Still sort of unnecessary, all things considered, but if the writers were going to shoehorn Peter Parker into the film, at least they did their best to make it make sense that Tony would recruit him. Holland and Tomei are sort of perfect as Peter and May, and the airport scene, in retrospect, feels mostly like a proof of concept for the kind of awesomeness (stunning high-flying acrobatics and nerd-witty banter: check!) we can expect from the next Spidey solo film. Sign me up.

[2] And by the way, I get that Bowens is trying to be cheeky when he asks rhetorically, “Wouldn’t Captain America: Civil War be a more interesting movie if Captain America (Chris Evans) and Iron Man (Robert Downey Jr) fought over, say, the affections of Black Widow (Scarlett Johansson), whose approval they are both clearly jockeying for anyway?” Yes, what a Paulette you are, Chuck, ever-so-subtly and perhaps-(perhaps-not!)-unironically insinuating that Natasha character would be far more effectively deployed as the object of affection in a male rivalry love triangle. Or, wait. No. Actually, I think that makes you a sexist jerk. My mistake.

[3] Civil War’s villain is also tragic. He is a direct product of the last Avengers film. Zemo has no superpowers, no great resources. Just a keen intellect and the drive to exact revenge. Like Steve, he’s a former soldier whose institutions failed him and those he cared about. Like Tony, he is a genius operating without restraint. Zemo is who Tony Stark might be if left to his own devices, but he justifies his villainy according to Steve Rogers’s ethos. If he’s the dark mirror to each of this film’s heroes, it reflects rather badly on their inability to resolve their differences.

Is it Thursday yet?

Last month, my wife and I finally stopped being outlaws. We had been watching Critical Role on YouTube for several months. Not on Geek and Sundry’s official channel mind you. Nope. Some user had thoughtfully put together his own playlist, updating it each Monday with the latest episode. I fully realize that this is the 21st century, and that a vast majority of people don’t care if they’re illegally pirating stuff. Screw those people. My wife and I spend precious little enough of our money on entertainment, but we figured that if Critical Role had given us nearly 150 hours of joy over the course of the last year, the least we could do is support it in the only way that matters in a marketplace. So we bought a Geek and Sundry Twitch subscription.

Geek and Sundry, of course, is the web-based entertainment company founded by Felicia Day. Capitalizing on the cachet Day earned with The Guild, G&S is home to nerdy shows like Wil Wheaton’s Tabletop and Co-Optitude, which Day hosts with her brother, Ryon. (Wife and I are fans of those, too.) G&S is a multiplatform presence, streaming videos from its official website as well as YouTube. Twitch bills itself as “social video for gamers,” which is apt enough. The platform includes live video streaming and chat functions, so you can watch your buddies play Halo or Hearthstone and comment on the game with other users besides the gamer in real time. Most of the popular channels are devoted to video gaming. G&S offers a variety of shows that are primarily oriented toward tabletop gaming.

What makes G&S’s Twitch experiment so intriguing is that it’s live. It seems, in other words, that broadcast media has come full circle. People from my generation and those even younger probably only know about old-time radio from movies like Woody Allen’s Radio Days (from what you might call his “peak Farrow” period), or perhaps they listen to shows like WPR’s “Old Time Radio Drama” (or whatever else is locally available outside of Wisconsin). While Twitch does allow its users to archive livestreams on their channel pages, the real draw is watching shows that are devised with the affordances and limitations of a live broadcast in mind.

Subscribers from around the world participate in the chat, peppering the hosts with questions, unsolicited advice, and solicited recommendations. While there are some shows designed around the chat function (like the recent trial of The Scavenger), most simply feature a confab of young, charismatic nerds playing games like Rock Band or HeroClix. The genius of Day and Wheaton is that they figured out that there was a fairly sizable niche audience of folks who would enjoy watching young, charismatic nerds play tabletop games. TableTop itself is almost paradigmatic in this regard. Each episode features Wheaton and four celebrity guests playing a different tabletop game, cracking wise about the diegetic absurdities of the games and sublimating their own cutthroat competitiveness into self-reflexive jibes. (Not to mention erecting a mythology around Wheaton’s own incredibly bad luck throughout most of the first two seasons. For instance, you now say, “I just Wheatoned,” when you roll really badly with your dice.) Unlike TableTop, the games on the Twitch channel unfold in real time, so many (though not all) hosts come from an improv background, flexing those theater muscles to carry two- to three-hour games with breezy insouciance.

That’s part of what makes Critical Role so special. As the host and Dungeon Master Matt Mercer opens every episode: “Hello! And welcome to Critical Role, the game where a bunch of us nerdy-ass voice actors sit around and play Dungeons and Dragons!” That’s pretty much it, but it explains very little about the show’s core appeal. What the description misses is just how gifted these actors are and how expertly they deploy their improv skills to flesh out and inhabit their characters. Some, like Sam Riegel and Marisha Ray, use something very close to their own accent and timbre as they play (respectively) Scanlon, the gnome bard, and Keyleth, the half-elf druid. Others, like Travis Willingham and Orion Acaba, demonstrate their professional range to give an Anglicized working-class growl to (again, respectively) Grog, the goliath barbarian, and upper-class twit brogue to Tiberius, the dragonborn sorcerer. The use of accents and different timbre is a helpful marker in the cast’s code-switching, as they flip merrily between their in-game characters and real-life personalities.

That, too, is part of the charm. Like any great improv troupe, the cast revels in surprising each other with totally in-character moments of ribaldry or pathos. One of Willingham’s greatest moments in the show, for instance, is when Grog locks himself in an outhouse to have a conversation with his cursed, sentient sword, Cravenedge. Though utterly hilarious, it carries some emotional weight, as one of the other party members, Percy (played with devilish calculation by Taliesin Jaffe), had just recently been delivered from bondage to his own cursed weapon. While Grog doesn’t want to pose a danger to his own group, he relishes the power given to him by the sword, and he’s no more inclined to sacrifice that power than Percy was, even with his growing suspicions. Similarly, Liam O’Brien and Laura Bailey play twins, Vax and Vex (respectively), whose comic bickering rings solidly true, but whose co-dependence delivers some of the biggest emotional impact in the series, especially when one or the other flutters over death’s threshold, instilling the other with uncontrollable panic. All of the characters often make very bad decisions for reasons that make total sense, and it then becomes the job of Vox Machina, their party, to pull their reckless butts out of the fire.

The commitment to character consistency has intersected with the challenges of live broadcast in some interesting ways. Perhaps the most controversial moment in the show’s run so far has been the departure of Orion Acaba after episode 27. Independent of the real life drama surrounding the event, the sudden departure was not entirely out of character for the flighty sorcerer, and his official farewell (performed by Mercer) in episode 37 was a somber highlight in the epilogue to the party’s first full arc without Tiberius. Another long-running challenge for Critical Role has been the incorporation of its gnomish cleric, Pike. Because Pike’s player, Ashley Johnson, pursues a live-action career that calls her away from Los Angeles, where the rest of the cast is based, she’s been missing for huge swaths of the show, not least including its initial few episodes. While she worked on Blindspot in New York City, Johnson telecommuted via Skype for several episodes. The distance and technical difficulties for Johnson meant that Pike was forced into a much more reactive role within the party, but her sporadic appearances also had the effect of reminding the cast and their characters how vital she is to the dynamic of Vox Machina. Indeed, one of the finest moments in the show was Johnson’s surprise appearance on-set for Episode 22, during a shooting break for Blindspot. The delight of the cast members to be reunited with Johnson was perfectly intertwined with the delight of their characters, who had not been together for four weeks. The necessity of having the players actually be present together physically in one place is something that can be dealt with in a live format, but it’s not something that can be “shot around.”

When technical difficulties occur in real time for us, the audience, it’s also about a thousand times more frustrating than a jam-up on YouTube. After all, when we were watching Critical Role on YouTube, we might have to abandon the video if YouTube was being stupid and come back to it later. That sucked. Then again, we rarely watched an entire episode all at once anyway. Critical Role episodes average three hours, and some have stretched past four. Given our schedules, my wife and I don’t usually get home until after 8:30 pm, and we’re usually asleep by 11. So while we were watching on YouTube, it became our custom to watch CR in one-hour blocks or so, breaking each episode into three nights’ entertainment. Besides prolonging the pleasure of each episode, finishing one also meant that we only had to wait four or five days until the next one.

Now that we try to watch Critical Role on Thursdays, when it airs (7 pm Pacific Time for its cast/crew, 9 pm here in the Midwest), that rhythm is severely disrupted. While it’s unusual for us to manage to stay awake until midnight on Thursdays, we usually watch at least two- to two-and-a-half hours as it streams live. That is, unless Twitch poops out on us. Or we poop out from fatigue. Neither of which is the worst thing in the world. And full episodes are uploaded by the next day, so we can pick up where we left off pretty quickly. But Twitch is, in our experience, still rather buggy. And since Critical Role is literally the first regularly-scheduled program that we have made a point to watch at its regularly-scheduled time since we got married,[1] not being able to watch it at that time is so much worse.

Worse, because we usually finish watching each episode on Friday nights. That’s awesome, in the sense that we get to finish the latest episode almost immediately afterward, and on our own schedule. But it also means that we have to wait until next week Thursday to see the new episode, and a less-than-perfect experience makes us all the hungrier for a better experience the next time. Which is usually no less than six days away, as opposed to the four or five it normally was when we watched episodes on YouTube.

There’s a bigger reason why it’s worse, though. After being spoiled for years by services like Netflix, Hulu, and Crunchyroll, which are at their best when you get to marathon episodes in large gulps, waiting for Critical Role each week is practically an exercise in discipline. There’s a reason why the fan-sourced tagline for Critical Role, “Is it Thursday yet?” is how Mercer closes each episode. The hunger for each episode is not felt by each fan alone; we feel it together. That time slot on Thursday is special because that particular time slot really means something. It’s the only time when all of us—the fans, the film crew, and the cast—get together for the Critical Role experience live. In real time. It happens first and for real only on Thursday. Everything afterward, while still thoroughly enjoyable, is not unique. It’s reproduced. That doesn’t lessen the enjoyment of the episode, but it also cannot replicate the sense of live connections being forged in the moment.

Fans of Critical Role are called “Critters,” and both the fans and players commonly refer to the “Critter community.” My wife and I don’t participate in the chat (which goes way too fast), the Reddit threads, or on Twitter, where the cast interacts with Critters on a regular basis. Yet I believe we do feel at least tangentially connected to the Critter community. In old message board parlance, we’re lurkers. But that sense of participation is something that we’re enabled to feel each Thursday night by virtue of the fact that we watch the show live, as it is streamed. The story itself is improvised with each breath and dice roll; the players are putting on a show for us, but they are also putting it on for each other. We, the audience, are simply invited. That invitation to the event itself, though, is always and only for Thursday at 1900 Pacific Time. It is the only time when none of us, collectively, knows what will happen next, and it is the only time when all of us, collectively, get to see what happens next. It is the only time when fear that something could go critically wrong is held perfectly in tension with the sincere hope that everything turns out all right. We, the viewers and players, are bound together in time to each moment.

There is something utopian,[2] I think, in the voluntary discipline of this ritual. Ritual discipline is something I don’t think I have appreciated enough in my life. It is, to be sure, qualitatively different from weekly worship services. It is also qualitatively different from live broadcasts of sports competitions, like football games. While I appreciate worship services far more deeply than sports competitions, I do acknowledge that, much like live artistic performances, there is something necessary to the human experience for events that technically only occur once—here, now, for those of us present—but which are ritually repeated at set times. These things give meaningful shape to our experience of time and space, and the most meaningful of these rituals take narrative form.

One of the great lies told about worship services is that it’s the same old crap every Sunday. In one sense, that’s true. Liturgies are cyclical, and they draw upon the same source material week after week, year after year, century after century. Yet. With each week, year, century, millennium, this circumscribed time with its own circumscribed set of conventions is made new by the fact that those present—here, now—are never the same. We are always older. Always slightly different. Always experiencing this same time in a new way, filtered by our passage through time. We die. Others take our place. They are not us, but we are them. We are made new by our participation in the ritual, by experiencing collectively a totally unique event that nevertheless replicates a set structure at periodic intervals throughout our lives. The narrative structure of these rituals is what gives narrative structure to our own lives.

Like any conventions, though, the governance of our life-narrative is not totally beholden to dogmatic minutiae. There is room for improvisation and surprise. These are also necessary. There is a certain delight, or perhaps catharsis, that can only be had by bonding together with others in the surprises that unfold themselves within the conventions of ritual. That’s why it’s healthy when someone farts loudly in church. That’s why it’s shocking when a pro ballplayer suffers a career-ending injury on the field. That’s why we know when stand-up comic tells us the truth. Are these things always delightful? Cathartic? Perhaps there are better words. Joy and awe. Rituals are not meant to be dry, empty obligations, but celebrations of being alive, and they are meant to inspire gratitude that we are alive to recognize meaning in this moment: here, now, together.

Rituals build communities, and communities thrive on ritual. That is true for individuals, families, villages, nations. It’s true that my wife and I simply don’t have the wherewithal at present to be active in the online Critter community. For now, though, we have made a commitment of time and treasure to experience Critical Role as it streams each week. It is something we cannot pilfer or reproduce and retain quite the same meaning. In finally subscribing to one of our favorite shows, we have finally begun to participate, however marginally, in a ritual that makes the lives of thousands, una communitas sine finibus, just that much more vibrant.☕


[1] I don’t count Doctor Who, which we typically get from Amazon the day after each episode airs. That’s pretty close, but not really the same thing as watching it as it’s broadcast.

[2] I’ve written very critically about utopia in the past. I’ve changed my previous position on utopianism about 165 degrees. Someday, perhaps, I may elaborate. Suffice it to say that I think utopian hope and utopian process are necessary components of any thriving community. I agree with Ernst Bloch that anti-utopianism tends to stifle positive social change; I disagree with any utopian theorist who views the shoring up of inherited traditions as inherently regressive, weak utopianism or as anti-utopian.

%d bloggers like this: