Ghostbusters ☕ d. Paul Feig, 2016

A modest prediction: like the original, 2016’s Ghostbusters will age well. Everyone knows that there are many New York Cities. There’s the real, actual NYC. There’s the NYC that each New Yorker lives in his or her own little world. Tourists, of course, have their own NYC. Then there’s the New York we see in movies: the violent dystopia, the romantic urbs bucolica, yesteryear’s city of tomorrow, etc. To paraphrase Whitman, it contains multitudes. The best movies set in New York City can only be set in New York City. Woody Allen doesn’t film, for the most part, in Boston, and despite what the Academy says, I don’t think it was such a hot idea for Martin Scorsese, either. By the same token, it’s impossible to think of the Ackroyd/Ramis/Reitman version of Ghostbusters taking place in Chicago, L. A., or New Orleans. “Let’s show this prehistoric bitch how we do things downtown.” Right? It’s gotta be set in New York, or it doesn’t quite work.

Or maybe it’s just that NYC as a milieu works so well as a catalyst for galvanizing types of humor honed elsewhere. After all, the original Ghostbusters cast was a mix of Canadians and Midwesterners, all connected with Second City and/or Saturday Night Live. The discipline of comedy tours and weekly television are rather like classical training for American comedians, who must adapt their routines and sketches to the demands of one of the most diverse audiences in the world. Live comedy demands an often tricky mix of topicality and timelessness—great jokes have to plugged into the here and now, but you can’t assume that everybody in the audiences is as plugged in as they should be. Film comedy is a different kind of tricky. Again, sharp humor always feels contemporary—but sharp humor always feels contemporary. The characters of Manhattan are as pathetic and funny in 2016 as they were in 1979; Peter Venkman’s narcissistic assholery and Ray Stanz’s blue collar geekery translated across state lines in 1984 as well as they translate across the three decades since they first appeared.

There’s little topical humor specific to 2016 in the new Ghostbusters, few allusions outside the franchise. Characters reference classic films like The Exorcist, but only to elements already deeply soaked into the pop culture consciousness. For instance, Andy Garcia plays the mayor of New York (because of course he does), and he deeply resents Kristin Wiig’s desperate scientist begging him not to be like the mayor from Jaws. Melissa McCarthy spends the whole film trying to get a decent bucket of wanton soup from her favorite Chinese restaurant—a running gag that works even better because only in (movie) New York City would someone stubbornly keep ordering the same disappointing soup from the same take-out joint and berate the delivery driver for it. Instead of “We’re ready to believe you!” or “Who you gonna call?,” the first slogan these Ghostbusters come up with is, “If you see something, say something,” only realizing after the flyers are already printed that someone is already using that one. In fact, that might be the most specifically New York joke of the film, and its topicality is restricted only in the sense that you have to know that the film takes place post-9/11.

In fact, that reference is probably the single strongest signal of the film’s temporal setting. There’s one instance of a smartphone video uploaded to YouTube costing a character a job, but apart from that, there’s little reference to the latest communication technologies, which probably comprise the single most conspicuous trait of our historical period. The (fictional, s’far’s I can tell) Mercado Hotel replaces 44 Central Part West as the site of the the climactic battle, and its art deco lobby is vintage (movie) New York City: it’s exactly the kind of perfectly preserved building you would expect to sit atop ancient ley lines, in addition to being an architectural expression of yesteryear’s cutting edge. It’s nebulously nostalgic, and while art deco might look simply dated elsewhere, it feels strangely a part of contemporary life in (movie) New York.

The Mercado Hotel climax is symbolic of what’s great about the film as well as what’s not so great. While it evokes that wonderful movie-NYC contemporary-nostalgia, it also evokes some of the most memorable scenes from the original Ghostbusters. Unfortunately, 2016’s Ghostbusters does entirely too much of that, and not cleverly enough. One callback that works well is the way this film brings in the classic logo, here spray-painted into a subway as a bit of mockery by a graffito. Another classy nod is the bronze bust of Harold Ramis glimpsed early in the film, gracing the hallowed halls of Columbia University. Cameos by other original cast members range from nice to outright distracting. Annie Potts essentially plays Janine, except here she’s the desk clerk in the Mercado. It works in part because her shtick is still funny, and because it’s a brief beat in the narrative flow. The single worst cameo is, unsurprisingly, Bill Murray’s. It’s not so much Murray’s performance as a paranormal debunker that clunks, but the fact that the film builds an entire sequence around him. While I think Paul Feig and Katie Dippold wanted him to be this version’s Walter Peck, it doesn’t really work out that way. For one, his cameo is too brief and poorly structured into the narrative to serve the catastrophic purpose of Peck in the original. For another, even if Murray’s performance is fine, he’s just too much Murray. Maybe other fans of the original will really dig him here. For me, the entire sequence screamed, “OMG you guys we got Murray for a day we gotta DO STUFF WITH HIM!!”

There’s really no way Feig et al. could win. Remaking a beloved film like Ghostbusters entails its own challenges that have little to do with the mechanics of storytelling and everything to do with fan service. Apart from the clunkiness of Murray’s extended cameo, he shows up at almost exactly the wrong time, a little more or less than halfway through the film. Until his appearance, the film had done deft work in metatextual commentary, sprinkling allusions to the earlier films into its original material in ways that were pleasing without interrupting the flow. In fact, the first 45 minutes or so of 2016’s Ghostbusters is borderline magnificent. It sets up a distinct cast, a different kind of villain, and it does all this with the workmanlike professionalism that makes for durable Hollywood cinema. The thematic arc is even distinct from the original. Ivan Reitman’s Ghostbusters were underdogs who got to prove their worth to a city famed for its facility with dream-crushing, and Peter Venkman learns to be a little less of a selfish asshole. Feig’s Ghostbusters are still underdogs who get to prove themselves, but this movie is really about what a difference friendship makes to said underdogs. The difference between the good guys and the bad guy here is that human connection. In a culture that frankly still often celebrates bullies and narcissists, the outcasts who save the city in the new film are honored for their personal strengths in ways that are subtext (if that) in the original Ghostbusters.

The cast makes that work. And as someone who is a big unplugged from pop culture, this was my first time really seeing Wiig, Kate McKinnon, and Leslie Jones as performers. SNL fans know them, but I don’t think I’ve watched SNL since about 2001 or 2002. They are simply terrific, as is McCarthy, whom I know going back to Gilmore Girls. The dialogue in this movie is good, and the special effects are okay; this is a movie you kind of have to see for the actors, though. Besides the great chemistry shared by the principle leads, they also spark with pretty much everyone else who shows up. I recognized Charles Dance, Ed Begley, Jr., Matt Walsh, Michael K. Williams, and Michael McDonald, of course; Cecily Strong, Neil Casey, and Steve Higgins are (apparently) SNL alumni as well. This isn’t quite an Ocean’s Eleven-level Who’s Who, but there are no wasted scenes with any of these performers. It’s all good stuff. Oh, and, yeah—Chris Hemsworth: delightful.

I’m interested to see how this movie plays over the long haul. Unlike a lot of my contemporaries, I didn’t see 1984’s Ghostbusters (or its sequel) until I was in my teens. So the nostalgia factor is a bit blunted, but I have watched the first film at least a dozen times. It’s impossible for me to watch 2016’s Ghostbusters and not be at least a little distracted by all the callbacks and cameos. Will younger audiences, those less attached to the original movies, feel the same way? What about viewers my age or older, who simply enjoy the cameos for what they are? I don’t typically see the point in doing a remake/reboot unless the filmmakers can find a reason to justify doing something new and different. Most of the new film hits the sweet spot between honoring the structure and vibe of the old one while still infusing it with the unique sensibility of its (re)makers. The very presence of the old cast (awesome though they are as individual performers) and some of the callbacks simply feels like an unwelcome intrusion, sort of like the VIPs that you’re obliged to put on the guest list even though the party will be super-unhip if they actually show up.

On the whole, though, it’s an enjoyable and—dare I say—necessary extension of the Ghostbusters franchise into the 21st century. The weird mix of welcome and unwelcome nostalgia is likely an unavoidable cost of that labor. All the same, what I kind of dig conceptually about the new film is that it formalizes the Ghostbusters not just as a viable franchise, but as a cultural institution, one that’s multigenerational in a meaningful, active sense. What would America be without its institutions—and what would (movie) New York be without its Ghostbusters? ☕


Reflections on revolution in American conservatism, part 2

Previously, I said that I have disavowed conservatism because a majority of American conservatives are aligned with bigotry. It’s a very presentist case to make, and the emergence of Donald Trump as the Republican (read: conservative, or at least “less liberal” than Hillary Clinton) candidate has made it not only easy, but convenient. To be perfectly honest, there’s a little bit of self-defense involved in my sudden deconversion: I don’t want to be associated with the racists and religious bigots on the Right who have made Trump their candidate. Since this blog is public, I don’t want anyone to make the mistake of thinking that I’m on board with the Lars von Trier melodrama unfolding within conservative circles. I’m not, in any way, interested in performing the ethical and rhetorical contortions to justify why I’m still conservative that other self-identified conservatives have been performing in order to explain away why they’re still voting for Donald Trump. I’m also not interested in performing the ethical and rhetorical contortions that other self-identified conservatives (the ones with a moral center) have been performing in order to place the responsibility for Trump’s ascendance on the Left. There’s blame enough to go around, I suppose, but I agree with Damon Linker that the main blame lies with the Right.

Therefore, a question that’s been vexing me for the last year is whether I’ve contributed to the surge of bigotry in any way simply by offering up conservative apologetics in the past.

This isn’t just navel-gazing; it’s a question of ethical responsibility. It’s a question I think every self-identified conservative ought to wrestle with. What is it in American conservatism (leaving aside other Western right wing traditions) that has enabled Trump, of all people, to be The Guy?

In assaying this question, I hope to make it clear that my disassociation with American conservatism as a political movement is not done purely for convenience, but something that is principled and which has been coming for some time. The truth is that it has been difficult for me for a long time to find much overlap between my own politics and the politics adopted by a majority of self-identified conservatives in this country. The difference now is that I no longer see myself as occupying a neglected corner of a big tent, but a place somewhere outside of it. In some ways, I feel that American conservatism pulled up stakes and left me behind some time ago, but it’s also likely that I simply wandered outside the tent at some point without realizing it until, just recently, I took a big gulp of fresh air and noticed that the circus, with its angry clowns and great heaps of elephant dung, was far, far away.

Most of what I wrote below with reference to historical context is mostly boilerplate summary and in no way my own original argument. As this is a personal reflection and not an academic essay, I’m not going to track down every document that has, for the last several years, nudged my thinking in this direction. Perhaps I’ll cover that in future reflections. At any rate, the historical context is my own words, but not my own ideas. Please read with that in mind.

__________

Russell Kirk is probably one of the most famous twentieth-century theorists of conservatism as a distinct political philosophy. Among other things, he’s famous for enshrining Edmund Burke as a canonical forerunner of what we, in America, now think of as conservative ideology. His essay on the “Ten Principles of Conservative Thought” is one that I’ve returned to at different points in my life as I’ve tried to balance current political circumstances against my own evolving framework. It’s necessary to remember that Kirk’s main body of intellectual work was published in the context of the Cold War. In fact, it’s necessary to remember that what is now mainstream U.S. conservatism was developed in that context. American society underwent a number of social changes in the decades stretching from the end of World War II to the fall of the Berlin Wall.

For the moment, it’s important to keep in mind that conservative intellectuals contended with three political antagonists that they saw as mutually overlapping (or, rather, in alliance against them): 1.) the radical Left intelligentsia, based mainly in universities and cosmopolitan, mostly coastal, urban centers; 2.) international communism, exemplified by the Soviet Union, the People’s Republic of China, and, after the 60s, various countries in South America; 3.) the progressive/liberal political movement, dating back at least to Wilson and TR, with presidents like FDR, JFK, and LBJ carrying the torch forward. It’s true that these three traditions have included people who have felt kinship with all of them, and it’s also true that there were people in all of them that utterly despised and disavowed association with the other traditions. Broadly speaking, the only thing these traditions have in common is that they were generally “Leftist.”

Apart from that, there were often sharp disagreements in terms of ideology and praxis between them. One of the more important distinctions is that the radical Left was often deeply critical of the entire European Enlightenment tradition, going all the way back to Locke, Kant, and Smith, while liberals often championed their causes on the principles that the radicals despised: individual liberty and reason. The radical argument (and I’m being quite reductive here) is that the Enlightenment tradition paved the way for the worst excesses of capitalism, which undergirded the material wealth and accomplishments of the West, including the wealth gained through the various colonial projects of the European nations. For radicals, individualism and the celebration of reason were just tools for keeping democratic populations docile and too motivated by self-interest to actually follow through on overcoming inequality. Liberals, by contrast, often achieved their most significant victories—making voting more democratic, civil rights legislation, social safety net programs—by emphasizing the dignity and choice of the individual and her exercise of reason. Few of these progressive achievements actually undermined or attacked the foundations of capitalism itself, which is why the radicals saw those victories as nice but ultimately hollow. Communists, for their part, tended to combine the worst elements of Left agitation for equality and liberal designs for rationalized social equality. Liberals deplored the abuses of communist regimes; radicals were often split, with some simply turning a blind eye to communist horrors, some ascribing communist failures to capitalism’s unending malignancy, and some moving even further beyond the framework of the nation-state, often championing variations on anarchist subversion of convention, whether liberal-democratic, fascist, or communist.

Into this context, the modern conservative movement as such was nursed into being.

The “ten principles” outlined by Kirk were originally delivered to a speech to the Heritage Foundation in 1986, late in his life, and more than three decades after the publication of The Conservative Mind. By this point, “movement conservatism” had become an ideology of its own. William F. Buckley, Jr. had the ear of President Ronald Reagan, and National Review was the flagship publication of conservative intellectual thought. Conservative pundits had also established a significant presence in mass media. Readers could follow writers like George Will and Charles Krauthammer in syndicated newspaper columns or The Weekly Standard, while radio listeners could tune into Rush Limbaugh or Phyllis Schafly. Pat Robertson and Jerry Falwell, in the meantime, had consolidated the so-called Religious Right into what they termed a “Moral Majority,” citizens who tuned intoThe 700 Club or subscribed to newsletters from organizations like Focus on the Family. The Heritage Foundation itself had become a leading conservative think tank by the mid-80s. Organizations like this one were vital to the project of welding a more hardline ideological conservatism, developed among the activist base that had propelled the Barry Goldwater insurgency a generation earlier, to the forefront of the political platform of the Republican Party.

In reading Kirk’s declaration of principles, we cannot afford to ignore this context. The Cold War caused a lot of misery globally, and to consider only the American context, it resulted in at least two wars fought to loss or stalemate (Vietnam and Korea), the support of often horrific dictatorships in South America, and the support of the proxy insurgents in the Middle East (to fight Soviet aggression into South and Southwest Asia) who would evolve into radical Islamist terror groups or theocratic dictatorships. On a more abstract level, it also managed to calcify ideological divisions in the United States—as far as conservatives were concerned—into two groups. On the one side were the three Left traditions outlined above, conflated by conservatives into a single monolithic force. Movement conservatives are as adept at invoking Stalin and Mao when talking about the Left as liberals are at invoking Hitler and Mussolini when talking about the Right. The Right’s calcification of the Left into a single, homogenous entity with respect to the Cold War is significant for my purposes only in that movement conservatives still have not moved on from that (wildly distorted) paradigm. What they have yet to come to terms with is that the particular fusion of political interests developed during the Cold War era—Christian social/religious hegemony, economic libertarianism (free market triumphalism combined with slashing federal programs), hawkish military spending and belligerence, increasing the militarization of police (coupled with extreme positions on individual gun rights), and expanding independent power of the executive branch for national security purposes—were also calcified. Outside the context of the Cold War, the Reagan fusion makes little to no ideological sense. As a coalition of interests threatened specifically by Soviet-style communism, it does. Much as movement conservatism falsely conflated various strains of Leftist thought,  it at least responded to a specific real-world situation. That’s not where the world is now, though.

The ten principles outlined by Kirk are not presented specifically as anti-communist resistance. They give the impression that they transcend their moment in time, as principles are meant to do. Kirk’s summation at the very end feels apt to what it is that conservatism ought to be, a recognition of “an enduring moral order in the universe, a constant human nature, and high duties toward the order spiritual and the order temporal.” That’s almost too transcendent, though. It’s not much different from the affirmation that we ought to bend toward our platonic ideals, rather than bend a mutable universe to our own will. As the foundation for political philosophy, we could do worse, but there have been moments in progressive thought, especially of the Hegelian varieties, that see progress as teleological, which is often simply another form of pursuing transcendent ideals. We do better to consider some of the specific principles Kirk outlines to get a sense of what he means.

I won’t go through all of them, but I’ll highlight a couple that are most pertinent to my reflections.

__________

Let’s start with this one, which is probably the most recognizably conservative in the American political context.

Seventh, conservatives are persuaded that freedom and property are closely linked. Separate property from private possession, and Leviathan becomes master of all. Upon the foundation of private property, great civilizations are built. The more widespread is the possession of private property, the more stable and productive is a commonwealth. Economic leveling, conservatives maintain, is not economic progress. Getting and spending are not the chief aims of human existence; but a sound economic basis for the person, the family, and the commonwealth is much to be desired.

You can see the fingerprints of the libertarian strain of Austrian economists all over this principle. More specifically, it is thinkers like Ludwig von Mises, F. A. Hayek, and Murray Rothbard, who developed the foundations for latter-day libertarian politics. Leviathan is a synonym for statist totalitarianism in this snippet, and libertarians contend that economic liberty and individual liberty are isomorphic concepts, and that if a single entity (such as the modern nation-state) controls the majority of private property, then the freedom to exercise individual liberty is necessarily curtailed. It follows, of course, that those who posses more wealth have more liberty than others, but this is, in the anarcho-capitalist realm of political theory, acceptable, because it both incentivizes individual striving for more wealth and allows individuals who exercise their freedom irresponsibly to lose it.

I’ve flirted with libertarianism for years, and I think it to be one of the most ideologically consistent political philosophies; the consistency itself is appealing to me. Unfortunately, my interest in libertarian thought led me to read a good deal of it. As with many things, direct exposure to something is the best inoculation against it.

Libertarian social economists have yet to provide an explanation of how statist nonintervention is to prevent a form of corporate oligarchy from replacing a functioning representative democracy, apart from the hazy belief that if a corporation runs its fief inefficiently, it will crumble and be replaced by another. For anyone concerned at all with maintaining social order, valorizing the volatility of free market competition and its fallout strikes me as naïve at best, and malicious at worst. Furthermore, the alleged efficiency of the market in weeding the strong from the weak competitors conventionally draws parallels between the processes of capitalism and natural selection. Natural selection, as a natural process, is amoral. Why, then, would human society wish to acclaim an inherently amoral process as a “good,” especially in contrast to public institutions erected to promote specific social goods? Finally, drawing a theoretical link between the amount of wealth one has and the amount of liberty one may exercise is something that both Marxists and libertarians have in common. The main difference is that Marxists wish to promote relatively more freedom for all, whereas libertarians wish to promote more freedom only for the wealthy. In a world where one’s labors directly correlate with the amount of private property one is able to amass, the libertarians might have the theoretical edge. In the real world, where things are not now and never will be fair, yoking possession of private property to possession of freedom is as much the road to serfdom as state socialism.

Of course, Kirk does not specifically advocate libertarianism per se as a principle of conservatism, nor does he say that it is the highest good. Instead, his emphasis on the benefits of protecting private property ownership, taken in tandem with the other principles, is meant to highlight the benefits of having a society in which individuals are encouraged to reap the benefits of investing their time, effort, and wealth into their own property.

Since this is but one of ten other principles, we might also recall that it takes no precedence over the others. Yet I do not think that there is any other principle in contemporary conservative thought that is held more dear than the idea of “limited government,” especially with respect to state intervention in the economy. Opposition to or critique of unions (public or private sector), state spending on education, welfare programs, environmental protection, and business regulation of any sort invariably falls back onto the notion that “the market” is better suited to mediate virtually every human endeavor, rather than the government.

Edmund Burke’s Reflections on the Revolution in France was particularly animated by the revolutionaries’ confiscation of private property, which he saw as a precondition for the lawless totalitarianism soon to follow. But even for Burke, a key issue was the balance of power between the landed nobility, the church, and the crown, whose mutually-dependent property relations underwrote the economy of the ancien regime the revolutionaries sought to upend—in short, Burke wasn’t defending private property rights as defined against the state as such, but against the illicit confiscation of property by a particular (in his view, illegitimate) government, judged against the particular historical circumstances of its constitution. That’s not quite the same thing as championing private sector solutions to public sector problems. Kirk’s own position, as stated in the principles, clearly is pitched against total state or communal ownership of property, but in no way does he militate against government intervention per se. What is primarily at stake here, as I see it, is the degree to which private property is the medium of individual liberty. In the context of the Cold War, it makes sense that a conservative would uphold the necessity of private property in opposition to the communist governments of Russia and China, which often used privation tactically to neutralize dissent, or, at the very least, had millions of citizens who had equality but nothing to do with it.

Confiscation is a powerful political weapon, and communist regimes have rarely hesitated to deploy it. (Again, I recognize my reductiveness.) Though Burke had indeed harshly condemned the French radicals’ confiscation and redistribution of property—as well as charged them with economic illiteracy—his eighteenth century view of property rights was much more moderate than late twentieth-century conservatives’ view. Kirk’s argument is favorable to the libertarian-ish Reagan era, but to say that “freedom and property are closely linked” is not to say that freedom depends upon private property. At the heart of Burke’s critique was the state’s exercise of power  through the lawless disregard for private subjects’ established claims to their property. Kirk observed a similar operation in the communist regimes of his century, but those were examples of totalitarian state tyranny, not examples of what happens whenever the state is an economic agent. That’s a distinction lost on whomever conflates all Left traditions into a monolithic whole. Despite conservative claims to the contrary, Keynesians are not de facto socialists or communists or totalitarians. Seeking the state’s aid in addressing the consequences of gross economic inequality is not tantamount to seeking complete economic leveling across the board. A close link is not a necessary causal relationship.

There aren’t many communist nation-states left, and arguably none that don’t qualify as failures, dictatorships, or (as in China’s case) quasi-state-capitalist. In 2016, much of the rhetoric surrounding economic justice is directed toward the obvious fact that a relative handful of people control the economic futures of everyone else on the planet. It may be the case that private property and personal freedom are indissolubly linked, but for most people in the world, that only means that most people have less freedom than others, and the structure of capitalist accumulation always works to the benefit of those who have already accumulated more capital. Any speech about infringements on freedom that is related to business or environmental regulation that is made on behalf of people who already have money is therefore a speech about preserving the inequalities that already exist.

I’ll say that again. When conservatives in 2016 talk about defending the free market, they are talking about defending economic inequalities that are part of the current economic structure.

Take that as you will. I’m just trying to put it in historical context.

The conservative principle of yoking personal wealth and personal freedom against economic leveling is not about protecting us from state communism. It is about enabling those who already possess property to get more. Again, in theory, maybe this isn’t so bad. It’s the American Dream to take your shot at prosperity, right? Well, yes, but in a market based on competition, there will always be losers in that competition. And in a market-based competition, the loser loses his property to the winner. The loser loses freedom.

Part of what makes freedom scary is that we have the freedom to fail. In theory. It also means that we have the freedom to succeed, often in amazing, unpredictable ways. In theory. That also means that we can fail in traumatic, unpredictable ways. A free market offers no provision for the losers. For those with a modest amount of private property to gamble, the stakes involved might indeed inspire prudence and innovation. For those with a great amount to gamble, the stakes involved might lead to recklessness. After all, the wealthy can afford golden parachutes. They have that freedom, whereas Joe and Jane Smith, who opened a restaurant with their life savings, will simply lose everything if the economy turns south.

Mind you, I’m not saying that the market is inherently bad. It is not inherently good, either. And having respect for the social benefits of private property ownership is not the same thing as libertarian free market triumphalism. So when I read conservatives prating about the efficiency of the market or how capitalism is pretty much the greatest thing ever or that money equals free speech, or that we can’t have freedom without unfettered capitalism, it grinds me to the core. I happen to accept the analogy that the market functions similarly to natural selection, but, because I have a moral sensibility, I am always inclined to weigh the human costs involved in that proposition. As a result, I don’t think the market deserves protection from state intervention; I think that human beings deserve protection from the fallout of a free market system.

Kirk, of course, was willing to associate himself with the Reagan coalition, which had quite radical ideas. I lack both the knowledge and time to discuss Kirk’s entire corpus as it relates to movement conservatism’s apotheosis in the 1980s. What is fascinating to me in the passage I quoted, though, is how tempered Kirk is, how much nuance he leaves in his contention that property rights underlie political freedom. Not only is this principle just one of ten (clearly not taking precedence over the others), but his presentation of property as foundational to the commonwealth really is fundamental—in the sense that it provides a stable basis for the family, for the community. This is not free market triumphalism. When conservatives valorize the market, with its volatility, its amorality, its blind indifference to reason and human dignity, they embrace the violent chaos of a Darwinist cosmos. It is to this vision of the cosmos that Kirk’s seventh principle stands opposed. Civilization is not built upon the chaos of the market, but upon the investment of human striving into material artifacts. The market is, as Joseph Schumpeter put it, an engine of creative destruction. This kind of instability, of rupture, of impermanence in the natural order is not, cannot be, representative of the transcendent moral order that Kirk declares to be first and foremost in the conservative mind.

__________

Second, the conservative adheres to custom, convention, and continuity. It is old custom that enables people to live together peaceably; the destroyers of custom demolish more than they know or desire. It is through convention—a word much abused in our time—that we contrive to avoid perpetual disputes about rights and duties: law at base is a body of conventions. Continuity is the means of linking generation to generation; it matters as much for society as it does for the individual; without it, life is meaningless. When successful revolutionaries have effaced old customs, derided old conventions, and broken the continuity of social institutions—why, presently they discover the necessity of establishing fresh customs, conventions, and continuity; but that process is painful and slow; and the new social order that eventually emerges may be much inferior to the old order that radicals overthrew in their zeal for the Earthly Paradise.

Conservatives are champions of custom, convention, and continuity because they prefer the devil they know to the devil they don’t know. Order and justice and freedom, they believe, are the artificial products of a long social experience, the result of centuries of trial and reflection and sacrifice. Thus the body social is a kind of spiritual corporation, comparable to the church; it may even be called a community of souls. Human society is no machine, to be treated mechanically. The continuity, the life-blood, of a society must not be interrupted. Burke’s reminder of the necessity for prudent change is in the mind of the conservative. But necessary change, conservatives argue, ought to be gradual and discriminatory, never unfixing old interests at once.

Once more, the context of the Cold War is essential to understanding this principle, and Kirk uses notably Burkean language when he describes “the old order that radicals overthrew in their zeal for the Earthly Paradise.” In the eighteenth century, the “radicals” would have been Jacobins; in the twentieth, they were the Bolsheviks. What’s at stake here is both preserving “custom, convention, and continuity” and rejecting the rupture that revolutionaries often champion when attempting to subvert the old order.

In the American context, this principle is more than a little weird when it’s advanced by conservatives who insist on using the Founding Fathers as pole stars in their rhetoric. Just in case you forgot, the United States formed in the wake of a revolution fought by British colonists against Great Britain. This revolution was founded in Enlightenment notions of natural rights and it pitted the received parliamentary tradition directly against the monarchy, rather than in partnership with it. While I happen to be of the mind that the American revolution was indeed quite radical, and that my nation’s forerunners had just grievances that were unjustly not addressed, I also am fully of the conviction that there was nothing conservative in temperance or politics about the revolution. Historians would be quick to point out that when the Constitution was drafted, it drew upon a long tradition of English common law, and that the colonies already had a functioning civil service apparatus to fall back on once the shooting stopped. In short, the radical break from their colonial overlords was founded upon inherited principles and transitioned (somewhat) smoothly thanks to laws, traditions, and infrastructure that were already in place. In that sense, the American revolution preserved much in custom, convention, and continuity.

But it was still a revolution.

Men, women, and children were killed in battle and as collateral damage. British loyalists were sometimes tortured or killed for their beliefs. Early American history is not my forte, but when I say that the revolution exacted a horrific human cost on both sides, I do not exaggerate.

The Declaration of Independence is, for good reason, one of the canonical documents of American society. In its language, statesmen, philosophers, and everyday people have found a wellspring of wisdom and authority—most often the latter, when invoking the Declaration to frame their own agenda. I return to it each year on or around our Independence Day, and it always strikes me anew as incredible. Mind you, I use that term in a somewhat nominal sense, rather than as praise or condemnation. What people often forget is that the Declaration was not a mere statement of beliefs; it was an argument. Allow me to quote liberally from it by way of illustration:

When in the Course of human events, it becomes necessary for one people to dissolve the political bands which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature’s God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.–That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, –That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness. Prudence, indeed, will dictate that Governments long established should not be changed for light and transient causes; and accordingly all experience hath shewn, that mankind are more disposed to suffer, while evils are sufferable, than to right themselves by abolishing the forms to which they are accustomed. But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government, and to provide new Guards for their future security.

All of this is praemunitio for the series of specific charges laid against the British government, the charges that justified the colonists’ secession from their government of more than 150 years.  While the lines about “Life, Liberty and the pursuit of Happiness” are likely the most-quoted, it is really that last sentence in the above quote that forms the core of the document’s purpose: the colonists had to prove that what England had become to them was really “absolute Despotism,” rather than a constitutional monarchy. A despot, you recall, is a dictator who arrogates all civil institutional power to himself alone. To charge England with reducing them to absolute Despotism was to charge the king with ignoring the rule of law and acting autocratically, outside his prescribed legal authority, to oppress the Americans. In short, to justify their own flouting of all legal authority, the colonists had to prove that the king flouted all legal authority first.

The Americans’ allegations had teeth, of course. What remains outstanding in the introductory remarks of the Declaration is that the authority to make their political rupture was arrogated to themselves as a “people,” somehow no longer quite English, but owing their decision as much to their own “duty, to throw off such Government” as to “the opinions of mankind.” If this feels vague and jazzy-handsy, it is. It’s grounded in a lot of well-established political philosophy (Locke looms particularly large), but it nevertheless hinges on the premise that a rupture between two distinct peoples has already occurred, and this justifies the further rupture between them. It is a document that acknowledges the continuity that previously existed, then alleges that the fault for its dissolution really lies entirely on the other side.

One of the many great ironies of American history is the contemporary resurgence of interest in the Founders among the radical right. Most notably, the grassroots opposition to President Obama that coalesced on or about 2009 dubbed itself the Tea Party. Instigated first by the president’s stimulus programs (intended to spur an end to the recession that started in 2008 with the collapse of the housing bubble and the derivatives market), then really galvanized by the passage of the Affordable Care Act, the latter-day Tea Party’s talking points in the early days revolved around the role the federal government ought to play in its citizens lives and how much it ought to spend toward that end. Basically, Obamacare was to them simultaneously an egregious intrusion upon citizens’ freedom to manage their own health care and a condensed symbol of Washington’s spendthrift waste of taxpayer resources. To hear the rhetoric, you would think that Barack Obama was quartering redcoats in our homes and torching our shopping malls with napalm. Tea Partiers see themselves as fighting to protect long-cherished liberties that are inherent to the American character and necessary to preserving our way of life.

While I think Jill Lepore has written the definitive essay on the Tea Party movement’s appropriation of American history, I do think that the the movement, quite by accident, stumbled into the perfect metaphor for itself. According to the latter-day exponents, the Boston Tea Party of 1793 is a symbol of the people’s refusal to let its government exercise tyranny through unjust economic policy. So it was.

Never mind, though, that England’s coffers were sapped, in part, by the British Empire’s efforts to head off French imperial ambitions aimed at, along with a few other continents, North America. You might recall getting confused back in school about who was on what side in the French and Indian War—American colonists called it that because the French persuaded indigenous people on this continent to fight alongside them against the British. At that point, Americans were still Britons, as was a young soldier named George Washington, who cut his military teeth in the conflict fighting alongside British forces. The French and Indian War, however, was just one part of a conflict that spanned the globe. Fighting wars has always been expensive, and when this particular war of empires ended in the 1760s, that tyrant, King George III, whose armies had killed and bled to protect his American colonial subjects, had to recoup costs. British legislative efforts to balance the ledger that had become stained with red on behalf of Americans (among other British peoples) were the first to stoke the fires of revolution in the thirteen colonies. If I may put it a bit ungenerously, the Americans didn’t want to foot the bill for a war that England had fought to preserve their life, liberty, and pursuit of happiness.

So the Americans were one people with the British when the French were storming their borders, but fifteen years later, they were merely “one people to dissolve the political bands which have connected them with another.”

In the space of one generation, a rupture in cultural and institutional continuity had been effected by men who were primarily incensed over taxation without representation, a rupture of such magnitude that they went right back to mass slaughter, this time over the principle of liberty.

Once again, I acknowledge my historical reductiveness, as well as the fact that I’m playing fast and loose in my selection of historical facts, which seem always to conspire in their complicatedness to thwart every easy narrative that we try to make of them. Mea culpa.

What this digression is meant to illustrate is that contemporary conservatism is largely shaped by a spirit of radicalism. It is, furthermore, a radicalism that seeks to sever custom, convention, and continuity on multiple levels. While conservatives are busy embossing their newsletters with quotes from the Founders, they neglect to acknowledge the legally dubious nature of any revolutionary project. They further neglect to acknowledge that the rationale justifying the entire revolutionary project devolved upon a parcel of citizens the authority to decide whether or not any other “people” have a say in the matter who disagreed with them about what form of government is most conducive to the ends of providing Safety and Happiness. The logic of American Revolution, in other words, dictates that you can justify any political action, so long as it is in response to Despotism—and, as luck would have it, those with the authority to identify Despotism usually happen to be the ones proposing the radical political rupture with others who were, until just recently, their fellow citizens.

It is my fundamental contention that American conservatism, as currently expounded by a majority of its self-identified adherents, effaces old customs, derides old conventions, and breaks the continuity of social institutions. And if American conservatives think it unfair of me to say so, they might consider the fact that it is they who nursed the antigovernment, anti-establishment rhetoric for decades which eventually birthed the Tea Party, a political movement that named itself after a bunch of middle-aged white dudes dressing up in racist costumes and committing an act of large-scale vandalism because their parliamentary representatives approved a tax hike.

I’ve no idea what Kirk would have thought of the contemporary Tea Party or its radical politics. At the time that Kirk wrote his ten principles, his counter-revolutionary ideas were adequate to address the manifest failures of the Soviet Union and its offspring to follow through on their own revolutionary ruptures. Perhaps there’s contemporary relevance to the efforts of ISIS to destroy the cultural history of the cultures whose people have been afflicted by the misfortune of residing in territory occupied (however briefly) by the so-called caliphate. (Perhaps.) What strikes me is that we have had Obamacare now for about as long as the American Revolutionaries had King George’s tariffs before they decided to dump tea into Boston harbor. I doubt that many conservatives are making serious plans to secede from the Union. I do see, however, a parallel logic between the Declaration’s division of two peoples and the commonplace call to “Take our country back.” From whom? Our fellow Americans?

__________

If you think that perhaps I’m being unfair to overlay conservatism so isomorphically with the Tea Party, allow me to direct your attention to the all-but-concluded Republican presidential primary. No, not to Donald Trump (although I’ll come back to him in a bit). Indeed, I think it is far more significant to reflect on the fact that the two men left standing at the end of the long march to Cleveland were Donald Trump and Ted Cruz.

Cruz is about as emblematic of the Tea Party as you can get, and he was widely acknowledged as the most conservative candidate running in this election cycle. While Trump is the most overtly noxious of the two, I think Cruz to be much more symbolic of this cultural moment in American movement conservatism, even in his failure to win the Republican nomination.

How many times did Cruz vow in his campaign to “repeal every word of Obamacare”? He offered a list of four cabinet positions and 25 agencies he would fight to eliminate. He said he’d carpet bomb ISIS strongholds, regardless of civilian casualties. When Donald Trump called for a ban on all Muslim immigration, Cruz countered by suggesting that we turn Muslim neighborhoods in the U.S. into state-run panopticons. He vowed that, on his first day in office, he would tear up the historic diplomatic deal with Iran. He led the fight in the Senate to push the government into a shutdown rather than fund the Affordable Care Act.

Conservatives call Cruz “principled.” Indeed he is. (Except when he apparently doesn’t tell the truth about his main opponent for eight months, but whatever.) The question is how he goes about applying those principles. It is my view that Cruz is not conservative. He is radical. It is radical to eliminate entire cabinet positions and agencies, just as it is radical to “repeal every word” of a law that is now an integral part of the nation’s health care system. It is radical to violate the constitutional rights of Muslims on account of them being Muslim. What Cruz consistently espouses is the violation of procedural norms for the sake of ideological purity. That’s radicalism. How is the word-for-word repeal of Obamacare by executive fiat any less radical than its creation and passage by duly-elected representatives of the American people? (It is, in fact, more so.) How is Cruz’s open declaration to turn American Muslim communities into de facto police states any less radical than, say, anything the president has said about limiting access to guns designed for efficiently killing mass numbers of people? (Again, I think it’s more so.) If anything, I think the president has largely worked within the institutional norms established by his predecessors, for better and worse. That is de facto conservatism.

Except, to the vast majority of American conservatives, it’s not.

The #NeverTrumpers endlessly repeat that Donald Trump is not now, nor has he ever been a true conservative. Indeed, Trump hasn’t really claimed to be. His supporters are the ones who claim to be conservative. Those agitating for the GOP to modify its rules so delegates won’t be bound to coronate that orange pile of smarm (as my wife calls him) are quick to point out that he didn’t win a majority of votes—just a mere plurality, as if the fact that only winning more primary delegates than any other single person is insufficient reason to nominate him the torch-bearer of the party whose candidate in 2000 was swept into office by overwhelming popular vote barely eked out a victory with a contentious Supreme Court decision that upheld the validity of the electoral college. These same #NeverTrumpers who pooh-pooh Trump’s mere plurality spent four years defending the legitimacy of George W. Bush’s first term on the same principle that Trump’s supporters cite to defend his right to the Republican nomination.

That’s why it’s important to consider Cruz and Trump together. Quite apart from the Republican media machine, the fact is that Cruz and Trump, considered together, do present a snapshot of conservative America. A majority of American conservatives voted for one of these two men. The question is what these two men have in common. Certainly not policy ideas. While they both took hardline stances on immigration, and while they are both bigoted against Muslims (though Cruz talks his way around more smoothly than Trump does), the only thing they really have in common is that their notions about how to go about actually governing the country are fundamentally extreme. Radical.

I do agree that a big part of Trump’s rise is simply attributable to old-fashioned racism. Old bigotry dies hard. I’d like to suggest, though, that there’s a deeper link between the fears of racial contamination and the trend toward polarization fueled by increasing ideological puritanism.

More than authoritarianism, what I think radicalism often does is lead people to place greater and greater faith in the power of ideological purity. Conservatism, as expounded in America for the last twenty years, takes extremism as a litmus test of seriousness.

When an entire political movement vets its candidates based on their commitment to radicalism, it is not difficult to see how a bully like Trump would appeal to its base. His entire shtick is premised on being extreme. It shows that he means business. The real story of the 2016 election is not how Trump got to be the Republican nominee. The real story is how people like John Kasich, Chris Christie, and Jeb Bush—politicians who are already very conservative—were deemed too moderate by the conservative base that turned out for the primaries.

Cruz and Trump were the last men standing because they were the most extreme candidates of the bunch. Despite all their manifest differences, what united them in their electoral successes was that every other conservative in the primary field was insufficiently radical. And when it came to the particular constellation of policy issues on which the conservative movement has staked its claim for the entirety of my lifetime, the conservative base didn’t go with the guy who became the most hated man in Washington for his utter commitment to ideological purity. No. They simply went for the guy who took the most pride in being offensive and amoral.

__________

To me, conservatism is quintessentially a productive, perhaps even progressive, resistance to radical change.

Is there any other conservative in American that you can think of who sees conservatism that way? Oh, aye, I’m sure there are a few. As I said in my previous post, however, a “few” do not a political philosophy make. Not unless they gather adherents; not unless those few are deemed in time to be originators of the discourse, as Foucault might put it.

In the last few years, I’ve found myself gravitating more toward thinkers who openly declare at least some allegiance to the tradition of liberalism, like Alan Jacobs or Damon Linker. My go-to source for news analysis is Vox, and for political scuttlebutt I hit Talking Points Memo. My favorite general interest magazine is The Atlantic. I relish reading Walter Benjamin. I’ve transmogrified into an over-educated, low-income Mugwump. I’ve kept thinking of myself as conservative, even though, apart from a few amateur bloggers, the only conservatives I read regularly publish in The American Conservative, which is notable for the diversity of nuanced, erudite perspectives found among many of the contributors who aren’t named Patrick J. Buchanan, who is a racist ass. The rest tend to be pretty good, which is to say that they did not spend the first eight years of this century circling the wagons around George W. Bush and the past eight years frothing at the mouth every time Barack Obama laced his shoes left side first.

The fact is that, in terms of my opinions on specific issues, I’ve not been conservative for some time now. It feels as though I woke up one day this last year, looked around at the political landscape, and thought, “Have I ever actually been one of these people?” Well, yes. I was, at one point. That much is true. And though I will admit that I feel ashamed of it, I feel compelled to declare that I ought not to be. First and foremost, I believe that every person ought to be granted the right to be persuaded to change their opinions over time. Mine certainly evolve, but very slowly. My current political outlook could only have been formed by the path I’ve traveled to get here. Being able to recognize the moral insanity of American conservatism is a blessing (I think?) that flows from what I’ve attempted for years to call my own conservative political convictions. The fact that my political convictions are not, in aggregate, conservative is the sort of epiphany only granted, I suspect, to those who have the perverse sensibility to luxuriate in disillusionment. What’re those sayings? “A conservative is a liberal who has been mugged by reality,” and, “Reality has a liberal bias.” Here’s one, just for kicks: a liberal is a conservative who has been mugged by radicalism.

At bottom, I don’t really feel that I have a particularly large share of blame to shoulder for American political conservatism’s collective embrace of bigotry and radicalism, because it’s my temperamental conservatism that causes me to regard that spectacle with moral horror. It simply means that, politically, I must formally recognize that I am the loyal opposition to their ranks, not an outlier within them.

As tempting as it is to switch my label, as Linker did years ago, to liberal, or to triangulate myself, as Jacobs has, somewhere within a constellation of conservative-liberal-socialist positions, that doesn’t feel right to me, either. Not at this time, anyway. Given my temperamental conservatism, it’s probably obvious to you by now that such a shift would be a bit too swift, a bit too premature. What I hope that my conservative, liberal, socialist, libertarian, anarchist, distributist, and communitarian friends and acquaintances understand is that whatever our differences, there are some principles to which I will continue to adhere for the time being.

First, I will presume good faith on the part of individual political advocates—just because we disagree on something (or most things) does not mean I must ascribe to you all that is unholy and pernicious, and I hope you will extend the same courtesy to me. Second, I will presume that no individual is beholden to all that is most rotten in his own political tradition—I will continue to refrain from holding each Leftist accountable for Stalin and Mao as I will from holding each Right-winger accountable for Hitler and Trump. Third, I will endeavor to blunt the appeal of radical measures as political solutions, regardless of the nobility and justice of the goal. Fourth, I believe that I have something valuable to learn from all political traditions, and I will presume that interactions with someone from any tradition will teach me something I can reflect on as my own political philosophy continues to evolve.

I believe that each tradition is capacious enough to include someone with those principles. It seems to me that these principles create enough of a foundation for mutual understanding that productive dialogue can take place between us.

That said, I do think that labels have power, and I wish I had one that could accurately capture my political philosophy. As J. L. Austin argued years ago, words do things. To call myself liberal or socialist is to perform some sort of meaningful social act, and consequences follow. Unless and until I have better understanding of what those consequences are, I don’t think it prudent to throw in with you, whatever your political orientation may be. This isn’t because I don’t wish to associate with you; on the contrary, I wish our association to be productive. I wish to test you, and to make myself a better thinker and to clarify my own political ethics as part of the exchange. I just don’t wish to identify with you unless you and I are satisfied that, by doing so, we’re being accurate, and that there is something to be gained for all by having me join your tribe. By all means, proselytize me if you can.

It is my belief that politics ought to be productive. I don’t place my utter and complete faith in politics, but politics ought to serve a purpose beyond establishing and maintaining the hegemony of a radical minority. Conservatism in the United States, at present, is opposed to this conviction, which is why I therefore stand in opposition to American conservatism. ☕


Reflections on revolution in American conservatism, part 1

This blog is not primarily meant for engagement with contemporary electoral politics, but I do agree with Alan Jacobs that blogs are meant, among other things, to hold their writers publicly accountable for thinking out loud. Current events therefore demand a reckoning of sorts.

People who know me well and longtime readers of this blog know that I have been identifying myself with conservatism. I’ve struggled mightily to retain for that label, insofar as it applies to myself, something resembling moral integrity.

To start off, then, I’d like to associate myself with a couple of posts made by Jacobs, the first from his blog at The American Conservative:

We all know what Trump is: so complete a narcissist that the concepts of truth and falsehood, right and wrong, are alien to him. He knows only the lust for power and the rage of being thwarted in his lust. In a sane society the highest position to which he could aspire is apprentice dogcatcher, and then only if no other candidates presented themselves.

If you put a gun to my head and told me that I had to vote for either Donald Trump or Hillary Clinton, I would but whisper, “Goodbye cruel world.” But if my family somehow managed to convince me to stick around, in preference to Trump I would vote for Hillary. Or John Kerry, or Nancy Pelosi. In preference to Trump I would vote for the reanimated corpse of Adlai Stevenson, or for that matter that of Julius Caesar, who perhaps has learned a thing or two in his two thousand years of afterlife. The only living person that I would readily choose Trump in preference to is Charles Manson.

And this one, from his personal blog:

As a conservative-liberal-socialist, I don’t fit onto any political maps that I know of, and I am accustomed to feeling slightly out of place — more, out of focus — in any given policy debate. But despite the sizable liberal element in my own personal political constitution, in times of serious conflict — today’s Brexit contretemps, for instance — I am always temperamentally alienated from liberalism. For what distinguishes many (most?) liberals from both conservatives and socialists, as today’s social media torpedoes reveal, is genuine incomprehension that any sane and decent person could disagree with them. […]

And this is why, despite the significant proportion of my political views that is genuinely liberal, I am less at home among liberals than among any other political group. Once their howls of outrage get wound up — and there is no outrage like that of a thwarted cultural elite — I just want to back quietly out of the room, close the door behind me, and get as far away as I can.

What I’ve confirmed over the course of the past year of following national politics is something I’ve come to realize over the last several years—or, rather, in the last decade and a half.

A central tenet of what I call “conservatism” is that the opposite of conservatism is not liberalism but radicalism. Aphoristically: conservatism is a principle of political temperament, not a policy agenda.

Edmund Burke wrote, “A state without the means of some change is without the means of its conservation.” The same is true, I think, of an individual’s political philosophy. There’s no need to retain ideological dogmata if they retain little value over time, but we ought not discard received wisdom lightly.

Within that framework, though, I consider President Obama a conservative. That might get me a “no duh” in countries much more liberal than the U.S. (or from Americans who had persuaded themselves that Bernie Sanders, bless him, was not a chasing a herd of flying pigs on his unicorn), but here, that just doesn’t work. I doubt the president himself would embrace that label in the current political climate. While there are self-identified conservatives who highly prize being anti-radical in temperament, there are few to none who use that as the primary criterion for what constitutes conservatism.

So: either everyone else in America misunderstands what conservatism is, or I do.

Around this time a year ago, I may have been tempted to say that political conservatism of some extant variety was still recuperable. I would have continued to do my part to make it so.

Circumstances dictate, however, that I categorically reject any association with the category 5 flustercuck that has been brewing in the GOP-conservative coalition for the last few decades. I’ve never been a Republican, but my temperamental conservatism has, like Jacobs, led me frequently to identify more with those aligned with the conservative (or classical liberal, if you prefer) tradition than the Left. Much as I tried to distance myself from particular noxious ideas within that tradition, I never thought it necessary to renounce a shared political identity tout court. That’s over now.

You can read my opinion on the Republican Party’s wholesale embrace of bigotry in my commonplace blog. Since I’ve been old enough to vote, I’ve never identified as a Republican, but I have valued my identity as an unaffiliated independent. Until this year, I would have at least thoughtfully considered a Republican before casting my vote. No longer. I will never vote for a Republican for any elected office. Ever. I don’t care how much I like an individual candidate. Whatever happens at the convention this month, the Republican Party has amply demonstrated its commitment to the values of racism, sexism, xenophobia, religious bigotry, and tyranny. Consequently, my vote will never be used in support of that peculiar institution.

Conservatives may point out that “conservatives” and “Republicans” are not isomorphic groups. True enough. There are still several conservative thinkers I genuinely respect and admire (Jacobs among them). They comprise a vanishingly small group. Most of them do not identify strongly as Republican. Even if they are decent, intelligent, and erudite people, I’m afraid that they do not typify, in my view, American conservatism. They are the rare exceptions, and I can’t identify myself as part of a tradition if I selectively edit its roster to include only the handful of good folks who aren’t braying sociopaths or historically illiterate bletherskates.

This is a matter of lex parsimoniae. 1.) A majority of Republicans self-identify as conservative. 2.) A plurality of Republicans has endorsed Donald Trump for president. 3.) Most “movement” conservatives who command the lion’s share of public attention support Trump in the name of conservatism—or, at the very least, in the name of defeating liberalism. Quack, quack, quack. That’s a flappin’ duck, folks. And this foul game* is bigger than one election cycle.

Something is rotten in the state of American conservatism, and I, for one, refuse to follow that shambling ghost to the parapet.

My political temperament is still best described as conservative. That will certainly have influence on my political views, but it in no way reflects my identification with whatever the public discourse calls political conservatism. Let me stress this point. American conservatism has placed Donald Trump, a person in possession of mostly vile and/or dangerous political opinions, in serious contention for the presidency. Conservatism in the United States has led itself to this moment, so I think the time has come for anyone who still wants to call him- or herself “conservative” to reflect critically on what, exactly, they believe and whether the devil has given them a good price for their souls.

Time to rub the scales out of my eyes. Whatever I am, “conservative” apparently no longer applies, at least in any politically meaningful way in the present cultural context.

To be continued.

__________

* Couldn’t resist. I’m so very sorry.

Updated with link to Part 2, 16 July 2016.


How does Marvel’s culture industry manage to keep hope alive?

Like most folks who saw it, I enjoyed Captain America: Civil War. It was inferior to the previous two Captain America films, in my estimation, but it was better than Age of Ultron. Much can and has been written about the drawbacks of the Marvel/Disney entertainment monolith, and I’ve been ruminating on the film since I saw it. Chuck Bowen recently used Civil War as the occasion to reflect on the State of Summer Cinema. Allow me to use Bowen as the occasion to reflect on everything that Marvel has done right so far. This will get a little dialectical. Bear with me.

Contrary to a cliche that dogs film critics, I don’t enjoy disliking nearly every movie that earns a significant amount of money. My words are carefully chosen. “Disliking” rather than “hating”, because to inspire such a passionate response as hate would require more than a preordained blockbuster usually offers. Works of art are like people: to hate either, one must be accorded a glimpse of their personality first, and a failure to exhibit personality provokes a muffled, low-risk indifference. But try telling people this sort of thing about a Marvel production and you’re a snob.

Of all people, I can empathize with Bowen’s gripe about his own audience’s bad-faith reception. The fact that one may simply not like a film (as opposed to hating, disliking, or any other strongly-agential gerund you please) does not compute for most people. When you’ve seen enough movies, the most common reaction, sadly, is non-reaction. Movies that most folks “love” or “hate” or think of as “just okay” or (God help me) “interesting” are, to the jaded cinephile, just sort of there. It’s almost a mercy when I actively hate a film, because I’m relieved to have my emotions excited by the experience. So what I’m saying is that I totally get what Bowen’s saying here. He’s got some other pity observations, such as when he compares blockbusters to the cautious personality of job interviewees, or that, given the choice simply to skip the requisite blockbusters, “Unending exclusion is dull and estranging,” so it’s probably better to be in the loop than out.

People have short cultural memories, but blockbusters used to occasionally be enjoyable. Even weird. Their plots might have been recycled and disposable, but they had plots, and some of them had ineffably powerful images. Raiders of the Lost Ark, one of the most influential of all blockbusters, is almost quaint now in its fealty to the idea of one hero, one villain, a heroine, a few colorful supporting characters, a MacGuffin, and a story that tied all these elements together with pleasurable simplicity. And while its protagonist, Indiana Jones, was an indestructible superman, he also has discernible human characteristics. For one, he clearly liked sex.

Whuh? Not quite sure how Indiana Jones is an “indestructible superman” (fridge-nuking notwithstanding), because the first and third films end with literal di ex machina that emphasize the hero’s relative powerlessness. Also, what the hell does it mean that “he clearly liked sex”? is that the most readily identifiable human characteristic—liking sex? Unlike Tony Stark, for instance?

I’m also unsure how the Thor movies or the first Captain America weren’t pleasurable in their simplicity: one hero, one villain, a heroine, a few colorful supporting characters, a MacGuffin, and a story that tied these elements together. Guardians of the Galaxy had multiple heroes, but it stands as this decade’s superlative example of exactly the kind of film Bowen is complaining that the blockbuster machine doesn’t produce. I haven’t seen some of the other films he name-drops, but if some ivory-tower-bound neckbeard like myself, who sees maybe ten new movies a year anymore, can think of several counterexamples out of hand—taken from the very franchise he’s arguing about—one might get the sense that Bowen’s punching a bit above his weight class. Or simply being obtuse. To wit:

“On a scene-by-scene basis, this new Marvel uber-movie makes almost no sense, hopscotching across dozens of cities and a couple of different timelines, plugging new superheroes such as Black Panther (Chadwick Boseman) and yet another Spider-Man (Tom Holland), while dropping cute little in-jokes designed to pressure audiences into catching up with past Marvel installments that they may have missed. This is the most irritating component of the new corporate blockbuster: it’s always heckling you to buy more, without ever giving you what you already paid for. It may be called Captain America, and more or less be a sequel to the vastly superior The Winter Soldier, but it offers a buffet of superheroes designed to abound in so much as to offer each audience member enough of what they individually like, so that they can each retrospectively assemble a different, more focused movie in their minds.”

This paragraph foreshadows how Bowen misunderstands Marvel’s project in some very fundamental ways. First, I’m not sure how the film makes no sense on a scene-by-scene basis. It doesn’t really hopscotch different timelines so much as parallel storylines. Most of these scenes are united by the MacGuffin (that’s right, folks—there is one!) of the Sokovia Accords, a multinational agreement among the world’s nations that superheroes require some sort of civilian oversight. Coming on the heels of Ultron’s robot uprising and Hydra’s hijacking of the U. S. military-industrial complex (not to mention the emergence of the Inhumans, if one still considers Marvel’s TV franchises to be part of the same universe), one can understand that non-superheroes might want a say in how the Avengers conduct their affairs.

Civil War is a sequel to Winter Soldier, but it’s more properly a sequel to every Marvel movie to date. Unlike virtually every major franchise produced by Hollywood, Marvel has made a point of making sequels that actually push their characters forward. It’s TV-style serialization—which, in turn, was influenced by early film serialization, so I guess the blockbuster has essentially come full circle. Unlike Raiders of the Lost Ark, which borrowed liberally from the early serials’ tropes, the MCU has borrowed from their episodic structure. More accurately, it uses the serial structure that has been the backbone of superhero comics for going on a century. While it’s fair to say that there are not really any long-term consequences in serialized comics (the medium that gave us the term “retcon”), there are often short- to medium-term consequences. Superman today is (sorta-kinda-pretty-much) the Superman of the 1940s, plus or minus a few powers. But there’s a gravitational power exerted by the shared universes of the two major comics publishers that basically requires their worlds to maintain a certain status quo. Mostly for business reasons: after all, it makes it easier for new readers to jump into a series when the basic premise and stakes are never-changing. There’s also a storytelling exigency, though: long-running series change creative teams from time to time, and it’s easier to do new(ish) things within an established paradigm if you’re not hamstrung by a never-ending series of paradigm shifts introduced by each previous team. It’s more about finding interesting new facets of a superhero (and that hero’s mythos) to explore, rather than totally reinventing the superhero from scratch. The Marvel movies do this better, in my opinion, than the Marvel comics.

The Marvel Cinematic Universe simply has exigencies that the comics do not, and that makes them more appealing to a certain kind of audience. The primary reason why I never got into superhero comics when I was collecting them is that I didn’t have the money or patience to go read everything I needed to know in order to fully appreciate the context of a current story arc. Storytelling in mainstream superhero comics is intransigently incestuous. After all, the whole point of having a shared universe is to do crossovers. That’s great when there’s only, like, a dozen titles in a given universe. It’s enough to induce a panic attack when dozens of titles and hundreds of characters are in play over the course of decades.

What’s worse is that so much of the pathos from these titles comes from the assumption that readers are at least somewhat familiar with the histories of these characters. Imagine watching Star Trek: Generations without ever having seen any TOS episodes or movies and maybe only a handful of TNG. It’s not a great film anyway, but watching Captain Kirk die for real and for good is kind of a kick in the gut if you’re a Trekker in any sense of the word. If you only have the vaguest idea who he is, due mainly to pop culture osmosis, maybe the moment works, but mostly not, I’d wager. The entire film hinges upon two legendary Enterprise captains meeting for the first time for the last time and the torch being officially passed from one generation of the franchise to The Next. If you haven’t been watching Star Trek, there’s no legend. No impact.

For me, reading virtually any major superhero book was like jumping straight into the final season of Lost. There were a few exceptions, as there always are. (I loved Ed Brubaker’s Daredevil, for instance, but then, I’d already read most of Miller’s run and most of Bendis’s thanks to my local library’s shockingly capacious graphic novel collection.) On the whole, though, it simply never mattered to me. Any of it. Or most of it, rather.

However much the Chuck Bowens of the world complain about each installment in MCU being a product placement for other installments, the product line is comparatively sparse if you put it alongside the comics. You want to get caught up before Civil War? ‘Kay. Rent Iron Man (just the first one), the two Avengers movies, and the first two Captain Americas. You don’t need any Hulk, Thor, or Guardians. You don’t need the TV shows. You don’t need the films produced by 20th Century Fox. Will you miss a few inside jokes? Sure. Are you on the hook for 10+ hours of entertainment. Yep.

Know what you’re not on the hook for? Approximately 82,946 comic books, including back issues and current releases, because you happened to pick up the latest X-Men and you don’t know who the hell any of the characters are or why the one guy you do recognize is now a gay psychopath who’s also apparently the clone half-brother of some other character who’s Professor X’s great-granddaughter from an alternate future who is responsible for the sixth (or seventh?) time Wolverine got amnesia and had to go work as a short-order cook in Laos, where he eventually teamed up with the Punisher to take out the assassin/warlord who will (tune in next month! Excelsior!) be responsible for murdering Matt Murdoch’s latest doomed girlfriend, which will somehow precipitate the third superhero Civil War.

Part of me appreciates the operatic plot lunacy and behind-the-scenes organization it takes to pull of stuff like that even halfway successfully. It’s the same part of me that, in the abstract, thinks that soap operas and pro wrestling are kind of cool, in theory. The other part of me looks at my wallet and my time commitments and goes, “Yeah, I can check out the latest Marvel flick every eight months. That’s way more doable.”

It’s doable for movie audiences because the kind of money and organization required for large-scale blockbuster film production can only make movies like this happen every eight months or so. The most marketable part of these movies, apart from the franchise branding, is the truly impressive roster of performers Marvel has assembled. You only get RDJ or Chris Evans for so many movies, so you better spread ‘em out and make ‘em really count. Similarly, Marvel’s scored big with some of its behind-the-scenes hires. Joss Whedon, obviously. James Gunn, though, was a stroke of genius. Guardians of the Galaxy is the single best film in the Marvel MCU, and apart from the first Iron Man, it is the least dependent on the films’ shared mythos.

What will be even more interesting to see is whether Marvel has the ambition and vision to continue the current MCU well past the tenure of its founding players. Chris Evans will be done with Steve Rogers after the Avengers two-parter. Downey, Jr. is likely to bow out sooner than later. Marvel can always recast or reboot, but the cool thing about movies like Ant-Man or Guardians is that they do well without necessarily being based on known quantities. I get that making a movie about Marvel’s first black superhero is a big deal (and more than a little overdue), but seriously: did anybody outside of the comics nerd-o-sphere know who Black Panther was until Marvel stuck him in Civil War?[1] Would Ta-Nahisi Coates have gotten a shot at writing the comic right now without Marvel deciding to branch Panther off into his own film? Would they have chosen to do so if they couldn’t have spliced him into a strong, stable franchise like Captain America first? Things like this are part of the upside of the entertainment-industrial complex. Money + hype = willing viable franchises into existence. (Not always, but you have to admit that momentum is on Marvel’s side at present.) Unlike the comics publishing arm, it’s quite possible that MCU can survive the retirement of its initial flagship characters if there are folks like Chadwick Boseman and Tom Holland waiting in the wings. (Or, for heaven’s sake, Scarlett freaking Johansson, who has already appeared in more Marvel movies than everyone but Downey and Evans.) Put another way, it’s possible for MCU to evolve in ways that the Marvel comics universe simply can’t, because we’re talking about two different markets and two different media, one of which depends on real, flesh-and-blood people to play the characters.

Not likely, I admit. Just possible.

I like the idea of the MCU growing and evolving, as it were, in real time. One of the distinct pleasures of watching the Marvel movies since Iron Man has been watching talented stars and writers collaborating to find interesting things to do within the constraints of the franchise. They age. They mature. Downey is a better Tony Stark now than he was in 2008. Chris Evans is a better Steve Rogers. Most movie actors get one film in which to get to know their characters. Downey and Evans have gotten six and five, respectively. To me, their performances reflect that process. And that’s tied to the thing that Bowens gets so, so wrong in his reflection Civil War. It is, in fact, the thing that he misses the most completely.

Would it kill the film-makers to offer just one memorable bit of dialogue? Every spoken line in Civil War serves an expository purpose. Or how about just one image that strives for poetry? Would it kill one of these movies to feature characters who are capable of actually dying? Or crying? Or changing allegiances? Or having money problems? Or loving, in a visceral, personal way, rather than in the usual platitudinous fashion that testifies to the needs of teaming up yet again to mount yet another adventure?

Watching Captain America: Civil War, in which positively nothing is at stake, I checked my watch 25 minutes into the film, sighing at the realization that there were nearly two hours remaining. How can audiences stand this? By submitting to the anesthesia of the loudness, I suspect, by comforting themselves with the knowledge that they are, at this moment, doing what culture expects of them. Seeing the “big” thing, the Super Bowl of yearly adventure epics.

The whole point of Bowen’s piece is (I think) to chastise audiences for letting Hollywood get away with selling them all of these films that are structurally the same, but he displays no grasp whatsoever of what has changed from film to film. So doing, he cannot understand why audiences keep showing up.

We’ll set aside haggling over what counts as a memorable line or a poetic image, or even what counts as “loving, in a visceral, personal way,” because I would argue that Civil War utterly hinges on what Eve Kosofsky Sedgwick calls homosociality, and even tweaks it a bit with the role that Black Widow plays in the character dynamics.[2] Let’s focus on three questions. “Would it kill one of these movies to feature characters who are capable of actually dying? Or crying? Or changing allegiances?” Let’s take these one at a time. The correct answers (phrased in the form of questions) are, in order:

Would it kill one of these movies to feature characters who are capable of actually dying? What, you mean like Peggy Carter? Title character of the Marvel Television Universe’s best series? The love of Steve’s life who kicked ass with him in the first film and died and was buried in Civil War?

Or crying? What, you mean like Wanda, after she blames herself for killing those civilians in the first scene? Or Tony, expressing a mix of sorrow and rage after Rhoadie gets shot down in the climactic fight at the airport? I’m sorry, does a character have to bawl uncontrollably—cry on cue, as it were—in order to count as “crying”?

Or changing allegiances? What, you mean like THE ENTIRE PLOT OF THIS MOVIE? Like how the first Avenger, Captain America (remember the first film’s title?), breaks his allegiance with the Avengers as a matter of conscience, and spends the whole film fighting with his former teammates as a result? Or how Black Widow totally confounds the entire idea of allegiance by trying to remain loyal to both of her friends and teammates, and also has to leave the Avengers as a result? Or how Black Panther goes from trying to murder Bucky to apprehending the real killer when he realizes he’s been duped? I’m actually thoroughly confused by this question. Just so I don’t cause any confusion, this question is not rhetorical: Chuck, did you actually watch Captain America: Civil War?

The purpose of Bowen’s series of facetious queries, of course, is to buttress the claim that “positively nothing is at stake” in Civil War. Again, this would be a totally baffling claim, even if we took Civil War as a case by itself. In the context of the MCU, it is about as objectively wrong as you can get. That is, if context and character development matter to any criticism based on a method of close textual analysis. (Hint: they really do!)

As others have already noted, both Steve Rogers and Tony Stark have actually had quite distinctive character arcs across the films in which they’ve appeared. Stark starts out as the reprobate Ayn Randian hero who worships at his own altar and must continually learn and re-learn the principle of self-sacrifice for the greater good. As the films progress, his sense of responsibility becomes less personal (due in large part to the lessons learned from the consequences of his own arrogance) and more based on principle, more directed toward the global community. In the Iron Man films, Tony repeatedly is forced to pay for his mistakes or the mistakes of his family or his corporate empire. Avengers is the first instance we get of Stark sacrificing himself genuinely selflessly. He invents Ultron partly as a psychological defense against his own perceive weakness, but also because he wants to protect Earth proactively from global threats. It is his greatest failure, the culmination of the arrogance displayed in each of his standalone films, and it is what leads this individualist bad boy to push for the institutional restraint of the Sokovia Accords. He knows that he cannot be trusted to hold himself accountable, so he welcomes the prospect of oversight. As the final confrontation with Cap at the end of the film shows, he knows himself quite well—he is unable to stop himself from trying to exact revenge for his parents’ murder. But the Tony Stark of Civil War is one making a conscious effort to restrain his arrogance; only someone with such a fatal flaw could recognize it manifesting in someone else: Steve Rogers.

A paragon of the Greatest Generation, he becomes a superhero principally out of a willingness to put himself at the service of the government to fight evil. Not just for his own sake; because he believes the world to be at stake. When Nick Fury taps him for the Avengers, it’s not much of an issue for him. It is, in fact, Tony’s innate resistance to institutional trust that persuades Cap to question Fury and discover the weapons program that provoked the alien invasion in the first place. He tries to carry on in Winter Soldier, but finds that the institution to which he has devoted his life is utterly infested with the evil he sacrificed himself (in the first film) to wipe out for good. Skepticism toward technocratic solutions to world piece underwrites his hostility to Ultron in Avengers 2, after which it is Tony who is forced to agree. By the end of that film, Steve takes over the Avengers because there is no institution left on earth that he can trust. That need for moral independence is what informs his rebellion against the Sokovia Accords in Civil War. His blind loyalty to Bucky is not merely personal affection and perhaps guilt over what happened to his oldest friend; Steve’s experience in each film has led him to value loyalty among comrades in arms above all else. External constraints, such as SHIELD or Ultron, have only compiled evil upon evil. In his arrogance, Steve believes that he can trust only his own moral compass, so he defies international law, deceives Tony about his parents’ death, and ends the film by founding his own rogue group of vigilantes. The consummate team player has become the ultimate loose cannon.

In short, Tony Stark and Steve Rogers’s character arcs have had an inverse trajectory that has been developed carefully and (shockingly) subtly over the course of the last decade, and what is at stake in Civil War is both thematic and personal. Thematically, the film presents two paradigms of the ethical use of force. Iron Man is a good guy, but he requires institutional constraints to use his power ethically, because he fears being a loose cannon. Captain America is a good guy, but he’s a loose cannon, because he fears that institutions will use his power unethically. Personally, Iron Man once again finds that his family’s tragic history traps him in an apparently unending cycle of retribution. Captain America is offered a final chance to save his oldest friend. Iron Man is spends most of the film seeking justice, only to have it turn into vengeance. Captain America is trying to redeem one friend by—to put it bluntly—screwing over another.

There be stakes all over the place. And that’s just for the two lead characters.[3]

More significantly, the stakes really only come into focus if you have, as Bowen says, done what culture expects of you: always checking out the Next Big Thing. Marvel counted on viewers having already invested their time and emotional energy into Tony Stark and Steve Rogers. Without that investment, there’s no payoff in Civil War. Just a bunch of latter-day demigods punching each other into buildings and making wisecracks. With that investment, the payoff is witnessing the tragedy of a broken friendship, of an already-broken man being denied justice for his parents, of a once-upright man turning lawless because the lawful institutions have, one by one, betrayed him for half a century. Amid all this tragedy remains hope, of course. That hope is stipulated by the money machine at the heart of MCU. Steve and Tony will reunite in Avengers: Infinity War because they have to. That doesn’t erase the manifold tragedy in Civil War, but it does structurally affirm that, despite the heartbreak and tragedy, heroes will ultimately do what they must simply because they’re heroes.

__________

[1] If Civil War succeeded in nothing else, it made me terrifically excited for the Black Panther and Spider-Man movies. Boseman will be a great leading man, and there are a ton of exciting possibilities for T’Challa in the MCU. Tom Holland’s Spider-Man was both the best and most extraneous part of Civil War. In an already overstuffed film, his was the only new character who didn’t really serve a plot function. The scene where Tony recruits him, however superfluous, at least felt fleshed-out on a character level. In a story that leans so heavily on Tony’s troubled relationship to his dead dad, we get to see Tony get paternal with a kid who has so much in common with him. Peter and Tony both lost their father-figures (Uncle Ben and Howard Stark, respectively), both are nerds, both have taken it on themselves to be heroes outside the law. It’s a rather sweet scene. Also chilling. Just like Howard, Tony places unreasonably high expectations on Peter to manipulate him. The line between Tony turning Peter into his weapon and Tony relating to Peter paternally is blurry here, but that makes it all the more real and resonant, given how the film ends. Still sort of unnecessary, all things considered, but if the writers were going to shoehorn Peter Parker into the film, at least they did their best to make it make sense that Tony would recruit him. Holland and Tomei are sort of perfect as Peter and May, and the airport scene, in retrospect, feels mostly like a proof of concept for the kind of awesomeness (stunning high-flying acrobatics and nerd-witty banter: check!) we can expect from the next Spidey solo film. Sign me up.

[2] And by the way, I get that Bowens is trying to be cheeky when he asks rhetorically, “Wouldn’t Captain America: Civil War be a more interesting movie if Captain America (Chris Evans) and Iron Man (Robert Downey Jr) fought over, say, the affections of Black Widow (Scarlett Johansson), whose approval they are both clearly jockeying for anyway?” Yes, what a Paulette you are, Chuck, ever-so-subtly and perhaps-(perhaps-not!)-unironically insinuating that Natasha character would be far more effectively deployed as the object of affection in a male rivalry love triangle. Or, wait. No. Actually, I think that makes you a sexist jerk. My mistake.

[3] Civil War’s villain is also tragic. He is a direct product of the last Avengers film. Zemo has no superpowers, no great resources. Just a keen intellect and the drive to exact revenge. Like Steve, he’s a former soldier whose institutions failed him and those he cared about. Like Tony, he is a genius operating without restraint. Zemo is who Tony Stark might be if left to his own devices, but he justifies his villainy according to Steve Rogers’s ethos. If he’s the dark mirror to each of this film’s heroes, it reflects rather badly on their inability to resolve their differences.


Is it Thursday yet?

Last month, my wife and I finally stopped being outlaws. We had been watching Critical Role on YouTube for several months. Not on Geek and Sundry’s official channel mind you. Nope. Some user had thoughtfully put together his own playlist, updating it each Monday with the latest episode. I fully realize that this is the 21st century, and that a vast majority of people don’t care if they’re illegally pirating stuff. Screw those people. My wife and I spend precious little enough of our money on entertainment, but we figured that if Critical Role had given us nearly 150 hours of joy over the course of the last year, the least we could do is support it in the only way that matters in a marketplace. So we bought a Geek and Sundry Twitch subscription.

Geek and Sundry, of course, is the web-based entertainment company founded by Felicia Day. Capitalizing on the cachet Day earned with The Guild, G&S is home to nerdy shows like Wil Wheaton’s Tabletop and Co-Optitude, which Day hosts with her brother, Ryon. (Wife and I are fans of those, too.) G&S is a multiplatform presence, streaming videos from its official website as well as YouTube. Twitch bills itself as “social video for gamers,” which is apt enough. The platform includes live video streaming and chat functions, so you can watch your buddies play Halo or Hearthstone and comment on the game with other users besides the gamer in real time. Most of the popular channels are devoted to video gaming. G&S offers a variety of shows that are primarily oriented toward tabletop gaming.

What makes G&S’s Twitch experiment so intriguing is that it’s live. It seems, in other words, that broadcast media has come full circle. People from my generation and those even younger probably only know about old-time radio from movies like Woody Allen’s Radio Days (from what you might call his “peak Farrow” period), or perhaps they listen to shows like WPR’s “Old Time Radio Drama” (or whatever else is locally available outside of Wisconsin). While Twitch does allow its users to archive livestreams on their channel pages, the real draw is watching shows that are devised with the affordances and limitations of a live broadcast in mind.

Subscribers from around the world participate in the chat, peppering the hosts with questions, unsolicited advice, and solicited recommendations. While there are some shows designed around the chat function (like the recent trial of The Scavenger), most simply feature a confab of young, charismatic nerds playing games like Rock Band or HeroClix. The genius of Day and Wheaton is that they figured out that there was a fairly sizable niche audience of folks who would enjoy watching young, charismatic nerds play tabletop games. TableTop itself is almost paradigmatic in this regard. Each episode features Wheaton and four celebrity guests playing a different tabletop game, cracking wise about the diegetic absurdities of the games and sublimating their own cutthroat competitiveness into self-reflexive jibes. (Not to mention erecting a mythology around Wheaton’s own incredibly bad luck throughout most of the first two seasons. For instance, you now say, “I just Wheatoned,” when you roll really badly with your dice.) Unlike TableTop, the games on the Twitch channel unfold in real time, so many (though not all) hosts come from an improv background, flexing those theater muscles to carry two- to three-hour games with breezy insouciance.

That’s part of what makes Critical Role so special. As the host and Dungeon Master Matt Mercer opens every episode: “Hello! And welcome to Critical Role, the game where a bunch of us nerdy-ass voice actors sit around and play Dungeons and Dragons!” That’s pretty much it, but it explains very little about the show’s core appeal. What the description misses is just how gifted these actors are and how expertly they deploy their improv skills to flesh out and inhabit their characters. Some, like Sam Riegel and Marisha Ray, use something very close to their own accent and timbre as they play (respectively) Scanlon, the gnome bard, and Keyleth, the half-elf druid. Others, like Travis Willingham and Orion Acaba, demonstrate their professional range to give an Anglicized working-class growl to (again, respectively) Grog, the goliath barbarian, and upper-class twit brogue to Tiberius, the dragonborn sorcerer. The use of accents and different timbre is a helpful marker in the cast’s code-switching, as they flip merrily between their in-game characters and real-life personalities.

That, too, is part of the charm. Like any great improv troupe, the cast revels in surprising each other with totally in-character moments of ribaldry or pathos. One of Willingham’s greatest moments in the show, for instance, is when Grog locks himself in an outhouse to have a conversation with his cursed, sentient sword, Cravenedge. Though utterly hilarious, it carries some emotional weight, as one of the other party members, Percy (played with devilish calculation by Taliesin Jaffe), had just recently been delivered from bondage to his own cursed weapon. While Grog doesn’t want to pose a danger to his own group, he relishes the power given to him by the sword, and he’s no more inclined to sacrifice that power than Percy was, even with his growing suspicions. Similarly, Liam O’Brien and Laura Bailey play twins, Vax and Vex (respectively), whose comic bickering rings solidly true, but whose co-dependence delivers some of the biggest emotional impact in the series, especially when one or the other flutters over death’s threshold, instilling the other with uncontrollable panic. All of the characters often make very bad decisions for reasons that make total sense, and it then becomes the job of Vox Machina, their party, to pull their reckless butts out of the fire.

The commitment to character consistency has intersected with the challenges of live broadcast in some interesting ways. Perhaps the most controversial moment in the show’s run so far has been the departure of Orion Acaba after episode 27. Independent of the real life drama surrounding the event, the sudden departure was not entirely out of character for the flighty sorcerer, and his official farewell (performed by Mercer) in episode 37 was a somber highlight in the epilogue to the party’s first full arc without Tiberius. Another long-running challenge for Critical Role has been the incorporation of its gnomish cleric, Pike. Because Pike’s player, Ashley Johnson, pursues a live-action career that calls her away from Los Angeles, where the rest of the cast is based, she’s been missing for huge swaths of the show, not least including its initial few episodes. While she worked on Blindspot in New York City, Johnson telecommuted via Skype for several episodes. The distance and technical difficulties for Johnson meant that Pike was forced into a much more reactive role within the party, but her sporadic appearances also had the effect of reminding the cast and their characters how vital she is to the dynamic of Vox Machina. Indeed, one of the finest moments in the show was Johnson’s surprise appearance on-set for Episode 22, during a shooting break for Blindspot. The delight of the cast members to be reunited with Johnson was perfectly intertwined with the delight of their characters, who had not been together for four weeks. The necessity of having the players actually be present together physically in one place is something that can be dealt with in a live format, but it’s not something that can be “shot around.”

When technical difficulties occur in real time for us, the audience, it’s also about a thousand times more frustrating than a jam-up on YouTube. After all, when we were watching Critical Role on YouTube, we might have to abandon the video if YouTube was being stupid and come back to it later. That sucked. Then again, we rarely watched an entire episode all at once anyway. Critical Role episodes average three hours, and some have stretched past four. Given our schedules, my wife and I don’t usually get home until after 8:30 pm, and we’re usually asleep by 11. So while we were watching on YouTube, it became our custom to watch CR in one-hour blocks or so, breaking each episode into three nights’ entertainment. Besides prolonging the pleasure of each episode, finishing one also meant that we only had to wait four or five days until the next one.

Now that we try to watch Critical Role on Thursdays, when it airs (7 pm Pacific Time for its cast/crew, 9 pm here in the Midwest), that rhythm is severely disrupted. While it’s unusual for us to manage to stay awake until midnight on Thursdays, we usually watch at least two- to two-and-a-half hours as it streams live. That is, unless Twitch poops out on us. Or we poop out from fatigue. Neither of which is the worst thing in the world. And full episodes are uploaded by the next day, so we can pick up where we left off pretty quickly. But Twitch is, in our experience, still rather buggy. And since Critical Role is literally the first regularly-scheduled program that we have made a point to watch at its regularly-scheduled time since we got married,[1] not being able to watch it at that time is so much worse.

Worse, because we usually finish watching each episode on Friday nights. That’s awesome, in the sense that we get to finish the latest episode almost immediately afterward, and on our own schedule. But it also means that we have to wait until next week Thursday to see the new episode, and a less-than-perfect experience makes us all the hungrier for a better experience the next time. Which is usually no less than six days away, as opposed to the four or five it normally was when we watched episodes on YouTube.

There’s a bigger reason why it’s worse, though. After being spoiled for years by services like Netflix, Hulu, and Crunchyroll, which are at their best when you get to marathon episodes in large gulps, waiting for Critical Role each week is practically an exercise in discipline. There’s a reason why the fan-sourced tagline for Critical Role, “Is it Thursday yet?” is how Mercer closes each episode. The hunger for each episode is not felt by each fan alone; we feel it together. That time slot on Thursday is special because that particular time slot really means something. It’s the only time when all of us—the fans, the film crew, and the cast—get together for the Critical Role experience live. In real time. It happens first and for real only on Thursday. Everything afterward, while still thoroughly enjoyable, is not unique. It’s reproduced. That doesn’t lessen the enjoyment of the episode, but it also cannot replicate the sense of live connections being forged in the moment.

Fans of Critical Role are called “Critters,” and both the fans and players commonly refer to the “Critter community.” My wife and I don’t participate in the chat (which goes way too fast), the Reddit threads, or on Twitter, where the cast interacts with Critters on a regular basis. Yet I believe we do feel at least tangentially connected to the Critter community. In old message board parlance, we’re lurkers. But that sense of participation is something that we’re enabled to feel each Thursday night by virtue of the fact that we watch the show live, as it is streamed. The story itself is improvised with each breath and dice roll; the players are putting on a show for us, but they are also putting it on for each other. We, the audience, are simply invited. That invitation to the event itself, though, is always and only for Thursday at 1900 Pacific Time. It is the only time when none of us, collectively, knows what will happen next, and it is the only time when all of us, collectively, get to see what happens next. It is the only time when fear that something could go critically wrong is held perfectly in tension with the sincere hope that everything turns out all right. We, the viewers and players, are bound together in time to each moment.

There is something utopian,[2] I think, in the voluntary discipline of this ritual. Ritual discipline is something I don’t think I have appreciated enough in my life. It is, to be sure, qualitatively different from weekly worship services. It is also qualitatively different from live broadcasts of sports competitions, like football games. While I appreciate worship services far more deeply than sports competitions, I do acknowledge that, much like live artistic performances, there is something necessary to the human experience for events that technically only occur once—here, now, for those of us present—but which are ritually repeated at set times. These things give meaningful shape to our experience of time and space, and the most meaningful of these rituals take narrative form.

One of the great lies told about worship services is that it’s the same old crap every Sunday. In one sense, that’s true. Liturgies are cyclical, and they draw upon the same source material week after week, year after year, century after century. Yet. With each week, year, century, millennium, this circumscribed time with its own circumscribed set of conventions is made new by the fact that those present—here, now—are never the same. We are always older. Always slightly different. Always experiencing this same time in a new way, filtered by our passage through time. We die. Others take our place. They are not us, but we are them. We are made new by our participation in the ritual, by experiencing collectively a totally unique event that nevertheless replicates a set structure at periodic intervals throughout our lives. The narrative structure of these rituals is what gives narrative structure to our own lives.

Like any conventions, though, the governance of our life-narrative is not totally beholden to dogmatic minutiae. There is room for improvisation and surprise. These are also necessary. There is a certain delight, or perhaps catharsis, that can only be had by bonding together with others in the surprises that unfold themselves within the conventions of ritual. That’s why it’s healthy when someone farts loudly in church. That’s why it’s shocking when a pro ballplayer suffers a career-ending injury on the field. That’s why we know when stand-up comic tells us the truth. Are these things always delightful? Cathartic? Perhaps there are better words. Joy and awe. Rituals are not meant to be dry, empty obligations, but celebrations of being alive, and they are meant to inspire gratitude that we are alive to recognize meaning in this moment: here, now, together.

Rituals build communities, and communities thrive on ritual. That is true for individuals, families, villages, nations. It’s true that my wife and I simply don’t have the wherewithal at present to be active in the online Critter community. For now, though, we have made a commitment of time and treasure to experience Critical Role as it streams each week. It is something we cannot pilfer or reproduce and retain quite the same meaning. In finally subscribing to one of our favorite shows, we have finally begun to participate, however marginally, in a ritual that makes the lives of thousands, una communitas sine finibus, just that much more vibrant.☕

__________

[1] I don’t count Doctor Who, which we typically get from Amazon the day after each episode airs. That’s pretty close, but not really the same thing as watching it as it’s broadcast.

[2] I’ve written very critically about utopia in the past. I’ve changed my previous position on utopianism about 165 degrees. Someday, perhaps, I may elaborate. Suffice it to say that I think utopian hope and utopian process are necessary components of any thriving community. I agree with Ernst Bloch that anti-utopianism tends to stifle positive social change; I disagree with any utopian theorist who views the shoring up of inherited traditions as inherently regressive, weak utopianism or as anti-utopian.


Kosmas Aitolos: “love of God and love of brethren”

Reblogged on WordPress.com

Source: Kosmas Aitolos: “love of God and love of brethren”


Anime listicle: the utopian workplace comedy

As I’ve written previously, I don’t have the emotional patience to watch anything that isn’t fairly light and diverting, so even my anime diet is suffering reduced portions of the action and science fiction shows I used to consume in large gulps. Though there haven’t been any new shows like this in a few seasons, it occurred to me a while back that there’s a certain type of comedy that I find to be particularly appealing. Broadly speaking, it’s a slice of life show, but one that centers on the workplace as the unifying institution in the characters’ lives. There is almost always some sort of blossoming romance threading through the series, and there is almost always a self-conscious absurdity to the show—usually in the form of a character’s simply implausible eccentricity, perhaps in the form of a supernatural element. In any case, the most important thing about all these shows is a certain vibe that persists in greater or lesser degrees.

To be honest, I’m not entirely sure how to characterize this vibe. Categorizing it by the shows’ premise—the “workplace comedy”—doesn’t really capture it, just as the “school club comedy” doesn’t really capture the range of comedies anchored in after school clubs. Categorizing it by other (sub)generic classifications—farce, screwball, romantic comedy—also don’t quite suffice. Those labels fit, as do adjectives like “zany” or “charming.” They just don’t capture it. In fact, the closest American analogy I can think of to what I dig about these shows is Parks and Recreation, which is certainly a workplace comedy with farcical and romantic elements, and which is frequently both zany and charming. Parks and Rec also grows out of a long tradition of workplace comedies like Taxi, WKRP in Cincinnati, Wings, Newsradio, and, obviously, The Office. Perhaps the most characteristic thing I can say about the vibe is that it is strangely utopian. Think about it: as much as the characters in these shows annoy the crud out of each other—to the point of being arguably dysfunctional—the truth is that the workplace is what provides these characters what is, as far as the audience can tell, the most important social network in their lives. The sitcom format also requires the minor conflicts of each episode largely to be resolved by the end of each episode, meaning that longstanding conflicts or resentments can be nursed for extended periods of time, but that there’s enough stability and human connection there to patch over those conflicts for at least another week.

Think about M*A*S*H, for instance. This show was entirely about the struggle of its characters to maintain their sanity and basic human decency in the middle of a war. People of good will can disagree over whether it travesties Robert Altman’s original film or if the shift toward dramedy in its later seasons was a bridge too far.[1] At heart, what really makes the show work is the genuine affection that the characters cultivated for them in the audience—characters that started out as caricatures (especially in the film, if I may be so bold), but who discovered and cultivated their shared humanity amidst the most deplorable conditions. In essence, these were all characters stuck in the workplace from hell, but it was either form a passable community or bust. So it was that 4077, with all its dysfunction, absurdity, and (debatably) bridge-too-far descents into melodramatic tearjerkery, was a utopian space created anew each week for just over twenty minutes.

The best of these workplace comedies acknowledge that many of the characters (if not most) have other important relationships in their lives, of course. Many supporting characters have significant others that remain with them for most of the series, or they have friends or social lives that are fulfilling in other ways. But these comedies acknowledge the often uncomfortable truth that we spend more time with our coworkers than we do with our families, and that many of our most important relationships—or at least most of the small, daily, mundane activities and events that give shape and definition to our inner lives—are rooted in the workplace. There’s something utopian about seeing a dozen-odd characters forge a long-lasting community over the course of however many weeks we spend with them.

In short, I love workplace comedies when they’re done well, and the anime shows that channel this particular vibe are especially good at plugging into a little something extra that we just can’t get with most live-action shows (although, again, Parks and Rec somehow managed to do it). The following shows are listed to give you a sense of the kind of show I mean. I’ve listed them in descending order, with the most paradigmatic show listed last. If there are any that I have neglected to mention, by all means let me know in the comments. I’m always happy to take recommendations.

Honorable Mention: Monthly Girls’ Nozaki-kun

A technical case could be made that this is mainly a high school club show. Hence the HM, rather than a place on the list. But it absolutely has the vibe of what I’m talking about. All the characters in this show are linked by their relationships with a single character: Nozaki, who happens to be (secretly, natch) a popular shojo mangaka. The big joke of the show is that this artist, whose comics are so in touch with the authentic romantic desires of his female readership, is just some clueless dude who takes inspiration from the dumb stuff that happens to the people he knows in real life—only when he translates his comically dense misunderstanding of the world into shojo tropes, it’s romantic gold. The real heart of the show, though, is the way Nozaki’s social network coalesces into its own pocket universe, one anchored in what amounts to his part-time job.

Honorable Mention: You’re Under Arrest!

I haven’t seen all the shows in this series, but I’m calling it a HM mostly because it doesn’t quite reach the absurd heights of most of the shows elsewhere on the list, and it’s not as straightforwardly a comedy (although it’s frequently quite funny). The premise is simple: the daily adventures of a pair of traffic cops in a Tokyo suburb. As with most of these shows, the premise is a useful anchor point for bouncing a lot of characters off each other and slowly developing their relationships over an extended period of time. It’s wonderful in its various incarnations; I’m just not certain it gives off the vibe I’m talking about.

How editors motivate their talent.

7.) The Comic Artist and His Assistants

For the most part, this is an amalgam of harem and ecchi tropes packed into mini-episodes. You’d be forgiven for thinking, initially, that it’s just another dumb show about a perverted manga artist who somehow manages to find himself in uncomfortable scenarios. It is that, certainly, and if mild fanservice and pantsu humor are your cuppa tea, this is a passable series. What elevates it is that it becomes much more about one of the assistants and the real value she gets out of working for her (pervert) boss than about the titular comic artist himself.

At that hour, this must be powerful magic indeed.

6.) The Devil Is a Part-Timer!

Ranked slightly below the next entry primarily because the workplace aspect of this show is so tertiary to… well, pretty much everything. That said, it’s a fantastic show. The title explains the central joke: when the Devil flees his parallel dimension after a group of heroes defeats him in battle, he winds up in our world. With only minimal reserves of magic left to draw on, the Devil is forced to get a part time job slinging burgers, and he decides to rebuild his empire on our Earth by working his way up the corporate ladder. A lot of this show is devoted to supernatural battles (all excellently done), but the core emotional trajectory is that of the demon king learning the value of life, work, and friends.

Every joke in this show’s quiver, captured in one image.

5.) I couldn’t become a hero, so I reluctantly decided to get a job.

Nearly contemporary to The Devil Is a Part Timer!, Yu-sibu is, in most ways, terribly inferior as a show. The jokes are telegraphed and uninspired, the central romance is beat-for-beat predictable, and there’s a ton of gratuitous fanservice. And when I saw “gratuitous,” I mean there’s an episode early on that’s barely the respectable side of tentacle porn. Once the show figures out that it’s a semi-earnest comedy about a commoner teaching a highborn how to value living like normal folk (albeit one that continues with gratuitous, if not-as-rapey, fanservice), it works a lot better. What saved this show for me was, at rock bottom, the workplace vibe. More than most shows on this list, it makes a point of emphasizing the hierarchical structure of the Japanese workplace and the web of mutual obligations that go with it. For that, its high-stakes, action-fantasy climax feels weirdly earned and sincere.

4.) Denki-Gai

While not a masterpiece, Denki-Gai is an almost perfect example of the kind of series I’m talking about. It takes place in a manga shop in Akibahara, so all the clerks are otaku of some variety. Like a lot of school club comedies, it spends perhaps a bit too much time making a spectacle of its characters’ eccentricities and not enough time delving into their lives outside of the shop—there’s a relatively thin supporting cast here that is not institutionally connected with otaku culture—but it’s warm and funny. The focus is on the developing relationships among the core cast of characters, and a lot of wacky situations are contrived in order to make that happen. Much as I hate retail work, this kind of show makes it seem reflexively appealing (and necessary) without losing sight of how hard it often is.

The Boss, obviously.

3.) Servant x Service

Based on a manga by the creator of Working!, SxS follows the misadventures of a group of civil servants. Of all the shows on this list, it’s probably the most consonant with the feel of similar American sitcoms: a bunch of wacky characters bouncing off one another in the confines of their cubicles, with occasional detours into the lives of patrons or tertiary friends, family, and acquaintances. While not exceptional, like Denki-Gai before it, it is an almost perfect distillation of the vibe I’m talking about into a single series. Given a sequel, I think it could expand on its core cast’s relationships pretty significantly without losing sight of the dynamics that make it so appealing. Oh, and the boss is either a talking rabbit or he uses a robotic rabbit as his at-work avatar.

2.) Polar Bear’s Cafe

One of the truly great anime series I’ve ever seen, Shirokuma Cafe is not entirely about or set in a workplace, but overflows with the vibe I’m talking about. Like any sitcom, it has a relatively small core cast, but like great anime comedies dating back to Urusei Yatsura, it expands continuously on its cast in a rather astonishing feat of sustained social worldbuilding. Also like Urusei Yatsura, Shirokuma Cafe has a perspective entirely peculiar to itself: the humor is wacky and deadpan—not unlike Wes Anderson’s adaptation of The Fantastic Mr. Fox—but also pretty chill. Every once in a while, it sneaks in just enough snark to leaven the genuinely utopian feel of the rest of the series.

It’s one of those shows that you can easily describe in a single sentence and never quite capture: Humans and talking animals who hang out together at a cafe get into lots of dumb adventures. If that appeals to you, great—go watch the show immediately. But the particular characters in this show each have distinctive personalities and their relationships really evolve over time. The evolution is slow, and is more of a constant deepening—a strengthening of community by routine—but it’s also peppered with delightful absurdity and eccentric characters whose eccentricity is (thankfully) not stereocopied from any number of twee, so-called “indie” films.

The titular cafe and a nearby zoo serve as the institutional loci for the show’s copious network of characters, but the core trio is the lazy Panda, the unctuous Penguin, and the puckish Polar Bear, who holds the entire community together with a mixture of trickster humor and patronly care. There’s truly no end of delights in this motley assemblage of personalities, which range from the bizarrely eccentric to the aggressively normal. The cherry on top is that, by the end of this series, you feel as though all the main characters have truly grown—not just grown, but grown together, with their ad hoc community having been utterly central to their (ever-so-slight) maturation.

Takanashi is, quite sadly, not entirely misunderstood by his contemporaries.

1.) Working!!

If every series were like Working!!, I suppose the original wouldn’t be so special. That said, the anime industry could stand to strive for a little more market saturation if every studio could take a crack at making at least one Working!!-esque show. This show tops the list for the reason that it is utterly paradigmatic of the kind of show I’m talking about. While the drama, such as it is, is driven primarily by romantic comedy subplots (basically, they’re all idiots who don’t know themselves well enough to be honest with the objects of their affection about how they feel), the appeal of this show is the obvious pleasure it takes in following the daily absurdities that crop up when a bunch of slightly peculiar people wind up working in the same place. Based on a four-panel manga by Karino Takatsu (also the creator of Servant x Service, remember?) Everyone has his or her quirk, none of which are totally debilitating, but which set them all up for the kind of codependent niches they can only really find with the particular social set at this particular place. Not to say they don’t all have lives outside the workplace—they do, and Working! does a masterful job layering them all into the misadventures of the workplace crew—but our perspective on those lives is always filtered through our judgment of the characters as formed through their interaction with each other at Wagnaria, the family restaurant in Hokkaido where they all work.

So far, there have been three series focused on the original cast, and a new series set somewhere else is apparently on the way. Perhaps the best compliment I can pay to a show like this is that it makes working part-time in the food service industry seem like an innately desirable vocation. Given that working part-time in food service very nearly destroyed my mental health, that’s a testament to just how wonderful this offbeat slice-of-life comedy is. And as a sidenote, the opening themes for each series are about the most devilishly infectious confections you’re likely to hear. Whether you’re seeking the vibe I’ve struggled to articulate in this list, you should probably check Working!! out as soon as possible.

_____

[1] Personally, I’m a much bigger fan of the show than the film. I’ve been slow to recognize Altman’s genius over the years, but I’ve always like M*A*S*H. Two factors militate against my preferring it over the show: 1.) I grew up watching the TV version, whereas I didn’t see the film until early adulthood; nostalgia is a fearsome force when it wishes to be. 2.) As I grow older, I find that I much prefer bleeding-heart sentiment to the arch irony at which Altman excelled. I’m not sure that the characters in the TV show are necessarily more well-rounded than the ones in the film (although I think they are), but the anarchic film tends to use its heroes merely as archetypal tricksters, whereas most of the characters in the TV show are ultimately people. The only inflexibly dimensionless character in the show, Frank Burns, exited stage right just as the show figured out that its characters had to be people, and if the showrunners wanted to have moral monsters in the show, they couldn’t very well afford to have one as one of the regular cast. Rather than humanize Burns, they just wrote him out. Which is kind of a shame, since Larry Linville was brilliant, but also necessary, because it would be inappropriate to reframe the show’s tone on empathy, but retain the one character who couldn’t empathize with anyone, and with whom nobody else could, either.


Follow

Get every new post delivered to your Inbox.

Join 47 other followers

%d bloggers like this: