The Other Hanoverians


On Simon Dickie’s Cruelty and Laughter




Chances are that you are going to enjoy Simon Dickie’s Cruelty & Laughter quite a bit more than you were meant to, or, perhaps, that you are going to find yourself wanting to like it more than you do. Or both. Liking it to the proper degree, at any rate—and in just the manner that it demands to be liked—is going to prove difficult. Dickie’s subject is eighteenth-century England’s sense of humor—its comic literature, for a start, the books you have probably read (Tom Jones, Roderick Random), alongside a great many others that you almost certainly haven’t (the downmarket imitators of Fielding, Smollett’s pedestrian rivals, the scores of clowning Adventures published at midcentury), and also the jokes that its people cracked even when they weren’t reading and the capers they cut on the streets. One of recent cultural history’s niftier stunts has been to get the eighteenth and nineteenth centuries to trade places—to get Victorian England to swap its received image with its Georgian predecessor, like two schoolkids each hungry for the other’s lunch. It has become possible, indeed, to forget that we once associated the nineteenth century with primness and moral fervor, so often have we been reminded that it was actually full of crossdressers and sadomasochists and ten-year-olds who drank gin. Eighteenth-century studies, in the meantime, having first developed a reputation for pissing boozily into any corner, has since retrieved for its students what we might call the other Hanoverians: polite, sentimental, Richardsonian, proto-evangelical—Victorian, in a word, if that word hadn’t come to mean “secretly pornographic.” But perhaps these revisions have by now gone as far as they were ever going to go. For Dickie’s is part of a recent group of books—the list includes Jessica Warner’s history of Gin and Debauchery in the Age of Reason (2002) and Vic Gatrell’s City of Laughter (2006)—that mean to reinstate older perceptions by resurrecting the hard-living eighteenth century, an Enlightenment bibulous and syphilitic, less an Age of Johnson than an age of johnson. Cruelty and Laughter is the kind of book you can consult if you want to learn the many nicknames for noses devised by eighteenth-century men and women, always eager to draw attention to a drinking companion’s peculiarities—to turn their fellows into animate caricatures: Saddle Nose, Razor Nose, Ruby Nose, &c. It is an almanac of boisterousness.

 The next thing you need to know about Simon Dickie, then, is that he is daring you to find any of this even the least bit amusing. His list of topics is easily named: a chapter each on joke books; on humor directed against the misshapen and the halt; on humor directed against the poor; on the compulsive malice of Henry Fielding’s humor, which pretends to a benevolence that it cannot put into practice; on rape jokes and the insistent smirking that overran even court transcripts of sexual assault trials; and on the vogue in England in the 1750s for cut-rate picaresque fiction. What really distinguishes Dickie’s work, though, more than its chosen subjects, is the unrelieved contempt with which he treats them. As early as the second page, he calls his materials “abhorrent,” and the rhetorical pelting never lets up from there; the jokes he discusses are variously “awful,” “vicious,” and “ghastly.” “Appalling” is one of his favorite words, as is “nasty.” Dickie’s stance might best be described as a pseudo-Marxist moralism, which finally doesn’t amount to much more than the unedifying insight that rich people in the eighteenth century were unkind. I could put the point in a somewhat fancier way: There are few literary critics now writing who identify more closely with the social historians. Dickie more than once refers to himself as a “historian” and keeps naming the “social historian” as his implied reader. But he is entirely stuck between his literary training and his historian-envy. He despises the archive he has made his own and so cannot even be bothered to pose any of the interesting literary questions about it. The loathing he feels towards his bibliography terminates in an intellectual weariness or indifference towards that writing’s inevitable intricacies. Dickie has obligingly read a great many noncanonical novels that you are never going to get to, but working through Cruelty and Laughter, you won’t learn much about them except that first, they existed, and second, you probably won’t like them. The literary historian longs to ask: Did laughter really only come at the expense of the lowest and most vulnerable? Is there really nothing to be said in defense of the carnival and people’s laughter? What about satire or hilarity directed against the great? Does knowing about the culture of cruel laughter change our views on those forms? Was there no affirmative laughter or Shandeism—rehabilitating laughter, that is, or laughter that defied misery—and if there really wasn’t, how did Laurence Sterne manage to convince himself that there was? Even if we agree to discuss malign laughter exclusively, then what do we make of its uneasy compound of delight and disgust—its high-spirited repugnance or mood-lifting hate? Does such laughter develop unwitting investments in the baseness and abnormality that it seems to scorn? How exactly do we know what in such laughter is contempt and what celebration?

 Alternately, we could take Dickie’s commitment to social history at face value and thereby allow a second round of questions to emerge. When we think about European fiction in the several generations before the major innovations of the 1740s, the books that spring to mind are mostly comic: Rabelais, the Spanish picaresque, Cervantes, Swift. If we conclude that this was not just some belated canonization effect—and Dickie gives us good reasons to think that it wasn’t, by suggesting that literary historians if anything downplay the preponderance of comic literature in earlier periods—then the question poses itself: Why was comic fiction once so widely read? What is the relationship between laughter and the formation of the nation-state? Or between laughter and colonization? Or between laughter and early capitalism? Will major social upheavals tend to produce the human anomalies or mock-epic incongruities—the mushroom and mimic men—on which comic fiction thrives? But Dickie shies away from these questions, too. He is not, finally, trained as a historian and will not, as a discourse-minded English professor, allow himself the kind of sophisticated speculation from multiple evidence streams that is the hallmark of good social history. So instead he compiles endless lists of verbal bullying: Eighteenth-century writers made fun of deaf people; they made fun of blind people; they made fun of the crippled, amputees, the pock-marked, and on and on and on. The book is a forceful exercise in anti-patrician counter-repugnance, but one begins to suspect that this is all it is.

  The matter is perhaps more curious than that. Dickie’s single most consequential argument is that the historians of sympathy, sentiment, and moral sense theory have tricked us all into according too much centrality to those topics—that a bourgeois culture of compassion and decency was very long in coming. One does not have to disagree with Dickie on this score to want to point out that Dickie is not, in fact, writing against sympathy. Quite the contrary: He is writing against the historians and critics of sympathy and sentimentalism, but those concepts—and the cultural formations they name—remain entirely uninspected. One expects, indeed, that it has to be that way. For Dickie is himself a sympathetic writer—a practitioner of benevolence and striker of sentimental stands—more perhaps than he is either literary critic or social historian, striving to put back in place a set of mid-nineteenth-century judgments against the vulgarities of the dram shop and the pleasure garden. He objects to jokes as “desympathizing.” “One wonders how anyone could have laughed.” He says things like: I don’t want to sound too Victorian, but Horace Walpole really was kind of an asshole.

  Of course, such judgments are not alien to social history. One can still hear in that last sentiment the ricochet of E. P. Thompson’s writing—the working-class historian’s animosity towards “the creatures of Walpole’s …circle” (that’s Walpole père in Thompson’s case), or his disbelief that the English aristocracy could have ever concluded that it was justified to execute a man for stealing a fish with his face covered. At his best, Dickie not only channels the spirit of Thompson and Hobsbawm and Hill, but also devises inventive ways of cross-breeding their arguments with disability studies and so of extending the concerns of English Marxism beyond field preachers and radical mechanics and towards the ragged and the abject. Foucault closes ranks with the Communist Party Historians Group. The category of the poor laborer merges with the category of the freak. Dickie, who possesses a social historian’s eye for the telling detail, takes as his subject “the anonymous, wretched victims of the consumer society so lavishly evoked by recent historians.” In one eighteenth-century version of charades, party-goers would imitate various trades for their companions to guess: Are you a baker? A tailor? A weaver? Successful imitations would typically hinge on reproducing a given tradesman’s characteristic deformity: his stoop, his squint, his abbreviated life.


And yet even here there are difficulties. Dickie’s emphasis on disability eventually changes the character of the English Marxism he often ventriloquizes, or, if you like, blocks some of its signature arguments. The status of class in Dickie’s argument is finally rather unclear, as it is, of course, in histories of humor more generally. The now orthodox position on rude laughter is Mikhail Bakhtin’s, which holds that low comedy is leveling and liberating—a suspension of the rules, an upending of accustomed social hierarchies, a joyful reduction of the body back to it mostly widely shared functions. Mardi Gras, if you believe this account, is the one space in otherwise regulated cultures where grotesque bodies are fully welcome, the one space, that is, in which beauty doesn’t move you to the front of the line, the space where half-naked fat men can dance with dwarfish women and find delighted onlookers cheering them on. This, tellingly, is an argument that Dickie doesn’t even consider long enough to dispute. One question we might now ask is: What do we say back to Bakhtin once we realize that the gentry also liked a good fart joke? Such is the importance of Gatrell’s City of Laughter, which reproduces hundreds of comic prints from the late eighteenth century and the Regency, all of them to varying degrees goatish and none of them within the budget of a saddlemaker’s apprentice. This prompts the student of comedy to modify Bakhtin’s case in two ways: In the eighteenth century, carnival was if anything more the property of the great than of the plebes—the low laughter of the high-born—and for some of them it was permanent and hence not just a holiday mood. Scurrility wasn’t so much the overturning of hierarchy as its habitual and sodden mode. Gatrell is a historian, but philosophically his account presupposes a kind of untutored Nietzscheanism or even a light vitalism: He asks us to think of London’s aristocratic crapulence as a culture without negation, a capacity for taking pleasure in just about anything without having to worry about who sins and who suffers. The visual arts produced a different, more joyous, less alienated city than the Londons one finds in literature, which is condemned to moralism by the simple fact of narrative sequence—compelled, in other words, to care about actions and their consequences. To note the Nietzscheanism in City of Laughter, a book so unbridled one suspects that Gatrell wrote most of it with his pants off, is at the same time to draw attention to the grindingly un-Nietzschean qualities of Dickie’s work. And this is worth dwelling on because the latter has affiliated himself with disabilities studies, a field which typically positions itself as fully beyond good and evil. Or to be more precise: Disability studies is an unlikely compound of Nietzschean and anti-Nietzschean—Christian and universalist—arguments, but from this synthesis Dickie has stripped away the Nietzscheanism (the cruelty, the laughter), and so fashioned a wholly prayerful version of the disability project, preoccupied with fragility and the beleaguered preeminence of the meek. At the same time, then, that he is injecting a set of Foucauldian concerns into English Marxism, he is terminating the Foucauldian thread in disability studies itself: “Scholars have been far quicker to acknowledge the sexual freedoms of early modern libertinism than the equally important freedoms of violence and destruction.”

  And yet Dickie’s very universalism keeps eating itself. His book’s basic position is that eighteenth-century laughter came mostly at the expense of the poor. Gentlemen chuckled into their cuffs while watching worn-down old women shit into ditches. An instability is then introduced into his argument when he notes, as rigor demands, that the laboring classes often laughed along with their betters. Cheap joke books contained the same malicious jokes as their expensively bound counterparts. A butcher was just as likely as a baronet to mimic a cripple’s limp or lead a blind man smack into some wall. And eventually Dickie pulls the plug on E. P. Thompson altogether: “No one can now overlook the nastiness of early modern plebeian life: the violence and long-held grudges, the insults and catfights in alleyways, the elaborate vengeance for unpaid debts or borrowed goods not returned.” The English Marxism which had seemed to furnish Cruelty and Laughter with its guiding ethos turns out to be one of its sadder casualties. “Cruelties in Common,” he might have called this book, in which the beautiful soul compiles its ever-growing catalog of the eighteenth century’s universal wantonness.

  And yet this moral stand is probably something of an intellectual dead end. That the problems attending rude humor are not simply ethical ones, but are rather formal and rhetorical, is amply demonstrated by Dickie’s own book, which itself falls into nearly all the traps that he has identified in eighteenth-century comedy. The only novel that Dickie discusses at any length is Fielding’s Joseph Andrews (1742), about which he makes two points: first, that Fielding, despite his professed intention to reform humor and elicit from his readers an un-cruel laughter, compulsively reproduces the knockabout of his own earlier stage comedies; and second, that eighteenth-century readers mostly appreciated Fielding’s novel as a bit of silly fun—a farce between covers—and thought of Parson Adams, in particular, not as an amiably eccentric paragon, but as a comedic butt and scapegoat, just another foolish old man to be swatted on the back of the head. We can, on Dickie’s behalf, extrapolate his argument into something of a method: We should be bothered whenever an attack on low comedy replicates what it critiques, and we should take bad readers as authoritative in this regard and so remain vigilant against an amoral audience’s ability to laugh for the wrong reasons. Any “instability of tone,” Dickie often insinuates, is just an unforgivable moral foot-dragging, a reluctance to condemn. I am only demonstrating my fidelity to Dickie’s project, therefore, if I now point out that Cruelty and Laughter extensively reproduces eighteenth-century jest-books in the process of attacking them, and that the book’s jacket promises that its collection of rape jokes and pranks perpetrated upon the sick will be “wildly enjoyable”—“entertaining,” the back cover calls the book, a work of “verve” and “joy.” Dickie himself pauses to explain what eighteenth-century people called it when a person soiled himself: “buttered eggs in the breeches,” they said. He also, in that Fielding chapter, tells us to be on our guard against elite figures who unconvincingly perform their solidarity with the eighteenth-century poor. One can learn a lot from Cruelty and Laughter and still wish that it weren’t so haplessly self-hoisting. If you are convinced of Dickie’s argument, then the only consequent way of showing this will be not to read his book.

The Sea Is Not a Place, Part 2

Map image

The proposition I would like to consider, in other words, is that novels—all novels: realist, modernist, and otherwise—have a hard time telling stories at scales larger than the nation, and that it is important for us to figure out why this is so. Consider the novel-in-letters. Dickens writes at one point that one must consider “all the work, near and afar,” that goes into making any not-really-solitary human life. Near and afar: Epistolary novels have their own tidy way of collapsing the distinction between the two, since familiarity-at-a-remove is their very métier. The letter is a wondrous device: Compatriots trade stories at a distance; each produces long-distance emotional effects in the other; one telegraphs moral claims upon his fellow’s far-off regard—any such novel obviously brings to the fore the idea that nations are synthesized in acts of reading. The early English novel often borrowed its scenarios from the theater, but if the novel helped readers imagine the nation and the theater mostly did not, then this is simply because characters in a novel don’t all have to be standing in the same room. Even novels with reduced character lists—which is to say, most eighteenth-century novels, since these had not yet worked out how to choreograph the street-filling multitudes of the high Victorian novel—even such novels submit their few personae to what by theatrical standards is a de-centralizing operation, often replicating the intensities of neoclassical drama, and peopling it with the stock types of the late seventeenth-century stage, but now in centrifugal form. The epistolary novel is the chamber drama in dispersal.

But then how far is afar? If the epistolary form is in part a way of narratively managing distance, then just how far can its techniques be extended? How thin can you stretch a novel before it begins to tear? Saliently: Can there be novels in which characters exchange letters between nations—or were there, in fact, such novels? That, yes, such novels were written and published raises a reader’s hopes; maybe transnational and even transoceanic fiction is viable after all. The bad news, then, must be instantly and soberly delivered: Epistolarity in such novels begins to seem makeshift and erratic. In 1792, Charlotte Smith published a novel called Desmond, whose title character some few hundred pages into the thing leaves for revolutionary France. Now that move is in itself not all that unusual: France is one place that characters in English fiction are routinely allowed to go. And yet even so, when Desmond leaves England, three remarkable things happen all at once: The gap between letters increases at least threefold and often widens rather further than that; the letters themselves get accordingly longer; and the characters repeatedly fret, at the beginning and ends of chapters, about the difficulties of maintaining a correspondence over very long distances: I haven’t heard from you forever; my apologies for not having written; “the opportunities I have of sending to the post are so few….” A novel committed to epistolary verisimilitude will have to factor in the limitations of the eighteenth-century postal service, absorbing the latter’s inefficiencies into its very form, where they will blossom into sympathetic arrhythmias and sentimental dysfunction. In Smith’s novel, characters ride beyond the reach of regular mails; letters routinely cross, which means that each party is writing from a position of relative ignorance; traveling friends become moving targets, less interlocutors than the blank site of recently vacated addresses: Fine feeling gets forwarded by cooperative landladies. The letters themselves, written in the company of an impatient carrier, begin to seem hurried—truncated and self-interrupting; or, conversely, letters never sent are bundled together and inflate into soliloquy.

This should allow us to specify the important point: Transnational novels aren’t nearly as good as their sedentary counterparts at the very things at which prose fiction is generally thought to excel. You might, for instance, think that eighteenth-century novels prompt readers to sharpen their sensibilities by inviting them to partake imaginatively in the lives of strangers, but a novel like Desmond shows something rather different—friendships put under pressure by distance, as characters themselves, corresponding only intermittently, are presented fewer occasions for sympathetic communion: “It is very uneasy to me, my dear Bethel, to be so long without hearing from you”; “time and distance are cruel enemies, even to the ties of blood.” Worse: A letter from another country will, when it finally arrives, still generate sympathy in its receiver, but that person will now have conspicuously fewer options for acting on his sentiments: “If these distressing scenes should become yet more alarming, I shall return to England”—because in another country my sentiments are for naught. In a sentence such as this, we witness sympathy, having flicked to the end of its elastic tether, begin its snapback and homeward journey. The transnational novel has to consider the possibility that sentiment is localizing and nation-bound. The sympathetic novelist stares unhappily at the limits of sympathy. Or there’s this: “Write to me instantly—Yet how shall I put off my determination till I receive your answer?” That dash is a kind of fracture, or perhaps a surveyor’s chain, melancholically measuring distance, at the distal end of which a character is realizing he is going to have come to a decision by himself, outside of his accustomed community of sentiment, and without the advice that would normally be wired in from the next county. This isolation is a problem for a sympathetic ethics, to be sure, but it is also a narratalogical problem, since in that same half-sentence we see one of this novel’s narrative strands achieving its reluctant autonomy; the hero’s story will now have to proceed without imported inputs. The novel’s multiple plots, far from knitting more tightly together in the genre’s accustomed fashion, begin to disarticulate. Epos reverts to episode. When, in a novel-in-letters, the communications begin to arrive infrequently, the back-and-forth that is the hallmark of the form grows muffled, to the point where the narration begins to resemble the running monologue of a plain first-person narrator, and the novel begins shedding its epistolary qualities. If made-up letters turn readers into virtual friends, then long distances tend to force even friends into the position of novel readers. The epistolary novel’s sense of time is accordingly unsettled. The status of events shifts in one of two ways, each of which inhabits a sympathy-defeating temporality: Either a correspondent writes to recount in full events that are no longer ongoing, that have already arrived at their conclusions and so present no possibility for sentimental intervention—or, odder: the correspondent writes to half-tell events that were ongoing when written, but will have been completed by the time the letter is read, their outcomes sealed, but to the addressee unknown: “Perhaps, before you receive this, for it is a long way from hence to England, he will be well—perhaps he may not need your prayers!”; “Before this letter reaches you, however, my fate must probably be decided.” And with sentences like those there collapses the shared time of the nation that novels are thought to generate. If cross-Channel epistolarity generates formal impasses of this severity, we can begin to understand why the trans-Atlantic novel in letters was left unattempted, since the problems of scale would have become that much more unmanageable. The epistolary novel, it turns out, suffered from the same difficulty as the British war command circa 1780—the insurmountable difficulty, that is, of transoceanic communication, of never being able to respond to events in real-time, of only ever knowing what happened in the war six weeks ago, of planning for the future with permanently and incorrigibly obsolete information. Verse epics, willing to scale impossible mountains or to send muses and angels screaming across the sky, don’t have this problem. We are used to thinking of novels as sprawling, encompassing, fully open to the world; and if you don’t like poetry—the way, for instance, Sartre didn’t like poetry—this might be because poems, you think, tend to be miniaturizing, inward-looking, preoccupied with language itself, in a manner that too readily turns its back on the world. But eighteenth-century poems routinely describe oceans and continents and spheres. Worldly novels and unworldly poetry—now consider please: What if it were actually the other way around?

This brings us back to our question, which we should naggingly repeat: Is it possible to write a novel about the entire world? Where is the novel that evenly divides its attention between the Chicago and Pakistani branches of the same family, without making either of those locations serve as mere backdrop to the other—as interlude or memory hole? Where is the periodical novel in which the Dedlocks divide their time every year between Melbourne and the Greek islands, in which Richard Carstone is still trying to make his nut in London, and Little Jo dies in Kinshasa? Could Gaskell’s North and South be rewritten so that its title refers to hemispheres and not counties? What do we do when we realize that Frank Norris’s Octopus is not just, as the subtitle has it, “a story of California”? If you wanted to sit down to write a planetary novel, what would you take as your model? What kind of novel would get you closest? And what about its techniques and conventions would you nonetheless have to change? What exactly do we take to be the pendular opposite to the domestic novel or literary regionalism? Jane Austen and Sarah Orne Jewett at one end of the geo-fictional spectrum and at the other end…well, what? Is it possible to compose a literary history of novels that were never written?

The problem is more complicated than that, since there are several different ways a book can fail to appear. There are, for a start, entire regions of our collective experience that seem inhospitable to narrative—the most consequential of these would be work, unsettling as it is to realize that the preponderance of contemporary narrative, novelistic or otherwise, takes place at the day’s margins, on weekends or during coffee breaks or after the whistle blows. Where, we might ask, are the great novels of the workplace?—though that question seems less like a judgment on novels than it does on factories or office buildings themselves, those austere and storyless zones—a judgment, I mean, on our lives’ blankest hours, routinized, repetitive, unproductive of incident. In other cases, a book’s non-appearance is a simple matter of literary access—of admittance to literacy and the quantum of leisure that alone makes writing possible—as when one learns, disbelievingly, that we possess no slave narratives in French, not a single one, not from Haiti, not from Guadeloupe, not from Martinique. Some institutions, in other words, don’t produce stories; and others don’t produce storytellers. But neither of those explanations seem to hold for the missing stories of empire and diaspora and global capitalism. So: Can we tell stories about the whole world? And if not, then why not? What’s keeping the novel from pulling this off? It is hard to shake the feeling that the novel should be up to the task. In inquiring about the planetary novel, we are, it’s important to keep in mind, not imposing on the form a reader’s private wish, arbitrarily spoken from outside its pages, alien to its design. We are not asking the novel to mop our floors or press our shirts and then complaining when it doesn’t. Quite the contrary: George Eliot writes that the novel produces “new consciousness of interdependence” or “fresh threads of connexion.” Goethe writes that “everything depends” on “knowing the connection of parts,” Salman Rushdie that “to understand just one life, you have to swallow the world.” One of the greatest accomplishments of the novel has been to generate on behalf of complex social systems a kind of hypothetical transparency, to allow us counter-factually to inhabit a metropolis the way we think villages and neighborhoods were once inhabited, to reduce the outsized back to the scale of the knowable. You read Balzac—or you watch The Wire—and you think: This is what it would feel like if cities were intelligible, which they’re not. So given that the tendency of the novel—or of one prominent strain of the novel—is already towards diffusion and dilation, and also towards complex causality and action at a distance, one would like to know why, when the novel’s compass was expanding, it stopped where it did. Why wasn’t that process arbitrarily extendable? Why, when characters in Great Expectations travel to Australia and Cairo—or when characters in Balzac’s Black Sheep voyage to Texas and Algeria—do the novels in question not follow them to those places? Dickens famously thought that the word “telescopic” was an insult. Clara Reeve’s Old English Baron, one of the first Gothic novels, from 1778, features one character who has, in fact, had many adventures overseas, but the novel poses as a found manuscript, and it turns out that the un-English pages have all been misplaced. Reeve simply leaves them out, tears them from the book. That names the problem pretty well: Where you expect to find the world outside England there is instead only a gap, a strikethrough, a coffee stain.

If you read a lot of novel criticism, you might want to track this observation back to Edward Said’s “Jane Austen and Empire,” itself twenty years old now and due for reconsideration. That essay was trying to evaluate one simple, easily overlooked literary datum—that Jane Austen over the course of Mansfield Park mentions Antigua some half a dozen times. What, the essay asks, are we to make of this flickering in the Caribbean distance, this dependency glimpsed out of the corner of one’s eye? Said’s answer to that question was notably unsettled. In one sense, all he wanted to do was make sure that no-one thought he was doing something willful by introducing the question of empire to the study of literature, and the mere presence of the word “Antigua” in the library’s Austen concordance was all he needed to make his point, since it allowed him to argue, correctly, that the colonies were already inside of English literary history and that insisting on the importance of empire was, in fact, just one more way of being attentive to a great novelist. It turns out that even the most decorous, music-box fictions compulsively record their affiliations with spaces outside themselves. Fair enough, I’d like to say—but we should also note the particular ways in which Said overstates his claims, since by the time the essay has finished, he will have enrolled Austen, alongside Jean Rhys and Joseph Conrad, in the roster of colonial writers, the idea being, I think, that there were in the eighteenth and nineteenth centuries no non-colonial writers. And this bit of luminous hyperbole, unmistakably generative and entirely flattening, is accompanied in Said’s essay by a palpable desire to correct Austen—to improve upon her novel—to bolster her few summary references to plantation slavery and thereby to transform her into the Tolstoy of Atlantic capitalism. Fredric Jameson has made the brilliant observation that large-format realist novels do not require footnotes to nearly the same degree as other types of literature. The discouraging experience of the undergraduate taking a seminar on Augustan poetry is one of spending a lot of time flipping to the back of the book and still not really understanding who Bolingbroke was. Most literary writing is hard to read without somebody constantly whispering explanations in your ear. But realist novels can get by with many fewer footnotes—and that’s not because the world they describe is more like ours—but because they are as it were self-footnoting, as though they had pre-emptively absorbed the apparatus of historical explanation and annotation into the fiction itself. It is in this sense a problem that so much of Said’s essay amounts to an extended historical gloss on Mansfield Park, with the critic providing the account of the sugar islands that Austen has in fact withheld. Said, in his own words, has made it his task to “reveal and accentuate the interdependence scarcely mentioned on [the novel’s] brilliant pages”; and in that sentence, he stops functioning as Austen’s commentator and steps forward instead as her collaborator, eagerly offering to draft Mansfield Park’s omitted scènes de la vie coloniale, praising Austen not for the novel she actually published, but for some imaginary other novel that the two of them cooked up together. But one needs, I think, to hold fast to the distinction between a novel in which Antigua is named and one in which it is brought before the mind as a narrative object in its own right. I hope this will begin to make clear the stakes of the project I am outlining here. The near absence of concertedly transoceanic novels is one of our literary history’s oddest lacunae. No less an intelligence than Edward Said was forced to make one up.

I don’t mean, by pointing this out, to admonish Said. That’s not it at all. I merely want to be clear about what he was up to so that we can, if possible, reformulate his point with greater precision. For if we have no choice but to become Austen’s deputies and co-novelists, we will need to know in much more detail what it’s going to take to write that other Mansfield Park. In particular, we will need to know how a novelist like Austen, having once spied the rest of the planet—having, that is, registered the globe as a possible object of narrative concern—nonetheless manages not to tell a story about it. What are the devices that, dyke- and levee-like, prevent the rising ocean from overrunning the novel’s pages? A demonstration is ready-to-hand—here’s a passage from Mansfield Park where you can see Austen deploying some of the canonical novel’s drainage techniques. The novel has just introduced a minor character; his name is William; he’s a sailor; he’s been abroad for several years; and he’s back in England on leave for the first time. This makes him a valuable guest.

William was often called on by his uncle to be the talker. His recitals were amusing in themselves to Sir Thomas, but the chief object in seeking them, was to understand the recitor, to know the young man by his histories; and he listened to his clear, simple, spirited details with full satisfaction—seeing in them, the proof of good principles, professional knowledge, energy, courage, and cheerfulness—every thing that could deserve or promise well. Young as he was, William had already seen a great deal. He had been in the Mediterranean—in the West Indies—in the Mediterranean again—had been often taken on shore by the favour of his Captain, and in the course of seven years had known every variety of danger, which sea and war together could offer. With such means in his power he had a right to be listened to; and though Mrs. Norris could fidget about the room, and disturb every body in quest of two needlefulls of thread or a second hand shift button in the midst of her nephew’s account of a shipwreck or an engagement, every body else was attentive; and even Lady Bertram could not hear of such horrors unmoved….

The first thing that this passage allows us to say is that Austen herself recognizes the possibility of global narration, and that our demand for such a form will seem accordingly less whimsical and arbitrary. Even in Austen’s Nottinghamshire, we find ourselves briefly in the presence not just of colonial wealth (Sir Thomas’s repaired finances)—and not just of colonial goods (Indian shawls and tea and spiced punch)—but in the presence, too, of colonial storytelling. The possibility of an entirely different narrative mode opens up in front of us. The second thing we’ll want to say, though, is that William is a rival narrator—not an ally who might piggyback on Austen’s own persona and thereby extend the novel’s geographic reach, but a competitor whose efforts must be parried. A sailor arrives spinning yarns, and the chamber novel registers his coming as an intrusion. Austen, we will note, has dealt with the challenge in much the way you might have expected her to—by absorbing William’s stories into her own apparatus, though to say this is not yet to say enough, since in most other cases that observation would mean that the young sailor’s Stories from the Sea had actually been reproduced or dialogically interpolated. Novels, after all, routinely feature guest narrators who show up for a few chapters to sit in with the band. But that’s not the case here. Almost nothing of the competing narrative has been preserved; Austen has undertaken not just to, say, recontextualize William’s adventures, but to neutralize them, to diminish them back to the mere fact of themselves. The visiting sailor “has a right to be listened to,” but that right will not be honored. The opening sentences are of special interest in this regard: What matters is the recitor, not the recital. The novel draws attention to Sir Thomas and allows him to model for us a way of listening to global or maritime stories—a mode of listening that purges the planet even from planetary relations, that brackets the world as narrative object and makes it subordinate to the world’s witness, that manages to transmute into a lyric solo the thronging chatter of port cities. That’s all you need to know about one strain of English fiction: that it knowingly makes a reading of the globe secondary to the reading of character.

You could make the same point in terms of genre, since the stories that William recounts have the quality of epic, and we can think of the novel as here reducing this more prestigious competitor form to a few amputated conventions—shipwrecks, battles, horrors. But I think the point is most compelling when made in terms of style. In Austen’s pages, we can see the globe acting upon novelistic style, making itself felt as a distortion in realist prose, a thinning of the form’s usual busy knit. The mark of the globe is a recourse to abstraction where we would otherwise expect specificity: “recitals, histories, an account, details.” Abstraction is the residue of an untold story—quite literally in this case—a history named ideationally qua history but then relegated to the Cone of Silence.

The problem of abstraction is worth thinking about some more. It will help, in this regard, to examine a passage from a novel that Elizabeth Hamilton published in 1800, called Memoirs of Modern Philosophers. It is not a novel that urgently demands to be rediscovered; it is enough, for current purposes, to imagine a gawky, Presbyterian Austen novel in which the scoundrels are all left-wing intellectuals. The book also goes in for a little light Shandeism, teasing readers about the expectations they bring to a novel and so flagging what by 1800 had already come to seem tediously conventional in prose fiction. The novel’s best bit comes in the final chapter, when the narrator announces that she is not after all going to be able to get everyone married before the book ends. And she imagines an irritated reader asking Why not? Sure, one of the novel’s men is so rascally as to be beyond marrying. But

“If Bridgetina can’t have him [the truly vile one],” cries the [reader], “she surely may have Myope at least. His poverty is no obstacle; for what so easy, as to make him have some rich uncle come home from the East-Indies, or to give him a prize in the lottery; or—oh, there are a thousand ways of giving him a fortune in a moment….”

“Giving him a fortune in a moment”: The marvel of this short passage is that it brings to the fore a constellation of problems in the history of the novel that we could cluster under the rubric of Things That Just Happen. The reader is begging the novelist to exercise her emergency powers, and one of the terms that occurs in this context is “fortune,” which is linked to the lottery and thus to chance. There is an entire literary history behind that formulation—the history of a writerly device—since eighteenth-century fiction still routinely chalks events up to fortuna—or, alternately, to providence or accident. In other contexts, the distinctions between those three would need to be teased out, but for our purposes what matters is that all of them are higher order abstractions that introduce terminal gaps—great, shrugging perplexities—into a given novel’s chain of causal explanation. Fortune is the encompassing and vacant pseudo-cause, the mark in human affairs of implacable complexity or genuine randomness, hence the hollow into which narrative bottoms. If an event happened “because of Fortune,” then it just happened.

But then the second possibility that the impatient reader raises is that a character could be summoned in from India. This points to a problem that I don’t ever think has been sufficiently considered: Characters routinely appear in novels from afar—and they routinely exit novels, as well, as surely as they do stages, and one way to think about this would be to say that novels almost always generate for themselves a kind of offstage, a place from which messengers arrive, where events happen unseen and to which characters can pop out for a cigarette and a costume change. And the passage already makes clear why novelists might avail themselves of this void space; an offstage can solve all manner of different problems that are at once ideological and formal, furnishing spaces—many of them with names like “Egypt” and “St. Kitts”—where novels relieve themselves from the otherwise endless burden of narration, magic boxes from which poor characters emerge rich and nobody needs to know how or why. The offstage is in this sense a spatialization of what had seemed like the intrinsically temporal problem of fortune. It’s not that fortune can’t strike nearby—it’s just that events transpiring in remote places are more likely to appear to the mind as fortune-driven—and the territories involved become, again, places where things just happen—the Fortune Islands or the Archipelago of Accidents. The biggest island in the Bahamas is called Providence.

What we can say now is that events in prose fiction become abstract when they are, as here, declared exempt from the novel’s usual modes of analysis and elucidation—the curiosity about complex action that seems to be part of narrative’s permanent Aristotelian inheritance. And it this abstraction that will make itself felt at the level of the sentence, in a manner that we’ve already just seen in Austen. Verbal de-specification is how Fortune and the offstage—these temporal and spatial dullnesses—begin to colonize the very style of prose fiction. Two more sentences from Mansfield Park:

…Sir Thomas was still abroad, and without any near prospect of finishing his business. Unfavorable circumstances had suddenly arisen at a moment when he was beginning to turn all his thoughts towards England; and the very great uncertainty in which everything was then involved determined him on sending home his son, and waiting the final arrangement by himself.

This is as close as we come in Mansfield Park to a direct description of Antigua, and what stands out here, again, is the strange, ruthless weeding of the narrative underbrush: Business?—What business? Circumstances?—what circumstances? Arrangement—what arrangement? Everything? Sentences like these do, it’s true, push the concept of fortune one degree back towards a differentiated causal sequence, but only one degree, achieving the syntax of narrative while narrating almost nothing. They resemble an author’s reminder to herself to insert event here.

The question, then, is whether the novel could set itself loose from this or that place without degenerating into abstraction in this manner—or more simply whether, in an English novel, Antigua could function as a narrated place and not just as a placeholder. And I think that these questions furnish us with a series of fresh reasons to go back and read widely in the early history of the novel. There are some utterly basic things we still don’t know. What finally accounts for this literary deficiency? Is the problem, as with epistolary fiction, that the novel’s technology gets glitchy when upscaled? Or is the problem, as with Austen, that novelists have, one after another, obstinately discounted openings to global narration even when these have conveniently presented themselves? The epistolary novel might have specifiable limitations, but then why aren’t these lifted by authoritative and disembodied narrators—the narrators that in this of all contexts we are going to have to stop calling “omniscient”? And which features of the realist novel could we nonetheless imagine repurposing to planetary ends? Is it really all that hard to conceive of a multiplot Mansfield Park—an Austen novel reunited with its twelve mislaid Caribbean chapters? But then what are the various ways in which regional and national novels cauterize their edges? How does any given novel constitute its geographical borders? How does it set territorial limits to what it is willing to narrate or how does it mark out a beyond into which it will not follow even major characters? Do maritime novels have distinctive narrative strategies for expanding the realist novel’s scope? Do immigrant novels? And could the realist novel still learn from the genres against which it typically defines itself? Can it learn from science fiction novels, which, after all, have an easier time than most talking about planets? Are there things that neoclassical verse epics know how to do that even a Dickensian or Balzacian realism doesn’t manage? And if a novelist tried to import these epic features into a novel already in progress, what would she have to give up? With what exactly would they seem incompatible? If the realist novel is to keep its promises, will it have to cede its very realism? Even the most imperialist of epics allow the casualties of empire to call curses down upon their conquerors. Couldn’t brother William, just once, see St. Helena from the main-mast? Will Fanny Price have to play cards with the sister-nymphs who live at the world’s western edge? Will Sir Thomas be called to account by the gorgon spirit of Africa rising caustic from the drink?


-The map up top was designed by Aaron Straup Cope.

-I don’t mean to suggest that there are no planetary novels, only that we should pause to appreciate how unusual they are—and that we should read them again carefully to make sure that they are really doing what we think they are. Here are the beginnings of a list: Gibson’s Pattern Recognition, any of Pynchon’s long novels, and especially Moby-Dick. Other readers will forward their own candidates.

-Thanks to Christopher Flynn, whose Americans in British Literature helped me extend my thinking on Charlotte Smith.

-Jameson’s comments on the self-footnoting novel can be found in The Political Unconscious.

-For more on how novels deal with Things That Just Happen, see my “Providence in the Early Novel, Or Accident If You Please.”

The Sea is Not a Place; or Putting the World Back into World Literature




If you want to understand some of the last decade’s renewed interest in the category of “world literature”—if, that is, you want to understand the real achievements of the concept as refurbished by Pascale Casanova, Franco Moretti and others, and perhaps also to begin repairing its weaknesses—it will help if you first understand the ways in which Samuel Beckett’s Molloy is most like the Charlie’s Angels movies. One way to get at their resemblance would be to list some of the complaints that viewers have leveled against the latter. It has “no plot,” wrote one critic of the first Angels movie, released in 2001, and, indeed, fails to meet the basic demands of continuity; “it’s difficult to tell how one punch leads to another.”  The San Francisco Chronicle warned that Charlie’s Angels lacked not only clear sequencing, but also characters that one might care about or indeed any discernible individuals at all, though, of course, it fully agreed the movie was fragmented, less a coherent story than “bits of scenes … overly stylized and self-conscious.”  The BBC elaborated on the point: The picture “leaps from one small scene to another,” it said, dispensing in the process with “real drama and proper exchanges.”  In literary history, these deficiencies are known, collectively, as “undermining the edifice of realism” and are the sort of thing that novelists get a lot of credit for attempting.  One student of modernism has written that Beckett, no less than Columbia Pictures, devised a “new set of technical tools that made it possible to escape meaning—which is to say narration, representation, succession, description, setting, even character.” Indictment: Charlie’s Angels “exists in a reality unto itself.”  Tribute: Beckett “created the most independent world conceivable.”  The medium changes, and calumny is transposed into praise.

This will seem like a joke, but we might, in fact, want to take seriously a certain plain, verbal fact, which is that people who don’t like big-budget action movies often describe them—spontaneously, unwittingly—as though they were modernist novels. Perhaps a moment’s reflection will make this less surprising. For what Molloy and Charlie’s Angels share is easily named; it is the aesthetics of abstraction, the pressure exerted upon narrative by de-specification. This, too, comes into focus when refracted through the criticism. Here is Perry Anderson on blockbuster cinema: “The basis for the fortune of Hollywood” has been “narrative and visual schemas stripped to their most abstract, recursive common denominators.”  And here is Terry Eagleton on the literature of the mid-century: “Beckett’s works take a few sparse elements and permutate them with Irish-scholastic ingenuity into slightly altered patterns.”  Recursion, permutation, slight alterations … Samuel Beckett and Hollywood film, these exact contemporaries, these children of the year 1906 … Spotting the two of them together, in tandem, now becomes a minor test, an opportunity to demonstrate one’s intellectual steadfastness: Are you willing to approach the culture industry and the art novel with the same aesthetic priorities? Can you hold the one to the same standards that you hold the other? Devotees of Beckett’s fiction might, of course, still conclude that they dislike Charlie’s Angels, but they aren’t going to be able to dislike it for insufficiently reminding them of Middlemarch.

Indeed, watching Charlie’s Angels with Beckett open on your lap is a chance to remind yourself of the rigorous formalism of much Hollywood film, which after all has its own particular way of “refusing to yield to the usual requirements of legibility.”  What we will want to say back to anyone incapable of appreciating such a radiance is that they don’t really like film qua film, that they bring with them into the movie theater the worn-out expectations generated by older narrative modes, to the point where they can no longer tolerate a cinema set free from extra-cinematic demands, liberated, more than any Iranian neorealism or the interminably filmed conversations of the French New Wave, into color and kinetics and pace. What offends is not the brainlessness of Charlie’s Angels, but its aestheticism, for which that other is code. A movie “without … purpose,” objects Roger Ebert, to which the only answer is: Exactly.

Turning to Beckett, we will want to repay the favor by pointing out the plebeian and atavistic quality of late modernist prose, the way in which it liquidates the conventions of novelistic realism in large part by reactivating the cadences of folklore and myth. Beckett’s was not an uncharted path to abstraction, but precisely an antique and subliterary one: Here’s a story about “two men … one small and one tall. They had left the town,” some town, no particular town.  We could say, more precisely, that Beckett’s prose achieves its high degree of abstraction by deploying at once two literary registers that we typically regard as opposed: folklore, which is Beckett’s debt to an Irish Revival that he officially scorned, but also a minutely interiorized and doubting ego borrowed from lyric poetry—a blocky folklorism, then, that has no need for novelistic particularities, plus a dismal lyricism that blurs whatever few specificities remain. Molloy often reads like myth retold by some tormented prose-sonneteer. “He wore a cocked hat” could be the beginning of a song or a children’s rhyme. But Beckett’s narrators will glaze any such bare fact: “It seemed to me he wore a cocked hat.”  We might, in the same spirit, call to mind Adorno’s observation that European modernism was basically just an extension of nineteenth-century horror fiction—or rather, that it was an unlooked-for recombination of neoclassicism and its Gothic opposite; abstraction made eerie; Palladianism with the lights turned out: Conrad’s ghost ships and vampire derelicts, Eliot’s bridge-crossing zombie-shades, not to mention the too easy instances of the Czech were-roach and the twelve-tone music that survives now almost only on the soundtracks of scary movies.  To this list we can add Beckett’s writing of the rotting flesh, whose signature tic is to say “death” wherever ordinary English would say “life,” and whose stories center on old men who beat up their even older mothers; on those who live within earshot of abattoirs; on menacing cops and unexplained kidnappings and rectal births. It has taken a sustained effort, of a more or less ideological kind, to get lots of people to agree that this was ever “high culture.” We can praise the Hollywood blockbuster for its euphoric and unweary modernism; or we can conclude that modernist art is less the negation of pop culture than its distension and making-arduous. Either way, it will be hard to escape the impression that modernism, determined to purify itself of mass culture, keeps rediscovering itself in its hated opposite. Charlie’s Angels only had one sequel; Molloy produced two.

Screen Shot 2013-08-10 at 2.39.58 PM
We can begin now to say why this pairing should matter to anyone wanting to study something called “world literature.” The problem with conventional accounts of modernism and aestheticism is that they tend to mistake abstraction for autonomy; abstract prose gets to count as self-sufficient, a writing apart from the world, answerable to no agencies or institutions, borrowing elements from empirical reality only to transfigure them, no longer constrained to file reports on the really existing, to serve out its time as the gazetteer of circumstance. If an artwork is any object unshackled from the demands of mere use—a jug too lovely or fragile or pointy-handled to pour from—then the virtue of abstraction will be that it unfits language for the purposes of ordinary communication and so shifts it over to the realm of art. This is what makes abstraction easy to mistake for autonomy or why it is easily misperceived as its vehicle. In Beckett’s prose, then, one finds a more or less strenuous refusal of context:

• “And I, what was I doing there, and why come? … these are things we must not take seriously.”

• “Shall I describe the room? No.”

• “For the particulars, if you are interested in particulars….”

What jumps out in these lines, and the many more like them, is that Beckett cannot, in fact,  quietly bypass readerly expectations; the apparatus of realism has to be acknowledged so that it can be tauntingly canceled by professions of ignorance and amnesia. My mother has died. “I don’t know how.” I used to love a woman. “I’ve forgotten” her name.  The most telling variant of this tic, also utterly commonplace in Beckett, is the withdrawn specification. A concrete detail of a realist kind is offered to the reader as bait and respite and then in the same sentence negated, like so:

• “A little dog followed him, a pomeranien I think, but I don’t think so.”

• “It was a chainless bicycle, with a free-wheel, if such a bicycle exists.”

• “The dog was uniformly yellow, a mongrel I suppose, or a pedigree, I can never tell the difference.”

It is rhetorical tether-snippings such as these that lead some readers to deem Beckett’s writing independent and self-directed, unbeholden to the objects it just barely names—or fails to name—or names multiply—“literature rescued from dependence,” as one admirer has it.      A self-sufficient literary language, then—except, of course, it is nothing of the sort. Autonomy, I think, except I don’t think so. When “abstraction” renames itself “autonomy,” the concept gets freighted with political claims that it cannot make good on. A writer’s withdrawal from reference is thought somehow to model or to guarantee or to act as signature for a second withdrawal, a retreat from institutions, as though an art for art’s sake did not in some entirely ordinary way have to be produced and announced to the world and disseminated and exhaustively explained. You can say that all art begins, in a fabulating spirit, by separating itself from reality, and you can praise abstract art for resolutely guarding that partition. Or, if you have come to distrust representations as such because they inevitably convey some ideology or another, you can say that an abstract and experimental writing works to unsettle our relationship to language, making it difficult for us to sink back into our usual lexical stupor, irritating us into inhabiting speech less thoughtlessly. Or you can simply marvel that the abstract artwork is the last thing in the world that isn’t expected to do anything, the only object still exempt from the calculus of efficiency, the only one of us who gets to stay out late because it doesn’t have to work in the morning. Humanity delegates its relinquished autonomy to a special class of objects, so that these can enjoy liberty in its stead. The abstract artwork is, in this sense, a labor-saving device, a metaphysical appliance, freedom’s automatic spray tube dishwasher. But having made any of these arguments, what do you then say when you discover that the US government began buying up modern art in the 1940s, that the State Department helped promote abstraction abroad as something like the official aesthetic of the United States, or indeed that many of the journals in which abstraction was argumentatively furthered received funding from the CIA—that the CIA’s first head of counter-intelligence was famous first for founding a quarterly of modernist poetry and that the CIA regularly recruited agents from the Kenyon Review?  Even abstraction has its political uses, chief among them to mime an independence from such use. Autonomous art was nakedly heteronomous—this may be the only paradox of twentieth-century aesthetics that Adorno missed.

Hence Charlie’s Angels. If it is writers like Beckett that you want to understand, then the virtue of talking about commercial film first is that no-one has ever mistaken Hollywood’s motley geometries and dream states for political autonomy. The freedom from reference, which we might also call an indifference to local content, is itself produced by a system and historical occasion—immigrants, in the Californian instance, learning to tell stories to other immigrants, conglomerating and simplifying their inherited narrative forms, which is what lends Hollywood movies the character of a sailor’s yarn, and then streamlining these further once the industry discovers that such reduced forms export especially well, like fortified wines and salted meats, playing with equal facility in nearly any national market or communal VCR, on the simple theory that a viewer in Chongqing is unlikely to commit to a 60-hour film dramatizing the contradictions of US drug policy on the streets of post-industrial Baltimore. The global dominance of Hollywood cinema cannot be separated from the basically Galilean quality of its cinematic space: bodies in motion against green screens, CGI cannonballs dropped from the world’s interchangeable towers.

Once one grants this last point—that abstraction itself has a material underpinning and that it emerges more easily in some historical locations than in others—then the task is simply to extend this insight back to Beckett (and Gombrowicz and Borges and Kobo Abe). This is where Pascale Casanova comes in. Modernism has had its own distinctive patronage institutions, whose needs it roughly serves, and it is the great virtue of Casanova’s World Republic of Letters to help us spot one, alongside the university and the American state, that we might otherwise have missed.  The easiest way to come to grips with her argument is to resolve it back into its component parts—to realize, that is, how programmatically Casanova has grafted Immanuel Wallerstein’s world-systems theory onto Pierre Bourdieu’s account of distinction or cultural capital. First Bourdieu: In order for a literary scene to exist, a national language needs to possess nothing so interesting as a rarefied temperament—neither a linguistic cache of ensorceling Indo-European roots nor a secret, primeval resemblance to ancient Greek—but an entirely mundane, nuts-and-bolts literary infrastructure: a leisured elite, schools willing to teach its patricians the skills of higher literacy, a caste of professional writers, bookstores, libraries, publishing houses, state patronage for the arts, and a functioning feuilleton. Any nation with these many latter will be able to convince itself that it also has the former. Then Wallerstein: Not all nation-states possess these resources to the same degree, and the ones that possess them in superabundance—France, Britain, more recently the US—get to tell the rest of the world what counts as literature. It’s worse than that: The literary salarymen of the great European metropolises—editors, critics, translators—have always played a unique mediating role in the global literary system, claiming for themselves the authority to choose which of the world’s aspiring novelists will get access to the large and university-educated readerships over which they stand guard, and the first issue to be decided by young writers on the literary periphery—in the Sudan, say, or in Gujarat—is thus whether or not they are going to write in ways designed to appeal to such people. What Casanova has persuasively established is that there are world cities of literature, places, above all Paris, where authors—and not just French ones—are certified as literary. The best thing about her book, in this sense, is that its title is simply wrong, utterly contravened by her own argument, which describes nothing like a “world republic of letters,” with whatever faded egalitarian associations that term still has, but rather a literary world system, neo-colonial in effect if rarely in intention: stratified, full of power imbalances, “a world of rivalry, struggle, and inequality.”

The point that we do not want to overlook is that a certain orthodox conception of High Literature—the aestheticist account of autonomous writing—is made possible only by this empire-not-republic of letters. That point comes in a weak form and a strong, which depends on the weak. The weak version says that all novels, even realist ones, will seem more abstract or aestheticized when lifted out of their various national contexts and read by foreigners who won’t understand their more sectional references—German readers, say, for whom the names of São Paulo neighborhoods are just sounds, so many swayings of the verbal hips. Against the old prejudice that condemns all translations for being dull photostats of their originals, this idea holds that translation is in many cases just the reverse—the key, indeed, to making a work literary, and that a certain loss, a smudging of the detail or declaring-irrelevant of the particularities, is intrinsic to this process. Literary aestheticism is in large part the effect of being republished elsewhere; we call autonomous those works whose dependencies we are unable to spot. To this idea—that a novel is more likely to get treated as literature once it travels—the strong version of the argument adds that the literary world system is designed to reward writers who have, as it were, preemptively de-nationalized, whose writing comes pre-abstracted, obligingly stripped of geographical and historical markers, proper-name-avoidant. Tolstoy positions a character in Смоле́нск, and a Russian reader in the 1870s recognizes a western border town, a fortress defending the route to Moscow, a crossroads-which-is-to-say-battlefield, a place where Napoleon once attacked. Tolstoy’s translator positions that same character in “Smolensk,” and a reader in Minnesota in 1930 thinks … nothing much, probably … that he wishes the book came with a map … that he likes a good Jewish joke. Smolensk has become a city I just about recognize as Russian, barely more than a spot-marking X. And then Beckett writes, in Molloy: “I beg your pardon, Sir, this is X, is it not?, X being the name of my town.”  Modernism ratifies the condition of literature in translation, neither presuming local knowledge nor offering to produce it. And “world literature” is the name for a certain tendency towards abstraction within the global literary system, the propensity of works aiming for an international readership to make themselves frictionless. There is to that extent a social history to literary autonomy, a social history, in other words, behind the kinds of writing that feel licensed to dispense with social history.

Such, in a nutshell, is Casanova’s splendid revision of the concept of Weltliteratur, which here stops functioning as the name for an (especially tedious) canon and instead makes its rightful contribution to a materialist history of letters. One marvels, indeed, while reading her book, at the determination-unto-mania with which Casanova transposes into the sphere of literature arguments borrowed from Braudel and dependency theory and the like, casting about for belletristic semi-peripheries, programs of poetic import-substitution, &c., and almost always identifying plausible candidates. It makes a person wonder into how many other non-economic domains world-systems theory could be usefully extended: Is there a cinematic world system? Probably. A musical one? A culinary one? And yet Casanova’s argument is, for all that, rather broken-backed; there is a fracture running through her very great book. Here’s the tricky thing: Casanova helps us see that the world’s publishing centers have had the power to declare writing literary, to consecrate a foreign production as Literature, and she argues that the abstraction characteristic of such writing is produced by the unevenness of the global literary system. Abstract writing—or concrete writing read as abstract—involves a false universalization imposed by the biblio-metropolis. She herself speaks in this regard of the “structural ethnocentrism of the literary world.”

And yet—and here’s the puzzle—Casanova aggressively prefers such abstract and falsely universal writing, routinely declaring international modernism superior to rival literary modes, and expressing a certain pity for the African and Asian writers who don’t get to enjoy its bogus autonomy—“nationalist” writers, these would be, and literary realists: “conservative, traditional—in a word … ignorant.”  She begins her book by explaining how a certain illusion of autonomy is produced and concludes it by patly reinstating that illusion. The matter comes to a head when she explains what distinguishes the semi-periphery in her ingenious model. One of Casanova’s advances over postcolonial studies as practiced in the English-speaking countries is that she has salvaged from Wallerstein this exceedingly generative concept, which adds a complexifying third term to the seesawing dichotomies of center/periphery and metropolis/colony. In Casanova, the semiperiphery—that which is neither metropolis nor colony proper—is the domain of the “small languages”—Bulgarian, Romanian, Swedish, and so on—languages, that is, with established print traditions, working presses, national or regional canons, &c, but whose literatures arouse little interest outside their borders and whose native readerships are by global standards so small as to support little professional literary activity. Writers on the semi-periphery thus face a choice, whether as burden or luxury, that genuinely colonized writers do not; the bifurcations in the literary world system crystallize in front of them: Is one to become a national writer or an international one? That choice isn’t fully available on the periphery, at least in the sense that Ngugi was doing something quite drastic in opting for Kikuyu, language without novels, whereas Josep Pla, in opting for an already belletrified Catalan, was merely clambering on board a regional donnée.

The point that we won’t want to miss is that this geopolitical distinction—national v. international—is, on Casanova’s understanding, pegged to a second, properly stylistic distinction: realist v. modernist. Writers who do not care if foreigners read them write stories about their home countries in an accessibly middling prose. Realist fiction thus becomes the symptom-in-literature of a region’s more general backwardness; it is intrinsically parochial, requiring the specifications that anchor prose to a particular pace; and writers who have the option of writing like Beckett and don’t take it stand accused of pursuing a retrograde policy. This is a point Casanova makes repeatedly and in the tones of a Viennese economist instructing protectionist Argentines to stop subsidizing wheat farmers. Such is the uneasy surprise of her book: Its entire conceptual framework is borrowed from the great anti-colonial sociologists, and a reader goes in thinking that she is trying to figure out what literature can contribute towards the liberation of colonized peoples. But it turns out that all she really cares about is the liberation of literature, and that she likes African and Latino writers most when they can serve that other end. It’s like getting to the last page of Wallerstein and finding out that he’d been promoting free markets all along. Casanova thus reliably inverts the anti-colonial position, championing Caribbean and Arab and Asian writers when they take up European intellectual tools against their own peers, as when she praises the Algerian novelist Rachid Boudjedra for “employing the weapons of writers in the center in order to subvert social and religious proprieties [in North Africa].”  What in the first twenty-five pages she exposes, with great agility, as the “naïve” idea of a “pure, dehistoricized, denationalized, and depoliticized conception of literature,”  she reinstates gullibly in her final paragraphs as a “truly autonomous literary revolution,” commending modernist fiction for generating a second “independent world” to shadow the one we actually live in, which I think anyone would have to admit is a rather peculiar definition of “world literature”: a literature as little as possible about the world.

There is more to be said about this cinching together of nationalism and realism, as about its setting over and against a modernism that gets to count as international, since it turns out that very little about this scheme will survive closer inspection. Casanova’s account starts unraveling, as so often, around the antithesis to which it is tacked: nationalist realism vs. internationalist modernism. We can start shouting out the names of argumentative threads as they come unfastened. There are, by my count, three important points to be made against Casanova:

Realism is every bit as international as modernism, at least in the sense that Casanova means it: a widely diffused set of narrative techniques or formal structures, written on every continent, referring back to the same few models—Scott, Balzac, Flaubert, Tolstoy—and less attentive to local content than you might think. Another way to make this point would be to say, as Franco Moretti has, that the realist novel was a basically imperial northwest-European literature, or that realism was once the name for the encroaching standardization of world fiction, an innovative form, to be sure, but also an inertia, a stable “Anglo-French paradigm … third-person historical novels, and not much else”: Benito Perez Galdós, Park Kyung-ni, Fenimore Cooper.  The insidiously realist novel proved so compelling a form that it convinced writers in southern Europe, Asia and elsewhere to find the most British possible stories to tell about those places or convinced them to trick out French plots with characters bearing assonantly local names. This is the occasion to recall Roberto Schwarz’s great argument that the European novel was not, in its very form, suited to the colonies, but that early Brazilian novelists did not know this.  Once a literary critic has separated realist fiction back into its distinct conventions—free-indirect discourse, marriage plots and multi-plots, character sketches, &c.—there is no reason to think of these as any less abstract than the studied imprecisions of late modernism: easy to carry, iterable, geographically indifferent.

Modernism is every bit as national as realism. There is, indeed, an unmistakable nationalism hitching a ride on Casanova’s argument, offering as it does a Third World anti-nationalism which tends nonetheless to endlessly reconfirm the preeminence of the French. This is no mere prejudice on her part: Casanova does provide some rather good reasons for thinking of Paris as the imperial arbiter of the Modern or for thinking that to become a modernist in Scandinavia or Ireland was in some more or less self-conscious way to Gallicize, and her account accordingly assigns a special, diagnostic role to those foreign writers who were upfront about apprenticing to los franceses: Rubèn Dario, Georg Brandes, August Strindberg, Beckett himself.  It is just that having made this point, she can no longer claim that modernism is, unlike realism, the authentically international position, since its transcontinental abstractions have always carried some secretly national commitments. That of course the same point can be made about an international-but-really-Anglo-French realism only tightens the screw: In addition to there being two international modes of prose fiction, there is also none.

The nation repeats at the level of content. Casanova makes the case for scores and scores of writers that they can’t be read in a narrowly national frame. She asks us to see any national literature as just one more place where international literary rivalries get played out, a perpetual, fraught recombination of foreign elements in which the indigenous contribution often recedes away to nothing: Canadian literature pits Anglophile novelists against Americanized ones. Modern Irish literature, which, from the vantage of 1870, one might have expected to be a running contest between the Anglicizers and the Gaelic nativists, decides instead to remodel itself on French, Russian, and Italian precedents. Casanova has a good time detailing such geo-literary twist and turns and has written perhaps the only literary history that sometimes reminds one of spy fiction: Ibsen “affirmed his determination to introduce realism into the theater and henceforth to use French literary tools for the purpose of devising a distinctively Norwegian style freed from German constraints and control.”  And yet this analytical sophistication comes at a certain cost, allowing one to forget that at the straightforward level of setting and character, the modernist novels that Casanova champions are no less nation-bound than the realist ones she finds contemptible. Faulkner, after all, is a regionalist, the cornerstone of Southern literature seminars, a modernist-of-one-county. Even Beckett’s Molloy grudgingly admits its Irish setting, and not only because the novel shares its name with a Victorian poet who wrote songs with titles like “The Kerry Dance” and “Thady O’Flynn.” If you read carefully, you’ll work out that Beckett has set his story on an island and that there is a sea, tellingly, to the east; you’ll spot the odd local custom or identifying mark: “And da, in my part of the world, means father.”  We could grant for the sake of argument that modernism is in literary history the properly international term, and we would still have to conclude that its internationalism is available in its pages only as form, in which case, Casanova, having laid out the distinction between an international modernism and a nationally minded realism, is not actually choosing one side of that antithesis, but rather a particular way of breaching it: the internationalized narrating of the nation. Joyce’s Portrait of the Artist ends when Stephen Daedalus resolves to leave Ireland, which is another way of saying that the novel itself never gets to leave, that it does not follow Stephen, that it is forever stuck in Dublin; it fails to complete the character’s cosmopolitan turn. Casanova’s point would be that Stephen’s cosmopolitanism has actually been present in Portrait all along at the level of technique, the tangible, typographic sign of which are the dashes that Joyce uses instead of quotation marks, which are, of course, not really an innovation, but simply how many continental European writers handle dialogue: Russian, French, Spanish. Cosmopolitanism is available to Joyce as an ethos, as a principle that characters can discourse about; and it is also available to him as a punctuation mark; but it remains oddly absent at the level of content. That is the condition of modernism.

Here, then, is a proposal, and it is the suggestion that actually concerns me here: In a tinkering spirit, one has to wonder about the unnamed counterpoint to Casanova’s chosen aesthetic—not a single-nation modernism, which is what she prefers, but a realism of many nations—Joyce’s Portrait, flipped. Ask yourself: If it is literary cosmopolitanism that we are after, why are we settling for Joyce’s Europeanized quotation marks? Why are stuck extrapolating the politics from a typographic convention? More broadly: Why is the argument about world literature proceeding entirely at the level of form and technique? Don’t you want to read novels whose narrators themselves travel from continent to continent—and not just from the provinces to Paris, or from Sussex to London, or between neighboring countries—but properly global novels? But then where are those titles? How many can you name? One begins to wonder whether the novel, as a form, in any of its modes, can absorb properly global or transcontinental content, since even on Casanova’s own account, this possibility seems entirely foreclosed. It’s the option that doesn’t even come up. Her formalism is to that extent a grave limitation, and one begins to suspect that an internationalism of content would be the utopian term that eludes her rickety conceptual scheme—utopian, that is, simply by virtue of being missing. We are accustomed to thinking of form as sedimented content—the formulation is Adorno’s—and we want to say in this spirit that certain literary techniques carry the globe with them. But then where are the naively planetary novels of which these techniques are the vaporings? Do we have in front of us the strange case of a sediment that precedes the object of which it is the residue? How could a novel make good on Joyce’s Hibernio-Slavic quotation dashes? Is it possible to reconstitute the body from that trace? Could a world literature actually tell stories about the world?

All one needs to know about Franco Moretti, meanwhile, is that he has written a book, The Modern Epic, which is perhaps the most bizarre contribution to literary history in the last generation, a book about “world texts”—“supranational works” of vast “geographical ambition”; of, indeed, “global ambition”—in which he for all intents and purposes identifies no such works.  The real head-scratcher in The Modern Epic comes in the closing pages when Moretti confesses that he had meant to write a study of novels that conceptualize time into very long periods—super-historical novels, you might call them—but that he had realized as he wrote that he was interested in geographical expanse instead: spatial immensities rather than chronological ones. And yet none of the works he writes about are geographically expanded, which leaves the reader in the odd position of having a deflationary counter-epiphany. Moretti is surprised to have written the book he did, and the reader is surprised that he didn’t actually write that book. His key titles are two national allegories (One Hundred Years of Solitude and Midnight’s Children); a city novel (Ulysses); Wagner’s Ring cycle, which Moretti himself calls “spatially concentrated,” “a grand world, but one made up of few places”; and Goethe’s Faust, which so defies Moretti’s attempts to classify it as a “world text” that he finally breaks down and concedes that it is “a kind of national saga” after all.  Instead of the modern epics that his title promises, Moretti has spread out before us a set of more or less unconvincing proxies: Maybe literary crowds and choruses can produce the effect of the world, by reproducing in prose what the planet feels like. Or maybe multiethnic nations can stand in for the world. Maybe department stores can. Or people walking shop-lined streets. Maybe we can say that an epochal and multi-generational narrative is about the world, provided we agree to read time as though it were space. But then why would we do that? Any solution this labored obviously discloses the actual problem, which is that extended space does not seem to be directly representable, and Moretti has not paused long enough to ask why. Why should we have to go through the detour of time? Why this nervous list of approximations? What becomes clear is that the one thing that Moretti most wants—the thing, too, that he has confoundingly convinced himself he has identified—is actually missing. So why do the theorists of world literature routinely make a hash of “the international” and “the national”? And do we have any counterproposals to make back to Moretti in a cooperative spirit? Where, finally, are the books he thought he was writing about?




Zizek’s Stalling



•3. Žižek’s STALLING




We would do well to remind ourselves of how the Greimas square works, because knowing the square is going to make it easier to pick out what is least settled in Žižek’s thinking: his uncertainties, his panic. Before you click away to some corner of the Internet that doesn’t involve Lithuanian semiotics, let me observe that there is nothing metaphysical about the Greimas square. It’s just a device for beginning to say in which specific ways a given opposition is likely to turn unstable—which particular terms, in other words, an antithesis will generate but no longer be able fully to encompass. It provides a rough guide to the instability of any conceptual pair you find yourself needing to think about. Perhaps you’re trying to make sense of a story (or a philosophical system or the everyday idiom of a school or social scene), and you’ve noticed that it is fixated upon some opposition. If you now tabulate a Greimas square around the opposition’s two terms, you will have a much clearer idea of how X vs. Y can become unstuck, at which point you can turn back to the narrative (or whatever) and start scanning it for its pressure points. You will have a better chance of naming the passages (or episodes or characters or arguments) that most threaten the narrative’s governing antithesis. The Greimas square will flush out the material that the story has to work hardest to contain.

Here’s the easiest possible example. We compulsively code people and animals according to their genitalia, to the point where some people think that the doors of restaurant bathrooms are the very model and derivation of all two-term thinking. So the Greimas square begins with what first strikes the mind as a fixed opposition:

Screen Shot 2013-07-29 at 10.02.17 AM

Next comes the bit that has the character of an instruction. For each term x, you think up some adjectives that describe the un-x and then record them in a short list beneath x’s opposite number, like so:

Screen Shot 2013-07-13 at 11.28.46 AM


All of the action happens now, once you have these four corners in place, as you begin to sum each of the vectors in the square, vertically, horizontally, and diagonally: Man plus woman, man plus mannish, man plus effeminate, and so on. If we accelerate to the completed square, it will look like this:

Screen Shot 2013-07-29 at 10.04.54 AM

We’ll want to note at least three things about this and any other such quadrangle. 1) Its line of central terms, from “hermaphrodite” down through “tomboy,” all name intermediate or mongrel concepts: mules, tangelos, the usual stuff of the dialectic. The Greimas square is an especially efficient way of generating, from out of a system in seeming repose, its agitation—its misfits and unassimilated conceptual grit—though it will at the same time disclose the categories by which the system will move in to denominate its own anomalies. 2) When you sum each side of the square vertically (man + mannish, for instance), the adjectives that reside on the bottom tier will serve as intensifiers, producing purified or pumped-up versions of each of the antithesis’s central poles. Implicit in the Greimas square is thus the neglected insight that positive terms—terms that seem to exist outside of relationship—are as disruptive to a binary order as intermediate ones. 3) Aficionados of Greimas often call the hermaphrodite—the both-and construction that perches on the top of the completed square—the perfected or utopian term. It’s not clear whether we should call this synthesis queer (because its archetype is the androgyne) or un-queer (because its original is marriage). Either way, it is in this utopian term that the system’s initial opposition is overcome, its stalled conflicts and predictable oscillations set to one side, and the gratifying possibility of new historical and narrative material at last glimpsed. The x-plus-y term is usually thought of as the way out of a given semiotic square and into some other parallelogram or lozenge.

Knowing even this much about Greimas should allow us to say what makes Žižek’s project in many respects rather unusual. His thinking is manifestly organized around an opposition—the antithesis of law and transgression. That couplet will reappear in scores of his more local arguments. But what he calls upon us to repudiate, after those many arguments have crystallized out into their overriding political claim and program, is the merger of law and transgression in post-Oedipal capitalism’s culture of compulsory mischief, that historically novel system in which authority accrues to the rule-breaker rather than to the bailiff and in which it has become possible—check your own head—to feel guilty about doing what you’re told or to find the superego calling you to account for being insufficiently insubordinate. We can simplify that last sentence: Žižek repudiates the merger, and this is peculiar because it means that on the schedule of concepts generated squarewise by the antithesis law vs. transgression, it is the perfected term—the fusing of obedience and rebellion—that Left Lacanianism recommends we back away from. Žižek is widely regarded as a dialectical thinker, but it has to be said: He takes the synthesis to be the problem, and that isn’t how the dialectic typically works. Žižek means to identify an already existing fusion and then in some not entirely perspicuous sense resolve it back into its component parts, to throw the dialectic in reverse or desublate an established Aufhebung. Anyone running a Greimas square on The Plague of Fantasies or The Ticklish Subject is going to stop short upon finding the utopian term preemptively blocked, displaced by market society’s malign parody of reconciliation. We’ll still want to work up the square, though, because doing so will at least generate the other options, the terms that might be asked to serve as utopia now that synthesis has forfeited the role. Some other location on the square is going to have to provide the chute that leads out of its geometry, and we’ll want to know which it is.

So here is what Žižek’s square would look like if we left all its terms in the abstract:


Screen Shot 2013-07-13 at 11.37.41 AM


At this point, our task is to work out what more specific terms Žižek has inserted into each of these conceptually dictated slots. We need, that is, to determine what kinds of historical substance can be attributed to the square’s otherwise intangible positions. We already know that the perfected term has been captured by the new spirit of capitalism and its “world of ordained transgression.” Change fast … match your brand’s look and feel … constantly innovating … 5 billion emails every month … monitor activity … celebrate creativity and chaos. And any disaffection we feel towards this term can effortlessly be extended to the two just below it, those other, equally inauspicious mediations: lawlike transgressions, the Lacanian name for which is hysteria, and transgressive knuckling to the law, known locally as perversion.

So with the central spindle removed from consideration, a Lacanian politics is going to have to travel the Greimas square’s outer perimeter. Three possibilities end up suggesting themselves:


Three o’clock

-Perhaps what we’re looking for is a politics that, in Žižek’s words, “suspends the dimension of the Law” or that affords us “jouissance outside the Law”—a transgressive transgression, then, a mode of waywardness that makes no reference back to the decrees of God or government and so can no longer be called “transgression” or “misconduct” nor even properly “lawlessness.” Žižek’s name for such devilry is “Christianity,” which is going to seem less confusing if we quickly note four things:

1) The philosopher from Catholic Europe doesn’t seem to realize it, but he isn’t talking about Christianity in general so much as about its hyper-Pauline strains—about radical Protestantism, in other words, and especially about the sects that came to the fore around the English Revolution: the Independents, the early Baptists, the Muggletonians. Something about Žižek’s confessional turn would have been more comprehensible if he had subtitled his books “Why the Quaker Legacy Is Worth Fighting For” and “The Perverse Core of Quakerdom.” If his persistent Jesus talk has struck many readers as confusing, this is at least in part because the Christians he is talking about are either dead or living in Pennsylvania college towns. Chances are you haven’t met them.

2) These Christians really did declare an end to the law. Here’s John Milton in Paradise Lost: “And to the cross he”—Jesus—“nails thy enemies: the law that is against thee and the sins of all mankind.” Knowing the historical case is your best chance at guessing the kind of politics that Žižek is trying to resuscitate when he says, in Miltonic accents, that Christ “signals the Law’s demise.” In the seventeenth century, some radical Protestants began selecting their own ministers from out of the nation’s pool of university graduates. They wouldn’t accept appointments from a superimposed hierarchy, but expected, rather, to exercise oversight over their own guardians. Others began raising ministers from out of their own plebian ranks—lay preachers, then, who kept their day jobs and were granted no special authority over their parishioners. In Bristol, England, there were mixed-raced Baptist congregations presided over by women as early as 1650. And then others still took the next consequent step and abolished the position of minister altogether, a feat that once perfected within the church could next be repeated extra-ecclesiastically. Milton held high office for its duration in the revolutionary government that beheaded the English king in 1649, which act is what the poet had in mind when he imagined the law being executed in Jesus’ stead, with Christ back on the ground and hammering, a centurion turned against the empire, the crucifix mutating before the reader into Judea’s guillotine.

3) This Christianity depends on a simple shift in grammatical mood. Where most churchgoers will tell you that Christians should love other people, the believers-beyond-the-law will say instead that Christians do love other people. If you are the sort of person who takes care of others without asking for their papers or checking first to see if they are worth your attention, then you are a Christian; and if not, then not. If, that is, you have to think about any of this, if you have to deliberate your way to that position, then you are only revealing your distance from God. There is thus an anti-ethical moment within Christianity itself, for which solidarity is not an obligation, but a kind of moral fact—the most important thing, in one sense, but also just something that people do. Keeping the law would, from this perspective, be a problem, since once you tell yourself that you should be more loving, you have made it clear that you would actually rather be some other way—you would prefer to be unreceptive or perfunctory or bilious—at which point agape can become just another target for your resentment, one more stricture that your authenticity requires you to defy.

4) In Book 8 of Paradise Lost, after the archangel Raphael has finished telling Adam the story of the Creation and the war in heaven and the ostracism of the rebel angels, he pauses to ask if the first man has any questions. And Adam has only a single question—just one thing he wants to ask: Do angels have sex? Raphael replies that they do, except that spirits have no flesh, such that they are constantly passing in and out of each other organisms, “obstacle finding none of membrane,” wafted into penetration by each puffing breeze. What Milton brings into view, then, is the possibility of a sex uncarcassed, whole-bodied and resolutely non-genital, still the literary canon’s most compelling image of polymorphous desire, a libido without need of fruitfulness or groin-anchoring. We will read elsewhere in the poem that the angels are “without feminine”—they are all “masculine”—and this will only confirm the point: that the radical Protestant heaven is a place of unrestrained sex between men, or if not men than males, gay sex, you might want to say, except that this sex accords no priority to “joint or limb”—some other gay sex, therefore, the hypothetically unphallic version, gay male sex refashioned on the example of its lesbian feminist antithesis. This is Christianity’s own vision of liberated enjoyment or good obscenity—of pleasure beyond the law—all of it palpable still, if you seek out the right Sunday service, in the quaking and the shaking and the shout music. The overall point is simple: Žižek sometimes seems to think that Jesus is how you become a Lacanian without going into therapy, and he thinks this because there really have existed Christians who believed that the law had been abolished and that moral life was a matter of enjoyment rather than obligation.


Six o’clock

-But then you might decide that pursuing an ethical enjoyment doesn’t make any sense. Let’s first put the best face on that position: The doctrine of good obscenity holds that political goods are not sustainable if they are rooted in repression. If, for instance, my fellows and I achieve our solidarity only by discretion and euphemism, then our camaraderie can at any point be blown apart by an eruption of the Real. The alternative would be to absorb trauma and the drive into our position and so to seek gonzo versions of what we’ll have to stop calling values: not freedom, but nasty freedom; not equality, but nasty equality; not justice, but nasty justice. Liberté, égalité, obscénité. But this proposal is hard to carry through consequently. Part of the problem is as it were philological. Another of Lacanianism’s core arguments is that the father is always a sham. That’s the starting point, in fact, from which psychoanalysis leads most directly to an emancipatory politics; it thinks it can show you that paternal authority doesn’t really exist. You probably formed your conception of authority at age two or three, attributing to your father powers that he plainly did not possess. To a toddler, the father is, ludicrously, the Person Who Can Do Anything He Wants—the one who can run faster than you and jump higher and always reach the ice cream, the one who can pull your nose from your face and reattach it at will, the one who can send you to your room and somehow make you stay there. Žižek’s claim is that your relationship to authority has never stopped being childish in this fashion, that even once you grew taller than your dad and began to outrun him and realized that the nose in the old man’s fist was just his own poorly disguised thumb, you transferred your belief in his omnipotence to the father’s sundry proxies: cops, bosses, priests, &c. What remains as one of childhood’s more damaging legacies is your conviction that there exists somewhere someone who gets to do all the things that you are prevented from doing, someone who possesses the jam that you lack. The grown-up alternative to this view would be, rather than struggling against such people, to stop believing in them, to stop conferring on them a supremacy that they would not have absent your belief. So authority in some sense doesn’t exist but is merely an attribution; all pretended powers are to that extent spurious; and the word “obscene,” in Žižek’s writing, usually refers to the way in which your desire is entwined with such fantasized and illegitimate hierarchy. But when Žižek writes, as he often does, that what we require is, say, an “obscene solidarity,” he can’t mean the word in that sense. He can’t mean a solidarity supercharged by some delusion we hold about our fathers, since the paternal presence, even if a phantom, would so obviously compromise the solidarity it is being asked to underwrite. Worse: We require an obscene politics on simple Enlightenment grounds, so that our practices will not depend on repression and its fragile lies, but then obscenity threatens to reintroduce into those practices distortion and misapprehension at another level. We watch Pentecostals chicken-walk down a church aisle, and we can just about imagine an obscene justice, except that obscenity in the sense that Žižek usually means it would transplant injustice back into the realm of the fair and the due. How, we will need to know, could obscenity serve justice and still be experienced as obscene? Wouldn’t obscenity by definition bring with it excess, inhumanity, compulsion, &c?

Nor is the problem merely lexical. That Žižek continues to use the word “obscene” in these contexts should rankle; it is a persistent if accidental reminder that transgression carries law with it and that devising genuinely liberated versions of the Left’s core positions is going to take more than an act of will. So the next part of the problem is epistemological: Let’s suppose I’m white and I’m close to some guys who aren’t, and I say that, no, really, I can joke with them about how enormous their penises are, because I am thereby acknowledging the history of racist cliché, the sexual panic that was woven into every looped rope, &c. This will be the crucible of our confrerie; my tastelessness will retrieve entire registers of historical experience that tact would just as soon place beyond discussion. And yet it is reasonable to ask: How will my buddies know what construction to put on my jokes? Psychoanalysis hardly suggests that we are transparent to one another, so I shouldn’t, if I’m following Žižek, be able to take my intelligibility as given. How, in other words, would anyone who is not himself a trained Lacanian analyst be able to tell that my joke isn’t a way of pulling racial rank on them? And wouldn’t even my analyst require long acquaintance with me in order to make that determination? So why would any comrade of mine put up with those big-black-dick jokes for the time it took to figure this out? Or maybe I think that my crew should be able to know my mind immediately and on the spot. Maybe there are simple verbal indices that will tell a person what is liberated enjoyment and what is mere hysteria. But then what would those be? Psychoanalysis doesn’t give us any reason to hope that this would be the case, and Žižek never instructs us on how to make the call, and besides, if there were such rhetorical cues—features of syntax or word-choice or inflection—then these would be mimickable by any racist and they would thereby stop functioning as cues. So let’s agree that my friends can’t tell my mind. But then I have to wonder, too, whether I can really know that my wisecracks are emancipated and anti-racist rather than obscene. Can I be sure that I understand my speech any better than others do? When did Freudians start believing that people are in control of their own utterances? At this point the epistemological problem reveals itself to have been a properly analytic one all along. For even if I speak my jokes in the spirit of uninhibited fellowship, can I be sure that I’m not also deriving pleasure, repetitively and compulsively, from them? It is a rare joke that tells itself only once.

One way to terminate this train of misgiving would be to give up on the idea of a good obscenity or enjoyment outside the law. Perhaps rather than trying to wrest pleasure free from regulation, we could cancel the law and enjoyment in one, swinging from the Greimas square’s scatty right flank down to its neutered fundament. The neither-nor would replace the both-and as the dialectic’s utopian term, producing not a synthesis but merely an uncharged field, atheticized and disannulled—an antinomianism still, but one that gains the unlawed person no treats or hedonic bonuses, an antinomianism chaste and meager. Perhaps, Žižek writes, we should worry less about “suspending the explicit laws” and worry more about suspending “their implicit spectral obscene supplement.” A good politics wouldn’t produce a different obscenity; it would “simply have none.” The complication that emerges at this point is that Žižek’s name for this position, as rival to an ecstatic Christianity, is also Christianity, which is thereby made to occupy two competing slots on the Greimas square. Maybe Christianity is the religion of love-not-law, featuring a god who pals around with whores and compulsively turns all nonalcoholic liquids into wine. But then maybe radical Protestantism’s love-beyond-the-law will itself no longer feel much like love. The Quakers, after all, haven’t quaked for centuries. They sit in silence in spare rooms and address each other as “friends”—Lenten intimates and un-obscene compeers, forming a horizon of flat amity from which no-one rises to the level of lover.


Nine o’clock

-But then if the idea is now to cancel the obscene supplement, it is enough to consult the Greimas square to know that one further option remains conceptually available. This would be the square’s left extremity: the law that is fully law, law in its positivity, with no furtive link back to disobedience—a position that, though thinkable, is psychoanalytically disallowed, which makes it all the more surprising that Žižek has been willing both to entertain the option as a political goal and to propose a candidate as historical bearer of the project. Here’s the project: “The problem (today, even) is not how we are to supplement Law with true love (the authentic social link), but, on the contrary, how we are to accomplish the Law by getting rid of the pathological stain of love.” “Kant sans Sade,” he sometimes calls this idea, a figure we normally know as “Kant.” And here’s the bearer: “Jews assert the Law without superego.” The origins of Judaism lay in a “liberation from the obscene superego.” Israelite hyper-legalism—the stance of a cartoon Judaism that sticks myopically to the letter of the law while ignoring its spirit—brings into view the law in its pure form, without “the repressed desire to sin.” And with that we have the answer to the acrostic’s last unsolved clue. Once we speak the word “Judaism,” we can re-do Žižek’s square with all its proper names and historical specifications:

Screen Shot 2013-07-29 at 10.25.49 AM
This schema allows us to see, among other things, that Žižek’s turn to political theology hasn’t, in fact, been all that pious. He has been interested in Christianity and Judaism only because their various sects really have had rather different things to say about a person’s proper relationship to the law and so can, at the price of a certain brusqueness, be asked to stand in for the combinatory’s several positions. But anyone put off by his readings in the history of religion can at any point return to the square in its conceptual purity and ahistorical abstraction; there’s no reason you can’t put forward your own secular version of the square, provided you are willing to propose alternate, irreligious candidates to take the place of Žižek’s godly ones. One word of caution: The most difficult moment in that undertaking is always going to involve the slot that Žižek calls “Jewish,” which has to be occupied by a magical people without drive or libido, utopian Pharisees and virtuosi of repression, the ones who can ignore their desires and never pay the price.

An opportunity now arises: With the completed square in front of us, we can say at last where Žižek’s thinking is most stuck. The square’s simplicity strips away the endless ingenuity of Žižek’s page-long riffs and discloses instead the unsteadiness of his Lacanian structure—the problems it cannot solve and questions it cannot answer. What almost no-one has noticed, for one, is that Žižek has been distancing himself from Christianity over the past decade and shifting over instead to the positions that he calls Jewish. Granted: They’re probably only Jewish within his Lutheran and sock-puppety scheme. The way he uses “Judaism” as a shorthand for various law-loving positions might, indeed, irritate you, but before your annoyance propels you to stop reading, you might want to at least register that this Judaism, if a scarecrow, is a scarecrow that Žižek has begun to identify with. Here, reduced to their tags, are some of the positions he has been arguing of late: that we have to “assert the priority of the Jewish principle of just revenge/punishment”; that we must mount a defense of “rigorous Jewish justice”; that we must retrieve “the most precious and revolutionary aspect of the Jewish legacy”; that we must forgo the banalities of human rights talk for the non-negotiable severities of the Decalogue; and, again and again, that “we are all potentially homo sacer,” which, traced back to its source in Agamben, means “We are all potentially victims of the camps” and is thus an erudite, Romanizing update of an old radical street chant: “Nous sommes tous des Juifs allemands.” You might remember reading in the Book of Acts that the early Christians were communists—“all things were common property to them”; “no one claimed that any of their possessions was their own”—and guessed on those grounds that Žižek’s drift to Leninist militancy was the simple extension of the month he spent reading Paul in the late ‘90s. Rhetorically, though, Žižek’s neo-Bolshevism is better understood as a break with his Protestantism, which turns out to have been cranky and fleeting, like those three Dylan albums that no-one ever listens to, though Zizek has never really announced his conversion, and it will take some cross-referencing to establish the point: In one book we read that ancient Judaism was “revolutionary” because it was willing to treat the law as a “pronouncement,” “something externally and violently imposed.” In another we read that the Left must “assume the task of a new ‘ordering’ against the global capitalist disorder”; capitalism is a non-regime of “permanent self-revolutionizing” and pointless innovation, against which we must “shamelessly enforce” a new law. Our communism will, in this sense, come from Sinai and not Galilee. Under the sign of Moses, revolution and counter-revolution become impossible to tell apart.

This particular argument of Žižek’s might, at first, seem a little startling, if only because it has rarely been appreciated how much of his corpus amounts to an ongoing Réflexion sur la question juive. But what is truly confounding is not this or that particular proposal of Žižek’s—not this or any other recommendation as to how we might best tackle the problem of enjoyment—but the sheer number of such recommendations. Žižek has lots of ideas about how we might get enjoyment right, and the effect of this fertile, brainstorming array of possibilities is to rob each individual suggestion of its plausibility and so to make the problem seem insoluble after all. False motion is the sign of the system’s stalling, its unremitting reasking of a question the possible answers to which never seem to stick. Žižek is hardly immune from the kettle logic he is quick to spot in others: —I didn’t break your law. –The law was broken when you gave it to me. –Law? What law? God doesn’t make laws. Pauline Christianity is likely to believe that Jesus “paid the price” or “fulfilled the law,” that he enabled God to show us mercy by suffering in our place, &c. It is in this sense entirely nomian and law-loving, suggesting as it does that it was beyond God’s power simply to repeal the law or to declare it void. Even a radical Protestantism thus preserves something of the law’s structure secreted inside itself, on the theory that Christ’s death keeps in permanent balance the scales of divine justice. Against this a more thorough-going antinomianism can argue that Jesus’ death was so brutal that there is nothing you and your piety can do to make it right again. You cannot say: Oh, I get itthis man was tortured and hung up to die, and I therefore promise not to have sex until I get married. That idea is, in fact, a little nutty, as though your prolonged virginity were in some sense equivalent to torture, as though the one could compensate for another. The best thing about Christianity is that it has at its center an act that was entirely cruel, because cruelty breaks the logic of the quid pro quo, which is the logic of the law or the contract or the bargain. But this notion of grace—which is the doctrine of the law’s gratuitousness and self-indicting excess—imposes the burden of endless thanksgiving, the acknowledgment day after day of an unpaid debt, which is to say a peonage: “Then for thy passion—I will do for that—Alas, my God, I know not what.” Calvinism, meanwhile, manages to suspend the law only by positing a sovereign god, a lord and father, whose authority cannot be checked even by his own commands; there is no law, it’s true, but only because God doesn’t have to honor any of the agreements you think you’ve made with Him. A Christian antinomianism, judged within a Lacanian frame, keeps cycling back to law and superego and Big Daddy. Žižek’s point all along has been that the concept of law doesn’t exhaust how the social order keeps a hold on us—that there is always something beyond law—and that this other thing, Enjoyment, necessarily approaches us as non-juridical beings, hence in the mode of bare life. Antinomianism might be the path to emancipation, but it is also the condition of both the sovereign and the homo sacer, those persons outside the law. So Žižek has gotten more hostile to Enjoyment over time. His asceticism has taken over. He has come to think of Judaism as a spiritual practice that can teach us how to follow the law without getting sucked into the obscene supplement. And he thinks the same thing about Kant—that Kant teaches us how to work upon ourselves, in a Lacanian spirit, so that we can identify moral law without Enjoyment getting in the way. And he thinks that Leninism was a Judeo-Kantian politics, before Stalinism took over and brought obscene enjoyment back into Communism. But there is still a very big problem. When he is attacking the theory of radical democracy put forth by some rival Lacanians, Žižek says that these others just don’t get it—that the negativity at the heart of democracy generates its own obscene supplement. Democracy prides itself on being ideologically thin, which means conceptually and libidinally thin, minimally mystified. A proper democracy will be entirely procedural or formal; it won’t tell anybody what to think or feel or want. But this means that democracy cedes enjoyment, the libido, &c to the Right, to which it is then attached in a historically determinate structure: an erotically thin democracy will always go hand in hand with erotically charged challenges from the Right. You can say in advance that they have to go together; you can’t have the first without the second. That’s what he means when he says that the last generation’s new nationalisms and new fundamentalisms have been part and parcel of democracy and the center-Left: Obscene enjoyment “is the obverse, the fantasmatic supplement, of democracy itself.”  It is on this point that Zizek is, in fact, closest to Wilhelm Reich. But then one has to wonder: How is his notion of a Judaic communism of the Law exempt from this same critique? What happens to Enjoyment under Judaism or Kantianism or Leninism? We’re really back to basics. Psychoanalysis tells us that the libido never just goes away; you can’t tell it to leave and you can’t tell it to heel. So why would Jews and Communists be exempt from this? Why would they and they alone have beaten Donkey Kong? If consumer capitalism is the Regime of Obscene Enjoyment that a justice-loving Communism is offering to repress, then won’t capitalism just take on the status of the Real or the drive, especially for the many of us who will possess pre-revolutionary memories of such a thing—won’t a successfully suppressed capitalism just become the market unconscious, the consumer underground, the shop-till-you-drop-and-all-you-can-eat-and-our-doors-never-close? So here are my two questions for Žižek, which I’m hoping someone will put to him the next time he is near a mike: First, are we meant to pursue a politics beyond obscenity or is the idea to make obscenity itself do the work of justice, and if the latter, in what sense would this obscenity still be obscene? Second, and perhaps more pressingly: How do Jews get their kicks? I know, I know: That question lies at the center of Žižek’s entire conked-out system, and it still sounds like a joke.

(My thanks to Jason Josephson, Anita Sokolsky, Ali Mctar, and my fellow readers of Zizek in ENGL 456.)


Zizek’s method



•2. Žižek’s Method


Žižek is above all a Gothic writer, and the admirers who approach him as though he were Louis CK or Reggie Watts are thus falling into a kind of category error. They’ve got the genre wrong, like the people who go to slasher movies and chortle every time the knife comes out. A Gothic writer: It’s not just that Žižek publishes on the kind of accelerated schedule that we more typically associate with pulp fiction or even comic books, though some still unfriendly readers could probably reconcile themselves to his industrial tempo if they began thinking of The Monstrosity of Christ and First as Tragedy not as free-standing volumes, nor even properly as books, but simply as the latest issues in a long-running title—a single year’s worth of Slavoj Žižek’s Adventures into Weird Worlds. The first-order evidence for Žižek’s Gothicism—the cues and triggers that invite us to read his writing as a kind of Gruselphilosophie—are not hard to find: the frequent encomia to Stephen King, to whom even his beloved Hitchcock is finally assimilated; a tendency to explicate Lacan by summarizing the plots of scary movies; a persistent concern with trauma, cataclysm, and grief. Psychoanalysis’s most fundamental insight, he writes, is that “at any moment, the most common everyday conversation, the most ordinary event can take a dangerous turn, damage can be caused that cannot be undone.” So, yes, Žižek is a magnetic and slobber-voiced goof; he is also the theorist of your life where it is going to be worst, the implacable prognosticator of your distress.

But even once we’ve spotted the jack-o-lantern that Žižek never takes off his porch, it is going to be hard to know what to do with it or how to reckon its consequences. What, after all, does it mean to say that a given philosopher is a kind of horror writer? You might be wondering, for instance, if there is a philosophical argument attached to all of Žižek’s horror-talk. It would be possible at this point to survey the philosophy canon and compile a list of concepts or excerptable positions establishing European thought’s many different accounts of terror, trepidation, and unease. Indeed, for the philosophy graduate student, the language’s fine discriminations between panic’s various grades and modes come as it were with the names of Great Thinkers already attached: Hobbesean fear, Kierkegaardian dread, Freudian Unheimlichkeit, the angst, anxiety, or anguish of your preferred existentialist. And there is nothing stopping you from reading Žižek in this manner and so walking away with yet another philosopheme, in which case you might decide that Žižek is a fairly conventional theorist of the spooky-sublime, like so: All language involves a doubling; whenever we name something, we fashion a doppelganger for it. I open my mouth, and where before there was one thing, the object, there are now two, the object and its name, and if I’m thinking clearly I need to be able to distinguish rigorously between the word “table” and the touchable, breakable, enduring-decaying, eighteenth-century Connecticut batten door upon which I am now typing. Žižek takes the position that language thus severed from its referents is always on the side of fiction, fantasy, and ideology. You can only be sure that you are in the presence of something real if this kind of doubling hasn’t taken place, if, in other words, the object hasn’t been surrounded by verbal shadows of itself. If you can talk about something, then it is by definition untrue; it has already been translated into a kind of derealized chatter. And if it’s true, or if it’s Real—because that’s the philosopheme you are about to pocket: the Real—then you can’t talk about it or can’t talk about it lucidly and coherently. But in that case, the only things that get to count as Real are the things that resist being named—those enormities that daunt our congenital glibness—which is to say the worst things: the torsions, the tearings, the ugliest breaks. Nearly everything can get sucked into the order of language, but some few things can’t. What remains is what’s real: the unspeakable.

But perhaps this too-fluid summary is beside the point. For to call Žižek a Gothic writer is finally to say less about the substance of his arguments than about his way of making those arguments—his philosophical style or Darstellung. It is one thing, I mean, to point out that Žižek gives an account of fear, which we could reflect on and debate at the seminar table and then agree with or not. It is another, rather more interesting thing to observe that Žižek is trying to scare you—not just to explain the uncanny to you, but to raise its pimples in your armflesh: “What unites us is that, in contrast to the classic image of proletariat who have ‘nothing to lose but their chains,’ we are in danger of losing everything.” Critical theory, of course, has always been readable as a mode of Gothic writing, just another subgenre of the dark-fantastic, with Freudianism and Foucauldianism assuming their place on the bookshelf alongside vampire novels and chronicles of crewless ghost ships and other such stories of the damned. Marx describes the commodity as “phantom-like” and calls capital a bloodsucker and attributes to it a “werewolf-hot-hunger.” Freud makes of psychoanalysis a sort of ghost story and instructs his followers to conduct therapy as though it were a séance or an exorcism—a making-the-spirits-walk. In German, the other name for the unconscious is not reassuringly distanced and Latinate, but bluntly, forbiddingly vernacular. The Ego, this is to say, does not share our person with the Id—that’s not how Freud puts it. Das Ich is chained to das Es,the Me” to “the It,” or, if you like, to It. Walter Benjamin, meanwhile, asks us to declare our solidarity with the dead. Adorno requires that you take a shard in the eye. Foucault recasts Left Weberianism as a paranoid thriller, a story about imprisonment and surveillance and the impossibility of outrunning power. Critical theory, this is all to say, needs to be read not only as a teaching or a storehouse of oppositional arguments, but also as a historically inventive crossbreeding of philosophy and genre fiction. The Frankfurt School Reader is, in that sense, one of the twentieth century’s great horror anthologies. If we now insert Žižek into this philosophical-literary timeline, we should feel less awkward naming some of his writing’s schlockier conventions: his direct emotional appeals to the reader; his sudden juxtapositions of opposed argumentative positions, which recall less the patient extrapolations of the dialectic than they do the jump cuts of summer-camp massacre movies; his pervasive intermingling of high and low, which marks Žižek’s arguments as postmodern productions in their own right, against which the genre experiments of Freud or Benjamin will seem, in retrospect, downright Jamesian and understated and belletristic. Das Ding an sich is just about hearable as the name of a B-movie: The Thing In Itself!

But this isn’t yet to say enough. I want you to agree that the Gothic in Žižek is something more than a reasoned-through philosophical position, offered to the reader to adopt as creed or mantra. But it is also something more than a sinister rhetoric or set of literary conventions—more than a palette of gruesome flourishes borrowed from the horror classics. In Žižek’s writing, the Gothic attains the status of a method. This will need to be explained, but it’s worth it: It is a tenet of Lacanianism that things in the world have trouble cohering or maintaining their integrity; this is true of persons, but it is every bit as true of institutions or, indeed, of entire social fields. One of the great Lacanian pastimes is thus to scan a person or a piece of writing or a historical-political scene for evidence of its (her, his) fragmentation or disintegration. To the bit of Sartrean wisdom that says that all identity is performance, the Lacanians add a qualifier: All identity is failed performance, in which case it is our task to stay on the lookout for a person’s protrusions and tells and prostheses, the incongruous features that seemingly put-together persons have not been able to absorb into their specious unity. In what specifiable ways are you least like you claim to be? Where is your Adam’s apple, because it’s probably not on your neck? Now once you get good at asking such questions of people, the challenge will be to figure out how to ask them again of the systems in which people reside. The Real—whatever lies menacingly outside of discourse—can take several different forms: Most obviously, it can name external trauma: assaults upon your person, the bullet in your belly, your harrowing. But it might also name your own disgusting desires, the ones you are least willing to own. Or it might name the totality (of empire, say, or global capitalism). Any concept that we form of the totality is going to be a reification, of course, something theorized, which is to say linguistically devised or even in some sense made up. But the totality-as-such, as distinct from this or that concept of totality, will persist as an unknowable limit to our efforts. It will be, to revise an old phrase, a structure palpable only in its effects, with the key proviso now being that the only effects that matter are the unpleasant ones: a structure palpable only in its humiliations. The world system is the shark in the water. Again, the Real might name a given social order’s fundamental antagonisms—the conflicts that are so basic to a set of institutions that no-one participating in those institutions can stand outside them. Or the Real might name the ungroundedness of those institutions and of our personae, their tenuous anchoring in free choices and changeable practices. So if you want to write political commentary in the style of Žižek, you really only need to do two things: 1) You scan the social scene that interests you in order to identify some absurd element within it, something that by official lights should not be in the room. Political Lacanianism in practice tends to be one big game of “Which one doesn’t belong?” or “One of these things is not like the others.” And 2) You figure out how this incongruity is an index of the Real in any of those varied senses: trauma, the drive, the totality, antagonism, or the void. You describe, in other words, how the Unspeakable is introducing anomalies and distortions into a sphere otherwise governed by speech.

So that’s one version of Žižek’s Gothic method. There are thus three distinct claims we’ll need to be able to tell apart. We can say, first, that Žižek likes to read Gothic fiction and also the eerier reaches of science fiction—and that’s true, though he precisely does not read them the way a literary critic would. It has always been one of the more idiosyncratic features of Žižek’s thought that he is willing to proclaim Pet Sematary a vehicle of genuine analytic insight or to see in horror stories more broadly a spontaneous and vernacular Lacanianism, in much the same way that old-fashioned moral philosophers used to think of Christianity as Kantianism for people without PhDs. To this observation we can easily add a second: that Žižek himself often reads as though he were writing speculative fiction, as in: You are not an upstanding member of society who dreams on occasion that he is a murderer, you are a murderer who dreams every night that he is an upstanding member of society—though keep reading in Žižek and you’ll also find: torture chambers, rape, “strange vibrating noises.” And yet if we’re taking Žižek at his word, then the point is not just to read Gothic novels, nor yet to write them. We must cultivate in ourselves, rather, a determination to read pretty much everything as Gothic. Once we’ve concluded that horror fiction offers a more accurate way of describing the world than do realist novels—that it is the better realism, a literature of the Real—then the only way to defend this insight will be to read the very world as horror show. It will no longer be enough to read Lovecraft and Shirley Jackson. The Gothic hops the border and becomes a hermeneutics rather than a genre. Anything—any poem, painting, person, or polity—will, if snuck up upon from the right angle, disclose to you its bony grimace.

This approach should help us further specify Žižek’s place on the philosophical scene. It is often complained that Hegelian thinkers—Adorno, Wallerstein, Jameson—subdue their interlocutors not by proving their arguments false but precisely by agreeing with them. Going up against a Hegelian, you find yourself less refuted than outflanked—absorbed, reduced, assigned some cramped nook in the dialectical apparatus. That’s a point we can now extend to Žižek, in whose writing the Gothic gets weaponized in precisely this Hegelian way. Horror becomes a device, a move, a way of transforming other people’s arguments. When Žižek engages in polemic with some peer, his usual tack is not to controvert his adversary’s arguments, but rather to improvise an eerie riff upon them, to re-state his opponent’s claims in their most unsettling register. You can call this the dialectic, but you might also call it pestilence. Žižek infects his rivals with Lacan and forces them to speak macabre versions of their core positions: undead Heidegger, undead Badiou, undead Judith Butler.

Three of these fiends we will want to single out:

Žižek summons zombie Deleuze. It is often remarked that critical theory in the new century has taken a vitalist turn. The trials-by-epistemology that were the day-to-day business of the long post-structuralist generation have given way to the endless policing of ontologies. Graduate students accuse each other of possessing the wrong cosmology or of performing their obeisance to the object with insufficient fervor. Deleuze and Guattarí can be corrected only by those proposing counter-ontologies. Claims get to be right because Bergson made them. You are scared to admit that you wrote your whole first book without having read Spinoza. Nietzsche is still quotable, but only where he is most ebullient and alpine. You ask which description of the stars, if recited consequently to its last rhyme, will reform the banking system and unmelt the ice caps. Klassenkampf seems less interesting than theomachia. What is less often remarked is that vitalism has only returned to the fore by consenting to a major modification—a fundamental change in its program and priorities—only, that is, by agreeing not to grant precedence to those things we used to call “living.” The achievement of the various neo-vitalisms has been to extend the idiom of the old Lebensphilosophie—its egalitarian cosmos of widely shared powers, its emphasis on mutation and metamorphosis—to entire categories of object that vitalists used to think of themselves as opposing: the inanimate, the inorganic, and the dead. It is in this sense misleading to call Deleuze a Spinozist without immediately noting that his Spinoza has been routed through La Mettrie and the various Industrial Revolutions and the Futurists, which makes of schizoanalysis less a vitalism than a profound updating of the same, such that it no longer has to exclude the machine—a techno-vitalism, then, for which engines are the better organisms, and which takes as its unnamed material prompts epochal innovations in the history of capitalism itself: the emergence of the late twentieth century’s animate industrialisms, flexible manufacturing and biotech, production producing and production produced.

So that’s one vitalism of the unliving, but there are others. Jane Bennett claims for her ontology the authority of her great lebensphilosophische forebears—Spinoza, Bergson, Hans Driesch, Bakhtin—and yet calls matter “vibrant” rather than “vital,” because she wants her list of things living and lifelike to include national electricity grids and the litter thrown from the windows of passing cars. Bennett is trying to imagine a United States that has become in a few key respects more like Japan—an America in which Midwesterners, possessed by an “irrational love of matter,” hold funeral services for their broken DVD players and pay priests to bounce adzuki beans from off the hoods of newly purchased trucks. The phrase “vibrant matter” might hearken back to William Blake’s infinite-in-everything, but Bennett uses it mostly to refer to the consumables and disposables of advanced capitalist societies: to enchanted rubbish dumps and copper tubing and other such late-industrial yōkai. The task, again, is to figure out how to be a vitalist on a planet without nature—a pantheist of the anaerobic or Spinoza for the Anthropocene. Bennett herself says that what interests her is above all the “variability” and “creativity” of “inorganic matter.”  In that context, the achievement of the adjective “vibrant” is to recall the word “vital” without entailing it: not alive, merely pulsating; not vitalist, but vitalish.

What we can now say about Žižek is that he offers his own, rather different way of dialectically revising the older vitalisms. His point is that most people already happen upon the cosmic life force—in their everyday lives and without special philosophical tutoring—and that such encounters are, on balance, terrifying. The élan vital is not your iPod’s morning workout mix; nor is it some metaphysical energy drink. It is the demiurge that makes of you “a link in the chain you serve against your will”—the formulation is Freud’s—“a mere appendage of your germ-plasm,” not life’s theorist and apostle, but its stooge and discardable instrument. Psychoanalysis is the school that takes as its starting point the repugnance that we properly feel towards life—a vitalism still, but one with all the judgments reversed, a necrovitalism, in which bios takes on the attributes that common sense more typically associates with death, its nullity, above all, and its blind stupidity. One of Žižek’s favorite ways of making this point is by reminding you of how you felt when you first saw Ridley Scott’s Alien—movie of cave-wombs and booby-trapped eggs, of male pregnancies and forced blow jobs, which ends when the undressed woman finally lures from his hidey-hole the giant penis monster, the adult alien with the taut, glossy head of an erection. But we might also think of the matter this way: In the early 1950s, Wilhelm Reich—the magus of western Maine, Paracelsus in a lab coat, the ex-Freudian who thought he could capture the cosmic life force in shoeboxes and telephone booths—organized something he called the Oranur Experiment. Reich had by that point begun styling himself the counter-Einstein, foil and counterweight to the Nobel Laureate of Dead Cities, dedicated to building the nuclear age’s new and sorely needed weapons of life. He had to this end procured a single needle of radium; the idea was that he would introduce this shaving of Nagasaki into a room supercharged with élan vital so that he could observe the cosmic forces of death and the cosmic forces of life fighting it out under laboratory conditions. It did not go as he’d planned. Reich panicked when he discovered, not just that the radium was in some sense stronger, but that the radioactivity had contaminated and rendered malevolent the compound’s orgone. The cosmic life force hadn’t been obliterated; it had been turned, made sinister, recruited over to do the work of death. Žižek, we might say, is the theorist of this toxic vitality; the one who thinks that orgone was bad to begin with; the philosopher of rampant and metastatic life.


Žižek summons zombie Levinas. It might be more precise to say that Žižek summons the zombie Other or the Neighbor-Wight. Either way, poring over Žižek’s response to Levinas is your best chance at learning how to replicate his achievement—how, that is, to turn philosophers you dislike into your reanimated thralls. Derrida delivers the funeral oration; Žižek returns with a shovel later that night. The spell you will read from the Lacanian grimoire has three parts:

-First, you seek out the moment in your rival’s system where his thinking is already at its creepiest. Chapbook summaries of Levinas often make him sound like a fairly conventional European moral philosopher, as though he hadn’t done anything more than cut a new path, dottering and roundabout, back to the old Kantian positions about the dignity and autonomy of other people. It is easy, I mean, to make Levinas sound inoffensive and dutiful. The wise man’s hand silently cups the chin of a stranger. It will be important to insist, then, that ethics-as-first-philosophy harbors its proper share of sublimity or even of something akin to dread. We know that Levinas’s first step was to adjust Husserl’s doctrine of intentionality: So consciousness is always consciousness of something—sure enough. And all thinking is directed outward; it cannot not refer—granted. But intention, even as it fans away from me in a wide, curving band, will meet obstacles or opacities, and it is by fixing our attention on these stains in the phenomenological field that Levinas develops what he himself calls “a philosophy of the enigma”—a kind of anti-phenomenology in which thinking begins anew by giving priority to what does not appear and in which it falls to me to sustain and shepherd this strangeness I have just discovered in the Not-I. This is a program whose uncanny and un-Kantian qualities we could restore only if we agreed to set aside Levinas’s own undarnably worn-out language—alterity, the Other, otherness—and to put “the Alien” in its stead: an ethics of the Alien would ask us to look upon the face of the Alien so that we can better understand the tasks of being-for-the-Alien. For current purposes, what we’ll want to keep in mind is that Žižek has no beef with this Levinas. He agrees after a fashion with the doctrines of alterity and can easily translate their claims about the obscurity of other people into Lacanian observations about the modes of appearance of the Real. But again, it’s not the argument that matters; it’s the method: Žižek has to find at least one point of agreement with Levinas, because that’s how the zombie hex gains access to its mark.

-So that’s the first step. You make a point of agreeing with your rival by finding that one argument of his that is already pretty occult. The next step, then, is to show how he nonetheless runs away from the creepiness he has conjured. Žižek’s complaint against Levinas is easily summarized. He thinks that the ethics of alterity, far from demanding difficult encounters with other people, encourages me to keep my relationship to others within strict bounds—to delimit, attenuate, and finally dull such encounters. Totality and Infinity is the handbook for stage-managing a counterfeit otherness, as a moment’s reflection on two of the words we most associate with Levinas should suffice to show. Who, after all, are the people who routinely allow themselves to be “caressed”? A Levinasian ethics takes as its paradigmatic others people with cheeks at the ready: lovers and children and hospice patients. The attitude it means to cultivate in us is accordingly amorous or avuncular or perhaps candy-striping. The moral person is the one in a position to dandle and cosset. The language of “the Neighbor,” meanwhile, forfeits even the slight provocations of the word “Other,” making strangers proximate again, returning outlanders to their position of adjacency. Neighbors aren’t the ones who draw out of you your hitherto unsuspected capacities for righteousness. They are the ones-to-whom-you-loan-cordless-drills, the ones-who-could-afford-to-buy-on-your-block. Psychoanalysis, then, is where Žižek would have us look for a philosophical program that does not housebreak the Other in this way, though the phenomenologists, if they are to follow him there, will have to agree to reinstate the entire, outmoded metaphysics of appearance v. essence, since those who go into analysis are consenting to set aside public facades and facile self-perceptions and are learning instead to speak the secret language of hidden things. The more-than-Levinasian task, at any rate, would be to find a way to live alongside that person, the person whose unspoken desires you would doubtless find ugly. Other people would terrify you if you knew them well—that is the most remorseless, Freudian plain speech—and it is in the dying light of that claim that Levinas’s thinking looks suspiciously like an excuse not to know them. A psychoanalytically robust account of Otherness would therefore have to reintroduce you to the people next door, that “inhuman” family with whom you now share a hedge, where by inhuman Žižek means “irrational, radically evil, capricious, revolting, disgusting.” Can you hew to the ethics of neighborliness even when a vampire buys the bungalow across the street? Are you willing to caress not just an unfamiliar face but a moldering one? Methodologically, the point we will not want to miss is that Levinas now stands accused on his own terms of having replaced the Alien with the Other, of having persuaded you to stuff your ears against your neighbors’ shenanigans, of having evinced once again what he himself once called the “horror of the Other which remains Other.” We put up with other people as long as they put up a face. And here, finally, is the portable technique, which you can bring to bear against any theorist and not just against the radical ethicists: When you read a rival philosopher, you will want to take whatever creepy argument he already proposes and find a way to make it a whole lot creepier. That will be your chance to conduct a kind of body swap, to replace the philosopher with a more consequently unpleasant version of himself.

-So that’s the second step. Step three is: You welcome your rival into the army of the dead, making sure that he realizes that he is just one monster among many such. Here’s where the hoodoo gets tricky. A Levinasian ethics presents itself to us as intimate, a thought nestled between two terms, Me and the Other, where the latter means “the neighbor and his mug at strokable distance.” And yet the term “Other” is incapable of this kind of grazing approach; it is barred in its very constitution from ever rubbing noses with us. For the word indicates no particular second person but only the anonymous and shrouded Autrui. If I speak only of “the Other,” with no further specification, I could be referring to anyone but me. The concept produces no further criterion and calls no-one by name. Behind its sham-individuation there thus lurks the mathematical sublimity of the crowd, impersonal and planet-filling. At this point you have two options: You can decide that the ethics of alterity is ineffectual because self-consuming in this fashion, claiming to preserve the irreducible strangeness of the other while in fact washing such peculiarities away in a bath of equal and undifferentiated otherness. The philosophical system’s organizing term is, as ever, what betrays it. Alternately, you can decide that a Levinasian ethics can survive only by generalizing itself, by accepting its own faceless abstraction as a prompt and so by agreeing to become categorical. If we follow this second route, we will have to say without blushing that Levinas’s thought as it has come down to us was already characterized by a pressure, irregularly heeded, to all-but-universalize. The term Other directs my moral concern recklessly in all directions, sponsoring a universalism to which I am the only exception—a humanism minus one.

But then it should be easy to add the subtracted one back in. It should be easy, I mean, to get the Me to takes its place among those many indistinct others and thereby to make Levinas’s universalism complete. It will be enough, in fact, to call to mind the basic dialectical idea that we do not cognize objects singly, but only relationally or in constellations. This means, among many other things, that the Me and the Other strictly imply one another. If my action in the world didn’t reach a certain limit, if I didn’t routinely knock into other objects and persons, if these latter didn’t reliably humble me, then I wouldn’t even have a sense of myself as a Me, which is to say as something that does not, in fact, coincide with the world. But then the Other and the Me are not fixed positions; they are conceptually unstable and even in some sense interchangeable. I can obviously switch places with the other; I am other to the Other, who, in turn, is a Me in her own right. As soon as I concede this, I have discovered my own alien-ness. Second, and as an intensification of these Hegelian reciprocity games, we can collapse the two terms into a single formation: not the Me and the Other, but the Other-Me or the self as foreign element. This can be managed a few different ways. My experience of becoming—of my own changeability—renders me other to myself, reconstructing the ego as watercourse or Heraclitean series. I do not shake the Other’s hand as though I didn’t know what it was like to be a stranger. But we can also travel a more direct psychoanalytic path to the same insight, simply by noting that I am not transparent to myself, not in charge of my own person, that my own desires and motives are basically incomprehensible to me—that, indeed, I am just another dimness or demonic riddle.

And with that, the terms generated by Levinas’s philosophy mutate beyond recognition. This, in case you missed it, is the culminating step in Žižek’s method: If when reading philosopher X, you hold fast to what is most Gothic in X’s thinking—if you generalize its monstrosities and don’t exempt yourself from them, if you promote Unwesen to the position of Wesen—then other core features of X’s system will break and buckle and shift, until it no longer really looks any more like X’s thinking. To stay with Levinas: The ethics of alterity rotates around a single inviolable prohibition—that I not conclude that all egos are more or less the same; that I not propose a theory of subjectivity that would hold equally for all people; that I not stipulate as the precondition of my welcoming another person that he or she be like me. But if the terms “self” and “other” cannot be maintained in their separateness—and they can’t—then this injunction will be lifted, and Žižek can improvise in its stead a paradoxical argument in which alterity becomes the vehicle of our similarity, in which I realize I am like others in their very otherness, in which the Hegelian homecoming comes to pass after all, but on the terrain of alienation and not of the self, in which what establishes our identity is not some human substance, but our inevitable distance from such substance—which distance, we, however, share. There thus arises the possibility that I will identify with the Alien, not in his humanity, but in his very monstrosity, as long as I have come to the conclusion first that the world’s most obviously damaged people only make public the inhumanity that is our common portion and my own clandestine ferment. And out of such acts of identification—and not of pity or tolerance or aid—Žižek would build, in the place of Levinas’s philosophy à deux, a global alien host or legion of the damned. Radicalize what is creepiest in your rival, in other words, and then make it universal. This brings us to Episode Number Three, in media res, as they say: already in progress…

Levinas zombie

Žižek summons the zombie multitude. I want to point out two more instances of this horror-movie universalism—two more cases, that is, in which Žižek takes one of radical thought’s settled positions and contagiously expands its orbit. What you’ll want to pay attention to is how each position leads to the same conceptual destination, which is the undead horde—Levinas has just led to the horde; and now Rancière will lead to the horde, and then Agamben will, too, like characters in a Lucio Fulci movie getting picked off at twenty-minute intervals. The horde: We’ll want to consider the possibility now that the cadaver-thronged parking lot is a post-political society’s last remaining image of the unmediated collectivity, the term that, having driven from consciousness the gatherings and aggregates posited by classical political philosophy—the assembly, the demos, the populo, the revolutionary crowd—must now be asked to absorb into itself the indispensable political energies we used to expect from these latter. Can we get the walking dead to mill about the barricades?—that is another of Žižek’s driving questions. Will they know to throw rocks?

One path to the horde begins with Rancière’s idea that politics proper belongs to “the part that has no part”—which is the philosopher’s oxymoronic term for the disenfranchised, those who are important to the system’s functioning but who don’t in the usual sense count, who don’t get to take part and who have no party. Rancière’s claim—and sometimes Žižek’s, too—is that only the agitations of such people (refugees, guest workers, the undocumented) so much as deserve to be called “politics,” because it is only at a system’s roiled margins that basic questions about a polity can be raised, questions, that is, about its scope and constitution. Anything that happens in the ordinary course of government takes the state’s functioning for granted and so isn’t really about the polis—is not, in that sense, “political.” On the face of it, this is a terrible idea. Rancière’s position is anti-constitutional and anti-utopian and indeed committed to failure. My actions only get to count as political provided the state does not recognize me, and as soon as I succeed in convincing someone in power to look me in the eye or indeed to act on my behalf, I cede my claim to be a political actor and become just another pawn of policy makers and the police. There is, in this sense, no such thing as getting the state right; every political breakthrough is actually a setback. To frame your program in terms of “the part that has no part” is to show contempt for those parts-with-parts, absolutely any parts, even though some of these portions will be quite meager. This has made Rancière ill-equipped to talk about what we might call the part that has little part: the native-born working classes, the rural poor, the jobless, the ineffectually enfranchised.

So can Rancière’s thinking be Gothically universalized? It is one of the more attractive features of Žižek’s thinking that he corrects Rancière at just this point and in just this fashion, insisting on the instability of the conceptual pair around which the politics of parts usually turns, inclusion-exclusion, as in: Politics is only ever out there; here there is only administration. That last sentence turns out to be untenable, for even the part that has no-part is not simply excluded. It is one of radical thought’s lazier habits to treat the word “margins” as though it meant the outside when it fact it means the space just inside the door, the page’s extremity and not the empty air that surrounds the lifted book. More: Even the word “exclusion” never refers to simple separation or distance. You have to have had some contact with a system for me to be able to say that you are excluded from it; the very concept depends on some thread or temporary node of connection. The gauchos of the Uruguayan plains may not be represented in the Danish Folketing, but they aren’t excluded from it either. “Exclusion” contains the idea of “inclusion” within itself and is not the latter’s simple opposite. Genuine apartness would require a different concept. This observation will allow Žižek to fold the old proletariat back into the category of the part that has no part. Working people and refugees are actually in similar positions of inclusion/exclusion: the grinding, mutilating condition of being swept up in a system whose inner workings nonetheless seem closed off and impossible to fathom.

One way to think about what Žižek is doing here would be to say that he is trying, within the terms dictated by contemporary European philosophy, to get us to shake off our gauchiste habit, picked up over the social democrat decades, of seeing European workers as basically First World and coddled and deleteriously white. He wants to help us retrieve “a more radical notion of the proletarian”—where more radical means not “more militant,” at least not in the first instance, but merely “more abject.” If I say now that the doctrine of we-all-are-refugees might hold the key to the emergence of a new proletariat, you might object, mildly, that this new proletariat sounds a lot like the old one—the really old one, the one that didn’t yet drive oversized Buicks, the working class stump-armed and black-lunged and blind. There is something new, however, about Žižek’s version of the wretched ones, which is that he’s pretty sure that they include us, the people who actually read his books, the people who know who Žižek is: the second-year university students, the middle-aged art historians, the underemployed web designers, the gap-year backpackers. “Today, we are all potentially homo sacer”—and then that’s a second, unusually clear instance of his Gothic universalism right there, now keyed to Agamben, who, once whammied, will produce an image of the concentration camp victim as Everyman or bare life as Ordinary Joe. To be a new-model proletarian is simply to know that your life, if not yet ghastly, is nonetheless exposed and insecure—wholly vincible. In place of Hardt and Negri’s squatters and street-partiers and Glo-Stick communards, Žižek means to fill the streets with a multitude less than human. It might take a minute for this idea to sink in. The new proletariat will be built out of homines sacri.  Žižek’s thrilling and preposterous idea is that having failed to organize fast-food chains or big-box retail, we might yet organize ourselves on the basis of la vita nuda—that the Musselmänner might form a union and yet remain Musselmänner, that those who have lost even the instincts of self-preservation, who have stopped swatting the flies that lay eggs in their open sores, might be made to see the point of collective bargaining.

It has become almost obligatory over the past decade to argue that fear lives on the Right, that terror is a means of social control, that one could defeat Al Qaeda and the Patriot Act at once if only one would resolve to be unafraid, if only we could make ourselves okay with not being safe. It is against the Left machismo of those arguments, so many rehashings of the old Spinozist idea that “fear makes us womanish,” that Žižek’s accomplishment over the last decade can be measured, as he has set about to reclaim terror as one possible platform for emancipation and revolutionary equality, to help us imagine a communism for the screamers and the tearful and the scared. Not that Žižek is offering to make you any less frightened. He will not give you refuge or grab your hand or quietly sing nonsense lyrics into your ear. A politics of militant fear does not begin by offering solace. Quite the contrary: Our task will be to communicate fear and to amplify it. You have a few different options as to how you might go about this. You can issue reasoned admonitions, explain to us soberly about the threats and the thresholds and the no-going-back: two degrees Celsius, go ahead tell us again. Or you can make us feel your own foreboding, as also the grief that is fear’s come-true aftermath: Show us the photographs of Katrina graffiti—“Destroy this memory,” one picture records, in white paint on a flooded brick house, in good, teacherly cursive, no less. But it has been left to Žižek to propose a radically darkened politics, a politics that, no longer content to protest the ongoing catastrophe, has taken the disaster into itself and begun to root for ruin. We are the ones they were supposed to be afraid of. In George Romero’s Land of the Dead, the zombies are for once oddly purposeful, these animate corpses with faces torn into tragic masks, whose first, returning memories are of what it was like once to work and when not working march. You are probably already hurting. A just politics is going to hurt a whole lot worse.

Land of the Dead


Three Essays on Zizek

Zizek Marat Joseph

•1. Žižek’s Argument

I’d like to put two questions to Slavoj Žižek, though the second question might turn out to be the first one wearing different-colored leotards. It would help, I think, if I explained first what I take to be Žižek’s core argument—the problem and puzzle driving his theoretical overproduction—both so that he can tell me if I’m wrong and because readers of Žižek are sorely in need of a map. It’s not that he never says what he is after; the problem is, rather, that the centrality of this one issue tends in his writing to get lost amidst the riffs and the endlessly re-explained Lacanianisms and the compulsive recording of everything he’s watched this year on hotel room televisions. It is possible to read an awful lot of Žižek and still not realize that he has a point. Indeed, one sometimes gets the feeling that the only people who understand him less well than his opponents are his enthusiasts.

So here, for easy reference, is his animating claim: that every political formation, in addition to generating the law, generates a particular more or less expected way of violating the law. Any set of prohibitions comes with its own accustomed transgressions, a particular way in which Law-in-the-abstract allows itself to be broken. Different laws produce different lawbreakers or different modes of rebellion. And what keeps us attached to a given political order—what makes us loyal to it—is not the law, but the transgression. We like living in a particular society because of the illicit pleasures that it affords us—because, that is, it grants us a particular set of turn-ons, and it does so not by openly trading in these latter, but precisely by seeming to disallow them. Following the law is one path to subservience; breaking it is a second. Transgression, in fact, produces in us the more powerful political obligation; it is the device by which a governing order takes hold of us for good. And Žižek, by making this argument, is merely tracking back to Freudian ground zero, to the idea that all of our relationships carry a libidinal charge or that desire and satisfaction are permanent features of our psychic lives—ineliminable, not to be overcome. The idea, further, is that law by itself couldn’t possibly work; the law alone can never be lawlike in its effects, for if some authority genuinely denied us all pleasure, we would take measures to abolish it. But authority doesn’t deny us pleasures; it creates new ones and can become, indeed, just another target for our ardor.

Enjoyment, to bottom-line it, is not the heroic alternative to discipline and convention. It is discipline’s sidekick and in some sense the authentically nomian term—the secret bearer of law’s regularities and compulsions. The libido is the vehicle of our subjection and thus the answer to why most of us, even those of us in the habit of striking defiant poses, don’t seek fundamental political changes or seek them only half-heartedly: Change would disrupt whatever erotic bargain we’ve quietly worked out with the prevailing order. Žižek’s way of putting all this is to say that every political system—every code of law or tablet of rules—comes with an “obscene supplement”; he also calls it “the inherent transgression.” And his single greatest talent as an intellectual is to survey some corner of the social scene and find the smudge of obscenity that holds it together, to smoke out its anchoring enjoyment, to help you see how people are getting off on things that they don’t seem to be getting off on.

That’s a pretty Calvinist skill as skills go. And, indeed, it is the asceticism of Žižek’s position, so unlike the prevailing tenor of radical philosophy, that we will want to underscore. In 1934, Wilhelm Reich, having recently fled to Denmark from Berlin, wrote an essay trying to make sense of the epochal victory in Germany of the leather-jacket Right. Why had the German Left failed to stand up to the fascists? How had they ceded so much ground? Reich began that essay by saying that Marxists were going to have to spend less time thinking about structure and system and historical process and more time thinking about “the subjective factor in history”—less time improvising mini-lectures on monopoly capitalism and the pseudo-democratic ruses of the bourgeois state and more time talking to ordinary people about how they feel and what they might do to feel better. The most remarkable section of the essay comes when Reich begins quoting Joseph Goebbels, not in order to document yet another National Socialist inanity, but in order to make clear that the fascists were onto something. Their success meant, by definition, that they had understood something that the Left had failed to grasp.National Socialism, [Goebbels] said, was not a puritan movement; the people should not be robbed of their joie de vivre; the aim was to achieve more life affirmation and less hypocrisy, more morality and fewer moralistic attitudes.” This is what socialists should have been saying, but perversely weren’t. Shame sits ever on our lips. Reich perceived a basic contradiction in the political constellation of the early 1930s: The fascists successfully appealed to people at the level of pleasure and desire, even while implementing punishment. The socialists, meanwhile, had big plans for emancipating their fellows in several different senses at once, and yet comported themselves according to the petty morality of the well-cushioned parlor. Fascism, in short, broke through in Germany because it was a lot more fun—it seemed to run on expanded erotic energies—whereas the Left, as ever, preferred to educate its potential comrades in the gross national product of India while asking them pointedly whether they fully understood that children made their shoes. Marxists, Reich concluded, needed to buy some guitars; they would have to write some better tunes.

It is this Reichian program, moreover, this determination to out-merry-make the Right, that Fredric Jameson has been trying to keep alive when arguing that Marxism must continue to strut down “the path of the subject,” that it must learn better ways to stimulate the “desire called Marx” or the “desire called utopia.” “If ideology … is a vision of the future that grips the masses, we have to admit that … no Marxist or Socialist Party anywhere has the slightest conception of what socialism or communism as a social system ought to be or can be expected to look like.” It’s just that Jameson, who was born eight months before Elvis Presley, came of age alongside the rock’n’roll Left that Reich seemed in many respects to have blueprinted, which means that his repeating of Reich’s complaint in the 1970s and ‘80s has to be read as an implicit reckoning with the counterculture’s limitations, an admission that even the newly larkish Left—the Left naked and capering—had been no match for General Electric and the Nixon administration. It’s not that Reich was wrong, and yet the socialist libido was still going to need something more than a Bo Diddley beat — that’s one version of Jameson.

And of course it’s not just Jameson who has been making this case. This is one of the things that makes Žižek so important—that he hasn’t been copycatting the inherited Reichian line, and so offers an alternative to Jameson and Deleuze and the many barrelsworth of Reich and Marcuse that really existing queer theory has smuggled past its Foucauldian sentries, an alternative, that is, to the no-longer-new Left’s program for the endless expansion and intensification of sexual life. Žižek is a Freudian, to be sure, and a man of the Left, but he is not a Left Freudian, if we take that term still to refer to one who mistakes his testicles for the working class and who regards the Id as a buddy and a pet and the smothered wellspring of his creativity. So Žižek is not like Jameson and Deleuze, but this observation is itself easily misunderstood. For his version of psychoanalysis does not want you to give up on your unorthodox desires—or at least not on all of them. Quite the contrary. Žižek’s sense is that we almost all engage in unusual behavior—sexual or at least eroticized behavior—to some degree. The problem is that nearly all of that behavior takes place with reference back to authority or to the law. We develop most of our sexual quirks as a way of taking a position with regard to the Master; we carry some notion of authority around in our heads, and the ways in which we like to get off are almost always predicated on what we believe to be true about the people in charge. So Žižek does indeed reject as facile the usual anti-authoritarian thrust of radical psychoanalysis, convinced as it is that we can forthrightly strip down and hump our way to emancipation, but it does so only to reinstate that anti-authoritarianism in another, more difficult place. Psychoanalysis in this mode doesn’t care what you get up to—it really doesn’t care how you take your pleasures—provided that these make no reference to the Master, provided, that is, that they aren’t even a rebellion against him. And to that extent there is one sense in which Žižek’s Lacanian-Hegelian system, otherwise committed to the ideas of negation and the lack, is fully invested in establishing a positivity or simple fact. Your task is to figure out the peculiar way you happen to desire when authority is entirely removed from the picture, when, that is, you no longer take the Master to be peeping from behind the curtains.

This, then, is the reason to go into analysis: The analyst has to be on the lookout for the one thing you desire—or the one way you desire, the one way you organize your satisfaction—that is not relational, not a position over and against bosses and fathers. Such is the knack that any good analyst has to develop: the ability to discriminate between Master-directed kink and kink that is truly your own. The bargain that analysis will make with you is that any enjoyment that survives the sundering of your psyche from authority is yours to keep. It’s just that most of your libidinal habits are not going to survive that sundering—or will be transformed by it into new ones. Žižek, following Lacan, calls any enjoyment thus liberated a sinthome, which, in the original French, isn’t anything more than an arch misspelling of and murky pun upon the word symptom. The Lacanian point is that the enjoyment that you take home with you at the end of a successful course of psychoanalysis is likely to look like and sound like a symptom—fevered, morbid, a “deviation from normal functioning,” the clinicians like to say. But it won’t actually be a symptom, or it will be a symptom with a difference, a symptom that is not a symptom. Analysis, in other words, aims not to cure you or return you to normal functioning, but to help you find your way to a happier disorder. Žižek’s hunch is that most people will leave analysis freakier than when they went into it.

So can we tell the difference between the raunch that unshackles us and the  raunch that fixes us in place? This is one of the more pungent questions that a political psychoanalysis prompts us to ask. For Wilhelm Reich was, of course, in one sense absolutely correct. It is not hard to agree that fascism succeeded in large part by devising new gratifications for its adherents. And perhaps it was only predictable that the Western Left would decide to take Reich’s advice and compete on that ground and help build consumer society’s all-singing-all-dancing-24-hour gaudy show. But psychoanalysis allows us to take stock of where we rock’n’rollers remain least at ease—or, indeed, to describe with some precision the new forms of anxiety that have come to the fore in an age of sex-without-taboos. Žižek’s argument is, in this respect, best understood as proposing a new way to periodize recent history—a new way, that is, of identifying the novelty of the present. It bears repeating: If Žižek is right, then in the political organization of enjoyment, obscenity has always played some kind of role. Even public life organized around strong authority figures used to summon the obscene supplement in its support. But we’ll want to at least consider the possibility that in our version of consumer capitalism, the obscene supplement has become primary and so largely supplanted what it had once been asked merely to buoy. The transgression has moved into the position of the master and so instituted a kind of authoritative obscenity. This marks a comprehensive change in what we might call the regime of enjoyment. Again: What keeps you attached to a society is the forms of deviant pleasure that it winks at. In nearly every social order that has ever existed, there has been law: state law or generally recognized prohibitions, and some people get off on breaking the law, while other people get off on the law itself, get off on enforcing it, get off on playing the cop or exasperated schoolmarm. What sets the present apart is that the prohibitions have to some considerable extent faded, which has produced a system of transgression without law or perhaps even transgression as law—what Žižek calls “the world of ordained transgression”—a society of compulsory pleasure in which you are perpetually enjoined to blow your load. You can think of this, if you like, as the flip side to another of Reich’s signature arguments. Sex-pol claimed that if you raised children in a sexually liberated way, refusing to drum inhibition into them, then they would not be willing later in life to go along with authority, because they would not be in the habit of giving up what was important to their happiness. They would be able to resist the call to renunciation, and if authority threatened their enjoyment directly, they would mutiny. Libidinally unpoliced children would become anti-authoritarian adults. The simple corollary of this argument is a catastrophe that Reich never even paused to consider—the plausibility of which advanced capitalism endlessly demonstrates—which is that if authority doesn’t threaten such people’s enjoyment, they will never rebel. If the social order gives people abundant opportunities to get off, it can abuse and exploit them in every other way.

Anyone trying to make sense of Žižek, then, will want to start tracking the ways in which ascetic and anti-ascetic arguments are knotted together in his work. He routinely speaks of “obscene enjoyment” or sometimes just of “obscenity,” and this in tones that we typically associate with anti-pornography campaigners. It’s just that what this version of psychoanalysis considers obscene is not sex, but the conjunction of sex and authority. An obscene pleasure is not one in which I gnash a ball gag or show too much areola, but one in which I imagine, however inarticulately, that I am serving the Master or emulating him or, indeed, defying him. To practice an anti-obscenity would therefore mean to devise a sexuality rigorously beyond the law. Whether or not it might also mean to devise a law beyond sexuality—a law unstained by pleasure—is one of the great open questions in Žižek work. You can, at any rate, accentuate this argument’s anti-asceticism, if you care to, since one of the conundrums most driving Žižek’s work is whether or not the sinthome can be turned into a politics. There is no question that Lacanianism can underwrite political positions or attitudes; it can underwrite a disconcertingly wide variety of them, in fact. The question is, rather, whether it can also produce a genuinely political practice. Could ordinary people learn en masse how to sever their desire from authority? Could we agree collectively not to fuck the police?—because if we can’t, then Lacanianism would seem condemned to remain a therapy and not a politics, to be undertaken in near isolation by the unhappy and the kithless, and producing little more than a libidinal aristocracy, the few upon whom liberated enjoyment has been bestowed, the jedi of the sinthome, an order increasingly restricted to France and Argentina and the university neighborhoods of Buffalo, NY. Can the sinthome be mass-produced?—that’s the properly hedonist version of Žižek’s project.

But then you can also, if you wish, lift out of Žižek’s arguments their fully anti-hedonist strains. Because when he tries to imagine this Lacanian politics, the models he turns to are notably austere: Kantianism, Christianity, Leninism. He says admiringly that poor teenagers with almost nothing to their name can still have discipline, an almost literal self-possession, a martial bearing and a karate chop. That most of us have met no such teenagers—that fifteen-year-olds tend, indeed, to be bywords not for discipline but for its opposite—suggests only how committed Žižek is to a certain fantasy of restraint and composure and self-command. One easy way to summarize Žižek, then, is to note that he tends to make abstemious proposals to libertine prompts. Liberated desire mutates inchwise into liberation from desire. It is easy for readers to find themselves wrong-footed by this. Chances are that you were first drawn to Žižek for one of two reasons: Maybe he was exactly what you always dreamed an Eastern European intellectual would be—manic, vulgar, flocculent; like a drunken peasant who just happened to be a great philosopher; not merely a Lacanian, but a gypsy-punk Lacanian. Or maybe it was enough that you found him funny, the one critical theorist whose mode of argumentation reliably recalls stand-up comedy, a programmatic tastelessness best watched on YouTube in six-minute bursts. Žižek, of course, doesn’t just retell a lot of inherited anekodty; his most famous observations themselves have the structure of bits: Have you ever noticed that different countries have different toilets? But then there is much in his thinking that Slavophiles and comedy nerds are required to overlook: that, for instance, he regularly attacks Eastern European intellectuals and artists for playing up the hard-living, balalaika schtick or for cultivating the impression that they write their books in slivovitz instead of ink. This, he says, is precisely the indecency on which nationalism thrives, and not only in the Balkans. Fans also fail to notice that Žižek’s first book in English already contained an attack on laughter (and the ideology of a liberated laughter)—an attack that he has never backed away from or even, to my knowledge, qualified. Obscenity might be the enemy, but comedy is its sniggering minion. Adorno used to say that anyone committed to the future would have to learn first to be unhappy in the present—that before we would so much as know to be fed up with our own exploitation, we would have to be “sated with false pleasures.” There is nothing that Žižek distrusts more than a dirty joke, which means you probably like him for the wrong reasons.


The Time Without Happiness



On Viv Soni’s Mourning Happiness (Ithaca, NY: Cornell University Press, 2010)

Now here’s a book that thinks you don’t even know whether you’re happy. The next time someone asks “Are you happy?”—inauspicious question that this is, prelude to a long talk and a probably sleepless night—the only proper answer you can give will be: “I don’t know” or, better, “It’s not for me to say.” The problem, for Vivasvan Soni, is not epistemological; it’s not that Soni has identified some new skeptical barrier that would prevent us from accurately identifying our own emotions: You don’t know whether you’re angry; you don’t know whether you’re contrite; &c. The issue is that happiness correctly understood is not, in fact, an emotion, not even a complex one, nor indeed any kind of inner state, and Mourning Happiness is the story of how we ever came to suppose that it was. But then this book is not just a history; it is an exorbitant labor of philosophical retrieval, rather, proposing that we return to another, all-but-vanished conception of happiness, which Soni anchors in the command, issued by the semi-mythical Athenian statesman Solon, that we “call no man happy until he is dead.” What this would mean is at least threefold: First, it would mean that happiness requires a difficult judgment; happiness will not wash over me, and I will never read it in the faces of others. Even following our current uses of the word, we accept that a person might be puzzled about her own happiness—unsure whether she is happy—whereas we would not expect her to be unsure, across even moderate durations, whether she is, say, in pain. Second, it would mean that this judgment will have to be spoken by others in my absence, since I will be dead, and that my fellows will thereby take responsibility for my life—for its course and its success. “Was he happy?” will inevitably, when spoken in grief, mean “Did we do everything we could to make him happy?” Third and most important: It means that you can tell whether someone was happy only if you take into consideration her entire life; to say that a person was happy is to say that, by some criterion unspecified, she lived her life well, and that life (or fortune or fate or God) did not at any point punish her irreparably. What’s more, if a person’s whole life is at issue, then there is no time of which you can say that her happiness did not matter. But then equally there is no month of which you can say that her happiness briefly ran high. Perfect moments do not enter into it. The day your first child is born will be important, of course, but no more than the reputedly routine Tuesday that precedes it. Soni’s most fundamental contention is that “happiness” used to be ordinary language’s one utopian term, broadcasting, even from its perch in everyday speech, the implacable idea that people deserve to lead good lives, and not just sometimes. The question, then, would have to be how we have ended up, by way of the very same word, with such a meagerness, a mere feeling, which you sometimes experience but mostly don’t—joy tempered with content; a seasoned gaiety; a composite pleasure; a reward for having endured long stretches of boredom and nausea; a treat: the weekend.

The good news, for some, will be that Soni is an eighteenth centuryist, and that Mourning Happiness is not just another happiness book, not the inescapable extension into its chosen field of one of academia’s more fashionable topics. For Soni’s is far and away the most brilliant reformulation of the question of happiness in recent years, the one book with reference to which all the other professors of happiness—the neo-Aristotelians, the SWB psychologists, sundry other late-model eudaimonologists—will have to frame their positions. Nor does Soni simply summon Augustan and Enlightenment proxies in order to ratify conclusions about happiness formulated in other venues. He draws out the specificities of eighteenth-century thinking and makes them indispensable for any serious consideration of the subject. Indeed, if you are in the habit of scanning the eighteenth-century lists, you would have to go back to Habermas’s public sphere book to find another volume of such reach and accomplishment—a book, I mean, that could carry the century to the rest of the academy, as vindication of the entire field, and that might generationally reframe entire subfields of eighteenth-century studies itself.

The bad news, then, at least for those who are protective of their period, is that Soni has come up with some entirely new reasons for hating the 1700s. If you believe, as he does, that modernity is the Time Without Happiness, then you might have thought the idea would be to recommend the eighteenth century to us as remedy—as the last period in which happiness was a central topic of philosophical and political debate. Our misery might be the occasion for a little Georgian revivalism. But that’s not it at all. Thus Soni: “One aim of this book … is to revise the common misperception that the eighteenth century taught us to think a secular, political conception of happiness, but that the nineteenth century turned its back on this utopian promise. I will argue, instead, that the failure to articulate a viable political conception of happiness is to be located in the eighteenth century itself, in the period’s putatively revolutionary and undeniably modern reinterpretation of happiness. … [H]ow is it that the eighteenth century’s very obsession with happiness culminates in the political obsolescence of the idea?” (3, 2)

That’s the question. Here, then, are the broad outlines of an answer: First, eighteenth-century fiction writers devised a new set of techniques for telling stories about unhappy people, stories in which the wretchedness of some lives serves a visible purpose, such that readers could in good conscience set aside, at least for long sections of a novel, their accustomed sense that people deserved to be happy. When we turn back to the question of happiness towards the end of such a narrative, this simple tear in the classical conception—the permission that eighteenth-century fiction has given us to temporarily stop thinking about happiness—will have left the concept permanently transformed: abstracted out of narrative; shorn in large part of its temporal aspect; given thinglike qualities; filled in with this or that arbitrary content; made deferrable, postponable, and so to some considerable degree optional; reclassified as pay or prize—as desert and, indeed, as dessert. This transformation might, in turn, help account for some of the impasses that scholars have already identified in eighteenth-century sentimentalism: the sentimentalists’ narcissistic regard for their own sensitivity; their persistent mistaking of anguish for virtue; their eagerness to weep, not to set suffering right but to relive it, not to abolish the position of the sufferer but to join her in her abjection. Once conceptions of happiness were made over, conceptions of unhappiness had perforce to mutate in their wake. Kant, meanwhile, took it upon himself to abolish happiness from moral philosophy altogether, but, then, upon reconsidering, scrambled to readmit it, though incompletely, as an awkwardly stitched add-on: happiness in the sheerest beyond, the thing that morality teaches you to expect but that you will never, in this lifetime, reach. The American Revolution, finally, presents a case study in the disappearance of happiness. Soni is palpably delighted to find a vigorous philosophy of happiness in much early American political thought—not just in the Declaration of Independence, whose “pursuit of happiness” is, as he remarks, actually rather hedged, but in various pre-revolutionary political tracts and insurgent state constitutions—all of it, however, vanished by 1787 and the drafting of the federal Constitution, which mentions happiness not at all. This last sequence demonstrates in fine grain what is Soni’s central and alarming point: that the late eighteenth century did not produce a politics of happiness. Quite the contrary, it was the period in which the politics of happiness was superseded—precisely a transition moment, in which we find political thinkers talking about happiness for just as long as it takes to privatize it. It’s not that the Americans—and their French allies—hadn’t been trying to establish a politics of happiness; some of them had been. But they were on Soni’s account trying to politicize the wrong happiness, a concept already so damaged as to be ideologically unserviceable. At this juncture, Soni’s thinking yields a series of grandly counter-intuitive and even paradoxical formulations: He finds an anti-utopian program at just the moment when we think the modern Left is coming to be. In Samuel Richardson, he discovers an amoral morality: Even the reader consents, after a fashion, to Pamela’s suffering, since the more she suffers, the more honorable she will seem. Studying Kant, he learns that the labor of philosophical ethics is to limit responsibility—to let you know what you are under no obligation to do—to let you, finally, off the hook. And in the end, the revolution’s happiness project turned out to be just another way of ushering felicity out the door. To judge by Soni’s recent essays—post-happiness—he seems to be gearing up to give a similar account of “judgment,” which concept Locke, Hutcheson, and others inherited from antiquity, but then mistranslated so badly that subsequent generations decided they could just as well do without it.[i] Such is the cast of Soni’s thought: We should worry less about the eighteenth century’s oversights than its obsessions, which are a ready guide to its mistakes. The period breaks everything it touches.

• • •

If there is another book that Mourning Happiness resembles, then it is surely Alisdair MacIntyre’s After Virtue (1981), itself old enough now to merit a refurbishing, though MacIntyre is so routinely misread that to note the likeness risks condemning Soni to share in that misunderstanding. Casual readers retrieving his name off of half-remembered undergraduate syllabi typically have MacIntyre pegged for a Catholic reactionary—a postmodern de Maistre or late capitalist Donoso Cortés—and one might well wonder what such a person has to do with the thinker in front of us, whose philosophical mentors have been Derrida and Levinas and, indeed, Jameson. Soni quotes no saints. But MacIntyre began his thinking life as a committed socialist, and initially conceived of After Virtue as supplying orthodox Marxism with the moral philosophy that it had never, to its detriment, gotten around to proposing. Anyone re-reading the book at a thirty years’ remove is in a good position to appreciate just how close its language is to the Frankfurt School and French Maoism and even to a certain post-structuralism. MacIntyre’s still Marxist Left Weberianism turns out to be really easy to spot: his attack on bureaucracy and rule by experts, his sense that a manipulative social order attempts to legitimize itself by socially performing versions of knowledge that it couldn’t possibly possess. After Virtue reads like a parallel to Habermas’s work in the same period, as in: Horkheimer keeps telling us that ‘instrumental reason’ is a problem and has colonized the lifeworld—so … what was that other kind of reason again? It’s just that where Habermas looked to Kant to equip the Left with a non-instrumental rationality, MacIntyre looked instead to Aristotle, whose philosophy of virtue he wanted us to understand as an anti-capitalist ethos of non-alienated activity and human achievement. Well before 1981, MacIntyre and E. P. Thompson had joined forces in a fight against the English Althusserians, and one easy way to make sense of After Virtue is as a bid to double-down on the young and humanist Marxism that they both preferred. Nor did MacIntyre ever really shed his Marxism, not, at any rate, with an apostate’s venom. As late as 1987, he was arguing that Aristotelianism did in fact survive, here and there, residually, into the twentieth century—as evidenced by Mao’s army in 1940s China. And he ended a 1993 essay reflecting on the collapse of state communism by proclaiming that “The point is … first to understand [our defeat] and then to start out all over again.”[ii]

That is the sense in which Soni deserves to be read as MacIntyre’s successor: He has started out all over again. Their affinities are, at any rate, quickly listed. First, Soni takes over MacIntyre’s basic thesis—that the eighteenth century underwent a catastrophic breakdown of moral reasoning—as well as his basic hunch about where we might look for redress: classical antiquity, one might blurt out, though this is too vague—better to say: the everyday practices of the Athenian  polis. Second, Soni shares MacIntyre’s allergy to liberalism. There are moments when Mourning Happiness sounds anti-liberal notes in such a recognizably neo-Aristotelian key that the book seems briefly to be channeling Matthew Arnold: “Freedom must be the freedom to live well, or it is worth nothing” (429). Third, the two share that unspoken debt to the Frankfurt School. If you wanted to claim Soni’s work for the Marxist tradition, it would be enough to recall any of a dozen passages in which Adorno invokes the future’s “promesse du bonheur”; when the critical theorist can no longer bring himself to write the word “communism,” he calls it “happiness” instead. So it is that in one of his late lectures Adorno identifies that “extraordinarily damaging dialectic” by which “in the name of freedom … happiness of every kind falls victim to a kind of taboo and is banished from philosophy.”[iii] Soni’s is the most trenchant account we now possess of that particular banishing, which Adorno himself left largely unexpounded. MacIntyre, finally, begins chapters with cheerful avowals of his own crankiness: “If my extreme position is correct….”[iv] Soni, in a similar mood, calls the demands that happiness makes upon us—the demands, that is, that he wants to make upon us in happiness’s name—“excessive” and “unrelenting” and “mad” (410). Soni is by turns a first-rate intellectual historian, a virtuoso philosophical exegete, and a groundbreaking literary critic. Yet one of the wonders of reading his work is the creeping realization that beneath a prose this calm and expository, and beneath an argumentative style so entirely deliberate—precise to the point of fussiness—there can lurk an idea reckless and militant: single-minded, obsessive.  “Every life that we must judge unhappy is potentially a radical indictment of the world that permits this immeasurable tragedy and injustice” (207). Mourning Happiness means finally to foster in us the intransigent “will to make a world of happiness before another person dies whom we must pronounce unhappy” (410).

MacIntyre has of late allowed his work to be described as a “revolutionary Aristotelianism,” and it’s that last phrase that we might now wish to extend to Soni: Well-being storms the barricades. We will have learned something important about his thought, then, when we dig deeper into Mourning Happiness and discover that it actually contains targeted attacks on both revolution and Aristotle. Of the great national revolutions, Soni concludes that they, no less than early English novels, asked contemporaries to stop making a priority of their happiness. Only for a time, they said; inevitably for too long. Puritans and patriots and Jacobins black and white all put happiness to one side, declaring so many states of emergency and finally substituting for a politics of happiness the politics of legitimacy, in which the only thing left mattering is who has derived authority from whom according to what kind of (more or less fictional) political contract. The problem with Aristotle, meanwhile, is that he thinks that if you size up a high-achieving man at forty—a go-getter at the height of his powers—you are licensed to conclude that he is happy, even if in some few cases the future might overturn that judgment. Soni, in other words, is bothered that Aristotle makes happiness something that one can attribute to a living person and astutely traces this shift back to that philosopher’s distinctive notion of “activity,” which links happiness to actions, mostly famously contemplation, that are themselves fully achieved, complete unto themselves, not in the service of some other goal. Such a conception distorts all the fundamental tenets of happiness in its pre-Aristotelian rigor: It makes it easy to tell who is happy; it sidelines the community that would otherwise be called upon to take responsibility for my happiness; and it delinks happiness from the narrative of an entire life. Indeed, the best life on this account would, because organized around self-sufficient activities, be a narratively rather thin one. Aristotle, in other words, seizes upon Solon’s painstaking idea and makes it slack—that’s Soni’s charge, and it constitutes, for anyone already studying happiness, one of the more surprising turns in his book: the moment when we realize that Aristotle was not eudaimony’s master thinker, but already its betrayer. There is also a bigger point here. What most distinguishes Soni from MacIntyre is a marked Heideggerian strain in his argument—not in the substance of his claims exactly, but in his way of recounting the history of philosophy—in his commitment, that is, to reaching back behind Aristotle and Plato to retrieve for thought a certain pre-philosophical content. One glance at the table of contents will clue you in: Soni’s chapter on Aristotle is subtitled “The First Forgetting.” Like other late-generation Heideggarians, Soni makes his claim on our attention by proposing a new candidate for the title of all-important-thing-that-European-philosophy-has-never-been-able-to-grasp, a fresh contender for the Ever Foreclosed, with happiness now functioning as rival to Sein and alterité and Hoffnung and das Nichtidentische.

So Soni’s resemblance to MacIntyre is no sooner established than it begins to fall apart, and if we agree to call this counter-MacIntyrean strain in Soni’s thinking “anti-philosophical,” we will have captured something consequential about it. The question, if you like, is why one might prefer Solon’s cryptic and unelaborated position to Aristotle’s fluent and philosophically robust one. The answer, of course, is that philosophers are inclined to say entirely too much. Soni wants to allow as much variation as possible with regards to what counts as happiness: across cultures and subcultures, from one individual life to the next, indeed, across the stages and seasons of a single, unsettled human life. Solon doesn’t tell inquirers how to recognize a happy life, only that they will have to wait for one to end before they can even try, and Soni takes this to mean that how we determine happiness is almost entirely up for grabs. The issue is this: If MacIntyre has, in his retirement, felt compelled to append prefaces to new editions of his old books, explaining to readers that he is not, in fact, a conservative, this is at least in part because he sometimes mimics one so convincingly, and nowhere more so than when he explains that for people to be virtuous, they will have to go back to living in homogenous communities, small-scale collectivities that share a culture and a language, as also modes of deliberation and moral understandings. Soni, on the other hand, comes out against any such “social consensus” on the matter of happiness and claims that a scrupulous understanding of the same won’t generate even “implicit norms”—standards that anyone seeking happiness would have to scramble to meet (80). He means this for real: One good way to read Mourning Happiness would be to track Soni’s determined efforts, across five hundred pages, to not really say anything about happiness at all; to not fill it in with content; to not tell you what it is or even what anybody in the past used to think it was; to refuse you all possible insight into how to be happy. “Happiness is nothing but the name for what is at stake in existence” (235).

We’ll gain a better sense of what Soni is up to here if we play him off against Kant, who famously argued that the doctrine of happiness—or rather, of “so-called happiness”—is entirely “empirical.”[v] The idea of happiness issues the prudential command that I take care of myself, that I tend to my own well-being, but it understands what this means only in terms of pleasure (or desire or inclination). The philosophers of happiness say they’re going to teach you how to live well but typically end up just scribbling inventories of stuff people like; and it is against these pseudo-ethical grocery lists that Kant will ask us to conceive of morality as pure form. Soni is out to show that Kant is wrong on this score: Happiness is not just the more or less optional content of moral thought; it is itself a kind of form, a way of thinking about the organization of a life. A proper understanding of happiness would have to be as formalist as a theory of duty or the law.

But to say even this may already be to give Soni’s project too philosophical a cast, for what matters to him is not so much how we conceptualize happiness, but how we tell stories about it—our “narrative assumptions about happiness” (180). The distinctions between stories matter here, and some readers, no doubt, will admire Soni chiefly for his startling ability to differentiate between story forms or even to produce, out of the inherited materials of literary history, entirely new genres—new objects of narratological concern. I’m thinking especially of those remarkable pages in which he explains why the eighteenth-century fictions that most resemble tragedies are not, in fact, tragedies: Unlike the authentic article, modern pseudo-tragedies—Soni calls them “trial narratives”—“require us to accept as necessary and even valuable the conditions that produce unhappiness” (201). But more than a vindication of this or that story type, Mourning Happiness is a vindication of narrative tout court as a vehicle of moral and political reasoning. For narrative repeats, as philosophy cannot, the open-ended and detail-oriented indeterminacy of the Solonian command. “[I]f narrative is necessary for the community to be able to relate to the totality of a life, it is also the way to describe the heterogeneity of a life not reducible to a finite set of salient descriptors.” (70) Stories furnish examples of happy lives without letting these petrify into norms. In the absence of rules or abstract criteria, they present something like the logic of the concrete, a direct contemplation of some particularized happiness—or of some unknown other’s unrepeatable desolation.

Many readers are going to find this unwillingness to pronounce on the subject of happiness among the book’s most attractive features; it is, at any rate, how Soni smuggles a radical pluralism in past his own anti-liberal strictures. But there is a problem here all the same. In a sentence quoted on the back of the book, one of Soni’s early reviewers has proclaimed his work to be “a major contribution to … ethics.” But Soni doesn’t exactly think of himself as a moral philosopher; he thinks that in some suitably recondite way he is making a contribution to political theory. You can tell this from the volume’s subtitle, which is “Narrative and the Politics of Modernity.” This is yet another way in which he and MacIntyre are alike, since both believe that morality is only worth something when it exceeds itself, when, that is, it becomes real in the world, devising the practices and building the institutions that will give its precepts genuine ethical substance. But Soni also wants happiness to remain indeterminate and rejects as reified any felicity whose content has been even lightly specified; it is here that the book’s many references to Derrida and Levinas do their hardest work—Happiness will never be present to you; happiness demands of us an infinite and impossible responsibility towards the well-being of other people—and it is unclear how a moral imperative this abstract is ever going to attain institutional shape: indeterminacy made real, infinite responsibility with a street address. Within the span of a few pages, Soni insists both that Solonian happiness comes down to “a completely”—and magnificently—“empty question” and that it is “not an abstract philosophical proposition existing in isolation from social life” (177, 179). One might wonder, then: How do you institutionalize a question? More to the point, how do you institutionalize its emptiness?

Another comparison will make the dimensions of the problem clear. In Archaeologies of the Future, Fredric Jameson admits that his formalist approach to utopian writing is actually rather peculiar. It is odd, he says, for a scholar to be studying utopias as a genre, since this seems to cancel out the politics of this most political of literary forms. Jameson is, by his own account, more interested in the conventions that structure utopian writing than he is in the substance of this or that utopian proposal. And yet if utopias are so many attempts to imagine the best possible society, then there is a conceptual pressure, generated from within the form and not present in the same way in any other genre, to have a favorite. This observation, however, allows Jameson to give his argument its splendid twist: To study utopias as a genre, rather than giving readings of isolated utopian texts, is to refuse to choose between them; it is to want all the utopias at once and thereby to imagine a world whose utopian energies are entirely excessive and overflowing. And this, Jameson explains, is indeed a properly political demand: To love any one utopia is to play into the hands of utopia’s liberal enemies, who think of any perfect society as fatally closed or totalitarian. But to love the genre is to re-introduce multiplicity into the utopian equation, and so to ask that we build not a single perfect society, but a network of not-really-rival utopias, a global federation of good places. The reconciled world would resemble the well-stocked bookshelf of a sci-fi buff.

The point is, then, that Soni’s formalist account of happiness is plainly modeled on Jameson’s pan-utopianism and is driven by roughly the same concerns: We don’t need one happiness to share between us; we need all the happinesses. But what is nonetheless missing from Soni’s account is that final step, via which happiness in all its multiplicity would acquire a properly collective and political dimension. He wants to boast that Greek happiness, unlike our more paltry version, “took concrete political form in an institution” (452), but the institution that he submits for our consideration, the funeral oration, is agonizingly slight and not in any of the usual senses a political institution at all. This is puzzling on a few different fronts: Soni’s political program comes down to the claim that the Athenians made a point of talking about their dead, which seems to suggest that we bury our lost friends in complacent silence. At one point he proclaims that “everyone deserves a funeral oration,” which I suppose is meant to be another one of those impossible demands, until one recalls that this is already the practice, and mundanely so, even in advanced capitalist societies: I can count on getting a eulogy when I die and, failing that, a toast (431). Sometimes he even seems to forget that the funeral oration does not, in fact, institutionalize happiness—it makes no-one happy—but only the judgment upon happiness, which is something else again. This tentativeness on the subject of politics—or, if you like, the elevation to politics of an argument that really is ethical—introduces into the book a permanent misgiving around its own abstractions: We have to “accept the Solonian idea in its formal rigor and indeterminacy” (110)—but equally we “must not shy away from [the] work of specifying concretely, if always provisionally, how to put a politics of happiness into effect” (471). It is upon realizing that this extraordinary book has absolutely nothing to say about the latter that Soni’s most pointed indictments of eighteenth-century philosophy begin to sound like self-recrimination: We must hold to account any philosopher who “fails to imagine a concrete and institutionalizable politics of happiness” (465). A reader should feel flummoxed in the face of any philosopher who, like Bentham, makes “many formal pronouncements … about the priority of happiness,” but whose “concept of happiness” is “essentially abstract” (400). A footnote early in Mourning Happiness informs us that Soni has also been writing on literary utopias, and this gives us reason to hope that he might yet produce a utopias volume of his own. [vi] Any such book would surely be a follow-up to the volume at hand, a sequel in which he dissolves his first book’s abstractions and carries out its unfinished business. For utopias will always violate the Solonian injunction, and anyone who loves them will have to make his peace with that. They cannot leave happiness unspecified, any more than you, when looking for the right locality in which to live, hunting for the county or country that will make it possible for you to live a life out to its fulfilled end, can afford to wait for that place’s collapse or inundation before declaring it to be good.


[i] See, for instance, the special issue of Eighteenth Century: Theory and Interpretation, edited by Soni, on “the crisis of judgment”—51.3 (2010).  

[ii] Alisdair MacIntyre, “Practical Rationalities as Forms of Social Structure,” Irish Journal of Philosophy 4 (1987): 3-19; “The Theses on Feuerbach: A Road Not Taken,” in Carol C. Gould & Robert S. Cohen eds., Artifacts, Representations and Social Practice (Dordrecht: Kluwer, 1993): 277-290, quotation 290.

[iii] Theodor Adorno, Problems of Moral Philosophy (delivered 1963, first published in German 1996), translated by Rodney Livingston (Stanford: Stanford University Press, 2000), 119.

[iv] Alasdair MacIntyre, After Virtue (3rd ed.; 1981) (Notre Dame: University of Notre Dame Press, 2007) 36.

[v] Immanuel Kant, Critique of Practical Reason (1788), translated by Werner Pluhar (Indianapolis: Hackett, 2002), 93, 118.

[vi] Vivsavan Soni, “Modernity and the Fate of Utopian Representation in Wordsworth’s ‘Female Vagrant.’” European Romantic Review 21 (2010): 363-381.

The Revolutionary Energy of the Outmoded




Fredric Jameson does not like predictions. His is an owlish and retrospective Marxism, one that happily foregoes the crystal ball of some former orthodoxy. There is a Hegelian lesson that Jameson’s writing repeatedly attempts to impart, which is that wisdom only comes in the backwards glance, that we glimpse history only in the moment when our plans fail or dialectically backfire, when our actions bump up against the objective, hurtful (but never foreseeable) limits of the historical situation. You can draw up your revolutionary schemes, paint the future as gaily or grimly as you like, but only upon review will it become plain in just what way you have been Reason’s dupe. If this point is unclear, you might consider Jameson’s response to the World Trade Center attacks, which began with the following extraordinary observation:I have been reluctant to comment on the recent ‘events’ because the event in question, as history, is incomplete and one can even say that it has not yet fully happened. … Historical events…are not punctual, but extend in a before and after of time which only gradually reveal themselves.”[1] I suspect many will find remarkable Jameson’s reluctance here to help shape the public response to September 11th. An event that has not fully happened yet is, after all, an event in which one may yet intercede, an event that one needn’t yet cede to the Right, an event to which one might yet attribute one’s own polemical and political meanings. But Jameson makes a conspicuous display here of spurning what Left criticism generally (and glibly) calls an “intervention”—as though the business of a Marxist criticism were not to intervene, but rather to bide its time, to wait until an event has been thoroughly mediated or disclosed its function, and then to identify, with the serene impotence of hindsight, history’s great game. Any event is, like revolution itself, a leap into the unknown. The owl of Minerva only flies in November.

One might wonder, then, how Jameson feels about his own writing, which has been so accidentally and accurately predictive. How does he feel, for instance, about his landmark postmodernism essay, the one that sometimes goes by the name “Postmodernism and Consumer Society”?[2] That article so neatly anticipated U.S. popular culture in the 1990s that it is hard to shake the feeling that a whole generation of artists—writers, musicians, filmmakers above all—must have mistaken it for a manifesto. (“Pastiche—check. Death of the subject—you bet. Depthlessness and disorientation—where do I sign up?”) As ridiculous as it may sound, the essay, first published in 1983, now reads like an exercise in cultural embryology, discerning the first, fetal traces of an aesthetic mode that would become fully evident only in the years that followed. One wonders, too, if young readers encountering the article for the first time now don’t therefore underestimate its savvy. One wonders if they don’t find it rather trite, since a sharp-eyed exegesis of Body Heat (1981) is really just a workaday description of L.A. Confidential (1997)—a script treatment.

We can be more precise: What has seemed so strangely prophetic about Jameson’s postmodernism argument are, oddly enough, its Benjaminian qualities. Benjamin’s fingerprints seem, in some complicated way, to be all over postmodernism. One might even say that postmodernism in America is a dismal parody of Benjaminian thought. Just cast an eye back over the last ten years, over U.S. pop culture on the cusp of the millennium—postmodernism post-Jameson. Consider, for instance, the apocalypticism that has been among its most persistent trends. The recent fin de siècle has been preoccupied with dire images of a devastated future: we might think here of the full-blown resurgence of millenarian thought and the orchestrated panic surrounding the millennium bug; of X-Files paranoia, which has told us to “fight the future”; of catastrophe movies and the resurgence of film noir and dystopian science fiction. If you were to design a course on popular culture in the 1990s, you would be teaching a survey in doom.

There is much in this culture of disaster that would merit our closest attention—there is, in fact, strangeness aplenty. Consider, for instance, the emergence as a genre of the Christian fundamentalist action thriller, the so-called rapture novel. These novels are basically an exercise in genre splicing; they begin by offering, in what for right-wing Protestantism is a fairly ordinary procedure, prophetic interpretations of world events—the collapse of the Soviet Union, the new Intifada—but they then graft onto these biblical scenarios plots borrowed from Tom Clancy techno-thrillers. The first thing that needs to be noted about rapture novels, then, is that they signal, on the part of U.S. fundamentalism, an unprecedented capitulation to pop culture, which the godly Right had until recently held in well-nigh Adornian contempt. Older forms of Christian mass culture have seized readily on new technologies—radio, say, or cable television—but they have tended to recreate within those media a gospel or revival-show aesthetic. In rapture novels, by contrast, as in the rapture movies that have followed in the novels’ wake, we are able to glimpse the first outlines of a fully commercialized, fully mediatized Christian blockbuster culture. Fundamentalist Christianity gives way at last to commodity aesthetics.

This is not yet to say enough, however, because this rapprochement inevitably holds surprises for secular and Christian audiences alike. The best-selling rapture novel to date is Jerry Jenkins and Timothy LaHaye’s Left Behind, which has served as a kind of template for the entire genre. In the novel’s opening pages, the indisputably authentic Christians are all called up to Christ—they are “raptured.” They literally disappear from earth, leaving their clothes pooled on the ground behind them, pocket change and car keys scattered across the pavement. This scene is the founding convention of the genre, the one event that no rapture novel can do without. And yet this mass vanishing, conventional though it may be, cannot help but have some curious narrative consequences. It means, for a start, that the typical rapture novel is not interested in good Christians. The heroes of these stories, in other words, are not godly people—this is true by definition, because the real Christians have all quit the scene; they have been vacuumed from the novel’s pages. In their absence, the narrative turns its attention to indifferent or not-quite Christians, who can be shown now snapping out of their spiritual ennui, rallying to God, and taking up the fight against the anti-Christ (who in Left Behind, takes the form of an Eastern European humanitarian whose malign plans include scrapping the world’s nuclear arsenals and feeding malnourished children). Left Behind, I would go so far as to suggest, seems to work on the premise that there is something better—something more significantly Christian—about bad Christians than there is about good ones. This notion has something to do with the role of women in the novel. Left Behind, it turns out, has almost no use for women at all. They all either disappear in the novel’s opening pages or get left behind and metamorphose into the whores of anti-Christ. It will surprise no-one to find a Christian fundamentalist novel portraying women as whores, but the former point is worth dwelling on: Left Behind cannot wait to dispense with even its virtuous women. It may hate the harlots, but it has no use for ordinary church-supper Christians either, imagined here as suburban housewives and their well-behaved young children. Anti-Christ has to be defeated at novel’s end, and for this to happen, the good Christians have to be shown the door, for smiling piety can, in the novel’s terms, sustain no narrative interest; it can enter into no conflicts. Left Behind is premised on the notion that devout Christians are cheek-turning wimps and goody-two shoes, mere women, in which case they won’t be much good in the fight against the liberals and the Jews. What this means is that the protagonists who remain in the novel—the Christian fence-sitters—are all men, and not just any men, but rugged men with rugged, porn-star names: Rayford Steele, Buck Williams, Dirk Burton. Left Behind is a novel, in other words, that envisions the remasculinization of Christianity, that calls upon its readers to imagine a Christianity without women, but with muscle and grit instead, a Christianity that can do more than just bake casseroles for people. And such a project, of course, requires bad Christians so that they may become bad-ass Christians. Perhaps it goes without saying: A Christian action thriller is going to be interested first and foremost in action-thriller Christians.

It is with the film version of Left Behind (2001), however, that things really get curious. The film’s final moments nearly make explicit a feature of the narrative that is half-buried in the novel: The film concludes with a brief sequence that we’ve all seen a dozen times, in a dozen different action movies—the sequence, that is, in which the heroic husband returns home from his adventures to be reunited with his wife and child. Typically, this scene is staged at the front door of the suburban house with the child at the wife’s side; you might think, emblematically, of the final shots of John Woo’s Face/Off (1997), which show FBI Agent Sean Archer (John Travolta) exchanging glances with his wife (Joan Allen) over the threshold as their teenaged daughter hovers in the background. Left Behind, for its part, reproduces that scene almost exactly, almost shot for shot, except, since the women have all evaporated or gone over to anti-Christ, the film has no choice but to stage this familiar ending in an unfamiliar way—between its male heroes, between Rayford Steele, standing in the doorway with his daughter, and a bedraggled Buck Williams, freshly returned from his battles with the Beast. A remasculinized Christianity, then, cannot help but imagine that the perfect Christian family would be—two men. Such, then, is one upshot of fundamentalism’s new openness to pop culture: Christianity uncloseted.

Of course, the borrowings can go in the other direction as well. Secular apocalypse movies can deck themselves out in religious trappings, but when they do so, they risk an ideological incoherence of their own. Think first about conventional, secular catastrophe movies—Armegeddon (1998), Deep Impact (1998), Volcano (1997)—so-called apocalypse films that actually make no reference to religion. These tend to be reactionary in rather humdrum and technocratic ways, full of experts and managers deploying the full resources of the nation to fend off a threat defined from the outset as non-ideological. The volcanoes and earthquakes and meteors that loom over such movies are therefore merely more refined versions of the maniacal terrorists and master thieves who normally populate action movies: they are enemies of the state whose challenge to the social order never approaches the level of the political. It is when such secular narratives reintroduce some portion of religious imagery, however, that their political character becomes pronounced. We might think here of The Seventh Sign (1988), which featured Demi Moore, or of the Arnold Schwarzenegger vehicle End of Days (1999). Like Left Behind, these last two films work by combining biblical scenarios and disaster-movie conventions, and the results are similarly confusing. To be more precise, they begin by offering luridly Baroque versions of the Christian apocalypse narrative, but then revert back to the secular logic of the disaster movie, as though to say: Catastrophes are destabilizing a merciless world in preparation for Christ’s return—and this must be stopped! In a half-hearted nod to Christian ethics, each of these movies begins by depicting the world of global capitalism as brutal and unjust—the montage of squalor has become something of an apocalypse-movie cliché—before deciding that this world must be preserved at all costs. The characters in these films, in other words, expend their entire allotment of action-movie ingenuity trying to prevent the second coming of Christ, imagined here as the biggest disaster of all.[3]

This is not to say that contemporary American apocalypses dispense with redemptive imagery altogether, at least of some worldly kind. Carceral dystopias, for instance, films that work by trapping their characters in controlled and constricted spaces, tend to posit some utopian outside to their seemingly total systems: the characters in Dark City (1997) dream of Shell Beach, the fictitious seaside resort that supposedly lies just past their nightmarish noir metropolis, the illusory last stop on a bus line that actually runs nowhere; the man-child of Peter Weir’s Truman Show (1998) dreams, in similar ways, of Fiji, which is a rather more conventional vision of oceanic bliss; and the Horatio-Alger hero of the genetics dystopia Gattaca (1997) follows this particular utopian logic to its furthest end by dreaming of the day he will be made an astronaut, the day he will fly to outer space, which of course is no social order at all, let alone a happier one, but merely an anything-but-here, an any-place-but-this-place, the sheerest beyond. As utopias go, then, these three are remarkably impoverished; they cannot help but seem quaint and nostalgic, strangely dated, like the daydreams of some Cold-War eight-year old, all Coney Island and Polynesian hula-girls and John-Glenn, shoot-the-moon fantasies.

But then it is precisely the old-fashioned quality of these utopias that is most instructive; it is precisely their retrograde quality that demands an explanation. For if on the one hand, U.S. pop culture has seemed preoccupied with the apocalypse, on the other hand it has seemed every bit as obsessed with cheery images from a sanitized past. Apocalypse culture has as its companion the many-faceted retro-craze: vintage clothing; Nick at Nite; the ‘70s vogue; the ‘50s vogue; the ‘40s vogue; the ‘30s vogue; the ‘20s vogue (the ‘60s are largely missing from this tally, for reasons too obvious to enumerate; the ‘60s vogue has been stunted, almost nonexistent, at least within a U.S. framework—retro tops out about 1963 and then gets shifted over to Europe and the mods); the return of surf, lounge-music, and Latin jazz; retro-marketing and retro-design, and especially the Volkswagen Beetle and the PT Cruiser.

Retro, then, deserves careful consideration of its own, as an independent phenomenon alongside the apocalypse. Some careful distinctions will be necessary. Retro takes a hundred different forms; it has the appearance of a single and coherent phenomenon only at a very high level of generality. We could begin, then, by examining the heavily marketed ‘60s and ‘70s retro of mainstream, white youth culture. Here we would want to say, at least on first pass, that the muffled camp of Austin Powers (1997), say—or the mid-‘90s Brady Bunch revival, or Beck’s Midnite Vultures—closely approximates Jameson’s notion of postmodern pastiche: this is retro as blank parody, the affectless recycling of alien styles, worn like so many masks. But that said, we would have to counterpose against these examples the retro-culture of a dozen regional scenes, scattered across the U.S., most of which are retro in orientation, but none of which are exercises in pastiche exactly. Take, for instance, the rockabilly and honky-tonk scene in Chapel Hill, North Carolina: It is impeccably retro in its musical choices and impeccably retro in its fashions, full of redneck hipsters sporting bowling shirts and landing-pad flattops and smart-alecky tattoos. Theirs is a form of retro whose reference points are emphatically local, and in its regionalism, the Chapel Hill scene aspires to a subculture’s subversiveness, a kind of Southern-fried defiance, which stakes its ground in contradistinction to some perceived American mainstream and then gives its rebellion local color, as though to say: “We don’t work in your airless (Yankee) offices. We don’t speak your pinched (Yankee) speech. We don’t belong to your emasculated (Yankee) culture. We are hillbillies and punks in equal proportion.”  Retro, in short, can be placed in the service of a kind of spitfire regionalism, and there is little to be gained by simply conflating this form of retro with the retro-culture marketed nationwide.

In fact, even mainstream ‘70s retro can take on different valences in different hands. To cite just one further example: hip-hop sampling, which builds new tracks out of the recycled fragments of existing recordings, might seem upon first inspection to be the very paradigm of the retro-aesthetic. And yet hip-hop, which has mined the ‘70s funk back-catalog with special diligence, typically forgoes the irony that otherwise accompanies such postmodern borrowings. Indeed, hip-hop sampling generally involves something utterly unlike irony; it is often positioned as a claim to authenticity, an homage to the old school, so that when OutKast, say, channels some vintage P-Funk, that sample is meant to function as a genetic link, a reoccurring trait or musical cell-form. The sample is meant to serve as a tangible connection back to some originary moment in the history of soul and R&B (or funk and disco).[4]

So differences abound in retro. And yet one is tempted, all the same, to speak of something like an official retro-culture, which takes as its object the 1940s and ‘50s: diners, martinis, “swing” music (which actually refers, not to ‘30s and ‘40s swing, but to post-war jump blues), industrial-age furniture, late-deco appliances, all chrome and geometry. The most important point to be made about this form of retro is that it is an unabashedly nationalist project; it sets out to create a distinctively U.S. idiom, one redolent of Fordist prosperity, an American aesthetic culled from the American century, a version of Yankee high design able to compete, at last, with its vaunted European counterparts. In general, then, we might want to say that retro is the form that national tradition takes in a capitalist culture: Capitalism, having liquidated all customary forms of culture, will sell them back to you at $16 a pop. But then commodification has ever been the fate of national customs, which are all more or less scripted and inauthentic. What is distinctive about retro, then, is the class of objects that it chooses to burnish with the chamois of tradition. There is a remarkable scene near the beginning of Jeunet and Caro’s great retro-film Delicatessen (1991) that is instructive in this regard: Two brothers sit in a basement workshop, handcrafting moo-boxes—those small, drum-shaped toys that, once upended and then set right again, low like sorrowful cows. The brothers grind the ragged edges from the boxes, blow away the shavings as one might dust from a favorite book, rap the work-table with a tuning fork and sing along with the boxes to ensure the perfect pitch of the heifer’s bellow. And in that image of their care, their workman’s pride, lies one of retro-culture’s great fantasies: Retro distinguishes itself from the more or less folkish quality of most national traditions in that it elevates to the status of custom the commodities of early mass production—old Coke bottles, vintage automobiles—and it does so by imbuing them with artisanal qualities, so that, in a strange historical inversion, the first industrial assembly lines come to seem the very emblem of craftsmanship. Retro is the process by which mass-produced trinkets can be reinvented as “heritage.”[5]

The apocalypse and the retro-craze—such, then, are the twin poles of postmodernism, at least on Jameson’s account. We are all so accustomed to this twosome that it has become hard to appreciate what an odd juxtaposition it really is. Disco inferno, indeed. This is a pairing, at any rate, that finds a rather precise corollary in the writings of Walter Benjamin. Each of the moments of our swinging apocalypse can be traced back to Benjaminian impulses, or opens itself, at least, to Benjaminian description. For in what other thinker are we going to find, in a manner that so oddly approximates the culture of American malls and American multiplexes, this combination of millenarian mournfulness and antiquarian devotion? Benjamin’s Collector seems to preside over postmodernism’s thrift-shop aesthetic, just as surely as its apocalyptic imagination is overseen by Benjamin’s Messiah, or at least by his Catastrophic Angel. It would seem, then, that Benjaminians should be right at home in postmodernism, and if this is palpably untrue—if the culture of global capitalism does not after all seem altogether hospitable to communists and the Kabbalah—then this is something we will now have to account for. Why, despite easily demonstrated affinities, does it seem a little silly to describe U.S. postmodernism as Benjaminian?

Jameson’s work is again clarifying. It is not hard to identify the Benjaminian elements in Jameson’s idiom, and especially in his utopian preoccupations, his determination to make of the future an open and exhilarating question. No living critic has done more than Jameson to preserve the will-be’s and the could-be’s in a language that would just as soon dispense altogether with its future tenses and subjunctive moods. And yet a moment’s reflection will show that Jameson is, for all that, the great anti-Benjaminian. It is Jameson who has taught us to experience pop culture’s Benjaminian qualities, not as utopian pledges, but as threats or calamities. Thus Jameson on apocalypse narratives: “It seems to be easier for us today to imagine the thoroughgoing deterioration of the earth and of nature than the breakdown of late capitalism; perhaps that is due to some weakness in our imaginations. I have come to think that the word postmodern ought to be reserved for thoughts of this kind.”[6] It is worth calling attention to the obvious point about these sentences—that Jameson here more or less equates postmodernism and apocalypticism—if only because in his earliest work on the subject, it is not the apocalypse but retro-culture that seems to be postmodernism’s distinguishing and debilitating mark. Again Jameson: “there cannot but be much that is deplorable and reprehensible in a cultural form of image addiction which, by transforming the past into visual mirages, stereotypes, or texts, effectively abolishes any practical sense of the future and of the collective project.”[7]  Jameson, in short, is most sour precisely where Benjamin is most expectant. He would have us turn our back on the most conspicuous features of Benjamin’s work; for late capitalism, it would seem, far from keeping faith with Benjamin, actually robs us of our Benjaminian tools, if only by generalizing them, by transforming them into noncommittal habits or static conventions: the Collector, fifty years on, shows himself to be just another fetishist, and even the Angel of History turns out to be a predictable and anti-utopian figure, unable to so much as train its eyes forward, foreclosing, without reprieve, on the time yet to come. U.S. postmodernism may be a culture that loves to “brush history against the grain,” but only in the way that you might brush back your ironic rockabilly pompadour.



But what if we refused to break with Benjamin in this way? Try this, just as an exercise: Ask yourself what these seemingly disparate trends—apocalypticism and the retro-craze—have to do with one another. Consider in particular that remarkable crop of recent films that actually unite these two trends, films that ask us to imagine an unlivable future, but do so in elegant vintage styles. These include: Ridley Scott’s Blade Runner (1982), the grand-daddy of the retro-apocalypses; three oddly upbeat dystopias—Starship Troopers and the aforementioned Gattaca and Dark City—all box-office underachievers from 1997; and, again, the cannibal slapstick Delicatessen. All of these films posit, in their very form, some profound correlation between retro and the apocalypse, but it is hard, on a casual viewing, to see what that correlation could be. Jameson, of course, offers a clear and compelling answer to this question, which is that apocalypticism and the retro-craze are the Janus faces of a culture without history, two eyeless countenances, pressed back to back, facing blankly out over the vistas they cannot survey.[8]

Some of these films, it must be noted, seem to invite a Jamesonian account of themselves. This is true of Blade Runner, for instance, or of The Truman Show—films that offer a vision of retro-as-dystopia, a realm of fabricated memory, in which history gets handed over to corporate administration, in which every madeleine is stamped “Made in Malaysia.” Perhaps it is worth pausing here, however, since we need to be wary of running these two films together. The contrast between them is actually quite revealing. Both Blade Runner and The Truman Show present retro-culture as dystopian, and in order to do this, both rely on some of the basic conventions of science fiction. Think about what makes science fiction distinctive as a mode—think, that is, about what distinguishes it from those genres with which it seems otherwise affiliated, such as the horror movie. Horror movies, especially since the 1970s, have typically worked by introducing some terrifying, unpredictable element into apparently safe and ordinary spaces. Monsters are nearly always intruders—slashers in the suburbs, zombies forcing their way past the barricaded door. But dystopian science fiction is, in this respect, nearly the antithesis of horror. It does not depict a familiar setting into which something frightening then gets inserted. What is frightening in dystopian science fiction is rather the setting itself. Now, this point holds for both Blade Runner and The Truman Show, but it holds in rather different ways. The first observation that needs to be made about The Truman Show is that it is more or less a satire, which is to say that, though it takes retro as its object, it is not itself a retro-film. It portrays a world that has handed itself over entirely to retro, a New Urbanist idyll of gleaming clapboard houses on mixed-use streets; but the film itself is not, by and large, retro in its narrative forms or cinematic techniques. Quite the contrary: the film wants to teach its viewers how to read retro in a new way; it wishes, polemically, to loosen the hold of retro upon them. The Truman Show takes a setting that initially seems like some American Eden, and then through the menacing comedy of its mise-en-scène—the falling lights and incomplete sets, the scenery that Truman stumbles upon or that springs disruptively to life—makes this retro-town come slowly to seem ominous. To give the film the cheap Lacanian description it is just begging for: The Truman Show charts the unraveling of the symbolic order. Every klieg light that comes crashing down from the sky is a warning shot fired from the Real. The simpler point, however, is that The Truman Show rests on a deflationary argument about American mass culture—a media-governed retro-culture depicted here as restrictive, counterfeit, and infantilizing—and its form is accordingly rather conventional. It is essentially a cinematic Bildungsroman, which ends once the protagonist steps forward to take full responsibility for his own life, and this, of course, tends to compromise the film’s own Lacanian premise: It suggests that any of us could simply step out of the symbolic order, step boldly out into the Real, if only we could muster sufficient resolve.[9]

Having a compromised and conventional form, however, is not the same thing as having a retro-form. In Blade Runner, by contrast, the setting—a dismal and degenerate Los Angeles—is self-evidently dystopian, but it is itself retro; it is retro as a matter of style or form. The film’s vision of L.A., as has often been observed, is equal parts Metropolis and ‘40s film noir, and the effect of the film is thus rather different from The Truman Show, though it is equally curious: Blade Runner may recycle earlier styles or narrative forms in a manner typical of retro, but the films that it mimics are themselves all more or less dystopian. If Blade Runner is a pastiche, it is a pastiche of other dystopias, and this has the effect of establishing the correlation between retro and the apocalypse in a distinctive way: Blade Runner posits a historical continuum between a bleak past and an equally bleak future, between the corrupt and stratified modernist city (of German expressionism and hardboiled fiction) and the coming reign of corporate capital (envisioned by so much science fiction), between the bad world we’ve survived and the bad world that awaits.

Such, then, are the films that seem ready to make Jameson’s argument for him. But there is good reason, I think, to set Jameson temporarily to one side. For present purposes, it would be more revealing to direct our attention back to Delicatessen, which, of all the retro-apocalypses, is perhaps the most winning and Benjaminian. The question that confronts any viewer of Delicatessen is why this film—which, after all, depicts an utterly dismal world in which men and women are literally butchered for meat—should be so delightful to watch, and not just wry or darkly humorous, but giddy and dithyrambic. I would suggest that the pleasure peculiar to Delicatessen has everything to do with the status of objects in the film—that is, with the extravagant and festive care that Jeunet and Caro bring to the filming of objects, which take on the appearance here of so many found and treasured items. One might call to mind the hand-crank coffee grinder, which doubles as a radio transmitter; or the cherry-red bellboy’s outfit; or simply the splendid opening credits—this slow pan over broken records and torn photographs—in which the picture swings open like a case of curiosities. It is as though the film took as its most pressing task the re-enchantment of the object-world, as though it were going to lift objects to the camera one by one and reattach to them their auras—not their fetishes, now, as happens in most commercial films, with their product placements and designer outfits—but their auras, as though the objects at hand had never passed through a marketplace at all. This is tricky: The objects in Delicatessen are recognizably of the same type as American retro-commodities—an antique wind-up toy, an old gramophone, stand-alone black-and-white television sets. At this point, then, the argumentative alternatives become clear: Either we can dismiss Delicatessen as ideologically barren, as just another pretext for retro-consumption, just another flyer for the flea market of postmodernism. Or we can muster a little more patience, tend to the film a little more closely, in which case we might discover in Delicatessen the secret of all retro-culture: its desire, delusional and utopian in equal proportion, for a relationship to objects as something other than commodities.

To follow the latter course is to raise an obvious question: How does the film direct our attention to objects in a new way? How does it reinvigorate our affection for the object world? This is a question, first of all, of the film’s visual style, although it turns out that nothing all that unusual is going on cinematographically: In a manner characteristic of French art-film since the New Wave, Delicatessen keeps the spectator’s eye on its objects simply by cutting to them at every opportunity and thus giving them more screen time than household artifacts typically claim. By the usual standards of analytical editing, in other words—within the familiar breakdown of a scene into detailed views of faces, gestures, and props—the props get a disproportionate number of shots. The objects, like so many Garbos, hog all the close-ups. “By permitting thought to get, as it were, too close to its object,” Adorno once said of Benjamin’s critical method, “the object becomes as foreign as an everyday, familiar thing under a microscope.”[10] Delicatessen works, in these terms, by taking Adorno’s linguistic figure at face value and returning it back to something like its literal meaning, back to the visual. The film permits the camera to get too close to its object. It forces the spectator to scrutinize objects anew simply by bringing them into sustained proximity.

The camerawork, however, is just the start of it, for in addition to the question of cinematic style, there is the related question of form or genre. Delicatessen, it turns out, is playing a crafty game with genre, and it is through this formal frolic that the film most insistently places itself in the service of its objects. For Delicatessen is retro not only in its choice of props—it is, like Blade Runner, formally or generically retro, as well. This point may not be immediately apparent, however, since Delicatessen resurrects a genre largely shunned by recent U.S. film. One occasionally gets the feeling from American cinema that film noir is the only genre ripe for recycling. The 1990s have delivered a whole paddywagon full of old-fashioned crime stories and heist pics, but where are all the other classic Hollywood genres? Where are the retro-Westerns and the retro war movies? Where are the retro-screwballs?[11] Neo-noir, of course, is relatively easy to pull off—dim the lights and fire a gun and some critic or another will call it noir. Delicatessen, for its part, attempts something altogether more difficult or, at least, sets in motion a less reliable set of cinematic conventions: pratfalls, oversized shoes, madcap chase scenes. Early on, in fact, the film has one of its characters say that, in its post-apocalyptic world, people are so hungry they “would eat their shoes”; and with this one line—an unambiguous reference to the celebrated shoe-eating of Chaplin’s The Gold Rush—it becomes permissible to find references to silent comedy at every turn: in the hero’s suspenders, in the film’s several clownish dances, in the near-demolition of the apartment building in which all the action is set, a demolition that, once read as slapstick, will call to mind Buster Keaton’s wrecking-ball comedy, the crashing houses of Steamboat Bill, Jr. (1928), say. Delicatessen, in sum, is retro-slapstick, and noting as much will allow us to ask a number of valuable questions.

The most compelling of these questions will return us to the matter at hand. We are trying to figure out how Delicatessen gets the viewer to pay attention to its objects, and so the question now must be: What does slapstick have to do with the status of objects in the film? It is hardly intuitive, after all, that slapstick should bring about the redemption of objects, should reattach objects to their auras. A cursory survey of classic slapstick, in fact, might suggest just the opposite—a world, not of enchanted objects, but of aggressive and adversarial ones. Banana peels and cream pies spring mischievously to mind. And yet we need to approach these icons with caution, lest we take a conceptual pratfall of our own; for Delicatessen draws on slapstick in at least two different ways, or rather, it draws on two distinct trends in early American slapstick, and each of these trends grants a different status to its objects. Everything rides on this distinction:

1) When we think of slapstick, we think first of all of roughhouse comedy, of the pie in the face and the kick in the pants, the endless assault on ass and head. Classic slapstick of this kind is what we might call the comedy of Newtonian physics. It is a farce of gravity and force, and as such, it is based on the premise that the object world is fundamentally intransigent, hostile to the human body. In this Krazy-Kat or Keystone world, every brick, every mop is a tightly wound spring of kinetic energy, always ready to uncoil, violently and without motivation.[12] It is worth remarking, then, that Delicatessen, contains its share of knockabout: the Rube Goldberg suicide machines, the postman always tumbling down the stairs. In its most familiar moments, Delicatessen, in keeping with its comic predecessors, seems to suggest that the human body is irreparably out of joint with its environment.

A first distinction is necessary here, for though Delicatessen may embrace the sadism of slapstick, it does so with a historical specificity of its own. Classic slapstick typically addresses itself to the place of the body under urban and industrial capitalism; one is pretty much obliged at this point to adduce Chaplin’s Modern Times (1936), with its scenes of working-class mayhem and man-eating machines. Delicatessen, by contrast, contains man-eaters of its own, but they are not metaphorical man-eaters, as Chaplin’s machines are—they are cannibals true and proper, and their presence adds a certain complexity to the question of the film’s genre, for there have appeared so many films about cannibalism over the last twenty years that they virtually constitute a minor genre of their own.[13] One way to describe Delicatessen’s achievement, then, is to say that it splices together classic slapstick with the cannibal film. There will be no way to appreciate what this means, however, until we have determined the status of the cannibal in contemporary cinema. Broadly speaking, images of the cannibal tend to participate in one of two discourses: Historically, they have played a rather repugnant role in the racist repertoire of colonial clichés. Cannibalism is one of the more extreme versions of the imperial Other, the savage who does not respect even the most basic of civilization’s taboos. Increasingly, however, in films such as Eat the Rich (1987) or Dawn of the Dead (1978), cannibalism has become a conventional (and more or less satirical) image of Europeans and Americans themselves—an image, that is, of consumerism gone awry, of a consumerism that has liquidated all ethical boundaries, that has sunk into raw appetite, without restraint.[14] For present purposes, this point is nowhere clearer than in Delicatessen’s final chase scene, in which the cannibalistic tenants of the film’s apartment house gather to hunt down the film’s hero. The important point here is that, within the conventions of classic Hollywood comedy, the film makes a conspicuous substitution, for our comic hero is not on the run from some exasperated factory foreman or broad-shouldered cop on the beat, as silent slapstick would have it. He is fleeing, rather, from a consumer mob, E.P. Thompson’s worst nightmare, some degraded, latter-day bread riot. It is important that we appreciate the full ideological force of this switchover: By staffing the old comic scenarios with kannibals instead of kops, the film is able to transform slapstick in concrete and specifiable ways. The cannibals mean that when Delicatessen revives Chaplin-era slapstick, it does so without Chaplin’s factories or Chaplin’s city. This is slapstick for some other, later stage of capitalism—modernist comedy from which modernist industry has disappeared, leaving only consumption in its place.

2) Slapstick, then, announces a pressing political problem, in Delicatessen as in silent comedy. It sounds an alarm on behalf of the besieged human body. Delicatessen’s project, in this sense, is to imagine that problem’s solution, to mount a counterattack, to ward off the principle of slapstick by shielding the human body from its batterings. The deranged, consumption-mad crowd, in this light, is one, decidedly sinister version of the collective, but it finds its counterimage here in a second collective, a radical collective—the vegetarian insurgency that serves as ethico-political anchor to the film. Or to be more precise: The film is a fantasy about the conditions under which an advanced consumer capitalism could be superceded, and in order to do so, it follows two different tracks: One of the film’s subplots follows the efforts of the anti-consumerist underground, the Trogolodytes, while a second subplot stages a fairly ordinary romance between the clown-hero and a butcher’s daughter. Delicatessen thus divides its utopian energies between the revolutionary collective, depicted here as some lunatic version of La Resistance, and the heterosexual couple, imagined in impeccably Adornian fashion as the last, desperate repository of human solidarity, the faint afterimage of a non-instrumental relationship in a world otherwise given over to instrumentality.[15]

But this pairing does not exhaust the film’s political imagination, if only because knockabout does not exhaust the possibilities of slapstick. Delicatessen, in fact, is more revealing when it refuses roughhouse and shifts instead into one of slapstick’s other modes. Consider the key scene, early in the film, when the clown-hero, who has been hired as a handyman in the cannibal house, hauls out a bucket of soapy water to wash down the stairwell. The bucket, of course, is another slapstick icon, and anyone already cued in to the film’s generic codes might be able to predict how the scene will play out. Classic slapstick would dictate that the hero’s foot get wedged inside the bucket, that he skid helplessly across the ensuing puddle, that the mop pivot into the air and crack him in the forehead, that he somersault finally down the stairs. The important point, of course, is that no such thing happens. The clown does not get his pummeling. On the contrary, he uses his cleaning bucket to fill the hallway of this drear and half-inhabited house with giant, wobbling soap-bubbles, with which he then dances a kind of shimmy. It is in this moment, when the film pointedly repudiates the comedy of abuse, that the film modulates into a different tradition of screen comedy, what Mark Winokur has called “transformative“ or “tramp” comedy.

The hallway scene, in other words, is Chaplin through and through. It is important, then, to specify the basic structure of the typical Chaplin gag—and to specify, in particular, what distinguishes Chaplin from the generalized brutality and bedlam of the Keystone shorts. Chaplin’s bits are so many visual puns: they work by taking an everyday object and finding a new and exotic use for it, turning a roast chicken into a funnel, or a tuba into an umbrella stand, or dinner rolls into two dancing feet.[16] In Delicatessen, such transformative comedy is apparent in the New Year’s Eve noisemaker that the frog-man uses as a tongue, to catch flies; or in the hero’s musical saw, which, in fact, is the very emblem of the film’s many objects—an implement liberated from its pedestrian uses, a tool that yields melody, a dumb commodity suddenly able to speak again, and not just to shill, but to murmur of new possibilities. It is in transformative comedy, then, in the spectacle of objects whose use has been transposed, that slapstick takes on a utopian function. Slapstick becomes, so to speak, its own solution: Knockabout slapstick, in which objects are perpetually in revolt against the human body, finds its redemption in transformative slapstick, in which the human body discovers a new and unexpected affinity with objects. The pleasure that is distinctive of Delicatessen is thus actually some grand comic version of Kant’s aesthetics, of Kant’s Beauty, premised as it is on the dawning and grateful realization that objects are ultimately and against all reasonable expectation suited to human capacities. Delicatessen reimagines the world as a perpetual pas de deux with the inanimate.[17]

Transformative slapstick, this is all to say, functions in Delicatessen as a kind of antidote to cannibalistic forms of consumption. At its most schematic, the film faces its viewers with a choice between two different ways of relating to objects: a cannibalistic relationship, in which the object will be destroyed by the consumer’s unchecked hunger, or a Chaplinesque relationship, in which the object will be kept alive and continually reinvented. And so at a moment when cinematic realism has fallen into a state of utter disrepair, when realism finds it can do nothing but script elegies for the working class—when even fine films like Ken Loach’s Ladybird Ladybird (1994) and Zonca’s Dream Life of Angels (1998) have opted for the funereal, with so much as the protest drubbed out of them—it falls to Delicatessen’s grotesquerie to fulfill realism’s great utopian function, to keep faith, as Bazin said, with mere things, “to allow them first of all to exist for their own sakes, freely, to love them in their singular individuality.”[18]

It is crucial, however, that we not confine this observation to Delicatessen, because in that film’s endeavor lies the buried aspiration of all retro-culture, even (or especially) at its most fetishistic. If you examine the signs that hang next to the objects at Restoration Hardware and other such retro-marts—these small placards that invent elaborate and fictional histories for the objects stacked there for sale—you will discover a culture recoiling from its commodities in the very act of acquiring them, a culture that thinks it can drag objects back into the magic circle if only it can learn to consume them in the right way. Retro-commodities utterly collapse our usual Benjaminian distinctions between the fetish and the aura, and they do so by taking as their fundamental promise what Benjamin calls  “the revolutionary energies that appear in the ‘outmoded,’” the notion that if you know the history of an item or if you can aestheticize even the most ordinary of objects—a well-wrought dustpan, perhaps, or a chrome toaster—then you are never merely buying an object; you are salvaging it from the sphere of circulation, and perhaps even from the tawdriness of use.[19]

This is not yet to say enough, however, because it is the achievement of Delicatessen to demonstrate that this retro-utopia is unthinkable without the apocalypse. For if the objects in Delicatessen achieve a luminosity that is denied even the most exquisite retro-commodities, then this is only because they occupy a ruined landscape, in which they come to seem singular and irreplaceable. Delicatessen is a film whose characters are forever scavenging for objects, scrapping over parcels that have gone astray, rooting through the trash like so many hobos or German Greens. It is the film’s fundamental premise, then, that in a time of shortage, and in a time of shortage alone, objects will slough off their commodity status. They will crawl out from under the patina of mediocrity that the exchange relationship ordinarily imposes on them. If faced with shortage, each object will come to seem unique again, fully deserving of our attention. There is a startling lesson here for anyone interested in the history of utopian forms: that utopia can require suffering, or at least scarcity, and not abundance; that the classical utopias of plenty—those Big Rock Candy mountains with their lemonade springs and cigarette trees and smoked hams raining from the sky—are, under late capital, little more than hideous afterimages of the marketplace itself, spilling over with redundant and misdistributed goods, stripped of their revolutionary energy; that a society of consumption must, however paradoxically, find utopia in its antithesis, which is dearth.[20] And so we come round, finally, to my original point: that we must have, alongside Jameson, a second way of positing the identity of retro-culture and the apocalypse, one that will take us straight back to Benjamin: Underlying retro-culture is a vision of a world in which commodity production has come to a halt, in which objects have been handed down, not for our consumption, but for our care. The apocalypse is retro-culture’s deepest fantasy, its enabling wish.


[1] Jameson’s full comments can be found in the London Review of Books (Volume 23, Number 19, October 4, 2001). See also “Architecture and the Critique of Ideology, in The Ideologies of Theory, Volume 2: The Syntax of History, pp. 35-60, esp. p. 41: “dialectical interpretation is always retrospective, always tells the necessity of an event, why it had to happen the way it did; and to do that, the event must already have happened, the story must already have come to an end.”

[2] This essay is available in multiple versions. The easiest to come by is perhaps “Postmodernism and Consumer Society,”  in The Cultural Turn (London: Verso, 1998), pp. 1-20; and the most densely argued “The Cultural Logic of Late Capitalism” in Postmodernism, or The Cultural Logic of Late Capitalism (Durham: Duke, 1991), pp. 1-54.

[3] The Seventh Sign, for what it’s worth, draws on at least four different genres: 1) It is, at the most general level, a Christian apocalypse narrative; its nominal subject is the End Time, the series of catastrophes set in motion by God in preparation for His final judgment. 2) But in doing so, it deploys most of the conventions of the occult horror film. Even though the film expressly states that God is responsible for the disasters depicted, it cannot help but stage those disasters as supernatural and scary, in sequences borrowed more or less wholesale from the exorcism and devil-child movies of the 1970s, which is to say that viewers are expected to experience God’s actions as essentially diabolical. The film may adorn itself with Christian trappings, but in a manner typical of the Gothic, it cannot, finally, represent religion as anything but frightening. 3) This last point is clearest in the film’s depiction of Jesus Christ, who actually appears as a character and is almost always filmed in shots lifted from serial-killer films—Jesus stands alone, isolated in ominous medium long-shots, his face half in shadow, lit starkly from the side. Jesus’ menace is also a plot point: Christ, in the film, rents a room from Demi Moore and, in a manner that recalls Pacific Heights (1990) or The Hand That Rocks the Cradle (1992), becomes the intruder in the suburban home, the malevolent force that the white professional family has mistakenly welcomed under its roof. 4) In its final logic, then, the film reveals itself to be just a disaster movie in disguise: The Apocalypse must be scuttled. Christ must be sent back to heaven (and thus evicted from the suburban home). Justice must be averted.

[4] I owe this point to a conversation with Roger Beebe. Even here, though, matters are more complicated than they at first seem. Hip-hop, after all, hardly dispenses with irony and pastiche altogether: Jay-Z  has sampled “It’s a Hard-knock Life” (from Annie) and Missy Elliot has sampled Mozart’s Requiem, but no-one is likely to suggest that hip-hop is establishing a genetic link back to the Broadway musical or Viennese classicism.

[5] Of course, as a nationalist project, retro will play out differently in different national contexts. Perhaps a related cinematic example will make this clear. Consider Jeneut’s Fabuleux destin d’Amélie Poulain (2001). At the level of diagesis—as a plain matter of plot and dialogue and character—the film has nothing at all to do with nationalism. On the contrary, it dedicates an entire subplot to undermining the provincialism of one of its characters, Amélie’s father, who resolves at movie’s end to become more cosmopolitan. The entire film is directed towards getting him to leave France. But at the level of form, things look rather different. Formally, the film is retro through and through. It won’t take a cinephile to notice the overt references to Jules et Jim (1962) and Zazie dans le Metro (1960), at which point it becomes clear that Amélie is a pastiche of the French New Wave, which is thereby transformed into a historical artifact of its own. Amélie, then, attempts to recreate the nouvelle vague, not with an eye to making it vital again as an aesthetic and political project, but merely to cycle exhaustively through its techniques, its stylistic tics, as though it were compiling some kind of visual compendium. The nationalism that the film’s narrative explicitly rejects thus reappears as a matter of form. Amélie works to draw our attention to the Frenchness of the New Wave, to codify it as a national style, and the presumed occasion for the film is therefore the ongoing battle, in France, over the Americanization of la patrie. Amélie is a bulldozer looking for its MacDonald’s.

[6] See Jameson’s “The Antinomies of Postmodernism,” in The Cultural Turn, pp. 50-72, quotation p. 50.

[7] See “The Cultural Logic of Late Capitalism,” in Postmodernism or, The Cultural Logic of Late Capitalism (Durham: Duke, 1991), pp. 1-54, quotation p. 46.

[8] The second quotation cited here goes on to make this point clear: Retro-culture, Jameson continues, “abandon(s) the thinking of future change to fantasies of sheer catastrophe and inexplicable cataclysm, from visions of ‘terrorism’ on the social level to those of cancer on the personal.”

[9] The Truman Show, to be fair, does hedge the matter somewhat. The film’s numerous cutaways to the show’s viewers show a “real world” that is itself populated by TV-thralls, Truman Burbanks of a lower order. So when Truman steps out of his videodrome, we have a choice: We can either conclude, in proper Lacanian fashion, that Truman has simply traded one media-governed pseudo-reality for another. Or we can conclude that the film is asking us to distinguish between those, like Truman, who are able to shrug off their media masters, and those, like his viewers, who aren’t. I take this to be the film’s constitutive hesitation, its undecideable question.

[10] See Adorno’s “Portrait of Walter Benjamin” in Prisms, translated by Samuel and Shierry Weber (Cambridge: MIT, 1981, pp. 227-241), here p. 240.

[11] Examples of these last can be found, but it takes some looking: Paul Verhoeven’s Starship Troopers is a retro World War II movie, more so than Pearl Harbor (2001) or Saving Private Ryan (1998), which aspire to be historical dramas; and the Coen brothers’ Hudsucker Proxy (1994) is unmistakably a retro-screwball (and such a lovely thing that it’s a wonder others haven’t followed its lead). But they are virtually the lone examples of their kinds, singular members of non-existent sets. Neo-noir, by contrast, has become too extensive a genre to list comprehensively.

[12] Perhaps a rare instance of literary slapstick, manifestly modeled on cinematic examples, will drive this point home. The following is from Martin Amis’s Money (London: Penguin, 198?), p. 289: “What is it with me and the inanimate, the touchable world? Struggling to unscrew the filter, I elbowed the milk carton to the floor. Reaching for the mop, I toppled the trashcan. Swivelling to steady the trashcan, I barked my knee against the open fridge door and copped a pickle-jar on my toe, slid in the milk, and found myself on the deck with the trashcan throwing up in my face … Then I go and good with the grinder. I took the lid off too soon, blinding myself and fine-spraying every kitchen cranny.”

[13] See, for instance, Eating Raoul (1982); Parents (1989); The Cook, The Thief, His Wife, and Her Lover (1989); and, in a different mood, Silence of the Lambs (1991) and Hannibal (2001).

[14] On the cultural uses of cannibalism, see Cannibalism and the Colonial World, edited by Francis Barker, Peter Hulme, Margaret Iversen (Cambridge: Cambridge, 1998), especially Crystal Bartolovich’s “Consumerism, or the cultural logical of late cannibalism” (pp. 204-237).

[15] For a discussion of Delicatessen that pays closer attention to the film’s narrowly French contexts—its nostalgia for wartime, its debt to French comedies—see Naomi’s Greene’s Landscapes of Loss: The National Past in Postwar French Cinema (Princeton: Princeton, 1999).

[16] See, respectively, Modern Times; The Pawnshop (1916); The Gold Rush (1925).

[17] There’s a sense in which this operation is at work even in the most vicious knockabout. Even the most paradigmatically abusive comedies—the Keystone shorts, say—are redemptive in that the staging of abuse itself discloses a joyous physical dexterity. The staging of bodies out of synch with the inanimate world relies on bodies that are secretly very much in synch with that world—and this small paradox characterizes the pleasure peculiar to those films.

[18] Bazin, What is Cinema?, translated by Hugh Gray (Berkeley: UCalifornia, 1967); see also Siegfried Kracauer’s Theory of Film: The Redemption of Physical Reality (New York: Oxford, 1965).

[19] See Benjamin’s “Surrealism: The Last Snapshot of the European Intelligentsia,” translated by Edmund Jephcott in the Selected Writings: Volume 2, 1927-1934, edited by Michael Jennings, Howard Eiland, and Gary Smith (Cambridge: Belknap, 1999, pp. 207-221), here p. 210.

[20] Compare Langle and Vanderburch’s utopia of abundance, as noted by Benjamin himself, in the 1935 Arcades-Project Exposé (in The Arcades Project, translated by Howard Eiland and Kevin McLaughlin—Cambridge: Belknap, 1999, pp. 3-13), here p. 7:

“Yes, when all the world from Paris to China

Pays heed to your doctrine, O divine Saint-Simon,

The glorious Gold Age will be reborn.

Rivers will flow with chocolate and tea,

Sheep roasted whole will frisk on the plain,

And sautéed pike will swim in the Seine.

Fricaseed spinach will grow on the ground,

Garnished with crushed fried croutons;

The trees will bring forth stewed apples,

And farmers will harvest boots and coats.

It will snow wine, it will rain chickens,

And ducks cooked with turnips will fall from the sky.”

(Translation altered)

To the Political Ontologists

The political ontologists have their work cut out for them. Let’s say you believe that the entire world is made out of fire: Your elms and alders are fed by the sky’s titanic cinder; your belly is a metabolic furnace; your lungs draw in the pyric aether; the air that hugs the earth is a slow flame—a blanket of chafing-dish Sterno—shirring exposed bumpers and cast iron fences; water itself is a mingling of fire air with burning air. The cosmos is ablaze. The question is: How are you going to derive a political program from this insight, and in what sense could that program be a politics of fire? How, that is, are you going to get from your ontology to your political proposals? For if fire is not just a political good, but is in fact the very stuff of existence, the world’s primal and universal substance, then it need be neither produced nor safeguarded. No merely human arrangement—no parliament, no international treaty, no tax policy—could dislodge it from its primacy. It will no longer make sense to describe yourself as a partisan of fire, since you cannot be said to defend something that was never in danger, and you cannot be said to promote something that is everywhere already present. Your ontology, in other words, has already precluded the possibility that fire is a choice or that it is available only in certain political frameworks. This is the fate of all political ontologies: The philosophy of all-being ends up canceling the politics to which it is only superficially attached. The –ology swallows its adjective.

The task, then, when reading the radical ontologists—the Spinozists, the Left Heideggerians, the speculative realists—is to figure out how they think they can get politics back into their systems; to determine by which particular awkwardness they will make room for politics amidst the spissitudes of being. In its structure, this problem repeats an old theological question, which the political ontologists have merely dressed in lay clothes—the question, that is, of whether we are needed by God or the gods. If you have given in to the pressure to subscribe to an ontology, then this is the first question you should ask: Whatever is at the center of your ontology—does it need you? Does Becoming need you? Is Being incomplete without you? Has the cosmic fire deputized you? And if you decide that, no, the fire does not need you—if, that is, you resist the temptation to appoint yourself that astounding entity upon which even the Absolute depends—then you will have yourself already concluded that there is nothing exactly to be gained from getting your ontology right, and you will be free to think about other and more interesting things.

If, on the other hand, you are determined to ontologize, and determined additionally that your ontology yield a politics, there are, roughly speaking, three ways you can make this happen.

First, you could determine that even though fire is the primal stuff of the universe, it is nonetheless unevenly distributed across it; or that the cosmos’s seemingly discrete objects embody fire to greater and lesser degrees. The heavy-gauge universalism of your ontology will prevent you from saying outright that water isn’t fire, but you might conclude all the same that it isn’t very good fire. This, in turn, would allow you to start drawing up league tables, the way that eighteenth-century vitalists, convinced that the whole world was alive, nonetheless distinguished between vita maxima and vita minima. And if you possess ontological rankings of this kind, you should be able to set some political priorities on their basis, finding ways to reward the objects (and people? and groups?) that carry their fiery qualities close to the surface, corona-like, and, equally, to punish those objects and people who burn but slowly and in secret. You might even decide that it is your vocation to help the world’s minimally fiery things—trout ponds, shale—become more like its maximally fiery things—volcanoes, oil-drum barbecue pits. The pyro-Hegelian takes it upon himself to convert the world to fire one timber-framed building at a time.

Alternately—and herewith a second possibility—you can proclaim that the cosmos is made of fire, but then attribute to humanity an appalling power not to know this. “Power” is the important word here, since the worry would have to be that human ignorance on this point could become so profound that it would damage or dampen the world-flame itself. Perhaps you have concluded that fire is not like an ordinary object. We know in some approximate and unconsidered way what it is; we are around it every day, walking in its noontide light, enlisting it to pop our corn, conjuring it from our very pockets with a roll of the thumb or knuckly pivot. And yet we don’t really understand the blaze; we certainly do not grasp its primacy or fathom the ways we are called upon to be its Tenders. You might even have discovered that we are the only beings, the only guttering flames in a universe of flame, capable of defying the fire, proofing the world against it, rebuilding the burning earth in gypsum and asbestos, perversely retarding what we have been given to accelerate. This argument expresses clear misgivings about humanity; it doesn’t trust us to keep the fire stoked; and to that extent it partakes of the anti-humanism that is all but obligatory among political ontologists. And yet it shares with humanism the latter’s sense that human beings are singular, a species apart, the only beings in existence capable of living at odds with the cosmos, capable, that is, of some fundamental ontological misalignment, and this to a degree that could actually abrogate an ontology’s most basic guarantees. From a rigorously anti-humanist perspective, this position could easily seem like a lapse—the residue of the very anthropocentrism that one is pledged to overcome—but it is in fact the most obvious opening for an anti-humanist politics (as opposed, say, to an anti-humanist credo), since you really only get a politics once the creedal guarantees have been lifted. If human beings are capable of forgetting the fire, someone will have to call to remind them. Someone, indeed, will have to ward off the ontological catastrophe—the impossible-but-somehow-still-really-happening nihilation of the fire—the Dousing.

That said, a non-catastrophic version of this last position is also possible, though its politics will be accordingly duller. Maybe duller is even a good thing. Such, at any rate, is the third pathway to a political ontology: You might consider arguments about being politically germane even if you don’t think that humanity’s metaphysical obtuseness can rend the very tissue of existence. You don’t have to say that we are damaging the cosmic fire; it will be enough to say that we are damaging ourselves, though having said that, you are going to have to stop trying to out-anti-humanize your peers. Your position will now be that not knowing the truth about the fire-world deforms our policies; that if we mistake the cosmos for something other than flame, we are likely to attempt impossible feats—its cooling; its petrification—and will then grow resentful when these inevitably fail. You might, in the same vein, determine that there are entire institutions dedicated to broadcasting the false ontologies that underwrite such doomed projects, doctrines of air and doxologies of stone, and you might think it best if such institutions were dismantled. If it’s politics we’re talking about, you might even have plans for their dismantling. Even so, you will have concluded by this point that the problem is in its essentials one of belief—the problem is simply that some people believe in water—in which case, ontology isn’t actually at issue, since nothing can happen ontologically; the fire will crackle on regardless of what we think of it, indifferent to our denials and our elemental philandering. You have thus gotten the politics you asked for, but only having in a certain sense bracketed the ontology or placed it beyond political review. And your political program will accordingly be rather modest: a new framework of conviction—a clarification—an illumination.

Still, even a modest politics sometimes shows its teeth. William Connolly, in a book published in 2011, says that the world-fire is burning hotter than it has ever burnt; the problem is, though, that some “territories … resist” the flame. What we don’t want to miss is the basically militarized language of that claim: “resisting territories” suggests backwaters full of ontological rednecks; Protestant Austrian provinces; the Pyrenees under Napoleon; Anbar. Connolly’s notion is that these districts will need to be enlightened and perhaps even pacified, whereupon political ontology outs itself as just another program of philosophical modernization, a mopping up operation, the People of the Fire’s concluding offensive against the People of the Ice. Don’t fight it, Connolly, in this way, too, an irenicist, instructs the existentially retrograde. Let it burn.

The all-important point, then, is that there is absolutely no reason to get hung up on the word “fire,” in the sense that there is no more sophisticated concept you can put in its place that will make these problems go away: not Being, not Becoming, not Contingency, not Life, not Matter, not Living Matter. Go ahead: Choose your ontological term or totem and mad-lib it back into the last six paragraphs.  Nothing else about them will change.

• • •

Anyone wanting to read Connolly’s World of Becoming, or Jane Bennett’s Vibrant Matter, its companion piece, also from 2011, now has some questions they can ask. The two books share a program:

-to survey theories of chaos, complexity; to repeat the pronouncements of Belgian chemists who declare the end of determinism; and then to resurrect under the cover of this new science a much older intellectual program—a variously Aristotelian, Paracelsian, and hermetic strain in early modern natural philosophy, which once posited and will now posit again a living cosmos a-go-go with active forces, a universe whose intricate assemblages of self-organizing systems will frustrate any attempt to reduce them back to a few teachable formulas;

-or, indeed, to trade in “science” altogether in favor of what used to be called “natural history,” the very name of which strips nature of its pretense to permanence and pattern and nameable laws and finds instead a universe existing wholly in time, as fully exposed to contingency, mutation, and the event as any human invention, with alligators and river valleys and planets now occupying the same ontological horizon as two-field crop rotation and the Lombard Leagues;

-to recklessly anthropomorphize this historical cosmos, to the point where that entirely humanist device, which everywhere it looks sees only persons, tips over into its opposite, as humanity begins divesting itself of its specialness, giving away its privileges and distinguishing features one by one, and so produces a cosmos full of more or less human things, active, volatile, underway—a universe enlivened and maybe even cartoonish, precisely animated, staffed by singing toasters and jitterbugging hedge clippers.

I wouldn’t blame anyone for finding this last idea rather winning, though one problem should be noted right way, which is that Connolly, in particular, despite getting a lot of credit for bringing the findings of the natural sciences into political theory—and despite repeating in A World of Becoming his earlier admonition to radical philosophers for failing to keep up with neurobiology and chemistry and such—really only quotes science when it repeats the platitudes of the old humanities. The biologist Stuart Kauffman has, Connolly notes, “identified real creativity” in the history of the cosmos or of nature. Other research has identified “degrees of real agency” in a “variety of natural-social processes.” The last generation of neuroscience has helped specify the “complexity of experience,” the lethal and Leavisite vagueness of which phrase should be enough to put us on our guard. It turns out that the people who will save the world are still the old aesthetes; it’s just that their banalities can now borrow the authority of Nobel Laureates (always, in Connolly, named as such). Of one scientific finding Connolly notes: “Mystics have known this for centuries, but the neuroscience evidence is nice to have too.” That will tell you pretty much everything you need to know about the role of science in the new vitalism, which is that it gets adduced only to ratify already held positions. This is interdisciplinarity as narcissistic mirror.

But we can grant Connolly his fake science—or rather, his fake deployment of real science. The position he and Bennett share—that the cosmos is full of living matter in a constant state of becoming—isn’t wrong just because it’s warmed over Ovid. What really needs explaining is just which problems the political philosophers think this neuro-metamorphism is going to solve. More to the point, one wonders which problems a vitalist considers still unsolved. If Bennett and Connolly are right, then is there anything left for politics to do? Has Becoming bequeathed us any tasks? Won’t Living Matter get by just fine without us? And if there is no political business yet to be undertaken, then in what conceivable sense is this a political philosophy and not an anti-political one?

The real dilemma is this: There are those three options for getting a politics back into ontology—you can devise an ontological hierarchy; you can combat ontological Vergessenheit; or you can promote ontological enlightenment. Bennett and Connolly don’t like two of these, and the third one—the one they opt for—ends up canceling the ontology they mean to advocate. I’ll explain.

Option #1: Hierarchy could work. Bennett and Connolly could try to distinguish between more and less dynamic patches of the universe—or between more and less animate versions of matter—but they don’t want to do that. The entire point of their philosophical program is a metaphysical leveling; witness that defense of anthropomorphism. Bennett, indeed, uses the word “hierarchical” only as an insult, the way that liberals and anarchists and post-structuralists have long been accustomed to doing. Having only just worked out that all of matter has the characteristics of life, she is not about to proclaim that some life forms are more important than others. Her thinking discloses a problem here, if only because it reminds one of how difficult is has been for the neo-vitalists to figure out when to propose hierarchies and when to level them, since each seems to come with political consequences that most readers will find unpalatable. Bennett herself worries that a philosophy of life might remove certain protections historically afforded humans and thus expose them to “unnecessary suffering.” She positions herself as another trans- or post-humanist, but she doesn’t want to give up on Kant and the never really enforced guarantees of a Kantian humanism; she thinks she can go over to Spinoza and Nietzsche and still arrive at a roughly Left-Kantian endpoint. “Vital materialism would … set up a kind of safety net for those humans who are now … routinely made to suffer.” That idea—which sounds rather like the Heidegger of the “Letter on Humanism”—is, of course, wrong. Bennett is right to fret. A vitalist anti-humanism is indeed rather cavalier about persons, as her immediate predecessors and philosophical mentors make amply clear. The hierarchies it erects are the old ones: Michael Hardt and Toni Negri think it is a good thing that entire populations of peasants and tribals were wiped out because their extermination increased the vital energies of the system as a whole. And if vitalism’s hierarchies produce “unnecessary suffering,” well, then so do its levelings: Deleuze and Guattari think that French-occupied Africa was an “open social field” where black people showed how sexually liberated they were by fantasizing about “being beaten by a white man.”

Option #2: They could follow the Heideggerian path, which would require them to show that humanity is a species with weird powers—that humans (and humans alone) can fundamentally distort the universe’s most basic feature or hypokeinomon. That would certainly do the political trick. Vitalism would doubtless take on an urgency if it could make the case that human beings were capable of dematerializing vibrant matter—or of making it less vibrant—or of pouring sugar into the gas tank of Becoming. But Bennett and Connolly are not going to follow this path either, for the simple reason that they don’t believe anything of the sort. Their books are designed in large part to attest the opposite—that humanity has no superpowers, no special role to play nor even to refuse to play. Early on, Bennett praises Spinoza for “rejecting the idea that man ‘disturbs rather than follows Nature’s order.’” We’ll want to note that Spinoza’s claim has no normative force; it’s a statement of fact. We don’t need to be talked out of disturbing nature’s order, because we already don’t. The same grammatical mood obtains when Bennett quotes a modern student of Spinoza: “human beings do not form a separate imperium unto themselves.” We “do not”—the claim in its ontological form means could not—stand apart and so await no homecoming or reunion.

Those sentences sound entirely settled, but there are other passages in Vibrant Matter when you can watch in real time as such claims visibly neutralize the political programs they are being called upon to motivate. Here’s Bennett: “My hunch is that the image of dead or thoroughly instrumentalized matter feeds human hubris and our earth-destroying fantasies of conquest and consumption.” On a quick read you might think that this is nothing more than a little junk Heideggerianism—that techno-thinking turns the world into a lumberyard, &c. But on closer inspection, the sentence sounds nothing like Heidegger and is, indeed, entirely puzzling. For if it is “hubris” to think that human beings could “conquer and consume” the world—not hubris to do it, but hubris only to think it, hubris only in the form of “fantasy”—then in what danger is the earth of actually being destroyed? How could mere imagination have world-negating effects and still remain imagination? Bennett’s position seems to be that I have to recognize that consuming the world is impossible, because if I don’t, I might end up consuming the world. Her argument only gains political traction by crediting the fantasy that she is putatively out to dispel. Or there’s this: Bennett doesn’t like it when a philosopher, in this instance Hannah Arendt, “positions human intentionality as the most important of all agential factors, the bearer on an exceptional kind of power.” Her book’s great unanswered question, in this light, is whether she can account for ecological calamity, which is perhaps her central preoccupation, without some notion of human agency as potent and malign, if only in the sense that human beings have the capacity to destroy entire ecosystems and striped bass don’t. The incoherence that underlies the new vitalism can thus be telegraphed in two complementary questions: If human beings don’t actually possess exceptional power, then why is it important to convince them to adopt a language that attributes to them less of it? But if they do possess such power, then on what grounds do I tell them that their language is wrong?

Option #3: Enlightenment it is, then. What remains, I mean, for both Connolly and Bennett, is the simple idea that most people subscribe to a false ontology and are accordingly in need of re-education. Connolly describes himself and his fellow vitalists as “seers”—he also calls them “those exquisitely sensitive to the world”—and he more then once quotes Nietzsche referring to everyone else, the non-seers, the foggy-eyed, as “apes.” I don’t much like being called an orangutan and know others who will like it even less, but at least this rendering of Bennett/Connolly has the possible merit of making the object-world genuinely autonomous and so getting the cosmos out from under the coercions of thought. Our thinking might affect us, but it cannot affect the universe. But there is a difficulty even here—the most injurious of political ontology’s several problems, I think—which is that via this observation philosophy returns magnetically to its proper object—or non-object—which is thought, and we realize with a start that the only thing that is actually up for grabs in these new realist philosophies of the object is in fact our thinking personhood. This is really quite remarkable. Bennett says that the task facing contemporary philosophy is to “shift from epistemology to ontology,” but she herself undertakes the dead opposite. She has precisely misnamed her procedure: “We are vital materiality,” she writes, “and we are surrounded by it, though we do not always see it that way. The ethical task at hand here is to cultivate the ability to discern nonhuman vitality, to become perceptually open to it.” There is nothing about her ontology that Bennett feels she needs to work out; it is entirely given. The philosopher’s commission is instead to devise the  moralized epistemology that will vindicate this ontology, and which will, in its students, produce “dispositions” or “moods” or, as Connolly has it, a “working upon the self” or the “cultivation of a capacity” or a “sensibility” or maybe even just another intellectual “stance.” Connolly and Bennett have lots of language for describing mindsets and almost no language for describing objects. Their arguments take shape almost entirely on the terrain of Geist. They really just want to get the subjectivity right.

There are various ways one might bring this betrayal of the object into view, in addition to quoting Bennett and Connolly’s plain statements on the matter. Among the great self-defeating deficiencies of these books are the fully pragmatist argumentative procedures adopted by their authors, who adduce no arguments in favor of their  chosen ontology. Bennett points out that her position is really just an “experiment” with different ways of “narrating”; an “experiment with an idea”; a “thought experiment,” Connolly says. “What would happen to our thinking about nature if…” The post-structuralism that both philosophers think they’ve put behind them thus survives intact. But such play with discourse is, of course, entirely inconsistent with a robust philosophy of objects, premised as it is on the idea that the object exerts no pressure on the language we use to describe it, which indeed we elect at will. The mind, as convinced of its freedom as it ever was, chooses a philosophical idiom just to see what it can do.

This problem—the problem, I mean of an object-philosophy that can’t stop talking about the subject—then redoubles itself in two ways:

– The problem is redoubled, first, in the blank epiphanies of Bennett’s prose style, and especially when she makes like Novalis on the streets of Baltimore, putting in front of readers an assemblage of objects the author encountered beneath a highway underpass so that we can imagine ourselves beside her watching them pulsate. The problem is that she literally tells us nothing about these items except that she heard them chime. One begins to say that she chose four particular objects—a glove, pollen, a dead rat, and a bottle cap—except that formulation is already misleading, since lacking further description, these four objects really aren’t particular at all. They are sham specificities, for which any other four objects could have served just as well. She could have changed any or all of them—could have improvised any Borgesian quartet—and she would have written that page in exactly the same manner. You can suggest your own, like this:

-a sock, some leaves, a lame squirrel, and a soda can

-a castoff T-shirt, a fallen tree limb, a hungry kitten, and an empty Cheetos bag

a bowler hat, a beehive, a grimy parasol, and Idi Amin

These aren’t objects; these are slots; and Bennett’s procedure is to that extent entirely abstract. This is what it means to say that materialism, too, is just another philosophy of the subject. It does no more or less than any other intellectual system, maintaining the word “object” only as a vacancy onto which to project its good intentions.

-The problem is redoubled, second, in the nakedly religious idiom in which these two books solemnize their arguments. That idiom, indeed, is really just pragmatism in cassock and cope. The final page of Bennett’s book prints a “Nicene Creed for would-be vital materialists.” Connolly’s book begins by offering its readers “glad tidings.” Nor does the latter build arguments or gather evidence; he “confesses” a “philosophy/faith,” which is also a “faith/conviction,” which is also a “philosophy/creed.” Bennett and Connolly hold vespers for the teeming world. Eager young materialists, turning to these books to help round out their still developing views, must be at least somewhat alarmed to discover that our relationship to matter is actually one of “faith” or “conviction.” A philosophical account of the object is replaced by a pledge—a deferral—a promise, by definition tentative, offered in a mood of expectancy, to take the object on trust. Nor is this in any way a gotcha point. Connolly is completely open about his (Deleuzian) aim “to restore belief in the world.” It’s just that no sooner is this aim uttered than the world undergoes the fate of anything in which we believe, since if you name your belief as belief, then you are conceding that your position is optional and to some considerable degree unfounded and that you do not, in that sense, believe it at all.

It’s not difficult, at any rate, to show that Connolly for one does not believe in his own book. The stated purpose of A World of Becoming is to show us how to “affirm” that condition. That’s really all that’s left for us to do, once one has determined that Becoming will go on becoming even without our help and even if we work against it. Connolly’s writing, it should be said, is generally short on case studies or named examples of emergent conjunctures, leaving readers to guess what exactly they are being asked to affirm. For many chapters on end, one gets the impression that the only important way in which the world is currently becoming is that more people from Somalia are moving to the Netherlands, and that the phrase “people who resist Becoming” is really just Connolly’s idiosyncratically metaphysical synonym for “racists.” But near the end of the book, three concrete examples do appear, all at once—three Acts of Becoming—two completed, one still in train: the 2003 invasion of Iraq; the 2008 financial collapse; and global warming. All three, if regarded from the middle distance, seem to confirm the vitalist position in that they have been transformative and destabilizing and will for the foreseeable future produce unpredictable and ramifying consequences. What is surprising—but then really, no, finally not the least bit surprising—is that Connolly uses a word in regard to these three cases that a Nietzschean committed to boundless affirmation shouldn’t be able to so much as write: “warning.” Melting icecaps are not to be affirmed—that’s Connolly’s own view of the matter. Mass foreclosure is not to be affirmed. Quite the contrary: If you know that the cosmos is capable of shifting suddenly, then you might be able to get the word out. The responsibility borne by philosophers shifts from affirmation to its opposite: Vitalists must caution others about what rushes on. The philosopher of Becoming thus asks us to celebrate transformation only until he runs up against the first change he doesn’t like.

This is tough to take in. Lots of things are missing from political ontology: politics, objects, an intelligible metaphilosophy. But surely one had the right to expect from a theorist of systemic and irreversible change, one with politics on his mind, some reminder of the possibility of revolution, some evocation, since evocations remain needful, of the joy of that mutation, the elation reserved for those moments when Event overtakes Circumstance. But in Connolly, where one might have glimpsed the grinning disbelief of experience unaccounted for, one finds only the bombed out cafés of Diyala, hence fear, hence the old determination to fight the future. The philosopher of fire grabs the extinguisher. The philosopher of water walks in with a mop.

Thanks to Jason Josephson and everyone in the critical theory group at Williams College.

Illegals, Part 4






A new problem: What are we to say about stories that feature both allegorical and literal versions of the same thing, of the same class of object or type of person—about True Blood, for instance, whose vampires code comprehensively as queer even though the show also includes among its characters several mortal, day-walking gays and lesbians? This is a real problem, because the show seems to be drawing a distinction, prompting a rigorous reader, one perhaps suspicious of allegory, to insist that the vampires can’t possibly be in some general way stand-ins for queer folk because the show already possesses these latter, and they are not coterminous with the vampires. Placing an allegorical construct in the same room as its literal equivalent doesn’t, as one might suspect, make the allegory stronger or easier to explicate. Quite the contrary: The allegory and the literal referent are going to be locked in a struggle for the relevant name or meaning, and it’s not entirely clear which is going to have the upper hand in that fight. You might think that the literal term has home turf advantage. If a gay person and a vampire are standing next to each other, and I only get to call one of them “gay person,” I’m going to choose the gay person. That’s what it means to say that the presence of the literal term can prevent the allegory from coalescing, like the trace amounts of yolk that ruin your every attempt at meringue. But then hyperbole is at the heart of allegory—you create an allegorical version of x by exaggerating certain features of x—and in that case, the non-literal construct can easily seem like the better version of the thing, more fully and vividly itself, purer, pushed further away from the imaginary average against which all specific difference is gauged. If Dracula and Oscar Wilde double each other, I might decide that it is the vampire who is really queer, whereupon gay and lesbian people will find themselves outflanked, normal by comparison, conceptually maneuvered bank into the ranks of dull humanity. The allegory can poach from the literal term its very name.

If we’re going to make sense of this particular deviant variety of allegory, it will help to have the terms provided by an unreformed structuralism, whose core insight was that all stories begin by generating some opposition or another: A and B, cowboy and Indian. The idea, then, is that since most of us experience oppositions as cockeyed and agitating, the business of nearly any story will be to stabilize its antithesis, though there are different ways a movie or novel or folktale might do this: by subordinating one term to another and perhaps by eliminating it altogether (cowboy defeats, expels, or guns down Indian); or, alternately, by fusing the two together into some unforeseen third (cowboy marries Indian). Storytelling can become more complicated, of course, as it begins shading in intermediate steps that already contravene the central opposition (the half-breed, say, or the white Indian) or as it appends secondary oppositions to its core one: (cowboy and East Coast railroad interest). But nearly all storytelling is at heart a play with oppositions, and the trick when considering a complicated story is to discern the master antithesis (or small set of antitheses) that underpins its many more local conflicts. The remarkable thing, then, about stories that contain allegorical and literal versions of the same thing is that they sabotage this most basic feature of narrative; they monkeywrench the binary by plugging the same term into each of the opposition’s slots—once nakedly and then again in disguise—and thereby create reflexive stories that are not, however, immediately recognizable as such: cowboy and cowboy, teasingly and with the air of paradox.

That such stories pose special challenges should be clear from Spielberg’s War of the Worlds, released in 2005. Nearly every newspaper and magazine reviewer—and, I suspect, most ordinary fans—thought that the movie was about terrorism or that it was 9/11’s conversion into science fiction: It was “the first serious post-9/11 sci-fi movie,” “a 9/11 allegory,” a reminder that “terrorists can take out a big chunk of the Manhattan skyline,” a surprisingly solemn tour of the nation’s “worst terrorism nightmares.” The New York press took to warning its readers off the movie: “merciless,” they called it, and “shocking”—35mm PTSD. And it is certainly true, as the reviewers all mentioned, that the film is crammed with “allusions” and “parallels” and “references” to 9/11: civil emergency in greater New York, panicked urbanites sprinting down city blocks, overwhelmed beat cops, airplane wreckage, a wall of the missing, and—least generically, most jarringly—a rain of ash.

That War of the Worlds is not about terrorism one knows all the same, because it tells you as much, and in so many words—except, of course, one doesn’t know it; everybody missed it. The movie’s hero has two children, and as they escape from the attack, the younger one screams: “Is it the terrorists?”—and gets no answer. Then a minute or two later the older one repeats the question, more calmly this time: “What is it? The terrorists?” “No,” the father says, “this came from someplace else.” All the more remarkably, the film has already by this point identified that Someplace Else or Other Thing, the thing that isn’t terrorism. Some four minutes into the movie, Tom Cruise’s ex-wife instructs him to stay on top of their teenaged son over the weekend, because he has a research report due “on the French occupation of Algeria.” And there it is: The malicious gag underlying the movie is that the invading Martians give a high-school student all the material he needs to write a really bang-up paper about occupation or that they turn his assignment into a family project: This is the weekend everyone learns about empire.

War of the Worlds was thus a thought experiment or indeed a political education—one specific to the middle years of the Bush era: Can you imagine a force powerful enough to do to the US what the US has done to Iraq? Can you imagine, via analogy and extrapolation, a military wielding technological superiority over the US of a kind that the US currently wields over the world’s other nations? Or as one character says of the invaders: “They defeated the greatest power in the world in a matter of a couple of days. … This isn’t a war any more than there is a war between men and maggots.” What the reviewers inexplicably overlooked was that terrorists do not occupy entire countries. And that’s all you need to bear in mind to realize that Spielberg’s movie was is no sense an homage to 9/11—just the reverse—it was a deliberate and principled insult to the instant sanctity of that day, a way of putting 9/11 back into perspective by staging on the same terrain an event of incomparably greater magnitude, a way, that is, of showing the New Yorkers who were told to skip the movie just how much worse it could have been: Baghdad.

This is the sort of thing that becomes possible when allegory doubles its referent; such doubling is, indeed, one of the only ways that narrative can place the same term on both sides of an opposition; X fights X; the US invades the US; Americans as colonizers, Americans as colonized. This is the structure we’ll need to carry forward with us if we want to make sense now of Attack the Block, which is Super 8’s English twin, the other alien-invasion movie from 2011 that pulls in equal measure from ET and the Goonies: more adventuring tweens, more BMXs, more aliens that seem visible only to the pubescent. But then Attack the Block is also the first movie I’ve named that is openly about race in some entirely literal and earthbound sense. This is first of all a simple matter of casting: Almost none of the movie’s heroes are conventionally, ethnically English; all but one come from African or Caribbean immigrant families. If you haven’t seen the movie, it’s not enough to imagine The Goonies with English accents. You have rather to imagine The Goonies as new-model Cockneys, black and mixed-race and speaking grime patois. But then it’s not just the characters: Attack the Block is also telling a story about race; indeed, it is telling perhaps the most familiar racial story of the last few generations, the one about integration and enfranchisement. All you need to know is the bare outlines of the plot: Once they start fighting the movie’s aliens—and fight they do, to the death; the movie’s resemblance to ET and Super 8 ends there—the boys are transformed from the piece’s villains to its heroes. They begin the movie by mugging a young white nurse, but they end it by saving the day. In other words, it’s not just that Attack the Block is one of the most extensive pieces of black British pop culture yet produced, and in that sense some kind of landmark. The movie is actually walking you through a reassessment of black Britain and can, to this extent, easily seem like an advance on that recent crop of movies that make the English poor seem like the worst people on earth, though it has to be said that those films’ chosen technique for communicating their sour insight is simply to remake Hollywood movies on English soil: Harry Brown, for instance, which casts Michael Caine as an East End vigilante and aitchless Eastwood—it’s there in the title, if you squint: “brown” = “smudged” or “unclean” = Harry Dirty; and especially the remarkable Eden Lake, which is Deliverance transplanted to a not-so-rural Buckinghamshire, with hoodie-wearing poor kids in the place of Georgia hillbillies: 13-year-old proletarians carving up their betters. These movies and others like them leave the impression that the British working class has simply gone feral—the impression, that is, that class relations in the UK have by this point simply snapped or that basic modes of sociability or decency or respect have disappeared, with dehumanized workers and lumpens stuck living in perpetuity on the far side of their old traditions. To a considerable extent, then, Attack the Block asks to be read as a polemical response to this cinema of broken Britain. The movie begins in the mode of Harry Brown and then simply demands that viewers revise their judgments. The respectable white audience’s designated proxy obtrusively changes her mind. At the beginning of the movie she and an older white neighbor commiserate: “They’re fucking monsters.” But by the end of the movie, she is telling the cops to back off from the bruvs: “I know them. They’re my neighbors.”

One way to summarize Attack the Block, then, would be to say that it is a story of uplift and interracial friendship, in which Britain redefines itself in order to make room for its newest members. Nor is it overreaching to mention Britain in this context; the film has the nation unmistakably on its mind. It is set on Guy Fawkes Day, for one, and so asks to be read as a redo of 1605—England’s second saving!—with West Indian yardies performing the patriotic gallantries once reserved for Protestant knights. More to the point, the movie’s 15-year-old hero, propelled in one scene from out a high window, saves himself by un-metaphorically clinging to the Her Majesty’s flag.

The movie, in sum, revises British nationalism by pushing it in a liberal and multiethnic direction, though we will want to note that this observation is dogged by two persistent instabilities.

First: The film’s visuals might be plenty nationalist—all fireworks and Union Jacks—but its dialogue is not. Anything but: The film’s teenagers routinely say that they are fighting only to defend their housing project, their block. Where the movie is John-Bullish, the characters are instead intensely localist: “We wouldn’t have mugged you if we’d known you lived here.” That’s a sentiment available only to someone whose sense of the imagined community stops cold at the corner shop. And to this jingoism of the neighborhood the characters add a working-class or black ethos of self-policing—the code, in the US context, of Stop snitching and jury negation and Walter Moseley novels: “This is the block. We take care of things our own way.” It might be possible, when trying to make sense of the movie, to simply superimpose these two terms—the nation and the locality—in which case we would conclude that Attack the Block is proposing a council-estate nationalism, a black-white alliance of the distrustful and cop-hating poor. There’s something to this idea, and yet the individual components remain visible and not fully resolved into one another.

Second: The movie does almost nothing to revise one’s perception that its heroes are sadistic predators. It merely concludes that sadistic predators are sometimes useful to have around. The film’s few white men are by contrast all emasculated. “I am registered disabled” one of them says; “I’m a member of fucking Amnesty!” shouts another; and the movie’s gibe is that these amount to the same thing, just two different routes to castration, physical and ideological—twin softnesses. This will obviously complicate our sense of the movie as liberal, since even as the movie is promoting a kind of racial liberalization, it is deciding that liberal men aren’t good for much, and the burden of British masculinity will thereby pass over to the nation’s young Trinidadians and Congolese, fourteen-year olds with knives and swords and bats and explosives, a nine-year old with a handgun, announcing that his new warrior name is “Mayhem.” Attack the Block sometimes gives the impression that it is recruiting the child soldiers of South London.

But then those two instabilities are nothing, mere tremors, compared to the movie’s central and defining instability, the oscillation around which it is constructed. I’ve been describing the role of race in the movie at the literal level, but then there is also an allegorical level, in which everything I’ve just described is taken back. This makes for a vast and, I think, unsolvable puzzle, though in many ways Attack the Block’s racial allegory is unusually bald and not in the least puzzling and amounts to this: The aliens are also black—hairy, subhuman, grinning, and black. I don’t actually want to put too much emphasis on the color in isolation. Racial allegory, after all, is not automatic. Lots of black things are not black. Darth Vader is not black. If all we had to go on is that the creatures are inky-dark, I’d say we could let it slide. But that’s not it: The movie is entirely upfront about how it wants us to understand the aliens’ ebony. The kids stand over the first adult monster they kill, and two of them speak out loud what they see: “Wow, that’s black, that’s too black to see. … That’s the blackest black ever, fam. … That’s blacker than my cousin Femi”—which moniker is Nigerian and usually followed by names like Ogumbanjo and Kuti.

The movie, in other words, openly places the creatures on a spectrum of African-ness. What’s more, it has various ways of expanding on this tactic. Only once does Attack the Block’s dialogue turn openly nationalist, when a gang member sticks up for the home country at the expense of Africa, pouring contempt on a white philanthropist off doing aid work in Ghana: “Why can’t he help the children of Britain? Not exotic enough, is it?” Or there’s this: One teenager warns another than an alien is about to attack by shouting “Gorilla!”—and then that’s another clue. Attack the Block is, at the level of its allegory, an inversion of Rise of the Planet of the Apes, a second film about berserking primates, and with the meanings from that other movie largely intact—the meanings, not the judgments. If Rise stages a latter-day slave rebellion—an insurgency against the mass incarceration of black men, an uprising that is at once prison break and revolution—then Attack the Block stages a related event, a bit of colonial turnabout, but asks us instead to cheer its suppression. Anyone who goes into this movie hoping that the Jamaican newcomers are going to battle the white dragon of the West Saxons or cut down the English aristocracy’s heraldic wyverns is going to have to swallow hard. For Attack the Block offers to enfranchise black Britons only by giving them creatures to kill who are blacker than themselves. A group of mostly black teenagers earns its citizenship by systematically cutting down the new crop of even darker arrivals. Conceptually, this is rather astounding: The film is telling two antithetical stories at once—and not via a multiplot—there is no main plot and contrapuntal subplot; it is telling two contradictory stories, but it only has one plot; the same story, then, but susceptible to two radically opposed constructions: a parable about learning to like black immigrants that is at the same time a fantasy about wiping them out—“Kill ‘em! Kill all them things!” The creatures in Attack the Block are so very jet that they often blur into the shadows, but the filmmakers, in what must have seem like an inspired touch, have given them glow-in-the-dark fangs, which means you can only see them when they bare their teeth. There’s an old joke in the American South. It begins: “How do you go coon-hunting at night?”