Review of “Or Orwell: Writing and Democratic Socialism” by Alex Woloch

BY MATTHEW HERBERT

I’m not much on book reviews that say basically I’m reviewing this book so you don’t have to read it.

But: I’m reviewing this book so you don’t have to read it.

In Or Orwell: Writing and Democratic Socialism, published in 2016 by Harvard University Press, Alex Woloch uses the tools of literary theory to dissect and examine George Orwell’s supposedly straightforward writing style. Woloch unearths a range of unexpected caveats and nuances behind Orwell’s famous dictum that good prose should be clear and simple, “like a window pane.”

Rather than shining a light straight through a window pane on to plain truths, Orwell’s prose actually contorts itself around the deeply complicated “sheer activity of writing,” according to Woloch’s analysis. You may have thought Orwell simply faced unpleasant truths head-on, but, beneath the surface of Orwell’s plain prose, Woloch espies opaque-seeming currents and describes them like this: “Conceptualizing exploitation entails persistently converging on the actual experience of singular persons stuck in oppressive structure; the ramifications of the structure (unlike the structure itself) cannot be fully articulated.”

Ahem.

But if you think this is the point where I start to parody Woloch–and more generally literary theory–it is not. Despite my conservative attitude about truth, facts, and rationality, I read Woloch’s newfangled book with an open mind and found a great deal of pleasure in many of his insights. If your favorite paintings were Renoir’s large canvases and an art critic invited you to come close and pore slowly over their details with him, would you hold back because you thought you saw something shifty in the critic’s eyes?

There is a wonderful chapter on Orwell’s first collection of essays published as a book, Inside the Whale. In it, Woloch makes some surprisingly accessible points about Orwell’s use of a “threshold effect” to describe the tension of being simultaneously inside and outside an abstract problem. This threshold is where a critical writer always exists, trying to gain access to something in the world but from the domain of her own private interiority.

Literary theory often seizes on a very close reading of short phrases or even individual words to make a larger claim. (See the entire 36-page first chapter on Orwell’s use of “quite bare” in “A Hanging.”) This maneuver can seem maddeningly trivial, because who cares about a single word being repeated throughout a longish text: that happens all the time. Or it can seem like a facile trick, because any competent wordsmith can combine small units of language to mean almost anything they wish.

But by the end of the chapter on Inside the Whale, I believed that Woloch actually had something interesting to say about the structure of Orwell’s writing. The odd thing about Inside the Whale is that it comprises only three essays (“Charles Dickens,” “Boys’ Weeklies,” and “Inside the Whale,” the last entry an extended consideration of Henry Miller’s Tropic of Cancer), and Orwell takes no pains whatsoever to say why he has grouped them together. Woloch actually does a convincing job citing some of Orwell’s recurring phrasing (yes, very plain and ordinary seeming) to argue that the thing that unifies the three essays is Orwell’s structural invocation of the “social horizon,” the boundary between class differences. Orwell doesn’t just say there are class differences in capitalist England: he uses form to dramatize how he thinks about those differences.

So much for such subtleties. Or Orwell is full of them, but I have a suspicion that most Orwell enthusiasts come to the man for the plain meaning of his texts and not for a paisley pattern of literary theoretical details. Still, if you find yourself, as I do, wanting to take in everything written about Orwell, Or Orwell surrenders up a number of more concrete, perhaps more satisfying observations. (We wanted to get up close to the canvas with that art critic for a reason, right?)

That chapter on Inside the Whale? In it Woloch mentions a letter in which Orwell said he wanted to write more “semi-sociological” essays like the three in that collection. And then, almost as an aside, Woloch suggests that Orwell may have casually invented what we now call cultural studies. And this seems true. All three essays, especially “Boys’ Weeklies,” explore the social meaning of popular texts, keeping their pure literary value as a side issue. The instinct of the literary critic is to dwell on the good, the true, and the beautiful, but here we have Orwell pointing out that it might be more socially revealing to pay attention to the actual texts that people buy and read no matter how commercial, ordinary, crass, or transgressive. I daresay I would not have been invited to a graduate philosophy seminar about “The Simpsons” in 1994 had Orwell not blazed a trail in the direction of cultural studies in 1940.

As a writer, Orwell has been “ritualized” and frozen in place as a kind of “figure through which two warring entities–two modes of reading–seek to obliterate each other,” Woloch claims. Indeed Orwell has become so paradigmatic in the contest between “naïve empiricism” (the idea that the world is knowable through plain old observation) and Critical Theory (the idea that our “observations” of the world are always mediated through cant, self-interest, delusion and, above all, ideology) that Orwell is “not just positioned on one side of the theoretical line, [but] has been invoked to structure the boundary itself.”

I think Orwell would like that: he structures the left-right boundary himself. Why not? He once labeled himself a Tory anarchist, a contradiction in basic attitudes if there ever was one. A committed socialist who wanted to smash and recast the very foundations of society, Orwell also disliked anyone who came across as too out of step with societal norms, notably gays, yogis, vegetarians, and men who wore tight shorts or hiked in groups.

So Orwell himself poses a dilemma about thinking and writing and being in the world. He is a kind of living paradox. In one of his most memorable observations, Orwell says that all socialists (like himself) hope for a world that has been expunged of war, famine, dirt, disease, and fear. But then he pauses to ask if there is anyone who actually wants to live in such a utopia? There would be no struggle, which sanctifies life for the leftist. Woloch answers this question for Orwell by quoting at length from his 1943 essay “Can Socialists Be Happy?”:

[. . .] I suggest that the real objective of Socialism is not happiness. Happiness hitherto has been a by-product, and for all we know it may always remain so. The real objective of Socialism is human brotherhood. This is widely felt to be the case, though it is not usually said, or not said loudly enough. Men use up their lives in heart-breaking political struggles, or get themselves killed in civil wars, or tortured in the secret prisons of the Gestapo, not in order to establish some central-heated, air-conditioned, strip-lighted Paradise, but because they want a world in which human beings love one another instead of swindling and murdering one another.

Amen. If socialists cannot be happy, let us hope they can keep striving to be.

Advertisement

Is the Useless Class Coming Sooner Than We Think?

BY MATTHEW HERBERT

In case you haven’t noticed, there has been a flurry of news reporting recently on the astonishing writing capabilities of new artificial intelligence (AI) applications. I haven’t tested one yet, but you can give one of these apps a writing prompt and some basic parameters, and it will produce a stylish and effective text that meets your specifications to a T.

A trick that no journalist seems able to resist these days is to insert a passage into their article written by the AI they are writing about. And the insertion is seamless; it reads as if the author wrote it herself.

In this fairly typical article from the Atlantic Monthly, we learn that lawyers are already using a leading AI app, ChatGPT, to produce legal briefs. And why wouldn’t they?–“ChatGPT passes the torts and evidence sections of the Multistate Bar Examination,” we also learn.

Robo-lawyers, anyone?

But there is much more to consider. If AI can craft an effective legal brief, AI can understand and adjudicate one too. So then we have: Robo-judges, Robo-juries.

Today, I would like to resist my usual urge to plod through a topic and tell you at distressing length how I think Orwell would judge it an offense on the human spirit. Instead I simply ask you: How big a deal do you think this is?

(Image: Smarthistory)

The ability to generate narratives around which groups of humans can be organized and rallied (to do absolutely anything, from underwriting home loans to pursuing happiness to killing millions other people) is the core skill of being human.

The historian Yuval Noah Harari has been arguing for the last several years that knowledge workers will soon follow the path of blue collar workers in being displaced by machines. The literate class has long been able to avoid thinking about the implications of AI-level automation because there was such a clear difference between what we do–manipulate abstract symbols–and what blue collar workers do–manipulate material things. We may have thought, fleetingly, it was a pity that factory workers were having it tough keeping up with machines, but I doubt many of us actually cared all that much. It wasn’t our problem.

But now it is.

Harari believes we are almost all of us destined to join what he calls the useless class, something completely alien in our history. Humans struggle, strive, create, accomplish. What will we do when machines can (and do) concoct the narratives that goad, instruct and inspire us in our highest ambitions? You might not need to lift boxes or sew zippers into jeans to feel human, but by god you need to do something that serves a higher purpose. What will it be? We can’t all create newer, better AIs–they will do that for themselves soon enough.

So I stop uncharacteristically short and ask for your thoughts. What does it mean that AI can now, instantaneously, write better than 99 percent of us? Is the useless class being created right in front of our noses? Are we in it yet?

Reflections on “Animal Liberation” by Peter Singer

BY MATTHEW HERBERT

Franz Kafka’s short story “In the Penal Colony” is deeply surreal, even by Kafka’s mind-bending standards.

This is the story’s mise en scene: an unnamed military officer in an unnamed tropical colony is demonstrating the operation of a killing machine. He is about to use it to execute a prisoner. The machine does its work over the course of 12 hours, slowly inscribing the text of the traduced law onto the condemned man’s torso, with ever deeper needle punctures.

Kafka’s machine does a job that many civilized people agree needs to be done–eliminating capital criminals–but why must it be done by means so cruel and bizarre?

With slight adjustments, this was the question I was asking myself as I read in Peter Singer’s classic 1975 book Animal Liberation* how beef cattle are slaughtered. Hanging upside down from a conveyor belt ten feet off the floor, the cattle approach the killing line. They are required by law to be alive during this process. They are supposed to be unconscious, but–mistakes are made, mechanisms falter, especially under the pressure of haste, and–sometimes the cattle are awake and aware. What happens then?

The animal, upside down, with ruptured joints and often a broken leg, twists frantically in pain and terror, so that it must be gripped by the neck or have a clamp inserted in its nostrils to enable the slaughterer to kill the animal with a single stroke, . . . .

The strange thing is, it is laws promoting humane slaughter and food purity that result in this nightmare scene of unrestrained sadism. The Food and Drug Act of 1906 requires that slaughtered animals not fall into the blood of other slaughtered animals; other laws protect the faith-based traditions that say food animals must be “healthy and moving” when killed. Taken together, the law, Singer points out, turns what is supposed to be an efficient if not quite benign method of slaughter into “a grotesque travesty of any humane intentions that may have once lain behind it.”

(Image: Animal Welfare Institute)

There are many philosophers with greater name recognition than Peter Singer–Kant, Plato, Aristotle, Descartes, Hume, Nietzsche. But unlike Singer, none of philosophy’s stars founded real-life social movements. Singer has. Go ahead and Google ‘Animal Liberation Front’.

None of philosophy’s pantheon had such immediate and profound life-changing effects on their contemporary world as Singer has. Granted, this is mostly because Singers’ old-fogey predecessors didn’t even try, but that is kind of the point. Singer is alone among his peers in imagining the philosopher’s role to be a doer, not just a thinker. (If you want to throw out Marx as a challenge to this claim–don’t even. How many years did he spend in his London living room writing Das Kapital? And how many years did he spend on the barricades? Marx may have written that the job of philosophy is to change the world and not just describe it, but that’s all he did: write.) Singer believes that the purpose of ethics is to reduce harm done to any sentient being, right here and right now. Therefore his measure of success is not book sales, not endowed professorships; it is not debate-winning arguments at conferences; it is his power to cause more and more people to adopt values that demand the reduction of harm to other sentient beings.

By this measure, Singer has been the most successful philosopher in history. Though he would not claim personal credit for the whole scope of animal rights work accomplished since 1975, it is fair to say that the movement Singer created has (a) changed laws in many countries curtailing and regulating scientific experimentation (much of it pointlessly cruel) on animals; (b) created powerful civil society groups that protect animals from harm, including PETA and the SPCA; (c) exposed the massive waste and cruelty of factory farming, which has led to significant, if far from complete, reduction of harm to food animals; (d) inspired the decision by millions to stop regarding animals’ interests as morally negligible, and, more or less concomitant with this; (e) inspired millions to stop consuming animals or animal products; and (f) informed the widespread realization that using animals for food is disastrously inefficient and globally unsustainable. There is simply not enough Earth to support animal farming on a scale that would provide meat to even a fraction of humans who might desire it. Even providing meat to the tiny fraction who can presently afford it in “Western” quantities is scorching the Earth. People should know that they are not just fiddling but feasting as they actively turn our earthly paradise into a hell.

Pardon me. I am breaking one of Singer’s most important rules. The economic and environmental disaster of meat-eating is so pressing that we proponents of animal liberation cannot afford the kind of pious dudgeon I just worked myself into. It puts others off and–the last 15 years of history notwithstanding–the purpose of rational argumentation is not to incapacitate one’s opponents with outrage. It is to make the world a better place by changing minds.

Weird as it seems, Kafka’s story “In the Penal Colony” is actually a straightforward piece of consciousness-raising literature. There’s a reason the action takes place in a colony. Because in a republic, as Cesare Beccaria first argued in 1764, the state has no right to deprive its citizens of their lives, else it is no republic. Killing by the state may only be done in territories where violent subjugation is the definitive method of governance. (Orwell expresses the sentiment behind this argument simply and powerfully in his 1931 essay “A Hanging.” I recommend it.)

There’s also a reason why the colony and the executioner go unnamed in Kafka’s story. In an “enlightened” society, outrages on morality can only be sustained if they are easily ignored. They must be hidden. They must happen off-stage, done by non-persons in non-places. Before Singer wrote Animal Liberation–in particular the 60-page long chapter “Down on the Factory Farm”–citizens of our liberal democracy might have plausibly pleaded ignorance to the mass-scale harms done to our food animals and the environment by factory farming. But now the institutionalized slaughter has been unmasked, and our citizens need not even be literate to know this. Because of the success of Singer’s writing, the practices of factory farming have been reported on by scores of journalists and documentary filmmakers.

One of the purposes of “In the Penal Colony” was to jolt the polite, educated Europeans who read short stories into asking what kinds of moral travesties were being carried out in their “interest” in the real world. (For an idea of the extent of these, see Adam Hochshcild’s 1998 book King Leopold’s Ghost: A Story of Greed, Terror and Heroism in Colonial Africa.)

At one time, mass ignorance of factory farming and other instances of cruelty to animals was excusable. It was all so well hidden. But like Kafka, Singer unmasked the cruelty, brought it right before our eyes for inspection.

Orwell told his contemporaries the uncomfortable truth that “ordinary” middle-class English prosperity was rooted in the colonial system of violent subjugation. Only the enslaver’s whip could produce sugar cheaply enough to provide it to all of England’s clerks, shop-keepers and other workers at an affordable price. Singer tells us a similar uncomfortable truth: the the ordinary practice of meat-eating depends for its viability on a vast system of cruel tyranny and organized coverup. Make meat less cruel, and you price it out of practically everyone’s reach. The system, with its bizarre killing machines (and countless other outrages), is what turns out the ordinary, plastic-wrapped pound of ground beef that you (may) consider your due as an ordinary consumer in a developed country. But read Animal Liberation and see if it does not upset this view by noting your role in this system–where you stand with respect to the killing line, with its Kafkaesque upside-down conveyor belts of “healthy and moving” beef cattle. A heavily moneyed system of propaganda has been set up to prevent you from knowing these things. But, again, read Singer. See if he doesn’t make you want to rebel. See if he doesn’t help you reject the idea that people with money can tell you what to think.

—–

* I read the 2009 updated edition.

Reflections on “Catch-22”

BY MATTHEW HERBERT

Even if you’ve never read Joseph Heller’s classic anti-war novel Catch-22, you probably know the basic setup. American bomber crews in Italy in World War Two face not just deadly German flak, but the absurd military logic that forces them to keep flying missions even as their commanding officer keeps extending the number required for a complete tour.

The number of missions goes up to 40, then 50, then 60. Any sane person could see where this was going and would try to get out. And there does seem to be a way out. Any crew member could be grounded for reasons of insanity. But:

There was only one catch, and that was Catch-22, which specified that a concern for one’s own safety in the face of dangers that were real and immediate was the process of a rational mind.

The result was a logic-chopping paradox. Any pilot who willingly flew more missions was crazy and didn’t have to; anyone who refused was sane and had to.

I had never read Catch-22 until this month, and I expected it to be a romp. Comedy this madcap, even if dark, dark usually lopes along. And, the machinery that advances the plot–American B-25 bombers and German 88-mm guns–could and did strike with slashing speed. The orgies and drinking sprees that punctuate Catch-22 are the same swirling blurs of frenzy they are in real life, occurring too fast to fix any memorial records of the events.

So it came as a surprise when I found myself reading slowly and savoring certain passages. Some days I only read 20 pages at a time. I found myself pausing to ruminate connections to ideas both prophetic and antecedent.

Heller’s most obvious debt is to Lewis Carroll. Page after page, Heller depicts the logic of war and bureaucracy as a baffling hall of mirrors. The feeling of being through the looking glass is evoked by almost every character and plot device. Major Major will only agree to meetings in his office when he is out of his office. Lieutenant Colonel Korn only allows airmen who ask no questions to ask questions. The atheist chaplain’s assistant berates the chaplain for underselling God. Perhaps my favorite is General Peckem’s guidance to his new executive officer: “While none of the work we do is very important, it is important that we do a lot of it.”

Carroll spun a fantasy in which illogic became systematic in a made-up world. Heller showed illogic to be the foundational requirement for making modern war in the actual world.

Indeed, the case is there to be made that Heller normalized the anti-war novel. The Good Soldier Schweik, published in 1923 (1930 in English), was arguably the first in this class, and the Lost Generation produced a string of novels and memoirs throughout the 1920s and -30s that took the glory out of war and showcased its moral desolation. But after Heller, this view of war would be the only one a serious novel could take. Christopher Hitchens notes that the timing of Catch-22, published in 1961, “had an unusual felicity, helping to curtain-raise what nobody knew would be The Sixties.” True, but this piquant note only catches half of Heller’s significance; his drumbeat message that war is a failure of the human spirit helped create the 1960s; it didn’t just land at their doorstep.

Along these lines, it would be going too far to say that Heller’s friend Kurt Vonnegut would not have been able in 1969 to publish Slaughterhouse-Five, another great anti-war novel, without Catch-22 as prologue. But it seems plausible that a reading public would not have made it through Vonnegut’s bizarre plot devices of time travel and alien abduction had they not been softened up by Heller’s more prosaic absurdities.

A handful of writers have, over long ages, made the argument that war is morally wrong. The case is easily grasped. But it would not be until Thomas Pynchon in 1973 put Dadaist elements and a cockeyed existentialism into the blender of Postmodernism to produce Gravity’s Rainbow that the argument would emerge that war violates something even deeper in the human person than morality–some Kantian substrate of order that creates the very possibility of morality. Gravity’s Rainbow deranges this invisible foundation of the human person.

And Heller preceded Pynchon in this project. As with Vonnegut and Slaughterhouse-Five, I cannot say the Gravity’s Rainbow could not have been written without Catch-22, but clearly Pynchon owes Heller a large debt. The whole idea of oddly-named U.S. servicemen hurrying across war-warped Europe on hectic, outlandish missions of unknown authorship and opaque objectives is to be found first in Catch-22. The mess NCO Milo Minderbinder operates a syndicate of black-marketeering air freight services while he himself serves as mayor of several European cities and potentate of several post-Ottoman territories. Is this Heller stretching reality or bending it? The point–which Pynchon took to new heights (or depths)–is that we are not supposed to know.

My U2 Decade (Part One)

BY MATTHEW HERBERT

I recently posted about making the musical playlist for my wake. (I mentioned then, and I repeat now: I am not dying. The playlist thing was just an exercise that I thought might be illuminating and useful. I carried it out in the spirit of Kurt Vonnegut’s old quip, “Should it happen that someday, God forbid, I die, . . . “)

In the course of that exercise, a different theme took shape, and it seemed to call for comment.

Here it is: The soundtrack of my life is U2’s 1991 album Achtung Baby. I hear it inside my head every single day. That must mean something, no? Something that, with effort, can be analyzed and pieced together. It would have to draw heavily on memory. The music inside my head began playing 31 years ago.

It happened like this.

In the fall of 1991, I had recently moved into a studio apartment on a hillside above Heidelberg, Germany. It was my first home on my own. I was discovering how quickly night fell in those months and how long the nights would be. The city lights below started twinkling up at me at four in the afternoon. And the nights lasted forever. If it was foggy the next morning, which it often was, the darkness would not lift until 10 o’clock or so. No matter how sunny one’s natural disposition, the dominant mood was brooding.

I was 25. Like all boys of that age, I had a favorite band, and I thought my favorite band said something about me. So I would play them on my stereo in that apartment above Heidelberg.

My favorite band did say something about me. Up to that point, my temperament had had a religious and puritanical side. I don’t mean that I was either one of those things (although man, did I try). But I did have this keenly moralistic outlook on life, and it had something to do with God, or possibly duty.

I also thought I was cool. I thought having run away from rural Missouri, going to war, and then ending up in Heidelberg had turned me from a bumpkin into a cosmopolitan. The ideas we get. You never really stop being who you are.

Possibly because I was so cool, I also sensed that there was something abroad in the world that needed rebelling against–maybe the unreflective acceptance of the status quo. Like I said, I was 25, and at that age boys don’t really know anything. Vague feelings were all I had to go on.

My favorite band was U2, and they did for me what favorite bands are supposed to do–they helped sharpen my formless, adolescent feelings about the world into more definitive ideas and attitudes.

I didn’t know it at the time, but U2’s first three albums had already mirrored the main narrative of my young life. Boy, released in 1980, showcased a self-interrogating male adolescence in loud, spare punk rock. But its lyrics were earnest, and that’s not very punk rock, is it? Boy posed questions about what it meant to be a morally serious teen. That’s all. It didn’t answer those questions, which is a very good thing because, like I said, boys don’t know anything at that age.

October, released in 1981, narrows down the broader questions of Boy into a single, more focused question. Can the self-serving rebellion of rock-n-roll accommodate a person’s call to serve a higher power? Answering such a call demands quietude, humility, and penitence, none of which is very rock-n-roll. (I would learn years later that U2’s members had nearly broken up over their struggle to answer this question, and October was the expression of this struggle.) October was a good album, but it left me feeling that the band’s central question might need answering, not just recycled into heartfelt new songs.

When U2 gave a definitive answer to the question, in War, their third studio album, it was a knockout blow. They had chosen rock-n-roll and moral seriousness. The answer to their question about faith and art was that there was no way to resolve the dilemma; therefore, there was no need to do so. To be human is to be pulled in different directions. Once you know that, it’s full steam ahead. Just listen to “Sunday Bloody Sunday” and tell me that’s band that has not made up its mind.

(War, by the way, was not just an artistic triumph in its own right. It also rid me forever of the need I had felt up until then occasionally to listen to Christian rock. Christian rock sucked supremely. I suppose it still does; I haven’t listened to it in years. Lacking the space here for a full disquisition, suffice it to say that Christian rock is bad in every way that music can be bad, and then it goes on to invent new categories of wrongness and malignity. This is what dishonesty does to art. Subordinating any art form to the strictures of a message–whether political, religious, ideological, what have you–is a form of censorship, and it contradicts the imperative that art must be free.)

Now I deliberately pass over the sonic, ethereal beauties of The Unforgettable Fire, released in 1985, and the expansive, epoch-making victory of Joshua Tree, 1987. They tell their own stories about U2’s growth and evolution, but they did not change my perception of the band’s voice as a sincere and fundamentally hopeful one.

I didn’t know then that life keeps changing even after you feel like you’ve reached firm ground–that things keep happening and happening that turn you into a different person, all the time. Wait, didn’t I say just a few paragraphs ago that I was basically stuck being a bumpkin, that people never really change? I don’t want to bring Aristotle’s theory of change into this, so I’m going to table that problem for now.

It is enough for us to know, as we float back into my small Heidelberg apartment in November 1991, what U2 meant to me at that time. Despite the band’s musical experimentations in the intervening years, their message was still firmly rooted in War–in 1983. At least for me it was.

Then my favorite band disturbed my peace.

I had brought home a CD of Achtung Baby from the PX. I lay on the floor right next to my small stereo. And I listened to the album straight through, track by track. What I heard confused me. The band suddenly had, and exhibited, libido. And doubt. Not teenage doubt about whether you can play rock-n-roll and still go to church, but deeper doubt about whether the floor might fall out from underneath life itself. They were asking whether anything made sense.

The theme of darkness also entered. It pervaded not just the mood and symbols of the album, but it made literal appearances in several of the songs. Before, U2’s songs had countenanced grief, outrages on conscience, and the moral severities of religion, but these trials happened, so to speak, in the clear light of day, and one always got the sense that the protagonist would win. Now Bono sang, “Love is blindness; I don’t want to see; Won’t you wrap the night around me.” He doesn’t want to see? As night pressed in on me, it pressed in on my favorite band, too. And for the first time, they seemed uncertain about whether dawn would break. I wasn’t ready for that.

But the kicker was irony. Up until Achtung Baby, it had been perfectly clear to me that U2 always said what they meant. Now, Bono was singing that he was “ready for the laughing gas;” boasting that his sex appeal was “even better than the real thing.” He dramatized the Last Supper in terms approaching parody. He sang that he was “ready to let go of the steering wheel.” Did he mean any of this? The U2 I knew steered straight and defiantly ahead. What was this chagrined recklessness all about?

These unsettling questions came packaged in a new musical vocabulary as well. That was probably what kept me listening, the thing that kept me open to the band’s new psychological landscape of uncertainty. Achtung Baby is U2’s most industrial-rock album. It is an orchestrated cacophony of metallic clinks, vehicular roars, factory hums, and driving, percussive machine blows singed with feedback. The Edge took a new measure of musical control over the album’s songs, and it seemed he was trying to hit out at the very things Bono was compromising with–fame, grandiosity, moral laxity. Bono was proclaiming a season of dark folly that was alarmingly close to nihilism, and the Edge was leading an insurgency against it. Or so it seemed to me.

I’m probably reading too much into Achtung Baby. I’m only 56. What do boys know at this age? But here goes anyway.

In retrospect, we know that Bono and the band were playacting attitudes that were actually supposed to be the targets of their interrogations. And I literally mean “play-acting.” When U2 toured in support of the album, Bono appeared first as The Fly, a wised-up reincarnation of black-leather Elvis. This was Bono bringing indictments of hypocrisy and cupidity against himself for wanting to be a rock star. Later in the show, he would appear as MacPhisto. MacPhisto was literally the devil in gold lame, but figuratively an homage to C.S. Lewis’s proposition that Satan is a sly charmer who never attacks us frontally.

I won’t bore you with a song-by-song analysis of what Achtung Baby means. It would only be so much blather. Listen to the tracks instead.

Eventually I did figure out why Achtung Baby became the soundtrack of my life. It was because Western history itself was making a revolutionary turn toward unchallenged freedom in 1991, and U2, acting on some mad, artistic insight, had recorded their masterpiece precisely at the hinge of that colossal turn, in Berlin. I would wake up the day after Christmas 1991, hung over in my little apartment above Heidelberg, and the USSR, the only viable enemy that liberal democracy had, would have ceased to exist. All its people had changed sides. They were with us now. Just a little more than two years before, the people of Berlin had torn down their wall and declared themselves to be one people. Now everyone was doing it.

The biggest part of me wanted to believe that History was at an End. That’s what my favorite books said, books by philosophers and other theoreticians. Once civilized people seized freedom, the books said, they would never go back to authoritarianism. The future was now inevitable and bright. But U2 was there to witness the turn, and they recorded a dark and anguished album that said you could never be sure about life and that things keep happening and happening that change you and you never know how you will turn out. I would not know until years later that, at that most unlikely moment, I was also ready to let go of the steering wheel, just like my band.

In My Time of Dyin’: A Post about Music

BY MATTHEW HERBERT

The words are from the title of a song by Bob Dylan, recorded when he was all of 21 years old. It’s a good song, but really what did he know about dying?

What do I know?

I am much closer than young Robert Zimmerman, who, on the cover of his 1962 debut album, Bob Dylan, looked like he wasn’t even shaving yet. Me, I’m close enough to make certain considerations.

Oh, but before I get into those, this is not an announcement of my imminent demise. I am as unaware today as I was yesterday of things actively trying to kill me.

But I am close enough to understand how time will start fraying soon. I know of the things that will turn time from an airy abstraction into hard reality. The heart, the lungs, the liver; they will all start giving up on the jobs they once did so well, for so long. There’s no way Dylan knew of those things when he was 21. He was using death the way poets and essayists have always used it–as an idea to focus the mind.

This blog has never been autobiographical. I’ve occasionally written about my favorite hobby, running, and I once made a big deal of nearly dying from too much morphine after back surgery. I also wrote a florid and intimate declaration of love for a hill one time. But by temperament, I keep a lofty focus on the Olympian heights–books, ideas, and the legacy of Orwell.

But I recently started to address a problem I hadn’t even known existed. And that problem is inescapably autobiographical: the matter of final arrangements. Oh, not the legal stuff. I’m a chary bureaucrat by training and habit, so I’ve checked all the boxes that one thinks of as “responsible estate planning.” Of course I’ve done that. The only thing that really matters to me is my ability to care for the small group of Earthlings I think of as my own, so I’m not going to allow myself to fail, through mere oversight, in that mission. (You know all those memes that start with “You had one job?”–I will not have them appended to the Facebook announcements of my Departure. I simply won’t.)

It has nagged me for several years, though, that there are other, more personal arrangements to be made. Money isn’t everything, after all.

Recently I posed myself the following question: If music were to be played at my wake, what would it be? And a further question arose: Would anyone even know where to start making such a playlist? Given the general glumness of the circumstances and the pressing need for buying tombstones and whatnot, would anyone feel like taking this job on? I fear it might get half done, if at all.

And this cannot stand.

Anyone of my generation knows how decisively important a soundtrack is. The Breakfast Club simply does not, cannot come to its proper end without the booming forth of “Don’t You Forget about Me,” by Simple Minds. The song finishes the story. I’m in search of songs that finish my story.

Well, easy, I thought. My life has a soundtrack, and it is U2’s dark, ironic, but still majestic Achtung Baby of 1991. It says everything you need to know about my inner life: used to be religious, now godless, bonded in some amorphous way to Berlin’s swirl of doom, art, redemption, and American guardianship.

But wait. It’s all very well to have U2’s loudest, most desolate and industrial sounds going through your head literally every day of your life, and to know that the songs are you in a way, but Achtung Baby would be an absolute non-starter at a wake. Take three-quarters of an hour, if you can, and listen all the way through to “Acrobat,” the 11th track on the Album. You’re feeling drained, forsaken and sonically battered by the time it plays. You need a respite of light and air. Instead, “Acrobat” comes on: a buzzsaw of inchoate anguish and rage. All is darkness and moral wrong, it says. Does it project the mood one wants just after a funeral?

It does not.

And this got me thinking: there is an urge to have the last say at one’s leave-taking, but this kind of thing can be taken too far. Last rituals certainly must take the departed as their subject, but they exist for other people. They must take the audience into equal account.

So, I will make time soon enough to write about U2’s formative power over me. There are questions that need answering about how their darkest songs came to score a bright, breeze-kissed life like mine, unmarked by wracked conscience or hint of woe.

But for now, to the task at hand. This is how I got down to the business of choosing the songs I want to be played at my wake, and how they revealed some telling problems.

Balance, is what I thought. The songs need to strike a balance between what they mean/t for me and what they say to the listener. And I came up with a few promising candidates, but I also came up with even more problem cases. To wit:

“Jokerman” by Bob Dylan, would be superb, I thought. It showcases Dylan at his poetic best, managing to be wry, wistful, and vaguely accusatory at the same time. The imagery, much of it Biblical, is supreme. The music, nudged along by Mark Knopfler’s understated guitar work, stays in the background, letting Bob spin out a complex warning of apocryphal menace. “Jokerman” was in.

Why did it beat out other, better known Dylan songs? I think “It’s All Right Ma, I’m Only Bleeding” is Dylan’s greatest poem. It is his apex achievement. But that’s the problem. It’s my wake, and I don’t want people zoning out at it, transfixed by what might be the best song written by a popular musician in the last 100 years. Listen to it on your own time.

Ditto “Hallelujah” by Leonard Cohen, mutatis mutandis. It’s too good. Plus, there’s its unstinting mood of heartbreak, which I presume will be going around freely enough without any prompts from my playlist. I hope for jokes to be told at my wake, sardonic stories to be shared. These things won’t happen if we have Jeff Buckley (doing my favorite version of “Hallelujah”) reminding us how the celestial joy of love is always at risk of being run into the ditch of abject human failure.

I also came to suspect that the effort to avoid the grim or acrimonious note could be taken too far. One of my absolute favorite songs of all time is “Mr. Blue Sky” by Electric Light Orchestra. It is, on grounds both psychological and musicological, the happiest song in the world. And therein lay the problem, as I saw upon reflection. Wouldn’t I come off as trying to tell the audience how to feel, and being pretty heavy-handed at it?

I submit this for your consideration and await your response: Should I omit “Mr. Blue Sky” for being too happy?

This dilemma raised a more general problem. Why not simplify the task and just write out a list of all one’s favorite songs, consigning other criteria to the wind? It’s a tempting schema. But it too sails into choppy waters. “Fat Bottomed Girls” is hands down my favorite song by Queen, because it rocks consummately and it it explores a theme that is delightful to me. But do I include it just because of its general excellence? Would I not risk slighting skinny bottomed girls, implying that back in the high tide of life I was indifferent to their presence? Wakes are not the place to feel a small hurt has been done to you, and I refuse to be the cause of even one. I am nothing if not gallant. So “Fat Bottomed Girls,” although a certified sterling favorite, was out.

A few songs were too on the nose, I worried. They seemed to be thrown in because they fit a lax kind of formula. If you hear Jackson Browne’s “Running on Empty” at my wake, you could be forgiven for thinking blandly, “Oh yes, he liked running.” And it’s true, I did like running. But I really like “Running on Empty,” although mostly for its imagery of the road and youth, not because it’s about running. It captures a time of life when the high, white cumulus clouds decorating the skies in one’s 20s start to turn gray and minatory, announcing the coming storm and turbulence of the 30s. Mostly, “Running on Empty” made the cut because of its musical delivery, earnest and bittersweet but not somber.

I also like Bob Seeger’s “Against the Wind,” and I enjoy the moment in Forrest Gump when the song is used to conjure the depleted, defiant mindset of the long-distance runner against the backdrop of Monument Valley. It’s a great song about restlessness and fatigue, but it leaves the listener wondering if not knowing where you’re going is an inevitable part of life. Is restlessness a permanent state? “Against the Wind” raises this question but does not answer it. Certain forms of melancholy are bound to present themselves at a wake, but I don’t think I want my celebrants asking themselves whether constant, pointless exertion is the main ingredient in the human condition. Let that thought emerge in its own way. So “Against the Wind” was out.

I definitely wanted a song or two by REM. They have always been one of my favorite bands, and I felt like my playlist would be incomplete without them. They provided the soundtrack to my life in my early 20s, before U2 shattered it and replaced it with Achtung Baby. But here I ran into a variant of the just-list-your-favorite-songs problem. It doesn’t work with bands either, or at least it doesn’t for REM. “It’s the End of the World as We Know It (And I Feel Fine)”? It’s definitely me, but the song is a cockeyed lark. Much as I hope my guests feel free to have zany thoughts, I’m not sure I should make the invitation explicit. “Everybody Hurts”? Lovely song, but please, wouldn’t it be slightly overdoing things at a wake? “Losing My Religion”? This one very well might make the cut, but radio overplay has sapped some of its feeling of originality. Plus people might start doing the choppy-hand dance that Michael Stipe does in the video. Could be weird. I guess it would be okay actually.

Feel free, if the mood takes you (Image: IMVDb)

So what I am left with is a handful of REM songs that strike me as inoffensive to the occasion but so obscure I feel I could mislead the mourners into thinking the songs meant more to me than they did. “Driver Eight” fits this description. It’s a quality REM song that I probably listened to hundreds of times in my 20s, but does it signify? In terms of simple musical beauty, “South Central Rain” is my favorite song by REM, but that plaintive chorus where Michael Stipe says over and over that he’s sorry?–It would almost certainly leave some people wondering whether there was a message there. What would I be doing all that apologizing for? And to whom? People might start puzzling out what the industrial scale wrong was that I had done and how it had never come to light. But I still love the song.

So I am at a loss REM-wise. I await your suggestions, dear reader.

I think I should close this post by looping back to U2. I can’t just give their whole catalog the boot because their portraiture of my life is too plaintive and morose for a wake, right? They are my band, after all, and there are questions of loyalty at stake. I must find a song or two of theirs that mark my farewell properly. The songs, it turns out, were easier to find than I thought they would be.

“Kite,” from the 2000 album All That You Can’t Leave Behind is pretty clearly a goodbye to a loved one, but it’s malleable enough that it covers many different kinds of goodbyes. One of a parent’s highest goals in life is emancipation–the moral and practical preparation of a child to stand on their own two feet. It’s a deep paradox, though: if you’ve done it well, you have broken your own heart, let your child go like a kite into the wind. But you have to do it anyway. To leave emancipation undone, or to do it poorly, is to wreck a young life and to risk setting off a broader train of dysfunction. So if it helps to hear Bono put the problem literally, when he sings, “I want you to know that you don’t need me anymore,” you’re welcome. It helped me too.

The second U2 selection was even easier. It was almost perfect for a wake. I couldn’t believe I’d missed it. Also from All That You Can’t Leave Behind, “Stuck in a Moment You Can’t Get Out Of” has a lovely gospel uplift to it. It’s addressed to someone lost, careworn, and temporarily defeated. Bono writes a lot of songs like this. (For a more somber variant not quite wake-appropriate but in every way superb, listen to “Stay [Far Away, So Close!].”) And Bono often tells you there is hope, or maybe something even better, like peace or love or affirmation, on the other side of the troubles. The Edge’s backing vocals at the end of the song–while studio-tuned to artificial perfection: oh, well–serve to complete Bono’s message. If we are to be saved at all, salvation will come through other people. Other people will make us who we are. That’s how we go on, I guess.

And so I close with my actual playlist as it stands, with no further commentary (except to say there is no particular order to the songs–that is a whole other problem). It feels okay to leave it this way. It is not just good manners to resist having the last, overbearing word. It is an unavoidable feature of the wake. The songs will have the last word themselves, and then it is up to other people to go on talking.

Bob Dylan: Jokerman, Like a Rolling Stone, Brownsville Girl

U2: Kite, Stuck in a Moment You Can’t Get Out Of

Jackson Browne: Running on Empty, Late for the Sky

Don Henley: Boys of Summer

Neil Young: My My, Hey Hey (Out of the Blue), Thrasher, Powderfinger, Comes a Time

Bruce Springsteen: Thunder Road

Fleetwood Mac: Dreams

10,000 Maniacs: Like the Weather, Verdi Cries

REM: Driver Eight, Losing My Religion, Don’t Go Back to Rickville

Chris Rea: Road to Hell

ELO: Turn to Stone, Mr. Blue Sky (?)

Review of “Grant” by Ron Chernow

BY MATTHEW HERBERT

Ron Chernow, it seems, has never met a cliché he didn’t like.

I open randomly to any of the 960 pages of Chernow’s 2017 biography of Ulysses S. Grant, and the dull, timeworn phrases turn out in squads, companies, and whole regiments. On page 561, we read that President Grant “toiled under heavy burdens,” while his longtime aide John Rawlins “felt duty bound to assist him.” (Rawlins was Grant’s Secretary of War–how else was he supposed to feel?)

We learn of a photograph of Grant taken during the Vicksburg campaign. There is, Chernow tells us, “an indescribable look of suffering” on Grant’s face. How does Chernow limn Grant’s supposedly indescribable pain? The general, he writes, has “sad, woebegone eyes.”

When we learn that President Woodrow Wilson, a native Georgian, dismissed Grant’s efforts at postwar reconstruction, it is with this lapidary phrasework: Wilson “consigned President Grant to the dustbin of history . . . .”

As the Battle for Chattanooga came to a successful culmination, “Grant hoped Sherman would reap the lion’s share of glory.” We are touched, later, to know that Sherman stood “ramrod straight” at Grant’s funeral.

Grant “had to show the velvet glove and iron fist at once” while dealing with Indians in the West.

Drearily, there is much, much more. I enjoyed Grant for the most part, but Chernow’s lack of literary seriousness became a distraction. Sometimes I couldn’t help tallying the number of clichés and clumsy phrases on a given page.

Which is too bad. Chernow’s massive biography is largely a success. It corrects a number of misconceptions about Grant and reveals little-known details of his life. Overall, Chernow makes a convincing argument that Grant is both greater and more complex than most of us have imagined him to be.

First, there is the matter of Grant’s drinking. While he certainly had a complicated relationship with booze, Grant was no drunk, at least not in the usual sense. The impression one gets from some Civil War histories (Shelby Foote’s magnificent The Civil War: A Narrative comes to mind) is that Grant spent long stretches of time drinking while on duty and even did so while encamped outside Vicksburg before his breakthrough victory there in July 1863. Marshalling a meticulous string of reports–and winnowing out a substantial amount of character attacks by Grant’s political foes–Chernow develops a very plausible, and different, profile of Grant’s drinking problem.

Grant was a distinctly episodic drinker who knew he had a problem with alcohol and never indulged when his family was nearby to provide emotional support. The early sprees that formed the foundation of later slurs and innuendoes all took place in the 1850s while Grant was a junior Army officer stationed far away from his young family, in remote Oregon and California. His isolated slips later in life all followed the same pattern: when Grant fell off the wagon, it was always one night, away from home, and not on duty. If any other patterns marked Grant’s drinking, they were: how he managed to maintain temperance for months, even years at a time, and how near he came to defeating alcoholism entirely. On a two and a half-year world tour after his second presidency–precisely the time to let down his guard and live a little–Grant steadfastly kept his wine glass overturned even as he was celebrated in palaces, ballrooms, and salons everywhere he went.

Grant’s dogged pursuit of sobriety reflected a broader American struggle to tame its wild side. In 1822, when Grant was born, Americans consumed the equivalent of 90 bottles of whisky each year on average. By the time Grant died in 1885, refusing a brandy-laced dose of morphine because as he told his doctor, alcohol didn’t “agree with him,” Americans drank less than half the amount they had at their peak earlier in the century.

The military genius Grant showed in the Civil War was so central to wartime victory, it has overshadowed how hard Grant fought as president to defeat the United States’ largest, most lethal terrorist group–the Ku Klux Klan. As someone who has worked with soldiers for most of my life, it is no surprise how bitterly they take it when their battlefield sacrifices are compromised by politicians who abandon the aims they fought for. Grant often felt the same way. But he enjoyed a rare historical opportunity: he was a former soldier who found himself empowered to follow through as a politician to try to secure what his troops had bled for.

Grant after a day of hard losses at Cold Harbor, Virginia in May, 1864 (Image: Britannica)

Chernow’s retelling of the founding of the KKK and Grant’s determination to destroy it puts this episode where it needs to be–front and center in the history of Reconstruction. Grant is justly praised for creating the “spirit of Appomattox” when he accepted Robert E. Lee’s surrender in April 1865. Grant’s generous terms, allowing Lee’s troops to return home with their guns and horses, was meant to mark a definitive end, not just to hostilities, but to feelings of hostility. And while many southerners accepted this gesture with dignified thanks, many more did not. Confederate General Nathan Bedford Forrest, as the first Grand Wizard of the KKK, led thousands of former Confederate soldiers in the South on a campaign of killing Republicans and recently emancipated Blacks, smashing voter registration sites, burning churches, and resisting all efforts to implement the constitutional amendments ending slavery and ensuring voting rights in the South (the 13th and 14th).

Grant may have been president, but in 1870, as the KKK launched what Chernow rightly calls “a new civil war by clandestine means,” he reverted to thinking like a general. The KKK’s center of gravity, Grant reasoned, was its ability to intimidate anyone who might testify against them in court. So Grant went all in on destroying this center of gravity. Responding to southern governors’ requests for help, Grant sent federal troops to enforce the Ku Klux Klan Act (actually three “Enforcement Acts”), which empowered the government to jail KKK suspects without Habeas Corpus rights–critically depriving them knowledge of witnesses’ identities–and to use federal troops to directly suppress KKK activity, doing the job local sheriffs refused to do. By 1872, Grant smashed the KKK’s power. Forrest resigned as Grand Wizard and recanted his overtly racist political goals.

Of course even the most naïve student of American history knows that Grant (and the nation) did not succeed in achieving the broader aims of Reconstruction. Indeed Chernow does an admirable job of describing how the decline of Grant’s second term as president was more or less coextensive with the demise of Reconstruction. When Grant left office in 1876, the network of white supremacists that would maintain the racist power order of the South were still alive and well despite the defeat of the KKK. They would go on to create the legal structure of Jim Crow and resist civil rights for another 80 years. (Read Eric Foner’s entire body of work on the massive criminal enterprise that defeated Reconstruction and kept racism alive. If you only have time for one of Foner’s books, make it Reconstruction: America’s Unfinished Revolution – 1863-1877.)

We often hear how personal the Civil War was, dividing brother from brother and father from son. The most luminous thread woven throughout Chernow’s book, retold with fine, stoical understatement that makes up for some of Chernow’s general failures of style, is Grant’s friendship with John Longstreet, who would become an acclaimed Confederate general. Grant and Longstreet became friends at West Point, when each was a boy of 18. Depending on which source you believe, Longstreet was Grant’s best man at his wedding to Julia Dent, or was at least instrumental in pairing the two up. (The Dents were family friends of the Longstreets.)

Longstreet and Grant served together in the Mexican American War in 1846. The next time they would meet on the battlefield, Grant’s forces would nearly kill Longstreet, in the Battle of the Wilderness in May 1864. Then, miraculously, Longstreet appears at Appomattox Courthouse, a senior commander under Lee. He is astonished when Grant treats him as a friend, and Longstreet is instrumental in persuading Lee that Grant will give, and honor, fair terms of surrender. After Grant’s death, Longstreet would call him “the truest as well as the bravest man that ever lived.”

Grant was a hard but idealistic man. He fought the Confederacy with death-dealing determination but then acted magnanimously in victory, hoping mercy would open the door to reconciliation. His genius as a general consisted in an intuitive understanding of a new kind of warfare he was helping to create, which is today called combined arms maneuver warfare. But he was no mere theorist. Grant won, Chernow tells us, because he never let up. His victories were often sealed on the day after a bout of grievous losses. Grant knew the other side would be reeling too, and he judged that that knife-edge moment was the opportunity to win–a victory of the smallest margin would give way to a larger one. And Grant was right. This was the path that he followed to defeat Lee and end the war, which earned him the undeserved label of “butcher.” Grant was not a butcher, but a fierce realist. He knew tomorrow’s peace would come faster the more violence he visited on the enemy today, and that was how he fought.

I have left out a handful of other themes that make Chernow’s book worth reading, especially his description of Grant’s habitual credulity and how it led to a string of corruption allegations. Grant’s surprising ability as a writer late in life becomes less surprising when we learn he wrote as many as 40 detailed orders a day as a general and later wrote all his presidential addresses without the aid of a committee of editors. As Grant was dying of cancer, he finally took on an editor he trusted, to help publish his memoirs so Julia would have a pension. The editor was Mark Twain, and Twain, who was not shy of cutting down idols no matter how large, called Grant a “flawless” writer.

A final theme that emerges from Chernow’s biography is how Grant constantly improved himself and constantly reinvented who he was. And he was seemingly afraid to leave nothing of his old self behind in the process. By the time late in life Grant had become, in turn, a driven civil rights activist, a calculating politician, a capable economist, an effusive public speaker, and a writer for the ages, he had completely shed his old identity as a warrior. His steadfast refusal to glorify war and to trade on his status as the general who saved the Union was the highest mark of his greatness.

Review of “The Best and the Brightest” by David Halberstam

BY MATTHEW HERBERT

Critics are supposed to criticize, yes?

If you’ve read more than one of my book reviews, you’ve likely noticed a tone of glad, unbroken praise. Once or twice I’ve used these pages to cast the stern, disapproving eyebrow, but it’s mostly sweetness and light I try to spread. I love books, and, life being short, I mostly read books I think will reward me.

This summer I finally got around to reading David Halberstam’s 1969 masterpiece, The Best and the Brightest. It tells how the most privileged, idealistic constellation of political leaders in the history of the United States committed the long series of moral crimes and strategic blunders that made up our “experience” in Vietnam. Americans’ belief that our government would do the right thing and tell us the truth has never recovered. I think it would be hard to call yourself a student of our national history without reading this book. Do so without delay.

But, as I sat outside enjoying the long German summer along with Halberstam’s classic, I couldn’t help feeling that things, as nice as they were, were perhaps going on too long. Halberstam especially.

So, I’m just going to say it: The Best and the Brightest would be a far better book if it were half as long. As is, it clocks in at 665 dense pages. And don’t get me wrong; it is never boring. Halberstam is a masterful writer with an eye for the revealing detail. But there are just so many of them.

As Halberstam puts each of the major Vietnam players and several of the not-so-major Vietnam players under the microscope, one wonders if we need the same cellular level of detail on the whole cast of characters. While it is illuminating to see how the nerdish National Security Advisor McGeorge Bundy followed a path from Groton to Yale to Harvard, evincing “so much intelligence harnessed to so much breeding,” as Halberstam puts it, we get several more pages illustrating just how and why Bundy’s Harvard years were so happy. He reads the Greeks; comes to be tutored by a little known professor named Henry Kissinger–an important observation but one that should be made and moved on from. We all know who Kissinger will become.

In about half the space he actually takes, Halberstam could have reached his useful, important conclusion, that Bundy, like so many of the men who took us to war in Vietnam, was an example of “a special elite, a certain breed of men whose continuity is among themselves. They are linked to one another rather than to the country.” Defense Secretary Robert McNamara’s background at Ford Motor Company (and, yes, Harvard) is revelatory.

My point in criticizing Halberstam is that his thesis is too important to leave sloshing around in a sea of tidbits and longeurs. The path to Vietnam had two defining characteristics. One: it was forged by a small clique of power elites disconnected from the national will and was therefore undemocratic. Two: it was based on a delusion born of our experience in World War Two that said technical know-how wedded to managerial ability would infallibly deliver war-winning power and insight. The tragic narrative that Halberstam tells with such admirable skill pierces this delusion. America’s best and the brightest just knew we were winning in Vietnam. That’s what all the spreadsheets and data points and PSYOPS studies were telling them But the managerial class missed the fact that “when they had brought . . . the slide rules and the computers which said that two plus two equals four, that the most basic rule of politics is that human beings never react the way you expect them to.” The Vietnamese got a vote in the outcome of the war too, and our best and brightest ignored it and distorted it and misunderstood it for decades. Until we lost.

So again I say, read Halberstam without delay. Even if you have little interest or education in foreign affairs, it is instructive to see just how damnably wrong we can be about our deepest convictions.

Review of “I Am Dynamite: A Life of Nietzsche” by Sue Prideaux

BY MATTHEW HERBERT

There’s a simple, three-part diagram that everyone learns as the foundation of communications theory. A box or circle marked “sender” occupies the diagram’s left side; a “receiver” sits to the right. In the middle is a big block arrow labeled “message.”

The idea is, the sender can send whatever message he wants–and its intent can be perfectly clear to him–but the interpretation of what he says is ultimately done by someone else. So once a message leaves your mouth, any number of audiences can seize on it and turn it into their own message, starting the cycle anew.

And even though this image implies that messages, once launched, are perpetually in motion–interpreted, reinterpreted and passed along–some interpretations get locked in. They become part of a record.

Anyone vaguely familiar with the 19th century German philosopher Friederich Nietzsche knows that some of his ideas were endorsed by Hitler’s National Socialists. That part of Nietzsche got locked in. The Nazis asserted that, with God and morality killed off by Nietzsche, Germany had a natural right to pursue its will to power and rule mercilessly over Europe’s Untermenschen. Whoever lacked the means to stand up to Nazi strength deserved to be extinguished as weaklings.

Wasn’t that the whole idea of Nietzsche’s best-known book, Beyond Good and Evil? Once you have dispensed with the axiom that there is a higher moral law inscribed somewhere above the human plane, all you are left with to guide human behavior is the set(s) of rules we come up with ourselves. Those rules might not have anything to do with good and evil.

Again, if you are vaguely familiar with Nietzsche, you have likely heard, in response to his unwelcome reputation as a pro-Nazi philosopher, some muffled, unconvincing noises by intellectuals to the effect that the Nazis oversimplified Nietzsche and basically got him all wrong.

Before I go on to praise Sue Prideaux’s extraordinarily good 2018 book I am Dynamite: A Life of Friederich Nietzsche for its astringent articulation of an actual, effective defense of Nietzsche, I would like to dispense with another, related injustice against the man. In addition to the handful of critics who occasionally make a specific case against Nietzsche as proto-Nazi, there is a whole, broad phalanx of cultural conservatives standing ready to assert that anyone who declares God dead is clearly playing with fire and deserves no conscientious defense of his principles. The principles of God, country, and decency are sacrosanct and must ipso facto be left unquestioned. Without them, society would break down into a Hieronymous Bosch nightmare.

To which Nietzsche would–and did, in a way–reply: Society already is a cluster of delusions, and if we do not rigorously and passionately apply to them the remedies of irony, solidarity and critical thinking, our collective delusions may in the end be indistinguishable from a nightmare.

Nietzsche did not declare that he had killed God. Rather, that we had. He was God’s coroner, not executioner. Look around you, Nietzsche implores–at the material-economic basis of all our everyday desires, at the inexorable growth of faith in science, at the inattention to serious moral reasoning–and you cannot find a single person living as if they believed with any seriousness in the Abrahamic God or the strictures He dictated. You may mouth the words of faith, but actual, sincere belief evaporated ages ago, leaving behind a mere husk of symbol and ritual that we cannot bring ourselves to abandon.

Prideaux’s title is masterfully chosen. To the uninitiated it provokes a kneejerk response along the (old) lines of, “I am dynamite, huh? Okay, once again we have Nietzsche proclaiming with reckless braggadocio that he blew up the foundations of Western culture. Well, Western culture is still here, isn’t it? The Pope is still on his throne.” But why would Prideaux have written the book if she only wished to pound the same old nail into his coffin?

There is, in fact, no nail, no coffin.

Nietzsche, who gradually lost his mind as he was writing his books, did occasionally give voice to braggadocio, or so it seemed. But his claim to be dynamite was more an expression of existential disquiet over his inability to inhabit the customary role of a writer. He wanted to be what it indeed appeared that he was, and what it would have been so comforting to go on posing as–a mere man, a critic with something urgent to say but which still somehow fit into the normal range of rational disputation.

But instead, Nietzsche removed a veil of hypocrisy so huge that it exposed all of literate society. He felt the weight of this act, and said of it:

I know my fate. One day there will be associated with my name the recollection of something frightful–of a crisis like no other before on earth, of the profoundest collision of conscience, of a decision evoked against everything that until then had been believed in, demanded, sanctified. I am not a man, I am dynamite.

I imagine this last sentence being spoken with an awful, quiet sense of alienation, not boastful pride.

Nietzsche did what others were unwilling to do. His statement that God is dead, Prideaux summarizes, “had said the unsayable to an age unwilling to go so far as to acknowledge the obvious: that without belief in the divine there was no longer any moral authority for the laws that had persisted throughout the civilization built over the last two thousand years.”

Prideaux understates the case, though. There has never been anything other than human will and wit and creativity behind our laws. By saying the unsayable, Nietzsche not only pulled the rug from under us in the here and now, but pointed out in hindsight there had never been a rug. Humanity was and always had been alone without a law giver. This yawning abyss of nothingness is the one Nietzsche is famous for peering into.

This much is my interpretation of Nietzsche, straight, with only a supporting word or two from Prideaux. Why read her book, and not Nietzsche himself?

Certainly read Nietzsche, but without some kind of scheme for making sense of him, you will almost surely come to feel lost. Nietzsche must be read in order, for the first time, and it is essential to know the circumstances that set the stage for each book. Prideaux charts with remarkable clarity and a completely engrossing narrative the path Nietzsche followed from one book to the next. Even if you never go on to read a word of Nietzsche, you will gain from Prideaux’s highly lucid retelling of his life and how it shaped his books.

Second, more than almost any other philosopher, Nietzsche’s message to the world was influenced and amplified by the first generation of his admirers and critics. They delivered Nietzsche. Without the (now) little known Danish critic Georg Brandes, who brought Nietzsche’s writings onto the international stage, we might not even know of Nietzsche today. Almost all of Nietzsche’s books had been remaindered after selling mere hundreds. Pitifully, Nietzsche took to writing his own reviews, because nobody else noticed him. It took a decade after Brandes brought Nietzsche’s writings to the attention of a network of European scholars that Nietzsche’s book sales took off. By the time his books became a phenomenon, he was done as a writer, mentally incapacitated, completely unable to craft or deliver his own messages.

The heart of Prideaux’s book is the definitive case it makes that Nietzsche’s officious, anti-Semitic sister Elisabeth was almost entirely responsible for warping Nietzsche’s ideas into pro-Nazi slogans and handing them off to the Third Reich. Ordinarily one must turn to fiction to find a villain who so wholly earns one’s hatred and contempt, but Prideaux’s portrayal of Elisabeth as a fraud, fabricator, professional liar, and unquenchable narcissist delivers the real thing. I counted three place in the text where Prideaux uncovers that it is precisely Nietzsche’s rejection of Elisabeth’s anti-Semitism that caused him to break from his siter. A “QED” might usefully be inserted after each one of these.

Finally, Prideaux’s retelling of Nietzsche’s life offers an indirect and highly humanizing glimpse into the question of mental health. Everyone knows Nietzsche “went crazy” toward the end of his life. Remember that conservative crowd I mentioned, always standing ready to warn of the dangers of declaring God dead? There’s another message always ready on their lips: that secular intellectualism is a kind of sickness, which taken too far leads to psychosis and breakdown. Nietzsche had it coming.

There is no point, of course, in addressing this witless attitude directly. With heart-piercing empathy, Prideaux re-describes Nietzsche’s mental decline without the victim-blaming tropes we still haven’t shaken free of. Nietzsche’s father died of a stroke or aneurism that was clearly preceded by neurological symptoms that would likely be diagnosed today. Nietzsche’s own mental decline followed, as night follows day, a fairly ordinary, possibly treatable concatenation of physical conditions. Today’s doctors might have seen it coming. Drugs might have helped.

Nietzsche called himself the “philosopher of perhaps.” He thought humans should stop seeking certainty, that it was bad for us. He preached amor fati, the assertive act of loving life no matter what fate brings your way. What eventually came Nietzsche’s way, in the decade after he broke down and stopped writing, was the existence of a zoo animal. His sister Elisabeth kept him locked in a house that was destined to become a Nazi shrine. Mute, eyes vacated of life, she showed him off to visitors, all of whom were assholes. In 1933 Elisabeth traded away Nietzsche’s favorite walking stick to Adolph Hitler, in exchange for the new Führer‘s riding crop.

Nietzsche is not one of my life-altering heroes. But I do feel a duty to recall him as someone who is on my side, who doesn’t belong to the Nazis, jackals and troglodytes of the world. There is no way Nietzsche could have accepted the fate of his last ten years of life, nor the damage done to his ideas by his sister during that time. It is up to us to recall Nietzsche for who he was and to give him a fate worth loving. Prideaux’s book helps immensely in this worthy project.