In My Time of Dyin’: A Post about Music

BY MATTHEW HERBERT

The words are from the title of a song by Bob Dylan, recorded when he was all of 21 years old. It’s a good song, but really what did he know about dying?

What do I know?

I am much closer than young Robert Zimmerman, who, on the cover of his 1962 debut album, Bob Dylan, looked like he wasn’t even shaving yet. Me, I’m close enough to make certain considerations.

Oh, but before I get into those, this is not an announcement of my imminent demise. I am as unaware today as I was yesterday of things actively trying to kill me.

But I am close enough to understand how time will start fraying soon. I know of the things that will turn time from an airy abstraction into hard reality. The heart, the lungs, the liver; they will all start giving up on the jobs they once did so well, for so long. There’s no way Dylan knew of those things when he was 21. He was using death the way poets and essayists have always used it–as an idea to focus the mind.

This blog has never been autobiographical. I’ve occasionally written about my favorite hobby, running, and I once made a big deal of nearly dying from too much morphine after back surgery. I also wrote a florid and intimate declaration of love for a hill one time. But by temperament, I keep a lofty focus on the Olympian heights–books, ideas, and the legacy of Orwell.

But I recently started to address a problem I hadn’t even known existed. And that problem is inescapably autobiographical: the matter of final arrangements. Oh, not the legal stuff. I’m a chary bureaucrat by training and habit, so I’ve checked all the boxes that one thinks of as “responsible estate planning.” Of course I’ve done that. The only thing that really matters to me is my ability to care for the small group of Earthlings I think of as my own, so I’m not going to allow myself to fail, through mere oversight, in that mission. (You know all those memes that start with “You had one job?”–I will not have them appended to the Facebook announcements of my Departure. I simply won’t.)

It has nagged me for several years, though, that there are other, more personal arrangements to be made. Money isn’t everything, after all.

Recently I posed myself the following question: If music were to be played at my wake, what would it be? And a further question arose: Would anyone even know where to start making such a playlist? Given the general glumness of the circumstances and the pressing need for buying tombstones and whatnot, would anyone feel like taking this job on? I fear it might get half done, if at all.

And this cannot stand.

Anyone of my generation knows how decisively important a soundtrack is. The Breakfast Club simply does not, cannot come to its proper end without the booming forth of “Don’t You Forget about Me,” by Simple Minds. The song finishes the story. I’m in search of songs that finish my story.

Well, easy, I thought. My life has a soundtrack, and it is U2’s dark, ironic, but still majestic Achtung Baby of 1991. It says everything you need to know about my inner life: used to be religious, now godless, bonded in some amorphous way to Berlin’s swirl of doom, art, redemption, and American guardianship.

But wait. It’s all very well to have U2’s loudest, most desolate and industrial sounds going through your head literally every day of your life, and to know that the songs are you in a way, but Achtung Baby would be an absolute non-starter at a wake. Take three-quarters of an hour, if you can, and listen all the way through to “Acrobat,” the 11th track on the Album. You’re feeling drained, forsaken and sonically battered by the time it plays. You need a respite of light and air. Instead, “Acrobat” comes on: a buzzsaw of inchoate anguish and rage. All is darkness and moral wrong, it says. Does it project the mood one wants just after a funeral?

It does not.

And this got me thinking: there is an urge to have the last say at one’s leave-taking, but this kind of thing can be taken too far. Last rituals certainly must take the departed as their subject, but they exist for other people. They must take the audience into equal account.

So, I will make time soon enough to write about U2’s formative power over me. There are questions that need answering about how their darkest songs came to score a bright, breeze-kissed life like mine, unmarked by wracked conscience or hint of woe.

But for now, to the task at hand. This is how I got down to the business of choosing the songs I want to be played at my wake, and how they revealed some telling problems.

Balance, is what I thought. The songs need to strike a balance between what they mean/t for me and what they say to the listener. And I came up with a few promising candidates, but I also came up with even more problem cases. To wit:

“Jokerman” by Bob Dylan, would be superb, I thought. It showcases Dylan at his poetic best, managing to be wry, wistful, and vaguely accusatory at the same time. The imagery, much of it Biblical, is supreme. The music, nudged along by Mark Knopfler’s understated guitar work, stays in the background, letting Bob spin out a complex warning of apocryphal menace. “Jokerman” was in.

Why did it beat out other, better known Dylan songs? I think “It’s All Right Ma, I’m Only Bleeding” is Dylan’s greatest poem. It is his apex achievement. But that’s the problem. It’s my wake, and I don’t want people zoning out at it, transfixed by what might be the best song written by a popular musician in the last 100 years. Listen to it on your own time.

Ditto “Hallelujah” by Leonard Cohen, mutatis mutandis. It’s too good. Plus, there’s its unstinting mood of heartbreak, which I presume will be going around freely enough without any prompts from my playlist. I hope for jokes to be told at my wake, sardonic stories to be shared. These things won’t happen if we have Jeff Buckley (doing my favorite version of “Hallelujah”) reminding us how the celestial joy of love is always at risk of being run into the ditch of abject human failure.

I also came to suspect that the effort to avoid the grim or acrimonious note could be taken too far. One of my absolute favorite songs of all time is “Mr. Blue Sky” by Electric Light Orchestra. It is, on grounds both psychological and musicological, the happiest song in the world. And therein lay the problem, as I saw upon reflection. Wouldn’t I come off as trying to tell the audience how to feel, and being pretty heavy-handed at it?

I submit this for your consideration and await your response: Should I omit “Mr. Blue Sky” for being too happy?

This dilemma raised a more general problem. Why not simplify the task and just write out a list of all one’s favorite songs, consigning other criteria to the wind? It’s a tempting schema. But it too sails into choppy waters. “Fat Bottomed Girls” is hands down my favorite song by Queen, because it rocks consummately and it it explores a theme that is delightful to me. But do I include it just because of its general excellence? Would I not risk slighting skinny bottomed girls, implying that back in the high tide of life I was indifferent to their presence? Wakes are not the place to feel a small hurt has been done to you, and I refuse to be the cause of even one. I am nothing if not gallant. So “Fat Bottomed Girls,” although a certified sterling favorite, was out.

A few songs were too on the nose, I worried. They seemed to be thrown in because they fit a lax kind of formula. If you hear Jackson Browne’s “Running on Empty” at my wake, you could be forgiven for thinking blandly, “Oh yes, he liked running.” And it’s true, I did like running. But I really like “Running on Empty,” although mostly for its imagery of the road and youth, not because it’s about running. It captures a time of life when the high, white cumulus clouds decorating the skies in one’s 20s start to turn gray and minatory, announcing the coming storm and turbulence of the 30s. Mostly, “Running on Empty” made the cut because of its musical delivery, earnest and bittersweet but not somber.

I also like Bob Seeger’s “Against the Wind,” and I enjoy the moment in Forrest Gump when the song is used to conjure the depleted, defiant mindset of the long-distance runner against the backdrop of Monument Valley. It’s a great song about restlessness and fatigue, but it leaves the listener wondering if not knowing where you’re going is an inevitable part of life. Is restlessness a permanent state? “Against the Wind” raises this question but does not answer it. Certain forms of melancholy are bound to present themselves at a wake, but I don’t think I want my celebrants asking themselves whether constant, pointless exertion is the main ingredient in the human condition. Let that thought emerge in its own way. So “Against the Wind” was out.

I definitely wanted a song or two by REM. They have always been one of my favorite bands, and I felt like my playlist would be incomplete without them. They provided the soundtrack to my life in my early 20s, before U2 shattered it and replaced it with Achtung Baby. But here I ran into a variant of the just-list-your-favorite-songs problem. It doesn’t work with bands either, or at least it doesn’t for REM. “It’s the End of the World as We Know It (And I Feel Fine)”? It’s definitely me, but the song is a cockeyed lark. Much as I hope my guests feel free to have zany thoughts, I’m not sure I should make the invitation explicit. “Everybody Hurts”? Lovely song, but please, wouldn’t it be slightly overdoing things at a wake? “Losing My Religion”? This one very well might make the cut, but radio overplay has sapped some of its feeling of originality. Plus people might start doing the choppy-hand dance that Michael Stipe does in the video. Could be weird. I guess it would be okay actually.

Feel free, if the mood takes you (Image: IMVDb)

So what I am left with is a handful of REM songs that strike me as inoffensive to the occasion but so obscure I feel I could mislead the mourners into thinking the songs meant more to me than they did. “Driver Eight” fits this description. It’s a quality REM song that I probably listened to hundreds of times in my 20s, but does it signify? In terms of simple musical beauty, “South Central Rain” is my favorite song by REM, but that plaintive chorus where Michael Stipe says over and over that he’s sorry?–It would almost certainly leave some people wondering whether there was a message there. What would I be doing all that apologizing for? And to whom? People might start puzzling out what the industrial scale wrong was that I had done and how it had never come to light. But I still love the song.

So I am at a loss REM-wise. I await your suggestions, dear reader.

I think I should close this post by looping back to U2. I can’t just give their whole catalog the boot because their portraiture of my life is too plaintive and morose for a wake, right? They are my band, after all, and there are questions of loyalty at stake. I must find a song or two of theirs that mark my farewell properly. The songs, it turns out, were easier to find than I thought they would be.

“Kite,” from the 2000 album All That You Can’t Leave Behind is pretty clearly a goodbye to a loved one, but it’s malleable enough that it covers many different kinds of goodbyes. One of a parent’s highest goals in life is emancipation–the moral and practical preparation of a child to stand on their own two feet. It’s a deep paradox, though: if you’ve done it well, you have broken your own heart, let your child go like a kite into the wind. But you have to do it anyway. To leave emancipation undone, or to do it poorly, is to wreck a young life and to risk setting off a broader train of dysfunction. So if it helps to hear Bono put the problem literally, when he sings, “I want you to know that you don’t need me anymore,” you’re welcome. It helped me too.

The second U2 selection was even easier. It was almost perfect for a wake. I couldn’t believe I’d missed it. Also from All That You Can’t Leave Behind, “Stuck in a Moment You Can’t Get Out Of” has a lovely gospel uplift to it. It’s addressed to someone lost, careworn, and temporarily defeated. Bono writes a lot of songs like this. (For a more somber variant not quite wake-appropriate but in every way superb, listen to “Stay [Far Away, So Close!].”) And Bono often tells you there is hope, or maybe something even better, like peace or love or affirmation, on the other side of the troubles. The Edge’s backing vocals at the end of the song–while studio-tuned to artificial perfection: oh, well–serve to complete Bono’s message. If we are to be saved at all, salvation will come through other people. Other people will make us who we are. That’s how we go on, I guess.

And so I close with my actual playlist as it stands, with no further commentary (except to say there is no particular order to the songs–that is a whole other problem). It feels okay to leave it this way. It is not just good manners to resist having the last, overbearing word. It is an unavoidable feature of the wake. The songs will have the last word themselves, and then it is up to other people to go on talking.

Bob Dylan: Jokerman, Like a Rolling Stone, Brownsville Girl

U2: Kite, Stuck in a Moment You Can’t Get Out Of

Jackson Browne: Running on Empty, Late for the Sky

Don Henley: Boys of Summer

Neil Young: My My, Hey Hey (Out of the Blue), Thrasher, Powderfinger, Comes a Time

Bruce Springsteen: Thunder Road

Fleetwood Mac: Dreams

10,000 Maniacs: Like the Weather, Verdi Cries

REM: Driver Eight, Losing My Religion, Don’t Go Back to Rickville

Chris Rea: Road to Hell

ELO: Turn to Stone, Mr. Blue Sky (?)

Advertisement

Review of “Grant” by Ron Chernow

BY MATTHEW HERBERT

Ron Chernow, it seems, has never met a cliché he didn’t like.

I open randomly to any of the 960 pages of Chernow’s 2017 biography of Ulysses S. Grant, and the dull, timeworn phrases turn out in squads, companies, and whole regiments. On page 561, we read that President Grant “toiled under heavy burdens,” while his longtime aide John Rawlins “felt duty bound to assist him.” (Rawlins was Grant’s Secretary of War–how else was he supposed to feel?)

We learn of a photograph of Grant taken during the Vicksburg campaign. There is, Chernow tells us, “an indescribable look of suffering” on Grant’s face. How does Chernow limn Grant’s supposedly indescribable pain? The general, he writes, has “sad, woebegone eyes.”

When we learn that President Woodrow Wilson, a native Georgian, dismissed Grant’s efforts at postwar reconstruction, it is with this lapidary phrasework: Wilson “consigned President Grant to the dustbin of history . . . .”

As the Battle for Chattanooga came to a successful culmination, “Grant hoped Sherman would reap the lion’s share of glory.” We are touched, later, to know that Sherman stood “ramrod straight” at Grant’s funeral.

Grant “had to show the velvet glove and iron fist at once” while dealing with Indians in the West.

Drearily, there is much, much more. I enjoyed Grant for the most part, but Chernow’s lack of literary seriousness became a distraction. Sometimes I couldn’t help tallying the number of clichés and clumsy phrases on a given page.

Which is too bad. Chernow’s massive biography is largely a success. It corrects a number of misconceptions about Grant and reveals little-known details of his life. Overall, Chernow makes a convincing argument that Grant is both greater and more complex than most of us have imagined him to be.

First, there is the matter of Grant’s drinking. While he certainly had a complicated relationship with booze, Grant was no drunk, at least not in the usual sense. The impression one gets from some Civil War histories (Shelby Foote’s magnificent The Civil War: A Narrative comes to mind) is that Grant spent long stretches of time drinking while on duty and even did so while encamped outside Vicksburg before his breakthrough victory there in July 1863. Marshalling a meticulous string of reports–and winnowing out a substantial amount of character attacks by Grant’s political foes–Chernow develops a very plausible, and different, profile of Grant’s drinking problem.

Grant was a distinctly episodic drinker who knew he had a problem with alcohol and never indulged when his family was nearby to provide emotional support. The early sprees that formed the foundation of later slurs and innuendoes all took place in the 1850s while Grant was a junior Army officer stationed far away from his young family, in remote Oregon and California. His isolated slips later in life all followed the same pattern: when Grant fell off the wagon, it was always one night, away from home, and not on duty. If any other patterns marked Grant’s drinking, they were: how he managed to maintain temperance for months, even years at a time, and how near he came to defeating alcoholism entirely. On a two and a half-year world tour after his second presidency–precisely the time to let down his guard and live a little–Grant steadfastly kept his wine glass overturned even as he was celebrated in palaces, ballrooms, and salons everywhere he went.

Grant’s dogged pursuit of sobriety reflected a broader American struggle to tame its wild side. In 1822, when Grant was born, Americans consumed the equivalent of 90 bottles of whisky each year on average. By the time Grant died in 1885, refusing a brandy-laced dose of morphine because as he told his doctor, alcohol didn’t “agree with him,” Americans drank less than half the amount they had at their peak earlier in the century.

The military genius Grant showed in the Civil War was so central to wartime victory, it has overshadowed how hard Grant fought as president to defeat the United States’ largest, most lethal terrorist group–the Ku Klux Klan. As someone who has worked with soldiers for most of my life, it is no surprise how bitterly they take it when their battlefield sacrifices are compromised by politicians who abandon the aims they fought for. Grant often felt the same way. But he enjoyed a rare historical opportunity: he was a former soldier who found himself empowered to follow through as a politician to try to secure what his troops had bled for.

Grant after a day of hard losses at Cold Harbor, Virginia in May, 1864 (Image: Britannica)

Chernow’s retelling of the founding of the KKK and Grant’s determination to destroy it puts this episode where it needs to be–front and center in the history of Reconstruction. Grant is justly praised for creating the “spirit of Appomattox” when he accepted Robert E. Lee’s surrender in April 1865. Grant’s generous terms, allowing Lee’s troops to return home with their guns and horses, was meant to mark a definitive end, not just to hostilities, but to feelings of hostility. And while many southerners accepted this gesture with dignified thanks, many more did not. Confederate General Nathan Bedford Forrest, as the first Grand Wizard of the KKK, led thousands of former Confederate soldiers in the South on a campaign of killing Republicans and recently emancipated Blacks, smashing voter registration sites, burning churches, and resisting all efforts to implement the constitutional amendments ending slavery and ensuring voting rights in the South (the 13th and 14th).

Grant may have been president, but in 1870, as the KKK launched what Chernow rightly calls “a new civil war by clandestine means,” he reverted to thinking like a general. The KKK’s center of gravity, Grant reasoned, was its ability to intimidate anyone who might testify against them in court. So Grant went all in on destroying this center of gravity. Responding to southern governors’ requests for help, Grant sent federal troops to enforce the Ku Klux Klan Act (actually three “Enforcement Acts”), which empowered the government to jail KKK suspects without Habeas Corpus rights–critically depriving them knowledge of witnesses’ identities–and to use federal troops to directly suppress KKK activity, doing the job local sheriffs refused to do. By 1872, Grant smashed the KKK’s power. Forrest resigned as Grand Wizard and recanted his overtly racist political goals.

Of course even the most naïve student of American history knows that Grant (and the nation) did not succeed in achieving the broader aims of Reconstruction. Indeed Chernow does an admirable job of describing how the decline of Grant’s second term as president was more or less coextensive with the demise of Reconstruction. When Grant left office in 1876, the network of white supremacists that would maintain the racist power order of the South were still alive and well despite the defeat of the KKK. They would go on to create the legal structure of Jim Crow and resist civil rights for another 80 years. (Read Eric Foner’s entire body of work on the massive criminal enterprise that defeated Reconstruction and kept racism alive. If you only have time for one of Foner’s books, make it Reconstruction: America’s Unfinished Revolution – 1863-1877.)

We often hear how personal the Civil War was, dividing brother from brother and father from son. The most luminous thread woven throughout Chernow’s book, retold with fine, stoical understatement that makes up for some of Chernow’s general failures of style, is Grant’s friendship with John Longstreet, who would become an acclaimed Confederate general. Grant and Longstreet became friends at West Point, when each was a boy of 18. Depending on which source you believe, Longstreet was Grant’s best man at his wedding to Julia Dent, or was at least instrumental in pairing the two up. (The Dents were family friends of the Longstreets.)

Longstreet and Grant served together in the Mexican American War in 1846. The next time they would meet on the battlefield, Grant’s forces would nearly kill Longstreet, in the Battle of the Wilderness in May 1864. Then, miraculously, Longstreet appears at Appomattox Courthouse, a senior commander under Lee. He is astonished when Grant treats him as a friend, and Longstreet is instrumental in persuading Lee that Grant will give, and honor, fair terms of surrender. After Grant’s death, Longstreet would call him “the truest as well as the bravest man that ever lived.”

Grant was a hard but idealistic man. He fought the Confederacy with death-dealing determination but then acted magnanimously in victory, hoping mercy would open the door to reconciliation. His genius as a general consisted in an intuitive understanding of a new kind of warfare he was helping to create, which is today called combined arms maneuver warfare. But he was no mere theorist. Grant won, Chernow tells us, because he never let up. His victories were often sealed on the day after a bout of grievous losses. Grant knew the other side would be reeling too, and he judged that that knife-edge moment was the opportunity to win–a victory of the smallest margin would give way to a larger one. And Grant was right. This was the path that he followed to defeat Lee and end the war, which earned him the undeserved label of “butcher.” Grant was not a butcher, but a fierce realist. He knew tomorrow’s peace would come faster the more violence he visited on the enemy today, and that was how he fought.

I have left out a handful of other themes that make Chernow’s book worth reading, especially his description of Grant’s habitual credulity and how it led to a string of corruption allegations. Grant’s surprising ability as a writer late in life becomes less surprising when we learn he wrote as many as 40 detailed orders a day as a general and later wrote all his presidential addresses without the aid of a committee of editors. As Grant was dying of cancer, he finally took on an editor he trusted, to help publish his memoirs so Julia would have a pension. The editor was Mark Twain, and Twain, who was not shy of cutting down idols no matter how large, called Grant a “flawless” writer.

A final theme that emerges from Chernow’s biography is how Grant constantly improved himself and constantly reinvented who he was. And he was seemingly afraid to leave nothing of his old self behind in the process. By the time late in life Grant had become, in turn, a driven civil rights activist, a calculating politician, a capable economist, an effusive public speaker, and a writer for the ages, he had completely shed his old identity as a warrior. His steadfast refusal to glorify war and to trade on his status as the general who saved the Union was the highest mark of his greatness.

Review of “The Best and the Brightest” by David Halberstam

BY MATTHEW HERBERT

Critics are supposed to criticize, yes?

If you’ve read more than one of my book reviews, you’ve likely noticed a tone of glad, unbroken praise. Once or twice I’ve used these pages to cast the stern, disapproving eyebrow, but it’s mostly sweetness and light I try to spread. I love books, and, life being short, I mostly read books I think will reward me.

This summer I finally got around to reading David Halberstam’s 1969 masterpiece, The Best and the Brightest. It tells how the most privileged, idealistic constellation of political leaders in the history of the United States committed the long series of moral crimes and strategic blunders that made up our “experience” in Vietnam. Americans’ belief that our government would do the right thing and tell us the truth has never recovered. I think it would be hard to call yourself a student of our national history without reading this book. Do so without delay.

But, as I sat outside enjoying the long German summer along with Halberstam’s classic, I couldn’t help feeling that things, as nice as they were, were perhaps going on too long. Halberstam especially.

So, I’m just going to say it: The Best and the Brightest would be a far better book if it were half as long. As is, it clocks in at 665 dense pages. And don’t get me wrong; it is never boring. Halberstam is a masterful writer with an eye for the revealing detail. But there are just so many of them.

As Halberstam puts each of the major Vietnam players and several of the not-so-major Vietnam players under the microscope, one wonders if we need the same cellular level of detail on the whole cast of characters. While it is illuminating to see how the nerdish National Security Advisor McGeorge Bundy followed a path from Groton to Yale to Harvard, evincing “so much intelligence harnessed to so much breeding,” as Halberstam puts it, we get several more pages illustrating just how and why Bundy’s Harvard years were so happy. He reads the Greeks; comes to be tutored by a little known professor named Henry Kissinger–an important observation but one that should be made and moved on from. We all know who Kissinger will become.

In about half the space he actually takes, Halberstam could have reached his useful, important conclusion, that Bundy, like so many of the men who took us to war in Vietnam, was an example of “a special elite, a certain breed of men whose continuity is among themselves. They are linked to one another rather than to the country.” Defense Secretary Robert McNamara’s background at Ford Motor Company (and, yes, Harvard) is revelatory.

My point in criticizing Halberstam is that his thesis is too important to leave sloshing around in a sea of tidbits and longeurs. The path to Vietnam had two defining characteristics. One: it was forged by a small clique of power elites disconnected from the national will and was therefore undemocratic. Two: it was based on a delusion born of our experience in World War Two that said technical know-how wedded to managerial ability would infallibly deliver war-winning power and insight. The tragic narrative that Halberstam tells with such admirable skill pierces this delusion. America’s best and the brightest just knew we were winning in Vietnam. That’s what all the spreadsheets and data points and PSYOPS studies were telling them But the managerial class missed the fact that “when they had brought . . . the slide rules and the computers which said that two plus two equals four, that the most basic rule of politics is that human beings never react the way you expect them to.” The Vietnamese got a vote in the outcome of the war too, and our best and brightest ignored it and distorted it and misunderstood it for decades. Until we lost.

So again I say, read Halberstam without delay. Even if you have little interest or education in foreign affairs, it is instructive to see just how damnably wrong we can be about our deepest convictions.

Review of “I Am Dynamite: A Life of Nietzsche” by Sue Prideaux

BY MATTHEW HERBERT

There’s a simple, three-part diagram that everyone learns as the foundation of communications theory. A box or circle marked “sender” occupies the diagram’s left side; a “receiver” sits to the right. In the middle is a big block arrow labeled “message.”

The idea is, the sender can send whatever message he wants–and its intent can be perfectly clear to him–but the interpretation of what he says is ultimately done by someone else. So once a message leaves your mouth, any number of audiences can seize on it and turn it into their own message, starting the cycle anew.

And even though this image implies that messages, once launched, are perpetually in motion–interpreted, reinterpreted and passed along–some interpretations get locked in. They become part of a record.

Anyone vaguely familiar with the 19th century German philosopher Friederich Nietzsche knows that some of his ideas were endorsed by Hitler’s National Socialists. That part of Nietzsche got locked in. The Nazis asserted that, with God and morality killed off by Nietzsche, Germany had a natural right to pursue its will to power and rule mercilessly over Europe’s Untermenschen. Whoever lacked the means to stand up to Nazi strength deserved to be extinguished as weaklings.

Wasn’t that the whole idea of Nietzsche’s best-known book, Beyond Good and Evil? Once you have dispensed with the axiom that there is a higher moral law inscribed somewhere above the human plane, all you are left with to guide human behavior is the set(s) of rules we come up with ourselves. Those rules might not have anything to do with good and evil.

Again, if you are vaguely familiar with Nietzsche, you have likely heard, in response to his unwelcome reputation as a pro-Nazi philosopher, some muffled, unconvincing noises by intellectuals to the effect that the Nazis oversimplified Nietzsche and basically got him all wrong.

Before I go on to praise Sue Prideaux’s extraordinarily good 2018 book I am Dynamite: A Life of Friederich Nietzsche for its astringent articulation of an actual, effective defense of Nietzsche, I would like to dispense with another, related injustice against the man. In addition to the handful of critics who occasionally make a specific case against Nietzsche as proto-Nazi, there is a whole, broad phalanx of cultural conservatives standing ready to assert that anyone who declares God dead is clearly playing with fire and deserves no conscientious defense of his principles. The principles of God, country, and decency are sacrosanct and must ipso facto be left unquestioned. Without them, society would break down into a Hieronymous Bosch nightmare.

To which Nietzsche would–and did, in a way–reply: Society already is a cluster of delusions, and if we do not rigorously and passionately apply to them the remedies of irony, solidarity and critical thinking, our collective delusions may in the end be indistinguishable from a nightmare.

Nietzsche did not declare that he had killed God. Rather, that we had. He was God’s coroner, not executioner. Look around you, Nietzsche implores–at the material-economic basis of all our everyday desires, at the inexorable growth of faith in science, at the inattention to serious moral reasoning–and you cannot find a single person living as if they believed with any seriousness in the Abrahamic God or the strictures He dictated. You may mouth the words of faith, but actual, sincere belief evaporated ages ago, leaving behind a mere husk of symbol and ritual that we cannot bring ourselves to abandon.

Prideaux’s title is masterfully chosen. To the uninitiated it provokes a kneejerk response along the (old) lines of, “I am dynamite, huh? Okay, once again we have Nietzsche proclaiming with reckless braggadocio that he blew up the foundations of Western culture. Well, Western culture is still here, isn’t it? The Pope is still on his throne.” But why would Prideaux have written the book if she only wished to pound the same old nail into his coffin?

There is, in fact, no nail, no coffin.

Nietzsche, who gradually lost his mind as he was writing his books, did occasionally give voice to braggadocio, or so it seemed. But his claim to be dynamite was more an expression of existential disquiet over his inability to inhabit the customary role of a writer. He wanted to be what it indeed appeared that he was, and what it would have been so comforting to go on posing as–a mere man, a critic with something urgent to say but which still somehow fit into the normal range of rational disputation.

But instead, Nietzsche removed a veil of hypocrisy so huge that it exposed all of literate society. He felt the weight of this act, and said of it:

I know my fate. One day there will be associated with my name the recollection of something frightful–of a crisis like no other before on earth, of the profoundest collision of conscience, of a decision evoked against everything that until then had been believed in, demanded, sanctified. I am not a man, I am dynamite.

I imagine this last sentence being spoken with an awful, quiet sense of alienation, not boastful pride.

Nietzsche did what others were unwilling to do. His statement that God is dead, Prideaux summarizes, “had said the unsayable to an age unwilling to go so far as to acknowledge the obvious: that without belief in the divine there was no longer any moral authority for the laws that had persisted throughout the civilization built over the last two thousand years.”

Prideaux understates the case, though. There has never been anything other than human will and wit and creativity behind our laws. By saying the unsayable, Nietzsche not only pulled the rug from under us in the here and now, but pointed out in hindsight there had never been a rug. Humanity was and always had been alone without a law giver. This yawning abyss of nothingness is the one Nietzsche is famous for peering into.

This much is my interpretation of Nietzsche, straight, with only a supporting word or two from Prideaux. Why read her book, and not Nietzsche himself?

Certainly read Nietzsche, but without some kind of scheme for making sense of him, you will almost surely come to feel lost. Nietzsche must be read in order, for the first time, and it is essential to know the circumstances that set the stage for each book. Prideaux charts with remarkable clarity and a completely engrossing narrative the path Nietzsche followed from one book to the next. Even if you never go on to read a word of Nietzsche, you will gain from Prideaux’s highly lucid retelling of his life and how it shaped his books.

Second, more than almost any other philosopher, Nietzsche’s message to the world was influenced and amplified by the first generation of his admirers and critics. They delivered Nietzsche. Without the (now) little known Danish critic Georg Brandes, who brought Nietzsche’s writings onto the international stage, we might not even know of Nietzsche today. Almost all of Nietzsche’s books had been remaindered after selling mere hundreds. Pitifully, Nietzsche took to writing his own reviews, because nobody else noticed him. It took a decade after Brandes brought Nietzsche’s writings to the attention of a network of European scholars that Nietzsche’s book sales took off. By the time his books became a phenomenon, he was done as a writer, mentally incapacitated, completely unable to craft or deliver his own messages.

The heart of Prideaux’s book is the definitive case it makes that Nietzsche’s officious, anti-Semitic sister Elisabeth was almost entirely responsible for warping Nietzsche’s ideas into pro-Nazi slogans and handing them off to the Third Reich. Ordinarily one must turn to fiction to find a villain who so wholly earns one’s hatred and contempt, but Prideaux’s portrayal of Elisabeth as a fraud, fabricator, professional liar, and unquenchable narcissist delivers the real thing. I counted three place in the text where Prideaux uncovers that it is precisely Nietzsche’s rejection of Elisabeth’s anti-Semitism that caused him to break from his siter. A “QED” might usefully be inserted after each one of these.

Finally, Prideaux’s retelling of Nietzsche’s life offers an indirect and highly humanizing glimpse into the question of mental health. Everyone knows Nietzsche “went crazy” toward the end of his life. Remember that conservative crowd I mentioned, always standing ready to warn of the dangers of declaring God dead? There’s another message always ready on their lips: that secular intellectualism is a kind of sickness, which taken too far leads to psychosis and breakdown. Nietzsche had it coming.

There is no point, of course, in addressing this witless attitude directly. With heart-piercing empathy, Prideaux re-describes Nietzsche’s mental decline without the victim-blaming tropes we still haven’t shaken free of. Nietzsche’s father died of a stroke or aneurism that was clearly preceded by neurological symptoms that would likely be diagnosed today. Nietzsche’s own mental decline followed, as night follows day, a fairly ordinary, possibly treatable concatenation of physical conditions. Today’s doctors might have seen it coming. Drugs might have helped.

Nietzsche called himself the “philosopher of perhaps.” He thought humans should stop seeking certainty, that it was bad for us. He preached amor fati, the assertive act of loving life no matter what fate brings your way. What eventually came Nietzsche’s way, in the decade after he broke down and stopped writing, was the existence of a zoo animal. His sister Elisabeth kept him locked in a house that was destined to become a Nazi shrine. Mute, eyes vacated of life, she showed him off to visitors, all of whom were assholes. In 1933 Elisabeth traded away Nietzsche’s favorite walking stick to Adolph Hitler, in exchange for the new Führer‘s riding crop.

Nietzsche is not one of my life-altering heroes. But I do feel a duty to recall him as someone who is on my side, who doesn’t belong to the Nazis, jackals and troglodytes of the world. There is no way Nietzsche could have accepted the fate of his last ten years of life, nor the damage done to his ideas by his sister during that time. It is up to us to recall Nietzsche for who he was and to give him a fate worth loving. Prideaux’s book helps immensely in this worthy project.

Review of “If Then: How the Simulmatics Corporation Invented to Future” by Jill Lepore

BY MATTHEW HERBERT

I have a Jill Lepore problem.

Every time I read a new book by the Harvard historian, I can’t help thinking it’s her best. I just start effusing that she is America’s greatest historian. She writes with such ease and appeal, you can almost forget how erudite she is.

Joe Gould’s Teeth: Lepore’s investigation of whether the longest book ever written actually was written, by a mad man wandering the streets of New York?–her best.

The Mansion of Happiness: A History of Life and Death: her exploration of human flourishing as it has been defined on the American scene?–definitely her best.

And her magnum opus, These Truths: A History of the United States?–absolutely, unquestionably her best. You can’t be fully American if you haven’t read it.

Which leads me to her latest (2020) book, If Then: How the Simulmatics Corporation Invented the Future. It’s certainly her best.

In 1959, a company was born, Simulmatics. The name was a mashup of simulation and automatic. The guy who came up with it, Ed Greenfield, was hoping the neologism would take off, like Norbert Wiener’s cybernetics, but it didn’t.

Eventually, the company Simulmatics was no more successful than the name, but it is in the telling of Simulmatics’s ambition, its methods and projects, and the ways it failed, that reveal how this company–already lost to memory–anticipated almost every idea that drives social media, advertising, big data, and political campaigning today. Simulmatics truly did invent the future. And then it disappeared.

The basic idea was simple. “The company proposes to engage principally in estimating probable human behavior by the use of computer technology.”

And engage they did.

During the company’s 11-year life, its scientists–known as the “what-if men”–advised John F. Kennedy’s 1960 presidential campaign, nudging him to advocate for civil rights and lean into his Catholicism; they helped the U.S Army dream up the “strategic hamlets” program and other data-driven ideas for winning the Vietnam war; they helped the FBI predict race riots in the mid-1960s by combining data about networks of people with external political variables that affected their behavior; they got involved in the invention of the internet, at the Defense Department’s Advanced Research Projects Agency. Always trying to predict, systematically, what people would buy, what they would choose, or do.

All of it depended on one guiding idea, the algorithm. For the Kennedy campaign, Simulmatics analyzed voters into 480 categories (for example, white, middle-aged housewife in the midwest), coded their opinions on to punch-cards, and started asking their IBM computers what-if questions, like if Kennedy coddles white southern conservatives on civil rights, what would other categories of voters do? And that’s how they came up with civil rights as a campaign issue. It wasn’t because advocating for civil rights was the right thing for JFK to do; it was a vote getter.

One of Simulmatics’s scientists called this kind of modeling a “people machine.” It was a way of aggregating data about millions of small, individual choices to simulate the way whole groups of people–maybe even an entire country–would act.

The vision was grand, Lepore tells us:

The scientists of the Simulmatics Corporation acted on the proposition that if they could collect enough data about enough people and feed it into a machine, everything, one day, might be predictable, and everyone, every human mind, simulated, each act anticipated, automatically, and even driven and directed, by targeted messages as unerring as missiles.

Lepore is the rarest kind of historian. She writes from an unmistakable moral perspective (liberal humanism) but without rendering history as a black-and-white morality tale. Despite the atrocious personal behavior of Simulmatics’s what-if men (they drank like fish, cheated on their wives, neglected their children), and despite the monstrosity of what they wrought–a democracy-killing dystopia–Lepore retains an almost Vonnegut-esque gentleness about their foibles, which were, after all, products of history before they became forces of history.

Looking back, she observes, “[i]t would be easier, more comforting, less unsettling, if the scientists of Simulmatics were villains. But they weren’t. They were midcentury white liberals in an era when white liberals were not expected to understand people who weren’t white or liberal.”

Lepore tells the story of how these flawed men created a system that would become a monster, but she tells it with wit, wryness, and often with tragic beauty. The chapter of If Then entitled “The Things They Carried” is in itself a masterpiece of historical writing, not to be missed. It describes in captivating, exquisite detail how Simulmatics gathered so much data about people, analyzed it so thoroughly, and often accurately predicted their behavior in highly particular circumstances, but still failed to understand what was really happening in the United States. They missed the real world undisclosed by their data–the anguish over Vietnam, the draft, and civil rights: the whole direction that country was going.

Much of the “social science” behind Simulmatics was shoddy–or outright chicanery–and it 1971 the company filed for bankruptcy. “Simulmatics died,” Lepore reports. “The fantasy of predicting human behavior by way of machines did not. Instead, it took new forms, forms that depended on forgetting that Simulmatics had ever existed.”

She goes on:

Simulmatics failed, but not before its scientists built a very early version of the machine in which humanity would in the early twenty-first century find itself trapped, a machine that applies the science of psychological warfare to the affairs of ordinary life, a machine that manipulates opinion, exploits attention, commodifies information, divides voters, fractures communities, alienates individuals, and undermines democracy. . . . Long before the age of quarantine and social distancing, Simulmatics helped atomize the world.

So when the wundkerkids of the early 21st century went looking for the venture capital that would make them into zillionaires, they needed to portray everything they were doing as utterly new, and that they were leaving all of history behind. They needed to mystify. Otherwise they might have to admit that their disruptive “breakthroughs”–big data, social networks, targeted algorithms–were just better recycles of what had already been done. Moreover, they needed to believe, and probably did believe, that they had escaped the orbit of the old, left history behind for pure data.

“The only thing that matters is the future,” proclaimed Google’s Anthony Levandowski in 2018; “I don’t even know why we study history.” But in fact the innovations of the new tech titans embodied the same old flaws and biases that had pulled an earlier generation of technology prophets crashing down to earth. Facebook, Twitter and Cambridge Analytics are all better at what they do than Simulmatics, but what they do bears the same human stamp, revives the same mad ambition. They might have known that had they studied history. They might have learned that the abrogation of the past for faith in a transformative future “isn’t an original idea,” as Lepore concludes. “[I]t’s a creaky, bankrupt Cold War idea, an exhausted and discredited idea. The invention of the future has a history, decades old, dilapidated.”

Writer’s Block But Worse

BY MATTHEW HERBERT

I’ve always wondered how writer’s block happens. For me, writing is like sculpting. There’s already something in front of me when I start–a feeling, a problem, a book I want to discuss. All I have to do is start working on the thing that’s there waiting for me. Maybe sculpting is actually too artsy a simile. Maybe it’s more like mowing the lawn.

Don’t get me wrong. That’s not to say I’m much good at writing. It’s just that getting started and keeping a flow of words going is not a huge challenge. Indeed the judiciousness it takes to refrain from writing might actually be harder for me than the showy impulse to write. (Christopher Hitchens’s old line comes to mind: Everyone has a book inside them. For most people that’s where it should stay.)

So it came as a surprise when this blog, which for six years had glided along easily and bloviated endlessly, creaked to an unceremonious halt a couple months ago. It happened, funnily enough, just as I was starting a series of posts on why I find writing so addictive. What I wanted to say is that I write for the same four reasons that my hero George Orwell does, and those reasons keep showing up in my life in one way and another. Orwell lays them out with characteristic clarity and modesty in his 1949 essay, “Why I Write.”

But I never got to the second reason.

As a mere rhetorical device, I’ll pretend for a moment that my blog actually has a readership. What you might say, if you’re part of that readership, is that, of course I got tired of writing. It’s always Orwell-this and 1984-that. Maybe I should move on to something new?

You probably have a point. There is more to life than Orwell. I could try to see it the way Joan Didion sees it. She’s another hero of mine. Or I could tackle a biggish project I’ve always wanted to do on why Kurt Vonnegut is the definitive liberal of the 20th century. Or how lucky we Americans are to be readers at the exact time that the greatest historian of our, well, whole history, is alive and writing–Jill Lepore.

See?–the lawn is there, just asking to be mowed.

This isn’t writer’s block. The problem is not that I can’t put pen to paper. It’s more insidious than that.

The problem is that I feel no generosity toward half of my imaginary audience. I have no desire to commune with those who are cheering on the murder of liberalism. Further, I’m not sure of what I can do even for my sympathizers. Make them feel obligated to leave behind a like or an amen? Further tax their already etiolated ability to pay attention?

If I were to, say, make a cutting argument that the Supreme Court of our land and the various governments of the states are conniving to return us to the Fugitive Slave Laws days, in which southern reactionaries can demand that we northern liberals arrest and return their chattel to them, what would follow? If I were to further comment that Texas, by empowering private citizens to bring ideologically motivated suits against whomever they wish, is deliberately choosing to be like Romania under its communist dictator Ceaușescu–where people spied and ratted on one another all the time–again, what would follow? My liberal friends would cheer; my authoritarian connections (long unfriended) would smolder in scorn, or–much more likely–just ignore me.

So the problem is Orwell again. He said of his lifetime that it was a thoroughly political era, and all writing was therefore political. And then, faced with the worst political threat of all–religiously fueled, nationalist authoritarianism–he stopped writing and went to war, in Spain. Hitchens observes of Orwell’s choice that there was no point in writing an essay detailing fascism: all you had to do was look at the sadists and bullies arrayed on the other side of the republic to know what you needed to know about their “ideas.”

Dear (imaginary) reader, this is not an announcement that I will follow Orwell in trading in my pen for a rifle. There is no need to report me to the police. I am too wedded to soppy liberal morals about the wrongness of shooting other people. Even deeper, I have some vague but insoluble attachment to Abraham Lincoln’s insistence that we Americans are not enemies but friends, that we cannot be enemies. I would feel like a lesser person if I stopped believing that. I love Abe so much. But the will to keep believing him takes all my effort. It makes it awfully goddamned hard to write anything.

The Four Things I Keep Coming Back To | Part One: Sheer Egoism

BY MATTHEW HERBERT

I am stuck with writing as a way of life, in much the same way I am stuck with running as a way of life. I have to do it pretty routinely to feel fully alive, even though it doesn’t come easily to me and I’m not very good at it.

My natural place in the field of writers, as with runners, is somewhere toward the back of the pack. Every time I write one of these posts, I learn my limitations anew: I sweat out a first draft that I feel okay about. I start knocking the rough edges off it and almost invariably end up changing its direction–without intending to–shifting tone, getting lost in asides and so forth. But usually I’ve put enough work into the second draft that I just accept its shortcomings. I polish it once, twice, three times, hit the publish button, and then I still read things back to myself that sound embarrassingly bad. I even manage to choose quotations from great writers that were trenchant and arresting in their original settings but seem off target in the places I try to use them.

And I ask myself: Since the human mind is more or less a representation machine, shouldn’t the act of writing down sentences that describe the world or express our thoughts come naturally to us? Why is it so hard?

Humans are, to my knowledge, the only species capable of being ridiculous, and this because we are uniquely capable of getting representations of simple, basic things so wrong. If cats could write, one gets the impression they would not produce ugly, awkward, or jarringly stupid sentences about the cat world–how or why would they? They would just, without bother, represent cat-facts as they are, right? Why, poor human that I am, with all my talent for error and solecism, do I insist on multiplying my opportunities for humiliation by trying to write things down? Wouldn’t it be better to take the advice of Wittgenstein and simply pass over in silence the things that resist our ability to express them? And isn’t that set of things pretty much all of life?

But I can’t drop it, this compulsion to write. There is a broad motive that underlies the whole attitude of a writer that I find is best expressed in the life of James Baldwin, and I may, in the coming weeks write about it, but today I want to focus on a subject that I keep coming back to year after year–the four specific reasons Orwell gave for writing in his 1946 essay “Why I Write.” They are all deceptively simple. And for anyone like me, who feels that writing is an organic part of living, they are much more than answers to the question, Why do I write? They are answers to the question, Why do I try to live they way I do?

When Orwell was asked why he wrote, in 1946, he had published scores of book reviews, dozens of essays, his own regular op-ed column in a national newspaper, and eight books, but he was still poor. Animal Farm, the book that would finally make him an international literary star, had come out in late 1945, but it would be several years before it would earn him much money. (In fact, as with 1984, almost all the royalties would come after his death.) Orwell cared so little about the proceeds that while he was still struggling to find a publisher for Animal Farm, one of the things he did was to make sure a Ukrainian translation was issued, for free. Some Ukrainian dissidents, after reading their free copies, would tell Orwell that his grasp of Soviet repression and intellectual corruption was literally incredible: they could not believe Orwell had not lived in the USSR himself.

Hearing this kind of thing was, it turns out, the first reason Orwell wrote. He didn’t care about the money; he wanted to be heard. Known. Admired. All writers do. This is how he put it:

Putting aside the need to earn a living, I think there are four great motives for writing, at any rate for writing prose. They exist in different degrees in every writer, and in any one writer the proportions will vary from time to time, according to the atmosphere in which he is living. They are:

1. Sheer egoism. Desire to seem clever, to be talked about, to be remembered after death, to get your own back on grownups who snubbed you in childhood, etc etc. It is humbug to pretend that this is not a motive, and a strong one.

The key is in that last sentence. It is humbug–the Victorian word for bullshit–to pretend you are not in the writing game for the ego strokes. This crucial piece of self-knowledge is something every writer must attain if s/he is to move on to serious writing, Orwell believes. All writers are “vain and self-centered,” he writes in the same passage, and they’re better off admitting it. It clears the accounts and puts the writer in an honest frame of mind.

This is one of the things I keep coming back to. Orwell’s admission of egoism is a secular version of the religious consciousness that all humans have sinned and fallen short of the Kingdom of Heaven. Orwell seems to be saying (to me, at any rate) that even writers, who set themselves up in a kind of omniscient position, interrogating politics, art, society, history, and so forth and passing exalted judgments on them, remain ego-driven children at their foundations.

I think it was in this same vain that the novelist Orhan Pamuk described with disarming frankness in 2006 why it was so great to win the Nobel Prize. The Nobel is at the top rank of human achievement. Only our most vaunted, learned thinkers earn it. They are imbued, we believe, with cold, Olympian virtues of aloofness and detachment–virtually a race apart from us. But Pamuk punctures this myth delightfully (and delightedly), saying in his Nobel Banquet speech that he once again felt like a teacher-pleasing child:

Actually the question I’ve heard most often since the news of this prize reached me is: How does it feel to get the Nobel Prize? I say, oh! It feels good. All the grown ups are constantly smiling at me. Suddenly everybody is again gentle, polite and tender. In fact, I almost feel like a prince. I feel like a child. . . . In fact now … come to think of it … That is why I write and why I will continue to write.

You can’t get much more egoistic than a prince, right? Everyone around you thinks you are young, smart, handsome, fit to rule the realm. Orwell was right about this state of the writer’s mind, and he was right to put egoism first in his list. Whatever writers tell you their purpose is, you can be sure their primary motive is to be heard, acknowledged, and valued.

And since writing is a way of life for me, I take Orwell’s words as a caution: In everything I do, even if it seems selfless or noble, there must be a part of me that is putting me first, calling out for praise and recognition. I have come to believe that some people think the point of growing up is to deny the existence of, overcome, or possibly even eliminate, this childish, selfish part of oneself. But this is humbug. The child never goes away, and we should not pretend that it virtue demands we negate it.

Two Cheers for “Blab” Books: How Corny Quotation Collections Shaped Two of the Greatest Minds in American History

BY MATTHEW HERBERT

Two of the best books I’ve read over the last year have been Abe: Abraham Lincoln in His Own Age, by David S. Reynolds, and Frederick Douglass: Prophet of Freedom, by David W. Blight.

Both men, Lincoln and Douglass, show that anyone can learn wisdom, gracefulness of expression, and moral courage from salutary books, no matter how humble or, indeed, corny. Even the most plebeian texts imaginable can help us furnish our minds beautifully, as the lives of these two giants show us.

One reason we hail Lincoln as a great democrat is his reputation as a rail-splitter–a frontier working man. But this image had only the tiniest grain of truth to it; he put up one rail fence, in 1830. The idea was seized on by Republican senators for Lincoln’s presidential campaign in 1860, and it took on a life of its own. Lincoln actually spent nearly all his pre-political working life behind a desk or arguing in front of a court. His roots in frontier culture, however, are of course real. As a boy, he attended country schools for a total of one year, on and off. Frontier schools were dodgy affairs back then; teacher certification was not a thing and many “teachers” were outright frauds. There were few books to be had and no standardized curriculum. Mostly the children just listened and repeated back what they heard.

“Much of the school day,” Reynolds reports, “was devoted to individual and group recitation. The idea behind these ‘blab’ or ‘vocal’ schools was that information could best be imprinted on the memory if spoken aloud–a habit that stuck with Lincoln, who later irritated colleagues in his Springfield law office by constantly reading aloud from newspapers or books.”

It is in Lincoln’s early reading habits, not in his over-hyped reputation for manual labor, that his real roots as a democrat begin to reveal themselves. Not only did he form his mind from the crudest intellectual clay, but even more importantly, he became completely receptive to cultural elements in his environment that were as eclectic as the content of his blab books. “[H]is mind was fed early on by all kinds of sources, high and low, sacred and secular,” Reynolds writes. As an adult Abe would speak one moment like a preacher, the next like a barroom raconteur, full of earthy jokes, then quote Shakespeare.

Among the most formative of Lincoln’s school books were “William Scott’s Lessons in Elocution, The Kentucky Preceptor, Noah Webster’s The American Speller, . . . and Lindley Murray’s The English Reader.” Lest we dismiss these eclectic, archly didactic books as merely the stage-setters that opened Lincoln’s mind to finer literature later in life, they actually stayed with him. Lincoln carried ideas and passages from these odd, humble books all his life. He developed a great capacity for memorizing texts from them and invoking them later.

Reynolds writes, “Lessons in Elocution included literary passages such as the soliloquy of Hamlet’s uncle on the murder of his brother (‘Oh! My offense is rank; it smells to heaven’), which Lincoln would spontaneously recite during his presidency.” From Aesop’s Fables, Lincoln took with him the image of bundled sticks, the strength of which he invoked “in a political circular . . . encouraging his fellow Whigs to act in unison rather than separately.”

These are just two examples of the scattershot collection of texts that shaped Lincoln’s mind. What mattered about the passages he memorized was not always their inherent greatness. Some were homely and modest, some scandalous, some preachy, and many–about spelling or grammar–destined to be outdated. But in all they reflected an amalgam of American impulses and ideas, a bounty of differing viewpoints that seemed in a way to embody the “multitudes” that captivated Walt Whitman.

The lessons Lincoln took from his school books were simple but powerful. The first was the importance of clarity. Though we recall the language of 19th century as florid and meandering (just try getting through the longeurs of Melville or Hawthorne), Lincoln led Americans into a new linguistic paradigm of brevity and precision. Say exactly what you mean, was the new injunction. But Lincoln also managed to cultivate a sense of style that gave his words literary power and moral weight. Lincoln gave the most important speech in American history, the Gettysburg Address, in only three minutes–a mere ten sentences that defined a whole new model of political language. To this day we still believe that anyone with something to say should be as clear and brief as possible, but without sacrificing beauty or style.

Lincoln also learned from the blab schools and quotation books that texts are intrinsically worth committing to memory. He didn’t know he would become president when he started memorizing all those lines; they just spoke to him. We can learn from Lincoln that this remains a habit worth emulating. If a passage of a poem, essay, play, or novel speaks to us, we can and should carry that passage with us. Words anchor us to the world, with all its wonders and trials. When we have nothing else, they are there to guide us, as they did Lincoln, during the gravest tests of human wisdom and courage.

Ringing literary allusions do not merely reflect our inner selves, though; they connect us to others. This was a third lesson Lincoln learned from his school books. A good communicator must know his audience if he wants to relate to them. It is a lesson tailor-made for a politician, but it it applies to the rest of us too. One of the things we say about experts and academics when they talk is that they are “off in their own world.” How true! They only seem to relate to their own kind. Lincoln’s school book readings taught him there are all kinds of people in the world, and to be fully human–especially in a democratic society–one must understand them and empathize with them. This starts with speaking their languages.

It was in the winter of 1830 that young Abraham Lincoln discovered a school book called The Columbian Orator. Like other school books already mentioned, it was a hodgepodge, a collection of texts laid out in no particular order but with the ring of nobility to them. As its name implied, The Columbian Orator was designed to teach effective public speaking. The winter after Lincoln began reading his copy of the book on the Illinois prairie, the young Fredrick Bailey, 900 miles to the east, in Baltimore, would acquire his.

We know Bailey today by the name he took after escaping slavery–Frederick Douglass. In a a way that mirrors the young Abraham Lincoln’s personal history, Bailey-Douglass was also the product, pedagogically speaking, of whatever school books were to be found in his immediate environment. David W. Blight recalls in Frederick Douglass: Prophet of Freedom, that young Bailey acquired his copy of The Columbian Orator from Irish friends of his who were carrying their school books with them when they would see young Frederick on the streets of Baltimore.

As it turned out, The Columbian Orator did have a guiding theme, chosen by its editor, Caleb Bingham, but it would have been hard to tease it out of its haphazard contents. Blight writes of Bigham’s book,

[Its] eighty-four entries were organized without regard for chronology or topic; such a lack of system was a pedagogical theory of the time designed to hold student interest. It held Frederick Bailey in rapt attention. The selections included prose, verse, plays, and especially political speeches by famous orators from antiquity and the Enlightenment. Cato, Cicero, Demosthenes, Socrates, John Milton, George Washington, Benjamin Franklin, William Pitt, Napoleon, Charles James Fox, and Daniel O’Connell . . . all appear at least once, and some several times. Most of the pieces address themes of nationalism, individual liberty, religious faith, or the value of education.

(Image: Open Library)

Taken on the whole, the book was, if the term is not too disparaging, a ragbag. I mean this word in the same way Orwell did when he used it to describe a genius no less than Shakespeare. What Orwell meant, and what I mean, is that a compelling voice–like the one in The Columbian Orator–can impart great wisdom even if it fails to evoke systematic understanding, minute design, or even erudition. It is in the powerful expression of an idea that the reader (or hearer) can see a life-changing truth as in a flash of lightning or hear a higher call to duty as in a clarion note. It’s the voice that matters.

As Frederick Bailey recited the words of The Columbian Orator to himself, essentially undergoing the same rote exercise in elocution and memorization that Abraham Lincoln did in the frontier schools of Illinois, those words took shape and caught fire. This sort of awakening was exactly what the passages in The Orator were meant to produce. Bingham, the collection’s editor, was a Dartmouth-educated abolitionist. He had chosen the texts for the Orator to showcase the central, founding idea of America–that each individual is sovereign and may not be owned or ruled over by others. Without saying the words “slavery” or “abolition,” Bingham assembled The Columbian Orator to teach the reader that slavery was un-American and indeed was at war with the liberal arc of history. In its pages, American school children, Blight tells us, “would have repeatedly encountered irresistible words such as ‘freedom,’ ‘liberty,’ ‘tyranny,’ and the ‘rights of man.'” It was “a vocabulary of liberation.”

All throughout his life, Douglass would refer to his copy of The Columbian Orator as his “rich treasure” and “noble acquisition.” He carried it with him when he escaped slavery. The Orator‘s promotion of American ideas poured “floods of light,” Douglass recalled, “on the nature and character of slavery, . . . penetrat[ing] the secret of all slavery and oppression.” Put simply, America would not have in its cultural possession one of its greatest books, Douglass’s epochal autobiography My Life as a Slave, without young Frederick Bailey’s chance acquisition of The Columbian Orator, that stiff, eclectic, grandstanding collection of liberal ideas. One of our great prophets might not have found his voice. And the chorus that eventually called for America to hold true to its ideal of freedom would have lacked its most plangent, powerful tones.

Review of “Black Earth: The Holocaust as History and Warning” by Timothy Snyder

BY MATTHEW HERBERT

I was going to open by saying what a tragedy it is that we still need books about the Holocaust. The quintessential crime against humanity, we are supposed to be past it now. But genocidal war fueled by the Big Lie has made something of a comeback with Putin’s invasion of Ukraine. “Sane,” “rational” people with an “elected” government are waging a brutal, murderous war to destroy, not just a country, but a nation, in Europe. Here we are again.

Still, we must deal narrowly with the events of today, right? How much can a liberal democrat like myself gain by reading a new history of the Holocaust? All it can serve to do, seemingly, is to highlight again what has been clearly and repeatedly established as humanity’s worst, most monstrous failure. Isn’t rehashing irremediable atrocities a kind of political pornography? And so Lublin, Treblinka and Auschwitz cannot really tell us much about Bucha, Mariupol, and Kramatorsk.

But maybe the recurring justification for revisiting the Holocaust lies in the audience, not the subject matter. I come from a country, America, where the people think they are naturally too virtuous to commit genocide, and I live in a country, Germany, where atonement for Nazi crimes has become so routine and ubiquitous that it can feel like a hollow ritual.

I am, it turns out, precisely the kind of person for whom Timothy Snyder wrote his 2015 book Black Earth: The Holocaust as History and Warning. As someone who thinks we understand Nazism and its crimes, I tend to believe that unformed sorrow and a stark pledge of never again is all we can offer the wounded human race after the Holocaust. It was all so much pure evil. But Snyder insists we still get the political facts of the Holocaust wrong, and thus we risk repeating it.

The actuating force of the Holocaust was not pure, unmotivated evil, according to Snyder. It was the intersection of humiliated nationalism, economic crisis, and conspiratorial racism. At the crossroads where these elements come together, Snyder argues, “few of us would behave well. There is little reason to think that we are ethically superior to the Europeans of the 1930s and 1940s, or for that matter less vulnerable to the kind of ideas that Hitler so successfully promulgated and realized.”

Before I offer my views on how history is repeating itself today, let me lay out Snyder’s main thesis and arguments. The effectiveness of Hitler’s genocide, Snyder writes, can be measured geographically, on the map of Europe. Although westerners typically call to mind the Third Reich’s deportation of Jews from Western Europe, it was in the East where the mass killing was most effective. More than 90 percent of the Jews in Poland, the Baltic countries and Nazi-occupied parts of the USSR were murdered; whereas half or more of the Jews in western Europe survived. Ironically (if the word can be used), Germany’s Jews had one of the highest rates of survival.

The East became Ground Zero for Nazi genocide, Snyder argues, because of how thoroughly Germany destroyed the state institutions there. When government authority was destroyed by the Wehrmacht‘s Blitzkrieg, so was the link to the rule of law. Even Hitler’s re-introduction of the law of the jungle, though, was only an enabling condition for the Holocaust. Had the situation been so simple as Einsatzgruppen rampaging without any laws to constrain them, Snyder argues, far fewer Jews would have died. The Nazis acting alone simply didn’t have the capacity to kill by the millions. What they needed, Snyder argues, was a large cohort of highly motivated local collaborators.

And they got them. This is the heart of Snyder’s argument: the unique tragedy of Eastern Europe during World War Two was the fact of double occupation–military conquest first by the Nazis then by the Red Army. To signal loyalty to each occupying power in turn, or often just to survive, thousands of eastern and central Europeans actively contributed, albeit in different ways, to the wholesale killing of Jews. When the Soviets occupied eastern Poland in 1939, they divested Jews of their property and businesses because they were anathema to communism. Non-Jewish Poles moved in and occupied their stolen property. When the Nazis invaded, they coopted the new Polish property owners into a scheme to kill the former owners, “racializing” what had been a purely political oppression by the Soviets.

In some areas Jews formed partisan groups that proved successful at killing Nazi occupiers. When the Red Army began liberating areas in 1943 that had been defended by partisans, Stalin ordered the partisans killed so they couldn’t claim credit for helping win the war. These patterns repeated themselves in dozens of variations across the East, and the Jews got it coming and going, by the Nazis and the Red Army.

The historian’s first task, of course, is to present the facts faithfully. Snyder obviously would not have written Black Earth had he not believed he was unearthing some new, objective evidence about what the Holocaust really was. But it is the moral of his book–the warning–that gives it urgency. If we persist in seeing the Holocaust as an instance of utterly unintelligible evil, he writes, we could blind ourselves to its central mystery–how ordinary people carried it out. How we might carry it out again.

“I am a normal man with normal needs,” says Paul Doll, an imaginary SS death camp commandant in Martin Amis’s 2015 novel The Zone of Interest; “I am completely normal. That is what nobody seems to understand.” The novel’s action is set in 1942 and 1943, as it is becoming evident that Germany is losing the war. But even as the Wehrmacht‘s military conquests slow and then go into reverse after Stalingrad, the genocide in the death camps picks up pace. Doll has to clear newly arriving trains every day. “We cannot cope with the numbers,” he complains. It dawns on Doll that the death camps have become Hitler’s main effort. The war for Lebensraum is being lost. So the genocide must be sped up. Amis’s unflinching theme in The Zone of Interest is the examination of each character as someone who is, or once was, normal but is now under the reality-bending circumstances of Hitler’s doomed killing frenzy.

“Under National Socialism,” reflects Amis’s protagonist, “you looked into the mirror and saw yourself. You found yourself out. . . . We all discovered, or helplessly revealed, who we were. Who somebody really was. That was the zone of interest.” And in a way, this is the zone of interest for Snyder as well, to insist on seeing the actors in the Holocaust, major and minor, as normal people. They believed a Big Lie when it was credible, in the 1930s, and then became part of the Big Lie’s monstrous bloody reality even after it passed beyond belief in the 1940s.

In Ukraine we are witnessing a hinge moment in history, where just such a transition is happening in real time. Vladimir Putin’s idea of Russia as victimized, surrounded, and unfairly constrained has been fermenting into a mass Russian movement for decades, and it is now exploding into a justification for genocidal murder. Just as Hitler drew an organic link (where there was none) between a real strategic adversary–Soviet Communism–and a helpless, demonizable people–the Jews–Putin has pulled off the same diabolical maneuver. Germany was by natural rights a strong, forthright nation, Hitler said, deserving of whatever wealth, land, and power it could grab. That’s just Realpolitik in its purest form. But the Jews devised a global conspiracy of liberalism that kept Germany in check. It wasn’t fundamentally the Jews who were an obstacle to Germany’s greatness; it was the alliance they created. The alliance was too big to conquer, so Hitler went after its putative source, in Jewishness.

While some of the details differ, this view of geopolitics is far too close to Putin’s to be ignored. The Ukrainians must be subdued, he says, not because they themselves are a threat to Russia’s greatness; they are nothing but homosexuals, leftover Nazis, and drug addicts. But because these weaklings have tricked NATO and the EU into constraining Russia, they have committed a fatal sin against Russian greatness. Tragically, I think it is plausible that Putin will turn to mass killing as the only achievable war aim that is left to him once it becomes clear that he, like Hitler, is losing the war.