The Meaning of Haymarket Square: How Marxists Won the Eight-Hour Day for Working Americans

BY MATTHEW HERBERT

There is no “straight” history of the United States. The standard version is the one told from the perspective of organized money; it is not just a plain, objective recounting of facts.

In his 1980 masterpiece, A People’s History of the United States, Howard Zinn informs the reader from the get-go that his perspective will be different. He will tell the story of America from the view of Americans who were on the receiving end of oligarchy–the poor, the oppressed, the disenfranchised, the silenced, excluded and ignored. The ninety-nine percent, if you will.

(And by the way, in 1980, assuming that America was an oligarchy turned out to be a pretty good working hypothesis. In 2014, two Princeton economists–stout guardians of the status quo if anyone is–concluded that Zinn’s assumption was sound: our country was run by a small group of economic elites with exclusionary access to power.)

I am heartily enjoying re-reading of Zinn these last couple weeks. To me it’s a puzzle that so many “ordinary” Americans seem allergic to Zinn, or any kind of critical retelling of our history. (1619 Project wars, anybody?). If we are indeed the home of the brave, why should we fear hearing voices from the past that have been forgotten or marginalized or blotted out? Can our faith in our founding ideals not withstand the testimony of ordinary, powerless people?

I don’t do a lot of flag-waving, but I would only want to belong to a country that keeps digging up its past and trying out new versions of its history. Any other kind of country is not free–and is also deeply uninteresting, which is another kind of problem.

If the authorities get the people to forget all the “irrelevant” facts that have been winnowed away to create the official version of history, eventually the unofficial version will simply die away. Memory and public record are the only things that enable us to think honestly about who we used to be and how we have changed. If the authorities can manipulate those two things sufficiently, they can create a history of themselves that is impervious to examination. Future Princeton economists will not get to call them oligarchs. We will be forced to believe their version as the only one. It will be as if the state’s antagonists never existed. And weren’t we born from an antagonistic movement? Didn’t Jefferson himself say that a democracy needs rebels?

So with that thought in mind, today I want to showcase Zinn’s recounting of the history of Haymarket Square. As an American living abroad most of the last 30 years, I’ve been quizzed more than once about this event. Almost every developed country east of the Azores celebrates May Day. The date is renowned as a victory of labor over capital.

Furthermore, almost every educated European also knows two slightly incongruent things about Haymarket. They know that the events in Chicago are venerated almost exclusively by the political left, and they know that America today is curiously devoid of memories of Haymarket. It’s not just that we have forgotten about it. We don’t seem to want it in our history. We even moved our version of Labor Day to a whole new month to keep it free of socialist taint.

Generally speaking, my interlocutors are not setting me up for a gotcha moment when they ask about Haymarket. They genuinely want to know how such basic information about it could have been purged from the public consciousness in the very country where it happened. It’s puzzling.

Zinn introduces his retelling of Haymarket by recalling a poem of the day, “My Boy.” It goes

I have a little boy at home,

A pretty little son;

I think sometimes the world is mine

In him, my only one . . .

‘Ere dawn my labor drives me forth;

Tis night when I am free;

A stranger am I to my child;

And stranger my child to me.

When we think of the labor movement, we think of strikers demanding two things–higher pay and better conditions. But the subject of this poem doesn’t just want a job that’s better remunerated or safer or easier for him to do. He wants a life. He wants his everyday not to be dictated to him so that it prevents privacy, agency, and normal human bonds of love.

By 1886, labor movements across the country were gaining momentum. The workers had nothing to lose but their chains, to paraphrase a Certain Someone. Zinn recounts that, from the days of Revolutionary America onward, laborers had worked 12 to 16 hour days and many considered a mere 9-hour shift on Saturdays a godsend. They were paid poverty-level wages across almost all industries.

As the Industrial Revolution gathered force, producers’ need for labor skyrocketed, and by the Civil War, cities across America (mostly in the north) became huge slums of the working poor. Contrary to Horatio Alger Myths, there was no way up and out of the slum, and this was by design. The system needed those masses of the powerless, immiserated poor to stay where they were and spend their every waking hour working.

When the American Federation of Labor called for nationwide strikes on 1 May 1886, it had the explicit goal in mind of ending the working person’s entrapment in a workday that permitted no private life, no time to be anything other than a factory hand. One group in Chicago that answered the AFL’s call, indeed, anticipated it, was the Central Labor Union. Led by two Marxists, Albert Parsons and August Spies, the CLU had published a manifesto the previous year. Here is the main part of it:

Be it Resolved, That we urgently need the wage-earning class to arm itself in order to be able to put forth against their exploiters such an argument which alone can be effective: Violence, and further, Be it Resolved, that notwithstanding that we expect very little from the introduction of the eight-hour day, we firmly promise to assist our more backward brethren in this class struggle with all means and power at our disposal, so long as they will continue to show an open and resolute front to our common oppressors, the aristocratic vagabonds and exploiters. Our war cry is “Death to the foes of the human race.”

Even without the advanced state of sleep science today, common sense and normal, bodily imperatives tell us we need about eight hours of sleep each night. The oligarchs of 1886 America said that’s all we needed, period: eight hours of sleep and 16 hours of work. You would have had no use for any life outside the factory and your meager bed. You are a mere extension of the machine you attend.

When the CLU had the temerity to assemble thousands of strikers in Haymarket Square, Chicago on 1 May against this idea, the authorities sent out the police, as usual. Many strikers quit under fire, many others were arrested. Spies wrote a fiery pamphlet calling for stiffer resistance, and on 4 May, a smaller group of protesters turned out. What they didn’t know is that an agent provocateur was among them, and at the end of the gathering, he threw a bomb at the police, killing seven of them.

Haymarket Riot - HISTORY
Image: History.com

With no physical evidence to identify who threw the bomb, the public prosecutor went after Parsons, Spies and six other CLU leaders. The lack of evidence was no barrier to achieving justice. Zinn recalls, “The evidence against the eight anarchists was their ideas, their literature; none had been at Haymarket that day except Fielden, who was speaking when the bomb exploded. A jury found them guilty, and they were sentenced to death.”

As Christopher Hitchens reminds us of the Catholic Church, it is worth worth remembering what it was like when it was strong, even though it seems docile today. The Inquisition and the Church’s other atrocities cannot simply be tossed down the memory hole.

I take a similar lesson from Zinn’s history of the contest between labor and capital in 19th-century America. Give capital enough power, and it will deny that you are even a human being. It will find a way to deprive you of a life of your own, and it will pay for “respectable” courts to convict you of thought crime if you demand more. For that is indeed what Parsons and Spies were convicted of. They were hanged in 1887. They were executed for thinking that workers deserved to have their own lives–a third of their day in which they could love, loaf, read, garden, or do whatever made them them. It is vital that we be able to recall a time in our history when this idea was deemed so dangerous that the state dispassionately killed its authors.

Parsons and Spies did not win the eight-hour work day alone, but they did spearhead its victory and made the ultimate sacrifice for it. I quoted their manifesto at length on purpose. It’s a discomfiting document. Its authors are Marxists, and their prose shows it–turgid, militant, straddling a line between peaceful protest and violent rebellion. It is an all-American document. If you feel entitled to your eight-hour workday, as I readily admit I do, take a moment to remember that the bravest, most committed partisans of this privilege–all-American Marxists–were hanged for bringing it to you. That is the meaning of Haymarket Square for me.

Howard Zinn and the Legacy of Orwell

BY MATTHEW HERBERT

For students of Orwell, Howard Zinn’s 1980 masterpiece, A People’s History of the United States, provides an endless source of inspiration and reflection. All history writing, Zinn argues, is recounting facts. But choosing the facts to recount is shaped by ideology. So when you look back at the stories that historians tell–or when you want to tell a new story of your own–you must try to be critical and candid about which ideology is at play.

Orwell wrote that his only talent was a “power of facing unpleasant facts.” And this talent was not merely an idle form of pessimism. Orwell’s most militant attacks on the status quo were calculated, deliberate and, yes, optimistic. He saw around him a society of fundamentally decent people but who were blind to the mass thievery, subjugation, and brutality of colonialism, which provided their income. Orwell wrote over and over again that the English people’s comfort and gentleness–national virtues which he sincerely admired–were based in a widespread ability not to see what was right in front of their noses. He wanted them to do better.

Orwell was also devastatingly frank about his own ideologies. Near the end of his life he wrote in an essay that since 1936, when he had gone to Spain to fight for the republic, everything he published had been “propaganda” for social democracy. Of course he believed he was writing the truth, but he didn’t believe that any writer could work completely objectively: it was impossible to write about anything of importance, Orwell believed, from an ideologically neutral perspective.

From the moment you crack open A People’s History of the United States, you see Zinn exuding a kind of intellectual courage that would do Orwell proud. The standard view of American history is not factually wrong, Zinn writes, but it dissembles, overshadows or understates so many unpleasant facts that it ends up telling a deeply warped version of how our country came to be. Every single-volume history of the United States that we had been presented with before 1980 invites its readers to avoid seeing what is right in front of their noses.

All honest historians know, Zinn writes, of Columbus’s cruelty and greed in “dealing” with the native Arawak people of the West Indies. Columbus himself records that one of his first acts was to “take by force” several Aarawaks to interrogate them about the location of gold. He also records how he enslaved thousands of Arawaks and sent them back to Spain. Tens of thousands more were to follow. Many were simply massacred. Once Columbus landed on their shores, the Arawaks’ history became one of subjugation, enslavement and death.

The standard histories do not lie about such devastation; one historian even calls it genocide. Another says Columbus “had his faults” but must be remembered for his seamanship. What the historian accomplishes by this kind of subterfuge is, Zinn writes, worse than lying. The loyal historian

refuses to lie about Columbus. He does not omit the story of mass murder; indeed he describes it with the harshest word one can use: genocide.

But he does something else–he mentions the truth quickly and goes on to other things more important to him. Outright lying or quiet omission takes the risk of discovery which, when made, might arouse the reader to rebel against the writer. To state the facts, however, and them to bury them in a mass of other information is to say to the reader with a certain infectious calm: yes, mass murder took place, but it’s not that important–it should weigh very little in our final judgments; it should affect very little what we do in the world. . . . To emphasize the heroism of Columbus and his successors as navigators and discoverers, and to de-emphasize their genocide, is not a technical necessity but an ideological choice.

To weigh and arrange the facts of history in a certain way is an ideological choice. And just as Orwell tells us plainly that his writings are all “propaganda,” Zinn tells us what his ideological choice is: he will tell the history of America focusing on the lives of the forgotten, subjugated, murdered, silenced and oppressed. Such a history may not give us a perfect picture of who we are or how our country came to be, but it would–and does–go a long way toward correcting a historical narrative shaped only by the ideology of conquest, money, and property.

Orwell’s Attack on Christianity in “1984”

BY MATTHEW HERBERT

By the time Orwell wrote 1984, near the end of his life, he was very much over God. He had been for a long time. He recalls in a 1947 essay, “Till the age of about fourteen I believed in God, and believed that the accounts given of him were true. But I was aware that I did not love him. On the contrary, I hated him, just as I hated Jesus and the Hebrew patriarchs.”

Strong stuff.

Young Eric Blair was rebelling against the most outrageous commandment in Christianity–to love, fear, and worship the very God who had created him sick with sin. It was a crazy-making idea. A sane, whole person cannot love his tormentor, certainly not on command, and young Blair knew it. “[A]t the middle of one’s heart,” he believed, “there seemed to stand an incorruptible inner self,” and this self had to stay sane, even if it meant living with the consciousness that he was a mere mortal animal, mastered in the end by time, fate and chance, not bound for victorious, eternal glory.

Why did Orwell, a committed liberal, do so little to promote the cause of secularism, which seeks to pierce the very first authoritarian code we are taught as children and, in a sense, fathers all the rest? Orwell writes almost nothing in his maturity about the benefits of losing one’s religion. It seems that once he had outgrown the idea of God at the age of 14, he dropped it entirely.

Well, that’s not exactly true. He did write a whole novel about theism and atheism, A Clergyman’s Daughter, but it is remarkably thin stuff, theologically speaking. It pivots on no towering clash of ideas, is harrowed by no Dostoevskyan “furnace of doubt.” It is much more English than that. After Orwell’s heroine, Dorothy, stops believing, he describes her change in meteorological terms:

There was never a moment when the power of worship returned to her. Indeed, the whole concept of worship was meaningless to her now; her faith had vanished, utterly and irrevocably. It is a mysterious thing, the loss of faith–as mysterious as faith itself. Like faith, it is ultimately not rooted in logic; it is a change in the climate of the mind.

Today we might say it was Orwell’s “lived experience” as a liberal democrat that caused his faith to vanish without fuss or ceremony. Arguments and logic played no role in his mental liberation, so he made no point of promoting them. Just give people time to outgrow the myths and superstitions of religion, and they will do so, he seemed to be thinking.

Although this passivity, I believe, characterized Orwell’s abiding attitude most of his non-believing life, I was wrong to think he let A Clergyman’s Daughter express his last word on Christian theism. The closing chapters of 1984 aim a savage blow directly at the core imperatives of Christianity. Furthermore, it is clear that the dying Orwell still hates the very things that outraged him as a school boy: the commandment to love and worship a sadist who claims the power to bend reality itself and who demands the subject, under pain of torture, connive in this self-aggrandizing fraud.

Everyone who has read 1984 recalls the broad outline of what happens to Winston Smith in the Ministry of Love. O’Brien tortures him to the point where he believes–with apparent genuineness–that 2 + 2 = 5.

But I hadn’t noticed until my latest re-reading the specificity with which Orwell attacks certain Christian principles as inherently totalitarian.

Evangelical Christians’ belief in a “young earth” created by God and fully furnished with familiar animals is well known. The most literal interpretation of this dogma is the basis of James Ussher’s famous “calculation” that the earth was created on 23rd October 4004 BCE.

Although I’ve read 1984 a dozen times or more, I’d never paused at the episode where O’Brien propounds Big Brother’s version of the same theory. Winston cannot accept O’Brien’s claim that the party has fully mastered material reality and indeed “make[s] the laws of nature.” Winston objects that the party cannot even achieve military mastery of the whole planet, which is itself “a speck of dust” in the vast darkness of the universe. How could it possibly claim to make the laws of nature?

“Nonsense,” O’Brien replies, “The earth is as old as we are, no older. How could it be older? Nothing exists except through human consciousness.”

When Winston objects that the “rocks are full of bones of extinct animals” that were alive long before humans, O’Brien rebuffs him with words that could come straight from a creationist pamphlet, “Have you seen those bones, Winston? Of course not. Nineteenth century biologists invented them.”

It is not just the substance but mainly the logic of O’Brien’s argument that indicts Christian dogma. The human mind is a blank slate, according to both Christianity and Big Brother. Agents of good and evil contend to inscribe things on the slate. Inscribe enough things, and a narrative takes shape. If one side gains the power to blot out everything written by its opponent, that side completely and utterly controls the narrative. It might as well control all of reality.

Winston knows this. In his job at the Ministry of Truth, he blots things out for a living, revising old books and newspapers to reflect the ever-changing party line. If the party expels an official, the facts of the official’s life are edited to fit his status as an enemy and traitor. All evidence that he once was good goes down the memory hole. In the torture chambers of the Ministry of Love, O’Brien explains to Winston that revising books and newspapers is just the beginning of what the party can do. Controlling history is good, yes, but controlling all of reality is the party’s real objective.

Furthermore, there is a shortcut to achieving this control. There is no need to expand the Ministry of Truth to be able to revise all the world’s texts. “We control matter,” O’Brien intones, “because we control the mind. Reality is inside the skull. You will learn by degrees, Winston. There is nothing that we could not do.”

Christianity, and indeed most religions, make the same naked invitation to power that O’Brien is making. When O’Brien says that scientists “invented” the fossil record, he opens the field for the individual to believe any alternative facts or theories whatsoever. Convince the individual that objective, mind-independent reality has no authority to rule his thought, and you become that authority. This is the power of cults, conspiracy mongers, dictators, and, yes, religions. It puts the faith-bearer at the center of a universe that makes no sense without his consent.

But how do you get a sane, reasonable person to believe the impossible–really believe? Perversely, it all comes back to the “Christian” commandment to love one’s tormentor.

Ministry of Love on Twitter: "We're from the Ministry of Love and we're  here to help. http://t.co/EI9v8BVRh4 #WARisPEACE"

In the interrogation room, Winston is belted to something like a dentist’s chair; electrodes are attached to his body, and they are used to administer whatever strength of shock O’Brien chooses. O’Brien has a pain dial.

Even though Winston knows it will draw another shock, he cannot accept O’Brien’s philosophy that there is no earth without man, no reality without minds to shape it. Addled by prolonged torture, Winston cannot articulate his objection. O’Brien fills it in for him, and goes on to explain the articles of faith under duress:

“I told you, Winston,” he said, “that metaphysics is not your strong point. The word you are trying to think of is solipsism. But you are mistaken. This is not solipsism. Collective solipsism, if you like. But that is a different thing; in fact the opposition thing. “All this is a digression,” he added in a different tone. “The real power, the power we have to fight for night and day, is not the power over things, but over men.” He paused, and for a moment assumed again his air of a schoolmaster questioning a promising pupil: “How does one man assert his power over another, Winston?”

Winston thought. “By making him suffer,” he said.

“Exactly. By making him suffer. Obedience is not enough. Unless he is suffering, how can you be sure that he is obeying your will and not his own? Power is in inflicting pain and humiliation. Power is in tearing human minds to pieces and putting them together again in new shapes of your own choosing. Do you begin to see, then, what kind of world we are creating?”

The most pitiful scene in the terrifying next-to-last chapter in 1984 is when the emaciated, pain-wracked Winston accepts the comforting embrace of O’Brien, who has turned off the pain machine. In the moment of relief, the only thing that matters to Winston is that O’Brien–his tormentor over so many days that Winston can’t count them–is the author of his release from pain. In a foretaste of 1984‘s last horrible revelation–that Winston comes to love Big Brother, not just obey him–Winston in that moment loves O’Brien.

Any creed that invites us to love the author of our misery because he ipso facto has the power to relieve our misery is totalitarian. It expresses a wish to have our emotions invigilated and commanded by someone else. It tells us we would be better off if someone more powerful than us were to take control the seat of our privacy. Put us together again in new shapes of their choosing. Not our will but theirs be done.

Of course I’m not saying that just because Christianity is a gateway to the darkest of totalitarian attitudes that the faithful must follow that path. All the Christians I know are too decent and polite to believe the worst, privacy-canceling parts of their creed. But those worst parts are still there, an open invitation to power–a hideous strength, as C.S. Lewis might put it. For my part, I still believe, as Orwell did, that at the middle of one’s heart is an incorruptible self that must expose and stand up to such outrages.

“It Was Even Conceivable That They Watched Everybody All the Time”

BY MATTHEW HERBERT

In Orwell’s autobiographical essay about his boarding school days, “Such, Such Were the Joys,” he tells of a formative experience. It’s slightly embarrassing, as many of our childhood memories are.

He’d been sent on an errand in town and made an unauthorized trip to a sweet shop to buy chocolates. The detour to the shop wasn’t the only illicit part of his adventure. The few pennies in his pocket, although his, were supposed to have been “paid in” to the school’s headmaster.

This practice of “paying in” was allegedly for the safekeeping of money, but it was really to deny the poorer boys the freedom to spend as they pleased. When young Eric Blair would ask for any of his funds (once, to buy a model airplane), the headmaster would demur, telling Blair the object of his desire was not the “kind of thing” a boy of his station should be buying.

But buy the chocolates young Blair did. He was taking an enormous risk. At school he had already been beaten violently with a cane, for a mere classroom blunder. Blair was now crossing the line into real crime, as he understood it. His pulse quickened. Then, Orwell relates

As I came out of the shop I saw on the opposite pavement a small sharpfaced man who seemed to be staring very hard at my school cap. Instantly a horrible fear went through me. There could be no doubt as to who the man was. He was a spy placed there by [the headmaster]! I turned away unconcernedly, and then, as though my legs were doing it of their own accord, broke into a clumsy run. But when I got round the next corner I forced myself to walk again, for to run was a sign of guilt, and obviously there would be other spies posted here and there about the town. All that day and the next I waited for the summons to the study, and was surprised when it did not come.

A few sentences later, Orwell releases himself from the grip of this childhood vision of terror. It was a silly thing to believe. Of course, he observes, there were no networks of spies scouring every street for guilty-looking little boys, trying to catch them breaking the rules. But the headmaster had already got inside young Blair’s head and taught him to believe this paranoid fantasy, or at least to suspect it.

Did Orwell ever let go of this suspicion?

In the opening chapter of 1984, Winston Smith dutifully presents himself to the Telescreen when summoned. Later he tries to hide from it. He wants to admire a diary he had bought recently, an item he knows is contraband. He sneaks around a corner, out of view of the Telescreen, and does his best to stay utterly silent as he gazes on the diary’s beautiful, creamy-white pages. He can’t help himself: he writes out the first few words of his rebellion. Did the Telescreen hear his pencilscratch? He doesn’t know.

In time the reader comes to understand that the agents manning the Telescreen can also monitor people’s heart rate and perspiration. Big Brother can know much more about the people of Oceania then they think they are revealing. Today we would call these kinds of tells “biometric signatures.” The oldest biometric in the book is the fingerprint; it identifies a person uniquely. New sensing and computing technology have evolved other signatures. Added to the list are: gait, pupil dilation, face- and voice recognition, and who knows what else. They can all tell some agent on the other side of the Telescreen who you are as you walk down the street, and probably much more.

Image: CBC News

Furthermore, biometric signatures can be cross-referenced with other data points that we constantly ooze into the information ecosystem, such as phone metadata, online purchases, social media posts, home address, tax- and police records, car registration, and–again–who knows what else. Are you starting to see a theme?

Orwell saw it. Faced with the unknowable power of Big Brother’s surveillance agents, he has Winston reflect, “It was even conceivable that they watched everybody all the time.”

Orwell is right back on that street in the England of his boyhood. His fear of the sharpfaced man was not so silly after all.

The sentence that fixes Winston in this spy-watched world is one I have vastly under-appreciated, and I have spent a decade actively appreciating the sentences in Orwell’s books. In 1984, Winston doesn’t know that Big Brother’s agents are all-seeing. But his reasonable suspicion is enough. He lives his life, Orwell writes, as if they are “watching everybody all the time,” and he bends his thoughts to accommodate this fear.

I was about to close this post by writing that it “hardly needs saying” that we live in a society shaped by the very condition young Eric Blair imagined and about which the mature Orwell wrote. The sharpfaced man really is who we thought he was. We are watched all the time by agents capable of discovering an indefinitely large body of information about us. But as Orwell also wrote, it sometimes takes effort to see what is right in front of our nose. So it needs saying: the power we have given to companies and governments to watch us and learn about us has not just eroded privacy (as we often read it is doing). It has plainly revoked a certain amount of privacy. In so doing, it has changed who we are. We are the kind of people who live with the suspicion that “they watch everybody all the time.” Maybe this is our God now, slightly less than omniscient but still doing the most invasive part of his job.

I have to a great extent fallen off line in the last few months. This was mostly because I was busy. And, no, I am not about to announce my departure from the grid. The world is a social place, and I live in it. Part of who I am is online, is out there in the information ecosystem, vulnerable to all kinds of surveillance. I don’t honestly believe I will ever walk down a street and have my gait, voice, or face linked to my online identity, but it could happen. We’ve created that condition; we’ve allowed it to happen.

If I could think my way in to young Eric Blair’s thoughts and rerun that scene outside the sweetshop, this is what I would have him say to the sharpfaced man: “Fuck off, I’m just buying chocolates here.” It might have worked in 1915 England. But it wouldn’t work today. The sharpfaced man is an offscreen nobody, and he already knows how much money we have, what we spend it on, what kind of chocolates we like, and that we’re supposed to be in school, not out on a lark. We’ve already told him all that. It’s conceivable he watches us all the time.

I Do Not Want an Orwell

BY MATTHEW HERBERT

In 1947, as Orwell was writing and rewriting drafts of 1984, he took time to turn out the essay “Such, Such Were the Joys,” an autobiographical reflection on his years in English boarding school. Its description of his school’s filth, snobbery, and intellectual fraudulence were so raw that Orwell held it back from publication. He knew it would invite scandal and, probably, recrimination.

The essay did precisely this when it came out after Orwell’s death. Several of Orwell’s peers thought he had exaggerated for literary effect and came to the old school’s defense.

Controversy aside, Orwell’s interpreters generally take “Such, Such Were the Joys” as a peroration of the major themes that would appear in his masterpiece, 1984.

This says something about boarding school, doesn’t it?

It’s true, the big ideas of 1984 are all there in outline in Orwell’s essay–the concept of thought crime, the use of violence to cow and control the individual, the material poverty imposed by authoritarian rule, the willful mass forgetting of history, and, overarching all of it, the despot’s blind need to abolish privacy.

I could easily write an essay about these themes, and it would please myself.

But why? For the life of me I don’t understand how, as a nearly 55-year old man, I still enjoy the academic exercise of matching up bits of text in one piece of literature to corresponding bits in another piece. The outcome usually manages to be both shallow and pompous. (Here we have Orwell anticipating Big Brother in the person of his former schoolmaster, Sambo.) I suppose it’s like the pleasure of fishing; simply going through the motions is always rejuvenating even if the motions are old and worn. The experience remains deep and vital. It touches something timeless. I do not care that I am such a poor literary critic. It’s the only thing I care to be in my down time.

What remains fascinating about reading “Such, Such Were the Joys,” (which I have probably read a dozen times) is that we can see in high definition how Orwell became the author who was about to write 1984, and consequently was to become one of the towering intellectual figures of the 20th century. Orwell’s school experience crystallized the two big ideas whose contraposition framed 1984: on the one hand a ruling system that is set up to erase individual conscience and, on the other, a thinking individual who is unable to surrender to the system’s demands.

“Such, Such Were the Joys” is full of reminders that the process of becoming Orwell was not much fun. One bald testament: “But at any rate, this was the great, abiding lesson of my boyhood: that I was in a world where it was not possible for me to be good.”

The narrative opens with a bleak reminiscence on the eight-year old Blair’s bedwetting, which he had earlier outgrown but which recurred under the stress of school life. Eric Blair the eight-year old child prayed to God for the bedwetting to end, because he found that he had no control over it despite his most desperate efforts. It mortified him, naturally. The schoolmasters told him he was to blame for it, and they beat, harangued and intimidated him. It was, as he said, impossible for him to do the right thing, but he nonetheless felt himself to be in the wrong.

Okay, that was the situation, as Vivian Gornick might put it, but what was the story? What was it like inside young Eric Blair’s head? “All through my boyhood I had the profound conviction that I was no good, that I was wasting my time, wrecking my talents, behaving with monstrous folly and wickedness and ingratitude . . . .”

If we have doubts that the child is the father of the man, Orwell reports that this attitude stayed with him past childhood. He recalls,

The conviction that is was not possible for me to be a success went deep enough to influence my actions till far into adult life. Until I was about thirty, I always planned my life on the assumption that not only was any major undertaking bound to fail, but that I could only expect to live a few years longer.

Despite the pessimism that darkened his emotional life for so long, though, Orwell–or, better to say, Blair–found the very thing in school that steeled him to struggle on toward hope and light and a decent, happy existence. A child cannot go on for years opposing his adult authorities at every turn, and young Blair made the necessary compromises to please and placate his teachers and even seek the warmth of their approval. He occasionally even fawned on them, as prisoners will fawn on their jailers. But, as he said,

all the while, at the middle of one’s heart, there seemed to stand an incorruptible inner self who knew that whatever one did–whether one laughed or snivelled or went into frenzies of gratitude for small favors–one’s only true feeling was hatred.

Why hadn’t young Blair known that there was such a thing as the “middle of one’s heart” a place where he was his inviolable self? Because, as he recalls, beliefs in early childhood are underwritten by pure authority and not necessarily the light of reason. “A child may be a mass of egoism and rebelliousness,” he writes, “but it has no accumulated experience to give it confidence in its own judgments. On the whole it will accept what it is told, and it will believe in the most fantastic way in the knowledge and power of the adults surrounding it.”

Twice in “Such, Such Were the Joys,” Orwell characterizes the inner life of the young child in boarding school as a mass of “irrational terrors and lunatic misunderstandings.” And adults choose this life for children and administer it, make it a reality. Or at least they used to, before this kind of school experience began to be abolished by law, enlightenment, and common decency.

“Dad, would you ever send me to a boarding school?”

This was the question, posed to me a week ago, which became a topic of conversation and, eventually, the occasion for these present scribblings.

My answer came straight from my heart, and in a way, straight from Orwell. The abiding sin of sending young children off to boarding school is that it removes them–arbitrarily, as they must see it–from the sanctuary of the home, which is a better place for them than any other place on earth. The great contrast between home and early 20th century English boarding school, Orwell wrote in “Such, Such Were the Joys,” is that

Your home might be far from perfect, but at least it was a place ruled by love rather than fear, where you did not have to be perpetually on your guard against the people surrounding you. At eight years old you were suddenly taken out of this warm nest and flung into a world of force and fraud and secrecy, like a goldfish into a tankful of pike.

Orwell became the leading public intellectual of the 20th century because of discoveries forced on him at boarding school. Eventually he would conclude that (1) politics was the only thing that could improve humanity, but (2) all of politics, even the “good” side, is a realm of “force and fraud and secrecy.” It was a lesson that enlightened a half-century of thought about–and struggle for–the survival of democracy. But it cost him his childhood.

So my answer to the question, “Dad, would you send me to boarding school?” was the same answer Orwell gave, a clear and resounding no. Orwell adopted an infant orphaned during World War Two and named him Richard. He took every measure he could to ensure Richard’s childhood would happen in a place ruled by love, not fear.

Orwell with Richard in 1946 or 1947

Orwell probably never had a full grasp of his own approaching greatness; widespread appreciation of his thought was just building as he began actively to die, of tuberculosis, in 1948. But what he did know from early in his career was that he had “a power of facing unpleasant facts.” He must have known, too, that this power sprang from his childhood experience being flung, a small goldfish, into a tankful of pike.

The life Orwell gave young Richard, while he could, was the opposite of this Darwinian nightmare; he kept Richard in a place ruled by love. Orwell wanted an everyman for a son; he did not want another Orwell. And neither would I. The young will find their own trials in life, the things that steel their minds, heighten their senses, and build up their resilience. We need not force these things on them and call it an education.

Treasure That Does Not Rust

BY MATTHEW HERBERT

I admit I’m as much a slave to filthy lucre as the next guy.

In fact, I’ve had my mind on my money and my money on my mind a lot recently. It’s probably because of this that I’ve noticed a different kind of wealth accruing right beneath my nose.

I’ve been keeping track these past few months of certain monetary investments that tend to be measured in years. As in, that was a good year for the S&P 500, or that was a crazy good year for real estate.

The salience of year-long segments–the way we almost automatically valuate things over that time period–naturally made me think of the pandemic we are emerging from. Okay, it lasted longer than a year, but looking back, I think you could say that the core of the experience, the uncertainty, the anxiety, the not knowing which way was out, lasted a good, solid year. It did for me.

Obviously, the pandemic had unexpected consequences. Too many to count. But here’s one I was totally unprepared for: I actually acquired a kind of wealth during lockdown that I would never trade away for anything. And I didn’t even notice what it consisted of until I started going back to normal life and I had to start letting go of it.

This is what it was: Togetherness. Every member of my family was constantly gathered under one roof in one another’s company. I know what you’re going to say. Of course it sucked, in numerous ways. We had stress, we occasionally had too much of one another, and we grieved in mutually unintelligible ways the things that were missing from life. We had formless feelings of loss.

But me personally, as a mid-sized mammal responsible for the propagation of the species and the provisioning and protection of the home unit? I had a sense of control I will probably never have again, and with it came better, nobler things like intimacy, familiarity and the discharge of the most important duties. I went to “school” five days a week with my son. Who gets to do that?

Now the family is starting to go off and do their own things, as they will. It is what is supposed to happen, of course, but it’s disorienting. I got so used to the feeling that I was protecting them. It was probably mostly an illusion, but it was part of the experience.

People get used to anything, to paraphrase Camus, and I got used to lots of bad things over the last year. But I gained great stores of wealth too. Twenty twenty was a very good year. I acquired riches that won’t rust or be eaten by moths, or taxed for that matter. I think when I look back on the pandemic I will remember first and foremost the parts I ended up cherishing in a weird way and how I didn’t want to let go of them.

Review of “Black Flags: The Rise of ISIS” by Joby Warrick

BY MATTHEW HERBERT

The most striking achievement of the terrorist organization Islamic State in Iraq and Syria (ISIS) is that, unlike other jihadist groups, ISIS actually ruled a piece of territory, a caliphate meant to unite all Muslims under Koranic law.

Its brutal, iconic founder, Abu Musab al-Zarqawi, took his name from his boyhood home, Warrick tells us, Zarqa–a “gritty” industrial town in northern Jordan. His formative experience was in Jordan’s largest prison, al-Jafr, in the burning southern desert.

ISIS might never have come into being had Zarqawi not found refuge in 2002 in a highly unusual place, the Zagros mountains of northern Iraq, nestled among two groups of people who had little love for jihadists–the predominantly secular Kurds and the Shi’ite Iranians, just across the border to the east.

ISIS arose in the “Sunni Triangle,” a restive area west of Baghdad, home to Saddam Hussein’s familial tribe; it made its caliphate’s capital in Raqqa, Syria; and in a stunning blow, it conquered Mosul, Iraq, a city of more than a million people, in one day. At its height, ISIS’s caliphate comprised an area larger than Israel and Lebanon combined, as Warrick reports.

In 2014 and 2015, foreign ISIS volunteers streamed into northern Syria through Turkey in such numbers that Turkey’s southern Hatay Province became infamous as an ISIS way station. Camouflage-clad jihadists openly shopped for combat gear in local military surplus stores.

Given this profusion of geography, how, I wonder, did a Pulitzer Prize-winning book about ISIS come to feature only a single, highly impressionistic map? The story of ISIS is its meteoric rise from secretive terrorist group hiding in safe houses to a functioning army capable of openly controlling territory (and eventually losing it–a part of its history that postdates Warrick’s book).

In most ways Black Flags: The Rise of ISIS is highly commendable. In fact, I don’t even mind that it undeservedly won the Pulitzer, because the story Warrick tells is one that needs wider awareness. If the Pulitzer gave it a push, good. After years of thick books and studies on post-9/11 terrorism and endless analyses of our wars in Iraq and Afghanistan, a presumably weary public still needs to know that our entanglement with Sunni jihadism is not over.

Indeed this is the source of my discontent with the lack of detailed maps in Black Flags. The fact that I knew the places Warrick was describing without looking at a map was a constant reminder that his book seemed to have been written for people like me, with enough background to take in the narrative at speed. It should have been pitched to an audience that hadn’t yet come to grips with the ways 9/11, the Iraq war, the Arab Spring, and the Syrian civil war shaped ISIS in all its unique specificity.

Or maybe I’m over-emphasizing this weakness. A friend of mine who fought against ISIS’s forerunner, al-Qaeda in Iraq, read Black Flags recently as someone who knew the objective facts up close but wanted a big-picture interpretation of what it all meant. Experts, soldiers and the general public alike have been left to wonder whether the “global war on terrorism” begun 20 years ago has come to an end. Black Flags keeps that question front and center even though it does not–cannot–answer it.

Another weakness of the book, which my friend drew out, is the strong impression it gives of a journalist who rushed out a gripping story that should have been more deeply reported. Many developments that clearly called out for deep primary sources were instead given thin washes of secondary source material. Indeed many of the pivotal events described in Black Flags are simply recontextualized quotations from the memoirs of, among others, King Abdullah of Jordan, Army General Stanley McChrystal, and former Secretary of State Hillary Clinton.

And even where the experiences described are intimately reported, there is a disappointing poverty of on-the-scene sources. The descriptions of ISIS’s brutal Sharia enforcement regime, which included horrific public punishments including crucifixions, come from a single witness. Warrick also underexploits the vast trough of information that ISIS published about itself. Not that this material should be taken as reliable narration of ISIS’s history, but the fact that Warrick makes only one glancing reference to Dabiq, ostensibly the caliphate’s newspaper of record, belies a rush to publish. Half the story of ISIS was its own edifice of grand delusion, which deserved a closer look.

All that said, Black Flags is nonetheless a very good book that deserves to be read. It moves fast and keeps the narrative crisp. It’s possible that Warrick judged wisely when he (or his editors) decided to keep it short, at 316 pages. It could have been a better book with 200 more pages, I believe. But what do I know? The more people who read Black Flags the better, and maybe they wouldn’t want to read a “better” book.

Orwell’s Ingenious Indirectness

BY MATTHEW HERBERT

Orwell is renowned for his “plain,” “direct” style of writing, a reputation he occasionally buttressed with acts of self-promotion. In his 1946 essay “Politics and the English Language,” he called for political writing to be fresh, clear and to the point. In another essay, “Why I Write,” he opined, “Good prose should be transparent, like a windowpane.”

Good advice if you can take it. Sentence for sentence, Orwell took his own advice pretty well. When it came to broader matters of theme and message, though, his record is mixed.

The picture Orwell paints with the image of a windowpane is one of the writer doing her job by looking the objective truth full in the face and recording what she sees. The plain facts are out there, shining straight in. But the light that shone through Orwell’s windowpane often came through at an angle, and it often refracted into unexpected quadrants and created new hues. Orwell didn’t just write about what was right in front of his nose, despite suggesting pretty forcefully that this is what writers should do.

The fact is, many of Orwell’s best essays are masterpieces of misdirection, especially the ones from early in World War II. They are not at all about what they purport to be about.

Between April 1940 and February 1942, Orwell wrote several pieces with tepid-looking titles, which were nominally about (1) an obscure English aristocrat who mourned the fading power of poetry, (2) an erratic, little known attack by Tolstoy on Shakespeare, and (3) an appreciation of smutty postcards. These abstract ideas were wafting from the mind of a man who was dying to be in the action–shooting Nazis, filling sandbags, or passing buckets of water hand over hand to dowse the flames of the Blitz. “It makes me writhe to be writing book reviews etc. at such a time,” he wrote in his diary in June 1940, “and even angers me that such time-wasting should still be permitted.” But there he was, stuck at his desk: Orwell was in a weird place.

Orwell at work, in a weird place

As early as September 1939, Orwell wanted above all to be useful to English society. He knew that England would soon be fighting an existential war against Nazi Germany. He tried and tried to get into the Army, into the Home Guard, into a government agency of some sort. He wanted to physically resist fascism, as he had done in the Spanish Civil War in 1937. But he was too sick with tuberculosis for the Army and too politically suspect for a government agency. For a full year before he finally became a sergeant in the Home Guard and then a BBC radio commentator, all Orwell could do was sit at home and write. He hated this waste of his time: It makes me writhe. Although he still found thinking about literature intrinsically pleasant, as he told friends, bookish flights of fancy seemed meaningless with the war going on.

But literature was the the thing Orwell knew, so, as he waited, for weeks and then months, he stuck to it. Little surprise that when he bent his mind to books, the words that came out of his typewriter were actually about fighting fascism. Literature was just the windowpane through which he saw that fight.

The reason I’m writing this essay is that Orwell’s choice of titles in this period can easily put off a whole class of potential readers, yet the essays behind those unhelpful titles are immensely valuable. People who have only read Animal Farm or 1984 are missing a huge part of the thought of the 20th century’s most important English-speaking polemicist if they don’t read a good-sized sample of these lesser known pieces.

So, hands up–who would like to read Orwell’s April 1940 review of Personal Record 1928-1939 by Julian Green? Julian who? My point exactly. We feel hardly a tickle of interest when Orwell indicates in the first sentence that Mr. Green’s reflections on poetry are lofty but “commonplace.” Green, we learn, has exquisite, “almost effeminate” literary sensitivity; he is old fashioned, a relic of a dying age in which “simply to preserve your aesthetic integrity seemed a sufficient return for living on inherited money.” He’s not only an aristocrat; he’s an aesthete.

But not just any aesthete. Unusually, Green knows he is obsolete as a social type. This is what catches Orwell’s eye.

The world is changing in ways that will render poets and the things they value useless, Green records in his diary. Hitler is not just killing people; he is warping the very possibility of European civilization by destroying the place that beauty and leisure occupy in a developed society. But Green stands up and mounts the only kind of protest he is capable of. Orwell writes of him, he “is far too intelligent to imagine that his way of life or his scheme of values will last forever. . . But what is attractive in this diary is its complete impenitence, its refusal to move with the times. It is the diary of a civilised man who realises that barbarism is bound to triumph, but who is unable to stop being civilised.”

The recalcitrant will to resist mind-killing brute force is the animating spirit of Winston Smith in 1984. Like Julian Green, Smith can see that a system of powers is rising that will crush him, but he cannot help living as a free man until it does so. He is unable to stop being free. That is the human spirit, distilled.

Orwell actually wrote two essays on Tolstoy’s weird attack on Shakespeare, one in May 1941, and a longer, better-known one in March 1947.

The gist of Orwell’s first essay is that Tolstoy called Shakespeare “one of the worst and most contemptible writers the world has ever seen.” His plays were unserious, incoherent plagiarisms of nonsensical tales, railed Tolstoy. The assault was personal. Tolstoy seemed angered by Shakespeare’s continuing popularity. The only way to explain Shakespeare’s appeal, Tolstoy said, was by reference to an arcane conspiracy theory positing that a cabal of 19th century German critics had somehow conned all of literary society into believing that Shakespeare was a genius rather than a fraud.

Orwell actually grants Tolstoy some of his particular criticisms, essentially agreeing that Shakespeare’s mind was “a jumble, a rag bag . . . he had no world view.” And, yes, Shakespeare’s plays were full of extraneous interjections, gaudy bangles and non-sequiturs. But Tolstoy had, according to Orwell, drawn all the wrong conclusions from this mess. Obviously, no small, secretive group of plotters could account for the fact that people still, in 1940, loved hearing Shakespeare. “One must conclude,” wrote Orwell, “that there is something good–something durable–in Shakespeare which millions of ordinary people can appreciate, though Tolstoy happened to be unable to do so.”

In 1984 Winston Smith believes, with a kind of religious faith, that the will of the ordinary people is the only force capable of defeating Big Brother’s totalitarianism. And he believes this despite being part of a very different kind of movement–what he thinks is a secretive resistance group driven by a specialized economic theory. The working class in 1984 evince almost no intellectual spark, but Smith believes in them nonetheless.

Orwell, too, ultimately trusted the masses over the experts to save civilization despite the fact that–like Smith–he advocated for a political view rooted in (leftist) expertise. The masses may be ignorant and even bigoted at their worst, Orwell thought, but their sense of common decency could not be dismissed as mere bourgeois sentimentality. If Britain wished to win the war, the masses’ sense of decency would have to be cultivated, he thought.

Where does Shakespeare fit in again? Just as it was ridiculous to think that ordinary Englishmen would troop in to watch Henry IV year after year because of some machinations of a clutch of scheming critics, it was equally witless to believe that expertise would win out over–or was in some sense better than–what people actually believed. Common decency was durable, and it had to be taken seriously as a political force.

Orwell continued to revere Tolstoy even after eviscerating his strange attack on Shakespeare. He is forever balancing two sides of an argument like that.

In fact, Orwell thought all of life was a balancing act. You would not know this is the main point of his essay “The Art of Donald McGill” until you got pretty close to its end. Indeed, unless you’re into the minutiae of British cultural history you would probably, like me, have no idea who Donald McGill even was. It turns out he drew jokey, semi-pornographic postcards, often featuring women who were plus-sized in all the right places. A short way into the essay you’re thinking it’s going to be about censorship–a wartime concern of Orwell’s–and the conceptual line between smut and erotica. But that’s not it.

As usual, Orwell goes to a much more interesting place than the well-trodden ground. Even the most idealistic moralist, Orwell says, has two sides: the official self (whom Orwell dubs Don Quixote), who always tries to do the right thing, even at risk to his own well-being, and the unofficial self, or Sancho Panza, “a little fat man who sees very clearly the advantages of staying alive with a whole skin.” He occasionally drinks too much and enjoys “soft beds, no work, . . . and women with ‘voluptuous’ figures.” We all have this side to us–the side that would keep, as Orwell did, a few of McGill’s postcards lying around and feel no need to conceal them from polite company.

Bracing stuff, but still Orwell is not quite saying anything new. What he says next, though, is revolutionary–that decent people should be able to admit the existence of their inner Sancho Panza with no loss of moral seriousness. “On the whole, human beings want to be good,” Orwell writes, “but not too good and not quite all the time.” There is a “worldwide conspiracy,” he goes on, to pretend that the unofficial self does not exist. This conspiracy, this determination to lie to oneself about oneself all the time, is what needs to go.

Had Orwell lived till 1990, to see the end of Stalinist communism, the spectacle would not have surprised him. And not because of the economics of globalism or other “inevitable” historical forces, but because Stalinism required deep, wholesale dishonesty from each citizen about what human beings are, and that dishonesty was unsustainable. Decade upon decade real Russian and Czech and Polish people had to pretend to believe in homo sovieticus, a perfectible prototype of human being that always and everywhere put the needs of the state first. But they knew this was a delusion; they were just ordinary people, with foibles and private lives of their own.

What Orwell hints at in “The Art of Donald McGill” is that no system can survive if it forbids privacy, which necessarily entails moral fallibility. In a democracy, the individual must have and safeguard a private self capable of ordering its own scheme of moral priorities. Our ability to distinguish between acceptable lapses and prohibited transgressions without having a code of law imposed on us forcibly from above is the mark of free and equal human beings.

Clearly there is a single theme running through these obscurely-titled essays. (I could multiply examples beyond three, but better, I think to leave them to you to discover, if you wish.) Any idealistic conception of what people ought to be–poets, play-goers, moralists and so forth–must be rooted in what people actually are. Even though Orwell was capable of holding very low opinions of human nature–the most formative book he read was Gulliver’s Travels–he was at the end of the day an optimist. He believed that people were, by some natural inclination to congregate and cooperate, capable of a minimum level of decency and that was enough to carry off a program of liberal democracy. It was an idea that eventually got him out of a weird place.

I Cannot See What I Need to See

BY MATTHEW HERBERT

“There is such a doubt about the continuity of civilisation as can hardly have existed for hundreds of years,” George Orwell wrote in January 1941. The Blitz was raging at the time, but he didn’t mean simply that Britain was at risk of being conquered by Hitler’s bombers.

In 1940 and 1941 Orwell was gripped by the idea that civilization was going through an epochal change that was, if you can imagine it, bigger than the eventual military outcome of the war. He felt that human life was being transformed right under everyone’s noses, and the emerging form of society would be unrecognizable to anyone stuck in the present. “Everything is cracking and collapsing,” he wrote in a book review.

German bomber over London in September 1940

Not everything Orwell predicted about the coming changes came true, at least not precisely, but the things Orwell got right were truly astounding.

He saw that the nature of work was changing fundamentally; managers and technicians would rise to dominate the new economy. Class differences would be eroded by something that didn’t have a name yet but which Orwell noticed and would come to be called mass culture–the fact that everyone, rich, poor and middle class alike, dressed more or less the same, saw the same movies, heard the same radio programs, and increasingly went to the same kinds of schools. Orwell also saw that all big governments, even democratic ones, would have the necessary communications technology to constantly manipulate their citizens’ perceptions, attitudes and even behavior. For centuries, citizens had to be cowed into conformity; now they would be led amicably by the nose.

What I want to do today, though, is not to discuss how right Orwell was about the epochal changes of his time. Rather, what I want to do is channel his mood as he braced for the coming upheavals. Because, increasingly I sense what Orwell did in 1940 and 1941: everything is cracking and collapsing. Things are happening that are transforming society right under our noses, and some of these spring from the same seismic trends Orwell saw coming.

First, the usefulness of billions of human beings is being obliterated before our eyes, as the nature of work evolves under the influence of science and technology even faster than in Orwell’s day. As recently as the year of my birth, any human in a developed society could aspire to the kind of working life that would earn him his keep, help society, and possibly even engage his mind. Increasingly, though, the labor outputs of smart machines, foreign wage-slaves and a handful of managers are displacing almost every kind of decent, constructive work we might have once expected to do. If you’re brainy, algorithms will do your job; brawny, and a machine probably already does it. The cascading effects of automation, specialization and offshoring are creating what the historian Yuval Noah Harari calls a rising “useless class.” The masses will soon be left without a meaningful contribution to make to society. What will we do?

I am greatly distressed by this trend. As a general rule, I get about as distressed by things as a sack of potatoes sitting slightly upright, sipping a glass of whisky. In a moment I’ll come to the reasons why I believe any thinking person should be terrified by the advent of large-scale human uselessness. But first let me explain its existential horror for me in particular.

The only saving grace we humans have is love. It makes the price of mortality payable. As Orwell put it, “one is prepared in the end to be defeated and broken up by life, which is the inevitable price of fastening one’s love upon other human individuals.” Love might not be all you need, but it is the one thing you need to make life worth living. I don’t mean to get all Aristotelian about this, but if we analyze the concept of love down to its elements, one of these is the capacity to wish another person well. This sounds trivial, but it is not. Just before my father died, several years ago, he said he took comfort in the fact that he could see the fruition of his work as a parent in his children’s ability to go on without him. He thought we would be all right, and he was able to wish us well. He could not have formed these thoughts, I believe, without a reasonably clear idea of what the world would look like in the near future.

To wish one’s remaining children well requires an implicit act of imagination. The departing parent must not just have confidence in his children’s internal abilities, but must also be able to conjure up a vision of a world in which they have a reasonable chance of using those abilities to flourish. This used to be easy; children did much the same things their parents did in a world that remained the same. There was never a reason to question what kind of “ecosystem” the next generation of children would continue their life struggles.

But in 1941, Orwell noticed that children would presently grow up having to develop skills for a fundamentally different world–skills that were nothing like the ones that all Englishmen up to that point had acquired. Of this new competitive ecosystem, Orwell wrote:

It is a civilisation in which children grow up with an intimate knowledge of magnetoes and in complete ignorance of the Bible. To that civilisation belong the people who are most at home in and most definitely of the modern world, the technicians and the higher-paid skilled workers, the airmen and their mechanics, the radio experts, film producers, popular journalists and industrial chemists. They are the indeterminate stratum at which the older class distinctions are beginning to break down.

The children of 1941 would be “technological natives,” to adapt a phrase from today. They would be significantly different from their parents, as Orwell foresaw, but not radically so. An England of technocrats and with fuzzier class differences could still be recognizably English, if just barely.

The children of today we call digital natives. I won’t attempt to characterize them; I’d just end up sounding uncool. But it is precisely the inadequacy of my descriptive powers that points to the source of my anxiety. Civilization is poised for such a disruption that I cannot even imagine the coming forms of life in which I will eventually try to wish my children well. And this failure of imagination is more or less where my own private anxieties overlap with the larger structural breakdowns that Harari and other thinkers like him foresee.

Much of this has to do with work. The world of work is changing drastically, and I feel about these changes the way did Orwell in 1940. It’s not that I’m predicting a dramatic break, but more like I’m realizing the very things we are doing today constitute that break. The leading waves of the revolution of uselessness (and a related phenomenon–pointless struggle) are already upon us.

I am no specialist in the economics of labor, so I will limit myself to a few blindingly obvious ways in which work life has changed during living memory and in which it continues to change today. One indisputable fact is that the “managerial revolution” proclaimed by James Burnham in his eponymous 1941 book has intensified into a form of technical specialization that now defines the domain of meaningful work. The most remunerative and meaningful jobs today all involve using office technology to solve intellectual or organizational problems. If that kind of work does not appeal to you, you are relegated to a second-class job or worse.

There are increasingly many of these to go around–pointless, unpleasant, and poorly paid jobs, which used to make up a small penumbra of the labor market. Today they make up the bulk of what we politely call the service industry. A handful of highly readable books can give you a useful view of this blighted landscape. Start with David Graeber’s 2018 book Bullshit Jobs: A Theory, in which he lays bare the decrepit futility of “lifestyle coach,” wedding planner, and all manner of consultantships. These jobs and their ilk all involve work that is utterly unrequired by humanity but in whose utility the employee pretends to believe. They are often well paid. This fundamental, widespread dishonesty corrupts the human soul. Bullshit jobs also waste time and resources on a massive scale.

Bullshit jobs are not to be confused with shit jobs, such as dishwasher or highway worker. Shit jobs are highly necessary but poorly paid. Many American workers who used to make their living doing one moderately shitty job now must string together several shit jobs. It hasn’t always been this way. If you graduated with a high school degree or acquired its rough equivalent in knowledge between 1776 and 1960, our country held out the prospect of dignified, meaningful work. It might be hard, and occasionally shitty around the edges, but it would enable you to make do on your own terms. Today, if “all” the education you want is a high school diploma, the odds are you will end up with one or more shit jobs for a long period of time and possibly all of your working life.

Whoever maintains that any hardworking American can earn a secure living through grit and determination simply does not grasp elementary statistics. There are thousands upon thousands of shit jobs out there. Society does not functions without them being done, and people will be driven by necessity to do them. Increasingly, workers who are not phenomenally lucky (as I have been–more in a moment) will be sorted into this sector with the force of gravity.

But you can still do your own thing by taking on a gig job, right? In the age of Uber, Fiver, and DoorDash, why would any freedom-loving American ever again punch a timecard or go to a meeting? For a withering expose of how this kind of work deprives ordinary people of rights, power and security, read Sarah Kesslar’s 2018 book (whose title says it all) Gigged: The End of the Job and the Future of Work.

The prelude to the gig trend was the discovery by employers in the decades leading up to the 1980s that they could parse whole jobs into part-time ones and leave out the benefits and much of what used to be called workers’ rights. Louis Hyman’s 2018 book Temp: How American Work, American Business, and the American Dream Became Temporary describes this descent into planned impoverishment and job insecurity. It is now okay in America for a working mom to have to string together enough part-time work at Target, Taco Bell and Grub Hub to feed her kids. Since the job market, like all free markets, is supposed to project a perfect reflection reality, we are meant to shrug our shoulders and accept that the vandalized, fractured nature of work is “just the way things are.” It is not. We created it.

The allure (to the employer) of downgrading all jobs to informal, poorly paid piece work has also created a new opportunity to wring the last ounce of labor value out of the old, even as they are dying. Jessica Bruder’s 2017 Nomadland: Surviving America in the 21st Century describes how senior citizens today must migrate around America’s hinterland chasing shit jobs in factory farms and Amazon warehouses because they cannot afford to retire, or even buy or rent homes in which to retire. Add to the Taco Bell moms I mentioned in the paragraph above a new sub-class of immiserated wage slaves: older people who are openly being worked to death by the rich in the clear light of day.

This sketch of work life in the 21st century has been a bit longer than I intended, but then again, work is the focal point of our lives. For good or ill, it makes up a huge part of the life-long struggle that gives us meaning and identity. Orwell was obsessed with it because he saw that changing work conditions would transform English society, making it into something wholly new. The puffed-up, useless aristocrats who were regarded as permanent features of society were actually part of a dying breed.

Let me pause here to state the obvious. As my children survey the field of careers they might try to take up, they will have to contend with the very antagonistic structural trends I’ve been describing: a nightmare job market whose distortions are authored by a tiny elite who present them (disingenuously) as objective reality. If my kids use their brains to get on the upside of these trends, their horizons could open on to relatively comfortable but meaningless work lives, in which most cognitive labor is done by machines and a tiny technological elite. More and more shit jobs will be performed by poor people far away. The American middle class in 2020 could be as done for as Orwell’s aristocrats in 1940.

I would know. I’m in the middle of the middle class, and–believe me–I keep my ear to the tracks.

As a humanities major with no saleable work skills, I am also part of a dying breed–the last of the bureaucrats able to parlay my soft credentials into hard, meaningful work and a handsome income. On current trends, though, millions of Americans whose qualifications outmatch mine will soon be relegated to the desolation of gig work, shit jobs, and other, as-yet unimagined forms of wage slavery. A lucky few will find well paid but meaningless bullshit jobs created in the entrepreneurial wake of the audacious scientific-technical elite. You know–project managers.

In his vexation of 1940 and 1941, Orwell mentioned a strange thing several times–his fear that novels would cease to be written, at least until the threat of fascism had been defeated. Indeed he feared the whole human endeavor of creative art was at risk. The Nazis, he believed, sought to kill the very idea of human freedom and equality, which is essential for art, especially literature. But Orwell had confidence that Nazism, as strong as it was, would ultimately be defeated. Art–and humanity–would survive.

It is not so clear that the forces that are draining the possibility of meaning from human life in the 21st century can be defeated by the things that gave Orwell hope–human bravery, selflessness, common sense and solidarity. There are two main reasons to be pessimistic.

One is that the scientific advances enabling our proliferation of technology are increasingly recondite. Common sense cannot come to grips with them, and bravery may be too crude a virtue to stand up to them. The mechanisms that promise to carry technology over the horizon of human intelligibility are genetics, nanotechnology and robotics (GNR). Artificial intelligence weaves through all three disciplines and forms the basis of the last. Learning, doing machines will shape our future. The .01 percent of the world’s smartest people will wind them up and see where they go.

For the purposes of this discussion, I will have to be exceedingly brief: GNR technologies will enable machines, some smaller than molecules, to re-assemble the building blocks of physical reality. In “Why the Future Doesn’t Need Us,” a 2000 essay widely read by geeks but, tragically, ignored by everyone else, computer scientist Bill Joy explained why GNR constituted such an unprecedented threat–because it is a suite of technologies designed to act autonomously. It self-replicates. Yes, it will in the strictest sense do what “we” (that .01 percent) tell it to, but given GNR’s complexity, our instructions to it could trigger sequences of events we can neither stop nor control. Let one genetically engineered plant with enhanced photosynthesis slip into the biosphere and it could out-compete (kill) every other plant species on earth. Create the “friendliest” of AIs and it could still be driven by unanticipatable logic chains to exterminate all of humanity or expropriate all our resources, which would amount to the same thing. And so on.

Summarizing this class of threats, Joy sounds more like a tent-revival preacher than a computer scientist:

I think it is no exaggeration to say we are on the cusp of the further perfection of extreme evil, an evil whose possibility spreads well beyond that which weapons of mass destruction bequeathed to the nation-states, on to a surprising and terrible empowerment of extreme individuals.

If Joy’s invocation of evil and “extreme individuals” sounds overblown, try this on for size: what if the overlords of the technologies poised to bring GNR threats online are not cold, titanic geniuses but just nihilistic tech bros honing their “hacker ethic” and chasing fat stacks of venture capital? Do you feel better? The clique of technology gurus who are presently assembling all the world’s useful artifacts–from emergency rooms to power plants to the locks on our front doors–into an “internet of things” literally do not care if the future they are inventing connects coherently to a past. This is the second reason I feel pessimistic about the near future. Nazism had a point, and Orwell was confident that decent, liberal people would sense its evil and resist it. Other than moving fast and breaking things, today’s technologists have no point. They just want to invent cool stuff, stuff we want. There may be an enemy lurking in their formless digital fantasyland, but we cannot recognize it.

At the very least, we should face up to how fragile of our world has been made by uploading its critical parts online, even while we race ahead toward the datafication of everything. See Nicole Perloth’s brand new This Is How They Tell Me the World Ends: The Cyberweapons Arms Race for a terrifying primer on this subject.

In If Then: How the Simulmatics Corporation Invented the Future, historian Jill Lepore details the underappreciated rise of data science and the pioneering company that first tried using it to shape human behavior and attitudes. This grand idea was dreamed up by a group of flawed but idealistic East Coast white liberals who wanted to improve American society in the 1960s. But the “best minds” of today who are carrying their project on “are [instead] thinking about how to make people click ads,” as one Facebook employee related in 2011. Celebrating “anarchy as a measure of . . . creativity,” the leading designers of the technosphere, like Mark Zuckerberg, seem to take pride in blanking out the past. But is this the best path to the future?

As incongruent as it sounds, the ingenious elite who are shaping our information ecosystem today are self-consciously anti-intellectual. To a great extent, they scorn the study of anything other than computing and communications technologies. In If Then, Lepore writes,

In twenty-first-century Silicon Valley, the meaninglessness of the past and the uselessness of history became articles of faith, gleefully performed arrogance. “The only thing that matters is the future,” said the Google and Uber self-driving car designer Anthony Levandowski in 2018. “I don’t even know why we study history. It’s entertaining, I guess–the dinosaurs and the Neanderthals and the Industrial Revolution and stuff like that. But what already happened doesn’t really matter. You don’t need to know history to build on what they made. In technology, all that matters is tomorrow.

Without putting too subtle a Heideggerian point on this statement of anti-human principle, let me sketch the main difference between having a “tomorrow” and having a “future.” Tomorrow is a blank slate, no more than the sun’s rising once more; it can herald anything, including the howling desolation of a civilization that has nullified itself, like the one Levandowski seems to invite. Having a future, though, is inextricably linked to having a past. One of the hopes that kept Orwell going in 1941 as the bombs fell on London was that plain, ordinary patriotism would be put to good use, and the English would summon enough unity to beat Hitler. They would look to their past, flawed and exploitative as it was, and imagine a future worth fighting for.

At the moment, I am too flummoxed to envision a coherent future. Twenty years after his essay, Bill Joy is still right that there are too few people (“extreme individuals”) in charge of it. He may also be right that its ghastly shape might have already been determined in the hidden codes of GNR technologies as they were emerging in the 1990s and 2000s without any regulatory supervision.

In Homo Deus: A Brief History of Tomorrow, as Yuval Noah Harari describes the wide-open possibilities of human life in the late 21st century, he comes back again and again to a central theme–the acceleration of change. Humans of the future will have to be flexible and resilient on an unprecedented scale. They will constantly have to educate and re-educate themselves and train for new jobs. Even the smart, talented and driven may have to plan to have two, three or more careers. It sounds exhausting. It sounds disorienting.

Or, if most of their work is done for them, they may have to learn to create meaning from permanent leisure. This sounds demoralizing.

I do not know if evolution has fitted us with the stores of resilience and creativity we will need to cope with such unrelenting change. But try we will. Humans are programmed to struggle. And as Orwell saw, the instinct that drives us on no matter what is the struggle to connect the future to the past. That’s why he believed England would win the war. But I cannot see my version of what Orwell saw. I cannot see the future in which humanity wins the war against inhumanity. It doesn’t mean that future isn’t out there: it just means that I have no way of picturing it. I need to be capable of what my father was at the last, of wishing my children well. But I cannot see their future world, the world in which they will try to thrive. I can only hope they will be able to.