Ten Ways to Lose Your Literature

-

“I have built a monument more durable than bronze.” — Horace

1.Inconceivable that a man with a disposition like Ben Jonson’s wouldn’t take an offered drink, and so it can be expected that the poet would enjoy a few when he completed his four-hundred mile, several months long walk from London to Edinburgh in the summer of 1618. Mounted as a proto-publicity stunt, Jonson’s walk was a puckish journey between the island’s two kingdoms, where once the Poet Laurette was in view of the stolid, black-stoned Edinburgh Castle upon its craggy green hill he’d be hosted by his Scottish equivalent, William Drummond of Hawthornden.

At a dinner on September 26th, the bill for victuals was an astonishing (for the time) £221 6s 4d. Imagine the verse composed at that dinner, the lines jotted onto scraps or belted out as revelries, the lost writing which constitutes the far larger portion of any author’s engagement with words. Also his engagement with ale and sack, for it was Jonson’s loose tongue that led to a discussion about a different type of literature lost to posterity, when he confessed that one of their poetic colleagues, a Reader of Divinity at Lincoln’s Inn named Dr. John Donne, had decades before attempted a scurrilous mock-epic entitled Metempsychosis; or, The Progress of the Soule.

Drawing its title from a Pythagorean doctrine that’s roughly similar to reincarnation, Donne’s Metempsychosis is a poem wherein he sings the “progress of a deathless soul, /Whom Fate, which God made, but doth not control, /Placed in most shapes.” He abandoned the project some 520 lines later. Drummond recorded that Jonson had said that the “conceit of Donne’s transformation… was that he sought the soul of the Apple which Eve pulled, and thereafter made it the soul of a Bitch, then of a she-wolf, and so of a woman.” Written before Donne converted to Protestantism, Metempsychosis was a parody of the Reformation, whereby the “soul” of the forbidden fruit would migrate through various personages in history, contaminating them with its malevolence. Jonson told Drummond that “His general purpose was to have brought in all the bodies of the Heretics from the soul of Cain,” perhaps moving through other villains from Haman to Judas.

As it was, Donne mostly charted the apple’s soul through various animal permutations, ending with Cain rather than starting with him. It seems that Donne feared his own mind, and the inexorable logic that drove his poem to a conclusion which was unacceptable. Donne “wrote but one sheet, and now since he was made a Doctor he repented highly and seeketh to destroy all his poems.” Jonson thought that Donne’s intent was to have the final manifestation of that evil spirit appear in the guise of John Calvin. Others have identified an even more politically perilous coda to Metempsychosis, when the at-the-time staunchly Catholic Donne imagined that the “great soul which here amongst us now/Doth dwell, and moves that hand, and tongue, and brow… (For ‘tis the crown, and last strain of my song)” and assumed that the poet was speaking of Elizabeth I.

Biographer John Stubb notes in John Donne: The Reformed Soul that Metempsychosis was “a politically directed piece of writing,” which is the biggest reason why none of you will ever read the entire poem. Donne himself wrote to a friend of his that there must be “an assurance upon the religion of your friendship that no copy shall be taken for any respect.” Metempsychosis is one way of losing your words. Literature as fragment, literature as rough draft, literature as the discarded. The history of writing is also the shadow history of the abandoned, a timeline of false-starts and of aborted attempts. What Donne wrote of Metempsychosis is, even in its stunted form, the longest poem which the lyricist ever penned, and yet it’s a literary homunculus, never brought to fruition. Never burnt upon the pyres by his critics either, because it would never be completed.

2.This is a syllabus of all which you shall never read: Jane Austen’s Sanditon, which exists only in outline form written in 1817, the year its author died of Addison’s disease, and that promised to tell the narrative of Sir Edward Denham whose “great object in life was to be seductive.” John Milton’s epic about Merlin entitled The Arthuriad. Over 2/3rds of the work of Aristotle, with all that survives composed of lecture notes. A thundering abolitionist speech delivered by the congressman Abraham Lincoln on May 29, 1856, where one observer said that he spoke “like a giant inspired.” Isle of the Cross by Herman Melville, which told the tale of romance between the Nantucket maiden Agatha Hatch Robertson and a shipwrecked sailor and crypto-bigamist. Melville explained in a letter to Nathaniel Hawthorne that Isle of the Cross concerned “the great patience & endurance, & resignedness of the women of the island in submitting so uncomplainingly to the long, long absences of their sailor husbands” (Rejected by publishers; no record ).

A series of versified narratives of Aesop penned by Socrates, the philosopher famed for despising writing. Hypocritical though Socrates may have been, he inspired Plato to burn all of the poems he wrote, and to argue for the banning of such trifles in The Republic. Sacred scripture has its absences as well, for the Bible references any number of ostensibly divine books which are to be found nowhere today. The Book of the Covenant mentioned in Exodus 24:7, The Book of the Wars of the Lord cited in Numbers 21:14, Acts of Uzziah whose authority is checked at Chronicles 26:22, and the evocatively titled Sayings of the Seers, which is called upon at Chronicles 33:19. Stuart Kelly, author of The Book of Lost Books: An Incomplete History of All the Great Books You’ll Never Read, notes of the Bible that it is “A sepulcher of possible authors, a catafalque of contradictory texts.” Scripture is a “library, but one in ruins.” Scholars know that there are, for example, a Gospel of Perfection and even more arrestingly a Gospel of Eve, neither of which made the final cut or whose whereabouts are known today.

Of the sacred books of the oracular Roman Sibyllines, not a single hexameter survives. Concerning sublime Sappho, she who was celebrated as the tenth muse, only one complete lyric endures. Nor are Socrates and Aristotle the only Greek philosophers for whom posterity records virtually none of their original writings; the great Cynic Diogenes of Sinope, who lived in an urn, masturbated in the Agora, and told Alexander the Great that the only thing the ruler of the world could do for him is get out of the light, supposedly wrote several manifestos, none of which still exists.  

In 1922 a suitcase filled with Ernest Hemingway’s papers, including the draft of a finished World War I novel, was stolen from Paris’ Gare de Lyon Metro station. A few decades later and the Marxist theorist Walter Benjamin fled towards the Spanish border after the Nazis invaded France, with a briefcase containing a manuscript. He’d commit suicide in Portbou, Spain in fear that the SS was following; when his confidant Hannah Arendt arrived in America, she tried to find Benjamin’s book, but the draft is seemingly lost forever. Lord Byron’s manuscripts weren’t misplaced, but were burned at the urging of his Scottish publisher John Murray, who was horrified by the wanton sexuality in the famed rake’s autobiography, not least of which may have been a confession of an incestuous relationship with his sister. Nineteenth-century critic William Gifford said that the poet’s recollections were “fit only for a brothel and would have damned Lord Byron to everlasting infamy.”

Franz Kafka desired that the entirety of his unpublished corpus be destroyed by his friend Max Brod, writing that “whatever you happen to find, in the way of notebooks, manuscripts, letters, my own and other people’s, sketches and so on, is to be burned unread to the last page.” Unfortunately for Kafka, but to literature’s benefit, Brod turned out to be a terrible friend. Of that which survives, however, much was incomplete, including Kafka’s novel Amerika with its infamous description of the Statue of Liberty holding a sword aloft over New York Harbor. A very different type of burning was that of the literary theorist Mikhail Bakhtin, who was holed up in a Moscow apartment when the Soviet Union was invaded by the Nazis. Bakhtin was forced to use his manuscript as cigarette paper. The book which he was working on, which would mostly go up in tobacco smoke, was a study of the German novel.  

Of the ninety plays which Euripides wrote, only eighteen survive. Aeschylus also wrote ninety tragedies, but of his, only six can be performed today. From his play The Loves of Achilles only one haunting line survives — “Love feels like the ice held in the hand by children.” Sophocles’ has seven plays which are extant, but we know that he penned an astounding 123 dramas. William Shakespeare was the author of a Love’s Labours Won and with his later collaborator John Fletcher a lost play based on Miguel Cervantes novel Don Quixote entitled The History of Cardenio, an irresistible phantom text. Hamlet, it has been convincingly argued, was based on an earlier play most likely written by Thomas Kyd which scholars have given the clinical name of Ur-Hamlet, though no traces of that original are available. Our friend Jonson also has a significant number of lost works, not least of which was 1597’s The Isle of Dogs which he cowrote with Thomas Nash and that was briefly performed at the Bankside Theater. Like Donne’s poem, the subject matter of The Isle of Dogs was potentially treasonous, with the Privy Council ruling that it contained “very seditious and slanderous matter,” banning the play, and briefly arresting its two authors.

When it comes to such forgotten, hidden, and destroyed texts, Kelly argues that a “lost book is susceptible to a degree of wish fulfillment. The lost book… becomes infinitely more alluring simply because it can be perfect only in the imagination.” Hidden words have a literary sublimity because they are hidden; their lacunae functions as theme. Mysteriousness is the operative mood of lost literature; whether it’s been victim of water or fire, negligence or malfeasance, history or entropy, what unites them is their unknowability. They are collectively the great unsolved of literature. There’s a bit of Metempsychosis about it, with a more benign lost soul connecting a varied counter-canon from Aristotle to Byron to Austen to Hemingway. Pythagoras who believed that all souls and ideas were united by an unseen divine filament which replicated throughout eternity and infinity would have some insight on the matter. Sadly, none of Pythagoras’ writings happen to survive.  

3. The claim that Hernan Cortez was welcomed by Montezuma into Tenochtitlan—that city of verandas, squares, canals, and temples —as if he were the feather-plumed Quetzalcoatl, owes much to the accounts gathered by Bernardino de Sahagun. The Franciscan friar assembled a group of Nahuatl speakers to preserve what remained of Aztec culture, including folklore, philosophy, religion, history, and linguistics. This sliver would be preserved in the Florentine Codex, named after the Italian city where it would one day be housed. The Nahuatl authors attempted to resurrect the world of their parents, when Tenochtitlan was larger and more resplendent than any European city, a collection of cobalt colored palaces, observatories, and libraries such that the conquistador Bernal Diaz del Castillo recalled “seeing things never before heard of, never before seen.” Miguel Leon-Portilla translates much of the Florentine Codex in The Broken Spears: The Aztec Account of the Conquest of Mexico, an evocation of that which “had been stormed and destroyed, and a great host of people killed and plundered… causing terror wherever they went, until the news of the destruction spread through the whole land.”

Cortez’s conquest took two years and was completed by 1521. Eight years later, the Spanish inquisitor Juan de Zumarraga, fabled in his native land as a witch-hunter, arrived and assembled a massive auto-de-fe of Aztec and Mayan books—with the Mestizo historian Juan Bautista Pomar noting that such treasures “were burned in the royal houses of Nezahualpiltzintli, in a large depository which was the general archive.” If Cortez was guilty of killing thousands of Aztecs, ultimately millions in the pandemics he spread, then Zumarraga was a murderer of memory. One assaulted the body and the other the mind, but the intent was the same — the extinction of a people. Lucien X. Polastron writes in Books on Fire: The Tumultuous Story of the World’s Great Libraries that the “conquistador was there to kill and capture, the cleric to erase; the bishop fulfilled his mission while satisfying his conscious desire to destroy the pride and memory of the native people.” The Jesuit Jose de Acosta mourned that “We’ve lost many memoires of ancient and secret things, that could have been of great utility. This derives from a foolish zeal,” it being left to those like Sahagun to try and redeem the Spanish of this holocaust they’d unleashed.

In A Universal History of the Destruction of Books: From Ancient Sumer to Modern-day Iraq by Fernando Baez argues that “books are not destroyed as physical objects but as links to memory… There is no identity without memory. If we do not remember what we are, we don’t know what we are.” Zumarraga’s atrocity is only one of many examples, including the destruction of the famed library at Alexandria and the 1536 Dissolution of the Monasteries when the English King Henry VIII would immolate Roman Catholic books in a campaign of terror which destroyed almost the entirety of the early medieval English literary heritage, save for a few token works like Beowulf which would be later rediscovered. Paradoxically, burning books is an acknowledgement of the charged power contained therein.

4.Sometime in 1857, an enslaved woman named Hannah Bond escaped from the North Carolina plantation of John Hill Wheeler. Light-skinned enough to pass as white, Bond dressed in drag and boarded a train due-north, eventually arriving in upstate New York. There Bond would board with a Craft family, from whom she would take her new surname. Eventually the newly christened Hannah Craft would pen in careful hand-writing a novel entitled The Bondswoman’s Narrative, the title perhaps a pun on its author’s previous slave name. Displaying a prodigious knowledge of the books in Wheeler’s library which Craft had been able to read in stolen moments, The Bondwoman’s Narrative is a woven quilt of influences (like all novels are); a palimpsest of things read but not forgotten.

There are scraps of Horace Walpole ’s gothic pot-boiler The Castle of Otranto and Charlotte Bronte’s tale of a woman constrained in Jane Eyre; Harriet Beecher Stowe’s sentimentalized slavery in Uncle Tom’s Cabin and the poet Phyliss Wheatley’s evocation of bondage, and more than any other literary influence that of Charles Dickens’ Bleak House. Drawing from the brutal facts of her own life, The Bondwoman’s Narrative concerns another Hannah’s escape from a plantation. “In presenting this… to a generous public I feel a certain degree of diffidence and self-distrust,” wrote Craft, “I ask myself or the hundredth time How will such a literary venture, coming from a sphere so humble be received?” Written sometime between 1855 (possibly while still in North Carolina) and 1861 (during the earliest days of the Civil War), Craft’s question wouldn’t be answered until 2002, after the manuscript was found squirreled away in a New Jersey attic.

As far as we know, The Bondswoman’s Narrative is the only surviving novel by an enslaved black woman. There were an assemblage of slave narratives ranging from Olaudah Equiano’s The Interesting Narrative from 1789 to Solomon Northup’s terrifying 1853 captivity narrative Twelve Years a Slave and Frederick Douglass’ autobiographical trilogy. Purchased at auction by the Harvard scholar Henry Louis Gates, who would edit, annotate, and publish the book, Craft’s rediscovery evokes the biblical story about King Josiah restoring Jerusalem’s Temple after finding the Book of Deuteronomy, its purpose to restore a sense of justice. Craft’s intent was that Americans must “recognize the hand of Providence in giving to the righteous the reward of their works, and to the wicked the fruit of their doings.”  

There is no discussing lost literature without consideration of that which is found. Just as all literature is haunted by the potential of oblivion, so all lost books are animated by the redemptive hope of their rediscovery. Craft’s book is the mark of a soul; evidence of that which is left over after the spirit has told us what it needs to tell us, even if it takes centuries to hear. A miracle in its rediscovery, Craft’s book is the rare survivor from hell that teaches us how much is lost as humans are lost. What lyrics were written in the minds of those working plantations which we shall never read; what verse revised in the thoughts of those being marched into the gas chamber? This is among the saddest of all lost literature. Craft’s rediscovery provides the divine promise of that canon of lost books—that literature may be lost, but maybe only for a time.

5.During the spring of 2007, I read the entirety of the pre-Socratic metaphysician Democritus. The assignment took me forty-five minutes. I did it in a Starbucks on Forbes Avenue in Pittsburgh and my Venti wasn’t even cold by the time I finished. When we consider literature that has been lost, literature which has survived, and literature that has been rediscovered, it must be understood that much is fragmentary — sometimes in the extreme. Democritus, the “laughing philosopher,” the father of science, who first conceived of atoms, endures in about six photocopied pages. “And yet it will be obvious that it is difficult to really know of what sort each thing is,” reads one of Democritus’ surviving fragments, and how correct he was.

Democritus is not unique; most ancient philosophers exist either only in quotation, paraphrase, or reputation. No tomes survive of Thales, Heraclitus, Parmenides, Protagoras, or Zeno, only some flotsam and jetsam here and there. As I’ve already mentioned, Pythagoras has no words of his which survive, and all of Aristotle is turgid second-hand lecture notes. The classical inheritance of Greece and Rome exists, where it does, in shards. Harvard University Press’s celebrated Loeb Classical Library, which prints translations of Greek and Latin literature, has 542 books in the entire sequence, from Apollonius to Xenophon. More than can be read in a long weekend, no doubt, but easily accessible during the course of a lifetime (or a decade). Easy to assume that the papyri of Athens and Rome were kindling for Christians who condemned such pagan heresy, though that’s largely a slander. The reality is more prosaic, albeit perhaps more disturbing in a different way. Moisture did more to betray the classical past than Christianity did; for decay is a patient monarch willing to wilt Plato as much as a grocery list. Something to remember as we have our endless culture wars about what should or shouldn’t be in the canon. What’s remembered happens to simply be what we have.  

Fragmentation defines literature — there is a haunting of all which can’t be. Fragments are faint whispers of canons inaccessible. Lacunae is sometimes structured into writing itself, for literature is a graveyard filled with corpses. Sometimes a body is hidden in plain sight – consider Shakespeare’s Sonnet 146. That poem begins “Poor soul, the centre of my sinful earth, / […] these rebel powers that thee array.” Because the meter is so aggressively broken, it’s understood that a typesetter’s mistake was responsible for the deletion of whatever the poet intended. Jarring to realize that Shakespeare is forever marred with an ellipsis. Dan Peterson writes in Reading Shakespeare’s Sonnets: A New Commentary that the aporia in the poem “tends to obsess most commentators,” but that the “poem deserves it; we shouldn’t allow it to be completely ruined by a compositor thinking about his dinner.” Several pages are spent by Peterson trying to use prosody in the service of forensics, with various degrees of plausibility entertained, including Shakespeare having possibly meant to write “fenced by,” “starv’d by,” or “fooled by,” all of which any good New Critic will tell you would imply wildly different interpretations.

I’d like to offer an alternative possibility, based not on
sober scansion but irresponsible conjecture. Peterson notes that the sonnet is
one which “says that the body is a lousy home for the soul, which ends enslaved
to its gaudy, pointless, sensual, self-consuming worldliness… it proposes
nothing sort of renunciation of worldly things, a mortification of the flesh in
exchange for the revival and revivification of the spirit.” Maybe then the gap
is the point, an indication that the matter of the poem can never really
intimate the soul of meaning, where the black hole of the typographical mistake
is actually as if an open grave, an absolute zero of meaning that sublimely
demonstrates the theme of the sonnet itself? Because the gulf between printed
word and the meanings which animate them is a medium for sublimity, the entirety
of all that which we don’t know and can never read as infinite as the universe
itself.

In the first-century the Roman critic Longinus, whose identity is unknown beyond his name (another way to lose your literature), argued that the “Sublime leads the listeners not to persuasion, but to ecstasy… the Sublime, giving to speech an invincible power and strength, rises above every listener.” Romantic era critics saw the sublime in the Alps and the Lake District; American transcendentalists saw it in the Berkshires and Adirondacks. For myself, I gather the trembling fear of the sublime when I step into the Boston Public Library at Copley Square, when I cross the fierce Fifth Avenue lions of the New York Public Library, and underneath the green-patina roof of the Library of Congress. To be surrounded by the enormity of all that has been written which you shall never read both excites and horrifies me – all the more so when you consider all that is lost to us, whether from misplacement, destruction, or having never been written in the first place (the last category the most sublime of all).

Longinus’s “On the Sublime” is also fragmented, struck through with gaps and errors. He tantalizes us at one point with “There is one passage in Herodotus which is generally credited with extraordinary sublimity,” but there is nothing more sublime than a vacuum, for what follows is nothing. Latter he promises that “loosely allied to metaphors are comparisons and similes, differing only in this,” but the page is missing. And at one point he claims that Genesis reads “Let there be light, and there was. Let there be earth, and there was,” though it could be entertained that Longinus is simply quoting an alternative version of the Bible which is lost. His essay was built with words from hidden collections, a gesture towards Alberto Manguel’s observation in The Library at Night that the “weight of absence is as much a feature of any library as the constriction of order or space… by its very existence, [it] conjures up its forbidden or forgotten double.”

6.W.H. Auden was the most ruthless self-editor as his decades-long war of attrition against his most celebrated lyric, “September 1st, 1939,” demonstrates. Originally published in the New Republic on the occasion of Adolph Hitler’s invasion of Poland, “September 1st, 1939” is well-remembered and well-loved for a reason. “I sit in one of those dives/On Fifty-second street,” where Auden hears news of the panzer divisions rushing towards Warsaw and Krakow. Here among mundanity, where “Faces along the bar/Cling to their average day,” Auden invokes feelings all too familiar to us in the contemporary moment (hence the endurance of the poem), these “Waves of anger and fear” felt by the drinkers at the bar; men who feel “Uncertain and afraid/As the clever hopes expire/Of a low dishonest decade.” Nonetheless, Auden maintains that the “lights must never go out, /The music must always play,” in the penultimate stanza including a line that has moved people for eight decades – “We must love one another or die.”  

Eighteen years later and Auden would write to a critic that “Between you and me, I loathe that poem.” He saw it as sentimental pablum; most importantly Auden felt that it was simply a lie. His biographer Edward Mendelsohn explains in Early Auden, Later Auden: A Critical Biography that “By his own standards, if not those of his readers, these public poems failed.” In later editions he changed the line to “We must love one another and die,” the conjunction giving an entirely different meaning (albeit literally truer). The red pen is not easily pulled out once a book is in print, for though he omitted the line in collections released in both 1945 and 1966, it was inevitable that “September 1st, 1939” would circulate, even though he wrote that it was “trash which he is ashamed to have written.” This poem is not lost literature, but rather a case of a failed attempt to have buried the word. Impossible to imagine that Auden didn’t despair at the simple fact that there had been a time when he could have strangled the poem in the crib, that before “September 1st, 1939” was sent out into the world the sovereign power of the strike-through had still once been his.

Luckily for us Auden didn’t do that, but it does demonstrate that one of the most effective means of losing literature is in editing and revising. How many innumerable drafts of famous novels and poems exist, revisions immeasurable? If there is any modernist credo, it’s one of valorizing the red pen. F. Scott Fitzgerald’s injunction to “kill your darlings,” and anecdotes about Hemingway writing a staggering forty-seven drafts of A Farewell to Arms (but only one of his novels is in some tossed luggage somewhere). Such is the masochism of contemporary composition advice, whereby if there is one inviolate truism it’s that writing isn’t writing unless its rewriting. Vladimir Nabokov who bragged that “My pencils outlast their erasers”; Truman Capote saying “I believe in the scissors”; and E.B. White and William Strunk’s commandment in The Elements of Style that “A sentence should contain no unnecessary words” (I reject that advice). Lean, muscular, masculine, taut, minimalist prose was the point of writing, and as such loss became an intrinsic part of literature itself.

Hannah Sullivan in her brilliant The Work of Revision examines how the cult of editing emerged, looking at how technology in part facilitated the possibility of multiple drafts. With the introduction of mechanical means of composition (i.e. the typewriter) authors had, for the first time, the possibility to relentlessly write and rewrite, and a certain ethos of toughness surrounding the culling of words developed. “Our irrepressible urge to alter ‘the givens’ helped to create Modernism,” argues Sullivan, and “remakes us right to the end.” In some ways, contemporary technology haunts us with the ghosts of exorcised drafts more than mere typewriters ever could. Sullivan had a record of typed pages to look back at: drafts with underlined and struck out passages, a cacophony of addition carrots and transposition marks and the eternal promise of “TK,” rendered in ink and whiteout, but she had a record.

With Word processing, editing and revision can be
instantaneous in a manner that they couldn’t with a Remington, so that drafts
exist layered on top of each other, additions and deletions happening rapidly
in real time, with no record of what briefly existed before like some quantum
fluctuation. A final copy is the result of writing, but is not writing itself.
It rather represents the aftermath of a struggle between the author and the
word, merely the final iteration of something massive, and copious, and large
spreading its tendrils unseen backwards into a realm of lost literature.
Revision is a rhizomatic thing, each one of the branches of potential writing
hidden and holding aloft the tiny plant. A final draft is the corpse left over
after the life that is writing has ended.  

7.How talented an actor must Edwin Forrest have been that on May 10th, 1849 his fans would be willing to riot in his defense after it was perceived that he was being slighted by rival thespian William Charles Macready? The two had long come to blows in the press over who was the superior Shakespearean actor, and they each had their own partisans. Where Macready was delicate and refined, Forest was rough-hewn and rugged; Macready delivered his lines with elegance, Forest with swagger and punch. Advocates for Macready watched their hero perform Hamlet in the palatial theaters of Broadway; the faction of Forrest was content to drink and brawl in front of the Bowery’s stages. Most importantly, Macready was British and Forrest an American. Shakespeare was thus an issue of patriotic loyalty, with Nigel Cliff writing in The Shakespeare Riots: Revenge, Drama, and Death in Nineteenth-Century America that the Bard was “fought over, in frontier saloons no less than in aristocratic salons, with an almost hysterical passion.”

“Early in life,” Forrest once said, “I took a great deal of exercise and made myself what I am, a Hercules.” The “Bowery Boys” of the Five Points slums were delighted by Forrest’s American self-regard. Which actor you preferred, and whose style of delivery you saw as superior, said much about who you were as a person. Forrest was preferred by the working class, both Know Nothing nativists and Irish immigrants thrilled to him, while the Anglophilic New York aristocracy attended Macready plays in droves. Following three nights of rambunctious, mocking “Bowery Boys” buying seats out at Macready’s title performance in Macbeth (of course) at the Astor Place Opera, a riot would explode, leaving as many as three dozen people dead, and over a hundred injured. The worst violence in the city since British prison ships dotted the harbor during the Revolution, and until the Draft Riots would burn through New York during the Civil War. All of it over the power of performances that none of us shall ever see.  

No literature is more intrinsic to human experience than performance, and no literature is more perishable. The New York World said that Forrest had “head tones that splintered rafters,” and reviewers noted the distinctive elongated pauses of Macready’s delivery, but the fact is that theirs is an art that will never be accessible to us. Sometimes Macready is configured as a stuffy Laurence Olivier to Forrest’s virile Marlon Brando, but more than likely both would have performed in the rigid, stylized style that reigned supreme in a theater where there was no technology that could amplify voices and where the idea of the naturalistic method would have seemed bizarre. We can’t really know though — Forrest died in 1872, followed less than six months later by Macready, both within five years of Thomas Edison’s invention of the phonograph.

We know a tremendous amount about the men, and eventually the women, who first performed some of the most iconic of plays centuries before Forrest and Macready. Even with contemporary accounts, however, we’ll never actually be able to see Richard Burbage, knowing rather the names of the characters he first played — Hamlet, Lear, Othello. Likewise, the Elizabethan comedian Richard Tarlton has had his monologues rendered mute by history. Or the Kings Men’s great comedic actor William Kempe, famous for his improvisational shit-talking at jeering audiences, though none of his jibes come down to us, even while it believed that he was instrumental in the composition of one of his most famous characters – Lear’s fool. And the incomparable Edward Alleyn, confidant of Christopher Marlowe (and son-in-law of Donne) who was regarded as the greatest actor to ever grace the Rose Theater stage, and who mastered a subtle art so ephemeral that it disappeared the moment the play ended. Of this assemblage, Stanley Wells writes in Shakespeare & Co.: Christopher Marlowe, Thomas Dekker, Ben Jonson, John Middleton, John Fletcher and the Other Players in His Story that they were “leading actors who would have been stars whenever they were born.” Well, maybe. Who is to say?

Part of the glory of the theater is its gossamer transience, the way in which each performance is different, how it can’t be replicated. A script is an inert thing, while the play is the thing forever marked by its own impermanence. In the years after Macready and Forrest died, Edison gave us the illusion of eternity, the idea that voices and images could be preserved. Nothing signaled a greater shift in human consciousness over the past millennium than the myth that both very far away or long after somebody’s death (not dissimilar states) that their identity could be preserved in an eternal present. We can’t watch Forrest or Macready — but we can Olivier and Brando. It seems fundamentally different, a way of catching forever the ephemeral nature of performance, of preserving the fleeting. An illusion though, for decay just takes longer to come. Film must have seemed a type of immortality, but it’s estimated that 75% of silent films are lost forever, and as many as 90% of all films made before 1929. Flammable nitrate and the fickle junking of Hollywood studios proved as final as death, because not only can you never watch Forrest or Macready, you also can’t see Lon Chaney in London After Midnight, Theda Bara in Cleopatra, or Humor Risk — the first film starring the Marx Brothers.

8. “It is one hundred years since our children left,” reads a cryptic, anonymous missive in the town record of the German city of Hamelin from 1384. The only evidence for the basis of that disturbing fairy tale about the Pied Piper, he of mottled cloth and hypnotic music who drew all of the children of the town away from their parents after he’d already depleted it of vermin. Fairy tales operate by strange dream logic, chthonic echoes from the distant past which exist half-remembered in our culture. Hypotheses have been proffered as to what the story may have been based on, why those children were taken. Explanations include that the youth of the city were victims of mass psychosis in a manner similar to the outbreaks of compulsive dancing which marked the late Middle Ages; it’s been suggested that they were victims of plague, that they’d been sold into slavery, or that they’d been recruited by a roving preacher to join one of the ill-fated “Children’s Crusades” that sent thousands of adolescents off to the Levant. Regardless of the basis for the fairy tale, it’s story has played down in our culture like an idee fix, in the nineteenth-century appearing not just in the Brothers Grimm’s Children’s and Household Tales, but also in poems by Johann Goethe and Robert Browning, with “Pied Piper” a short-hand for the siren’s manipulative call.

Who then “authored” the original story? Do we credit the reinvention of the Grimm’s as its origin, do we count the source material from which they drew their inspiration, does each text influenced by the tale stand on its own? Was it whatever forlorn ancestor made that annotation in Hamlin’s ledger? The Pied Piper himself? The nature of a fairy tale is that everyone is their reader but nobody is their author. Jack Zipes writes in Why Fairy Tales Stick: The Evolution and Relevance of a Genre that “We respond to these classical tales almost as if we were born with them, and yet, we know full well that they have been socially produced and induced and continue to be generated.” The Grimms and other Romantic-minded folklorists saw the fairy tale as arising spontaneously from the collective genius of the people, and there is a sense in which these anonymous tales are a collaborative venture of composition which takes places over centuries, millennia even. They are, in a sense, examples of lost literature finding itself, their creators’ anonymity a different form of oblivion.

The fairy tales which we all seem to intuitively know — Cinderella, Beauty and the Beast, Rumpelstiltskin — were collected by linguists like the Brothers Grimm, but it was in the twentieth-century that folklorists were actually able to categorize them. Chief among the classification systems developed for fairy tales is the Aarne-Thompson-Uthar Index, a complex method of charting the various narrative relationships between disparate stories, with an accompanying numeric mark to distinguish individual narratives. Cinderella, for example, is ATU 510A; Beauty and the Beast is ATU 425C. Scholars were able to thus chart stories to their potential beginnings. The tale of Cinderella finds its earliest iteration in ancient Greek writings of the geographer Strabo; researchers at the University of Durham have been able to ascertain that the Beast first met Belle in a version from an astounding four-thousand years ago. Jamie Terhani and Sara Graça da Silva, the folklorists who used phylogenetic means to chart alterations in Indo-European languages so as to estimate the approximate age of various fairy tales, have claimed that The Devil and the Smith (of which the Faust legend is an iteration) may have first been told six millennia ago.   

So many variations, so many lost stories, whispered only to infants in swaddling clothe over millennia. We can never know what exactly the earliest version of those stories was like; we’ll never know the names of those who composed them. Fairy tales pull at our soul like the vestigial leg of an amputee, a dull ache of people long since gone whose stories we still tell even though we’ve forgotten the creators. Anonymous literature of this sort is the most intimate, told to children before bed-time, repeated to families preparing food around a kitchen table. “I would venture to guess that Anon,” wrote Virginia Woolf in A Room of One’s Own, “who wrote so many poems without signing them, was often a woman.” Such is the lost literature of our mothers, and our grandmothers, of “Anon” who is the greatest writer who ever lived (or didn’t).  Nothing is as intrinsic to our sense of identity like these sorts of stories, when all else is stripped away from us — popular paper backs, avant-garde experimentation, canonical literature — fairy tales will remain. While our libraries are inscribed with names like “Shakespeare” and “Cervantes,” we’ll never be able to chisel into stone the legion of those who composed Cinderella.

9.Since his brother died, Amadeo Garcia Garcia can only speak his native tongue to another human in his dreams. His language was once used to express love and anger, to console and castigate, to build, to instruct, to preserve, now relegated only to nocturnal phantoms. Over the last several decades, fewer and fewer people were able to understand Taushiro, till only Garcia’s immediate family knew the language, and now they’ve all died. A linguistic isolate spoken in the Peruvian Amazon, Taushiro is like all languages in that its syntax and grammar, its morphology and diction, necessarily shapes its speaker’s perception of reality.

Journalist Nicholas Casey who introduced Garcia’s story to the world in a New York Times article notes that the “entire fate of the Taushiro people now lies with its last speaker, a person who never expected such a burden and has spent much of his life overwhelmed by it.” When it joins that graveyard of discarded language, alongside Akkadian and Manx, Ainu and Etruscan, what will pass is nothing so dry as a dictionary, but an entire vision of the world. Literature is language and all languages are literature, forged collaboratively in the discourse between people. When the only ones left to talk to are ghosts of dead loved ones in dreams, it’s as if the coda for an entire universe.

Linguist K. David Harrison explains in When Languages Die: The Extinction of the World’s Languages and the Erosion of Human Knowledge what exactly is at stake. As of the turn of this century, there were 6,912 distinct languages spoken in the world, albeit the vast majority of those spoken by exceedingly few people (as with Taushiro and its speakership of one). He explains that 204 of those languages have less than ten speakers, and that an additional 344 have no more than a hundred. By the end of this century, the number of spoken languages will be half that previous number, if we’re lucky. Victim to globalization and “development,” Harrison says that we stand to lose an “immense edifice of human knowledge, painstakingly assembled over millennia by countless minds, [which] is eroding, vanishing into oblivion.”

Garcia can give us indications of what the stories he heard from his parents were like, of how it feels to speak a language that doesn’t distinguish between numbers, or where diction is whittled down to a pristine simplicity, but we’ll never really know since none of us can speak Taushiro. It was the anthropologist Edward Sapir and his student Benjamin Whorf who made the fullest argument as to the way that these unique qualities produce thought, where language isn’t the result of ideas, but rather that ideas were the result of language. Their estimation was that things like tense, person, subject-verb-object order, and so on, don’t just convey information—they create it. Whorf was an insurance claims adjuster intimately aware of how much of reality depends on the language through which we sieve our experience; it was he who was responsible for the convention of “flammable” things being marked as such, as opposed to the grammatically correct “inflammable,” which he had discovered people took to mean the opposite.   

Their Sapir-Whorf Hypothesis is succinctly stated by the
first of the two in his 1929 The Status of Linguistics as a Science, when
he argued that “Human beings do not live in the objective world alone… but are
very much at the mercy of the particular language which has become the medium
of expression for their society… The worlds in which different societies live
are distinct worlds, not merely the same world with different labels attached.”
English is not French is not Greek is not Farsi is not Punjabi. Taushiro is not
English. Translation is feeling about in a darkened room and being able to discern
the outline of the door, but it doesn’t give one the ability to step through
into the other room (only perhaps to hear some muffled conversation with an ear
pressed against the wall).

When a tongue has genuinely stopped moving there is an
insurmountable difference separating us from its literature. We’ll never quire
get the fear in Elamite accounts of Alexander the Great invading the Achaemenid
Empire; nor understand the vanquished pathos of the god Chemosh speaking in his
native Moabite; or the longing implicit in the poetry of Andalusian Arabic.
Each one of those languages had their own last speakers, as lonely as Garcia,
like Lot surveying his destroyed home and thinking he was the last man on
Earth, or as its said in Taushiro, “Ine aconahive ite chi yi tua tieya ana na’que
I’yo lo’.”

10.A thousand virgin trees have been planted in the Nordmarka forest near Oslo. Just saplings today, the Norwegian spruces are embanked by older birch and fur trees, but the new plantings are marked to be felled in 2114, after they’ve grown for a century. At that point, they’ll be pulped and turned to paper, which will be transported to the Deichman Library, which houses a printing press that will be used to produce the first editions of books that will have been compiled over the preceding ten decades and maintained in the sanctum of a wooden space known as the “Silent Room.” This is the Scottish artist Katie Peterson’s Future Library Project, in which a different prominent author will contribute a novel every year until the completion date, with the understanding that nobody will be allowed to read their contribution until 2114.

Which means that none of you reading this today will ever be able to parse Margaret Atwood’s novel Scribbler Moon. What its plot is, who the characters she’s created are, or the themes entertained, is all a glorious absence, save for that evocative two-word title. Nor can you read David Mitchell’s From Me Flows What You Call Time, whose title makes it sound as if it were an imagined novel from his Cloud Atlas, the author remarking that taking part in the project was a “vote of confidence in the future.” The most recent contribution has come from native son Karl Ove Knausgaard, an untitled work which may or may not contain descriptions of breakfast that go on for pages. Knausgaard said of Paterson’s vision, that it’s “such a brilliant idea, I very much like the thought that you will have readers who are still not born — it’s like sending a little ship from our time to them.”

A vote of confidence in the future is a beautiful description
of a beautiful project, if an idiosyncratic one. It’s also a definition of
literature, for even though the writer must primarily create for herself,
literature still must transmit in the connections between minds. Literature is
a vote of confidence in the future, in the present, in the past – it’s a vote
of confidence in other people. The Future Library Project is in keeping with
those theorists who are concerned with “deep time,” with the profoundly long
view and arc of human history as it rushes away from us. The Long Now
Foundation of San Francisco is one such organization that encourages all of us
to think in the sorts of terms that Paterson does, to understand that innumerable
civilizations have fallen and so shall ours, but that there is a way in which
history ever moves forward.

Stewart Brand writes of a future Library of Alexandria in The Clock of the Long Now: Time and Responsibility, imagining a “10,000-Year Library… [in] a vast underground complex hewn out of rock – preferably a mountain.” The Long Now Foundation tentatively is taking suggestions for what a 10,000-Year library might look like, what books should be included, and how we’re to understand the continuity of an institution that would be older than all of recorded human history. “Fantasy immediately calls up a refuge from the present,” Brand writes, “a place of weathered stone walls and labyrinthine stacks of books, at a remote location with far horizons. It is a place for contemplative research and small, immersive conferences on topics of centenary and millennial scope.” Surely, he knows that there is something quixotic in this vision, just as Paterson no doubt understands that a century hence it’s quite possible that nobody will be left around to read those books in Oslo.

Literature is forever in the process of being lost, and it’s hubristic to assume that what we read today will be around to be read tomorrow. Nevertheless, that’s the beauty in Peterson and Brand’s dreams, that it conceives of a way that all which is lost shall someday be found, that all which is feeble can be preserved. Theirs is a struggle of attrition against that most merciless of editors known as entropy. All literature is of a similar resistance against time, mortality, finitude, limitation. To write it to commit an act of faith, to pray that what words you’ve assembled shall last longer than you, and that they’ll hopefully be found by at least someone who shall be, however briefly, changed.

Bonus Links:—Ten Ways to Live ForeverTen Ways to Save the WorldTen Ways to Look at the Color Black

Image Credit: Pexels/Bakr Magrabi.

Letter from the Capitol

- | 1

The Confederate battle standard never flew within the Capitol Building — until January 6th, 2021. During the Civil War, that cankered, perfidious, malignant, cancerous cabal of traitors who grandiosely called themselves the “Confederate States of America” had many northern strategic inflection points in which they stabbed into the nation’s body, and because of these, for a time, it seemed as if they might be triumphant. General John Hunt Morgan’s 2nd Kentucky Calvary Regiment raided not just in that unfortunate border state, but in 1863 they pierced into Indiana and Ohio as well. Morgan would finally surrender in Salineville, Ohio, which latitudinally is almost as far north as Connecticut. Even more incongruously and a year later, 21 veterans of Morgan’s Raid crossed over the Canadian border, that land then colonized by a Southern-sympathizing Great Britain, and attacked the sleepy hamlet of St. Albans, Vermont, including robbing the bank and forcing the citizens at gun point to swear fealty to the Confederacy. The most violent (and most famous) invasion of the north was the traitor Robert E. Lee’s campaign in Pennsylvania, the goal of which was to possibly capture or burn down Philadelphia, but which was stopped at the infamous “High Water Mark” of the Confederacy when Union General George C. Meade turned back the Army of Northern Virginia at Gettysburg—a battle that took more than 50,000 American lives in three days. During Lee’s campaign in southern Pennsylvania, free Black women and men had to flee north, as the Confederate raiders would send those they kidnapped into a southern bondage.

For sheer absurdity, among the closest positions that the rebels ever got to the national capital was the Marshall House Inn in Alexandria, Virginia, where a Confederate flag was displayed that was so large and so tall that Lincoln could see it from the White House across the Potomac. A few weeks after Ft. Sumter and Union troops occupied the city, marching down red-bricked King Street where slave markets had sold thousands of human beings less than ten miles from the Capitol Building. When Colonel Elmer Ephraim Ellsworth of the 11th New York Volunteer Infantry Regiment ascended to the roof of the hotel to remove the flag, the proprietor of the Marshall House shot him dead, the first Union casualty of the Civil War. Despite being able to see the warped cross of the Confederate battle standard from the portico of the White House, Lincoln steadfastly refused to move the capital to safer points further north, arguing that the abandonment of Washington would be a capitulation to the seditionists.

“Let us be vigilant,” Lincoln telegraphed to the worried Maryland governor in 1864, “but keep cool. I hope neither Baltimore nor Washington will be sacked.” Not for lack of desire, as that same year Confederate Lieutenant Jubal Early would attack Ft. Stevens in the Northwest Quadrant of the District of Columbia, in a battle that would take close to nine hundred men. Long had the secessionists dreamed of Washington as the capital of their fake nation. In the decades before the Civil War some imagined a “Golden Circle,” which would be a veritable empire of slavery, with the South Carolina Senator Robert Barnwell Rhett imperially enthusing that “We will expand… over Mexico – over the isles of the sea — over the far-off Southern — until we shall establish a great Confederation,” their twisted nation stretching from Panama to the District of Columbia. Until last week the Confederate flag never flew within the Capitol.

There the man casually strolls across the red-and-blue mosaic floor of some antechamber in the Capitol, dressed in jeans and a black hoodie with a tan hunting vest; hoisted over his shoulder is the Confederate flag, its colors matching the tiles. It shouldn’t be lost on anybody that his uniform is the exact same “suspicious” article of clothing which Black pre-teenagers have been shot for wearing, even while this man is able to raid the very seat of government unmolested. Because America is many things, but it is not subtle, the man in the photograph is centered by two gild-framed oil paintings. One is of Charles Sumner, the Massachusetts Senator and abolitionist nearly caned to death by an opponent on the legislative floor of this very building, and who denounced the “unutterable wrongs and woes of slavery; profoundly believing that, according to the true spirit of the Constitution, and the sentiments of the fathers, it can find no place under our National Government” before Congress in 1852. The other portrait, almost predictably, is of John C. Calhoun, the South Carolina Senator and Vice President under Andrew Jackson, who in 1837 would declaim that the “relation now existing in the slaveholding states… instead of an evil, [is] a good. A positive good,” and would then gush about what a kind and benevolent slave-master he was. It would be harder to stage a more perfect encapsulation of the American dichotomy than our weekend warrior did on Wednesday, the continual pull between those better angels of our nature and the demons of history, who are never quite exorcized and are often in full possession of the body politic. A power in that grotesque image, the cosplaying Confederate momentarily self-anointing himself sovereign as he casually strolls through the chamber. Chillingly strolled, one might say, for all of these terrorists acted with as impunity as if they had the knowledge there would be no consequences to their actions. It reminds us that the mantra “This isn’t who we are” is at best maudlin and at worst a complete lie.

The siege against the Capitol on the day that Congress met for the constitutionally mandated and largely pro-forma ritual of officially counting the Electoral College votes to certify Joe Biden and Kamala Harris as the rightful victors of the 2020 presidential race can be examined from many directions, of course. Security experts can parse why there was such a profound failure at ensuring the safety of the session; political scientists can explain how social media algorithms has increasingly radicalized adherents of the far-right; historians can place movements like QAnon and the Proud Boys in a genealogy of American nativism and European fascism. Everyone should be able to say that ultimate responsibility lay with the stochastic terrorism promoted by the lame-duck president and his congressional sycophants in the Sedition Caucus, as well as his media enablers with whom he is clasped in a toxic symbiotic relationship. All those approaches to analysis are valid, but I choose to look at the day as a literary critic and a resident of Washington D.C., because those things are what I am. But incongruity alone, even the uncanny alone, can’t quite provide the full critical lexicon for what we witnessed on our televisions that afternoon, the sense that even more than an inflection point, we were viewers of a cracked apocalypse. How do we make sense of an attempted American putsch, the almost-nightmare of a coup?

Because the cultural idiom of this nation is Hollywood, and our interpretive lenses are by necessity through that of the movies, I can’t help but feel that much of what we saw seemed prefigured in film. The terrible logic of America is that our deepest nightmares and desires always have a way of enacting themselves, of moving from celluloid to reality. Look at the photograph of Jake Angeli, the self-styled “QAnon Shaman,” shirtless and bedecked in racoon fur with buffalo horns upon his head (in pantomime of the very people whom this nation enacted genocide upon) with his face smeared in the colors of the American flag, standing at the dais of the Speaker of the House, and tell me that it doesn’t look like a deleted scene from The Postman. Or examine the photograph of a smiling ginger man in a stocking cap emblazoned with “TRUMP,” casually waving as he jauntily strolls underneath the rotunda past John Trumbull’s massive painting Surrender of General Burgoyne holding under his arm a pilfered wood podium decorated with a gold federal eagle, his hero’s adage that “when the looting starts, the shooting starts” apparently only to be selectively enforced. It looks like something from the post-apocalyptic movie The Book of Eli.

And then, most chillingly (and disturbingly underreported), there was the painstakingly assembled set of gallows, placed a bit beyond the equestrian monument to Ulysses S. Grant, who with great courage and strength broke the first iteration of the Ku Klux Klan, from which one vigilante hung that most American symbol of a noose. When remembered in light of the black-clad and masked men photographed with guns and zip-ties, it should make all of us consider just how much more tragic this violation, which was already a grotesque abomination, could have been. Horrifying to recall that the narrative conceit in Margaret Atwood’s The Handmaid’s Tale (and its television adaptation) that allowed for the theocratic dictatorship to ascend to power was the mass murder of a joint session of Congress. Sometimes #Resistance liberals get flak for their fears of fascism, but it would be easier to mock those anxieties if our country didn’t so often look like a science fiction dystopia.

It’s my suspicion that pop culture — that literature — is capable of picking up on some sort of cultural supersonic wavelength, those deep historical vibrations that diffuse in circles outward from our present into both past and future. There is something incantatory about those visions generated in word and special-effect, so that the eeriness of seeing marauding fascists overtake the Capitol grounds feels like something we’ve seen before. Think of all the times we’ve watched the monuments of Washington D.C. destroyed on film. Last week — while half paying attention to a block of cheesy apocalypse movies on the Syfy network that were supposed to count down the days left in the year — I saw the U.S.S. John F. Kennedy aircraft carrier pushed into the city by an Atlantic tsunami where it rolled across the National Mall and crushed the White House in Roland Emmerich’s godawful 2012. I’ve seen the executive mansion punctuated by bombs and dotted with bullet holes in the spectacularly corny Antoine Fuqua movie Olympus Has Fallen, and according to Thrillist the Capitol itself has been laid waste in no less than nine movies, including Day After Tomorrow, Earth vs. the Flying Saucers, G.I. Joe: Retaliation, Independence Day, Olympus Has Fallen, Superman II, White House Down, and X-Men: Days of Futures Past. Probably the impulse to watch this sort of thing is equal parts vicarious thrill and enactment of deep fears. I remember that when I saw Independence Day (also by Emmerich, the Kurosawa of schlock) after it came out, the 1996 theater audience erupted into cheers and claps when the impenetrable wall of exploding alien flames incinerated its way across D.C. and shattered the white dome of the Capitol like an egg being thrown into a fire-place. Was that applause an expressed opinion about Newt Gingrich? About Bill Clinton? Something darker?

After the terrorist attacks on 9/11, now almost twenty years ago, there was a profoundly shortsighted prediction that the hideous spectacle of Americans seeing the World Trade Center collapse would forever cure us of our strange desire to see our most famous buildings, and the people within them, destroyed. A perusal into the Olympian corpus of the Marvel Cinematic Universe (seemingly the only entertainment which Hollywood bothers to produce anymore) will testify that such an estimation was, to put it lightly, premature. French philosopher Guy Debord could have told us this in 1967 in his Society of the Spectacle, wherein he noted that “all of life presents itself as an immense accumulation of spectacles. Everything that was directly lived has moved away into a representation,” to which it could be added that the inverse is also accurate – everything that has been represented has seemingly moved into life. Which doesn’t mean that scenes like those which we witnessed on Wednesday aren’t affecting – no, the opposite is true. People reach to the appraisal that “it looks like a movie” not to be dismissive, but rather because cinema is the most powerful mythopoesis that we’re capable of.

What’s needed, of course, is a vocabulary commensurate with what exactly all of us saw. A rhetoric capable of grappling with defilement, with violation, with desecration, but because all we have is movies, that’s what we’re forced to draw upon. They gave us the ability to think about the unthinkable before it happened; the chance to see the unseeable before it was on our newsfeeds. If the vision of the screen is anemic, that’s not necessarily our fault — we measure the room of our horror with the tools which we’ve inherited. Few square miles of our civic architecture are quite so identified with our quasi-sacred sense of American civil religion as the grounds of the U.S. Capitol, and so the spectacle of a clambering rabble (used as a Trojan Horse for God knows what more nefarious group of actors) calls to mind fiction far more than it does anything which actually has happened. That’s the cruelty of our current age — that so frequently our lives resemble the nightmare more than the awakening. The Capitol siege was very much an apocalypse in the original Greek sense of the word: an unveiling, a rupture in normal history that signals why all of this feels so cinematic — though it’s hard to tell if it’s the beginning or ending of the movie, and what genre we’re exactly in. As Timothy Denevi writes about the assault in LitHub, “What is a culmination, after all, except the moment in which everything that could happen finally does? Where are we supposed to go from there?”

Important to remember that everything which could happen has already happened before, at some point. That’s what the bromide about this not being who we are gets wrong — this is, at least partially, who we’ve always been, albeit not in this exact or particular way. What happened at the eastern edge of the Mall this week has shades of the Wilmington Insurrection of 1898 in which an conspiracy of white supremacists plotted against the Black leadership of the North Carolina city ushered in Jim Crow at the cost of hundreds of lives (and then untold millions over the next century). The assault on the Capitol has echoes of the Election Riots of 1874, when members of the White League attacked Black voters in Eufaula, Alabama, leaving behind dozens of wounded women and men, and seven corpses. These are two examples of hundreds of similar events that shamefully liter our nation’s history, albeit most citizens have never heard of them. Hell, most people didn’t know about the Tulsa race massacre of 1921 — still less than a century ago — until HBO’s Watchmen dramatized it. The issue is exactly the same: White supremacists think that only their votes count, and will do anything to enforce that conviction.

That the supporters of the man who currently occupies the Oval Office believe any number of insane and discounted conspiracy theories about election fraud — claims rejected in some sixty lawsuits and a 9-0 Supreme Court decision — is to in some ways miss the point. Listen to their language — the man who instigated Wednesday’s riot emphasizes that he simply wants to count “legal” votes and ask yourself what that means, and then realize why the fevered rage of his mob focuses on places like Detroit, Philadelphia, and Atlanta. If the only people who’d been allowed to vote for Trump were white people, then he would have won the election in his claimed landslide — that’s what he and his supporters mean by “legal” votes. The batshit insane theories are just fan fiction to occlude the actual substance of their political belief. Such anti-democratic sentiment is also an American legacy, an American curse. The connection between what happened on Capitol Hill and in Wilmington, Eufaula, and Tulsa; or Fort Bend, Texas in 1888; or Lake City, South Carolina in 1897; or Ocoee, Florida in 1920; or in Rosewood, Florida in 1923 (you can look them all up), or any number of other thousands of incidents, may seem tangential. It isn’t.

When I lived in Massachusetts there was a sense of history that hung thick in the air, all of those centuries back to the gloomy Puritans and their gothic inheritance. Historical markers punctuated the streets of Boston and her suburbs, and there was that rightfully proud Yankee ownership of the American Revolution. Our apartment was only a mile or so from the Lexington battle green where that shot heard around the world rang out, and I used to sometimes grab a coffee and read a magazine on one of its pleasant benches in what was effectively a pleasant park, battle green thoughts in a green shade. Part of me wanted to describe this part of the country as haunted, and perhaps it is, but its ghosts seem to belong to a distant world, a European world. By contrast, when I moved to Washington DC, the American specters moved into much clearer focus. If Massachusetts seems defined by the Revolution, then the District of Columbia, and Maryland, and Virginia are indelibly marked by the much more violent, more consequential, more important, and more apocalyptic conflagration of the Civil War. In his classic Love and Death in the American Novel, the critic Leslie Fiedler described the nation as “bewilderingly and embarrassingly, a gothic fiction, nonrealistic and negative, sadist and melodramatic — a literature of darkness and the grotesque in a land of light and affirmation.” Our national story is a Jekyll and Hyde tale about the best and worst aspirations at conflict within the Manichean breast of a nation which fancied itself Paradise but ended up somewhere points further south.

Because I have a suspicion that poetry is capable of telling the future, that everything which can or will happen has already been rendered into verse somewhere (even if obscured), a snatch of verse from a Greek poet accompanied my doom scrolling this week. “Why isn’t anything going on in the senate?” Constantin Cavafy asked in 1898, “Why are the senators sitting there without legislating?” I thought about it when I first heard that the mob was pounding at the Capitol door; it rang in my brain when I saw the photographs of them parading through that marble forest of statuary hall, underneath that iron dome painted a pristine white. “Because the barbarians are coming today,” Cavafy answered himself. I thought about it when I looked at the garbage strewn through the halls, the men with their feet up on legislators’ desks, cackling at the coup they’d pulled. “What’s the point of senators making laws now? Once the barbarians are here, they’d do the legislating.” For a respite, it seems that the barbarians have either been pushed back or left of their own accord. In that interim, what will be done to make sure that they don’t return? Because history and poetry have taught us that they always do.

Image credit: Pexels/Harun Tan.

On Dreams and Literature

-

“We are such stuff/As dreams are made on, and our little life/Is rounded with a sleep.” — William Shakespeare, The Tempest (1611)

“A candy-colored clown they call the sandman/tiptoes to my room every night/just to sprinkle stardust and to whisper/’Go to sleep, everything is alright.’” — Roy Orbison, “In Dreams” (1963)

Amongst the green-dappled Malvern Hills, where sunlight spools onto spring leaves like puddles of water in autumn, a peasant named Will is imagined to have fallen asleep on a May day when both the warmth and the light induce dreams. Sometime in the late fourteenth-century (as near as we can tell between 1370 and 1390), the poet William Langland wrote of a character in Piers Plowman who shared his name and happened to fall asleep. “In a summer season,” Langland begins, “when soft was the sun, /I clothed myself in a cloak as I shepherd were… And went wide in the world wonders to hear.” Presentism is a critical vice, a fallacy of misreading yourself into a work, supposedly especially perilous if it’s one that’s nearly seven centuries old. Hard not to commit that sin sometimes. “But on a May morning,” Langland writes, and I note his words those seven centuries later on a May afternoon, when the sun is similarly soft, and the inevitable drowsiness of warm contentment takes over my own nodding head and drowsy eyes so that I can’t help but see myself in the opening stanza of Piers Plowman.

“A marvel befell me of fairy, methought./I was weary with wandering and went me to rest/Under a broad bank by a brook’s side,/And as I lay and leaned over and looked into the water/I fell into a sleep for it sounded so merry.” Good close readers that we are all supposed to be, it’s imperative that we don’t read into the poem things that aren’t actually in it, and yet I can’t help but imagine what that daytime nocturn was like. The soft gurgle of a creek through English fields, the feeling of damp grass underneath dirtied hands, and of scratchy cloak against unwashed skin; the sunlight tanning the backs of his eyelids; that dull, corpuscular red of daytime sleep, the warmth of day’s glow flushing his cheeks, and the almost preternatural quiet save for some bird chirping. The sort of sleep you fall into when you’re on a train that rocks you to sleep in the sunlight of late afternoon. It sounds nice.

Piers Plowman is of a medieval poetic genre known as a dream allegory, or even more enticingly as a dream vision. Most famous of these is Dante Alighieri’s The Divine Comedy, where its central character (who as in Piers Plowman shares the poet’s name) discovers himself in a less pleasant wood than does Will, for that “When I had journeyed half of our life’s way,/I found myself within a shadowed forest,/for I had lost the path that does not stray.” The Middle Ages didn’t originate the dream vision, but it was the golden age of the form, where poets could express mystical truths in journeys that only happened within heads resting upon rough, straw-stuffed pillows. Langland’s century alone saw Geoffrey Chaucer’s Parliament of Fowls, John Gower’s Vox Clamantis, John Lydgate’s The Temple of Glass, and the anonymously written Pearl (by the same lady or gent who wrote Sir Gawain and the Green Knight). Those are only English examples (or I should say examples by the English; Gower was writing in Latin), for the form was popular in France and Italy as well. A.C. Spearing explains in Medieval Dream-Poetry that while sleeping “we undergo experiences in which we are freed from the constraints of everyday possibility, and which we feel to have some hidden significance,” a sentiment which motivated the poetry of Langland and Dante.

Dante famously claimed that his visions — of perdition, purgatory, and paradise — were not dreams, and yet everything in The Divine Comedy holds to the genre’s conventions. Both Langland and Dante engage the strange logic of the nocturne, the way in which the subconscious seems to rearrange and illuminate reality in a manner that the prosaicness of overrated wakefulness simply cannot. Dante writes that the “night hides things from us,” but his epic is itself proof that the night can just as often reveal things. Within The Divine Comedy Dante is guided through the nine circles of hell by the Roman poet Virgil, from the antechamber of the inferno wherein dwell the righteous pagans and classical philosophers, down through the frozen environs of the lowest domain whereby Lucifer forever torments and is tormented by that trinity of traitors composed of Cassius, Brutus, and Judas. Along the way Dante is privy to any number of nightmares, from self-disemboweling prophets to lovers forever buffeted around on violent winds (bearing no similarity to a gentle Malvern breeze). In the Purgatorio and Paradiso he is spectator to far more pleasant scenes (though telling that more people have read Inferno, as our nightmares are always easiest to remember), whereby he sees a heaven that’s the “color that paints the morning and evening clouds that face the sun,” almost a description of the peacefulness of accidentally nodding off on an early summer day.

Both The Divine Comedy and Piers Plowman express verities accessed by the mind in repose; Langland’s poem, for not beginning in a dark wood but rather in a sunny field, embodies mystical apprehensions as surely as does Dante. A key difference is that Langland’s allegory is so obvious (as anyone who has seen the medieval play Everyman can attest is true of the period). Characters named after the Seven Deadly Sins, or called Patience, Clergy, and Scripture (and Old Age, Death, and Pestilence) all interact with Will — whose name has its own obvious implications. By contrast, Dante’s characters bear a resemblance to actual people (or they are actual people, from Aristotle in Limbo to Thomas Aquinas in Heaven), even while the events depicted are seemingly more fantastic (though in Piers Plowman Will witness both the fall of man and the harrowing of hell). Both are, however, written in the substance of dreams. Forget the didactic obviousness of allegory, the literal cipher that defines that form, and believe that in a field between Worcestershire and Hertfordshire Will did plumb the mysteries of eternity while sleeping. What makes the dream vision a chimerical form is that maybe he did. That’s the thing with dreams and their visions; there is no need to suspend disbelief. We’re not in the realm of fantasy or myth, for in dreams order has been abolished, anything is possible, and nothing is prohibited, not even flouting the arid rules of logic.

A danger to this, for to dream is to court the absolute when we’re at our most vulnerable, to find eternity in a sleep. Piers Plowman had the taint of heresy about it, as it inspired the revolutionaries of 1381’s Peasant Rebellion, as well as the adherents of a schismatic group of proto-Protestants known as Lollards. Arguably the crushing of the rebellion led to an attendant attack by authorities on vernacular literature like Piers Plowman, in part explaining the general dismalness of English literature in the fifteenth-century (which excluding Mallory and Skelton is the worst century of writing). Scholars have long debated the relationship between Langland and Lollardy, but we’ll let others more informed tease out those connections[. T]he larger point is that dreaming can get you in trouble. That’s because dreaming is the only realm in which we’re simultaneously complete sovereign and lowly subject; the cinema we watch when our eyes are closed. Sleep is a domain that can’t be reached by monarch, tyrant, state, or corporation — it is our realm.

Dreams had a radical import in George Orwell’s dystopian classic 1984. You’ll recall that in that novel the main character of Winston Smith is a minor bureaucrat in totalitarian Oceania. Every aspect of Smith’s life is carefully controlled; his life is under total surveillance, all speech is regulated (or made redundant by New Speak), and even the truth is censored, altered, and transformed (which the character himself has a role in). Yet his dreams are one aspect of his life which the government can’t quite control, for Smith “had dreamed that he was walking through a pitch-dark room. And someone sitting to one side of him had said as he passed: ‘We shall meet in the place where there is no darkness.’” Sleep is an anarchic space where the dreamer is at the whims of something much larger and more powerful than themselves, and by contrast where sometimes the dreamer finds themselves transformed into a god. A warning here though — when total independence erupts from our skulls into the wider world (for after all, it is common to mutter in one’s sleep) there is the potential that your unconsciousness can betray you. Smith, after all, is always monitored by his telescreen.

Whether it’s the thirteenth or the twenty-first centuries, dreaming remains bizarre. Whether we reduce dreams to messages from the gods and the dead, or repressed memories and neurosis playing in the nursery of our unconscious, or simply random electric flickering of neurons, the fact that we spend long stretches of our life submerging ourselves in bizarre parallel dimensions is so odd that I can’t help but wonder why we don’t talk about it more (beyond painful conversations recounting dreams). So strange is it that we spend a third of our lives journeying to fantastic realms where every law of spatiality and temporality and every axiom of identity and principle of logic is flouted, that you’d think we’d conduct ourselves with a bit more humility when dismissing that which seems fantastic in the experience of those from generations past who’ve long since gone to eternal sleep. Which is just to wonder that when William Langland dreamt, is it possible that he dreamt of me?

Even with our advancements in the modern scientific study of the phenomenon, their mysteriousness hasn’t entirely dissipated. If our ancestors saw in dreams portents and prophecies, then this oracular aspect was only extended by Sigmund Freud’s The Interpretation of Dreams. He who inaugurated the nascent field of psychoanalysis explained dreams as a complex tapestry of wish fulfillment and sublimation, an encoded narrative that mapped onto the patient’s waking life and that could be deciphered by the trained therapist. Freud writes that there “exists a psychological technique by which dreams may be interpreted and that upon the application of this method every dream will show itself to be a senseful psychological structure which may be introduced into an assignable place in the psychic activity of the waking state.” Not so different from Will sleeping in his field. The origin may be different — Langland sees in dreams visions imparted from God and Freud finds their origin in the holy unconsciousness, but the idea isn’t dissimilar. Dreaming imparts an ordered and ultimately comprehensible message, even though the imagery may be cryptic.

Freud has been left to us literary critics (who’ve even grown tired of him over the past generation), and science has abandoned terms like id, ego, and superego in favor of neurons and biochemistry, synapses and serotonin. For neurologists, dreaming is a function of the prefrontal cortex powering down during REM sleep, and of the hippocampus severing its waking relationship with the neocortex, allowing for a bit of a free-for-all in the brain. Scientists have discovered much about how and why dreaming happens — what parts of the brain are involved, what cycles of wakefulness and restfulness a person will experience, when dreaming evolved, and what functions (if any) it could possibly serve. Gone are the simple reductionisms of dream interpretation manuals with their categorized entries about your teeth falling out or of showing up naked to your high school biology final. Neuroscientists favor a more sober view of dreaming, whereby random bits of imagery and thought thrown out by your groggy chemical induced brain rearrange themselves into a narrative which isn’t really a narrative. Still, as Andrea Rock notes in The Mind at Night: The New Science of How and Why we Dream, “it’s impossible for scientists to agree on something as seemingly simple as the definition of dreaming.” If we’re such stuff as dreams are made of, the forensics remain inconclusive.

Not that dreaming is exclusively a human activity. Scientists have been able to demonstrate that all mammals have some form of nocturnal hallucination, from gorillas to duck-billed platypuses, dolphins to hamsters. Anyone with a dog has seen their friend fall into a deep reverie; their legs pump as if they’re running, occasionally they’ll even startle-bark themselves awake. One summer day my wife and I entertained our French bulldog by having her chase a sprinkler’s spray. She flapped her jowly face at the cool gush of water with a happiness that no human is capable of. That evening, while she was asleep, she began to flap her mouth again, finally settling into a deeper reverie where she just smiled. Dreaming may not necessarily be a mammalian affair — there are indications that both birds and reptiles dream — albeit it’s harder to study creatures more distant from us. Regardless, evidence is that animals have been sleeping perchance to dream for a very long time, as it turns out. Rock writes that “Because the more common forms of mammals we see today branched off from the monotreme line about 140 million years ago… REM sleep as it exists in most animals also emerged at about the time that split occurred.” We don’t know if dinosaurs dreamt, but something skittering around and out of the way of their feet certainly did.

If animal brains are capable of generating pyrotechnic missives, and if dreaming goes back to the Cretaceous, what then of the future of dreaming? If dogs and donkeys, cats and camels are capable of dreaming, will artificial intelligences dream? This is the question asked by Philip K. Dick’s Do Androids Dream of Electric Sheep?, which was itself the source material for Ridley Scott’s science fiction film classic Blade Runner. Dick was an author as obsessed with illusion and reality, the unattainability of truth, and doubt as much as any writer since Plato. In his novel’s account of the bounty hunter Rick Deckard’s decommissioning of sentient androids, there is his usual examination of what defines consciousness, and the ways in which its illusions can present realities. “Everything is true… Everything anybody has ever thought,” one character says, a pithy encapsulation of the radical potential of dreams. Dick imagined robots capable of dreaming with such verisimilitude that they misapprehended themselves to be human, but as it turns out our digital tools are able to slumber in silicon.

Computer scientists at Google have investigated what random images are produced by a complex artificial neural network as it “dreams,” allowing the devices to filter various images they’ve encountered and to recombine, recontextualize, and regenerate new pictures. In The Atlantic, Adrienne LaFrance writes that the “computer-made images feature scrolls of color, swirling lines, stretched faces, floating eyeballs, and uneasy waves of shadow and light. The machines seemed to be hallucinating, and in a way that appeared uncannily human.” Artificial Intelligence has improved to an unsettling degree in just the past decade, even though a constructed mind capable of easily passing the Turing Test has yet to be created, though that seems more an issue of time than possibility. If all of the flotsam and jetsam of the internet could coalesce into a collective consciousness emerging from the digital primordial like some archaic demigod birthing Herself from chaos, what dreams could be generated therein? Or if it’s possible to program a computer, a robot, an android, an automaton to dream, then what oracles of Artificial Intelligence could be birthed? I can’t help but thrill to the idea that we’ll be able to program a desktop version of the Delphic Oracle analyzing its own microchipped dreams. “The electric things have their life too,” Dick wrote.

We’ve already developed AI capable of generating completely realistic-looking but totally fictional women and men. Software engineer Philip Wang invented a program at ThisPersonDoesNotExist.com which does exactly what its title advertises itself as doing: it gives you a picture of a person who doesn’t exist. Using an algorithm that combs through actual images, Wang’s site uses something called a generative adversarial network to create pictures of people who never lived. If you refresh the site, you’ll see that the humans dreamt of by the neural network aren’t cartoons or caricatures, but photorealistic images so accurate that they look like they could be used for a passport. So far I’ve been presented with an attractive butch woman with sparkling brown eyes, a broad smile, and short curly auburn hair; a strong jawed man in his 30s with an unfortunate bowl cut and a day’s worth of stubble who looks a bit like swimmer Michael Phelps; and a nerdy-looking Asian man with a pleasant smile and horn-rimmed glasses. Every single person the AI presented looked completely average and real, so that if I encountered them in the grocery store or at Starbucks I wouldn’t think twice, and yet not a single one of them was real. I’d read once (though I can’t remember where) that every invented person we encounter in our dreams has a corollary to somebody that we once met briefly in real life, a waitress or a store clerk whose paths we crossed for a few minutes drudged up from the unconscious and commissioned into our narrative. I now think that all of those people come from ThisPersonDoesNotExist.com.

One fictional person who reoccurs in many of our dreams is “This Man,” a pudgy, unattractive balding man with thick eyebrows, and an approachable smile who was the subject of Italian marketer Andrea Natella’s now defunct website “Ever Dream This Man?” According to Natella, scores of people had dreams about the man (occasionally nightmares) across all continents and in dozens of countries. Blake Butler, writing in Vice Magazine, explains that “His presence seems both menacing and foreboding at the same time, unclear in purpose, but haunting to those in whom he does appear.” This Man doesn’t particularly look like any famous figure, nor is he so generic that his presence can be dismissed as mere coincidence. A spooky resonance concerns this guy who looks like he manages a diner on First Avenue and 65th emerging simultaneously in thousands of peoples’ dreams (his cameo is far less creepy after you’re aware of the website). Multiple hypotheses were proffered, ranging from This Man being the product of the collective unconscious as described by Carl Jung to Him being a manifestation of God appearing to people from Seattle to Shanghai (my preferred theory). As it turns out, he was simply the result of a viral marketing campaign.

Meme campaigns aside, the sheer weirdness of dreams can’t quite exorcize them of a supernatural import — we’re all looking for portents, predictions, and prophecies. Being submerged into what’s effectively another universe can’t help but alter our sense of reality, or at least make us question what exactly that word means. For years now I’ve had dreams that take place in the same recurring location — a detailed, complex, baroque alternate version of my hometown of Pittsburgh. This parallel universe Pittsburgh roughly maps onto the actual place, though it appears much larger and there are notable differences. Downtown, for example, is a network of towering, interconnected skyscrapers all accessible from within one another (there’s a good bookstore there); a portion of Squirrel Hill is given over to a Wild West experience set. It’s not that I have the same dreams about this place, it’s that the place is the same, regardless of what happens to me in those dreams when I’m there. So much so that I experience the uncanny feeling of not dreaming, but rather of sliding into some other dimension. An eerie feeling comes to me from a life closer then my own breath, existing somewhere in the space between atoms, and yet totally invisible to my conscious eye.

Such is the realm of seers and shamans, poets and prophets, as well as no doubt yourself — the dream realm is accessible to everyone. As internal messages from a universe hidden within, whereby the muse and oracle are within your own skull. Long have serendipitous missives arisen from our slumber, even while we debate their ultimate origin. Social activist Julia Ward Howe wrote “Battle Hymn of the Republic” when staying at Washington D.C.’s Ward Hotel in 1861, the “dirtiest, dustiest filthiest place I ever saw.” While “in a half dreaming state” she heard a group of Union soldiers marching down Pennsylvania Avenue singing “John Brown’s Body,” and based on that song Howe composed her own hymn while in a reverie. Howe’s dreaming was in keeping with a melancholic era enraptured to spiritualism and occultism, for she commonly was imparted with “attacks of versification [that] had visited me in the night.” The apocalyptic Civil War altered peoples’ dreams, it would seem. Jonathan White explores the sleep-world of nineteenth-century Americans in his unusual and exhaustive study Midnight in America: Darkness, Sleep, and Dreams During the Civil War, arguing that peoples’ “dream reports were often remarkably raw and unfiltered… vividly bringing to life the horrors of the conflict; for others, nighttime was an escape from the hard realities of life and death in wartime.”

Every era imparts its own images, symbols, and themes into dreams, so that collective analysis can tell us about the concerns of any given era. White writes that during the Civil War people used dreams to relive “distant memories or horrific experiences in battle, longing for a return to peace and life as they had known it before the war, kissing loved ones that had not seen for years, communing with the dead, traveling to faraway places they wished they could see in real life,” which even if the particulars may be different, is not so altered from our current reposes. One of the most famous of Civil War dreamers was Abraham Lincoln, whose own morbid visions were in keeping with slumber’s prophetic purposes. Only days before his assassination, Lincoln recounted to his bodyguard that he’d had an eerily realistic dream in which he wandered from room to room in the White House. “I heard subdued sobs,” Lincoln said, as “if a number of people were weeping.” The president was disturbed by the sound of mourning, “so mysterious and so shocking,” until he arrived in the East Room. “Before me was a catafalque, on which rested a corpse wrapped in funeral vestments,” the body inside being that of Lincoln. Such dreams are significant — as the disquieting quarantine visions people have had over the past two months can attest to. We should listen — they have something to tell us.

Within literature dreams seem to always have something to say, a realm of the fantastic visited in novels as diverse as L. Frank Baum’s Wizard of Oz, Charles Dickens’ A Christmas Carol, Neil Gaiman’s Sandman, and Lewis Carol’s Alice in Wonderland. The dream kingdom is a place where the laws of physics are muted, where logic and reason no longer hold domain, and the wild kings of absurdity are allowed to reign triumphant. Those aforementioned novels are ones in which characters like Dorothy, Ebenezer Scrooge, Morpheus, and Alice are subsumed into a fantastical dream realm, but there are plenty of books with more prosaic dream sequences, from Mr. Lockwood’s harrowing nightmare in Emily Brontë’s Wuthering Heights to Raskolnikov’s violent childhood dreams in Fyodor Dostoevsky’s Crime and Punishment. “I’ve dreamt in my life dreams that have stayed with me ever after, and changed my ideas,” writes Brontë, “they’ve gone through and through me, like wine through water, and altered the color of my mind.” Then there is the literature that emerges from dreams, the half-remembered snippets and surreal plot lines, the riffs of dialogue and the turns of phrase that are birthed from the baked brain of night. Think of the poppy reveries of Thomas DeQuincy’s Confessions of an English Opium-Eater or the delicious purpose of Samuel Taylor Coleridge’s “Kubla Khan” written in a similar drug haze until the poet was interrupted by that damned person from Porlock.

Spearing writes that “from the time of the Homeric poems down to the modern novel, it is surely true that the scene of the great bulk of Western literature has not been the internal world of the mind, in which dreams transact themselves, but the outer, public world of objective reality,” but this misses an important point. All novels actually occur in the internal world of the mind, no matter how vigorous their subjects may be. I’ll never be able to see the exact same cool colors of Jay Gatsby’s shirts that you envision, nor will I hear the exact timber of Mr. Darcy’s voice that you imagine, in the same way that no photographs or drawings or paintings can be brought back from the place you go to when you sleep. Dreaming and reading are unified in being activities of fully created, totally self-contained realities. Furthermore, there is a utopian freedom in this, for that closed off dimension, that pinched off universe which you travel to in reveries nocturnal or readerly is free of the contagion of the corrupted outside world. There are no pop-up ads in dreams, there are no telemarketers calling you. Even our nightmares are at least our own. Here, as in the novel, the person may be truly free.

Dreaming is the substance of literature. It’s what comes before, during, and after writing and reading, and there can be no fiction or poetry without it. There is no activity in waking life more similar to dreaming than reading (and by proxy writing, which is just self-directed reading). All necessitate the complete creation of a totally constructed universe constrained within your own head and accessible only to the individual. The only difference between reading and dreaming is who directs the story. As in a book as in our slumber, the world which is entered is one that is singular to the dreamer/reader. What you see when you close your eyes is forever foreign to me, as I may never enter the exact same story-world that you do when you crack open a novel. “Life, what is it but a dream?” Carol astutely asks.

We spend a third of our day in dream realms, which is why philosophers and poets have always rightly been preoccupied with them. Dreams necessarily make us question that border between waking and sleeping, truth and falsity, reality and illusion. That is the substance of storytelling as well, and that shared aspect between literature and dreaming is just as important as the oddity of existing for a spell in entirely closed off, totally self-invented, and completely free worlds. What unites the illusions of dreams and our complete ownership of them is subjectivity, and that is the charged medium through which literature must forever be conducted. Alfred North Whitehead once claimed that all of philosophy was mere footnotes to Plato — accurate to say that all of philosophy since then has been variations on the theme of kicking the tires of reality and questioning whether this exact moment is lived or dreamt.

The pre-Socratic metaphysician Gorgias was a radical solipsist who thought that all the world was the dream of God and the dreamer was himself. Plato envisioned our waking life as but a pale shadow of a greater world of Forms. Rene Descartes in Meditations on First Philosophy forged a methodology of radical doubt, whereby he imagined that a malicious demon could potentially deceive him into thinking that the “sky, the air, the earth, colors, shapes, sounds and all external things are merely the delusions of dreams which he has devised to ensnare my judgment. I shall consider myself as not having hands or eyes, or flesh, or blood or senses, but as falsely believing that I have all these things.” So, from the assumption that everything is a dream, Descartes tried to latch onto anything that could be certain. Other than his own mind, he wasn’t able to find much. In dreams there is the beginning of metaphysics, for nothing else compels us to consider that the world which we see is not the world which there is, and yet such philosophical speculation need not philosophers, since children engage in it from the moment they can first think.

When I was a little kid, I misunderstood that old nursery rhyme “Row Your Boat.” When it queried if “life is but a dream,” I took that literally to mean that all which we experience is illusion, specter, artifice. In my own abstract way I assumed that according to the song, all of this which we see: the sun and moon, the trees and flowers, our friends and family, are but a dream. And I wondered what it would be like when I woke up, who I would recount that marvelous dream too? “I had the strangest dream last night,” I imagined telling faces unknown with names unconveyed. I assumed the song meant that all of this, for all of us, was a dream — and who is to know what that world might look like when you wake up? Such a theme is explored in pop culture from the cyberpunk dystopia The Matrix to the sitcom finale Newhart, because this sense of unreality, of dreams impinging on our not-quite-real world is hard to shake. Writing about a classic metaphysical thought-experiment known as the Omphalos Argument (from the Greek for “navel,” as relating to a question of Eden), philosopher Bertrand Russell wrote in The Analysis of Mind that “There is no logical impossibility… that the world sprang into being five minutes ago, exactly as it then was, with a population that ‘remembered’ a wholly unreal past.” Perhaps we’ve just dozed off for a few minutes then? Here’s the thing though — even if all of this is a dream — it doesn’t matter. Because in dreams you’re innocent. In dreams you’re free.

Image credit: Pexels/Erik Mclean.

A Year in Reading: Ed Simon

-

So. How are we expected to begin these things? How can I write about reading in this year of all years, this Annus Horribilis of American authoritarianism, American division, American plague? There’s no judgement in that question – it’s genuine. Because to not state the obvious would be callous: at the time of this writing there have been a quarter of a million deaths that were largely preventable if there had only been a modicum of concern from both the government and the collective citizenry.

At the same time, to wallow in all of that misfortune, the pandemic death count rising, the spate of police murders of Black citizens, the brazen incitements to violence from the thankfully defeated president, could just be more fodder for doomscrolling (the term popularized by the journalist Karen K. Ho). No doubt you’re familiar with this activity, for the correct answer to the question of “What did you read this year?” would be “Facebook, Reddit, and Twitter. CNN, The New York Times, and The Washington Post. Comment sections. Comment sections. Comment sections.” If anything quite expressed the emotional tenor of this wicked reality for most of us, it was the feeling of being dead-eyed and exhausted, eyeballs vibrating in their sockets and blood straining in our temples, ensconced in the cold glow of the smart-phone screen as you endlessly stared at travesty after travesty. Androids with our Androids.

Being who I am, I’ve got an inclination to write about the triumph of reading, the warmth from pages expressing the ineffable separateness of these people whom we happen to share the world with, for a bit. The way in which literature acts as conduit for connection, the building of worlds with words, kingdoms of interiority claimed through the audacious act of writing, and so on. But do you know what I actually did with most of my free time? Doomscrolling. Just like you. How could it be otherwise? Companion to our worry, companion to our fear, companion to our free minutes. To endlessly scroll through our social media newsfeeds fed that demon of acedia nestled in each individual skull, simultaneously giving us the illusion of control, the strange pleasure of anxiety, and the empty calories that filled our bellies but did nothing to finally satiate our hunger.

Nothing new in this, what Daniel Defoe described of 1665 in his novel A Journal of the Plague Year, whereby the “apprehension of the people was likewise strangely increased… addicted to prophecies and astrological conjurations, dreams, and old wives’ tale than ever they were before or since,” something to keep in mind as I endlessly refreshed Nate Silver. It reminded me of the childhood feeling that I used to have after hours of Nintendo; that shaky, bile-stomached emotion that I imagine senior citizens feeding quarters into Atlantic City slot machines must feel. Easier to pretend that this was a type of reading; knowing facts without reflection, horror without wisdom.

Yet I did read books this year. If I’m being honest, I didn’t read terribly widely or terribly deeply, and there is a distinct before and after as regards the plague, but I still forced myself to read even if it was at a glacial speed compared to normal, even if it was sometimes joyless. I did so because I felt that I had to, in the same way you white-knuckle it through flight turbulence by humming to yourself. I did it because I was scared that if I didn’t, I might forget how. And through that, I still had beautiful moments of reading, incandescent ones, transcendent ones. Books were still able to move me when two thousand people had died, or when two hundred thousand people had. Reading may sometimes feel like a frivolity, but it isn’t. All of that stuff I said in the second paragraph, the quasi-mocking tone about how I’m apt to argue that literature is about connection? Well, you knew I was setting that up rhetorically to knock it down. I don’t always feel that sentiment to be true, but you need not feel something to know it’s true (then again, I’ve always been a works instead of faith guy). Don’t fault me for being predictable.

This is the third year I’ve been lucky enough to be able to write one of these features for The Millions, and maybe it’s the English teacher in me, but I always have a need to tie together what I’ve read into some sort of cohesive syllabus. Summers past I used to actually theme my beach reading around subjects; one year I read novels according to the very specific criteria that they had to be about tremendous changes which happened in an instant (Tom Perrotta’s The Leftovers; Kevin Brockmeier’s The Illumination); in another season, all of the works on my docket were contemporary novels of manners (Jeffrey Eugenides’s The Marriage Plot; Dean Bakopoulos’s My American Unhappiness). This season of pandemic, it seemed that the dominant subject of the novels which I read was family.

In all of their complexities, almost every novel which I pleasure-read in 2020 examined family in its multitudinous complexity. Happy families and broken families; families of fate and families of choice; tragic families and triumphant families. I couldn’t have known it on New Years Day, but there was something appropriate in this, for this year was – in all of its darkness – for many a year of family. In the elemental stillness of quarantine people got to know their families with a new intimacy (for good and bad); some broods found themselves broken, some made new again. Most crucially, and at the risk of being maudlin, the pandemic distilled to an immaculate purity the centrality of family. My family’s own year was divided by the beautiful caesura of welcoming our first child into this world, the miracle of new life deserving of every cliché that can be said about it, a grace and gift that all of the beautiful rhetoric I can muster would scarcely be worthy of.

If novels serve any purpose, it’s to act as engines of empathy (whether or not that makes the world a better place is a question for somebody of a higher pay grade), and so I was able to see a bit of myself in Jonathan Safran Foer’s description of being a new father from his door-stopper of a book Here I Am. Jacob Bloch reminisces on moments with his first son, “the smell of the back of his neck; how to collapse an umbrella stroller with one hand… the transparency of new eyelids… my own inability to forgive myself for the moments I looked away and something utter inconsequential happened, but happened.” While Jacob and I share a parents’ love and a District of Columbia mailing address, the Blochs of Cleveland Park live in a slightly different universe from my own, though one marked by similarly tumultuous global crises, a throwback to the great male mid-century novelist canon for our century, set against the backdrop of a potentially apocalyptic war in the Middle East.

The Blochs are an unhappy family. Jacob is petty, anxious, and narcissistic; his wife Julia is unfulfilled; his father Irv is opinionated and hypocritical; his grandfather Isaac is a suicidal Holocaust survivor; his children Sam, Max, and Benjy each have their fair share of neuroses for being so young, and his Israeli cousin Tamir is simultaneously boastful and sensitive, flashy and wise. Across the daily travails of the Bloch family, from the threat of a cancelled Bar Mitzvah, the indiscretions and infidelities, and the sufferings of a beloved elderly family dog (which lent itself to one of the most moving scenes I read this year), there is the omnipresent question of Judaism and its relation to Israel, played out in a world where antisemitism is very much not a past phenomenon. Envy has always made it difficult for me to appreciate Foer, but for its occasional indulgences, Here I Am is a novel of profound beauty – especially in its dialogue, though all writers should have some humility. When Jacob gets into a fight with Max about the respective influence of Roth versus Kanye West, his son responds about the former that “First of all, I’ve never even heard of that person.”

From Cleveland Park to Harlem, Imbolo Mbue imagines a very different family experience in Behold the Dreamers, though perhaps not such a very different family (for all parents want what is good for their children). Jende Jonga has overstayed his three-month visa, and has brought over from their native Cameroon his wife Neni and their young son. Working as a livery driver, Jenda’s cousin is able to get him a job as a private chauffeur for Clark Edwards, investment banker at Lehman Brothers in 2007. Mbue depicts the ways in which money and legal status effect two radically different groups of people during the last major economic collapse. Fundamentally a novel about the American Dream, which is to say a novel about money and the way it differentiates one man from another, Behold the Dreamers movingly and brilliantly tells the sort of New York story that can be so easy to overlook.

Immigration is at the
core of Behold the Dreamers – what it means to forever fear deportation,
the sort of hard work that puts a pain in the back and feet that require five
Tylenol at a time, the crowding of a one-bedroom Uptown apartment with husband,
wife, son, and newborn daughter. So triumphant are the dreams of immigrant
aspiration, that there is a surreal beauty in a (c.2008) boast that “He will
take us to a restaurant in the Trump Hotel… He will hire Donald Trump himself
to cook steak for us,” so that the nativist is made to humbly genuflect before
the very sort of people whom he has subsequently tortured.  Mbue writes about her characters with a such
a humane tenderness that even when they’re cruel, or shortsighted, or fearful,
there is still a fundamental love which makes their full humanity apparent, so
that by the conclusion a reader will even have some sympathy for the investment
banker who is implicated in all that went wrong in 2008. With almost perfect
pitch for how people talk to one another, Mbue moves from the kitchens of
Harlem where Cameroonians prepare ekwang and ndole, to the gilded
living rooms of Park Avenue and the spacious backyards of the Hamptons. “Why
did you come to America if your town is so beautiful?” Clark asks his driver.
“Jende laughed, a brief uneasy laugh. ‘But sir,’ he said. ‘America is
America.’”

Both of these books came to me from the neighborhood mainstay of Capitol Hill Books, across the street from the red-bricked environs of the palatial nineteenth-century Eastern Market. The proprietors of the bookstore had an ingenious concept whereby readers would fill out a form about their reading preferences, and an upper limit on how much money they’d be willing to spend, and then they would compile a sealed grab-bag of mystery tomes which would be left in front of the store at an agreed upon time, like some sort of illicit literary handoff. My main method of finding totally new books, not pushed by algorithm or article, was precluded after the libraries closed, and so Capitol Hill Books’ invitation to take a literary leap into the unknown was a welcome diversion. Because the store is an amazing place, only a few blocks from the Library of Congress and the Supreme Court, with creased, underlined paperback volume crammed into every conceivable inch of the converted townhouse (including the bathroom), and because the coronavirus has demolished the economy and small business people received little of the relief which they were due from the federal government, I’m going to feature several other independent bookstores in Washington D.C. who deserve your money more than the website named after a South American rainforest. Please consider buying from them, or from any of the other bookstores I’m featuring – you don’t even have to live in the District (but of course I encourage you to buy from your own local independents – if you’re a fellow Pittsburgher I can attest to the glories of Classic Lines, Amazing Books & Records, and White Wale Bookstore).

Maybe save some of your lucre for the funky cool Solid State Books on H Street, in the neighborhood variously called NoMA or the Atlas District, depending on which gentrifying real estate agent you talk to. Solid State Books is the type of simultaneously sleek and cozy storefront that calls for you to wander after a dinner of Ethiopian or Caribbean food, coffee in hand, as you paw through the delicious tables of new novels. It embodies the sleek urbanity of bookstore wandering that’s become all too rare in mid-sized American cities, and though the pandemic makes that singular joy impossible right now, Solid State is available for curbside pickup. Consider purchasing Annie Liontas’s Let Me Explain You or Mary Beth Keane’s Ask Again, Yes, two novels that share with Behold the Dreamers a sense of immigrant possibility (and failure, pain, and tribulation) in the greater New York metro area. If Mbue had a love for the city from Malcolm X Boulevard down to Washington Square Park, then Liontas looks across the Hudson to the great Jersey Purgatory of Meadowlands strip malls, oil refineries, and diners, all the way down I-95 to New York’s greatest suburb of Philadelphia. It’s there that Stavros Stavros Mavrakis owns the Gala Diner, and where following a series of prophetic intimations concerning his impending death, sends accusatory emails to his three daughters and his ex-wife. “I, Stavros Stavros, have ask God to erase the mistakes of my life; and God has answer, in a matter of speaking, That it is best to Start Over, which requires foremost that We End All that is Stavros Stavros. No, not with suicide. With Mercy.”

Liontas’ character is King Lear as filtered through Nikos Kazantzakis, and in her main character’s incorrigibility – his yearning, his toxicity, and his potential for grace – she writes a tragi-comic parable about the American Dream. Let Me Explain You is a fractured fairy tale recounted by Stavros Stavros, and his broken, suffering, and triumphant daughters Stavroula, Litza, and Ruby. The Gala’s proprietor is one of the most distinctive voices since, well, Jonathan Safran Foer’s Ukrainian narrator Alex in Everything Is Illuminated, and Stavros Stavros hilarious and moving exposition marks Liontas as a major talent. Within Let Me Explain You there is an excavation of the layers of pride and woundedness, success and failure, which marks much of the immigrant experience, a digging deep into the strata of its characters’ histories. Liontas goes beyond the smudged and laminated menus of the Gala – the plates of crispy gyro meat smothered in tzatziki; pork roll, egg and cheese sandwiches; the disco fries covered in gravy; and the flimsy blue-and-white cups of cheap coffee with their ersatz meander design – to demonstrate that Shakespearean drama can happen even in Camden County.

Keane’s Ask Again, Yes takes place in points farther north, along the section of the Acela corridor immediately north of New York, as the upwardly mobile suburbs of Westchester stretch onward from outside the Bronx to leafy Connecticut, in communities like New Rochelle, Scarsdale, and Gillam. The last place is where two NYPD rookies – Francis Gleason and Brian Stanhope – who worked the same beat together in the 1970s Death Wish era of urban blight, coincidentally find themselves as neighbors, both following a suburban dream of fenced in lawns, Fourth of July grilling, and strip mall supermarkets. Like both Stavros Stavros and Jende, Francis is also an immigrant, this time from the west of Ireland. “One minute he’d been standing in a bog on the other side of the Atlantic,” Kean writes, “and the next thing he knew he was a cop. In America. In the worst neighborhood of the best known city in the world.”

A reserved man, Francis
isn’t particularly fond of Brian’s American volume, or of the latter’s erratic
wife Anne Stanhope, who like Gleason was also Irish-born. Despite Francis’
reservations about the Stanhopes, their children – young Kate Gleason and Peter
Stanhope – develop an intense adolescent romance that spans decades and has
combustible implications for the families. The story features a single instance
of incredible violence, the trauma of which alters both the Gleasons and the
Stanhopes, forcing them to ask how life is lived after such a rupture. Keane’s
novel is that rare thing in our contemporary era, where the culture industry
has for too long been obsessed with anti-heroes and gentle nihilism – it’s a
narrative of genuine moral significance, that’s just as concerned with
redemption as damnation, that takes contrition as seriously as that which gets
you to the point where grace is even necessary.

If you still haven’t gotten New York City out of your system, and if pandemic restrictions have you missing colleges and universities (as Zoom instruction is inevitably so much more anemic), then consider picking up a copy of James Gregor’s campus novel Going Dutch from East City Bookshop. A charming Capitol Hill mainstay that’s half descended into a basement right on Pennsylvania Avenue, not far from the string of restaurants and shops known as Barracks Row, East City Bookshop has excellent sections of history, politics, and contemporary novels, and is the sort of place where you can get twee mugs produced by the Unemployed Philosophers’ Guild. It’s the sort of bookstore that if it were in the Village, could predictably be perused by Gregor’s characters Richard and Anne, two New York University comparative literature grad students who enter into a strange psychosexual affair. Both are working on their dissertations in medieval Italian literature, but only Anne can be said to have any preternatural talent in her scholarship, which Richard is more than happy to exploit in his own research. While Richard unsuccessfully flits through Grindr, he and Anne fall closer and closer together, the two eventually agreeing to a relationship that is equal parts sex and plagiarism. “Part of him found her annoying,” Gregor writes of Richard’s feelings towards Anne, “another part was curious to observe her. There was something both needling and captivating about her that he couldn’t explain… emitting waves of musky, indeterminately foreign glamor… [he] found himself strangely excited by her presence in the classroom. It wasn’t attraction exactly, but he felt the blurred outlines of that category.” Anne is a very particular type of paradoxically worldly ingenue, a spinster with an edge, and Richard and her relationship falls deeper and deeper into pathology and the pathetic.

Washington D.C. and Los Angeles are some 2,654 miles apart, but a visit to Dupont Circle’s classic Kramer’s (because of the coffee bar it features it is now officially known as Kramer Books and Afterwords) can bestow upon you sunny California in novel form, with three titles that feature the Golden State in all of its seedy resplendence – Tracy Chevalier’s At the Edge of the Orchard, Patrick Coleman’s The Churchgoer, and The Millions’ staff writer Edan Lepucki’s Woman No. 17. District hullabaloo had it that the storied Kramer’s was potentially going to leave its Dupont Circle location, making the neighborhood infinitely poorer, but luckily the owners opted to continue their lease on the storied storefront where Monica Lewinsky once purchased a copy of Walt Whitman’s Leaves of Grass for Bill Clinton. Once our plague year has ended, shoppers will still be able to stop into the Connecticut Avenue location in this neighborhood of embassies and gay bars, and pick up any of the aforementioned California titles (in the meantime, consider ordering them online).

For pure folkloric
Americana, Chevalier’s At the Edge of the Orchard is an equally
beautiful and brutal novel, immaculate in its consummate weirdness. Chevalier
recounts tale of Robert Goodenough, son of Ohio apple growers James and Sadie
Goodenough, who in the decade before the Civil War searches for tree saplings
in northern California on behalf of a British naturalist who sells them to his
countrymen that have the unusual desire to grow sequoias and redwoods on the
grounds of English country estates. While traipsing through the hills north of
San Francisco, humbled by the forest cathedrals of the redwoods, Robert relives
the traumas of the unspeakable domestic violence in the frontier country which
left him an orphan. “Though grafted at the same time, they had grown up to be
different sizes; it always surprised James that the tree could turn out as
varied as his children.” Chevalier’s novel examines the ways that human
motivations can be unpredictable as the route that branching roots might take,
pruning back the exigencies of an individual human life to an elemental, almost
folkloric essence, and testing the soil of myth and memory to write a
luminescent novel that’s part fairy-tale, part parable, part Greek tragedy, and
part Western.

A different American
myth is explored in Coleman’s The Churchgoer, a brilliant neo-noir that true
to that venerable genre’s greatest of conventions places its seedy subject
matter of sex and criminality in the estimably pleasant and sunny- forever-75-degrees
of southern California. Mark Haines is a recovering alcoholic and drug addict,
a night watch security guard, a San Diego beach bum, and a former youth pastor
who has lost any faith in the God that failed him. He becomes embroiled in the
affairs of a mysterious and beautiful young runaway (as one does) named Cindy
Liu, a woman who comes from the same world of evangelical platitudes and
megachurch hypocrisies as he does, and when she goes missing and his night
watch partner is murdered (perhaps connected?) Haines embarks on an
investigation every bit worthy of Dashiell Hammett or Raymond Chandler. Reflecting
on a former parishioner who may be involved in sundry affairs, Mark notes that
“I didn’t like any of this. I didn’t like being questioned… If they wanted to
know what he was afraid of when he was seventeen, what he asked for prayers
about, how many times a week on average he committed the sin of self-pollution
against his better intentions, I could dig all that out from somewhere in my
brain… [but] Confession usually pulled up well short of the deeper truth.” The
true pleasure of Coleman’s novel isn’t plot (though the speed of pages turned
would recommend it for that alone), but rather language, which is always true
of the best noir books. The Churchgoer tastes like a gulp of cold black
coffee at an AA meeting which a cigarette has been cashed into, it sounds like
the static of a television left on until 3a.m. and the hum of a neon light in
the bar window of an Oceanside dive, it feels like insomnia and paranoia.

Lepucki makes great use
of the oppressive sunlight of California in her Hitchcockian domestic
tragicomedy Woman No. 17. Her second novel after the excellent
post-apocalyptic California, Lepucki explores the sultry side of
Hollywood Hills, where wealthy writer Lady Daniels hires a college student as a
live-in nanny to care for her young son while the former finishes an
experimental memoir, made possible off of alimony from her still-close film
producer ex-husband. “It was summer. The heat had arrived harsh and bright,
bleaching the sidewalks and choking the flowers before they had a chance to
wilt… I preferred to stay at home: ice cubes in the dog bowl, Riesling in the
freezer,” Lady says. Alternating between Lady and S., the art student whom she
hires without a proper vetting, Woman No. 17 explores the intersections
of obsession and sexuality, transgression and performance, in recounting how S.
becomes increasingly unhinged in an “art project” which involves imitating her
alcoholic mother and seducing Lady’s mute, adolescent, older son. As At the
Edge of the Orchard explores the traumas of family, and The Churchgoer examines
what it means to both be rejected by family and to construct a new family of
your own volition, so too does Lepucki interrogate the illusions of intimacy
and the way in which the mask we choose to wear can quickly become our face.

As the final two novels I’m writing about take as their subject the very soul of the nation, I recommend that you put in an order to buy Nell Zink’s Doxology and Kathleen Alcott’s America was Hard to Find at the District of Columbia literary institution of Politics and Prose. Perhaps the most foundational of bookstores in the D.C. literary ecosystem, Politics and Prose shares a Cleveland Park setting (or at least half-of-one) with Zink’s much anticipated novel, while Alcott’s America Was Hard to Find ranges over the entire continent, and the surface of the moon as well. Drawing its title from a poem by the radical priest and anti-Vietnam War activist Father Daniel Berrigan, Alcott’s novel is a bildungsroman for the American century. Audaciously reimagining the last fifty years of history, America is Hard to Find tells the story of the brief liaison of Air Force pilot Vincent Kahn and bartender Fay Fern, which results in the birth of their illegitimate son Wright. Kahn goes on to become the first man to walk on the moon, and Fay a domestic terrorist in a far-left group similar to the Weather Underground or the Symbionese Liberation Army. Easy to imagine the two as proxies for a type of Manichean struggle in the American spirit – the square astronaut and the radical hippie. Yet Alcott is far too brilliant of an author to pen simple allegory or didactic parable, for America Was Hard to Find is the sort of novel where mystery and the fundamental unknowability of both the national psyche and those of the people condemned to populate it are expressed in shining prose on every page.

The moon was everything he had loved about the high desert,” Alcott writes of Kahn’s first sojourn on that celestial body, “where nothing was obscured, available to you as far as you wished to look, but cast in tones that better fit the experience, the grays that ran from sooty to metallic, the pits dark as cellars. Most astonishing was the sky, a black he had never seen before, dynamic and exuberant. With a grin he realized the only apt comparison. It was glossy like a baby girl’s church shoes – like patent leather.

Alcott’s prose is so lyrical, so gorgeous, that it can be almost excruciating to read (I mean this as a compliment), a work that is so perfectly poetic that a highlighter would run out of ink before you’re a tenth of the way through the novel. There are scenes of arresting, heart-breaking beauty, none more so than the doomed life of Wright, a gay man who perishes in our country’s last plague. “There is a kind of understanding that occurs just after,” writes Alcott, “If we are lucky, we catch it at the door on our way out, watch it enter the rooms we have left. It is not always possible to tell the exact moment you have separated from the earth. So much of what we know for certain is irrelevant by the time we know it.”  

True to its title, there is something almost sacramental in Zink’s Doxology, with its poignant ruminations on both ecology and aesthetics as told througha generation-spanning story focused on Pam and Daniel Svoboda and their precocious daughter Flora. Originally 2/3rds of a Lower East Side 80s and 90s rock band situated somewhere on the spectrum between post-punk and grunge, the final member of their trio is Joe, a gentle musical genius with undiagnosed Williams Syndrome who was the only one to go onto any type of success before overdosing on September 11, 2001. Split between New York City and the Washington D.C. of Pam’s Fugazi-listening-Adams-Morgan-Clubbing youth, Doxology is an ultimately uncategorizable book about the connections of family forged in hardship and the transcendent power of creation. Zink’s narration is refreshingly Victorian, having no problem dwelling in exposition and displaying the full omniscience we require of our third-person narrators (though her Author as God has a sense of humor). Daniel “was an eighties hipster. But that can be forgiven, because he was the child of born-again Christian dairy-farm workers from Racine, Wisconsin” or that Joe’s “father was a professor of American history at Columbia, his mother had been a forever-young party girl in permanent overdrive who could drink all night, sing any song and fake the piano accompaniment, and talk to anybody about anything. In 1976 she died.”

Contrary to the order in which I’ve recounted this syllabus, I read Doxology in January, and as with Lauren Groff’s excellent speculative epic Arcadia, Zink’s novel moves into the near future from the time of its publication date in 2019. Recounting the effect that historical events like Desert Storm, 9/11, and the financial collapse of 2008 have on the Sveboldas, not to mention the election of Donald J. Trump, Doxology ends in the summer of 2020, a year after it was written and half a year after I read it. Flora lives in Washington, having been effectively raised by her grandparents, and in our infernal year as imagined by Zink she is a wounded environmental activist living in the Trumpian twilight. “On the last Wednesday in July, Washington was bathed in an acrid mist. The roses and marble facades stood sweating in air that stank of uncertainty. It was a smell that ought to be rising from burning trash, not falling from the sky as fawn-colored haze.”

Some sort of ecological
catastrophe has befallen the United States – perhaps a meltdown at a nuclear
power plant – and the burnt ochre sun struggling through pink overcast skies
speaks to the omnipresence of death. The Trump administration, of course,
denies any knowledge, telling people that they should simply live their lives,
and FOX News runs exposes about noodle thickness rather than the radioactive
plume which seems to be spreading over the east coast. With the uncanny
prescience that can only be imparted to us by a brilliant writer, I remember
finishing Zink’s novel and wondering what awaited us in the months ahead. Unnerving
to think of it now, but when I read Doxology I’d yet to have worn a face
mask outside, or heard of “social distancing.” I’d yet to have felt the itchy
anxiety that compels one to continually use hand-sanitizer, or to flinch from
whenever you hear a cough during the few minutes a day when your dog’s bladder
compels you to leave your apartment. When I read Doxology, already
fearful for the year ahead, not a single American had yet died of this new
disease, and I hadn’t yet heard the word coronavirus.

More from A Year in Reading 2020

Do you love Year in Reading and the amazing books and arts content that The Millions produces year round? We are asking readers for support to ensure that The Millions can stay vibrant for years to come. Please click here to learn about several simple ways you can support The Millions now.

Don’t miss: A Year in Reading 20192018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, 2008, 2007, 2006, 2005

Who’s Afraid of Theory?

- | 9

In a pique of indignation, the editors of the journal Philosophy and Literature ran a “Bad Writing Contest” from 1995 to 1998 to highlight jargony excess among the professoriate. Inaugurated during the seventh inning of the Theory Wars, Philosophy and Literature placed themselves firmly amongst the classicists, despairing at the influence of various critical “isms.” For the final year that the contest ran, the “winner” was Judith Butler, then a Berkeley philosophy professor and author of the classic work Gender Trouble: Feminism and the Subversion of Identity. The selection which caused such tsuris was from the journal Diacritics, a labyrinthine sentence where Butler opines that the “move from a structuralist account in which capital is understood to structure social relations in relatively homologous ways to a view of hegemony in which power relations are subject to repetition, convergence, and rearticulation brough the question of temporality into the thinking of structure,” and so on. If the editors’ purpose was to mock Latinate diction, then the “Bad Writing Contest” successfully made Butler the target of sarcastic opprobrium, with editorial pages using the incident as another volley against “fashionable nonsense” (as Alan Sokal and Jean Bricmont called it) supposedly reigning ascendant from Berkeley to Cambridge.

The Theory Wars, that is the administrative argument over which various strains of 20th-century continental European thought should play in the research and teaching of the humanities, has never exactly gone away, even while departments shutter and university work is farmed out to poorly-paid contingent faculty. Today you’re just as likely to see aspersions on the use of critical theory appear in fevered, paranoid Internet threads warning about “Cultural Marxism” as you are on the op-ed pages of the Wall Street Journal, even while at many schools literature requirements are being cut, so as to make the whole debate feel more like a Civil War reenactment than the Battle of Gettysburg. In another sense, however, and Butler’s partisans seem to have very much won the argument from the ‘80s and ‘90s—as sociologically inflected Theory-terms from “intersectionality” to “privilege” have migrated from Diacritics to Twitter (though often as critical malapropism)—ensuring that this war of attrition isn’t headed to armistice anytime soon.   

So, what exactly is “Theory?” For scientists, a “theory” is a model based on empirical observation that is used to make predictions about natural phenomenon; for the lay-person a “theory” is a type of educated guess or hypothesis. For practitioners of “critical theory,” the phrase means something a bit different. A critical theorist engages with interpretation, engaging with culture (from epic poems to comic books) to explain how their social context allows or precludes certain readings, beyond whatever aesthetic affinity the individual may feel. Journalist Stuart Jeffries explains the history (or “genealogy,” as they might say) of one strain of critical theory in his excellent Grand Hotel Abyss: The Lives of the Frankfurt School, describing how a century ago an influential group of German Marxist social scientists, including Theodor Adorno, Max Horkheimer, Walter Benjamin, and Herbert Marcuse, developed a trenchant vocabulary for “what they called the culture industry,” so as to explore “a new relationship between culture and politics.” At the Frankfurt Institute for Social Research, a new critical apparatus was developed for the dizzying complexity of industrial capitalism, and so words like “reify” and “commodity fetish” (as well as that old Hegelian chestnut “dialectical”) became humanistic bywords.  

Most of the original members of the Frankfurt School were old fashioned gentlemen, more at home with Arnold Schoenberg’s 12-tone avant-garde then with Jelly Roll Morton and Bix Beiderbecke, content to read Thomas Mann rather than Action Comics. Several decades later and a different institution, the Center for Contemporary Cultural Studies at the University of Birmingham in the United Kingdom, would apply critical theory to popular culture. These largely working-class theorists, including Stuart Hall, Paul Gilroy, Dick Hebdige, and Angela McRobbie (with a strong influence from Raymond Williams) would use a similar vocabulary as that developed by the Frankfurt School, but they’d extend the focus of their studies into considerations of comics and punk music, slasher movies and paperback novels, while also bringing issues of race and gender to bear in their writings.

In rejecting the elitism of their predecessors, the Birmingham School democratized critical theory, so that the Slate essay on whiteness in Breaking Bad or the Salon hot take about gender in Game of Thrones can be traced on a direct line back through Birmingham. What these scholars shared with Frankfurt, alongside a largely Marxian sensibility, was a sense that “culture was an important category because it helps us to recognize that one life-practice (like reading) cannot be torn out of a large network constituted by many other life-practices—working, sexual orientation, [or] family life,” as elucidated by Simon During in his introduction to The Cultural Studies Reader. For thinkers like Hall, McRobbie, or Gilroy, placing works within this social context wasn’t necessarily a disparagement, but rather the development of a language commensurate with explaining how those works operate. With this understanding, saying that critical theory disenchants literature would be a bit like saying that astronomical calculations make it impossible to see the beauty in the stars.

A third strain influenced “Theory” as it developed in American universities towards the end of the 20th century, and it’s probably the one most stereotypically associated with pretension and obfuscation. From a different set of intellectual sources, French post-structural and deconstructionist thought developed in the ‘60s and ‘70s at roughly the same time as the Birmingham School. Sometimes broadly categorized as “postmodernist” thinkers, French theory included writers of varying hermeticism like Jacques Derrida, Michel Foucault, Giles Deleuze, Jean Lyotard, Jacques Lacan, and Jean Baudrillard, who supplied English departments with a Gallic air composed of equal parts black leather and Galois smoke. Francois Cusset provides a helpful primer in French Theory: How Foucault, Derrida, Deleuze, & Co. Transformed the Intellectual Life of the United States, the best single volume introduction on the subject. He writes that these “Ten or twelve more or less contemporaneous writers,” who despite their not inconsiderable differences are united by a “critique of the subject, of representation, and of historical continuity,” with their focus the “critique of ‘critique’ itself, since all of them interrogate in their own way” the very idea of tradition. French theory was the purview of Derridean deconstruction, or of Foucauldian analysis of social power structures, the better to reveal the clenched fist hidden within a velvet glove (and every fist is clenched). For traditionalists the Frankfurt School’s Marxism (arguably never all that Marxist) was bad enough; with French theory there was a strong suspicion of at best relativism, at worst outright nihilism.

Theory has an influence simultaneously more and less enduring than is sometimes assumed. Its critics in the ‘80s and ‘90s warned that it signaled the dissolution of the Western canon, yet I can assure you from experience that undergraduates never stopped reading Shakespeare, even if a chapter from Foucault’s Discipline and Punish might have made it onto the syllabus (and it bears repeating that contra the reputation of difficulty, the latter was a hell of a prose stylist). But if current online imbroglios are any indication, its influence has been wide and unexpected, for as colleges pivot towards a business-centered STEM curriculum, the old fights about critical theory have simply migrated online. Much of the criticism against theory in the first iteration of this dispute was about what such thinkers supposedly said (or what people thought they were saying), but maybe even more vociferous were the claims about how they were saying things. The indictment about theory then becomes not just an issue of metaphysics, but one of style. It’s the claim that nobody can argue with a critical theorist because the writing itself is so impenetrable, opaque, and confusing. It’s the argument that if theory reads like anything, that it reads like bullshit.

During the height of these curricular debates there was a cottage industry of books that tackled precisely scholarly rhetoric, not least of which were conservative screeds like Allan Bloom’s The Closing of the American Mind: How Higher Education has Failed Democracy and Impoverished the Souls of Today’s Students and E.D. Hirsh Jr.’s The Dictionary of Cultural Literacy. Editors Will H. Corral and Daphne Patai claim in the introduction to their pugnacious Theory’s Empire: An Anthology of Dissent that “Far from responding with reasoned argument to their critics, proponents of Theory, in the past few decades, have managed to adopt just about every defect in writing that George Orwell identified in his 1946 essay ‘Politics and the English Language.’” D.G. Myers in his contribution to the collection (succinctly titled “Bad Writing”) excoriates Butler in particular, writing that the selection mocked by Philosophy and Literature was “something more than ‘ugly’ and ‘stylistically awful’… [as] demanded by the contest’s rules. What Butler’s writing actually expresses is simultaneously a contempt for her readers and an absolute dependence on their good opinion.”

Meanwhile, the poet David Lehman parses Theory’s tendency towards ugly rhetorical self-justification in Signs of the Times: Deconstruction and the Fall of Paul de Man, in which he recounts the sundry affair whereby a confidante of Derrida and esteemed Yale professor was revealed to have written Nazi polemics during the German-occupation of his native Belgium. Lehman also provides ample denunciation of Theory’s linguistic excess, writing that for the “users of its arcane terminology it confers elite status… Less a coherent system of beliefs than a way of thinking.” By 1996 and even Duke University English professor Frank Lentricchia (in a notoriously Theory-friendly department) would snark in his Lingua Franca essay “Last Will and Testament of an Ex-Literary Critic” to (reprinted in Quick Studies: The Best of Lingua Franca) “Tell me your theory and I’ll tell you in advance what you’ll say about any work of literature, especially those you haven’t read.”

No incident illustrated more for the public the apparent vapidity of Theory than the so-called “Sokal Affair” in 1996, when New York University physics professor Alan Sokal wrote a completely meaningless paper composed in a sarcastic pantomime of critical theory-speak entitled “Transgressing the Boundaries: Toward a Transformative Hermeneutics of Quantum Gravity” which was accepted for publication in the prestigious (Duke-based) journal Social Text, with his hoax simultaneously revealed by Lingua Franca. Sokal’s paper contains exquisite nonsense such as the claim that “postmodern sciences overthrow the static ontological categories and hierarchies characteristic of modernist science” and that “these homologous features arise in numerous seemingly disparate areas of science, from quantum gravity to chaos theory… In this way, the postmodern sciences appear to be converging on a new epistemological paradigm.” Sokal’s case against Theory is also, fundamentally, about writing. He doesn’t just attack critical theory for what he perceives as its dangerous relativism, but also at the level of composition, writing in Fashionable Nonsense: Postmodern Intellectuals’ Abuse of Science that such discourse “exemplified by the texts we quote, functions in part as a dead end in which some sectors of the humanities and social sciences have gotten lost.” He brags that “one of us managed, after only three months of study, to master the postmodernist lingo well enough to publish an article in a prestigious journal.” Such has long been the conclusion among many folks that Theory is a kind of philosophical Mad Libs disappearing up its own ass, accountable to nobody but itself and the departments that coddle it. Such was the sentiment which inspired the programmers of the Postmodern Essay Generator, which as of 2020 is still algorithmically throwing together random Theory words to create full essays with titles like “Deconstructing Surrealism: Socialism, surrealism and deconstructivist theory” (by P. Hans von Ludwig) and “Social realism and the capitalist paradigm of discourse” (by Agnes O. McElwaine).

Somebody’s thick black glasses would have to be on too tight not to see what’s funny in this, though there’s more than a bit of truth in the defense of Theory that says such denunciations are trite, an instance of anti-intellectualism as much as its opposite. Defenses of Theory in the wake of Sokal’s ruse tended to, not unfairly, query why nobody questions the rarefied and complex language of the sciences but blanches when the humanities have a similarly baroque vocabulary. Status quo objections to that line of thinking tend to emphasize the humanness of the humanities; the logic being that if we’re all able to be moved by literature, we have no need to have experts explain how that work of literature operates (as if being in possession of a heart would make one a cardiologist). Butler, for her part, answered criticism leveled against her prose style in a (well written and funny!) New York Times editorial, where she argues, following a line of Adorno’s reasoning, that complex prose is integral to critical theory because it helps to make language strange, and forces us to interrogate that which we take for granted. “No doubt, scholars in the humanities should be able to clarify how their work informs and illuminates everyday life,” Butler admits, “Equally, however, such scholars are obliged to question common sense, interrogate its tacit presumptions and provoke new ways of looking at a familiar world.”

To which I heartily agree, but that doesn’t mean that the selection of Butler’s mocked by Philosophy and Literature is any good. It costs me little to admit that the sentence is at best turgid, obtuse, and inelegant, and at worst utterly incomprehensible. It costs me even less to admit that that’s probably because it’s been cherry picked, stripped of context, and labeled as such so that it maximizes potential negative impressions. One can defend Butler— and Theory—without justifying every bit of rhetorical excess. Because what some critics disparage about Theory—its obscurity, its rarefied difficulty, its multisyllabic technocratic purpleness—is often true. When I arrived in my Masters program, in a department notoriously Theory-friendly, I blanched as much as Allan Bloom being invited to be a roadie on the Rolling Stones’ Steel Wheel Tour. For an undergraduate enmeshed in the canon, and still enraptured to that incredibly old-fashioned (but still intoxicating) claim of the Victorian critic Matthew Arnold in Culture and Anarchy that the purpose of education was to experience “the best which has been thought and said,” post-structuralism was a drag. By contrast, many of my colleagues, most of them in fact, loved Theory; they thrilled to its punkish enthusiasms, its irony laden critiques, its radical suspicion of the best of which has been thought and said. Meanwhile I despaired that there were no deconstructionists in Dead Poets Society.

I can no longer imagine that perspective. It’s not quite that I became a “Theory Head,” as one calls all of those sad young men reading Deleuze and  Félix Guattari while smoking American Spirit cigarettes, but I did learn to stop worrying and love Theory (in my own way). What I learned is that Theory begins to make sense once you learn the language (whether it takes you three months or longer), and that it’s innately, abundantly, and estimably useful when you have to actually explain how culture operates, not just whether you happen to like a book or not. A poet can write a blazon for her beloved, but an anatomist is needed to perform the autopsy. Some of this maturity came in realizing that literary criticism has always had its own opacity; that if we reject “binary opposition,” we would have to get rid of “dactylic hexameter” as well. The humanities have always invented new words to describe the things of this world that we experience in culture. That’s precisely the practice attacked by John Martin Ellis, who in his jeremiad Against Deconstruction took on Theory’s predilection towards neologism, opining that “there were plenty of quite acceptable ordinary English words for the status of entrenched ideas and for the process of questioning and undermining them.” All of that difference, all of that hegemony, and so much phallologocentricism… But here’s the thing— sometime heteroglossia by any other name doesn’t smell as sweet.

Something anachronistic in proffering a defense of Theory in the third decade of the new millennium; something nostalgic or even retrograde. Who cares anymore? Disciplinary debates make little sense as the discipline itself has imploded, and the anemic cultural studies patois of the Internet hardly seems to warrant the same reflection, either in defense or condemnation. In part though, I’d suggest that it’s precisely the necessity of these words, and their popularity among those who learned them through cultural osmosis and not through instruction, that necessitates a few statements in their exoneration. All of the previous arguments on their behalf—that the humanities require their own jargon, that this vocabulary provides an analytical nuance that the vernacular doesn’t—strike me as convincing. And the criticism that an elite coterie uses words like “hegemonic” as a shibboleth are also valid, but that’s not an argument to abandon the words—it’s an argument to instruct more people on what they mean.

But I’d like to offer a different claim to utility, and that’s that Theory isn’t just useful, but that it’s beautiful. When reading the best of Theory, it’s as if reading poetry more than philosophy, and all of those chewy multisyllabic words can be like honey in the mouth. Any student of linguistics or philology—from well before Theory—understands that synonyms are mythic and that an individual word has a connotative life that is rich and unique. Butler defends the Latinate, writing that for a student “words such as ‘hegemony’ appears strange,” but that they may discover that beyond its simpler meaning “it denotes a dominance so entrenched that we take it for granted, and even appear to consent to it—a power that’s strengthened by its invisibility.” Not only that, I’d add that “hegemony,” with its angular consonants hidden like a sharp rock in the middle of a snowball, conveys a sense of power beyond either brute strength or material plenty. Hegemony has something of the mysterious about it, the totalizing, the absolute, the wickedly divine. To simply replace it with the word “power” is to drain it of its impact. I’ve found this with many of those words; that they’re as if occult tone poems conveying a hidden and strange knowledge; that they’re able to give texture to a picture that would otherwise be flat. Any true defense of Theory must, I contend, give due deference to the sharp beauty that these sometimes-hermetic words convey.

As a totally unscientific sample, I queried a number of my academic (and recovering academic) colleagues on social media to see what words they would add to a list of favorite terms; the jargon that others might roll their eyes at, or hear as grad school clichés, but that are estimably useful, and dare I say it—beautiful. People’s candidates could be divided in particular ways, including words that remind us of some sort of action, words that draw strength from an implied metaphorical imagery, and words that simply have an aural sense that’s aesthetically pleasing (and these are by no means exhaustive or exclusive). For example, Derrida’s concept of “deconstruction,” a type of methodological meta-analysis that reveals internal contradictions within any text, so as to foreground interpretations that might be hidden, was a popular favorite word. “Deconstruction” sounds like an inherently practical term, a word that contractors rather than literary critics might use, the prefix connotes ripping things down while the rest of the word gestures towards building them (back?) up. A similar word that several responders mentioned, albeit one with less of a tangible feel to it, was “dialectics,” which was popularized in the writings of the 19th-century German philosopher George Wilhelm Friedrich Hegel, was mediated through Karl Marx, and was then applied to everything by the Frankfurt School. As with many of these terms, “dialectics” has variable meaning depending on who is using it, but it broadly refers to an almost evolutionary process whereby the internal contradictions of a concept are reconciled, propelling thought into the future. For the materialist deployment of the term by Marx and his followers, the actual word has an almost mystical gloss to it, the trochaic rhythm of the word itself with its up-down-up-down beat evoking the process of thesis-antithesis-synthesis to which the term itself applies. Something about the very sound of “dialectic” evokes both cutting and burying to me, the psychic struggle that the word is supposed to describe.

Then there are the words that are fueled with metaphorical urgency, short poems in their own right that often appropriated from other disciplines. Foucault used words like “genealogy” or “archeology” when some might think that “history” would be fine, and yet those words do something subtly different than the plodding narrative implied by the more prosaic word. With the former there is a sense of telling a story that connects ideas, trends, and themes within a causal network of familial relations, the latter recalls excavation and the revealing of that which remains hidden (or cursed). Deleuze and Guatari borrowed the term “rhizome” from botany, which originally described the complex branching of root systems, now reapplied to how non-hierarchical systems of knowledge propagate. “Rhizome” pays homage to something of beauty from a different way of understanding the world—it is not filching, it is honoring. The Italian Marxist Antonio Gramsci similarly borrowed the term “subaltern,” later popularized by Gayatri Chakravorty Spivak, for whom it came to designate communities of colonized people who are simultaneously exoticized and erased by imperial powers. The word itself was a term used for junior officers in the British colonial service. Finally, I’m partial to “interiority” myself, used to denote fictional representations of consciousness or subjectivity. Yet “interiority,” with its evocation of a deep subterranean network or the domestic spaces of a many-roomed mansion, says something about consciousness that the more common word doesn’t quite.

My favorite critical jargon word, however, is “liminal.” All of us who work on academic Grub Street have their foibles, the go-to scholarly tics marking their prose like an oily fingerprint left on Formica. We all know the professor with their favored jargon turn (often accompanied by an equivalent hand movement, like an intricate form of Neapolitan), or the faculty member who might be taken to yelling out “Hegemonic!” at inopportune times. Thus, I can’t help but sprinkle my own favored term into my writing like paprika in Budapest goulash. My love for the word, used to designate things that are in-between, transitioning, and not quite formed, has less to do with its utility than with the mysterious sense of the sounds that animate it. It’s always been oddly onomatopoeic to me, maybe because it’s a near homophone to “illuminate,” and makes me think of dusk, my favorite time of day. When I hear “liminal” it reminds me of moonbeams and cicadas at sunset; it reminds me that the morning star still endures even at dawn. An affection for the term has only a little to do with what’s useful about it, and everything to do with that connotative ladder that stretches out beyond its three syllables. I suspect that when we love these words, this jargon, it’s an attraction to their magic, the uncanny poetry hidden behind the seemingly technocratic. The best of Theory exists within that liminal space, between criticism and poetry; justifying itself by recourse to the former, but always actually on the side of the latter—even if it doesn’t know it.           

Image Credit: Wikipedia

On Obscenity and Literature

- | 2

“But implicit in the history of the First Amendment is the rejection of obscenity as utterly without redeeming social importance.” —Associate Justice William J. Brennan Jr., Roth v. United States (1957)

Interviewer: Speaking of blue, you’ve been accused of vulgarity. Mel Brooks: Bullshit! —Playboy (February, 1975)

On a spring evening in 1964 at the Café Au Go Go in Greenwich Village, several undercover officers from the NYPD’s vice squad arrested Lenny Bruce for public obscenity. Both Bruce and the club’s owner Howard Solomon were shouldered out through the crowded club to waiting squad cars, their red and blue lights reflected off of the dirty puddles pooled on the pavement of Bleecker Street. For six months the two men would stand trial, with Bruce’s defense attorney calling on luminaries from James Baldwin to Allen Ginsberg, Norman Mailer to Bob Dylan, to attest to the stand-up’s right to say whatever he wanted in front of a paying audience. “He was a man with an unsettling sense of humor,” write Ronald K.L. Collins and David M. Skover in The Trials of Lenny Bruce: The Fall and Rise of an American Icon. “Uncompromising, uncanny, unforgettable, and unapologetic…His words crossed the law and those in it is. He became intolerable to people too powerful to ignore. When it was over, not even the First Amendment saved him.” The three-judge tribunal sentenced Bruce to four months punishment in a workhouse. Released on bail, he never served a day of his conviction, overdosing on morphine in his Hollywood Hills bungalow two years later. He wouldn’t receive a posthumous pardon until 2003.

“Perhaps at this point I ought to say a little something about my vocabulary,” Bruce wrote in his (still very funny) How to Talk Dirty and Influence People: An Autobiography. “My conversation, spoken and written, is usually flavored with the jargon of the hipster, the argot of the underworld, and Yiddish.” Alongside jazz, Jewish-American comedy is one of the few uniquely American contributions to world culture, and if that comparison can be drawn further, then Bruce was the equivalent of Dizzy Gillespie or Miles Davis—he was the one who broke it wide open. Moving comedy away from the realm of the Borscht Belt one-liner, Bruce exemplified the emerging paradigm of stand-up as spoken word riff of personal reflections and social commentary, while often being incredibly obscene. The Catskills comedian Henry Youngman may have been inescapably Jewish, but Bruce was unabashedly so. And, as he makes clear, his diction proudly drew from the margins, hearing more truth in the dialect of the ethnic Other than in mainstream politeness, more honesty in the junky’s language than in the platitudes of the square, more righteous confrontation in the bohemian’s obscenity than in the pieties of the status quo. Among the comics of that golden age of stand-up, only Richard Pryor was his equal in bravery and genius, and despite the fact that some of his humor is dated today, books like How to Talk Dirty and Influence People still radiate an excitement that a mere burlesque performer could challenge the hypocrisy and puritanism of a state that would just as soon see James Joyce’s Ulysses and D.H. Lawrence’s Lady Chatterley’s Lover banned and their publisher’s hauled to jail as they would actually confront any of the social ills that infected the body politic.

What separates Bruce from any number of subsequent comics is that within his performances there was a fully articulated theory of language. “Take away the right to say the word ‘fuck’ and you take away the right say ‘fuck the government,’” he is reported to have said, and this is clearly and crucially true. That’s one model of obscenity’s utility: its power to lower the high and to raise the low, with vulgarity afforded an almost apocalyptic power of resistance. There is a naivety, however, that runs through the comedian’s work, and that’s that Bruce sometimes doesn’t afford language enough power. In one incendiary performance from the early ’60s, Bruce went through a litany of ethnic slurs for Black people, Jews, Italians, Hispanics, Poles, and the Irish, finally arguing that “it’s the suppression of the word that gives it the power, the violence, the viciousness.” He imagines a scenario whereby the president would introduce members of his cabinet by using those particular words, and concludes that following such a moment those slurs wouldn’t “mean anything anymore, then you could never make some six-year-old black kid cry because somebody called him” that word at school. Bruce’s idealism is almost touching—let it not be doubted that he genuinely believed language could work in this way—but it’s also empirically false. Dying a half-century ago he can’t be faulted for his ignorance on this score, but when we now have a president who basically does what Bruce imagined his hypothetical Commander-in-Chief doing, and I think we can emphatically state that the repetition of such ugliness does nothing to dispel its power.       

Discussions about obscenity often devolve into this bad-faith dichotomy—the prudish schoolmarms with their red-pens painting over anything blue and the brave defenders of free speech pushing the boundaries of acceptable discourse. The former hold that there is a certain power to words that must be tamed, while the later champion the individual right to say what they want to say. When the issue is phrased in such a stark manner, it occludes a more discomforting reality—maybe words are never simply utterances, maybe words can be dangerous, maybe words can enact evil things, and maybe every person has an ultimate freedom to use those words as they see fit (notably a different claim than people should be able to use them without repercussion). Bruce’s theory of language is respectably semiotic, a contention about the arbitrary relationship between signifier and signified, whereby that chain of connection can be severed by simple repetition, as when sense flees from a word said over and over again, whether it’s “potato” or “xylophone.” But he was ultimately wrong (as is all of structural and post-structural linguistics)—language is never exactly arbitrary, it’s not really semiotic. We need theurgy to explain how words work, because in an ineffable and numinous way, words are magic. When it comes to obscenity in particular, whether the sexual or the scatological, the racial or the blasphemous, we’re considering a very specific form of that magic, and while Bruce is correct that a prohibition on slurs would render resistance to oppression all the more difficult, he’s disingenuous in not also admitting that it can provide a means of cruelty in its own right. If you couldn’t say obscenities then a certain prominent tweeter of almost inconceivable power and authority couldn’t deploy them almost hourly against whatever target he sees fit. This is not an argument for censorship, mind you, but it is a plea to be honest in our accounting.

Obscenity as social resistance doesn’t have the same cache it once did, nor is it always interpreted as unassailably progressive (as it was for Bruce and his supporters). In our current season of a supposed Jacobin “cancel culture,” words have been ironically re-enchanted with the spark of danger that was once associated with them. Whether or not those who claim that there is some sort of left McCarthyism policing language are correct, it’s relatively anodyne to acknowledge that right now words are endowed with a significance not seen since Bruce appeared in a Manhattan courtroom. Whatever your own stance on the role that offensiveness plays in civilized society, obscenity can only be theorized through multiple perspectives. Four-letter words inhabit a nexus of society, culture, faith, linguistics, and morality (and the law). A “fuck” is never just a “fuck,” and a shit by any other name wouldn’t smell as pungent. Grammatically, obscenities are often classified as “intensifiers,” that is placeholders that exist to emphasize the emotionality of a given declaration—think of them as oral exclamation marks. Writing in Holy Sh*t: A Brief History of Swearing, Melissa Mohr explains vulgarity is frequently “important for the connotation it carries and not for its literal meaning.” Such a distinction came into play in 2003 after the Irish singer Bono of U2 was cited by the Federal Communications Commission when upon winning a Golden Globe he exclaimed “fucking brilliant.” The Enforcement Commission of the bureau initially decided that Bono’s f-bomb wasn’t indecent since its use clearly wasn’t in keeping with the sexual definition of the word, a verdict that was later rescinded higher up within the FCC.

“Historically,” Mohr writes, “swearwords have been thought to possess a deeper, more intimate connection to the things they represent than do other words,” and in that regard the pencil-necked nerds at the FCC ironically showed more respect to the dangerous power of fucking then did Bono. If vigor of emotion is all one was looking for in language, any number of milquetoast words would work as well as a vulgarity, and yet obscenity (even if uttered due to a stubbed toe) is clearly doing something a bit more transcendent than more PG terms—for both good and bad. Swearing can’t help but have an incantatory aspect to it; we swear oaths, and we’re specifically forbidden by the Decalogue from taking the Lord’s name in vain. Magnus Ljung includes religious themes in his typology of profanity, offered in Swearing: A Cross-Cultural Linguistic Study, as one of “five major themes that recur in the swearing of the majority of the languages discussed and which are in all likelihood also used in most other languages featuring swearing.” Alongside religious profanity, Ljung recognizes themes according to scatology, sex organs, sexual activities, and family insults. To this, inevitably, must also be offered ethnic slurs. Profanity is by definition profane, dealing with the bloody, pussy, jizzy reality of what it means to be alive (and thus the lowering of the sacred into that oozy realm is part of what blasphemously shocks). Obscenity has a quality of the theological about it, even while religious profanities have declined in their ability to shock an increasingly secular society.

Today a word like “bloody” sounds archaic or Anglophilic, and almost wholly inoffensive, even while it’s (now forgotten) reference to Christ’s wounds would have been scandalous to an audience reared on the King James Bible. This was the problem that confronted television director David Milch, who created the classic HBO western Deadwood. The resultant drama (with dialogue largely composed in iambic pentameter) was noted as having the most per capita profanity of any show to ever air, but in 1870s Dakota most of those swears would have been religious in nature. Since having Al Swearengen (a perfect name if ever there was one) sound like Yosemite Sam would have dulled the shock of his speech, Milch elected to transform his characters’ language into scatological and ethnic slurs, the latter of which still has the ability to upset an audience in a way that “by Christ’s wounds!” simply doesn’t. When Swearengen offers up his own theory of language to A.W. Merrick, who edits Deadwood’s newspaper, he argues that “Just as you owning a print press proves only an interest in the truth, meaning up to a fucking point, slightly more than us others maybe, but short of a fucking anointing or the shouldering of a sacred burden—unless of course the print press was gift of an angel,” he provides a nice synthesis of the blasphemous and the sexual. The majority of copious swears in Deadwood are of the scatological, sexual, or racial sort, and they hit the ear-drum with far more force than denying the divinity of Christ does. When Milch updated the profanity of the 19th century, he knew what would disturb contemporary audiences, and it wasn’t tin-pot sacrilege.  

All of which is to say that while obscenity has a social context, with what’s offensive being beholden to the mores of a particular century, the form itself universally involves the transgression of propriety, with the details merely altered to the conventions of a time and place. As an example, watch the 2005 documentary The Aristocrats directed by magician Penn Jillette, which features dozens of tellings of the almost unspeakably taboo joke of the same name. Long an after-hours joke told by comedians who would try to one-up each other in the degree of profanity offered, Jillette’s film presents several iconic performers giving variations on the sketch. When I saw the film after it came out, the audience was largely primed for the oftentimes extreme sexual and scatological permutations of the joke, but it was the tellings that involved racial slurs and ethnic stereotypes that stunned the other theater goers. It’s the pushing of boundaries in and of itself, rather than the subject in question, that designates something as an obscenity. According to Sigmund Freud in his (weirdly funny) The Joke and Its Relation to the Unconscious, vulgar humor serves a potent psychological purpose, allowing people “to enjoy undisguised obscenity” that is normally repressed so as to keep “whole complexes of impulses, together with their derivatives, away from consciousness.” Obscenity thus acts as a civilizational pressure valve for humanity’s chthonic impulses.

That
words which are considered obscene are often found in the vocabulary of the
marginalized isn’t incidental, and it recommends spicey language as a site of
resistance. English swearing draws directly from one such point of contact
between our “higher” and our “lower” language. The majority of English swears
have a Germanic origin, as opposed to a more genteel Romance origin (whether
from French or Latin). In keeping with their Teutonic genesis, they tend to
have an abrasive, guttural, jagged quality to their sounds, the better to
convey an onomatopoeic quality. Take a look at the list which comprises comedian
George Carlin’s 1972 bit “Seven Words You Can Never Say on Television.” Four of
them definitely have an Old English etymology, traceable back to the West
Germanic dialect of the Angles, Saxons, Frisians, and Jutes who occupied
Britain in the later centuries of the first millennium. Three of them – the one
that rudely refers to female genitalia, the one that tells you to rudely do
something sexual, and the one that tells you to do that thing to your mother –
may have Latin or Norman origins, though linguists think they’re just as likely
to come from what Medievalists used to call “Anglo-Saxon.” Most of these words
had no obscene connotations in their original context; in Old English the word
for urine is simply “piss,” and the word for feces is “shit.” Nothing dirty
about either word until the eleventh-century Norman invasion of Britain privileged
the French over the English. That stratification, however, gives a certain
gutter enchantment to those old prosaic terms, endowing them with the force of
a swear. Geoffrey Hughes writes in Swearing: A Social History of Foul
Language, Oaths, and Profanities in English that the “Anglo-Saxon element… provides
much more emotional force than does the Normal French of the Latin. Copulating
pandemonium! conveys none of the emotion charge of the native equivalent fucking
hell!”  Invasion, oppression, and
brutality mark those words which we consider to be profane, but they also give
them their filthy enchantments.

What’s clear is that the class connotations of what Bruce called an “argot” can’t be ignored. Swearing is the purview of criminals and travelers, pirates and rebels, highwaymen and drunks. For those lexicographers who assembled lists of English words in the early modern era, swearing, or “canting,” provided an invaluable window into the counter-cultural consciousness. The Irish playwright Richard Head compiled The Canting Academy, or Devil’s Cabinet Opened in 1673, arguably the first full-length English “dictionary,” complied decades before Dr. Johnson’s staider 1755 A Dictionary of the English Language. Decades before Head’s book, and short pamphlets by respectable playwrights from Thomas Dekker to Thomas Middleton similarly illuminated people on the criminal element’s language—other examples that were included as appendices within books, such as Thomas Harman’s A Caveat or Warning for Common Cursitors, go back well into the 16th century. Such “canting guides,” exploring the seamy underbelly of the cockney capital, were prurient pamphlets that illustrated the salty diction of thieves and rogues for the entertainment of the respectable classes. One of the most popular examples was the anonymously edited A New Dictionary of the Terms Ancient and Modern of the Canting Crew, first printed in 1698.  Within, readers could learn the definitions of insults from “blobber-lipped” to “jobber-not.” Such dictionaries (that included words like “swindler” and “phony,” which still survive today) drew from the English underclass, with a motley vocabulary made up of words from rough-hewn English, Romani, and ultimately Yiddish, among other origins.

A direct line runs between the vibrant, colorful, and earthy diction of canting to cockney rhyming slang, or the endangered dialect of Polari used for decades by gay men in Great Britain, who lived under the constant threat of state punishment. All of these tongues are “obscene,” but that’s a function of their oppositional status to received language. Nothing is “dirty” about them; they are, rather, rebellions against “proper” speech, “dignified” language, “correct” talking, and they challenge that codified violence implied by the mere existence of the King’s Speech. Their differing purposes, and respective class connotations and authenticity, are illustrated by a joke wherein a hobo asks a nattily dressed businessman for some change. “’Neither a borrower nor a lender be’—that’s William Shakespeare,” says the businessman. “’Fuck you’—that’s David Mamet,” responds the panhandler. A bit of a disservice to the Bard, however, who along with Dekker and Middleton could cant with the best of them. For example, within the folio one will find “bawling, blasphemous, incharitible dog,” “paper fac’d villain,” and “embossed carbuncle,” among such other similarly colorful examples.

An entire history could be written about early instances of noted slurs, which of course necessitates trawling the Oxford English Dictionary for examples of dirty words that appear particularly early. For “shit,” there is a 1585 instance of the word in a Scottish “flyting,” an extemporaneous poetic rhyme-battle held in Middle Scotts, which took place between Patrick Hume and Alexander Montgomerie. The greatest example of the form is the 15th-century Flyting of Dunbar and Kennedy, containing the first printed instance of the word “fuck.” In the OED, our good friend the dirty lexicographer Richard Head has the earliest example given in the entry for the word “fuck,” the profanity appearing as a noun in his play Hic et Ubique: or, The Humors of Dublin ,wherein a character says “I did creep in…and there I did see [him] putting the great fuck upon my wife.” And the dictionary reflects the etymological ambiguity concerning the faux-francophone/faux-Virgilian word “dildo,” giving earliest attribution to the playwright Robert Greene in 1590 who in his comedy Never Too Late wrote “Dildido dildido, Oh love, oh love, I feel thy rage rumble below and above.” Swearing might be a radical alternative to received language, but it pulses through literature like a counter-history, a shadow realm of the English tongue’s full capabilities. It is a secret language, the twinned-double of more respectable letters, and it’s unthinkable to understand Geoffrey Chaucer without his scatological jokes or Shakespeare minus his bawdy insults. After all, literature is just as much Charles Bukowski as T.S. Eliot; it’s William S. Burroughs and not just Ezra Pound.

Sometimes those dichotomies about what language is capable of are reconciled within the greatest of literature. A syllabus of the immaculate obscene would include the Marquis de Sade’s 120 Days of Sodom, Charles Baudelaire’s The Flowers of Evil, Gustave Flaubert’s Madame Bovary, Joyce’s Ulysses, Lawrence’s Lady Chatterley’s Lover, Vladimir Nabokov’s Lolita, Henry Miller’s Tropic of Cancer (smuggled out of a Barnes & Noble by yours truly when I was 16), and Irving Welsh’s Trainspotting. Along with his fellow Scotsman James Kelman, Welsh shows the full potential of obscenity to present an assault on the pieties of the bourgeois, mocking Madison Avenue sophistry when he famously implores the reader to “Choose rotting away, pishing and shiteing yersel in a home, a total fuckin embarrassment tae the selfish, fucked-up brats ye’ve produced. Choose life.” Within the English language, looming above all as the primogeniture of literary smut, is the great British author John Cleland, who in 1748 published our first pornographic novel in Fanny Hill: Or, Memoirs of a Woman of Pleasure, wherein he promised “Truth! stark naked truth, is the word, and I will not so much as take the pains to bestow the strip of a gauze-wrapper on it.” Cleland purposefully wrote Fanny Hill entirely in euphemisms and double entendres, but the lack of dirty words couldn’t conceal the fact that the orgiastic bildungsroman about a middle-age nymphomaniac was seen as unspeakably filthy. The novel has the distinction of being the longest banned work in U.S. history, first prohibited by the Massachusetts Supreme Court in 1821, only to be sold legally after the U.S. Supreme Court ruled that its censorship was unconstitutional in 1966. The same year that Bruce was found face-down, naked and dead, in his California bathroom.     

A goddamn unequivocal fucking triumph of the human spirt that any fucking wanker can march up into a public library and check out a copy of Fanny Hill. That liberty is one that was hard fought for, and we should look askance on anyone who’d throw it away too cavalierly. But there is also something disingenuous as dismissing all those who suppressed works like Fanny Hill or Ulysses or Lady Chatterley’s Lover as mere prigs and prudes. A work is never censored because it isn’t powerful; it’s attacked precisely because of that coiled, latent energy that exists within words, none the more so than those that we’ve labeled as forbidden. If the debate over free speech and censorship is drenched in a sticky oil of bad faith, then that slick spills over into all corners. My fellow liberals will mock the conservative perspective that says film or comic books or video games or novels are capable of altering someone into action, sometimes very ugly action—but of course literature is capable of doing this. Why would we read literature otherwise? Why would we create it otherwise? The censor with his black marker in some ways does due service to literature, acknowledging its significance and its uncanny effect. To claim that literature shouldn’t be censored because all literature is safe is not just fallacious, it’s disrespectful. The far more difficult principle is that literature shouldn’t be censored despite the fact that it’s so often dangerous.

Like any grimoire or incantation, obscenity can be used to liberate and to oppress, to free and to enslave, to bring down those in power but also to froth a crowd into the most hideous paroxysms of fascistic violence. So often the moralistic convention holds that “punching down” is never funny, but the dark truth is that it often is. What we do with that reality is the measure of us as people, because obscenity is neither good nor bad, but all power resides within the mouth of who wields it. What we think of as profanity is a rupture within language, a dialectic undermining conventional speech, what the Greeks called an aporia that constitutes the moment that rhetoric breaks down. Obscenity is when language declares war on itself, often with good cause. Writing in Rabelais and His World, the great Russian critic Mikhail Bakhtin defined what he called the “carnivalesque,” that is the principle that structured much medieval and Renaissance performance and literature, whereby the “principle of laughter and the carnival spirit on which the grotesque is based destroys…seriousness and all pretense.” Examining the Shrovetide carnivals that inaugurated pre-Reformation Lent, Bakhtin optimistically saw something liberatory in the ribald display of upended hierarchies, where the farting, shitting, pissing, vomiting hilarity of the display rendered authority foolish. “It frees human consciousness,” Bakhtin wrote, “and imagination for new potentialities.”

An uneasy and ambivalent undercurrent threads through Bakhtin’s argument, though. If the carnival allowed for a taste of emancipation, there was also always the possibility that it was just more bread and circus, a way to safely “rebel” without actually challenging the status quo. How much of our fucks and shits are just that, simply the smearing of feces on our playpen walls? Even worse, what happens when the carnival isn’t organized by plucky peasants to mock the bishops and princes, but when the church and state organize those mocking pageants themselves? Bakhtin didn’t quite anticipate the troll, nor did Bruce for that matter. Gershon Legman writes in the standard text Rationale of the Dirty Joke: An Analysis of Sexual Humor that “Under the mask of humor, our society allows infinite aggressions, by everyone and against everyone. In the culminating laugh of the listener or observer…the teller of the joke betrays his hidden hostility.” Can’t you take a joke? Because I was just joking. Legman’s reading of obscenity is crucial—it’s never just innocent, it’s never just nothing, it’s never just words. And it depends on who is saying them, and to whom they’re being said. Because swearing is so intimately tied to the theological, the use of profanity literally takes on the aura of damnation. It’s not that words aren’t dangerous—they are. But that doesn’t mean we must suture our mouths, even as honesty compels us to admit that danger. What we do with this understanding is the process that we call civilization. Because if Lenny Bruce had one unassailable and self-evident observation, it was that “Life is a four-letter word.” How could it be fucking otherwise?

Bonus Link:—Pussy Riot: One Woman’s Vagina Takes on Japan’s Obscenity Laws

Image Credit: Flickr/Jeniffer Moo

Ten Ways to Save the World

-

1.              In a purple-walled gallery of the Smithsonian Museum of American Art, you can visit the shrine constructed by Air Force veteran and janitor James Hampton for Jesus Christ’s return. Entitled “Throne of the Third Heaven of the Nation’s Millennium General Assembly,” the altar and its paraphernalia were constructed to serve as temple objects for the messiah, who according to Hampton, based on visions he had of Moses in 1931, the Virgin Mary in 1946, and Adam in 1949, shall arrive in Washington D.C. His father had been a part-time gospel singer and Baptist preacher, but Hampton drew not just from Christianity, but brought Afrocentric folk traditions of his native South Carolina to bear in his composition. Decorated with passages from Daniel and Revelation, Hampton’s thinking (done in secret over 14 years in his Northwest Washington garage) is explicated in his 100-page manifesto St. James: The Book of the 7 Dispensations (dozens of pages are still in an uncracked code). Claiming that he had received a revised version of the Decalogue, Hampton’s notebook declared himself to be “Director, Special Projects for the State of Eternity.” His work is a fugue of word salad, a concerto of pressured speech. A combined staging ground for the incipient millennium—Hampton’s shrine is a triumph.

As if the bejeweled shield of the Urim and the Thummim were constructed not by Levites in ancient Jerusalem, but by a janitor in Mt. Vernon. Exodus and Leviticus give specifications for those liturgical objects of the Jewish Temple—the other-worldly cherubim gilded, their wings touching the hem of infinity huddled over the Ark of the Covenant; the woven brocade curtain with its many-eyed Seraphim rendered in fabric of red and gold; the massive candelabra of the ritual menorah. The materials with which the Jews built their Temple were cedar and sand stone, gold and precious jewels. When God commanded Hampton to build his new shrine, the materials were light-bulbs and aluminum foil, door frames and chair legs, pop cans and cardboard boxes, all held together with glue and tape. The overall effect is, if lacking in gold and cedar, transcendent nonetheless. Hampton’s construction looks almost Mesoamerican, aluminum foil delicately hammered onto carefully measured cardboard altars, names of prophets and patriarchs from Ezekiel to Abraham rendered.

Everyday Hampton would return from his job at the General Services Administration, where he would mop floors and disinfect counters, and for untold hours he’d assiduously sketch out designs based on his dreams, carefully applying foil to wood and cardboard, constructing crowns from trash he’d collected on U Street. What faith would compel this, what belief to see it finished? Nobody knew he was doing it. Hampton would die of stomach cancer in 1964, never married, and with few friends or family. The shrine would be discovered by a landlord angry about late rent. Soon it would come to the attention of reporters, and then the art mavens who thrilled to the discovery of “outsider” art—that is work accomplished by the uneducated, the mentally disturbed, the impoverished, the religiously zealous. “Throne of the Third Heaven of the Nation’s Millennium General Assembly” would be purchased and donated to the Smithsonian (in part through the intercession of artist Robert Rauschenberg) where it would be canonized as the Pieta of American visionary art, outsider art’s Victory of Samothrace. 

Hampton wasn’t an artist though—he was a prophet. He was Elijah and Elisha awaiting Christ in the desert. Daniel Wojcik writes in Outsider Art: Visionary Worlds and Trauma that “apocalyptic visions often have been expressions of popular religiosity, as a form of vernacular religion, existing at a grassroots level apart from the sanction of religious authority.” In that regard Hampton was like so many prophets before him, just working in toilet paper and beer can rather than papyrus—he was Mt. Vernon’s Patmos. Asking if Hampton was mentally ill is the wrong question; it’s irrelevant if he was schizophrenic, bipolar. Etiology only goes so far in deciphering the divine language, and who are we so sure of ourselves to say that the voice in a janitor’s head wasn’t that of the Lord? Greg Bottoms writes in Spiritual American Trash: Portraits from the Margins of Art and Faith that Hampton “knew he was chosen, knew he was a saint, knew he had been granted life, this terrible, beautiful life, to serve God.” Who among us can say that he was wrong? In his workshop, Hampton wrote on a piece of paper “Where there is no vision, the people perish.” There are beautiful and terrifying things hidden in garages all across America; there are messiahs innumerable. Hampton’s shrine is strange, but it is oh so resplendent.

2.              By the time Brother John Nayler genuflected before George Fox, the founder of the Quaker Society of Friends, his tongue had already been bored with a hot iron poker and the letter “B” (for “Blasphemer”) had been branded onto his forehead by civil authorities. The two had not gotten along in the past, arguing over the theological direction of the Quakers, but by 1659 Nayler was significantly broken by their mutual enemies that he was forced to drag himself to Fox’s parlor and to beg forgiveness. Three years had changed the preacher’s circumstances, for it was in imitation of the original Palm Sunday that in 1656 Nayler had triumphantly entered into the sleepy sea-side town of Bristol upon the back of a donkey, the religious significance of the performance inescapable to anyone. A supporter noted in a diary that Nayler’s “name is no more to be called James but Jesus,” while in private writings Fox noted that “James ran out into imaginations… and they raised up a great darkness in the nation.”

At the start of 1656, Nayler was imprisoned, and when Fox visited him in his cell, the latter demanded that the former kiss his foot (belying the Quaker reputation for modesty). “It is my foot,” Fox declared, but Nayler refused. The confidence of a man who reenacted Christ’s entrance into Jerusalem. Guided by the Inner Light that Quakers saw as supplanting even the gospels, Nayler thought of his mission in messianic terms, and organized his vehicle to reflect that. Among the “Valiant Sixty,” itinerant preachers who were too radical even for the Quakers, Nayler was the most revolutionary, condemning slavery, enclosure, and private property. The tragedy of Nayler is that he happened to not actually be the messiah. Before his death, following an assault by a highwayman in 1660, Nayler would write that his “hope is to outlive all wrath and contention, and to wear out all exaltation and cruelty, or whatever is of a nature contrary to itself.” He was 42, beating Christ by almost a decade.

“Why was so much fuss made?” asks Christopher Hill in his classic The World Turned Upside Down: Radical Ideas During the English Revolution. “There had been earlier Messiahs—William Franklin, Arise Evans who told the Deputy Recorder of London that he was the Lord his God…Gadbury was the Spouse of Christ, Joan Robins and Mary Adams believed they were about to give birth to Jesus Christ.” Hill’s answer to the question of Nayler’s singularity is charitable, writing that none of the others actually seemed dangerous, since they were merely “holy imbeciles.” The 17th-century, especially around the time of the English civil wars, was an age of blessed insanity, messiahs proliferating like dandelions after a spring shower. There was John Reeve, Laurence Clarkson, and Lodowicke Muggleton, who took turns arguing over which were the two witnesses mentioned in Revelation, and that God had absconded from heaven and the job was now open. Abiezer Cope, prophet of a denomination known as the Ranters, demonstrated that designation in his preaching and writing. One prophet, Thoreau John Tany (who designated himself “King of the Jews”), simply declared “What I have written, I have written,” including the radical message that hell was liberated and damnation abolished. Regarding the here and now, Tany had some similar radical prescriptions, including to “feed the hungry, clothe the naked, oppress none, set free them bounden.”

There have been messianic claimants from first century Judea to contemporary Utah. When St. Peter was still alive there was the Samaritan magician Simon Magus, who used Christianity as magic and could fly, only to be knocked from the sky during a prayer-battle with the apostle. In the third century the Persian prophet Mani founded a religion that fused Christ with the Buddha, that had adherents from Gibraltar to Jinjiang, and a reign that lasted more than a millennium (with its teachings smuggled into Christianity by former adherent St. Augustine). A little before Mani, and a Phrygian prophet named Montanus declared himself an incarnation of the Holy Spirit, along with his consorts Priscilla and Maximillia. Prone to fits of convulsing revelation, Montanus declared “Lo, the man is as a lyre, and I fly over him as a pick.” Most Church Fathers denounced Montanism as rank heresy, but not Tertullian, who despite being the primogeniture of Latin theology, was never renamed “St. Tertullian” because of those enthusiasms. During the Middle Ages, at a time when stereotype might have it that orthodoxy reigned triumphant, and mendicants and messiahs, some whose names aren’t preserved to history and some who amassed thousands of followers, proliferated across Europe. Norman Cohn remarks in The Pursuit of the Millennium that for one eighth-century Gaulish messiah named Aldebert, followers “were convinced that he knew all their sins…and they treasured as miracle-working talismans the nail pairings and hair clippings he distributed among them.” Pretty impressive, but none of us get off work for Aldebert’s birthday.  

More recently, other messiahs include the 18th-century prophetess and mother of the Shakers Anne Lee, the 20th-century founder of the Korean Unification Church Sun Myung Moon (known for his elaborate mass weddings and owning the conservative Washington Times), and the French test car driver Claude Vorilhon who renamed himself Raël and announced that he was the son of an extraterrestrial named Yahweh (more David Bowe’s The Rise and Fall of Ziggy Stardust and the Spiders from Mars then Paul’s epistles). There are as many messiahs as there are people; there are malicious messiahs and benevolent ones, deluded head-cases and tricky confidence men, visionaries of transcendent bliss and sputtering weirdos. What unites all of them is an observation made by Reeve that God speaks to them as “to the hearing of the ear as a man speaks to a friend.” 

3.Hard to identify Elvis Presley’s apotheosis. Could have been the ’68 Comeback Special, decked in black leather warbling “Suspicious Minds” in that snarl-mumble. Elvis’s years in the wilderness precipitated by his manager Col. Tom Parker’s disastrous gambit to have the musician join the army, only to see all of the industry move on from his rockabilly style, now resurrected on a Burbank sound stage. Or maybe it was earlier, on The Milton Berle Show in 1956, performing Big Mama Thorton’s hit “Hound Dog” to a droopy dog, while gyrating on the Hollywood stage, leading a critic for the New York Daily News to opine that Elvis “gave an exhibition that was suggestive and vulgar, tinged with the kind of animalism that should be confined to dives and bordellos.” A fair candidate for that moment of earthly transcendence could be traced back to 1953, when in Sun Records’ dusty Memphis studio Elvis would cover Junior Parker’s “Mystery Train,” crooning out in a voice both shaky and confident over his guitar’s nervous warble “Train I ride, sixteen coaches long/Train I ride, sixteen coaches long/Well that long black train, got my baby and gone.” But in my estimation, and at the risk of sacrilege, Elvis’s ascension happened on Aug. 16, 1977 when he died on the toilet in the private bathroom of his tacky and opulent Graceland estate.

The story of Elvis’s death has the feeling of both apocrypha and accuracy, and like any narrative that comes from that borderland country of the mythic, it contains more truth than the simple facts can impart. His expiration is a uniquely American death, but not an American tragedy, for Elvis was able to get just as much out of this country as the country ever got out of him, and that’s ultimately our true national dream. He grabbed the nation by its throat and its crotch, and with pure libidinal fury was able to incarnate himself as the country. All of the accoutrement—the rhinestone jumpsuits, the karate and the Hawaiian schtick, the deep-fried-peanut-butter-banana-and-bacon-sandwiches, the sheer pill-addicted corpulence—are what make him our messiah. Even the knowing, obvious, and totally mundane observation that he didn’t write his own music misses the point. He wasn’t a creator—he was a conduit. Greil Marcus writes in Mystery Train: Images of America in Rock ‘n’ Roll of the “borders of Elvis Presley’s delight, of his fine young hold on freedom…[in his] touch of fear, of that old weirdness.” That’s the Elvis that saves, the Elvis of “That’s All Right (Mama)” and “Shake, Rattle, and Roll,” those strange hillbilly tracks, that weird chimerical sound—complete theophany then and now.

There’s a punchline quality to that contention about white-trash worshipers at the Church of Elvis, all of those sightings in the Weekly World News, the Las Vegas impersonators of various degrees of girth, the appearance of the singer in the burnt pattern of a tortilla. This is supposedly a faith that takes its pilgrimage to Graceland as if it were zebra-print Golgotha, that visits Tupelo as if it were Nazareth. John Strausbaugh takes an ethnographer’s calipers to Elvism, arguing in E: Reflections on the Birth of the Elvis Faith that Presley has left in his wake a bona fide religion, with its own liturgy, rituals, sacraments, and scripture. He writes that it is a “The fact that outsiders can’t take it seriously may turn out to be its strength and its shield. Maybe by the time Elvism is taken seriously it will have quietly grown too large and well established to be crushed.” There are things less worthy of your worship than Elvis Presley. If we were to think of an incarnation of the United States, of a uniquely American messiah, few candidates would be more all consumingly like the collective nation. In his appetites, his neediness, his yearning, his arrogance, his woundedness, his innocence, his simplicity, his cunning, his coldness, and his warmth, he was the first among Americans. Elvis is somehow both rural and urban, northern and southern, country and rock, male and female, white and black. Our contradictions are reconciled in him. “Elvis lives in us,” Strausbaugh writes, “There is only one King and we know who he is.” We are Elvis and He was us.   

4.              A hideous slaughter followed those settlers as they drove deep into the continent. On that western desert, where the lurid sun’s bloodletting upon the burnt horizon signaled the end of each scalding day, a medicine man and prophet of the Paiute people had a vision. In a trance, Wodziwob received an oracular missive, that “within a few moons there was to be a great upheaval or earthquake… he whites would be swallowed up, while the Indians would be saved.” Wodziwob would be the John the Baptist to a new movement, for though he would die in 1872, the ritual practice that he taught—the Ghost Dance—would become a rebellion against the genocidal policy of the U.S. Government. For Wodziwob, the Ghost Dance was an affirmation, but it has also been remembered as a doomed moment. “To invoke the Ghost Dance has been to call up an image of indigenous spirituality by turns militant, desperate, and futile,” writes Louis S. Warren in God’s Red Son: The Ghost Dance Religion and the Making of Modern America, “a beautiful dream that died.” But a dream that was enduring. 

While in a coma precipitated by scarlet fever, during a solar eclipse, on New Year’s Day of 1889, a Northern Paiute Native American who worked on a Carson City, Nevada, ranch and was known as Jack Wilson by his coworkers and as Wovoka to his own people, fell into a mystical vision not unlike Wodziwob’s. Wovoka met many of his dead family members, he saw the prairie that exists beyond that which we can see, and he held counsel with Jesus Christ. Wovoka was taught the Ghost Dance, and learned what Sioux Chief Lame Deer would preach, that “the people…could dance a new world into being.” When Wovoka returned he could control the weather, he was able to compel hail from the sky, he could form ice with his hands on the most sweltering of days. The Ghost Dance would spread throughout the western United States, embraced by the Paiute, the Dakota, and the Lakota. Lame Deer said that the Ghost Dance would roll up the earth “like a carpet with all the white man’s ugly things—the stinking new animals, sheep and pigs, the fences, the telegraph poles, the mines and factories. Underneath would be the wonderful old-new world.”

A messiah is simultaneously the most conservative and most radical, preaching a return to a perfected world that never existed but also the overturning of everything of this world, of the jaundiced status quo. The Ghost Dance married the innate strangeness of Christianity to the familiarity of native religion, and like both it provided a blueprint for how to overthrow the fallen things. Like all true religion, the Ghost Dance was incredibly dangerous. That was certainly the view of the U.S. Army and the Bureau of Indian Affairs, which saw an apocalyptic faith as a danger to white settler-colonials and their indomitable, zombie-like push to the Pacific. Manifest Destiny couldn’t abide by the hopefulness of a revival as simultaneously joyful and terrifying as the Ghost Dance, and so an inevitable confrontation awaited.   

The Miniconjou Lakota people, forced into the Pine Ridge Reservation by 1890, were seen as particularly rebellious, in part because their leader Spotted Elk was an adherent. As a pretext concerning Lakota resistance to disarmament, the army opened fire on gathered Miniconjou, and more than 150 people (mostly women and children) would be slaughtered during the Wounded Knee Massacre. As surely as the Romans threw Christians to the lions and Cossacks rampaged through the Jewish shtetls of eastern Europe, so too were the initiates of the Ghost Dance persecuted, murdered, and martyred by the U.S. Government. Warren writes that the massacre “has come to stand in for the entire history of the religion, as if the hopes of all of its devoted followers began and ended in that fatal ravine.” Wounded Knee was the Calvary of the Ghost Dance faith, but if Calvary has any meaning it’s that crucified messiahs have a tendency not to remain dead. In 1973 a contingent of Oglala Lakota and members of the American Indian Movement occupied Wounded Knee, and the activist Mary Brave Beard defiantly performed the Ghost Dance, again.   

5.             Rabbi Menachem Mendel Schneerson arrived in the United States via Paris, via Berlin, and ultimately via Kiev. He immigrated to New York on the eve of America’s entry into the Second World War, and in that interim six million Jews were immolated in Hitler’s ovens—well over half of all the Jews in the world. Becoming Lubavitch Chief Rebbe in 1950, Schneerson was a refugee from a broken Europe that had devoured itself. Schneerson’s denomination of Hasidism had emerged after the vicious Cossack-led pogroms that punctuated life in 17th-century eastern Europe, when many Jews turned towards the sect’s founder, the Baal Shem Tov. His proper name was Rabbi Israel ben Eliezer, and his title (often shortened to “Besht”) meant “Master of the Good Name,” for the Baal Shem Tov incorporated Kabbalah into a pietistic movement that enshrined emotion over reason, feeling over logic, experience over philosophy. David Bial writes in Hasidism: A New History that the Besht espoused “a new method of ecstatic joy and a new social structure,” a fervency that lit a candle against persecution’s darkness.

When Schneerson convened a gathering of Lubavitchers in a Brooklyn synagogue for Purim in 1953,  a black cloud enveloped the Soviet Union. Joseph Stalin was beginning to target Jews whom he implicated in “The Doctor’s Plot,” an invented accusation that Jewish physicians were poisoning Soviet leadership. The state propaganda organ Pravda denounced these supposed members of a “Jewish bourgeois-nationalist organization… The filthy face of this Zionist spy organization, covering up their vicious actions under the mask of charity.” Four gulags were constructed in Siberia, with the understanding that Russian Jews would be deported and perhaps exterminated. Less than a decade after Hitler’s suicide, and Schneerson would look out into the congregation of swaying black-hatted Lubavitchers, and would see a people labeled for extinction.

And so on that evening, Schneerson explicated on the finer points of Talmudic exegesis, on questions of why evil happens in the world, and what man and G-d’s role is in containing that wickedness. Witnesses said that the very countenance of the rabbi was transformed, as he declared that he would speak the words of the living G-d. Enraptured in contemplation, Schneerson connected the Persian courtier Haman’s war against the Jews and Stalin’s upcoming campaign, he invoked G-d’s justice and mercy, and implored the divine to intervene and prevent the Soviet dictator from completing that which Hitler had begun. The Rebbe denounced Stalin as the “evil one,” and as he shouted it was said that his face transformed into a “holy fire.”

Two days later Moscow State Radio announced that Stalin had fallen ill and died. The exact moment of his expiration was when a group of Lubavitch Jews had prayed that G-d would still the hand of the tyrant and punish his iniquities. Several weeks later, and Soviet leadership would admit that the Doctor’s Plot was a government rouse invented by Stalin, and they exonerated all of those who’d been punished as a result of baseless accusations. By the waning days of the Soviet Union, the Lubavitcher Rebbe would address crowds gathered in Red Square by telescreen while the Red Army Band performed Hasidic songs. “Was this not the victory of the Messiah over the dark forces of the evil empire, believers asked?” writes Samuel Heilman and Menachem Friedman in The Rebbe: The Life and Afterlife of Menachem Mendel Schneerson.

The Mashiach (“anointed one” in Hebrew) is neither the Son of G-d nor the incarnate G-d, and his goal is arguably more that of liberation than salvation (whatever either term means). Just as Christianity has had many pseudo-messiahs, so is Jewish history littered with figures whom some believers saw as the anointed one (Christianity is merely the most successful of these offshoots). During the Second Jewish-Roman War of the second century, the military commander Simon bar Kokhba was lauded as the messiah, even as his defeat led to Jewish exile from the Holy Land. During the 17th century, the Ottoman Jew Sabbatai Zevi amassed a huge following of devotees who believed him the messiah come to subvert and overthrow the strictures of religious law itself. Zevi was defeated by the Ottomans not through crucifixion, but through conversion (which is much more dispiriting). A century later, and the Polish libertine Jacob Frank would declare that the French Revolution was the apocalypse, that Christianity and Judaism must be synthesized, and that he was the messiah. Compared to them, the Rebbe was positively orthodox (in all senses of the word). He also never claimed to be the messiah.

What all share is the sense that to exist is to be in exile. That is the fundamental lesson and gift of Judaism, born from the particularities of Jewish suffering. Diaspora is not just a political condition, or a social one; diaspora is an existential state. We are all marooned from our proper divinity, shattered off from G-d’s being—alone, disparate, isolated, alienated, atomized, solipsistic. If there is to be any redemption it’s in suturing up those shards, collecting those bits of light cleaved off from the body of G-d when He dwelled in resplendent fullness before the tragedy of creation. Such is the story of going home but never reaching that destination, yet continuing nevertheless. What gives this suffering such beauty, what redeems the brokenness of G-d, is the sense that it’s that very shattering that imbues all of us with holiness. What the German-Jewish philosopher Walter Benjamin explains in On the Concept of History as the sacred reality that holds that “every second was the narrow gate, through which the Messiah could enter.”

6.When the Living God landed at Palisadoes Airport in Kingston, Jamaica, on April 21, 1966, he couldn’t immediately disembark from his Ethiopian Airlines flight from Addis Ababa. More than 100,000 people had gathered at the airport, the air thick with the sticky, sweet smell of ganja, the airstrip so overwhelmed with worshipers come to greet the Conquering Lion of the Tribe of Judah, His Imperial Majesty Haile Selassie I, King of Kings, Lord of Lords, Elect of God, Power of the Trinity, of the House of Solomon, Amhara Branch, noble Ras Tafari Makonnen, that there was a fear the plane itself might tip over. The Ethiopian Emperor, incarnation of Jah and the second coming of Christ, remained in the plane for a few minutes until a local religious leader, the drummer Ras Mortimer Planno, was allowed to organize the emperor’s descent.

Finally, after several tense minutes, the crowd pulled back long enough for the emperor to disembark onto the tarmac, the first time that Selassie would set foot on the fertile soil of Jamaica, a land distant from the Ethiopia that he’d ruled over for 36 years (excluding from 1936 to 1941 when his home country was occupied by the Italian fascists). Jamaica was where he’d first been acknowledged as the messianic promise of the African diaspora. A year after his visit, while being interviewed by the Canadian Broadcasting Corporation, Selassie was asked what he made of the claims of his status. “I told them clearly that I am a man,” he said, “that I am mortal…and that they should never make a mistake in assuming or pretending that a human being is emanated from a deity.” The thing with being a messiah, though, is that it does not depend on the consent of the worshiped whether or not they’re to be adored.

Syncretic and born from the Caribbean experience, and practiced from Kingstown, Jamaica, to Brixton, London, Rastafarianism is a mélange of Christian, Jewish, and uniquely African symbols and beliefs, with its own novel rhetoric concerning oppression and liberation. Popularized throughout the West because of the indelible catchiness of reggae, with its distinctive muted third beat, and the charisma of the musician Bob Marley who was the faith’s most famous ambassador, Rastafarianism is sometime offensively reduced in peoples’ minds to dreadlocks and spliff smoke. Ennis Barrington Edmonds places the faith’s true influence in its proper context, writing in Rastafari: From Outcasts to Culture Bearers that “the movement has spread around the world, especially among oppressed people of African origins… [among those] suffering some form of oppression and marginalization.”

Central to the narrative of Rastafarianism is the reluctant messiah Selassie, a life-long member of the Ethiopian Orthodox Tewahedo Church. Selassie’s reign had been prophesized by the Jamaican Protestant evangelist Leonard Howell’s claim that the crowning of an independent Black king in an Africa dominated by European colonialism would mark the dawn of a messianic dispensation. A disciple of Black nationalist Marcus Garvey, whom Howell met when both lived in Harlem, the minister read Psalm 68:31’s injunction that “Ethiopia shall soon stretch out her hands unto God” as being fulfilled in Selassie’s coronation. Some sense of this reverence is imparted by a Rastafarian named Reuben in Emily Robeteau’s Searching for Zion: The Quest for Home in the African Diaspora who explained that “Ethiopia was never conquered by outside forces. Ethiopia was the only independent country on the continent of Africa…a holy place.” Sacred Ethiopia, the land where the Ark of the Covenant was preserved. 

That the actual Selassie neither embraced Rastafarianism, nor was particularly benevolent in his own rule, and was indeed deposed by a revolutionary Marxist junta, is of no accounting.  Rather what threads through Rastafarianism is what Robeateu describes as a “defiant, anticolonialist mind-set, a spirit of protest… and a notion that Africa is the spiritual home to which they are destined to return.” Selassie’s biography bore no similarity to residents in the Trenchtown slum of Kingstown where the veneration of a distant African king began, but his name served as rebellion against all agents of Babylon in the hopes of a new Zion. Rastafarianism found in Selassie the messiah who was needed, and in their faith there is a proud way of spiritually repudiating the horrors of the trans-Atlantic slave trade. What their example reminds us of is that a powerful people have no need of a messiah, that he rather always dwells amongst the dispossessed, regardless of what his name is.

7.Among the Persian Sufis there is no blasphemy in staining your prayer rug red with shiraz. Popular throughout Iran and into central Asia, where the faith of Zoroaster and Mani had both once been dominant, Sufism drew upon those earlier mystical and poetic traditions and incorporated them into Islam. The faith of the dervishes, the piety of the wali, the Sufi tradition is that of Persian miniatures painted in stunning, colorful detail, of the poetry of Rumi and Hafez. Often shrouded in rough woolen coats and felt caps, the Sufi practice a mystical version of faith that’s not dissimilar to Jewish kabbalah or Christian hermeticism, an inner path that the 20th-century writer Aldous Huxley called the “perennial philosophy.” As with other antinomian faiths, the Sufis often skirt the line of what’s acceptable and what’s forbidden, seeing in heresy intimations of a deep respect for the divine.

A central poetic topoi of Sufi practice is what’s called shath, that is deliberately shocking utterances that exist to shake believers out of pious complacency, to awaken within them that which is subversive about God. One master of the form was Mansour al-Hallaj, born to Persian speaking parents (with a Zoroastrian grandfather) in the ninth century during the Abbasid Caliphate. Where most Sufi masters were content to keep their secrets preserved for initiates, al-Hallaj crafted a movement democratized for the mass of Muslims, while also generating a specialized language for speaking of esoteric truths, expressed in “antithesis, breaking down language into prepositional units, and paradox,” as Husayn ibn Mansur writes in Hallaj: Poems of a Sufi Martyr. Al-Hallaj’s knowledge and piety were deep—he had memorized the Koran by the age of 12 and he prostrated himself before a replica of Mecca’s Kaaba in his Baghdad garden—but so was his commitment to the radicalism of shath. When asked where Allah was, he once replied that the Lord was within his turban; on another occasion he answered that question by saying that God was under his cloak. Finally in 922, borrowing one of the 99 names of God, he declared “I am the Truth.”

The generally tolerant Abbasids decided that something should be done about al-Hallaj, and so he was tied to a post along the Tigris River, repeatedly punched in the face, lashed several times, decapitated, and finally his headless body was hung over the water. His last words were “Akbar al-Hallaj” —“Al-Hallaj is great.” An honorific normally offered for God, but for this self-declared heretical messiah his name and that of the Lord were synonyms. What’s sacrilegious about this might seem clear, save for that al-Hallaj’s Islamic piety was such that he interpreted such a claim as the natural culmination of Tawhid, the strictness of Islamic monotheism pushed to its logical conclusion—there is but one God, and everything is God, and we are all in God. Idries Shah explains the tragic failure of interpretation among religious authorities in The Sufis, writing that the “attempt to express a certain relationship in language not prepared for it causes the expression to be misunderstood.”  The court, as is obvious, did not agree.

If imagining yourself as the messiah could get you decapitated by the 10th century Abbasids, then in 20th-century Michigan it only got you institutionalized. Al-Hallaj implied that he was the messiah, but for the psychiatric patients in Milton Rokeach’s 1964 study The Three Christs of Ypsilanti each thought that they were the authentic messiah (with much displeasure ensuing when they meet). Based on three years of observation at the Ypsilanti State Hospital starting in 1959, Rokeach treated this trinity of paranoid schizophrenics. Initially Rokeach thought that the meeting of the Christs would disavow them all of their delusions, that the law of logical non-contradiction might mean anything to a psychotic. But the messiahs were steadfast in their faith—each was singular and the others were imposters. Then Rokeach and his graduate students introduced fake messages from other divine beings, in a gambit that the psychiatrist apologized for two decades later and would most definitely land him before an ethics board today. Finally Rokeach grants them the right to their insanities, each of the Christs of Ypsilanti continuing in their merry madness. “It’s only when a man doesn’t feel that he’s a man,” Rokeach concludes, “that he has to be a god.”

Maybe. Or maybe the true madness of the Michigan messiahs was that each thought themselves the singular God. They weren’t in error that each of them were the messiah, they were in error by denying that truth in their fellow patients. Al-Hallaj would have understood, declaring before his executioner that “all that matters for the ecstatic is that the Unique shall reduce him to Unity.” The Christs may have benefited more by Sufi treatment than psychotherapy. Clyde Benson, Joseph Cassel, and Leon Gabor all thought themselves to be God, but al-Hallaj knew that he was (and that You reading this are as well). Those three men may have been crazy, but al-Hallaj was a master of what the Buddhist teacher Wes Nisker calls “crazy wisdom.” In his guidebook The Essential Crazy Wisdom, he celebrates the sacraments of “clowns, jesters, tricksters, and holy fools,” who understand that “we live in a world of many illusions, that the emperor has no clothes, and that much of human belief and behavior is ritualized nonsense.” By contrast, the initiate in crazy wisdom, whether gnostic saint of Kabbalist rabbi, Sufi master or Zen monk, prods at false piety to reveal deeper truths underneath. “I saw my Lord with the eye of the heart,” al-Hallaj wrote in one poem, “I asked, ‘Who are You?’/He replied, ‘You.’”

8. “Bob” is the least-likely looking messiah. With his generic handsomeness, his executive hair-cut dyed black and tightly parted on the left, the avuncular pipe that jauntily sticks out of his tight smile, “Bob” looks like a stock image of a 1950’s pater familias (his name is always spelled with quotation marks). Like a clip-art version of Mad Men’s Don Draper, or Utah Sen. Mitt Romney. “Bob” is also not real (which may or may not distinguish him from other messiahs), but rather the central figure in the parody Church of the Sub-Genius. Supposedly a traveling salesman, J.R. “Bob” Dobbs had a vision of JHVH-1 (the central God in the church) in a homemade television set, and he then went on the road to evangelize. Founded by countercultural slacker heroes Ivan Stang and Philo Drummond in 1979 (though each claims that “Bob” was the actual primogeniture), the Church of the SubGenius is a veritable font of crazy wisdom, promoting the anarchist practice of “culture jamming” and parody in the promulgation of a faith where it’s not exactly clear what’s serious and what isn’t.

“Bob” preaches a doctrine of resistance against JHVH-1 (or Jehovah 1), the demiurge who seems as if a cross between Yahweh and a Lovecraftian elder god. JHVH-1 intended for “Bob” to encourage a pragmatic, utilitarian message about the benefits of a work ethic, but contra his square appearance, the messiah preferred to advocate that his followers pursue a quality known as slack. Never clearly defined (though its connotations are obvious), slack is to the Church of the SubGenius what the Tao is to Taoism or the Word is to Christianity, both the font of all reality and that which gives life meaning. Converts to the faith include luminaries like the underground cartoonist Robert Crumb, Pee-wee Herman Show creator Paul Reubens, founder of the Talking Heads David Byrne, and of course Devo’s Mark Mothersbaugh. If there is any commandment that most resonates with the emotional tenor of the church, it’s in “Bob’s” holy commandment that says “Fuck ‘em if they can’t take a joke.”

The Church of the SubGenius is oftentimes compared to another parody religion that finds its origins from an identical hippie milieu, though first appearing almost two decades before in 1963, known by the ominous name of Discordianism. Drawing from Hellenic paganism, Discordianism holds as one of its central axioms in the Principia Discordia (written by founders Greg Hill and Kerry Wendell Thornley under the pseudonyms of Malaclypse the Younger and Omar Khayyam Ravenhurst) that the “Aneristic Principle is that of apparent order; the Eristic Principle is that of apparent disorder. Both order and disorder are man made concepts and are artificial divisions of pure chaos, which is a level deeper than is the level of distinction making.” Where other mythological systems see chaos as being tamed and subdued during ages primeval, the pious Discordian understands that disorder and disharmony remain the motivating structure of reality. To that end, the satirical elements—its faux scripture, its faux mythology, and its faux hierarchy—are paradoxically faithful enactments of its central metaphysics.

For those of a conspiratorial bent, Thornley first conceived of the movement after leaving the Marine Corps, and he was an associate of Lee Harvey Oswald. It sounds a little like one of the baroque plots in Robert Anton Wilson and Robert Shea’s The Illuminatus! Trilogy. A compendium of occult and conspiratorial lore whose narrative complexity recalls James Joyce or Thomas Pynchon, The Illuminatus! Trilogy was an attempt to produce for Discordianism what Dante crafted for Catholicism or John Milton for Protestantism: a work of literature commensurate with theology. Stang and Drummond, it should be said, were avid readers of The Illuminatus! Trilogy. “There are periods of history when the visions of madmen and dope fiends are a better guide to reality than the common-sense interpretation of data available to the so-called normal mind,” writes Wilson. “This is one such period, if you haven’t noticed already.” And how.

Inventing religions and messiahs wasn’t merely an activity for 20th-century pot smokers. Fearing the uncovering of the Christ who isn’t actually there can be seen as early as the 10th century, when the Iranian war-lord Abu Tahir al-Jannabi wrote about a supposed tract that referenced the “three imposters,” an atheistic denunciation of Moses, Jesus, and Muhammad. This equal opportunity apostasy, attacking all three children of Abraham, haunted monotheism over the subsequent millennium, as the infernal manuscript was attributed to several different figures. In the 13th century, Pope Gregory IX said that the Holy Roman Emperor Frederick II had authored such a work (the latter denied it). Within Giovanni Boccaccio’s 14th-century The Decameron there is reference to the “three imposters,” and in the 17th century, Sir Thomas Browne attributed the Italian Protestant refugee Bernardino Orchino with having composed a manifesto against the major monotheistic faiths.

What’s telling is that everyone feared the specter of atheism, but no actual text existed. They were scared of a possibility without an actuality, terrified of a dead God who was still alive. It wouldn’t be until the 18th century that writing would actually be supplied in the form of the French authored anonymous pamphlet of 1719 Treatise of the Three Imposters. That work, going through several different editions over the next century, drew on the naturalistic philosophy of Benedict Spinoza and Thomas Hobbes to argue against the supernatural status of religion. Its author, possibly the bibliographer Prosper Marchand, argued that the “attributes of the Deity are so far beyond the grasp of limited reason, that man must become a God himself before he can comprehend them.” One imagines that the prophets of the Church of the SubGenius and Discordianism, inventors of gods and messiahs aplenty, would concur. “Just because some jackass is an atheist doesn’t mean that his prophets and gods are any less false,” preaches “Bob” in The Book of the SubGenius.

9.The glistening promise of white-flecked SPAM coated in greasy aspic as it slips out from it’s corrugated blue can, plopping onto a metal plate with a satisfying thud. A pack of Lucky Strikes, with its red circle in a field of white, crinkle of foil framing a raggedly opened end, sprinkle of loose tobacco at the bottom as a last cigarette is fingered out. Heinz Baked Beans, sweet in their tomato gravy, the yellow label with its picture of a keystone slick with the juice from within. Coca-Cola—of course Coca-Cola—its ornate calligraphy on a cherry red can, the saccharine nose pinch within. Products of American capitalism, the greatest and most all-encompassing faith of the modern world, left behind on the Vanuatuan island of Tanna by American servicemen during the Second World War.

More than 1,000 miles northwest from Australia, Tanna was home to airstrips and naval bases, and for the local Melanesians, the 300,000 GIs housed on their island indelibly marked their lives. Certainly the first time most had seen the descent of airplanes from the blue of the sky, the first time most had seen jangling Jeeps careening over the paths of Tanna’s rainforests, the first time most had seen Naval destroyers on the pristine Pacific horizon. Building upon a previous cult that had caused trouble for the jointly administered colonial British-French Condominium, the Melanesians claimed that the precious cargo of the Americans could be accessed through the intercession of a messiah known as John Frum—believed to have possibly been a serviceman who’d introduced himself as “John from Georgia.”

Today the John Frum cult still exists in Tanna. A typical service can include the rising of the flags of the United States, the Marine Corps, and the state of Georgia, while shirtless youths with “USA” painted on their chests march with faux-riffles made out of sticks (other “cargo cults” are more Anglophilic, with one worshiping Prince Philip). Anthropologists first noted the emergence of the John Frum religion in the immediate departure of the Americans, with Melanesians apparently constructing landing strips and air traffic control towers from bamboo, speaking into left-over tin cans as if they were radio controls, all to attract back the quasi-divine Americans and their precious “cargo.” Anthropologist Holger Jebens in After the Cult: Perceptions of Other and Self in West New Britain (Papua New Guinea) describes “cargo cults” as having as their central goal the acquisition of “industrially manufactured Western goods brought by ship or aeroplane, which, from the Melanesian point of view, are likely to have represented a materialization of the superior and initially secret power of the whites.” In this interpretation, the recreated ritual objects molded from bamboo and leaves are offerings to John Frum so that he will return from the heavenly realm of America bearing precious cargo.

If all this sounds sort of dodgy, than you’ve reason to feel uncomfortable. More recently, some anthropologists have questioned the utility of the phrase “cargo cult,” and the interpretation of the function of those practices. Much of the previous model, mired in the discipline’s own racist origins, posits the Melanesians and their beliefs as “primitive,” with all of the attendant connotations to that word. In the introduction to Beyond Primitivism: Indigenous Religious Traditions and Modernity, Jacob K. Olupona writes that some scholars have defined the relationship between ourselves and the Vanuatuans by “challenging the notion that there is a fundamental difference between modernity and indigenous beliefs.” Easy for an anthropologist espying the bamboo landing field and assuming that what was being enacted was a type of magical conspicuous consumption, a yearning on the part of the Melanesians to take part in our own self-evidently superior culture. The only thing that’s actually self-evident in such a view, however, is a parochial and supremacist positioning that gives little credit to unique religious practices. Better to borrow the idea of allegory in interpreting the “cargo cults,” both the role which that way of thinking may impact the symbolism of their rituals, but also what those practices could reflect about our own culture.

Peter Worsley writes in The Trumpet Shall Sound: A Study of “Cargo” Cults in Melanesia that “a very large part of world history could be subsumed under the rubric of religious heresies, enthusiastic creeds and utopias,” and this seems accurate. So much of prurient focus is preoccupied with the material faith. Consequently there is a judgment of the John Frum religion as being superficial. Easy for those of us who’ve never been hungry to look down on praying for food, easy for those with access to medicine to pretend that materialism is a vice. Mock praying for SPAM at your own peril; compared to salvation, I at least know what the former is. Christ’s first miracle in the Gospel of John, after all, was the production of loaves and fishes, a prime instance of generating cargo. John Frum, whether he is real or not, is a radical figure, for while a cursory glance at his cult might seem that what he promises is capitalism, it’s actually the exact opposite. For those who pray to John Frum are not asking for work, or labor, or a Protestant work ethic, but rather delivery from bondage; they are asking to be shepherded into a post-scarcity world. John Frum is not intended to deliver us to capitalism, but rather to deliver us from it. John Frum is not an American, but he is from that more perfect America that exists only in the Melanesian spirit.   

10.On Easter of 1300, within the red Romanesque walls of the Cistercian monastery of Chiaravelle, a group of Umiliati Sisters were convened by Maifreda da Pirovano at the grave of the Milanese noblewoman Guglielma. Having passed two decades before, the mysterious Guglielma was possibly the daughter of King Premysl Otakar I of Bohemia, having come to Lombardy with her son following her husband’s death. Wandering the Italian countryside as a beguine, Guglielma preached an idiosyncratic gospel, claiming that she was an incarnation of the Holy Spirit, and that her passing would incur the third historical dispensation, destroying the patriarchal Roman Catholic Church in favor of a final covenant to be administered through women. If Christ was the new Adam come to overturn the Fall, then his bride was Guglielma, the new Eve, who rectified the inequities of the old order and whose saving grace came for humanity, but particularly for women.

Now a gathering of nuns convened at her inauspicious shrine, in robes of ashen gray and scapulars of white. There Maifreda would perform a Mass, transubstantiating the wafer and wine into the body and blood of Christ. The Guglielmites would elect Maifreda the first Pope of their Church. Five hundred years after the supposed election of the apocryphal Pope Joan, and this obscure order of women praying to a Milanese aristocrat would confirm Maifreda as the Holy Mother of their faith. That same year the Inquisition would execute 30 Guglielmites—including la Papessa. “The Spirit blows where it will and you hear the sound of it,” preached Maifreda, “but you know now whence it comes or whither it goes.”

Maifreda was to Guglielma as Paul was to Christ: apostle, theologian, defender, founder. She was the great explicator of the “true God and true human in the female sex…Our Lady is the Holy Spirit.” Strongly influenced by a heretical strain of Franciscans influenced by the Sicilian mystic Joachim of Fiore, Maifreda held that covenantal history could be divided tripartite, with the first era of Law and God the Father, the second of Grace and Christ the Son, and the third and coming age of Love and the Daughter known as the Holy Spirit. Barbara Newman writes in From Virile Woman to WomanChrist: Studies in Medieval Religion and Literature that after Guglielma’s “ascension the Holy Spirit would found a new Church, superseding the corrupt institution in Rome.” For Guglielma’s followers, drawn initially from the aristocrats of Milan but increasingly popular among more lowly women, these doctrines allowed for self-definition and resistance against both Church and society. Guglielma was a messiah and she arrived for women, and her prophet was Maifreda.

Writing of Guglielma, Newman says that “According to one witness… she had come in the form of a woman because if she had been male, she would have been killed like Christ, and the whole world would have perished.” In a manner she was killed, some 20 years after her death when her bones were disinterred from Chiaravelle, and they were placed on the pyre where Maifreda would be burnt alongside two of her followers. Pope Boniface VIII would not abide another claimant to the papal throne—especially from a woman. But even while Maifreda would be immolated, the woman to whom she gave the full measure of sacred devotion would endure, albeit at the margins. Within a century Guglielma would be repurposed into St. Guglielma, a pious woman whom suffered under the false accusation of heresy, and who was noted as particularly helpful in interceding against migraines. But her subversive import wasn’t entirely dampened over the generations. When the Renaissance Florentine painter Bonifacio Bembo was commissioned to paint an altarpiece around 1445 in honor of the Council of Florence (an unsuccessful attempt at rapprochement between the Catholic and Orthodox Churches) he depicted God crowning Christ and the Holy Spirit. Christ appears as can be expected, but the Holy Spirit has Guglielma’s face.

Maifreda survived in her own hidden way as well, and also through the helpful intercession of Bembo. The altar that he crafted had been commissioned by members of the powerful Visconti family, leaders in Milan’s anti-papal Ghibelline party, and they also requested the artist to produce 15 decks of Tarot cards. The so-called Visconti-Sforza deck, the oldest surviving example of the form, doesn’t exist in any complete set, having been broken up and distributed to various museums. Several of these cards would enter the collection of the American banker J.P. Morgan, where they’d be stored in his Italianate lower Manhattan mansion. A visitor can see cards from the Visconti-Sforza deck that include the fool in his jester’s cap and mottled pants, the skeletal visage of death, and most mysterious of all, il Papesa—the female Pope. Bembo depicts a regal woman, in ash-gray robes and white scapular, the papal tiara upon her head. In the 1960s, the scholar Gertrude Moakley observed that the female pope’s distinctive dress indicates her order: she was an Umilati. Maifreda herself was first cousins with a Visconti, the family preserving the memory of the female pope in Tarot. On Madison Avenue you can see a messiah who for centuries was shuffled between any number of other figures, never sure of when she might be dealt again. Messiahs are like that, often hidden—and frequently resurrected.  

Bonus Links:—Ten Ways to Live ForeverTen Ways to Change Your GodTen Ways to Look at the Color Black

Image Credit: Wikipedia.

A Fraternity of Dreamers

- | 2

“There is no syllable one can speak that is not filled with tenderness and terror, that is not, in one of those languages, the mighty name of a god.” —Jorge Luis Borges, “The Library of Babel” (1941)

“Witness Mr. Henry Bemis, a charter member in the fraternity of dreamers. A bookish little man whose passion is the printed page…He’ll have a world all to himself…without anyone.” —Rod Serling, “Time Enough at Last,” The Twilight Zone (1959)

 When entering a huge library—whether its rows of books are organized under a triumphant dome, or they’re encased within some sort of vaguely Scandinavian structure that’s all glass and light, or they simply line dusty back corridors—I must confess that I’m often overwhelmed with a massive surge of anxiety. One must be clear about the nature of this fear—it’s not from some innate dislike of libraries, the opposite actually. The nature of my trepidation is very exact, though as far as I know there’s no English word for it (it seems like some sort of sentiment that the Germans might have an untranslatable phrase for). This fear concerns the manner in which the enormity of a library’s collection forces me to confront the sheer magnitude of all that I don’t know, all that I will never know, all that I can never know. When walking into the red-brick modernist hanger of the British Library, which houses all of those brittle books within a futuristic glass cube that looks like a robot’s heart, or the neo-classical Library of Congress with its green patina roof, or Pittsburgh’s large granite Carnegie Library main branch smoked dark with decades of mill exhaust and kept guard by a bronze statue of William Shakespeare, my existential angst is the same. If I start to roughly estimate the number of books per row, the number of rows per room, the number of rooms per floor, my readerly existential angst can become severe. This symptom can even be present in smaller libraries; I felt it alike in the small-town library of Washington, Penn., on Lincoln Avenue and in the single room of the Southeast Library of Washington D.C. on Pennsylvania Avenue. Intrinsic to my fear are those intimations of mortality whereby even a comparatively small collection must make me confront the fact that in a limited and hopefully not-too-short life I will never be able to read even a substantial fraction of that which has been written. All those novels, poems, and plays; all those sentiments, thoughts, emotions, dreams, wishes, aspirations, desires, and connections—completely inaccessible because of the sheer fact of finitude.

Another clarification should be in order—my fear isn’t the same as worrying that I’ll be found out for having never read any number of classical or canonical books (or those of the pop, paper-back variety either). There’s a scene in David Lodge’s classic and delicious campus satire Changing Places: A Tale of Two Campuses in which a group of academics play a particularly cruel game, as academics are apt to do, that asks participants to name a venerable book they’re expected to have read but have never opened. Higher point-values are awarded the more canonical a text is; what the neophytes don’t understand is that the trick is to mention something standard enough that they still can get the points for having not read it (like Laurence Sterne’s Tristram Shandy) but not so standard that they’ll look like an idiot for having never read it. One character—a recently hired English professor—is foolish enough to admit that he skipped Hamlet in high school. The other academics are stunned into silence. His character is later denied tenure. So, at the risk of making the same error, I’ll lay it out and admit to any number of books that the rest of you have probably read, but that I only have a glancing Wikipedia familiarity with: Marcel Proust’s Remembrance of Things Past, James Joyce’s Finnegan’s Wake, Don DeLillo’s White Noise, David Foster Wallace’s Infinite Jest. I’ve never read Harper Lee’s To Kill a Mockingbird, which is ridiculous and embarrassing, and I feel bad about it. I’ve also never read Jonathan Franzen’s The Corrections, though I don’t feel bad about that (however I’m wary that I’ve not read the vast bulk of J.K. Rowling’s Harry Potter books). Some of those previously mentioned books I want to read, others I don’t; concerning the later category, some of those titles make me feel bad about my resistance to them, others I haven’t thought twice about (I’ll let you guess individual titles’ statuses).

I offer this contrition only as a means of demonstrating that my aforementioned fear goes beyond simply imposter syndrome. There are any number of reasons why we wish we’d read certain things, and that we feel attendant moroseness for not having done so—the social stigma of admitting such things, a feeling of not being educated enough or worldly enough, the simple fact that there might be stuff that we’d like to read, but inclination, will power, or simply time has gotten in the way. The anxiety that libraries can sometimes give me is of a wholly more cosmic nature, for something ineffable affects my sense of self when I realize that the majority of human interaction, expression, and creativity shall forever be unavailable to me. Not only is it impossible for me to read the entirety of literature, it’s impossible to approach even a fraction of it—a fraction of a fraction of it. Some several blocks from where I now write is the Library of Congress, the largest collection in the world, which according to its website contains 38 million books (that’s excluding other printed material from posters to pamphlets). If somebody read a book a day, which of course depends on the length of the book, it would take somebody about 104,109 years and change to read everything within that venerable institution (ignoring the fact that about half-a-million to a million new titles are published every year in English alone, and that I was also too unconcerned to factor in leap years).

If you’re budgeting your time, may I suggest the British Library, which though it has a much larger collection of other textual ephemera, has a more manageable 13,950,000 books, which would take you a breezy 38,291 years to get through. If you’re of a totalizing personality, according to Google engineers from a 2010 study estimating the number of books ever written, you’ll have to wade through 129 million volumes of varying quality. That would take you 353,425 years to read. Of course this ignores all of that which has been written but not bound within a book—all of the jottings, the graffiti, the listings, the diaries, the text messages, the letters, and the aborted novels for which the authors have wisely or unwisely hit “Delete.” Were some hearty and vociferous reader to consume one percent of all that’s ever been written—one percent of that one percent—and then one percent of that one percent—they’d be the single most well-read individual to ever live. When we reach the sheer scale of how much human beings have expressed, have written, we enter the realm of metaphors that call for comparisons to grains of sand on the beach or stars in our galaxy. We depart the realm of literary criticism and enter that of cosmology. No wonder we require curated reading lists.

For myself, there’s an unhealthy compulsion towards completism in the attendant tsuris over all that I’ll never be able to read. Perhaps there is something stereotypically masculine in the desire to conquer all of those unread worlds, something toxic in that need. After all, in those moment’s of readerly ennui there’s little desire for the experience, little need for quality, only that desire to cross titles off of some imagined list. Assume it were even possible to read all that has been thought and said, whether sweetness and light or bile and heft, and consider what purpose that accomplishment would even have. Vaguely nihilistic the endeavor would be, reminding me of that old apocryphal story about the conqueror, recounted by everyone from the 16th-century theologian John Calvin to Hans Gruber in Die Hard, that “Alexander the Great…wept, as well indeed he might, because there were no more world’s to conquer,” as the version of that anecdote is written in Washington Irving’s 1835 collection Salmagundi: Or, The Whim-whams and Opinions of Launcelot Langstaff, Esq. and Others. Poor Alexander of Macedon, son of Philip, educated in the Athenian Lyceum by Aristotle, and witness to bejeweled Indian war-elephants bathing themselves on the banks of the Indus and the lapis lazuli encrusted Hanging Gardens of Babylon, the gleaming, white pyramids at Cheops and the massive gates of Persepolis. Alexander’s map of the world was dyed red as his complete possession—he’d conquered everything that there was to be conquered. And so, following the poisoning of his lover Hephaestion, he holed up in Nebuchadnezzar’s Babylonian palace, and he binged for days. Then he died. An irony though, for Alexander hadn’t conquered, or even been to all the corners of the world. He’d never sat on black sand beaches in Hokkaido with the Ainu, he’d never drank ox-blood with the Masai or hunted the giant Moa with the Maori, nor had he been on a walkabout in the Dreamtime with the Anangu Pitjantjatjara or stood atop Ohio’s Great Serpent Mound or seen the grimacing stone-heads of the Olmec. What myopia, what arrogance, what hubris—not to conquer the world, but to think that you had. Humility is warranted whether you’re before the World or the Library.

Alexander’s name is forever associated not just with martial ambitions, but with voluminous reading lists and never-ending syllabi as well, due to the library in the Egyptian city to which he gave his name, what historian Roy MacLeod describes in The Library of Alexandria: Centre of Learning in the Ancient World as “unprecedented in kingly purpose, certainly unique in scope and scale…destined to be far more ambitious [an] undertaking than a mere repository of scrolls.” The celebrated Library of Alexandria, the contents of which are famously lost to history, supposedly confiscated every book from each ship that came past the lighthouse of the city, had its scribes make a copy of the original, and then returned the counterfeit to the owners. This bit of bibliophilic chicanery was instrumental to the mission of the institution—the Library of Alexandria wasn’t just a repository of legal and religious documents, nor even a collection of foundational national literary works, but supposedly an assembly that in its totality would match all of the knowledge in the world, whether from Greece and Egypt, Persia and India. Matthew Battles writes in Library: An Unquiet History that Alexandria was “the first library with universal aspirations; with its community of scholars, it became a prototype of the university of the modern era.” Alexander’s library yearned for completism as much as its namesake had yearned to control all parts of the world; the academy signified a new, quixotic emotion—the desire to read, know, and understand everything. By virtue of it being such a smaller world at the time (at least as far as any of the librarians working there knew) such an aspiration was even theoretically possible.

“The library of Alexandria was comprehensive, embracing books of all sort from everywhere, and it was public, open to anyone with fitting scholarly or literary qualifications,” writes Lionel Casson in Libraries in the Ancient World. The structure overseen by the Ptolemaic Dynasty, who were Alexander’s progeny, was much more of a wonder of the ancient world than the Lighthouse in the city’s harbor. Within its walls, whose appearance is unclear to us, Aristophanes of Byzantium was the first critic to divide poetry into lines, 70 Jews convened by Ptolemy II translated the Hebrew Torah into the Greek Septuagint, and the geographer Eratosthenes correctly calculated the circumference of the Earth. Part of the allure of Alexandria, especially to any bibliophile in this fraternity of dreamers, is the fact that the vast bulk of what was kept there is entirely lost to history. Her card catalogue may have included lost classical works like Aristotle’s second book of Poetics on comedy (a plot point in Umberto Eco’s medieval noir The Name of the Rose), Protagoras’s essay “On the Gods,” the prophetic books of the Sibyllines, Cato the Elder’s seven-book history of Rome, the tragedies of the rhetorician Cicero, and even the comic mock-epic Magrites supposedly written by Homer.

More than the specter of all that has been lost, Alexandria has become synonymous with the folly of anti-intellectualism, as its destruction (variously, and often erroneously, attributed to Romans, Christians, and Muslims) is a handy and dramatic narrative to illustrate the eclipse of antiquity. Let’s keep some perspective though—let’s crunch some numbers again. According to Robin Lane Fox in The Classical World: An Epic History from Homer to Hadrian the “biggest library…was said to have grown to nearly 500,000 volumes.” Certainly not a collection to scoff at, but Alexander’s library, which drew from the furthest occident to the farthest orient, has only a sixth of all the books in the Library of Congress’s Asian Collection; Harvard University’s Widener Library has 15 times as many books (and that’s not including the entire system); the National Library of Iran, housed not far from where Alexander himself died, has 30 times the volumes than did that ancient collection. The number of books held by the Library of Alexandria would have been perfectly respectable in the collection of a small midwestern liberal arts college. By contrast, according to physicist Barak Shoshany on a Quora question, if the 5 zettabytes of the Internet were to be printed, then the resultant stack of books would have to fit on a shelf “4×10114×1011 km or about 0.04 light years thick,” the last volume floating somewhere near the Oort Cloud. Substantially larger shelves would be needed, it goes without saying, than whatever was kept in the storerooms of Alexandria with that cool Mediterranean breeze curling the edges of those papyri.

To read all of those scrolls, codices, and papyri at Alexandria would take our intrepid ideal reader a measly 1,370 years to get through. More conservative historians estimate that the Library of Alexandria may have housed only 40,000 books—if that is the case, then it would take you a little more than a century to read (if you’re still breezing through a book a day). That’s theoretically the lifetime of someone gifted with just a bit of longevity. All of this numeric stuff misses the point, though. It’s all just baseball card collecting, because what the Library of Alexandria represented—accurately or not—was the dream that it might actually be possible to know everything worth knowing. But since the emergence of modernity some half-millennia ago, and the subsequent fracturing of disciplines into ever more finely tuned fields of study, it’s been demonstrated just how much of a fantasy that goal is. There’s a certain disposition that’s the intellectual equivalent of Alexander, and our culture has long celebrated that personality type—the Renaissance Man (and it always seems gendered thus). Just as there was always another land to be conquered over the next mountain range, pushing through the Kush and the Himalayan foothills, so too does the Renaissance Man have some new type of knowledge to master, say geophysics or the conjugation of Akkadian verbs. Nobody except for Internet cranks or precocious and delusional autodidacts actually believes in complete mastery of all fields of knowledge anymore; by contrast, for all that’s negative about graduate education, one clear and unironic benefit is that it taught me the immensity and totality of all of the things that I don’t know.

Alexandria’s destruction speaks to an unmistakable romance about that which we’ll never be able to read, but it also practically says something about a subset of universal completism—our ability to read everything that has survived from a given historical period. By definition it’s impossible to actually read all of classical literature, since the bulk of it is no longer available, but to read all of Greek and Roman writing which survives—that is doable. It’s been estimated that less than one percent of classical literature has survived to the modern day, with Western cultural history sometimes reading as a story of both luck and monks equally preserving that inheritance. It would certainly be possible for any literate individual to read all of Aristophanes’s plays, all of Plato’s dialogues, all of Juvenal’s epigrams.  Harvard University Press’s venerable Loeb Classical Library, preserving Greek and Latin literature in their distinctive minimalist green and red covered volumes, currently has 530 titles available for purchase. Though it doesn’t encompass all that survives from the classical period, it comes close. An intrepid and dogged reader would be able to get through them, realistically, in a few years (comprehension is another matter).

If you need to budget your time, all of Anglo-Saxon writing that survives, that which didn’t get sewn into the back-binding of some inherited English psalm book or found itself as kindling in the 16th century when Henry VIII dissolved the monasteries, is contained across some four major poetic manuscripts, though 100 more general manuscripts endure. The time period that I’m a specialist in, which now goes by the inelegant name of the “early modern period” but which everybody else calls the “Renaissance,” is arguably marked as the first time period for which no scholar would be capable of reading every primary source that endures. Beneficiary of relative proximity to our own time, and a preponderance of titles gestated through the printing press, it would be impossible for anyone to read everything produced in those centuries. For every William Shakespeare play, there are hundreds of yellowing political pamphlets about groups with names like “Muggletonians;” for every John Milton poem, a multitude of religious sermons on subjects like double predestination. You have to be judicious in what you choose to read, since one day you’ll be dead. This reality should be instrumental in any culture wars détente—canons exist as a function of pragmatism.   

The canon thus functions as a kind of short-cut to completism (if you want to read through all of the Penguin Classics editions with their iconic black covers and their little avian symbol on the cover, that’s a meager 1,800 titles to get through). Alexandria’s delusion about gathering all of that which has been written, and perhaps educating oneself from that corpus, drips down through the history of Western civilization. We’ve had no shortage of Renaissance Men who, even if they hadn’t read every book ever written, perhaps at least roughly know where all of them could be found in the card catalogue. Aristotle was a polymath who not only knew the location of those works, but credibly wrote many of them (albeit all that remains are student lecture notes), arguably the founder of fields as diverse as literary criticism to dentistry. In the Renaissance, whereby it could be assumed that the attendant Renaissance Man would be most celebrated, there was a preponderance of those for whom it was claimed that they had mastered all disciplines that could be mastered (and were familiar with the attendant literature review). Leonardo da Vinci, Blaise Pascal, Athanasius Kircher, Isaac Newton, and Gottfried Wilhelm Leibnitz have all been configured as Renaissance Men, their writings respectively encompassing not just art, mathematics, theology, physics, and philosophy, but also aeronautics, gambling, sinology, occultism, and diplomacy as well.

Stateside both Benjamin Franklin and Thomas Jefferson (a printer and a book collector) are classified as such, and more recently figures as varied as Nikola Tesla and Noam Chomsky are sometimes understood as transdisciplinary polymaths (albeit ones for whom it would be impossible to have read all that can be read, even if it appears as such). Hard to disentangle the canonization of such figures from the social impulse to be “well read,” but in the more intangible and metaphysical sense, beyond wanting to seem smart because you want to seem smart, the icon of the Renaissance Man can’t help but appeal to that completism, that desire for immortality that is prodded by the anxiety that libraries inculcate in me. My patron saint of polymaths is the 17th-century German Jesuit Kircher, beloved by fabulists from Eco to Jorge Louis Borges, for his writings that encompassed everything from mathematics to hieroglyphic translation. Paula Findlen writes in the introduction to her anthology Athanasius Kircher: The Last Man Who Knew Everything that he was the “greatest polymath of an encyclopedic age,” yet when his rival Renaissance Man Leibnitz first dipped into the voluminous mass of Kirchermania he remarked that the priest “understands nothing.”

Really, though, that’s true of all people. I’ve got a PhD and I can’t tell you how lightbulbs work (unless they fit into Puritan theology somehow). Kircher’s translations of Egyptian were almost completely and utterly incorrect. As was much of what else he wrote on, from minerology to Chinese history. He may have had an all-encompassing understanding of all human knowledge during that time period, but Kircher wasn’t right very often. That’s alright, the same criticism could be leveled at his interlocutor Leibnitz. Same as it ever was, and applicable to all of us. We’re not so innocent anymore, the death of the Renaissance Man is like the death of God (or of a god). The sheer amount of that which is written, the sheer number of disciplines that exist to explain every facet of existence, should disavow us of the idea that there’s any way to be well-educated beyond the most perfunctory meaning of that phrase. In that gulf between our desire to know and the poverty of our actual understanding are any number of mythic figures who somehow close that gap; troubled figures from Icarus to Dr. Faustus who demonstrate the hubris of wishing to read every book, to understand every idea. A term should exist for the anxiety that those examples embody, the quacking fear before the enormity of all that we don’t know. Perhaps the readerly dilemma, or even textual anxiety.

A full accounting of the nature of this emotion compels me to admit that it goes beyond simply fearing that I’ll never be able to read all of the books in existence, or in some ideal library, or if I’m being honest even in my own library. The will towards completism alone is not the only attribute of textual anxiety, for a not dissimilar queasiness can accompany related (though less grandiose) activities than the desire to read all books that were ever written. To whit—sometimes I’ll look at the ever-expanding pile of books that I’m to read, including volumes that I must read (for reviews or articles) and those that I want to read, and I’m paralyzed by that ever-growing paper cairn. Such debilitation isn’t helpful; the procrastinator’s curse is that it’s a personality defect that’s the equivalent of emotional quicksand. To this foolish inclination towards completism—desiring everything and thus acquiring nothing—I sometimes use Francis Bacon’s claim from his essays of 1625 that “Some books should be tasted, some devoured, but only a few should be chewed and digested thoroughly,” as a type of mantra against textual anxiety, and it mostly works. Perhaps I should learn to cross-stitch it as a prayer to display amongst my books, which even if I haven’t read all of them, they’ve at least been opened (mostly).

But textual anxiety manifests itself in a far weirder way, one that I think gets to the core of what makes the emotion so disquieting. When I’m reading some book that I happen to be enjoying, some random novel picked up from the library or purchased at an airport to pass the time, but not the works that I perennially turn toward—Walt Whitman and John Milton, John Donne and Emily Dickinson—I’m sometimes struck with a profound melancholy born from the fact that I shall never read these sentences again. Like meeting a friendly stranger who somehow indelibly marks your life by the tangible reality of their being, but whom will return to anonymity. Then it occurs to me that even those things I do read again and again, Leaves of Grass and Paradise Lost, I will one day also read for the last time. What such textual anxiety trades in, like all things of humanity, is my fear of finality, of extinction, of death. That’s at the center of this, isn’t it? The simultaneous fear of there being no more worlds to conquer and the fear that the world never can be conquered. Such consummation, the obsession with having it all, evidences a rather immature countenance.

It’s that Alexandrian imperative, but if there is somebody wiser and better to emulate it’s the old cynic Diogenes of Sinope, the philosophical vagabond who spent his days living in an Athenian pot. Laertius reports in Lives of the Eminent Philosophers that “When [Diogenes] was sunning himself…Alexander came and stood over him and said: ‘Ask me for anything you want.’ To which he replied, ‘Stand out of my light.’” And so, the man with everything was devoid of things to give to the man with nothing. Something indicative of that when it comes to this fear that there are things you’ll never be able to read, things you’ll never be able to know. The point, Diogenes seems to be saying, is to enjoy the goddamn light. Everything in that. I recall once admitting my fear about all that I don’t know, all of those books that I’ll never open, to a far wiser former professor of mine. This was long before I understood that an education is knowing what you don’t know, understanding that there are things you will never know, and worshiping at the altar of your own sublime ignorance. When I explained this anxiety of all of these rows and rows of books to never be opened, she was confused. She said “Don’t you see? That just means that you’ll never run out of things to read.” Real joy, it would seem, comes in the agency to choose, for if you were able to somehow give your attention equally to everything than you’d suffer the omniscient imprisonment that only God is cursed with. The rest of us are blessed with the endless, regenerative, confusing, glorious, hidden multiplicity of experience in discrete portions. Laertius writes “Alexander is reported to have said ‘Had I not been Alexander, I should have liked to be Diogenes.”

Image Credit: Wikipedia.

Stories in Formaldehyde: The Strange Pleasures of Taxonomizing Plot

-

Somewhere within the storerooms of London’s staid, gray-faced Tate Gallery (for it’s currently no longer on exhibit) is an 1834 painting by J.M.W. Turner entitled “The Golden Bough.” Rendered in that painter’s characteristic sfumato of smeared light and smoky color, Turner’s composition depicts a scene from Virgil’s epic Aeneid wherein the hero is commanded by that seventh-century-old prophetic crone, the Sibyl of Cumae, to make an offering of a golden bough from a sacred tree growing upon the shores of crystalline blue Lake Avernus to the goddess Prosperina, if he wishes to descend to Hades and see the shadow of his departed father. “Obscure they went through dreary shades, that led/Along the waste dominions of the dead,” translated John Dryden in 1697, using his favored totemistic Augustinian rhyming couplets, as Aeneas descends further into the Underworld, its entrance a few miles west of Naples. As imagined by Turner, the area around the volcanic lake is pleasant, if sinister; bucolic, if eerie; pastoral, if unsettling. A dapple of light marks the portal whereby pilgrims journey into perdition; in the distance tall, slender trees topped with a cap of branches jut up throughout the landscape. A columned temple is nestled within the scrubby hills overlooking the field. The Sibyl stands with a scythe so that the vegetable sacrifice can be harvested, postlapsarian snakes slither throughout, and the Fates revel in mummery near hell’s doorway. Rather than severe tones of blood red and sulfurous black, earthy red and cadaverous green: Turner opted to depict Avernus in soft blues and greys, and the result is all the more disquieting. Here, the viewer might think, is what the passage between life and death must look like—muted, temperate, serene, barely even noticeable from one transition to the next.

As with the best of Turner’s paintings, with his eye to color the visual equivalent of perfect pitch, it is the texture of hues that renders, if not some didactic message about his subject, a general emotional sense, a sentiment hard to describe and registering at a pitch that can be barely heard and yet alters one’s feelings in the moment. Such was the sense conveyed by the Scottish folklorist James George Frazer who borrowed the artist’s title for his landmark 1890 study The Golden Bough: A Study in Comparative Religion, describing on his first page how the painting is “suffused with the golden glow of imagination in which the divine mind of Turner steeped and transfigured even the fairest natural landscape.” This scene, Frazer enthuses, “is a dream-like vision of the little woodland…[where] Dian herself might still linger by this lonely shore, still haunt these woodlands wild.” An influential remnant of a supremely Victorian enthusiasm for providing quasi-scientific gloss to the categorization of mythology, Frazer’s study provided taxonomy of classical myth so as to find certain similarities, the better to provide a grand, unified theory of ancient religion (or what Edward Casaubon in George Elliot’s Middlemarch, written two decades before, might call The Key to All Mythologies). First viewing Turner’s canvas, and the rationalist Frazer was moved by the painting’s mysteriousness, the way in which the pool blue sky and the shining hellmouth trade in nothing as literal as mere symbolism, but wherein the textured physicality—the roughness of the hill and the ominous haze of the clouds, dusk’s implied screaming cicadas and the cool of the evening—conveys an ineffable feeling. Despite pretensions to an analysis more logical, Frazer intimates the numinous (for, how couldn’t he?). “Who does not know Turner’s picture of the Golden Bough?” he writes.

His argument in The Golden Bough was that religions originated as primitive fertility cults, dedicated to the idea of sacrifice and resurrection, and that from this fundamentally magical worldview would evolve more sophisticated religions, to finally be supplanted by secular science. The other argument from The Golden Baugh is implicit in the book’s very existence—that structure can be ascertained within the messy morass of disparate myths. To make this argument he drew from sources as diverse as Virgil to the Nootka people of British Columbia, classifying, categorizing, and organizing data as surely as a biologist preserving specimens in a jar of formaldehyde. And like Charles Darwin measuring finch beaks, or Thomas Huxley pinning butterflies to wood blocks, Frazer believed that diversity was a mask for similarity.

As reductionist as his arguments are, and as disputed as his conclusions may be, Frazer’s influence was outsize among anthropologists, folklorists, writers, and especially literary critics, who thrilled to the idea that some sort of unity could be found in the chaotic variety of narratives that constitute world mythology. “I am a plain practical man,” Frazer writes, “not one of your theorists and splitters of hairs and choppers of logic,” and while it’s true that The Golden Bough evidences a more imaginative disposition, it still takes part in that old quixotic desire to find some Grand Unified Theory of Narrative. While Frazer’s beat was myth, he was still a reporter in stories, and percolating like a counter-rhythm within discussion of narrative is that old desire, the yearning to find the exact number of plots that it is possible to tell. Frazer, for all that was innovative about his thought, was neither the first nor the last to treat stories like animals in a genus, narratives as if creatures in a phylum.

That grand tradition claims there are only 36 stories that can be told, or seven, or four. Maybe there is really only one tale, the story of wanting something and not getting it, which is after all the contour of this story itself—the strange endurance of the sentiment that all narrative can be easily classifiable into a circumscribed, finite, and relatively small number of possibilities. While I’ve got my skepticism about such an endeavor—seeing those suggested systems as erasing the particularity of stories, of occluding what makes them unique in favor of mutilating them into some Procrustean Bed—I’d be remiss not to confess that I also find these theories immensely pleasing. There is something to be said about the cool rectilinear logic that claims any story, from Middlemarch to Fifty Shades of Grey, Citizen Kane to Gremlins 2, can be stripped down to its raw schematics and analyzed as fundamental, universal, eternal plots that have existed before Gilgamesh’s cuneiform was wedged into wet clay.

Christopher Booker claims in The Seven Basic Plots: Why We Tell Stories that “wherever men and women have told stories, all over the world, the stories emerging to their imaginations have tended to take shape in remarkably similar ways,” differences in culture, language, or faith be damned. With some shading, Booker uses the archetypal psychoanalysis of Carl Jung to claim that every single narrative, whether in epic or novel, film or comic, can be slotted into 1) overcoming the monster (Beowulf, George Lucas’s Star Wars), 2) rags to riches (Charlotte Bronte’s Jane Eyre, Horatio Alger stories), 3) the quest (Homer’s The Odyssey, Steven Spielberg’s Raiders of the Lost Ark), 4) voyage and return (The Ramayana, J.R.R. Tolkien’s The Hobbit), 5) comedy (William Shakespeare’s Twelfth Night, the Coen Brothers’ The Big Lebowski), 6) tragedy (Leo Tolstoy’s Anna Karenina, Arthur Penn’s Bonnie and Clyde) or 7) rebirth (Charles Dickens’s A Christmas Carol, Harold Ramis’s Groundhog Day).

That all of these parenthetically referenced works are, of course, astoundingly different from each other in character, setting, and most of all language, is irrelevant to Booker’s theory. While allowing for more subtlety than my potted overview would allow, Booker still concludes that “there are indeed a small number of plots which are so fundamental to the way we tell stories that it is virtually impossible for any storyteller ever entirely to break away from them.” Such a claim is necessary to Booker’s contention that these narratives are deeply nestled in our collective unconscious, a repository of themes, symbols, and archetypes that are “our basic genetic inheritance,” which he then proffers as an explanation for why humans tell stories at all.

The Seven Basic Plots, published in 2004 after 34 years of labor, is the sort of critical work that doesn’t appear much anymore. Audacious to the point of impudence, ambitious to the level of crack-pottery, Booker’s theory seems more at home in a seminar held by Frazer than in contemporary English departments more apt to discuss gender, race, and class in Jane Austen’s Pride and Prejudice than they are the Orphic themes of rebirth as manifested in that same novel. Being the sort of writer who both denied anthropogenic climate change and defended asbestos (for real), Booker had the conservative’s permanent sense of paranoid aggrievement concerning the treatment of his perspectives. So, let me be clear—contra Booker’s own sentiments, I don’t think that the theories in The Seven Basic Plots are ignored by literary critics because of some sort of politically correct conspiracy of silence; I think that they’re ignored because they’re not actually terribly correct or useful. When figuring out the genealogical lineage of several different species of Galapagos Island finches, similarity becomes a coherent arbiter; however, difference is more important when thinking through what makes exemplary literature exemplary. Genre, and by proxy plot, is frequently more an issue of marketing than anything. That’s not to say that questions of genre have no place in literary criticism, but they are normally the least interesting (“What makes this gothic novel gothic?”). No stranger to such thinking himself, author Kurt Vonnegut may have solved the enigma with the most basic of monomyths elucidated—“man falls into hole, man gets out of hole.”

Booker isn’t after marketing, however, he’s after the key to all mythologies. Like Frazer before him, he won’t be the last critic enraptured by the idea of a Periodic Table of Plots, capable of explaining both Fyodor Dostoevsky’s Crime and Punishment as well as Weekend at Bernie’s, and he won’t be the last. If you wish to blame somebody for this line of thinking, as with most disciplines of human endeavor from ethics to dentistry, look to Aristotle as the culprit. The philosopher’s “four conflicts,” man against himself, man against man, man against nature, and man against the gods, have long been a convenient means of categorizing plots. The allure of there being a limited number of plots is that it makes both reading and writing theoretically easier. The denizens of high culture literary criticism have embraced the concept periodically, as surely as those producing paperbacks promising that a hit book can be easily plotted out from a limited tool kit. Georges Polti, of Providence Rhode Island and later Paris France, wrote The Thirty-Six Dramatic Situations in 1895, claiming that all stories could be categorized in that number of scenarios, including plots of “Crime pursued by vengeance” and “Murderous adultery.” “Thirty-six situations only!” Polti enthuses. “There is to me, something tantalizing about the assertion.” Polti’s book has long been popular as a sort of lo-fi randomizer for generating stories, and its legacy lives on in works like Ronald B. Tobias’s 20 Master Plots and How to Build Them and Victoria Lynn Schmidt’s A Writer’s Guide to Characterization: Archetypes, Heroic Journeys, and Other Elements of Dynamic Character Development.

There is also a less pulpy, tonier history surrounding the thinking that everything can be brewed down to a handful of elemental plots. My attitude concerning such thinking was a bit glib earlier, as there is something to be said about the utility in this thinking, and indeed entire academic disciplines have grown from that assumption. Folklorists use a classification system called the “Aarne-Thompson-Uther Index,” where a multitude of plot-types are given numbers (“Cinderella” is 510A, for example), which can be useful to trace the ways in which stories have evolved and altered over both distance and time. Unlike Polti’s 36 plots, Tobias’s 20, or Booker’s seven, Stith Thompson’s Motif-Index of Folk-Literature goes to six volumes of folk tales, fairy tales, legends, and myths, but the basic idea is the same: plots exist in a finite number (including “Transformation: man to animal” and “Magic strength resides in hair”). As with the system of classification invented by Francis James Child in The English and Scottish Popular Ballads, or the Roud Folk Song Index, the Aarne-Thompson Uther Index is more than just a bit of shell collecting, but rather a system of categorization that helps folklorists make sense of the diversity of oral literature, with scholar Alan Dundes enthusing that the system was among the “most valuable tools in the professional folklorist’s arsenal of aids for analysis.” Morphological approaches define the discipline known as “narrative theory,” which draws from a similar theoretical inclination as that of the ATU Index. All of these methodologies share a commitment to understanding literature less through issues of grammar, syntax, and diction, and more in terms of plot and story. For those who read with an eye towards narrative, there is frequently an inclination, sentiment, or hunch that all stories and novels, films and television shows, epics and lyrics, comics and plays, can have their fat, gristle, and tallow boiled away to leave just the broth and a plot that’s as clean as a bone.

A faith that was popular among the Russian Formalists, sometimes incongruously known as the Prague School (after where many of them, as Soviet exiles, happened to settle), including Roman Jacobson, Viktor Shklovsky, and Vladimir Propp, the last of whom wrote Morphology of the Folktale, reducing those stories to a narrative abstraction that literally looks like mathematics. A similar movement was that of French structuralism, as exemplified by its founder the linguist Ferdinand de Saussure, and as later practiced by the anthropologist Claude Levi-Strauss and the literary critic Roland Barth. In the Anglophone world, with the exception of some departments that are enraptured to narratology, literary criticism has often focused on the evisceration of a text with the scalpel of close reading rather than the measurement of plot with the calipers of taxonomy. Arguably that’s led to the American critical predilection towards “literary” fiction over genre fiction, the rejection of science fiction, fantasy, horror, and romance as being unserious in favor of all of those beautifully crafted stories in The New Yorker where the climax is the main character looking out the window, sighing, and taking a sip of coffee, while realizing that she was never happy, not really.

There are exceptions to the critical valorization in language over plot, however, none more so than in the once mighty but now passé writings of Canadian theorist Northrop Frye. Few scholars in the English-speaking world were more responsible for that once enthusiastic embrace of taxonomic criticism than this United Church of Christ minister and professor at Toronto’s Victoria College. Frye was enraptured to the psychoanalyst Carl Jung’s theories of how fundamental archetypes structure our collective unconscious, and he believed that a similar approach could be applied to narrative, that a limited number of plots structured our way of thinking and approaching stories. In works like Fearful Symmetry on William Blake, and his all-encompassing Anatomy of Criticism, Frye elucidated a complex, baroque, and elegant system of categorizing stories, the better to interpret them properly. “What if criticism is a science as well as an art?” Frye asked, wishing to approach literature like a taxonomist, as if novels were a multitude of plants and animals just awaiting Linnaean classification. For those who read individual poems or novels as exemplary texts, explaining what makes them work, Frye would say that they’re missing the totality of what literature is. “Criticism seems to be badly in need of a coordinating principle,” he writes, “a central hypothesis which, like the theory of evolution in biology, will see the phenomena it deals with as part of a whole.”

Frye argued that this was to be accomplished by identifying that which was universal in narrative, where works could be rendered of their unique flesh down into their skeletons, which we would then find to be myths and archetypes. From this anodyne observation, Frye spun out a complex classification system for all Western literature, one where he identifies the exact archetypes that define poetry and prose, where he flings about terms like “centripidal” and “centrifugal” to interpret individual texts, and where phrases like the “kerygmatic mode” are casually used.  Anatomy of Criticism is true to its title; Frye carves up the cadaver of literature and arrives at an admittedly intoxicating theory of everything. “Physics is an organized body of knowledge about nature, and a student of it says that he is learning physics, not nature,” Frye writes. “Art, like nature, has to be distinguished from the systematic study of it, which is criticism.” In Frye’s physics, there are five “modes” of literature, including the mythic, romantic, high mimetic, low mimetic, and ironic; these are then cross listed with tragic, comic, and thematic forms; what are then derived are genres with names like the dionysian, the elegiac, the aristophanic, and so on. Later in the book he supplies a complex theory of symbolism, a methodology concerning imagery based on the Platonic Great Chain of Being, and a thorough taxonomy of genre. In what’s always struck me as one of the odder (if ingenious) parts of Anatomy of Criticism, Frye ties genres specifically to certain seasons, so that comedy is a spring form, romance belongs to the summer, autumn is a time of tragedy, and winter births irony. How one reads books from those tropical places where seasons neatly divide between rainy or not speaks to a particular chauvinism on the Canadian’s part.

For most viewers of public television, however, their introduction to the “There-are-only-so-many-stories” conceit wasn’t Frye, but rather a Sarah Lawrence College professor who was the titular subject of journalist Bill Moyers’s 1988 PBS documentary Joseph Campbell and the Power of Myth. Drawing largely from his 1949 study The Hero with a Thousand Faces, Campbell became the unlikely star of the series that promulgated his theory of the “monomyth,” the idea that a single-story threads through world mythology and is often focused on what he termed “the hero’s journey.” Viewers were drawn to Campbell’s airy insights about the relationship between Akkadian mythology and Star Wars (a film which George Lucas admitted was heavily influenced by the folklorist’s ideas), and his vaguely countercultural pronouncement that one should “Follow your bliss!,” despite his own right-wing politics (which according to some critics could run the gamut between polite Reaganism to fascist sympathizing). Both Frye and Campbell exhibited a wide learning, but arguably only the former’s was particularly deep. With an aura of crunchy tweediness, Campbell seemed like the sort of professor who would talk to students about the Rubaiyat of Omar Khayyam in an office which smells of patchouli, a threadbare oriental rug on the dusty floor, knick-knacks assembled while studying in India and Japan, and a collapsing bookshelf jammed with underlined paperback copies of Friedrich Nietzsche and Arthur Schopenhauer above his desk. Campbell, in short, looked like what we expect a liberal arts teacher to look like, and for some of his critics (like Dundes who called him an “non-expert” and an “amateur”) that gave him an unearned authority.

But what an authority he constructed, the hero with only one theory to explain everything! Drawing from Jung, Frazer, and all the rest of the usual suspects, Campbell argued in his most famous book that broad archetypes structure all narrative, wherein a “hero ventures forth from the world of common day into a region of supernatural wonder: fabulous forces are there encountered and a decisive victory is won: the hero comes back from this mysterious adventure with the power to bestow boons on his fellow man.” Whether Luke Skywalker venturing out from Tatooine or Gilgamesh leaving Ur, the song remains the same, Campbell says. Gathering material from the ancient near east and Bronze Age Ireland, the India of the Mahabharata and Hollywood screen plays, Campbell claimed that his monomyth was the skeleton key to all narrative, a story whose parsing could furthermore lead to understanding, wisdom, and self-fulfillment among those who are hip to its intricacies. The Hero with a Thousand Faces naturally flattered the pretensions of some artists and writers, what with its implications that they were conduits connected directly to the collective unconsciousness. Much as with Freud and the legions of literary critics who applied his theories to novels and film, if Campbell works well in interpreting lots of movies, it’s because those directors (from Lucas to Stanley Kubrick) happened to be reading him. The monomyth can begin to feel like the critical equivalent of the intelligent design advocate who knows God exists, because why else would we have been given noses on which to so conveniently hold our glasses?

Campbell’s politics, and indeed that of his theory, are ambivalent. His comparative approach superficially seems like the pluralistic, multicultural, ecumenical perspective of the Sarah Lawrence professor that he was, but at the same time the flattening of all stories into this one monomyth does profound violence to the particularity of myths innumerable. There is a direct line between Campbell and the mythos-laden mantras of poet Robert Bly and his Iron John: A Book About Men, the tome that launched a thousand drum circles of suburban dads trying to engage their naturalistic masculinity in vaguely homoerotic forest rituals, or of Canadian psychotherapist/alt-right apologist Jordan Peterson who functions as basically a Dollar Store version of the earlier folklorist. Because myths are so seemingly elemental, mysterious telegrams from the ancient past, whose logic seems imprinted into our unconscious, it’s hard not to see the attraction of a Campbell. And yet whenever someone starts talking about “mythos” it inevitably can start to feel like you’re potentially in the presence of a weirdo who practices “rune magik,” unironically wonders if they’re an ubermensch, and has an uncomfortably racist Google search history. We think of the myth as the purview of the hippie, but it’s just as often the provenance of the jackbooted authoritarian, for Campbell’s writings fit comfortably with a particularly reactionary view of life, which should fit uncomfortably with the rest of us. “Marx teaches us to blame society for our frailties, Freud teaches us to blame our parents,” Campbell wrote in the posthumously published Pathways to Bliss, but the “only place to look for blame is within: you didn’t have the guts to bring up your full moon and live the life that was your potential.” Yeah, that’s exactly it. People can’t afford healthcare or get a job because they didn’t bring up their full moon…

The problem is that if you take Campbell too seriously then everything begins to look like it was written by Campbell. To wit, the monomyth is supposed to go through successive stages, from the hero’s origin in an ordinary world where he receives a “call to adventure,” to being assisted by a mentor who leads him through a “guarded threshold” where he is tested on a “road of trials,” to finally facing his ultimate ordeal. After achieving success, the hero returns to the ordinary world wiser and better, improving the lives of others through the rewards that have been bestowed upon him. The itinerary is more complex than this in The Hero with a Thousand Faces, but this should be enough to convey that Campbell’s schema is general enough that it can be applied to anything, but particular enough that it gives the illusion of rigor. Think of Jesus Christ, called to be the messiah and assisted by John the Baptist, tempted by Satan in the desert, and after coming into Jerusalem facing torture at the hands of the Romans, before his crucifixion and harrowing of hell, only to be resurrected with the promise of universal human salvation. Now, think of Jeff Lebowski, called to be the Dude and assisted by Walter Sobchak, tempted by Jackie Treehorn, battling the nihilists, only to return in time for the bowling finals. Other than speaking deep into the souls of millions of people, it should be uncontroversial to say that the gospels and the Coen Brothers’ The Big Lebowski are only the same story in the most glaringly of superficial ways, and yet the quasi-conspiratorial theory of the monomyth promises secret knowledge that says that they are.    

But here’s the thing—stories aren’t hydrogen, plots aren’t oxygen, narratives aren’t carbon. You can’t reduce the infinity of human experience into a Periodic Table, except in the most perfunctory of ways. To pretend that the tools of classification are the same as the insights of interpretation is to grind the Himalayas into Iowa, it’s to cut so much from the bone that the only meal you’re left with is that of a skeleton. When all things are reduced to monomyth, the enthusiast can’t recognize the exemplary, the unique, the individual, the subjective, the idiosyncratic, because some individual plot doesn’t have a magical wizard shepherding the hero to the underworld, or whatever. It’s to deny the possibility of some new story, of some innovation in narrative, its to spurn the Holy Grail of uniqueness. Still, some sympathy must be offered as to why these models appeal to us, of how archetypal literary criticism appeals to our inner stamp collectors. With apologies to Voltaire, if narrative didn’t exist it would be necessary to invent it—and everything else too. The reasons why archetypal criticism is so appealing are legion—they impose a unity on chaos, provides a useful measure of how narratives work, and give the initiate the sense that they have knowledge that is applicable to everything from The Odyssey to Transformers.

But a type of critical madness lay in the idolatry of confusing methodological models for the particularity of actual stories. Booker writes of stories that are “Rags to Riches,” but that reductionism is an anemic replacement for inhabiting Pip’s mind when he pines for Estella in Charles Dickens’s Great Expectations; he classifies Bram Stoker’s Dracula as being about “Overcoming the Monster,” but that simplification is at the expense of that purple masterpiece’s paranoia, its horror, its hunger, its sexiness. There are no stories except for in the details. To forget that narratives are infinite is a slur against them; it’s the blasphemy of pretending that every person is the same as every other. For in a warped way, there is but one monomyth, but it’s not what the stamp collectors say it is. In all of their variety, diversity, and multiplicity, every tale is a creation myth because every tale is created. From the raw material of life is generated something new, and in that regard we’re not all living variations of the same story, we’re all living within the same story.  

Bonus Links:—The Purpose of Plot: An Argument with MyselfThe Million Basic PlotsOn Not Going Out of the House: Thoughts About Plotlessness

Image Credit: Wikimedia Commons.

Letter from Wartime

-

“Questo è il fiore del partigiano,/o bella ciao, bella ciao, bella ciao ciao ciao,/questo è il fiore del partigiano/morto per la libertà.”                                         —Italian Partisan Song, “Bella Ciao”
“Heard about Houston? Hear about Detroit?/Heard about Pittsburgh, PA?/You oughta know not to stand by the window/Somebody see you up there.”                                         —Talking Heads, “Life During Wartime”
In the hours before Hurricane Sandy slammed into the northeastern United States, my apartment in Bethlehem (Pennsylvania), which was 100 miles and a few hours from the Atlantic, was permeated by the unmistakable smell of the shore. Stolid son of the Alleghenies that I am, I’d never experienced the full onslaught of a hurricane before. This almost miasmic odor I associated with vacation—a fragrance inextricably connected to the Jersey boardwalk and Massachusetts beaches, of salt-water taffy and lobster rolls—suddenly permeating my living room, whose window looked out on a hulking, rusting former steel mill, felt borderline apocalyptic. As is the nature in things apocalyptic, it’s the incongruity that is alarming. As it was for some frightened 17th-century peasant reading a pamphlet foretelling doom because of the appearance of a mysterious comet in the heavens or the birth of a two-headed calf. The unexpected, the unusual, the unforeseen act as harbinger.
A landlocked home smelling like the beach is perhaps not as dramatic as those former examples, of course, and yet as with a sun-shower or the appearance of frost in May, there is a certain surrealism in things being turned upside down. That disruption in the nature of things makes it feel like worse disorder is coming. As it did, certainly, those hours before climate-change-conjured Sandy knocked out transponders, their explosions lighting up the horizon an oozing green all through the night, the winds howling past my building on its hill overlooking the river, where ultimately the power was out for more than a week, and roads made unpassable by the felled centuries-old oaks and maples which dotted the Lehigh Valley. It’s the eerie stillness in the air before the storm came that impressed itself upon me (so much so that this isn’t the first time I’ve written about it), those last few moments of normalcy before the world ended, but when you could tell it was coming, and there was nothing to do but charge your phone and reinforce your windows to withstand the impact from all of the debris soon to be buffeted about. Can you smell the roiling, stormy, boiling sea in the air right now?
“If destruction be our lot,” state representative Abraham Lincoln told a crowd gathered at the Young Men’s Lyceum of Springfield, Ill., in the winter of 1838, “we must ourselves be its author and finisher. As a nation of freemen, we must live through all time, or die by suicide.” Historical parallels outlive their critical utility; some of us have made a cottage industry out of comparing whatever in our newsfeeds to the Peasants’ Rebellion or the English civil wars. In the realm of emotion however, in psychological reality, is the autumn of 2020 what it felt like to learn that Polish defenses had been overrun by the Nazi blitzkrieg? To apprehend the dull shake of those guns of August a generation before? To read news that Ft. Sumter had fallen? As Franco’s war in Spain was to the world war, as Bleeding Kansas was to the civil, are we merely in the antechamber to a room that contains far worse horrors? Ultimately no year is but like itself, so that we’re already cursed enough to live during these months of pandemic and militia, of incipient authoritarianism contrasted with the uncertain hope for renewal. On the ground it can’t help but feel like one of those earlier moments, so that we’re forced to fiddle about with the inexact tool of historical comparison, of metaphor and analogy. Something of what Lincoln said, more than something, seems applicable now. “Suicide” might not be the right word though, unless we think of the national body politic as a single organism in and of itself. Certainly there are connotations of self-betrayal, but it’s more accurate to see this season of national immolation as what it is—a third of the country targeting another third while the remaining third remains non-committal on what stand they’ll take when everything starts to finally fall apart.
We shouldn’t misread Lincoln’s choice of word as indicating an equivalence of sides; in this split in the national psyche there is the malignant and the non-malignant, and it’s a moral cowardice to conflate those two. On one side we have a groundswell movement on behalf of civil and human rights, a progressive populism that compels the nation to stand up for its always unrealized and endlessly deferred ideals; on the other we have the specter of authoritarianism, of totalitarianism, of fascism. This is not an issue of suicide, it’s one of an ongoing attempted homicide, and if you’re to ever not shrink away from mirrors for the rest of your life—even if the bad guys should win (as they might)—then choose your side accordingly. And figure out that you don’t even have to like your allies, much less love them, to know that they’re better than the worst people in the room. If you bemoan “cancel culture” and “social justice warriors” but not extrajudicial kidnapping of activists by paramilitaries, then you are at best a hypocrite and a fool, and at worst a bad-faith actor justifying the worst of the U.S. government. If your concern is with the rhetorical excesses of a few college kids on Twitter, but you’re silent about the growing fascist cult currently in control of the federal executive, the federal judiciary, half of the federal legislature, and a majority of state governments (not to speak of the awesome power of the military), then you’ve already voted with your words. If you’re disturbed by property destruction, but not the vigilante murder of protestors, then you’ve since made your decision. We all have to imagine that speaking out might still mean something; we have to pretend like voting might make a difference; we all have to live with ourselves as citizens and humans beings. What I’m writing about is something different, however. What I’m writing about is what it feels like to be living through the blood-red dusk of a nation.
When the Romans left Britain, it was so sudden and surprising that we still have record of the shock amongst the locals over the retraction of the empire from their frosted shores. The Medieval English monk Gildas the Wise, as well as his student the Venerable Bede, record that in the immediate years following this abandonment, an appeal was sent to the capital for assistance. “The barbarians drive us to the sea,” wrote the Britons’ leaders, “the sea drives us to the barbarians; between these two means of death, we are either killed or drowned.” Under the protection of the imperial hegemon, the British Celts built an advanced civilization. Aqueducts brought water into the towns and cities, concrete roads lined paths through the countryside. One imagines that the mail arrived on time. In a shockingly brief period, however, and all that was abandoned; the empire having retracted back into itself and left those for whom it was responsible at the mercy of those who wished to pick apart its bones. Three centuries later, and the inhabitants of England no longer even remembered Rome; an anonymous Anglo-Saxon poet writes of a ruined settlement, that “This masonry is wondrous; fates broke it/courtyard pavements were smashed… Roofs are fallen, ruinous towers,/the frosty gate with frost on cement is ravaged,/chipped roofs are torn, fallen,/undermined by old age.” Have you seen American infrastructure lately? By the eighth century and that silent scop singling his song of misinterpreted past glories can’t even imagine by what technology a city like Londinium was made possible. He writes that “the work of giants is decaying,” because surely men couldn’t have moved stones that large into place.
Because historical parallel is such a fickle science, an individual of very different political inclinations than myself might be apt to misunderstand my purposes. They may see some sort of nativist warning in my allegory about Picts and Scots pushing beyond Hadrian’s great, big beautiful wall. Such a reading is woefully incorrect, for the barbarians that I identify are not some mythic subaltern beyond the frontier, but rather the conspiratorial minded fanatics now amassing at the polls, the decadent parsers of tweets who believe in satanic cabals, and the personality cultists who’ve all but abandoned a belief in democracy. As the Greek poet Constantine Cavafy wrote, “Why isn’t anything going on in the senate?/Why are the senators sitting there without legislating?/Because the barbarians are coming today.” We’re beyond the point of disagreeing without being disagreeable, the era of going high when they go low is as chimerical as it ever was.  There is something different in the United States today, and I know that you feel it; something noxious, toxic, sick, diseased, and most of all decadent. The wealthiest nation on Earth with such iniquity, where pandemic burnt—still burns—through the population while the gameshow host emperor froths his supporters into bouts of political necromancy. There is no legislation today because it increasingly feels like this is not a nation of laws, but something lower and uglier.
When I say that there is a decadence, I mean it in the fullest sense of that word. Not in the way that some reactionaries mean, always with their bad faith interpretations; nor exactly in the manner that my fellow leftists often mean, enraptured as they are to that ghost called “materialism.” Rather I mean a fallenness of spirit, a casual cruelty that if I were a praying man I’d identify as being almost devilish. Perhaps there are satanic cabals after all, just not where the letter-people think (I suspect the call is actually coming from within the White House). Since the republic was founded, we’ve fancied ourselves Rome, always fearing the Caesar who never seems to finally cross the Potomac. That’s the thing with self-fulfilling prophecies. Now the denizens of the fading order of Pax Americana seem every bit as incredulous at collapse as those poor Britons a millennium-and-a-half ago. Writing in The Irish Times, the great critic Fintan O’Toole notes that “Over more than two centuries, the United States has stirred a very wide range of feelings in the rest of the world: love and hatred, fear and hope, envy and contempt, awe and anger. But there is one emotion that has never been directed towards the U.S. until now: pity.” I can genuinely say that I appreciate his sentiment.
When I lived in Europe, I couldn’t help but feel that there was ironically something younger about my friends—I imagine it would seem compounded today. The irony comes from the traditional stereotype of “The American,” this rustic well-meaning hayseed, this big, bountiful, beautiful soul traipsing on his errand into the wilderness. If America was a land without history, then the Old World was supposedly death haunted, all those Roman ruins testament to the brutality that marked that continent, not least of all in the last century. Such was the public relations that marked this hemisphere from its supposed discovery onward—but how easily we forget the blood that purchased this place, a land which was never virginal, but that was raped from the beginning. I envy Europeans. I envy their social democracy and their welfare states, their economic safety nets and their sense of communal goodwill (no matter how frayed or occasionally hypocritical). Every European I met, the English and Scots, the French and Italians, seemed more carefree, seemed more youthful. They seemed to have the optimism that Americans are rumored to have but of which there is no remaining evidence of as the third decade of this millennium begins. During the early days of the pestilence the Italians were locked inside all of those beautiful old stone buildings of theirs. Now they’re sitting outside in cafes and trattorias, going to movies and concerts. We’re of course doing those things too, but the difference is that we have more than 200,000 dead and counting, and from the top on down it seems like few care. A French friend of mine once asked how Americans are able to go to the grocery store, the theater, the public park, without fear of getting shot? In the end, America will get you, whether by bullet or microbe. As a nation of freemen, we’re a traumatized people…

One of the few outsiders to really get our number was D.H. Lawrence, who in his Studies in Classic American Literature noted that “The essential American soul is hard, isolate, stoic, and a killer. It has never yet melted.” How could it be otherwise, in a nation built on stolen land by stolen people? America’s story is a gothic tale, a house built on a Native American burial ground. The legacies of bloodshed, of assault, of exploitation, of oppression that mark this forge of modernity ensure that it’s hard to be otherwise, even if we’re not allowed to ever admit such unpatriotic things. In that sense I don’t wonder if it wasn’t inevitable that we’d eventually be led—against the wishes of the majority—by this fool who promises to steal an election while accusing his adversary of the same, who will no doubt refuse to concede even when it becomes clear that he’s lost. We’re continually told by nice, liberal, and morally correct commentators that this is not who we are, but the American president is a philandering, sociopathic carnival barker who sells bullshit to people who can’t be so brain dead as to not know that it’s bullshit, all because they hate people who look different from them more than they love their own children. He’s Elmer Gantry, Harold Hill, “Buzz” Windrip.  He’s the unholy union of P.T. Barnum and Andrew Jackson. What could be more American?
Of course our saving grace has always been that we’re a covenantal nation, defined by supposed adherence to an abstract set of universal values. No land for anything as mundane as blood and soil (even though those ghouls at Charlottesville spread their terror for exactly that reason). There was something scriptural in the idealism that John Winthrop maintained in 1630, whereby national sustenance was in “our community as members of the same body,” or Lincoln in 1864 providing encomium for “government of the people, by the people, for the people,” and Barack Obama in 2004 declaring the American mantra to be one of “Hope in the face of difficulty, hope in the face of uncertainty, the audacity of hope.” That old saw about life, liberty, and the pursuit of happiness. No nation since that of the ancient Hebrews was so fully founded upon an idea—this idea that is by definition so utopian and so completely unattainable that to be a satisfied American is to make your peace with heartbreak, or else to see yourself become either delusional or cold and cruel.
There is an idea of America and the reality of the United States, and all of our greatest literature, rhetoric, and philosophy lives in that infinite gap between, our letters always being an appraisal of the extent of our disappointment. “The promises made in the Declaration of Independence and the Constitution,” writes critic Greil Marcus in The Shape of Things to Come: Prophecy and the American Voice, “were so great that their betrayal was part of the promise.” Thus the greatest of American political modes from the Puritans to Obama would be the jeremiad. Thus our most native of literature, be it Mark Twain’s The Adventures of Huckleberry Finn or Ralph Ellison’s Invisible Man, chart the exigencies of a dream deferred. All of American literature is a tragedy. What we’re living through now isn’t a tragedy, however—it’s a horror novel. Only the most naïve of fools wouldn’t be aware that that strain of malignancy runs through our country’s narrative—all of the hypocrisies, half-truths, and horrors—that define us from the moment when the word “America” was first printed on Martin Waldseemüller and Mathias Ringmann’s map of the world in 1507. In Stephen Vincent Benet’s classic short story “The Devil and Daniel Webster,” Old Scratch himself says that “When the first wrong was done to the first Indian, I was there. When the first slaver put out for the Congo. I stood on her deck…I am merely an honest American like yourself—and of the best descent.” What would Eden be, after all, without the serpent? A thing with devils is that they imply there must be angels; if you can find proof of hell, that indicates that there might be a heaven, somewhere. That’s the corollary to the failed covenant, that even with all of the hypocrisy, half-truth, and horror, there is that creed—unfulfilled, but still stated. Freedom of expression. Equal opportunity. The commonwealth of all people. Do I write jeremiads myself? Very well then.


I only do so to remind us that the confidence man huckster (who as I write this is only a few miles down Pennsylvania Avenue undoubtedly conspiring on what nightmares he’ll unleash upon his fellow citizens when he doesn’t get his way) is an American, if a cankered one. Take solace, though, because America isn’t just Stephen Miller, but Harriet Tubman and John Brown also; it’s not only Steve Bannon, but Frederick Douglass and Elizabeth Cady Stanton; more than Donald Trump, it’s also Eugene Debs and Dorothy Day, James Baldwin, and Emma Goldman, Harvey Milk and Ruth Bader Ginsburg. Such a litany of secular saints is of course inconsistent, contradictory, and I’ll unabashedly confess a bit maudlin. But that’s okay—we need not all agree, we need not all be saints, to still be on the side of those beings in any such Manichean struggle. More than just angels can fight demons; the only thing required is the ability to properly name the latter. Because if American history is anything, if the American idea is anything, it’s a contradictory story, that dialectical struggle that goes back through the mystic chains of memory, a phrase which I once read somewhere. The contradictions of American culture once again threaten to split the whole thing apart. Make your plans accordingly, because the battle always continues.
For such is the great moral struggle of this century. It is against neofascism and its handmaiden of a cultish twisted civil religion. It requires the breaking of this fractured American fever dream, where a vaccine is far from assured. Right now it seems like our choices are authoritarianism or apocalypse, though perhaps there are always reasons to hope for more. What’s coming, I can’t be sure of, but that lyric of the great prophet Leonard Cohen “I’ve seen the future, brother/It is murder” echoes in my numbed brain. Whether or not we can stand athwart history and yell “Stop!” or not, whether or not there is the possibility to affect genuine change, whether or not it’s we can still salvage a country of decency, justice, and freedom—I’m unsure. What I do know is that whether or not any of those things can happen, we must live our political lives with a categorical imperative that acts as if they can. Least of all so that we’re able to live with ourselves alone in the rooms of our minds. Live with at least some convictions, live spiritually like the men remembered in poet Genevieve Taggard’s lyric in honor of those veterans of the Abraham Lincoln Brigade. Americans (mostly socialists, communists, and anarchists) who went to Spain to fight the fascists in the years before the Second World War. “They were human. Say it all; it is true. Now say/When the eminent, the great, the easy, the old,/And the men on the make/Were busy bickering and selling,/Betraying, conniving, transacting, splitting hairs,/Writing bad articles, signing bad papers,/Passing bad bill,/Bribing, blackmailing,/Whimpering, meaching, garroting, – they/Knew and acted.”
Bonus Links:—Letter from the Other ShoreLetter from the PestilenceSteal This Meme: Beyond Truth and LiesOn Pandemic and Literature
Image Credit: SnappyGoat.