Can the average humanities professor be blamed if she rises in the morning, checks the headlines, shivers, looks in the mirror, and beholds a countenance of righteous and powerless innocence? Whatever has happened politically to the United States, it’s happened in stark opposition to the values so many philosophers and English professors, historians and art historians, creative writers and interdisciplinary scholars of race, class, and gender hold dear.
We are, after all, the ones to include diverse voices on the syllabus, use inclusive language in the classroom, teach stories of minority triumph, and, in our conference papers, articles, and monographs, lay bare the ideological mechanisms that move the cranks and offices of a neoliberal economy. Since the Reagan era our classrooms have mustered their might against thoughtless bigotry, taught critical thinking, framed the plight and extolled the humanity of the disadvantaged, and denounced all patriotism that curdles into chauvinism.
We’ve published books like Henry Louis Gates Jr.’s The Signifying Monkey, Gayatri Chakravorty Spivak’s A Critique of Postcolonial Reason, Judith Butler’s Gender Trouble, and Kwame Anthony Appiah’s The Ethics of Identity — treatises that marshal humane nuance against prejudice, essentialism, propaganda, and demagogic charisma.
We’ve cast out Bill O’Reilly, Glenn Beck, Sean Hannity, and Steve Bannon, but also Allan Bloom, Jordan Peterson, Richard J. Herrnstein, and Charles Murray. Our manner has been academic, but our matter has been political, and we have fought hard. So how have we ended up in these ominous political straits?
The easy answer is frightening enough: We don’t really matter. The hard one chills the blood: We are, in fact, part of the problem.
How has this sorry reality come to pass, across the humanities, and as if despite them? I can only tell a story of my own field and await the rain of stones. Three generations ago, literature professors exchanged a rigorously defined sphere of expertise, to which they could speak with authority, for a much wider field to which they could speak with virtually no power at all. No longer refusing to allow politics to corrupt a human activity that transcends it, they reduced the literary to the political. The change was sharp. From World War I until the 1960s, their forerunners had theorized literature as a distinct practice, a fine art, a realm of its own. Whether in the scholarship of the Russian Formalists, in T.S. Eliot’s archconservative essays, or in such midcentury monuments as Erich Auerbach’s Mimesis (1946), René Wellek and Austin Warren’s Theory of Literature (1948), and Northrop Frye’s Anatomy of Criticism (1957), literature was considered autonomous.
The editorial logic of right-wing media resembles closely the default position of many recent books and dissertations in literary studies.
Then, starting in the 1970s, autonomy became a custom honored only in the breach. Terry Eagleton and Fredric Jameson were first among countless equals who argued that pure art was pure politics. In 1985, Jane Tompkins laid out what many scholars increasingly believed about the whole field — that “works that have attained the status of classic, and are therefore believed to embody universal values, are in fact embodying only the interests of whatever parties or factions are responsible for maintaining them in their preeminent position.” Porous boundaries, fluid categories, and demoted reputations redefined classic texts.
Beauty became ideology; poetry, a trick of power, no more essentially valuable than other such tricks — sitcoms, campaign slogans, magazine ads — and no less subject to critique. The focus of the discipline shifted toward the local, the little, the recent, and the demotic. “I find no contradiction in my writing about Henry James, bodybuilding, heavy metal, religion, and psychoanalytic theory,” Marcia Ian stated in PMLA in 1997. In Classics and Trash: Traditions and Taboos in High Literature and Popular Modern Genres (1990), Harriet Hawkins argued that much pop culture “has in practice … been a great deal more democratic and far less elitist, even as it has often been demonstrably less sexist than the academically closeted critical tradition.” Within the bosky purlieus of a declining humanism, everything had become fair game for study: Madonna and Lost, Harry Potter and Mad Men.
The demographic exclusivity of the midcentury canon sanctified the insurrection. Who didn’t feel righteous tossing Hawthorne on the bonfire? So many dead white men became so much majestic smoke. But now, decades later, the flames have dwindled to coals that warm the fingers of fewer and fewer majors. The midcentury ideal — of literature as an aesthetically and philosophically complex activity, and of criticism as its engaged and admiring decoding — is gone. In its place stands the idea that our capacity to shape our protean selves is the capacity most worth exercising, the thing to be defended at all costs, and the good that a literary inclination best serves.
Democratizing the canon did not have to mean abdicating authority over it, but this was how it played out. In PMLA in 1997 Lily Phillips celebrated a new dispensation in which “the interpreter is not automatically placed above either producers of texts or participants in events but is acknowledged as another subject involved in a cultural practice, with just as much or as little agency.” This new dispensation — cultural studies — “emerged forcefully because the awareness of positionality, context, and difference is endemic to this historical period.”
Having eaten the tail of the canonical beast they rode on, scholars devoured their own coccyges. To profess the humanities was to clarify one’s situatedness, one’s limited but crucial perspective, one’s opinion and its contingent grounds. Yet if “opinion is always contingent,” Louis Menand asked laconically, “why should we subsidize professionals to produce it?”
By the 1990s, many scholars equated expertise with power and power with oppression and malicious advantage. The humane gesture was not to fight on behalf of the humanities — not to seek standing — but rather to demonstrate that literary studies no longer posed a threat. Unmaking itself as a discipline, it could subtract at least one instance of ideological violence from the nation and world.
If the political events of 2016 proved anything, it’s that our interventions have been toothless. The utopian clap in the cloistered air of the professional conference loses all thunder on a city street. Literature professors have affected America more by sleeping in its downtown hotels and eating in its fast-food restaurants than by telling one another where real prospects for freedom lay. Ten thousand political radicals, in town for the weekend, spend money no differently than ten thousand insurance agents.
Now that we have a culture of higher education in which business studies dominate; now that we face legislatures blind to the value of the liberal arts; now that we behold in the toxic briskness of the four-hour news cycle a president and party that share our disregard for expertise while making a travesty of our aversion to power, the consequences of our disavowal of expertise are becoming clear. The liquidation of literary authority partakes of a climate in which all expertise has been liquidated. In such a climate, nothing stands against demagoguery. What could?
That English departments have contributed to this state of affairs is ironic to say the least. A lifetime ago, literary studies was conceived precisely in opposition to the specter of demagogues. The field was funded and justified on the presumption of its value as a bulwark against propaganda and political charisma. Our predecessors feared more or less exactly what we now face. The discipline we’ve deconstructed was their answer to it.
Consider their historical situation, in which potent new media were fast eclipsing old. The first radio broadcasting station opened in fall 1920 in East Pittsburgh. Two years later, consumers spent $60 million on radio sets, parts, and accessories. In 1920, movies were silent. A decade later, they talked. Instantaneous sound and light reached vast audiences between the World Wars and played an instrumental role launching the second one. Authoritarians right and left stirred ears, eyes, and hearts across unprecedented distances. The unruly masses, mobilized by facile slogans, looked everywhere poised to undermine free institutions.
The screens outshined and the speakers drowned out the fainter, truer, deeper power of difficult texts and images. By the mid-1930s, F. Scott Fitzgerald believed that the novel, “the strongest and supplest medium for conveying thought and emotion,” was taking a back seat to “mechanical and communal art that, whether in the hands of Hollywood merchants or Russian idealists, was capable of reflecting only the tritest thought, the most obvious emotion.” What lay ahead were the gulags of Siberia and the kickshaws of Madison Avenue.
Long before Hitler invaded Poland, many Americans recognized and feared the thrust of the new media. And without those fears, we would not have English departments as we know them. The New Criticism and kindred postwar approaches received huge financial sanction for their purported capacity to counter propagandistic simplicity. The Rockefeller Foundation gave thousands to support Princeton University’s Gauss Seminars in Criticism (starting in 1952), the Kenyon, Sewanee, Hudson, and Partisan Reviews (starting in 1944), and the School of English at Kenyon College under Lionel Trilling, F.O. Matthiessen, and John Crowe Ransom (starting in 1947). It also underwrote the independent scholars creating the intellectual architecture of the moment (including, in 1945, René Wellek and Austin Warren, writing Theory of Literature).
Higher education must emphasize similarity as well as difference, collective sustenance as well as individualistic emancipation, you as well as me.
But why literature? Why were philanthropic foundations with close ties to Washington bankrolling English professors? The rationale went like this: Modern industrial conditions deprived life of meaning and divested it of stability. Rote work left the mind empty. New forms of culture, shallow and mesmerizing, rushed to fill the void. In the Soviet Union and Nazi Germany, this meant dangerous ideologies. In the United States, as Fitzgerald saw early on, it meant commercial garbage. Advertising was preferable to totalitarianism, but barely. “Before man can transcend himself,” Bernard Rosenberg opined in 1957, “he is being dehumanized.” “This breeds anxiety,” he continued, “and the vicious circle begins anew, for as we are objects of manipulation, our anxiety is exploitable.”
Anxious minds entertained cheap appeals, and not just fascist ones. “Hollywood represents totalitarianism,” Hortense Powdermaker argued, where “the concept of man as a passive creature to be manipulated extends to those who work for the studios, to personal and social relationships, to the audiences in the theaters and to the characters in the movies.” Theodor Adorno considered TV just as bad as film, aimed as it was “at producing, or at least reproducing, the very smugness, intellectual passivity, and gullibility that seems to fit in with totalitarian creeds, even if the explicit surface messages of the shows may be anti-totalitarian.”
Such prognostications infused the atmosphere in which the postwar humanities were conceived and born. The GI Bill did not just express gratitude to American veterans; it made timely provision against potential calamity. For demobilized American soldiers, humanistic study on a college campus would provide an alternative to radical and reactionary politics. It would equip them with common experiences, intellectual touchstones, ideas and debates that mitigated extreme partisanship, or worse. Literature professors, for once, shared real power with those who presided over the reshaping of the national mind. The anti-propagandistic crusades of the 1930s heralded the English departments of the 1950s.
To combat ideology, literature had to have properties of its own, beyond and above politics. Modernism had intensified these properties. What vouchsafed the purity of a text was in large part its sophistication. No dictator could find in it a resource for brainwashers. Difficulty, ambiguity, tension, paradox, and irony, the whole pantheon of New Critical values, signified, quite proudly, political uselessness. Such qualities established literature as transcendent content within immanent forms. This classroom commitment to eternity lasted roughly from 1945 to 1967. It fused together expertise, seriousness, difficulty, aesthetic hierarchy, and political innocence. It also tied these attributes mostly to poems and novels by straight white men. This, though, was a tragic contingency and not the essential truth of the field.
“Content is to be dissolved so completely into form,” Clement Greenberg announced in “Avant-Garde and Kitsch,” “that the work of art or literature cannot be reduced in whole or in part to anything not itself.” The danger of avant-garde works, “from the point of view of fascists and Stalinists, is not that they are too critical, but that they are too ‘innocent,’ that it is too difficult to inject effective propaganda into them, that kitsch is more pliable to this end.” Kitsch sustained despots; difficult art, their enemies.
Perhaps the most strident and earnest advocate of the value of high culture was Dwight Macdonald. In the Kennedy years he invited his reader to imagine somebody from the 16th century reading a copy of Time or The New York Times. That reader would spend a day or two mastering the issue, “because he would be accustomed to take the time to think and even to feel about what he read.” Macdonald’s contemporaries instead had “developed of necessity a rapid, purely rational, classifying habit of mind, something like the operations of a Mark IV calculating machine.” There was no longer time “to bring the slow, cumbersome depths into play.”
Macdonald wrote at dusk, breathing life into dying claims. The left soon abandoned its high cultural pretenses, helped along by the countercultural artists, writers, and musicians who retained the formal complexity while ditching the seriousness. Menand judges that “1962 was virtually the last year when a spirited defense of traditional cultural values by a liberal thinker could have carried much credibility.” Sgt. Pepper’s Lonely Hearts Club Band, Bonnie and Clyde, The Spy Who Came in From the Cold, All in the Family, Motown, Blonde on Blonde, Portnoy’s Complaint, Hair, Rolling Stone, and the work of Andy Warhol constitute Menand’s short list of distinction-effacing triumphs.
Meanwhile, the United States was stumbling catastrophically into Vietnam, a war that rendered it untenable to insist that academic expertise delivered you from ideology, as research universities were the beating heart of the Cold War. “Ambivalence,” so highly prized in poetry classrooms, mattered not at all to physicists perfecting warheads and political scientists heralding the fall of Communist dominoes. It positively discouraged student protesters, who rejected it wholesale. Literature classrooms became political — or, depending on your perspective, admitted to the politics that had always undergirded them. In the decades that followed, the task of criticism became the subversion of the monoliths complicit in injustice: the canon, confining gender roles, racial stereotypes, phallogocentrism, and logocentrism.
Were the midcentury academics, so spooked by film and radio, total fools? Instead of answering that question for the billionth time, let’s consider an analogous case. In 1990, the internet did not exist — not what we mean by the internet. By 1999 it represented the golden goose of the American economy. As late as Clinton’s first inaugural, college students still wrote letters to their friends at other colleges. A decade later, grandparents emailed grandchildren. And between the bursting of the tech bubble in 2000 and the election of 2016, few saw in cyberspace the seeds of disaster. We were far too clever to get suckered — especially those of us professionally dedicated to teaching and writing suspiciously. Blanket cynicism meant blanket immunity to infectious falsehoods. David Foster Wallace’s greatest gift to American readers might have been his warning that cynicism posed a larger threat than the naïveté it delivered us from. Yet to even his acolytes Wallace was an endearing canary in a chimerical coal mine. He neurotically sweated stuff that wasn’t really all that sinister.
It was natural to scoff at yesteryear’s Dwight Macdonalds and bask in the new media. It was positively Straussian to resuscitate distinctions between high and low, better and worse, more or less conducive to democracy. Rather than a danger, literature professors found in cyberspace a bountiful trough. It fed them new content that would, by the cleverness of the discipline, trick neither them nor the students in whom they inculcated habits of vigilant incredulity.
In 1997 David Glover embraced the glut: “Because the technologies of textuality and representation have long since outstripped any solely literary determination, reading can no longer be imagined as a singular encounter between subject and text but must instead be reconceived as a historically variable bundle of norms, codes, capacities, and techniques whose precise configuration at any time … remains a topic for detailed examination.” Twenty years later one finds articles like Karen Bourrier’s on “Victorian Memes” in Victorian Studies, which surveys references to Victorian authors on Twitter and argues that “these sentimental and didactic tweets often sound more ‘Victorian’ than realist novels themselves.” The midcentury notion of literature as antidote to propaganda gave way to the 21st century embrace of cyberspace as the happy playground of the times. “By broadening the field of available texts with which we can work,” Erik Yuan-Jyan Tsao observed, “cultural studies has contributed much to the survival of literary study.”
Scholarly thralldom to the internet ranges from casual tweeting to the digitization of the campus library to a full professionalization of what the internet makes possible: machine reading, distant reading, the noticing of patterns across wide swaths of texts. Meanwhile, English majors are replaced by communications majors. Those resisting such trends invoke technocratic grimness, the sad soullessness of remote criticism, or the inherent political conservatism of the digital humanities. But few, until recently, thought to fear the political consequences of cyberspace’s dispersal of real community.
The propagandistic nightmare of 1939 was metastatic unity, but the propagandistic specter today, just as grave, is the arrogant and ubiquitous hunch that an individual mind can overthrow the collective lie. The humanities were once upon a time a laboratory for experiments in shared interpretation. They have become, like politics — and, in fact, as politics — aggressively individualistic and resolutely anti-historical.
This goes even for those who write in dedication to progressive solidarity. A recent text embodies the disease we left-leaning English professors share with our political enemies. Joseph North’s Literary History: A Concise Political History (Harvard University Press, 2017) makes good on the adjectives in its subtitle. In a stock gesture, North excoriates the postwar establishment. He blames the New Critics and their Anglo counterparts for destroying the leftist potential of literary criticism. North believes that the formative work of I.A. Richards in the 1920s laid a foundation that stood undeveloped ever after by those on the left. Instead, John Crowe Ransom, F.R. Leavis, and friends built on that foundation a reactionary edifice. They “remade and institutionalized it as a thoroughly idealist practice, based in a neo-Kantian aesthetics of disinterest and transcendent value, directed toward religious cultural conservatism.”
North has nothing to say about Nazi Germany, the Soviet Union under Stalin, the New Critics’ reconciliation with the postwar welfare state, or their great fear of mobs duped by autocrats. Instead he charges with villainy the moderates who wished to help prevent another world war. The figures North discusses would not recognize themselves in the portrait. North draws no distinction between the Southern Agrarians of the 1920s and the men they later became, the New Critics of the 1940s and 1950s. As Robert H. Brinkmeyer, Jr. has made clear in The Fourth Ghost: White Southern Writers and European Fascism, 1930-1950 (LSU Press, 2009), the war chastened and transformed this group from apologists for slavery to centrists for republicanism. But North treats as homogeneous the most volatile 20 years of American political history, ignoring the headlines and election cycles that shaped the minds he castigates.
North declares unabashedly his disregard for the past. He is writing a “strategic” history. He “reflects on the past less for the sake of seeing ‘the full picture’ and more for the sake of discovering its main lines of force; and not even for the sake of discovering all the forces that were relevant at the time, but instead limiting oneself to those lines of force that still seem to condition what occurs today.” Astonishing or not, this accords with the norms of our moment. The past is not an intricate reality deserving meticulous fidelity. It is a source to be used selectively in service of the Editorial Now.
I don’t doubt that North considers his work a far cry from the shamelessness and ruthlessness of the demagogic media. I do as well, because his political goals resemble my own. But on what grounds can we insist on the distinction? How does a “strategic history” that carefully selects its “lines of force” differ from a Fox News story that makes the tiny something huge and the huge thing tiny? A scholar who denies the claims of history because they complicate the argument he wishes to make partakes in the forms of denial that are at the heart of our crisis.
For going on 50 years, professors in the humanities have striven to play a political role in the American project. Almost without exception, this has involved attacking the establishment. As harmful as institutionalized power can be, as imperfect as even the most just foundations inevitably appear, they are, as it turns out, all we’ve got. Never has a citizen been so grateful for institutions — for functioning courts, for a professionalized FBI, for a factually painstaking CBO or GAO — as since November 2016.
Even the most devoted relativist cannot behold Fox News or Breitbart and not regard these media outlets as propagandistic in the most flagrant sense. Eisenhower would have balked. Promoting conspiracy theories, granting vile charisma a national platform, amplifying peccadillos into crimes and reducing crimes to peccadillos, they embody everything that literary studies was meant, once, to defend against — not through talking politics, but by exercising modes of expression slow enough to inoculate against such flimsy thinking. Yet the editorial logic of right-wing media resembles closely the default position of many recent books and dissertations in literary studies: The true story is always the oppositional story, the cry from outside. The righteous are those who sift the shadows of the monolith to undermine it in defense of some notion of freedom.
In the second decade of the 21st century, the longstanding professorial disinclination to distinguish better from worse does not inspire confidence. The danger of being too exclusive, which the canon once was, pales before the danger of refusing to judge. In 2011, Louis Menand observed that “few people think that … in the matter of what kind of art people enjoy or admire, the fate of the republic is somehow at stake.” The Apprentice signifies once and for all the hubris of such blitheness.
The only way to battle, with real hope of meaningful reform, against populist nationalism is to affirm alternative forms of commonality. Far and away the greatest challenge for scholars in the 21st century will be figuring out how to do this work and also how to reclaim the influence that they voluntarily, and with the best of intentions, ceded. That influence will depend on expertise, and that expertise, for anybody in the humanities, will derive from a profound and generous understanding of the past.
Seventy years ago, the architects of a new world order got much wrong, for which the last two generations have relentlessly taken them to task. But they also tried to moor the American future in the best of our national traditions. They preserved and expanded the welfare state, scoured the 19th century for a democratic canon, balanced global power to forestall another World War, advanced the ethical vocabulary in which the movements for the wide expansion of civil rights unfolded, and sustained enough of a civic consensus to shame a criminal president into resigning. These accomplishments now appear distant dreams.
For the republic to survive, higher education must emphasize similarity as well as difference, continuity as well as rupture, collective sustenance as well as individualistic emancipation, you as well as me. It must do this without tipping into the old, real, omnipresent dangers of prejudice and bigotry. Liberal academics used to aim to thread that needle. They have long since given up but must try again. The central values of liberal arts education as presently conceived — creativity and critical thinking, originality and individuality — are all sail and no ballast. They might be the qualities of a good tech-sector job applicant or reality-show contestant, but we’re in mortal need of good citizens.