In After Virtue, the philosopher Alasdair MacIntyre suggests that, when a culture is in good shape, its condition will have much to do with the robust arguments conducted under its auspices about the good or the virtuous. An emphasis, there, on argument, that has always seemed to me compelling. I find it impossible to imagine a social order I would want to live in that isn’t built around a continuing, even interminable series of arguments. No doubt Yeats was right to warn, a century or so ago, that often in the modern world the best lack all conviction — you know how it goes — while only the worst are full of passionate intensity, so that the arguments, such as they are, will much of the time seem anything but edifying. And yet we do argue, and we want to be able to continue doing so as if things mattered — as if the tools at our disposal were sufficient to allow us to know, more or less, what we’re arguing about.
A friend of mine told me, not long ago, that passionate intensity was overrated, and conviction too. I knew what she meant. People with convictions are much of the time tedious. Moreover, they are intent upon achieving a grand consensus and, not incidentally, bringing everyone else to their knees. They want the rest of us to feel free to express ourselves, as they like to say, but only on the condition that we find them and their convictions irresistible and keep our mouths shut when we don’t. No surprise that passionate intensity seems often to belong most insistently to the commissars of correctness and their inflamed camp followers, who have as little use for real argument as they have for genuine difference or diversity.
Of course, there are arguments and arguments. Reasonable people even argue over whether there is any special virtue in being reasonable, contending, with considerable justification, that reason is only reason and can satisfy only the rational side of our nature. Others — shall we call them postmodernists? — contend that all values are “constructions,” and that disputes about ideas are inevitably hopeless and much of the time incoherent, in the sense that people can grasp only what their own culture, or tribe, or faction, allows them to grasp. Meanwhile, legions of the newly enlightened tell themselves that they must strive to eliminate strain and conflict, so that argument itself will seem more and more to be a sign that things are not as they were meant to be, and mere ideas will remind us all of the sorry fact that we have not yet arrived at the one true idea that will banish all the others.
It is always tempting to say that this is not a good time for ideas. Though people hold them or dismiss them, promote them or disparage them, ideas often seem unstable. Often we think we are debating an idea only to discover that it no longer means what we thought it meant. We proclaim our affection for equality, autonomy, liberation, authenticity only to find that the meanings of those words and the concepts they name have changed into something unrecognizable. Those of us who have long been wary of big ideas, ideas that mobilize infatuates, find that even modest ideas are routinely appropriated for purposes that can seem astonishing. This is a time when students and their mentors at major universities declare themselves endangered by the “unsafe and hostile” environment created by a professor — call her Laura Kipnis if you like — who had the nerve to publish a so-called offensive essay. Thought you understood terms like “unsafe,” “hostile,” and “endangered” and knew more or less what diversity of outlook or opinion might entail in an academic environment? Think again.
The very notion of diversity is now increasingly understood to refer to anything but differences of outlook, which we are urged — by the newly enlightened and militant — not to protect but to suppress and eliminate so that no delicate sensibility need be challenged or unsettled. A Jewish proverb says, “Don’t wish too hard, or you’ll get what you want.” So you want to make things safe enough to protect yourself and others not only from shock and awe but also from potentially disturbing thoughts and ambivalences? Don’t be surprised if you end up with more than you ever bargained for. Mae West liked to say, “Too much of a good thing can be a good thing,” and who wouldn’t say amen to that? But too much safe and secure and beneficent will never add up to anything good.
Ideas have always been in flux. The standard, unfavorable sense conveyed by the word “prejudice” was consistently challenged, over centuries, by thinkers like Edmund Burke and, later, T.S. Eliot, who saw in prejudice the goal you hoped to arrive at if you were to have a foundation for your thoughts and any hope of conducting a serious argument. MacIntyre changed the way we think about “identity” by asserting that rebellion against one’s own inherited identity is often a powerful way of expressing it. Herbert Marcuse, by no means alone in this, stirred a generation of radicals to consider whether tolerance might itself be an instrument or symptom of repression, thereby converting the benign idea celebrated by John Stuart Mill and other liberal thinkers into something else.
And yet, in spite of this long history of instability in the domain of ideas, it is now harder than ever to argue about ideas without first ascertaining that you and your antagonist share even rudimentary assumptions about what exactly is intended when a concept is invoked. Is judgment an exercise of discrimination or, as Montaigne had it, “an expression of habit”? Is “the other” to be understood as external to oneself or as a part of oneself? Is perfectibility to be understood as a delusion or, as Rousseau contended, that which principally distinguishes us from animals? When we say “love,” are we speaking of “the marriage of true minds” or of Jacques Lacan’s idea that “love is giving something you haven’t got to someone who doesn’t exist”?
Even legitimate ideas like ‘privilege’ are often invoked as conversation stoppers.
Anyone can deplore the evolution or disappearance of particular ideas. Each of us knows how to posture and preen on behalf of ideas that seemed secure and are now regarded as problematic, or naïve, or irrelevant. What ever happened, we ask, to the idea of colorblindness we used to think we aspired to in a multiracial universe and now find too quaint even to mention? Can it be that disinterestedness, once a noble ideal, has become nothing more than the mask that power wears? Is sympathy the good, plain thing we used to call fellow-feeling, or has it become one of those coercive fanaticisms wielded by people who never saw a moral high ground they didn’t like?
And what ever happened to banality, now that artworks heavily invested in the elementary or the obvious are hung with pride in major museums and bought up at high prices by influential collectors? Can it be that, in spite of the debates that periodically erupt around Hannah Arendt’s concept of “the banality of evil,” the word “banality” itself has become largely meaningless among younger writers and intellectuals? Not so easy, is it, to note the progress of such an idea, once as compelling to poets as to moral philosophers, without seeming to posture and deplore, to wax nostalgic for a time when you could say “banal” with some assurance that others would understand what you were going on about. We need terms like “banality” for the sharp, precise antagonisms they convey.
To be sure, many ideas that once had much to recommend them deserve in time to disappear. “The sublime,” perhaps, or “blaming the victim,” or “authenticity” when it refers to characteristics that are thought to be fixed and irrevocable, as in “authentically Jewish” or “authentically black.” About such things people can, quite clearly, disagree, and I have no program or formula to offer that will settle disputes built around things of this kind, though I can, really I can, smile for days on end when I learn, from a delirious “manifesto,” that males, authentic males, are “fatally engaged with violence, annihilation and extinction,” all as a result of their obsession with achieving and sustaining erection. So much for “authentic,” for ideas that may well deserve, as I say, to disappear.
The more serious issue has to do with the fact that a great many good and vitally important ideas are apt to be misused and are not, as a rule, honorably employed even by persons who should know better. Consider the ostensibly sophisticated handling of the idea that goes by the name of “privilege.” In current debates, it is often coupled with the word “white” so as to name a condition, a fact, that cannot be disputed, contradicted, or wished away by generous sentiments or liberal platitudes. An idea, white privilege, that is intended, when invoked, to banish any potentially troubling questions. Does the idea describe what is relevant or essential in any encounter between a white person and a nonwhite person, no matter who those persons are? Is an encounter between a black professor and a white student inevitably an exercise in race relations? Between a white professor and a black student? Is white privilege an important factor when the white persons in a given encounter are themselves poor and clearly victims of systemic injustices that have thwarted them in spite of their whiteness?
My point in raising these questions is not to answer them but to suggest that even legitimate ideas like “privilege” are often invoked as conversation stoppers, as blunt instruments wielded so as to inhibit real talk and real thinking, along the lines laid out by Kwame Anthony Appiah in a Salmagundi essay, where he speaks of the “racial etiquette” that has made it hard to talk about ideas like privilege and systemic racism.
“My impression,” Appiah writes, “is that many well-intentioned, non-racist academics … who are not black, feel that in criticizing” certain ideas “they risk … exposing themselves, dare I suggest, to the risk of being accused” of “contributing to racism.” This is not, he goes on, “to put it mildly, in the best interests of learning,” and “it is also often condescending,” reflecting as it does “a refusal to think seriously about racism” and to make it possible “to distinguish racism from other things.” And thus do we find that ideas — even ideas like “racism,” “privilege,” “microaggressions” — can come to lead lives unimagined by their progenitors. “Stalking a lost deed,” said Milan Kundera 40 years ago, of those Czech intellectuals who found, to their horror, that they had given their allegiance to platinum-plated “progressive” ideas that would soon be used to betray them or silence them.
We have long supposed that so-called liberal societies are worth defending precisely because they are committed to pluralism and the clash of ideas. And yet on several fronts our liberal societies are advancing toward what a number of thinkers, from Isaiah Berlin to John Gray, call “missionary regimes” promoting what they take to be “advanced values.” These values are informed by ideas whose status is — or is felt to be — all but unimpeachable. One such idea is rooted in the belief that, if you show people the error embedded in their ways of thinking, you will turn them around and save them from themselves. Explain to the religious that their faith is based on illusion, and before you know it their faith will disappear. That sort of thing. Faith itself thus becomes an idea you can refute and displace with a better idea.
Nor are advanced thinkers who operate in this way to be found only on the political or cultural left. In France, to argue for divestment from Israeli enterprises, as a protest against Israeli actions in Gaza, is now regarded not merely as a terrible idea, which perhaps it is, but as a form of incitement to violence and thus a criminal offense. No doubt, in the wake of the recent terrorist attacks in Paris, further restrictions on political discourse will be introduced, always with the standard, high-minded appeals to solidarité and liberté. So much for openness and the clash of ideas in a Western-enlightenment capital, where the going consensus has it that, if you teach people not to have bad thoughts, you will save them from error, that is, from drawing unwanted conclusions about the things they see, and that you will thereby offer them an ambivalence-free life.
To those who spend much of their time in academic settings, the phenomenon I am associating with missionary regimes will be instantly recognizable. More and more in such settings, the learning agenda is controlled by cadres of so-called human-relations or human-resources professionals and their academic enablers, who, as the Yale English professor David Bromwich has described them, regard “learning as a form of social adjustment,” and believe that it is their business to promote “adherence to accepted community values.” Ideas thus are esteemed only insofar as they ordain a safe and accredited direction that we can learn, all of us, to follow. Dialogue is encouraged so long as it is rooted in approved suppositions and clearly headed where we must all want it to go. The atmosphere has about it, as Bromwich sharply observes, the qualities of “a laboratory that knows how to monitor everything, and how to create nothing” and “a church held together by the hunt for heresies.”
But then the life of ideas is also increasingly compromised in precincts beyond the academy. Why should this be so? Clearly it is not sufficient to point to the instability of ideas. In many respects, that is the least of the problems we confront, and may not be very much of a problem at all. In fact, the problems to which I’ve alluded here have principally to do with something else entirely. Call it, if you will, the consequence of a failure, widely shared in our culture, to relate to ideas as if they might be radically incommensurable. Meaning what exactly? Simply that many ideas or values cannot be plausibly or usefully compared. Why not? Because the backgrounds presupposed by different ideas, the history and logic informing them, may well be too different for them to be comparable. In accepting incommensurability between one idea and another, we do not declare that we cannot or will not choose between them, only that neither reason nor logic will certify the correctness of the choice we make.
Dialogue is encouraged so long as it is rooted in approved suppositions and clearly headed where we must all want it to go.
The idea of affirmative action seems to me compelling, even indispensable, though I know it to be a violation of other ideas I honor, such as the idea that, in making assessments and commitments, it is essential not to assign primary or even secondary importance to race or gender or ethnicity. The criterion of compensation, or restitution, that informs the idea of affirmative action is essentially distinct from and not comparable with the ideas that inform a principled resistance to affirmative action. There is no coherent political or ethical system that will allow me to assert with confidence the irresistible superiority of affirmative action as an idea.
Just so, the idea informing “choice,” or “a woman’s right to choose,” or “abortion rights,” is not commensurable with a religiously inspired opposition to abortion, which regards terminating a pregnancy as a sin. Proponents of both views, each rooted in fully elaborated ideas, often pretend that the one is clearly superior to the other, when in truth there is no coherent way to compare them. Again, we make our choice, but we deceive ourselves if we suppose that reason is sufficient to validate that choice. It isn’t reason that informs Antigone’s decision to bury her traitorous brother. In deciding to defy the law, and her uncle the king, Antigone does not compare the one idea of rightful obligation (to the law) with the other (to a brother). Antigone chooses as she must, given who she is. But the essentially incommensurable nature of the ideas at issue cannot be denied or resolved. The tragedy lies in the fact that both conceptions of obligation are legitimate and compelling.
We are not, most of us, tragic characters — I myself try very hard not to be one — and yet we can surely accept that in weighing ideas we will often be dealing with incompatibilities and in-comparables, and proceeding without the help of a reliable, overarching standard. Liberal societies have, in general, tended to accept what John Gray calls the “agonistic character” of the freedoms we enjoy, and there is a sense in which mundane formulas like “agreeing to disagree” or “respecting difference” presuppose at least a vague appreciation of incommensurability. Not long ago you could quote John Stuart Mill on the “bad and immoral” persons who stigmatize those who hold unpopular opinions, and in quoting him you would know at least that Mill had helped to create the consensus on which modern liberal societies were built.
But that foundation is today not at all secure, and what Mill had to say about English society in 1859, when he published On Liberty, seems now extraordinarily pertinent. “In the present age,” he wrote "— which has been described as ‘destitute of faith, but terrified at skepticism’ — … the claims of an opinion to be protected from public attack are rested not so much on its truth as on its importance to society.” To Mill’s observation we might add that, for a great many writers, intellectuals, and academics, the agonistic character of liberal culture no longer seems at all an attractive proposition.
Our educated classes regard the university chiefly as an instrument of our collective purpose and an efficient engine for transmitting anxiety about ideas felt to be dangerous or out of bounds. Bizarre that a culture officially committed to diversity and openness should be essentially conformist, and that the hostility to the clash of incommensurable ideas and even to elementary difference should be promoted with the sort of clear conscience that can belong only to people who don’t know what they’re doing.
My old friend Irving Howe once wrote that “every current of the Zeitgeist … every assumption of contemporary American life favors the safe and comforting patterns of middlebrow feeling.” We don’t use expressions like “middlebrow” any more, but the words “safe and comforting patterns of middlebrow feeling” do accurately identify a good deal of what we contend with. Don’t care for “middlebrow” as a handy term of derogation? Bracket it if you like, but also consider that the lock-step march of the new commissars setting up to take control of our cultural institutions, from the universities to the mainstream media, has much to do with creating what Howe called “safe and comforting patterns” of feeling. The favored ideas are informed by a determination not to offend, not to disorient, not to stir discomfort. The idea of learning as adjustment is rooted in hostility to friction or dispute. What Kwame Anthony Appiah described as conversational “etiquette” is designed to ensure that conversation remain safe and refuse to stray in the direction of ideas that could conceivably cause anyone to feel fatally set apart from others in one’s cohort. The desire to make nice is, so I believe, a staple feature of what Howe calls “middlebrow” feeling.
Inevitably the kind of thing I’m after here will take many forms, and in proposing analogies or historical antecedents I would never want to suggest that one expression of a political tendency or a cultural formation is equivalent to any other. And yet I have no reluctance to invoke a work that meant a great deal to American intellectuals of my generation, not because it addressed “middlebrow” feeling but because it alerted us to a phenomenon oddly renascent in our culture. I’m speaking of Czeslaw Milosz’s The Captive Mind, a work in which he anatomized the accommodation of Polish intellectuals of the early 1950s to varieties of authoritarianism. Central to Milosz’s portraits of typical Polish figures was their desire for what he called “a feeling of belonging,” a feeling not always easy to achieve, and thus accomplished, in Milosz’s telling, by a ready resort to the “Pill of Murti-Bing,” designed to relieve people of doubt and anxiety.
To be sure, many of his contemporaries resisted Milosz’s characterization of them, and many of those now teaching in the American academy will likewise resist my efforts to associate what is going on today with the tendency memorably analyzed by Milosz. They will resist, most especially, the suggestion that they have need of such a “Pill,” and they will deny that they have mastered a way of adapting to the ideological demands of the moment, contending that they have miraculously preserved, as Tony Judt wrote in 2010, “somewhere within themselves the autonomy of a free thinker — or at any rate a thinker who has freely chosen to subordinate himself to the ideas and dictates of others,” to persons who pass in this society for the indisputably “enlightened.” Judt also noted that students and others in the United States will find themselves “mystified” by the notion that anyone would simply capitulate, blithely buy into “any idea, much less a repressive one.” And yet, in a culture far removed from early-1950s Poland, the drift of portions of the intelligentsia into the fond embrace of safe and reassuring ideological postures, including an intolerance of ideas and persons felt to be “divisive,” is an unmistakable feature of the present moment.
Susan Sontag was powerfully attracted to the transgressive, to waywardness and incompatibilities, to what Philip Rieff called “the instabilities that are the modern condition.” Susan was by no means an easy person to be close to (we had a long and difficult friendship), but she was an exemplary intellectual and, for me, an odd and improbable authority figure. How so? In her attraction to writers and thinkers who “create themselves,” who are “inexorable” in their willingness to think against the grain of received or officially accredited ideas. When I interviewed her for the first time, in 1975, she had just engaged in an extended debate with the poet Adrienne Rich, who had argued that, in criticizing the filmmaker Leni Riefenstahl, Susan had “let down the good [feminist] cause.” Susan’s response? “Party lines,” she argued, “make for intellectual monotony and bad prose,” and result in “demands for intellectual simplicity.”
I’m happy to end there, with the sense, embodied in all of Susan’s work, that the life of ideas is at best strenuous and incompatible with the rage for “simplicity” and “party lines,” and that an academic establishment committed to “accepted community values” will never find a way to honor the transgressive, the inexorable, or the instabilities that are at the heart of the modern condition.
Robert Boyers is the editor of Salmagundi, director of the New York State Summer Writers Institute, and a professor of English at Skidmore College. His most recent book is The Fate of Ideas: Seductions, Betrayals, Appraisals (Columbia University Press). A version of this essay was delivered as the keynote lecture at a conference on “little magazines & The Conversation of Culture in America” at the New School in November.