We're sorry. Something went wrong.
We are unable to fully display the content of this page.
The most likely cause of this is a content blocker on your computer or network.
If you continue to experience issues, please contact us at 202-466-1032 or email@example.com
The second act of this drama is also predictable. After the University of Chicago legal scholar Brian Leiter leaked the letter on his personal blog, Judith Butler wrote a letter to The Chronicle apologizing for signing it. She recognized that the signatories “ought to have been more fully informed of the situation” before seeking to influence NYU’s decision. Similarly, after widespread online censure of the Harvard letter, 34 of its signatories published a retraction. “We were lacking full information about the case,” they said, remarking that they did not understand the extent of the impact their action would have on their own students.
What I find most interesting about reactions to the Harvard case beyond academe is commentators’ surprise that scholars of renown — “among the most accomplished and admired intellectuals in the world,” according to the New York Times columnist Michelle Goldberg — would demonstrate such poor judgment. Some of this is, to coin a German term, mere Efeuschadenfreude — the pleasure of seeing something bad happen in the Ivy League. But focusing on Harvard obscures the painful fact that all academics, famous or not, are capable of errors of judgment, especially when they instinctively fall in line with the opinions of their peers. A lengthy education, it turns out, is no guarantee of rigorous, independent thinking — to say nothing of ethical behavior. The spectacle of celebrated scholars taking underinformed stances in public is an argument for more humility throughout academe.
Acting foolishly now and then is human nature. The problem is that we scholars tend to assume we are immune from errors of thought. North American academic culture (especially in the humanities) thrives on revolutionary rhetoric: Its stars are those who can think in new ways, intellectual radicals ready to overthrow past orthodoxies. It would seem to follow that the more successful a scholar becomes, the more capable they are of breaking away from the herd.
We rebel only in ways we know will be palatable to a substantial cohort of our peers.
The opposite is true. Years of apprenticeship and evaluation on the way to a tenured position, along with the extended emotional adolescence this experience fosters, render academics docile. We learn to anticipate the reactions our superiors will have to our work, our teaching, even to the way we spend our free time. We learn to ward off potential criticism, especially from those whose scholarly work intersects with our own, since it is their opinions that will count the most for continued employment. We rebel only in ways we know will be palatable to a substantial cohort of our peers. True iconoclasts have a tendency of finding their way to the door.
The most powerful factor, however, is social media. Scholars who might have presented themselves to their field only a few times a year at conferences now spend enormous amounts of energy crafting and maintaining public profiles, some of which are visible to the entire world. The marriage of social media and university life is not always a happy one. Academe used to have spaces that were primarily devoted to experimentation: The classroom and the conference allowed people to try out new ideas, face opposition (friendly or not), and sharpen arguments accordingly. Now, both classroom instruction and conference presentations can be made available to the general public in real time. Even when well-intended, this immediate translation to the public sphere means that scholars are justifiably anxious about presenting work that is vulnerable to attack.
The situation after publication is not much better. Social media can be used to spread news about a new article or book, but also invites immediate commentary. I have seen scholars tearing apart a colleague’s book online right after publication, before they could have had time even to read the entire work — thus shaping the field’s perception of it long before anyone else had a chance to make up their own mind. Even the harshest reviews used to take a year or two to go to press.
The awareness that any intellectual position can be, within minutes, reduced to a flat caricature and widely denounced leads to a reasonable unwillingness to express independent thought. Disputes that might previously have taken place within an academic field, or inside one’s own university, now have the potential to be publicized on a large scale.
On the positive side, this means that questionable or arguably unethical actions can come to light more quickly, as in the case of clubby letters in support of powerful professors accused of abuse. But less powerful people are, if anything, more likely to be cowed by the threat of social opprobrium, since they will not have the same networks of support to fall back on. Their livelihoods may be at stake; their mental well-being definitely is. Psychological research shows the close relationship between social exclusion and physical pain, but we can find can an especially vivid, and famous, example in the very early period of the modern university in Paris, in the life of another teacher who ignored professional boundaries. The 12th-century philosopher Peter Abelard was castrated by his wife Heloise’s family after he seemed to abandon her, a humiliating, traumatic attack he described in his History of My Calamities. But a few years later, when he was forced to burn his own book at the Council of Soissons, he decided that being mutilated was easier to bear than cancellation, medieval-style: “I mourned the harm to my reputation much more than the harm to my body.”
Fallacy 1: People are who they say they are. They are not. Anecdotally, it is a good idea to assume the opposite could be true, especially in cases where an individual makes a name out of being an intrepid fighter for virtue. People who build a moral platform are sometimes trying to deflect from their own lapses. A politician on a crusade against the misuse of campaign funds will often turn out to have used their own campaign funds in some sordid manner. Likewise, the scholar who throws around accusations of intellectual theft willy-nilly may turn out to have copied their prose from Wikipedia. The examples are many.
Fallacy 2: Your own interactions with a person are representative of everyone’s experience of them. In fact, we tend to be both hyperaware of the power others have over us, and unaware of the power we have over others. This is why even scholars at the top of their field — protected by tenure, powerful university lawyers, and networks of influence — can imagine themselves victims of witch hunts. We do not know how someone behaves toward those less powerful than they unless we catch them unaware, or unless their targets trust us enough to tell us. They usually don’t.
Fallacy 3: Someone’s line of research is indicative of their ethics. It can be — there are scholars whose behavior toward others is in harmony with their stated politics — but this is hardly a given.
Fallacy 4: Brilliance is equal to excellence in all respects. It is not.
Fallacy 5: Someone’s social identity is an index of their goodness, so, for example, a man can abuse graduate students but a woman cannot. This myth is particularly painful for victims of people who seem like they could not possibly be offenders.
It should be clear at this point that my observations cut across methodological and political lines. Nor do I exclude myself from criticism. My logic suggests that I am as fallible to groupthink as any other; indeed, some of my worst decisions have come from letting others do my thinking for me. This is not a condemnation, but an appeal that we try to be as careful in our professional lives as we are in our scholarly work. For years I have seen humanists defend their fields by claiming that we teach “critical thinking.” The cultivation of rational citizens, capable of independently questioning and interpreting the world around them, is supposed to be one of the reasons our research and teaching are so valuable. But what is our work really worth if, after years of teaching our students to think critically, we cannot do it ourselves?