I was roaming the book exhibit at the Modern Language Association convention when I saw a very famous person. There are only a few super-famous people in academe—people whom the literary scholar Jeffrey Williams has called academostars. This person is known from coast to coast and continent to continent. The very famous person was talking with an editor of a well-known academic newspaper. The VFP and I chatted, and I found myself excited that her glory was reflected by her willingness to converse, even warmly, with me. When we parted, I pointed out to the editor that although this person was enormously famous, she had not written even one book that was a free-standing monograph rather than a collection of essays.
Academic fame is an even stranger goddess than her nonacademic counterpart. In the world of films or novels, your fame is fleeting —you’re often only as good as your last production. Films that splashed across marquees in the summer are all but forgotten when the snows fall. And as regards books, secondhand bookstores and Web sites are swollen with works that were once the rage and now are obscure. Fame, which D.H. Lawrence famously referred to as the bitch goddess, is in those worlds really bitchy. But in academe you need to have written only one major book or article, and you’ll be remembered until you die.
Once your idea is accepted and becomes famous, it has amazing durability. In addition, once you’ve made your mark, it is very hard to erase it. You may write a lot of other books on different topics, but you’ll be remembered only for your original mark. Fame is very economical and can spare only one tomb per name in the academic pantheon.
We could, in fact, invent a new parlor game: Think of a famous person, and the assembled players have to shout out the concept associated with that person. Try Foucault —you’ll no doubt respond “power.” Said, Orientalism; Bourdieu, habitus; Baudrillard, simulacrum; Derrida, deconstruction; Watson and Crick, the double helix; C. Wright Mills, the power elite; Gramsci, hegemony; Elaine Scarry, pain; Donna Haraway, cyborgs; Judith Butler, performativity. Hours of fun to be had over nachos and salsa. Those scholars may write legions of books, but the subsequent work won’t erase the initial flourish of their signature concept. If these people were actors, we’d say they were typecast.
It is the nature of academic fame that it is faddish and cultish —faddish in the sense that, by definition, it is based on the newness of a scholar’s ideas. Readers demand the shock of an innovative insight. But it is only a matter of time before a concept slides into familiarity. The new idea becomes institutionalized. Therefore the successful outcome of any famous concept will be its general acceptance, to the point that it becomes, paradoxically, commonplace. We then find it difficult to understand how the concept could have had the éclat that it did originally.
Why is it that most scholars become famous for only one idea, and that this idea is destined to become ordinary? The fame of a specific idea depends, again paradoxically, on its general application across fields. The broader the sowing of the idea, the greater the fame of the originator. When we think, for example, of Derrida or Foucault, we think of so many fields and specializations that have adapted and used their work.
But in appealing to a broad range of scholars, the key insight has to translate into a generic form of academic lingua franca, and when those ideas distill down to graduate students, who themselves must master a lexicon of such ideas, there isn’t room for a lot of qualification or changes in the basic idea. Graduate students in some sense, then, are fame bestowers —certainly they buy the most big-idea books. They need to learn and master a wide range of thinkers, and so the economy of the one-idea-per-scholar paradigm reigns.
Although academic fame is durable, you can lose it if your idea becomes antiquated. Fame’s mental handprints in the pavement of academe’s Grauman’s Chinese Theatre are set in concrete, but not in stone. That’s because no matter how famous you get, the zeitgeist shifts every 30 or 40 years or so. You might be the most famous semiotician in the 1970s but unknown in 2008. So while your fame can endure, it can endure only as long as the zeitgeist that gave it meaning. The names of Greimas and Genette, which caused scholars to tremble with intellectual fear and excitement in the early 60s, now produce a tremor of a yawn. Northrop Frye’s Anatomy of Criticism was requisite reading for every graduate student in the 60s and 70s; now it’s virtually unknown and certainly unusable as a guide to criticism today. Wayne Booth’s The Rhetoric of Fiction was a must-read for every English major during that same period and is now entirely forgotten. The early feminists’ insights might have made a splash in the late 60s and early 70s, but now they are considered essentialist and passé.
But my general point is that academic fame is not easily lost, compared with other kinds of fame. It is hypothetically possible to lose that fame, but the remarkable thing is that in all the examples I can think of —and I encourage you to join me in this thought experiment —I cannot come up with a single famous academic who became infamous solely through his or her writing or speaking. And I can think of only one or two who actually lost an appointment in that way. (I discuss them later.)
There is, then, wide latitude in academe for various kinds of transgressions, and one can attribute that to a strong allegiance to the sacred idea of academic freedom. Unlike in American politics, sexual scandal, for example, rarely, if ever, knocks off any academic laurel wreaths. In fact, as in Hollywood, sexual gossip, innuendo, and even obsession may augment an academic’s celebrity. Sexual harassment should lead to the loss of academic fame, but I can come up with no example of its having done that.
One would assume that commenting in some controversial way about politics would precipitate a rapid fall from power, but that is apparently not the case. Both the philosopher Martin Heidegger and the literary critic Paul de Man have been accused of being anti-Semitic, yet their reputations have remained relatively untarnished.
Heidegger was a member of the Nazi Party who was appointed director of the University of Freiburg in 1932. His writings at the time indicate strong support of the Nazi cause, and a contemporary photograph shows him in full Nazi regalia. He later denied that he had been a fervent Nazi and said he had joined the party only to save his university. Subsequent documentation proves otherwise, and, as many have noted, he never apologized for his involvement in Nazism. Yet his ideas are still widely studied and taught.
Paul de Man, a Yale professor and inaugurator of deconstruction in the United States, was found after his death to have written over a hundred articles for the Belgian collaborationist newspaper Le Soir. Both de Man and Heidegger have been the focus of vigorous arguments for and against ignoring politically reprehensible actions in favor of a body of work. De Man’s reputation has perhaps suffered a bit, but he has had many heavy-hitting apologists, including Jacques Derrida and J. Hillis Miller. In some sense, there has been both scandal and spin, and the reputations of both de Man and Heidegger depend on the effectiveness of the latter over the former. Still, it has been hard to strip fame from those two scholars and drape them with notoriety, or even infamy, because their ideas endure.
I’d like to look at two contemporary examples as well: Edward Said and Peter Singer.
Said was a controversial but highly regarded scholar whose work on Orientalism made him famous. (I was a graduate student of his, as well as a friend and colleague.) His early book Beginnings earned him a reputation within the field of literary theory, but the jump to “famous in all fields” occurred when his work impinged on the public sphere. Because of Said’s writing and speaking out on the question of Palestine and his involvement with the Palestine Liberation Organization, he was denounced by Zionist and other pro-Israeli organizations, but that antagonism may actually have served to gain him more fame than notoriety. He was a regular face on public television, often interviewed on public radio, in demand internationally as a speaker, president of the Modern Language Association, winner of many awards, and so on.
But when a photograph of Said throwing a stone from the Lebanese border into Israel appeared in the news media around the world, there was a strong move to oust him from his tenured position at Columbia University. Right-wing pro-Israeli groups referred to Said as a professor of terror, and he was accused of throwing stones at Israeli border guards. But according to Said, he had taken a popular bus tour with his son, Wadie, and one of the stops was this particular location, at which it was routine for bus riders to symbolically throw stones. His son had challenged Said, who was very competitive in sports, to see who could throw his stone farther.
It is significant that it takes something amounting to a physical act to threaten fame. One earns academic fame through writing and speaking, but infamy is more about the deed than the word. The move from language to objects, from thought to action, shifts the grounds by which one is judged in academe. We might call this the “sticks and stones” rule. All of the negative publicity, by the way, did not remove Said from the academic pantheon.
Another example of the sticks-and-stones rule might be observed in the career of Peter Singer. This Australian utilitarian philosopher has had two major, related tracks in his career: He is an animal-rights theorist, and he is also a theorist who advocates the removal of life support from severely disabled people. Those two goals might seem contradictory, and I can’t resolve them for you easily here, except by saying that utilitarian arguments favor vegetarianism as a way to benefit the majority of people in the world, and removing people from costly life support also is claimed to benefit the greater good. While many in the disability community and religious organizations denounced Singer’s draconian stance toward end-of-life issues and toward cognitively disabled persons, his notoriety rose to infamy when he violated his own rules to keep his severely disabled mother alive in a plush nursing home. That shifted theory to practice, words to action. Yet despite his infamy, he remains famous. Your notoriety can feed your fame.
There seems to be only one way that words can have the force of actions, and that is in the realm of identity politics. Attacks or slurs on people based on their identity can harm the attacker’s reputation. Lawrence Summers, for example, as president of Harvard University, suggested that women might be less successful than men in math and science careers because of “innate differences” between the genders. Reaction to those comments and his general behavior led him to resign his administrative job. But he still holds an illustrious appointment on Harvard’s faculty and was appointed by President Obama as head of the National Economic Council, so his fall from Harvard’s presidency was not precipitous.
Ward Churchill was not famous initially, but he became infamous by saying in a speech a few days after September 11, 2001, that U.S. foreign policy had caused the terrorist attacks, and that the people working in the World Trade Center were “little Eichmanns” because of their involvement in global capitalism. He was ultimately dismissed from the University of Colorado at Boulder. The notoriety his speech provoked led to scrutiny of his academic work, and research irregularities were found. Interestingly, the only way he could be fired was on the basis of acts of plagiarism and the like. An appeal ended successfully, with a jury finding that Churchill had been wrongfully dismissed, although they chose to award him only one dollar in damages. So far the university has not rehired him, though his fame remains intact.
Plagiarism seems to be one of the biggest ways in which a famous academic can become infamous. Yet even there, so much depends on the effectiveness of spin over scandal. The historian and former Harvard professor Doris Kearns Goodwin was accused of many instances of plagiarism. Virtually verbatim sentences and paragraphs were lifted from other sources. She denied the accusation that she deliberately plagiarized, claiming that her assistants were to blame for sloppy note taking. Far from becoming infamous, Goodwin was a ubiquitous talking head during the 2008 election frenzy.
Nor have other famous academic plagiarists —historians all —suffered substantially. Michael Bellesiles, a former professor of history at Emory University, wrote a book on the early history of guns and was accused of falsifying information and sources. He eventually resigned, and the Bancroft Prize awarded by Columbia University for his book was rescinded. Joseph Ellis, a historian of Vietnam at Mount Holyoke College and winner of the National Book Award and the Pulitzer Prize, turned out to have lied about his first-person accounts of being in Vietnam and other aspects of his life and work. He received a one-year suspension, during which he began work on a biography of George Washington that was very well reviewed and made it to No. 6 on The New York Times best-seller list. So much for scandal.
Falsification of academic work can lead to its own kind of fame. Alan Sokal, not particularly famous outside his field (he is a physicist and mathematician), published a phony essay in Social Text in which he claimed that the social construction of scientific facts was valid. He tried to show that gravity and the speed of light could be seen as socially constructed. No one realized the essay was a hoax until Sokal revealed it himself. But rather than being excoriated, he was acclaimed as a fact-buster and quack-revealer. And, of course, he was never penalized or reprimanded by his university.
Finally comes the very special fame of my colleague at the University of Illinois at Chicago, Bill Ayers, an education theorist. His long-ago notoriety for being a cofounder of the Weather Underground in the 60s flashed forward when he became America’s most well-known domestic terrorist. His academic career and standing remain intact, and his appearance on national news programs just confirms his newfound, although perhaps momentary, fame.
So why is academic fame so resilient? Why is it so hard to lose? Probably because academic knowledge is Talmudic. To know something, you need to know what came before it. Every study has to present a review of literature. Academic knowledge always relies on a receding chain of authorization, with each reference referring back in time and logical development to a previous concept and the person linked to it. Thus, ultimately, the history of famous ideas is a history of famous people. Once installed in the chain of reasoning and citation, famous people and concepts are hard to remove from the collective memory.
Films can go out of fashion and seem dated, but even academic ideas that are antiquated are still part of the history of ideas. Our very notion of knowledge always includes ideas we know to be untrue. We don’t believe in much of Locke’s notions about the mind or Augustine’s view of cosmology, but we still study them because, like Everest, they are there. They are part of the terrain of the history of philosophy. To understand Nietzsche, you need to read Kant, and to understand Kant, you need to read Descartes, and so on. Nothing gets lost, and so it is virtually impossible to lose fame in academe, even if you are infamous for a time.
With the advent of Google and other Internet resources, the eternalization of citations and references reaches new proportions. You can find anyone on the Web, where no one ever goes away. Even if you get fired by your university for plagiarism, you’ll never get fired from Google. And Google not only acts like a large collective unconscious that never forgets, but also contributes to fame gauged by the number of hits a name produces.
In the end, academic fame may turn out to provide what the ancients were looking for in architecture and statuary: a kind of immortality. Ozymandias, who thought his eternal reputation was assured by his huge statue, didn’t get more than a pair of big feet in the desert. Yeats may have derided bronzes as “self-born mockers of man’s enterprise,” but academic fame seems to have taken on a hermetically sealed luster that may not dim for some time to come.
Lennard J. Davis is a professor of English, disability and human development, and medical education at the University of Illinois at Chicago. He is the author, most recently, of Obsession: A History (University of Chicago Press, 2008) and Go Ask Your Father: One Man’s Obsession With Finding His Origins Through DNA Testing, published this month by Random House.