We’re sorry. Something went wrong.
We are unable to fully display the content of this page.
If you continue to experience issues, contact us at 202-466-1032 or firstname.lastname@example.org
Do you believe “science is real,” as some yard signs insist? The slogan is not about reality but about society — it means “Scientists are trustworthy.” As a Chronicle reader, you’re unlikely to be in the camp of the extreme skeptics. You probably don’t think, for instance, that scientists are using the Covid-19 vaccine to malevolently insert a digital tracking chip into your bloodstream, as one of the more florid conspiracy theories of the moment has it. (Twelve percent of Americans believe this, apparently.) But outside of such fringe commitments, the spectrum of plausibly “reality based” attitudes toward our expert classes is very wide, including everything from implicit trust in the reliability of “Dr. Fauci” (credential firmly in place) to anxieties about ramped-up government surveillance and border control.
The pandemic has rendered the question of the authority of the medical sciences hyper-salient, but scientists are hardly the only experts to have come under increased scrutiny in recent years. The collapse of the U.S. war in Afghanistan reignited debates around foreign policy and military expertise that have been simmering at least since the Vietnam War. A growing divide between rich and poor has caused many to wonder whether economists know as much as they think they do. And a series of high-profile disputes over American history, spurred by The New York Times’s “1619 Project,” has raised essential questions about the nature of historical knowledge. All of these issues have confounded predictable distinctions between left and right.
The academy is expertise’s natural abode — its incubator, gatekeeper, protector, and even home to some of its sharpest critics. At a time when the authority of the university is under attack — from the right, for whom trust in academe has steeply declined, and from the left, increasingly hostile toward the rubrics of meritocracy so central to academic life — renewed attention to expertise has never seemed so urgent. We asked nine scholars and writers from across disciplines to answer the question: What will the vicissitudes of the last couple of years mean for the future of “expertise,” broadly conceived? Here’s what they told us.
Beware those suddenly endowed with politically convenient insight.
Once that change has overtaken a scholar, the habit of conscientious restraint is loosened for the sake of a cause more urgent than accuracy. Credulous listeners may come to believe there is a lock to which the scholar has been vouchsafed a special key. “This reminds me of that” becomes a dominant mode of assertive inquiry. So many reminders are now vivid in the world around us, so many equivalences, so many parallels crowd the horizon of current events. And an anxious audience is eager to accept the explanations only the scholar can give.
So a history professor outguesses the FBI and, equipped as a historian, discerns in the violence of January 6 a new “Reichstag Moment”; while a philosophy professor uncovers the germ of fascism in the fact, apparent to a philosopher, that fascists (unlike other politicians and power brokers) are “intentionally mythical.” Academics of this stamp have opted out of the morale of scholarship to take up the role of prophet and preacher. They feel that their scholarship has fortified them with a rare understanding. They have come into possession of a penetrating night vision for facts and analogies that are convenient for their party position.
In the summer of 2020, over a thousand health-care workers signed a public letter declaring that the mass protests against racism — though done in defiance of Covid restrictions — were actually beneficial to public health. “White supremacy,” they said, “is a lethal public health issue that predates and contributes to Covid-19"; accordingly, they concluded, “we support” the mass protests “as vital to the national public health.” This seemed, on the face of it, a confusion of categories as strange as the notion that misogyny is a greater danger to women than heart disease. The public letter was clearly meant to promote a moral purpose. These demonstrations, the signers were saying, deserved the support of good people, unlike the merely recalcitrant protests against Covid restrictions elsewhere. In the short run, the signatures-in-solidarity may have given heart to the protests. In the long run, their effect was to raise a suspicion that medical professionals were no longer giving medical advice.
As director of the National Institute of Allergy and Infectious Diseases, Anthony S. Fauci wields control over an annual budget of $6 billion — a power impossible to gauge in ordinary institutional terms. His performance over the last 20 months, therefore, was bound to be closely watched. What soon became unmistakable was his insistence on his posture as an expert, and — notwithstanding a calm, reassuring, and kindly demeanor — the essential immodesty of his presentation. He changed his official directives many times: on the advisability of wearing masks at all; on the superiority of two or three masks; on the necessity of children staying home from school, and the reasonableness of sending them to school; on the certainty of a natural origin of Covid, and the possibility of a laboratory origin; on the contagiousness of Covid in small or large outdoor gatherings.
Some of these shifts were inescapable, given the uncertain and changing information about the virus itself; but what never changed was the rigorous authority claimed by the expert. Much of the advice Fauci was disseminating was not primarily medical. It was civic or social in nature. He never drew that line, however, and seldom took the occasion to say “We don’t know.” This misuse, and overuse, of expertise, and the confusion of the expert claim with personal charm and authority, bears some of the responsibility for the quasi-political conflict the U.S. is now witnessing on the general advisability of vaccinations.
Daniel Ellsberg, in his memoir Secrets, offered an acute observation on the seductiveness of the expert posture to the expert. When, during the Vietnam War, Ellsberg worked for the assistant secretary of defense John McNaughton, crossing his desk every day were classified messages and intelligence reports no ordinary American was privy to. He knew, of course, the incompleteness, the speculative pretense that underwrote many of these documents, and in some cases the outright concealment and falsification; but there was a sense of privilege, too — a certain protective sensibility that came with knowing the secrets.
Such knowledge can make one feel entitled to wave away outside challenges in spite of one’s own doubts. After all, Ellsberg had seen things that those others — the outraged citizen, the journalist with a good nose — simply couldn’t have detected. Like the immodest institutional authority and, in a way, like the scholar who turns into a popular-wisdom source, the national defense expert is apt to become a committed person far in advance of the evidence. Dick Cheney, Paul Wolfowitz, Douglas Feith, Stephen Hadley, David Petraeus, and a host of high-ranking advisers had digested a great deal of expert knowledge before they took the United States into Afghanistan and its many sequels.
David Bromwich is a professor of English at Yale.
It’s the Politics, Dummy
The corruption of science and expertise.
How did we get to this point? The pandemic capped a disquieting sequence that started with post-9/11 conspiracy theories and proceeded with panicked conversations about “post-truth.” The extent of the damage required a search for causes, and possibly for culprits. An early salvo was fired in 2004 by Bruno Latour, when he staged a pro forma penance for having deconstructed scientific facts in his youth (without inhaling), only to accuse those who see “capitalism” or “imperialism” as the driving forces behind science of having laid the groundwork for contemporary conspiracy theories. Had he acknowledged that he was rehashing Karl Popper’s 50-year old philippic against the likes of Karl Mannheim, the argument may have lost its edge.
Others sought the roots of the problem in postmodernism, the eternal turd in the Enlightenment punchbowl: Facts had become representations; right, might; knowledge, power; science, an instrument of discipline. With his impeccably glabrous head that gave him the physique du rôle, Michel Foucault was often cast as the James Bond villain — laughing ominously as a shackled and screaming Truth is lowered into the corrosive bath of Discourse — with Jacques Derrida a disheveled sidekick. Never mind that a much more radical deconstruction of scientific facts had been already offered in the early 1930s by a real scientist, the Polish chemist Ludwik Fleck, at a time when Foucault was still catching his first glimpse of pastoral power as an altar boy and Derrida had just transitioned to solid foods: The humanities had to be purged of this scourge once and for all.
The “crisis of expertise” narrative is premised on the questionable assumption that there once existed a technocratic golden age that saw benevolent men in gray suits and thick-framed glasses rule by universal epistemic consensus. This age never was. Even modernization theory, arguably the most ambitious social engineering project of the 20th century, had from the outset a checkered history and ended in failure. Ask any technical adviser recently disgorged by a C-17 Globemaster on the tarmac of an Uzbek airport, their eyes puffy from the sleepless flight from Kabul and their laptop bloated with obsolete nation-building templates and backfired counterinsurgency plans: Grand schemes to improve society from above, as James Scott has argued, usually encounter resistance. Sometimes, resistance can be a factor of both moral and scientific progress, as when the activist group ACT UP fought the medical establishment in the late 1980s and successfully pushed for more funding for HIV research and wider access to experimental treatments for AIDS victims.
But the “crisis of expertise” is only part of the story. One must also consider the idolatry that surrounds apolitical expertise in some quarters. Take Anthony S. Fauci. During the pandemic, he became a national hero among American liberals not because of what he did — which didn’t amount to much — but because he seemed to be a paragon of truth and rationality in the dizzying multiverse of Trumpworld.
Fauci’s popularity was boosted by political factors. If the #resistance was prompt to co-opt him as one of its mascots, along with Robert S. Mueller III and James B. Comey, it is largely because the heroes of this geriatric sequel to Marvel’s Avengers: Prime derived their superpowers from two sources, science and the law, which work best when what’s political about them is kept out of sight.
Which brings us to the crux of the matter: The “crisis of expertise” cannot be separated from a distinctly political agenda — the return to neoliberal technocratic governance after its national-populist variant — that relies on expertise to eschew politics altogether. Such a strategy reflects the inherent tendency of neoliberalism to insulate rule making from democratic politics, something best done by enshrining political choices in regulatory frameworks better left to specialists. The result of this escamotage is a deeply anti-political form of politics. Because it presents itself as nonpartisan expertise, you won’t find its program in manifestos but, most of the time, in the popular-science aisle of your local Barnes & Noble.
The genre has indeed perpetuated just-so stories about a science free from the vicissitudes of politics and finance, at a time when universities were restructured under neoliberal injunctions and the production of knowledge subjected to the arbitrage of a marketplace of ideas. These sanitized narratives are instrumental in peddling neoliberal common sense refurbished as everyman’s science. The result has been the corruption of science and expertise. When the authors of Freakonomics promise to reveal “the hidden side of everything,” who needs conspiracy theories?
The crisis of expertise is endogenous to political crisis. When the prevailing social and economic arrangements work for the majority, people do not care much about the political role of experts. But when humanity teeters on the brink of environmental collapse, is devastated by a pandemic and torn apart by economic inequalities, the technocratic curtailing of politics inevitably hits a wall. If the only response to the pent-up need for a wide-ranging revision of these unsustainable arrangements comes across as “Science for Dummies,” a backlash is inevitable. And sure enough, such backlashes usually end up fueling the wrong kind of politics.
Nicolas Guilhot is a professor of intellectual history at the European University Institute.
Devils Without Details
Against the seductions of the grand sweep.
At some point, my mother and I clashed over what this meant. If I was “serious,” she said, then I needed to learn music theory. I was hungry to be on stage moving people with my depth of feeling, while she was buying flashcards and workbooks and quizzing me on harmonic structures. None of it seemed relevant to anything I cared about, and I rebelled, as kids do.
That’s now one of the great regrets of my life. But my mother’s insistence on the small details that buttress a big impression — the memorization, the fine-tuning — guides a great deal of the sort of scholarship I admire.
It is easy, given the vogue for postures of generosity these days, to forget that expertise, too, should be a generous disposition. While the “thinker” follows conceptual strands across big, interlinked questions, the “expert” gives due attention to all the nuts and bolts that hold an answer together. A good scholar of course does both, but finding the right balance is a never-ending task.
And it is easy to get lost in the weeds of consecration: Who really “counts” as an expert, and on what? This is a fruitless mission in most of the humanities, since there are close to no limits to how minutely we can define a field of knowledge. But humanists need what I call “dispositional expertise,” by which I mean they need to be attracted to understanding how broad conclusions of different kinds can come to seem tenable up close. This entails learning lots of things that you’ll never “use,” because knowing what you need to know often depends on learning many things you don’t.
Humanities expertise by even this expanded definition is probably imperiled in the modern American research university. I say “probably” because I have no systematic way of determining whether this is or is not the case; because I see things, like we all do, from my own intersection of fields; and because I am open to being convinced I am wrong.
What I do have is a sense that a lot more people, in many more disciplines than was true even five years ago, are invested in a vocabulary that implies a commitment to social justice: race, empire, decolonization. Too often, such terms are disassociated from insistence on expertise; they are seen as devils without details. It is one thing to think about “empire” in a Hardt-and-Negri sort of way, as a conceptual abstraction, and quite another to get deep into the guts of one — the vast, complicated archives, the technical minutiae of power, dispossession, accommodation, and collaboration, the warp and weft of history.
One bit of evidence for the unfortunately widespread perception that expertise and social engagement are somehow in conflict is the number of colleagues, from various schools, whom I have heard suggest that doing away with language requirements might be one way to be more inclusive. The problems with this line of argument are many (most people in the world are neither white nor monolingual), but the most alarming idea latent here is that it is possible to skip straight to political breadth by sacrificing a commitment to scholarly depth. This serves minoritized traditions least of all.
Jeanne-Marie Jackson is an associate professor of English at the Johns Hopkins University and an Andrew Carnegie fellow.
Identity Shapes Expertise
Ignoring this fact perpetuates injustice.
In 1984 the legal theorist Richard Delgado set out to find the leading experts in civil-rights law. He assembled the top 20 law review articles on the topic using a seemingly objective standard: those published in leading journals and those that received the greatest number of citations. But he discovered a curious fact: All of the articles that fit those criteria were written by white men.
There was an “elaborate minuet,” as Delgado called it, of exclusively internal debate within this cohort over the best means to advance racial justice. Many of their arguments were strong, but how could it be, Delgado wondered, that the ways in which unequal treatment can cause a person to suffer a “withered self-concept” would be best represented in the work of white authors citing other white authors rather than in work by people of color exploring the effects of racist societies? When he asked the author of one article why he didn’t cite the work of theorists such as Kenneth Clark, the author explained that he selected the source he cited because it was “so elegant.”
The problem, Delgado argued, was that ruling out considerations of identity in the development of intellectual work diminished the quality of that work, but in a way that current thinking about expertise could never reveal. Judging the top articles by the number of their citations may seem like a neutral standard, but it in fact perpetuates injustice while concealing that injustice.
Sonia Sotomayor was pilloried during her confirmation process for giving voice to the idea that a person’s background and experience can affect their legal judgments in ways that our justice system should acknowledge. This is a pretty mundane claim: If I want to learn what it means to live as an undocumented person in America, I need to listen to undocumented people. Similarly, it may well be useful to have a Latina from a disadvantaged background on the Supreme Court. That is a legitimate aspect of one’s expertise. But Sotomayor’s claim was treated as unspeakable in polite society because it challenged shibboleths about the neutrality of expertise. It is these shibboleths that rationalize white male upper-class rule.
Our understanding of expertise is foreshortened if we assume that it is something any smart person can achieve equally about any conceivable topic. Rationality is not simply the execution of a function — it involves content, and it is this content that helps to inform judgments. Let’s begin to recognize our diverse experiences as aspects of the expertise we bring.
Linda Martín Alcoff is a professor of philosophy at Hunter College and the Graduate Center of the City University of New York.
On Tap but Not on Top
Experts are essential — but must know their place.
Governments all over the world have taken cover behind the slogan that they are simply “following the science.” However, science is silent on questions of policy that politicians must resolve. Lockdown or no lockdown? When to reopen schools? Which groups, if any, should be eligible for booster shots? It is absurd to suggest that these are questions to which virology and epidemiology provide answers. Individual scientists have their views, like the rest of us.
And like the rest of us, scientists’ views about policy questions reflect their values, which are not certified by “the science” or by their professional expertise. Science does not prioritize or decide between different values, such as safety and liberty, in the way that governments must. When political leaders defend their priorities by claiming that they are only doing what the science tells them to do, they are bluffing.
To the extent that the pandemic has made vivid to the public the distinction between questions of fact and of value in public policy, that is a good thing. The last 18 months have helped the public to understand, if they did not already, that there is no such thing as “the science,” singular. There are sciences, plural, and there are scientists who don’t necessarily agree with one another. What scientists say today they might revise tomorrow. No statement is immune to revision.
This welcome shift toward nuance has, however, produced an overreaction among some sections of the public. The conclusion they have drawn from the fallibility of science is that we should be suspicious or even dismissive of what scientists are saying — even when they are all saying the same thing. In its most extreme form, this line of thinking leads some people to embrace outlandish conspiracy theories. When challenged, they insist that they are exercising their God-given right to think for themselves. While thinking for oneself is no bad thing in principle, it has proved lethal for those who dismissed the threat of Covid-19 only to contract and succumb to the illness. It is inadvisable to think for oneself if doing so means contradicting the views of those who are in a much better position to know the relevant facts.
Writing in 1927, the American philosopher John Dewey argued that experts are inevitably so far removed from common interests as to become “a class with private interests and private knowledge.” A government that overvalues expertise, he wrote, risks becoming “an oligarchy managed in the interests of the few.” This might be an exaggeration, but Dewey was right to draw attention to the dangers of excessive deference to experts. A mature and balanced view of expertise recognizes its limits.
The challenge for the public has been to avoid the twin extremes of excessive deference and excessive skepticism toward experts. Despite all the hype about the death of expertise, most people are not “anti-expert.” They still seek the services of an oncologist when they think they have cancer and the services of an electrician when their lights don’t work. At the same time, they grasp that the pandemic has thrown up questions of policy and value that are for all of us to answer. The fact that liberals and conservatives argue about the appropriateness of lockdowns is not proof of scientific ignorance. It is evidence that members of the public with different political allegiances can recognize a question of value when they see one.
Quassim Cassam is a professor of philosophy at the University of Warwick.
We need processes that check experts’ bad behavior.
These “noble lies” showed that flawed expert communication can do serious damage to public trust. One lesson here is that experts should avoid noble lies — all lies, really. Instead, they should share clearly and openly the uncertainty and limitations of their knowledge.
Yet behavioral prescriptions may not be enough to solve the problem. If public trust in scientific expertise depended on experts’ never behaving or communicating badly, it would be a fragile trust. The more important question is whether advisory processes themselves can be structured differently — whether they can more effectively expose and contain individual missteps, allowing the public to reflect more critically on expert statements.
Jeremy Bentham argued that trust is appropriately placed in institutions that systematize an attitude of healthy distrust. He believed that carefully designed institutions would facilitate good judgments about which particular people or claims could be trusted. The advantage of this approach is that it recognizes there will be good and bad behavior within any system — so rather than premising trust on the behavior of individual actors, it aims to build trust in the system itself.
The question, then, is what institutional structures could earn public trust based on an attitude of healthy distrust. The standard answer is that the internal gatekeeping mechanisms of science are designed to do just this: Peer review ensures the reliability of scientific claims, credentialing systems weed out the unqualified, conflict-of-interest protocols screen for unethical behavior. It has long been maintained that these scientific structures are — and should be — the guarantor of public trust in expertise. When that trust declines, it is natural to find fault with the workings of these systems.
If the pandemic has taught us one thing on this subject, it is that we cannot expect the efficacy of internal scientific gatekeeping mechanisms alone to sustain public trust in expertise. These systems may be robust enough to establish the reliability of scientific claims generally, but when scientific findings are the basis of policies that affect people’s lives, establishing trust requires additional kinds of scrutiny. The public needs to know how its interests and values are being represented in research and advisory processes, how concepts such as health, efficacy, protection, or precaution are defined, and what alternative conceptions may be possible.
In both the herd-immunity and masking examples above, experts made assumptions about the public interest that were controversial and only belatedly disclosed. For experts to earn and keep public trust, we need public debate on points like these, ideally through institutions that could test expert claims against the public’s values and priorities.
What might these institutions look like? Scientists in Britain set up an independent scientific advisory group — Indie SAGE — to weigh in on pandemic-response measures and challenge official advice. The group publicized all its recommendations, broadcast its meetings on YouTube, and fostered public conversation about scientific knowledge. Some expert bodies also sought direct public participation: The Centers for Disease Control and Prevention’s Advisory Committee on Immunization Practices has always been open to the public, and during the pandemic it was responsive to public criticism of its vaccine-distribution plan.
We could also design new institutions that bring together scientists, policy makers, and citizens at earlier stages of the research and modeling process. We might broadcast public debates between scientists making the case for and against a proposed policy, for example, or design a “science court” where citizens could directly question competing expert claims. Such institutions would cultivate the healthy distrust that makes sound judgments — and public trust in those judgments — possible.
Zeynep Pamuk is an assistant professor of political science at the University of California at San Diego and author of Politics and Expertise: How to Use Science in a Democratic Society (Princeton University Press).
Trust must come from how we build science — not just how we communicate it.
Rudd has a Ph.D. in economics from Princeton University and a multi-decade career in some of the most powerful economic institutions in the world, including the U.S. Treasury Department and the Federal Reserve Board, where he now works. If there is such a thing as expertise in the study of economics, Rudd seems like the kind of person who would have it. But the introductory section of the paper suggests that many economists “organize their thinking about real-world economic phenomena” around ideas that have “little empirical foundation” and that are “seriously deficient on theoretical grounds” to boot. These kind of statements could undermine the very idea of economists as experts.
No class of experts has had more global political power in the past few decades than U.S. economists. Seventy countries — more than a third of the world — were politically and economically restructured by the International Monetary Fund’s “structural adjustment programs” in the 1980s. The so-called Chicago Boys restructured Chilean society with the blessing of their intellectual father, Milton Friedman; Jeffrey Sachs and his boys administered “shock therapy” to post-Soviet Russia. The emerging economist-influenced common sense that guided their hands was cooked up in Washington, D.C., and accordingly labeled the “Washington Consensus.” When experts abuse their expertise, distrust ensues even after times passes and conditions have changed — what the philosopher Naomi Scheman calls the Tuskegee effect, after the infamous medical experiments.
It’s important to underscore that the political distrust here can be well earned. If these kinds of suspicions tempt you to wash your hands of swaths of supposed experts and expertise entirely, you’re not alone. You’re joined by iconoclasts and orthodoxy-condemners of all varieties, a loose category that on its own would seem to include the stiff opposition to non-Islamic rule in northern Nigeria (the Hausa phrase “Boko Haram” could be translated as “Western education is forbidden”), the mounting insurgency against critical race theory in the United States, and climate denialists of the old-school corporate and new-school far right varieties.
The fact that the motley crew of objectors to the current schemes of intellectual authority includes anti-vaxxers just as surely as it includes anti-imperialists shouldn’t confuse us into thinking that there’s some secret underlying agreement or political coherence to this chaotic list of characters. It’s precisely the lack of any such coherence that should prompt us to look beyond the well-earned distrust of status quo authorities for a sensible 21st-century political outlook.
Rather than abandoning expertise, then, we should ask: What expertise could we build that would be worth the trouble? The philosophers Kyle Whyte and Robert Crease remind us what expertise is really about: trust, and the ways we can live when we’ve allocated it well. They recount a case worth revisiting: the development of the Nunavik Research Center. The Inuits at Nunavik funded the hiring of laboratory scientists and established a health board to identify scientific topics of public concern and submit them to the scientists for study. The work is monitored by local elders and the results are translated and shared across the local population. Distrust was neither eliminated nor ignored, but managed through careful institutional design that combined “Western” science with local indigenous sovereignty.
The philosopher Gabriele Contessa reminds us that it is possible to manage knowledge production well at scale if we take a thoroughly social approach to the problem. In the 1970s, scientists sounded the alarm about chlorofluorocarbons, or CFCs, a class of chemical compounds then used as refrigerants and in packing materials. The chemicals were found to pollute the atmosphere and deplete the ozone layer that protects the earth from harmful radiation. In response, 27 countries in 1987 signed the Montreal protocol, a set of comprehensive regulations to restrict production of CFCs.
But this was just the beginning. In 1990, Elizabeth Cook, a researcher affiliated with Friends of Earth, explained how environmental-justice groups responded to the Montreal Protocol. Organizations from West Germany and Belgium to Hong Kong and Malaysia challenged governmental and industry inaction on the scientists’ reports, launching boycotts, distributing pamphlets, and taking out attack ads against recalcitrant companies. In the U.S., they successfully pushed for manufacturers to immediately switch to known CFC alternatives, and to commit to searching for completely “ozone-safe” alternatives. By 1995, CFCs had gone entirely out of production. The 1980s’ balance of power between state governments, companies, activists, and the broader public made global progress on CFCs possible, and we will need to change that balance in 2021 if we aim to make similar strides.
So while we may not be able to eliminate the sources of distrust in expertise, their consequences needn’t be as hazardous as they have been during the pandemic. On the model of the Nunavik center and the transnational campaigns against CFCs, we can reform our knowledge-producing institutions — not just in how they communicate scientific knowledge, but in how they form it in the first place.
Olúfẹ́mi O. Táíwò is an assistant professor of philosophy at Georgetown.
Profsplaining and Other Disorders
What to do about the QAnon shaman and his cronies.
“Skepticism about experts” is running high these days, and nowhere more so than among the anxious, the angry, and the alienated: QAnon cultists, lockdown protesters flying the Gadsden flag of “personal liberty,” and the roughly one-fifth of Americans who according to a recent study self-identify as anti-vaxxers at least some of the time. An undisguised contempt for experts and a flagrant disregard for facts often go hand in glove: purveyors of “alternative facts” like Donald J. Trump, Alex Jones of InfoWars, Steve Bannon, and Capitol Hill trolls like Lauren Boebert and Marjorie Taylor Greene pour scorn on experts.
Anti-vaxxers like Robert F. Kennedy Jr., whose status as American royalty has made him a social-media superspreader of vaccine misinformation, and Mark Crispin Miller, a New York University professor of media studies who believes Covid vaccines are part of a monstrous plot by billionaire eugenicists like Bill Gates and Ted Turner to exterminate the “unfit” and reduce the rest of us to “neofeudal” servitude, are all about facts and analysis. Both wade into the manufactured controversy over vaccines and public-health measures with footnotes rampant, well-armed by an obsessive review of the medical literature.
Kennedy and Miller remind us that a rejection of expert consensus isn’t always synonymous with the Trumpian article of faith that “truth isn’t truth,” as Rudy Giuliani memorably put it. Both men claim to follow the facts. It’s just that their facts — cherry-picked or taken out of context from legitimate sources or quoted from fraudulent ones — defy what Michel Foucault called the “regime of truth,” the dominant discourses and adjudicating bodies that determine what, in a given society, is true or false. They respect expertise, just not official expertise.
In that light, Miller and Kennedy have a lot in common with a conspiracist and college dropout like Jacob Chansley. Chansley is the so-called QAnon shaman, last seen storming the Capitol in a horned fur headdress, brandishing a spear, on January 6. Asked by the BBC’s Channel 4 News, “At what point in your life did you stop listening to the mainstream narrative?” he replied, “When I realized that doing my own research brought me more information than listening to the news ever could. Once I stopped allowing the news to make up my mind or my narrative for me, I grew exponentially.” Like many in the QAnon cult, Chansley thinks of himself as a “researcher,” part of an interpretive community engaged in a kind of paranoid hermeneutics that scans the white noise of media overload for coded messages about a Satanic cabal of pedophiles that controls the so-called deep state.
From an academic perspective, combating the eroding faith in expertise and the growing belief, in some quarters, that we’re entitled not just to our own opinions, but to our own facts, is epistemologically tricky. As Elise Wang, a scholar of conspiracy theories, points out, reflexive appeals to the importance of “media literacy” as an inoculation against declining confidence in expertise misunderstands the cultural logic behind the rising tide of skepticism. (The Rand report urges an emphasis, in college curricula, on media literacy.) Wang puts her finger on a paradox that makes it difficult for academics to challenge the rogue epistemologies that characterize our times. “Unfortunately, the core tenets of media literacy — don’t believe everything you read, do the research yourself, think for yourself — are also the watchwords of conspiracy theorists,” she notes, in her TEDxDuke talk on the subject.
Circling the wagons around the Enlightenment virtues of skeptical inquiry, evidence-based argument, and disinterested science isn’t a winning strategy in a post-truth moment. Not only do we not have the specialized expertise, in many instances, that would enable us to sort truth from falsehood, but our ideological polarization makes us more susceptible than ever to confirmation bias. Skepticism toward experts and media outlets perceived as mouthpieces for our ideological enemies is often a marker of identity. So, too, is the embrace — sometimes sincere, sometimes ironic (to troll our opponents) — of narratives that fly in the face of the facts but confirm our worldviews.
Worse yet, bad actors like Trump and Bannon weaponize our ideological tribalism and epistemological uncertainty by “flooding the zone with shit,” as Bannon pungently put it. They inundate the news media and thus the public with “an avalanche of competing stories” so disorienting in their competing truth claims but also in their sheer volume that they produce what Sean Illing, writing in Vox, called “a certain nihilism in which people are so skeptical about the possibility of finding the truth that they give up the search.” Or, torn between the warring propositions of the X-Files paradox — “Trust No One,” but rest assured “The Truth Is Out There” — they renounce what remains of their faith in the expert elite and embark, like Kennedy, Miller, Chansley, and countless other Truthers, on their own investigations.
What too many academics fail to realize is that the truth they seek isn’t so much empirical truth — the truth of hard facts — as it is the truth of cultural narratives, which infuse our lives with meaning. Facts matter — desperately, in the middle of a plague that has killed over 4.55 million worldwide. But they become meaningful only when they’re inlaid in the mosaic of narrative. “We seek patterns,” says Wang, “and the more out of control we feel in our personal lives and our work and our world, the more we seek patterns. Stories are how we unite; they’re what get us up in the morning. People don’t believe conspiracy theories because they’re irrational or uneducated or they just don’t have the right information. Far above truth, people seek meaning.”
Sternly instructing the masses that they’ve got their facts wrong — “profsplaining,” let’s call it — is only going to play into popular perceptions of academics as ivory-tower elitists defending their cultural authority against the unlettered rabble. In the public arena, academics, especially those in the hard sciences, need to learn to convey the facts and their analyses not just accurately but meaningfully. In short, they need to learn to tell better stories. Democracy, not to mention our species’ survival, hangs in the balance.
Mark Dery is a cultural critic and the author of many books, most recently Born to Be Posthumous: The Eccentric Life and Mysterious Genius of Edward Gorey (Little, Brown).
Expertise After Covid
Knowledge requires infrastructure. It is always social. And it needs defending.
Expertise requires infrastructure. It is always social. And these facts become particularly visible in moments of crisis. These are a few takeaways from two books I’ve been reading during the pandemic, one new, one old.
The first is Let the Record Show, Sarah Schulman’s extraordinary new history of the AIDS Coalition to Unleash Power, better known as ACT UP. Schulman documents the remarkable citizen science taken on by the “Treatment and Data Committee” within ACT UP. Drawing on paradigms that came out of the women’s health movements, as well as disability-rights activism and Black Power, the members of this group insisted that the people who were most affected by HIV/AIDS should get to make knowledge about it — and demonstrated that fights over definitions are not simply semantics. They make a real difference to what a disease is and how an epidemic moves. For instance, getting the CDC to recognize that cervical cancer and yeast infections could be symptoms of AIDS was essential to getting people experimental drugs, or Medicaid and Social Security benefits, that they desperately needed. Otherwise, as one slogan Schulman cites went, “Women don’t get AIDS, we just die from it.”
The second book I have been reading is older: The Authoritarian Personality, from 1950. In particular, I have been rereading the chapters by Theodor Adorno, in which he attempts to articulate a vision of what critical social science should look like. Adorno has often been portrayed, not least by himself, as a grumpy European adrift among American empiricists, impatient to be done with thinking about survey design and replicability so that he can get back to his Schoenberg and Beckett. But in fact, to revisit Adorno’s empirical writings from the 1940s and 1950s is to find texts deeply engaged with questions about gathering and interpreting data.
Another thing that people forget, or that I had forgotten, about the authoritarian personality is that Adorno does not think it exists, quite. That is, he does not think that there are people who have authoritarian personalities. The point of the famous F Scale for diagnosing latent fascism is not to separate potential fascists from everyone else, and then get rid of them, like a spam filter. Rather, it is to identify social and psychological processes that tend to make people susceptible to fascist propaganda. Adorno regards the high scorer “syndrome” as a useful abstraction. Developing the idea of a “syndrome,” he also embraces a normative commitment: Social science must dedicate itself to curing “social disease.” The kind of knowledge it produces is meaningless without such a commitment — even if the nature of the disease must remain open to contestation. For Adorno, the processes predisposing some Americans to fascism were not foreign; they came from within American capitalism. They were, to use today’s cliché, not bugs but features.
Both Schulman and Adorno have special resonance during a pandemic that has demonstrated repeatedly just how socially situated science is, and how inseparably this novel virus is intertwined with social pathologies that pre-exist it and shape its evolution — with inequality, incarceration, lack of access to health care and child care, all raced and gendered and classed as ever.
Both books are, finally, also stories about building the institutions and relations that are capable of creating and sustaining new kinds of expertise, when they are urgently needed.
In Schulman, this aspect of building infrastructure is explicit. Let the Record Show is full of surprisingly engrossing accounts of love, infighting, jealousy, grudging respect, and heroism among different ACT UP members. Who had nursing training, could change a bedpan, knew how to put in an IV? Whose job gave them access to a Xerox machine? Being friends with CDC officials from Harvard or Yale undergrad, as the most famous members were, was one kind of resource among others.
In the Authoritarian Personality, the institutional story remains in the background. Adorno did this work after losing his job on the Princeton Radio Project, because the Rockefeller Foundation was not interested in funding Marxists. So the first study of this kind that the institute produced was funded by unions and modeled on workers’ inquiry. The psychologizing frame that the study uses was not only a compromise among the various authors: It was also a strategic attempt by refugees to be able to keep working in a new country.
Taken together, these two books suggest something important about expertise after Covid. The pandemic has demonstrated the crucial importance of many of the kinds of expertise housed in universities — from the expertise of biologists who understand how mRNA vaccines increase protein translation to that of sociologists who anticipate what factors might make a person able to access vaccines, to that of philosophers, and literary and media scholars, who can parse the conspiracy theories inspiring numerous people to refuse them. With variants multiplying in undervaccinated populations, the second and third kinds of expertise are just as essential as the first. Yet, the pandemic has accelerated the decline of the conditions that made the work of many kinds of experts possible.
Humanistic and social scientific knowledge is essential knowledge, in other words, and no knowledge can be entirely neutral — if only because knowledge-making depends on an infrastructure that needs to be maintained, and sometimes fought for. They also demonstrate that doing this will probably require organizing outside of what are currently considered expert spaces — with adjuncts, cafeteria workers, bearers of student debt.
Writing this has, surprisingly, put me in a good, galvanized mood. If expertise after Covid will have to fight for itself, let’s get to it.
Moira Weigel is an assistant professor of communication studies at Northeastern University.