It’s wreaking havoc in universities and jeopardizing the progress of research
By Steven PinkerFebruary 13, 2018
The waging of a “war on science” by right-wing know-nothings has become part of the conventional wisdom of the intelligentsia. Even some Republican stalwarts have come to disparage the GOP as “the party of stupid.” Republican legislators have engaged in spectacles of inanity, such as when Sen. James Inhofe, chair of the Committee on Environment and Public Works, brought a snowball to the Senate floor in 2015 to dispute the fact of global warming, and when Rep. Lamar Smith, chair of the House Committee on Science, Space, and Technology, pulled quotes out of context from peer-reviewed grants of the National Science Foundation so he could mock them (for example, “How does the federal government justify spending over $220,000 to study animal photos in National Geographic?”).
We’re sorry, something went wrong.
We are unable to fully display the content of this page.
This is most likely due to a content blocker on your computer or network.
Please allow access to our site and then refresh this page.
You may then be asked to log in, create an account (if you don't already have one),
or subscribe.
If you continue to experience issues, please contact us at 202-466-1032 or help@chronicle.com.
The waging of a “war on science” by right-wing know-nothings has become part of the conventional wisdom of the intelligentsia. Even some Republican stalwarts have come to disparage the GOP as “the party of stupid.” Republican legislators have engaged in spectacles of inanity, such as when Sen. James Inhofe, chair of the Committee on Environment and Public Works, brought a snowball to the Senate floor in 2015 to dispute the fact of global warming, and when Rep. Lamar Smith, chair of the House Committee on Science, Space, and Technology, pulled quotes out of context from peer-reviewed grants of the National Science Foundation so he could mock them (for example, “How does the federal government justify spending over $220,000 to study animal photos in National Geographic?”).
Yet a contempt for science is neither new, lowbrow, nor confined to the political right. In his famous 1959 lecture “The Two Cultures and the Scientific Revolution,” C.P. Snow commented on the disdain for science among educated Britons and called for a greater integration of science into intellectual life. In response to this overture, the literary critic F.R. Leavis wrote a rebuttal in 1962 that was so vituperative The Spectator had to ask Snow to promise not to sue for libel if they published the work.
The highbrow war on science continues to this day, with flak not just from fossil-fuel-funded politicians and religious fundamentalists but also from our most adored intellectuals and in our most august institutions of higher learning. Magazines that are ostensibly dedicated to ideas confine themselves to those arising in politics and the arts, with scant attention to new ideas emerging from science, with the exception of politicized issues like climate change (and regular attacks on a sin called “scientism”). Just as pernicious is the treatment of science in the liberal-arts curricula of many universities. Students can graduate with only a trifling exposure to science, and what they do learn is often designed to poison them against it.
The most frequently assigned book on science in universities (aside from a popular biology textbook) is Thomas Kuhn’s The Structure of Scientific Revolutions. That 1962 classic is commonly interpreted as showing that science does not converge on the truth but merely busies itself with solving puzzles before lurching to some new paradigm that renders its previous theories obsolete; indeed, unintelligible. Though Kuhn himself disavowed that nihilist interpretation, it has become the conventional wisdom among many intellectuals. A critic from a major magazine once explained to me that the art world no longer considers whether works of art are “beautiful” for the same reason that scientists no longer consider whether theories are “true.” He seemed genuinely surprised when I corrected him.
ADVERTISEMENT
The historian of science David Wootton has remarked on the mores of his own field: “In the years since Snow’s lecture the two-cultures problem has deepened; history of science, far from serving as a bridge between the arts and sciences, nowadays offers the scientists a picture of themselves that most of them cannot recognize.” That is because many historians of science consider it naïve to treat science as the pursuit of true explanations of the world. The result is like a report of a basketball game by a dance critic who is not allowed to say that the players are trying to throw the ball through the hoop.
Many scholars in “science studies” devote their careers to recondite analyses of how the whole institution is just a pretext for oppression. An example is a 2016 article on the world’s most pressing challenge, titled “Glaciers, Gender, and Science: A Feminist Glaciology Framework for Global Environmental Change Research,” which sought to generate a “robust analysis of gender, power, and epistemologies in dynamic social-ecological systems, thereby leading to more just and equitable science and human-ice interactions.”
Students can graduate with only a trifling exposure to science, and what they do learn is often designed to poison them against it.
More insidious than the ferreting out of ever more cryptic forms of racism and sexism is a demonization campaign that impugns science (together with the rest of the Enlightenment) for crimes that are as old as civilization, including racism, slavery, conquest, and genocide.
This was a major theme of the Critical Theory of the Frankfurt School, the quasi-Marxist movement originated by Theodor Adorno and Max Horkheimer, who proclaimed that “the fully enlightened earth radiates disaster triumphant.” It also figures in the works of postmodernist theorists such as Michel Foucault, who argued that the Holocaust was the inevitable culmination of a “bio-politics” that began with the Enlightenment, when science and rational governance exerted increasing power over people’s lives. In a similar vein, the sociologist Zygmunt Bauman blamed the Holocaust on the Enlightenment ideal to “remake the society, force it to conform to an overall, scientifically conceived plan.”
In this twisted narrative, the Nazis themselves are somehow let off the hook (“It’s modernity’s fault!”). Though Critical Theory and postmodernism avoid “scientistic” methods such as quantification and systematic chronology, the facts suggest that they have the history backwards. Genocide and autocracy were ubiquitous in premodern times, and they decreased, not increased, as science and liberal Enlightenment values became increasingly influential after World War II.
ADVERTISEMENT
To be sure, science has often been pressed into the support of deplorable political movements. It is essential, of course, to understand that history, and legitimate to pass judgment on scientists, just like any historical figures, for their roles in it. Yet the qualities that we prize in humanities scholars — context, nuance, historical depth — often leave them when the opportunity arises to prosecute a campaign against their academic rivals. Science is commonly blamed for intellectual movements that had a pseudoscientific patina, though the historical roots of those movements ran deep and wide.
“Scientific racism,” the theory that races fall into a hierarchy of mental sophistication with Northern Europeans at the top, is a prime example. It was popular in the decades flanking the turn of the 20th century, apparently supported by craniometry and mental testing, before being discredited in the middle of the 20th century by better science and by the horrors of Nazism. Yet to pin ideological racism on science, in particular on the theory of evolution, is bad intellectual history. Racist beliefs have been omnipresent across history and regions of the world. Slavery has been practiced by every major civilization and was commonly rationalized by the belief that enslaved peoples were inherently suited to servitude, often by God’s design. Statements from ancient Greek and medieval Arab writers about the biological inferiority of Africans would curdle your blood, and Cicero’s opinion of Britons was not much more charitable.
More to the point, the intellectualized racism that infected the West in the 19th century was the brainchild not of science but of the humanities: history, philology, classics, and mythology. In 1853, Arthur de Gobineau, a fiction writer and amateur historian, published his cockamamie theory that a race of virile white men, the Aryans, spilled out of an ancient homeland and spread a heroic warrior civilization across Eurasia, diverging into the Persians, Hittites, Homeric Greeks, and Vedic Hindus, and later into the Vikings, Goths, and other Germanic tribes. (The speck of reality in this story is that these tribes spoke languages that fell into a single family, Indo-European.) Everything went downhill when the Aryans interbred with inferior conquered peoples, diluting their greatness and causing them to degenerate into the effete, decadent, soulless, bourgeois, commercial cultures that the Romantics were always whingeing about. It was a small step to fuse this fairy tale with German Romantic nationalism and anti-Semitism: The Teutonic Volk were the heirs of the Aryans, the Jews a mongrel race of Asiatics. Gobineau’s ideas were eaten up by Richard Wagner (whose operas were held to be re-creations of the original Aryan myths) and by Wagner’s son-in-law Houston Stewart Chamberlain (a philosopher who wrote that Jews polluted Teutonic civilization with capitalism, liberal humanism, and sterile science). From them the ideas reached Hitler, who called Chamberlain his “spiritual father.”
Science played little role in this chain of influence. Pointedly, Gobineau, Chamberlain, and Hitler rejected Darwin’s theory of evolution, particularly the idea that all humans had gradually evolved from apes, which was incompatible with their Romantic theory of race and with the older folk and religious notions from which it had emerged. According to these widespread beliefs, races were separate species; they were fitted to civilizations with different levels of sophistication; and they would degenerate if they mixed. Darwin argued that humans are closely related members of a single species with a common ancestry, that all peoples have “savage” origins, that the mental capacities of all races are virtually the same, and that the races blend into one another with no harm from interbreeding. The University of Chicago historian Robert Richards, who traced Hitler’s influences, ended his book titled Was Hitler a Darwinian? (a common claim among creationists) with “The only reasonable answer to the question ... is a very loud and unequivocal No.”
I mention the limited role of science in so-called scientific racism not to absolve the scientists (many of whom were indeed active or complicit) but because the movement deserves a deeper and more contextualized understanding than its current role as anti-science propaganda. Misunderstandings of Darwin gave scientific racism a boost, but it sprang from the religious, artistic, intellectual, and political beliefs of its era. If we think scientific racism is not just unfashionable but mistaken, it is because of the better historical and scientific understanding we enjoy today.
ADVERTISEMENT
Recriminations over the nature of science are by no means a relic of the “science wars” of the 1980s and 1990s — when scientists and humanities scholars clashed over the nature of scientific truth — but continue to shape the role of science in universities. When Harvard reformed its general-education requirement in 2006-7, the preliminary report of the task force introduced the teaching of science without any mention of its place in human knowledge: “Science and technology directly affect our students in many ways, both positive and negative: they have led to life-saving medicines, the internet, more efficient energy storage, and digital entertainment; they also have shepherded nuclear weapons, biological warfare agents, electronic eavesdropping, and damage to the environment.”
Well, yes, and I suppose one could say that architecture has produced both museums and gas chambers, and that classical music both stimulates economic activity and inspired the Nazis. But this strange equivocation between the utilitarian and the nefarious was not applied to other disciplines, and the statement gave no indication that we might have good reasons to prefer understanding and know-how to ignorance and superstition.
Does the demonization of science in the liberal arts matter? It does, for a number of reasons. Though many talented students hurtle along pre-med or engineering tracks from the day they set foot on campus, many others are unsure of what they want to do with their lives and take their cues from professors and advisers. What happens to those who are taught that science is just another narrative like religion and myth, that it lurches from revolution to revolution without making progress, and that it is a rationalization of racism, sexism, and genocide? I’ve seen the answer: Some of them figure, “If that’s what science is, I might as well make money!” Four years later, their brainpower is applied to thinking up algorithms that allow hedge funds to act on financial information a few milliseconds faster, rather than to finding new treatments for Alzheimer’s disease or technologies for carbon capture and storage.
The stigmatization of science is also jeopardizing the progress of science itself. Today anyone who wants to do research on human beings, even an interview on political opinions or a questionnaire about irregular verbs, must prove to a committee that he or she is not Josef Mengele. Though research subjects obviously must be protected from exploitation and harm, the institutional-review bureaucracy has swollen far beyond this mission. Its critics have pointed out that it has become a menace to free speech, a weapon that fanatics can use to shut up people whose opinions they don’t like, and a red-tape dispenser that bogs down research while failing to protect, and sometimes harming, patients and research subjects. Jonathan Moss, a medical researcher who had developed a new class of drugs and was drafted into chairing the research-review board at the University of Chicago, said in a convocation address, “I ask you to consider three medical miracles we take for granted: X-rays, cardiac catheterization, and general anesthesia. I contend all three would be stillborn if we tried to deliver them in 2005.” The same observation has been made about insulin, burn treatments, and other lifesavers.
The hobbling of research is not just a symptom of bureaucratic mission creep. It is actually rationalized by many bioethicists.
The hobbling of research is not just a symptom of bureaucratic mission creep. It is actually rationalized by many bioethicists. These theoreticians think up reasons that informed and consenting adults should be forbidden to take part in treatments that help them and others while harming no one. They use nebulous rubrics like “dignity,” “sacredness,” and “social justice.” They try to sow panic about advances in biomedical research with far-fetched analogies to nuclear weapons and Nazi atrocities, science-fiction dystopias like Brave NewWorld and Gattaca, and freak-show scenarios like armies of cloned Hitlers, people selling their eyeballs on eBay, and warehouses of zombies to supply people with spare organs. The University of Oxford philosopher Julian Savulescu has exposed the low standards of reasoning behind these arguments and has pointed out why “bioethical” obstructionism can be unethical: “To delay by 1 year the development of a treatment that cures a lethal disease that kills 100,000 people per year is to be responsible for the deaths of those 100,000 people, even if you never see them.”
ADVERTISEMENT
Ultimately the greatest payoff of instilling an appreciation of science is for everyone to think more scientifically. Cognitive psychologists have shown that humans are vulnerable to crippling biases and fallacies. Movements that aim to work around those biases and to spread scientific sophistication — data journalism, Bayesian forecasting, evidence-based medicine and policy, real-time violence monitoring, effective altruism — have a vast potential to enhance human welfare. But an appreciation of their value has been slow to penetrate the culture.
I asked my doctor whether the nutritional supplement he had recommended for my knee pain would really be effective. He replied, “Some of my patients say it works for them.” A business-school colleague shared this assessment of the corporate world: “I have observed many smart people who have little idea of how to logically think through a problem, who infer causation from a correlation, and who use anecdotes as evidence far beyond the predictability warranted.” A colleague who uses quantitative tools to study war, peace, and human security describes the United Nations as an “evidence-free zone":
The higher reaches of the UN are not unlike anti-science humanities programs. Most people at the top are lawyers and liberal-arts graduates. The only parts of the Secretariat that have anything resembling a research culture have little prestige or influence. Few of the top officials in the UN understood qualifying statements as basic as “on average” and “other things being equal.” So if we were talking about risk probabilities for conflict onsets, you could be sure that Sir Archibald Prendergast III or some other luminary would offer a dismissive, “It’s not like that in Burkina Faso.”
Resisters to scientific thinking often object that some things just can’t be quantified. Yet unless they are willing to speak only of issues that are black or white and to forswear using the words more, less, better, and worse (and, for that matter, the suffix -er), they are making claims that are inherently quantitative. If they veto the possibility of putting numbers to those claims, they are saying, “Trust my intuition.” But if there’s one thing we know about cognition, it’s that people (including experts) are arrogantly overconfident about their intuition.
In 1954, Paul Meehl stunned his fellow psychologists by showing that simple actuarial formulas outperform expert judgment in predicting psychiatric classifications, suicide attempts, school and job performance, lies, crime, medical diagnoses, and pretty much any other outcome in which accuracy can be judged at all. His conclusion about the superiority of statistical to intuitive judgment is now recognized as one of the most robust findings in the history of psychology.
Data, of course, cannot solve problems by themselves. All the money in the world could not pay for randomized controlled trials to settle every question that occurs to us. Human beings will always be in the loop to decide which data to gather and how to analyze and interpret them. The first attempts to quantify a concept are always crude, and even the best ones allow probabilistic rather than perfect understanding. Nonetheless, social scientists have laid out criteria for evaluating and improving measurements, and the critical comparison is not whether a measure is perfect but whether it is better than the judgment of an expert, critic, interviewer, clinician, judge, or maven. That turns out to be a low bar.
ADVERTISEMENT
Many humanities scholars are receptive to insights from science. But the highbrow police proclaim that they may not indulge such curiosity.
Because the cultures of politics and journalism are largely innocent of the scientific mind-set, questions with major consequences for life and death are answered by methods that we know lead to error, such as anecdotes, headlines, rhetoric, and what engineers call HiPPO (highest-paid person’s opinion). Many dangerous misconceptions arise from this statistical obtuseness. People think that crime and war are spinning out of control, though homicides and battle deaths are going down, not up. They think that Islamist terrorism is a major risk to life and limb, though the danger is less than that from wasps and bees. They think that ISIS threatens the existence or survival of the United States, though terrorist movements rarely achieve any of their strategic aims.
The dataphobic mind-set (“It’s not like that in Burkina Faso”) can lead to real tragedy. Many political commentators can recall a failure of peacekeeping forces (such as in Bosnia in 1995) and conclude that they are a waste of money and manpower. But when a peacekeeping force is successful, nothing photogenic happens, and it fails to make the news. In her book Does Peacekeeping Work? (Princeton University Press, 2008),the Columbia University political scientist Virginia Page Fortna addressed the question in her title with the methods of science rather than headlines, and found that the answer is “a clear and resounding yes.” Knowing the results of these analyses could make the difference between an international organization’s helping to bring peace to a country and letting it fester in civil war.
Take another life-or-death political question. Do campaigns of nonviolent resistance work? Many people believe that Gandhi and King just got lucky: Their movements tugged at the heartstrings of enlightened democracies at opportune moments, but everywhere else, oppressed people need violence to get out from under a dictator’s boot. The political scientists Erica Chenoweth and Maria J. Stephan assembled a data set of political-resistance movements across the world between 1900 and 2006 and discovered that three-quarters of the nonviolent resistance movements succeeded, compared with only a third of the violent ones. Gandhi and King were right, but without data, you would never know it.
Though the urge to join a violent insurgent or terrorist group may owe more to male bonding than to just-war theory, most of the combatants probably believe that if they want to bring about a better world, they have no choice but to kill people. Would anything change if everyone knew that violent strategies were not just immoral but ineffectual? It’s not that I think we should airdrop crates of Chenoweth and Stephan’s book into conflict zones. But leaders of radical groups are often highly educated, and even the cannon fodder often have had some college and absorb the conventional wisdom about the need for revolutionary violence. What would happen over the long run if a standard college curriculum devoted less attention to the writings of Karl Marx and Frantz Fanon and more to quantitative analyses of political violence?
One of the greatest potential contributions of modern science may be a deeper integration with the humanities. By all accounts, the humanities are in trouble. University programs are downsizing; the next generation of scholars is un- or underemployed; morale is sinking; students are staying away.
ADVERTISEMENT
No thinking person should be indifferent to our society’s disinvestment in the humanities. A society without historical scholarship is like a person without memory: deluded, confused, easily exploited. Philosophy grows out of the recognition that clarity and logic don’t come easily to us, and that we’re better off when our thinking is refined and deepened. The arts are one of the things that make life worth living, enriching human experience with beauty and insight. Criticism is itself an art that magnifies the appreciation and enjoyment of great works. Knowledge in these domains is hard won and needs constant enriching and updating as the times change.
Diagnoses of the malaise of the humanities rightly point to anti-intellectual trends in our culture and to the commercialization of universities. But an honest appraisal would have to acknowledge that some of the damage is self-inflicted. The humanities have yet to recover from the disaster of postmodernism, with its defiant obscurantism, self-refuting relativism, and suffocating political correctness. Many of its luminaries — Nietzsche, Heidegger, Foucault, Lacan, Derrida, the Critical Theorists — are morose cultural pessimists who declare that modernity is odious, all statements are paradoxical, works of art are tools of oppression, liberal democracy is the same as fascism, and Western civilization is circling the drain.
With such a cheery view of the world, it’s not surprising that the humanities often have trouble defining a progressive agenda for their own enterprise. Several college presidents and provosts have lamented to me that when a scientist comes into their office, it’s to announce some exciting new research opportunity and demand the resources to pursue it. When a humanities scholar drops by, it’s to plead for respect for the way things have always been done.
To be sure, there is no replacement for the close reading, thick description, and deep immersion that erudite scholars can apply to individual works. But must these be the only paths to understanding? A consilience with science offers the humanities many possibilities for new insight. Art, culture, and society are products of human brains. They originate in our faculties of perception, thought, and emotion, and they accumulate and spread through the epidemiological dynamics by which one person affects others. Shouldn’t we be curious to understand these connections by tearing down academic silos and mining the sciences for insights about human nature that could illuminate culture and society? Both sides would win. The humanities would enjoy more of the explanatory depth of the sciences, as well as a forward-looking agenda that could attract ambitious young talent (not to mention appeal to deans and donors). The sciences could challenge their theories with the natural experiments and ecologically valid phenomena that have been so richly characterized by humanities scholars.
In some fields, this consilience is a fait accompli. Archaeology has grown from a branch of art history to a high-tech science. The philosophy of mind shades into mathematical logic, computer science, cognitive science, and neuroscience. Linguistics combines philological scholarship on the history of words and grammatical constructions with laboratory studies of speech, mathematical models of grammar, and the computerized analysis of large corpora of writing and conversation.
ADVERTISEMENT
Comparable opportunities beckon in political theory, the visual arts, musicology, and literature, deepening John Dryden’s insight that a work of fiction is “a just and lively image of human nature, representing its passions and humours, and the changes of fortune to which it is subject, for the delight and instruction of mankind.” And though many concerns in the humanities are best appreciated with traditional narrative criticism, some raise empirical questions that can be informed by data. The advent of data science applied to books, periodicals, correspondence, and musical scores has inaugurated the digital humanities, whose potential is limited only by the imagination.
The promise of a unification of knowledge can be fulfilled only if knowledge flows in all directions. Some of the scholars who have recoiled from scientists’ forays into explaining art are correct that these explanations have been, by their standards, shallow and simplistic. All the more reason for them to reach out and combine their erudition about individual works and genres with scientific insight into human emotions and aesthetic responses. Better still, universities could train a new generation of scholars who are fluent in each of the two cultures.
Although in my experience many artists and humanities scholars are receptive to insights from science, the policemen of highbrow culture proclaim that they may not indulge such curiosity. In a dismissive review in The New Yorker of a book by the literary scholar Jonathan Gottschall on the evolution of the narrative instinct, Adam Gopnik writes, “The interesting questions about stories ... are not about what makes a taste for them ‘universal,’ but what makes the good ones so different from the dull ones. ... This is a case, as with women’s fashion, where the subtle, ‘surface’ differences are actually the whole of the subject.” But in appreciating literature, must connoisseurship really be the whole of the subject? An inquisitive spirit might also be curious about the recurring ways in which minds separated by culture and era deal with the timeless conundrums of human existence.
In 1782, Thomas Paine extolled the cosmopolitan virtues of science:
Science, the partisan of no country, but the beneficent patroness of all, has liberally opened a temple where all may meet. Her influence on the mind, like the sun on the chilled earth, has long been preparing it for higher cultivation and further improvement. The philosopher of one country sees not an enemy in the philosopher of another: he takes his seat in the temple of science, and asks not who sits beside him.
What he wrote about the physical landscape applies as well to the landscape of knowledge. In this and other ways, the spirit of science is the spirit of the Enlightenment.
ADVERTISEMENT
Steven Pinker is a professor of psychology at Harvard University, and author, most recently, of Enlightenment Now: The Case for Reason, Science, Humanism, and Progress (Viking), from which this essay is adapted.