To see how we treat the concept of truth these days, one might think we just don’t care anymore. Politicians pronounce that global warming is a hoax. An alarming number of middle-class parents have stopped giving their children routine vaccinations, on the basis of discredited research. Meanwhile many commentators in the media — and even some in our universities — have all but abandoned their responsibility to set the record straight. (It doesn’t help when scientists occasionally have to retract their own work.)
Humans have always held some wrongheaded beliefs that were later subject to correction by reason and evidence. But we have reached a watershed moment, when the enterprise of basing our beliefs on fact rather than intuition is truly in peril.
It’s not just garden-variety ignorance that periodically appears in public-opinion polls that makes us cringe or laugh. A 2009 survey by the California Academy of Sciences found that only 53 percent of American adults knew how long it takes for Earth to revolve around the sun. Only 59 percent knew that the earliest humans did not live at the same time as the dinosaurs.
As egregious as that sort of thing is, it is not the kind of ignorance that should most concern us. There is simple ignorance and there is willful ignorance, which is simple ignorance coupled with the decision to remain ignorant. Normally that occurs when someone has a firm commitment to an ideology that proclaims it has all the answers — even if it counters empirical matters that have been well covered by scientific investigation. More than mere scientific illiteracy, this sort of obstinacy reflects a dangerous contempt for the methods that customarily lead to recognition of the truth. And once we are on that road, it is a short hop to disrespecting truth.
It is sad that the modern attack on truth started in the academy — in the humanities, where the stakes may have initially seemed low in holding that there are multiple ways to read a text or that one cannot understand a book without taking account of the political beliefs of its author.
That disrespect, however, has metastasized into outrageous claims about the natural sciences.
The strategy is to say, “I refuse to believe this,” and then filibuster in the court of public opinion.
Anyone who has been paying attention to the fault lines of academic debate for the past 20 years already knows that the “science wars” were fought by natural scientists (and their defenders in the philosophy of science) on the one side and literary critics and cultural-studies folks on the other. The latter argued that even in the natural realm, truth is relative, and there is no such thing as objectivity. The skirmishes blew up in the well-known “Sokal affair” in 1996, in which a prominent physicist created a scientifically absurd postmodernist paper and was able to get it published in a leading cultural-studies journal. The ridicule that followed may have seemed to settle the matter once and for all.
But then a funny thing happened: While many natural scientists declared the battle won and headed back to their labs, some left-wing postmodernist criticisms of truth began to be picked up by right-wing ideologues who were looking for respectable cover for their denial of climate change, evolution, and other scientifically accepted conclusions. Alan Sokal said he had hoped to shake up academic progressives, but suddenly one found hard-right conservatives sounding like Continental intellectuals. And that caused discombobulation on the left.
“Was I wrong to participate in the invention of this field known as science studies?,” Bruno Latour, one of the founders of the field that contextualizes science, famously asked. “Is it enough to say that we did not really mean what we said? Why does it burn my tongue to say that global warming is a fact whether you like it or not? Why can’t I simply say that the argument is closed for good?”
“But now the climate-change deniers and the young-Earth creationists are coming after the natural scientists,” the literary critic Michael Bérubé noted, "… and they’re using some of the very arguments developed by an academic left that thought it was speaking only to people of like mind.”
That is the price one pays for playing with ideas as if doing so has no consequences, imagining that they will be used only for the political purposes one intended. Instead, the entire edifice of science is now under attack. And it’s the poor and disenfranchised, to whom the left pays homage, who will probably bear the brunt of disbelief in climate change.
Of course, some folks were hard at work trying to dispute inconvenient scientific facts long before conservatives began to borrow postmodernist rhetoric. In Merchants of Doubt (Bloomsbury Press, 2010), two historians, Naomi Oreskes and Erik M. Conway, have shown how the strategy of denying climate change and evolution can be traced all the way back to big tobacco companies, who recognized early on that even the most well-documented scientific claims (for instance, that smoking causes cancer) could be eroded by skillful government lobbying, bullying the news media, and pursuing a public-relations campaign. Sadly, that strategy has largely worked, and we today find it employed by the Discovery Institute, the Seattle organization advocating that “intelligent-design theory” be taught in the public schools as balance for the “holes” in evolutionary theory, and the Heartland Institute, which bills itself as “the world’s most prominent think tank promoting skepticism about man-made climate change.”
What do such academically suspect centers have to offer by way of peer-reviewed, scientifically reputable evidence? Almost nothing. But that is not the point. The strategy of willful ignorance is not to fight theory with theory and statistic with statistic. It is instead to say, “I refuse to believe this,” and then filibuster in the court of public opinion. It is not crackpot theories that are doing us in. It is the spread of the tactics of those who disrespect truth.
Remember the great dialogue Euthyphro, in which Socrates, soon facing trial for impiety and corrupting youth, admonishes a callow young fellow for professing to know what “righteousness” is? Socrates demonstrates again and again that Euthyphro has no idea what he is talking about when he argues that it would be righteous for him to prosecute his own father for murder on the basis of some pretty shoddy evidence — and shows that Euthyphro cannot even define the meaning of the word. Socrates is adept at questioning and at verbal humiliation — his standard method throughout the dialogues — but not because he knows the answers. When challenged, Socrates always demurs. He has no wisdom, he says, but is only a kind of “midwife” who can help others to seek it. Even though the goal of philosophy is to find the truth, Socrates customarily professes ignorance.
Plato here teaches a central lesson about the philosopher’s search for knowledge, which has ramifications for any quest for true belief. The real enemy is not ignorance, doubt, or even disbelief. It is false knowledge. When we profess to know something even in the face of absent or contradicting evidence, that is when we stop looking for the truth. If we are ignorant, perhaps we will be motivated to learn. If we are skeptical, we can continue to search for answers. If we disbelieve, maybe others can convince us. And perhaps even if we are honestly wrong, and put forward a proposition that is open to refutation, we may learn something when our earlier belief is overthrown.
But when we choose to insulate ourselves from new ideas or evidence because we think that we already know what is true, that is when we are most likely to believe a falsehood. It is not mere disbelief that explains why truth is so often disrespected. It is one’s attitude.
In a recent paper, “Why Do Humans Reason?,” Hugo Mercier and Dan Sperber, both of them philosophers and cognitive scientists, argue that the point of human reason is not and never has been to lead to truth, but is rather to win arguments. If that is correct, the discovery of truth is only a byproduct.
The fact that humans do reason poorly is beyond dispute. The psychological literature is replete with examples of mistakes like “confirmation bias” (seeking out only information that confirms our preconceptions) and “hindsight bias” (relying on current knowledge to assume that something was predictable all along). The work goes back to the 1970s and ’80s, with Daniel Kahneman and Amos Tversky’s groundbreaking research on irrationality in how people weigh risks and losses, which helped establish the field of behavioral economics and undermine the reigning idea in economics of rational choice. Kahneman, a psychologist who won the Nobel Memorial Prize in Economic Science, updated his work in Thinking, Fast and Slow (Farrar, Straus and Giroux, 2011).
The fundamental question that motivates Mercier and Sperber’s analysis is this: Why would being a persuasive speaker be valuable to humans as they evolved? Here the authors tell a story about the importance of argumentation to the evolution of communication. In a group setting, where people were not already inclined to trust one another, they would need some way of evaluating claims. That’s where arguments come in. Just to make an assertion does not rise to the level of overcoming what Mercier and Sperber and colleagues have called the “epistemic vigilance” against being deceived or manipulated. If you present other people with the reasons for your belief, however, you have now given them the means to evaluate the truth of your claim and also, if you are right, presumably extend more trust to you in the future. Thus, according to Mercier and Sperber, providing arguments for our beliefs improves the quality and reliability of information that is shared in human communication.
The philosopher Andy Norman and others have criticized this theory by pointing out that it relies far too heavily on the idea that rhetorical skills are valuable within an evolutionary context, irrespective of the truth of the beliefs being advocated. What if the reasons for your beliefs are not true? In a response to Mercier and Sperber, the psychologist Robert J. Sternberg pointed out that while reason and argument are closely related, “persuasive reasoning that is not veridical can be fatal to the individual and to the propagation of his or her genes, as well as to the human species as a whole.”
We are faced with the prospect of a significant change in the temperature of our planet if we continue to harvest and use all of the fossil fuels at our disposal. Suddenly the stakes for a longtime problem of human irrationality seem enormous. But if the seeds of disrespecting truth were planted so long ago, why are they now growing with such force?
One likely candidate is the Internet. It facilitates not only the spread of truth but also the proliferation of crackpots, ideologues, and those with an ax to grind. With the removal of editorial gatekeepers who can vet information, outright lies can survive on the Internet. Worse, those who embrace willful ignorance are now much more likely to find an electronic home where their marginal views are embraced.
An obvious solution might be to turn to journalists, who are supposed to embrace a standard of objectivity and source-checking that would be more likely to support true beliefs. Yet, at least in part as a result of the competition that has been enabled by the Internet, we now find that even some mainstream journalists and news media are dangerously complicit in the follies of those who seek to disrespect truth. There have always been accusations of bias in the media, but today we have Fox News on the right and MSNBC on the left (along with a smattering of partisan radio talk-show hosts like Rush Limbaugh), who engage in overt advocacy for their ideological views.
Yet those are not the kinds of journalists we should be so worried about, for they are known to be biased. Another tendency is perhaps even more damaging to the idea that journalism is meant to safeguard truth. Call it “objectivity bias.” Sensitive to criticism that they, too, are partisan, many news sites try to demonstrate that they are fair and balanced by presenting “both” sides of any issue deemed “controversial” — even when there really aren’t two credible sides. That isn’t objectivity. And the consequence is public confusion over whether an issue — in the case of climate change or childhood vaccination, a scientific issue — has actually been settled.
To fight back, we should remember the basic principles of evidence-based belief and true skepticism that got us out of the Dark Ages. Although behavioral economists, among other scholars, have amply shown that human reason is not perfect, that is no excuse for lazy thinking. Even if our brains are not wired to search for truth, we can still pursue a path that might lead to better answers than those supplied by Kahneman’s “fast” part of our brain. Truth may not be automatic, but it is still an option. Socrates taught us as much long before we knew anything about cognitive science: Good reasoning is a skill that can be learned.
We are no more a slave to nature in reasoning than we are in morality. Few people would argue that we are genetically programmed to be moral. We may be hard-wired to do things that increase the survival value of our genes, like killing our rivals when no one is looking, but we do not do them, because they are unethical. If we can make such a choice in morals, why not also with reason?
The choosing is what makes us human. It’s not our imperfect brains, but the power to decide for ourselves how we will live our lives, that should give us hope. Respecting truth is a choice.
Lee McIntyre is a research fellow at the Center for Philosophy and History of Science at Boston University. His book Respecting Truth: Willful Ignorance in the Internet Age will be published this month by Routledge.