Last week, my husband and I got into an after-dinner argument over nuclear power with a good friend who was visiting from Moscow—the kind of heated argument friends get in all the time. After almost two hours of loud give and take, we gave up and went to bed.
In light of recent discussions I’ve been reading about “the argumentative theory of reasoning,” I’ve been re-examining what happened that night. Each of us remained rational—offering strong evidence for our positions and staying within the confines of logic. But were any of us really trying to discover the truth? Or were we each simply trying to win the argument—using reasoning skills as weapons with which to clobber our opponents?
In a jointly authored paper published this past June (“Why Do Humans Reason? Arguments for an Argumentative Theory”), by Hugo Mercier (a postdoctoral fellow at The University of Pennsylvania) and Dan Sperber (a French research scholar with several university affiliations who is in the “social and cognitive sciences”), the authors propose that human beings use reason not to discover the truth, but in order to argue. Mercier and Sperber also cite studies that individuals argue better when they argue in a group than when they they’re alone (arguing with themselves, I presume—something most of us call “thinking”).
Almost all classical philosophy—and nowadays, the “critical thinking” we in higher education tout so automatically—rests on the unexamined idea that reasoning is about individuals examining their unexamined beliefs in order to discover the truth. The “argumentative theory of reasoning” upends this philosophical cart by proposing that reasoning is actually a tool to help with arguing—that arguing, not truth, is the point of it all. The theory jibes nicely with evolutionary psychology, which sees reasoning as yet another step in the development in human communication that enhanced the social side of human existence and put the human competitive urge to good use.
Critics have attacked Mercier and Sperber, charging them with offering the theory of the month (in globbing onto the trendy evolutionary psychology) and with being caught in the problem of self-reference (are the authors after the truth about thought, or simply trying to win an argument?).
Yet the argumentative theory of reasoning (also known as the “the social brain hypothesis”) can’t be dismissed so readily. In proposing that reasoning works best not when people think things through for themselves, on their own, but when they reason together in the context of a group—they are au courant with current research on the subject. As the authors point out, reasoning, even if flawed in any given instance, works in an overall way to improve the reliability of what speakers say, and what listeners hear. It helps groups as a whole decide what is the best course of action. (As a political aside, the theory suggests democracy is superior to all other forms of government.)
When teamed up with evolutionary psychology, the argumentative theory of reasoning might just become the next popular movement in higher education (replacing boring, bland mission-speak about critical thinking with boring, bland mission-speak about group arguments). If evolutionary psychologists are right that reasoning evolved not so we can discover truth, but simply so we can argue with others, we’re staring at a good explanation for the curious fact that reasoning in groups is generally superior to the reasoning of an “average individual”—and is frequently better than the reasoning of the best individual within that group. Or consider the fascinating problem of the “confirmation bias”—the tendency we all have to accept arguments that support our point of view and oppose arguments against it. The confirmation bias is said to be at its most powerful when we reason alone.
The argumentative theory of reasoning offers a strong explanation for why democratic deliberations work poorly, and lack any creativity, within closed groups. When everyone in a given group, at the start, thinks the same way, and there are no opposing ideas to argue with, people don’t really deliberate. Instead, they simply reinforce one another’s confirmation bias. A group of likeminded people ends up as a unified, polarized group. For true deliberation, the confirmation bias of people within a group needs to be countered by opposite opinions. (Mercier admits that the current Congress presents a puzzling exception to the theory, suggesting this comes from the fact that it started off too polarized to ever achieve any kind of rational consensus.)
Let’s think what this might mean for education. Gone would be the idea of the student seated at a desk in the library, alone with his thoughts and his freshly learned critical thinking skills. Mercier and Sperber would argue that when students reason on their own, they’re actually going to end up using reason for purposes for which it isn’t designed. All our fussing over teaching students to “think critically” would be beside the point, because it isn’t reasoning that needs changing. Reasoning is doing just fine—doing what the authors say it’s “supposed to do”—simply because its nature is that it helps people win arguments. The authors say that “instead of trying to change the way people reason, “interventions based on the environment—institutional in particular—are much more likely to succeed.” In other words, we need to create more situations where students are compelled to argue with people who disagree with them. That, according to the authors, “should produce very good results without having [to have reasoning itself] be reformed.”
If the argumentative theory of reasoning takes hold, it will destroy more than critical thinking. It’ll take down the whole edifice of Western thought–including Aristotle’s ideas about man possessing a rational principle, and Descartes’ famous “Cogito ergo sum,” which flattened that principle with the cudgel of experience. The new purpose of reasoning will not be to discover truth, but to argue in groups.
What a depressing advance in the history of thought.Return to Top