"Humility" isn’t a word that most academics — or Americans — identify with. Indeed, if there is a single attitude most closely associated with our culture, it’s the opposite of humility. The defining trait of the age seems to be arrogance — in particular, the kind of arrogance personified by our tweeter in chief; the arrogance of thinking that you know it all and that you don’t need to improve because you are just so great already.
But our culture’s infatuation with this kind of arrogance doesn’t come out of the blue. Trump is a symptom and not the cause of a larger trend, one that rewards dogmatic certainty and punishes those who acknowledge the possible limitations of their own point of view. Liberal white male professors like myself are hardly immune. And part of the academic culture we’ve helped to create — including the rise of aggressive "no platforming" tactics to prevent conservatives from speaking on some campuses — has only fed into the perception that academics are no more willing to engage in dialogue and debate than Trump supporters.
Fueling this trend of know-it-all arrogance is the oft-cited polarization of the American people, encouraged by our use of technology. The internet didn't create this polarization, but it does speed it up. That’s partly because the analytics that drive the internet don’t just get us more information; they get us more of the information we want.
Everything from the ads we read to the political news in our Facebook feed is tailored to our preferences. That’s incredibly useful for buying shoes and finding good restaurants. It is easier than ever to get and share information, but the information we get often reflects ourselves as much as it does anything else. Less noticed is that this has an effect not only on how we regard others, but on how we regard ourselves.
This sort of ignorance is partly due to the fact that human beings aren’t isolated knowing machines. We live in an economy of knowledge that distributes cognitive and epistemic labor among specialists. That’s a good thing — no one person can know everything, or even very much. But put all the doctors, scientists, mechanics, and plumbers together, and we collectively know quite a bit.
Yet this often means we blur the line between what’s inside our heads and what’s not. Some philosophers have argued that this blurring is actually justified because knowing itself is often an extended process, distributed in space. When I know something because of your expert testimony — say, that my car’s alternator is broken — what I know is partly in your head and partly in mine. If that’s right, then living in a knowledge economy literally increases my knowledge because knowing is not just an individual phenomenon.
Suppose this extended, distributed picture of knowledge is right. Add the personalized internet, with its carefully curated social-media feeds and individualized search results, and you get not one knowledge economy, but many different ones, each bounded by different assumptions of which sources you can trust and what counts as evidence and what doesn’t. The result is not only an explosion of overconfidence in what you individually understand but an active encouragement of epistemic arrogance. The Internet of Us becomes one big reinforcement mechanism, getting us all the information we are already biased to believe, and encouraging us to regard those in other bubbles as misinformed miscreants. We know it all — the internet tells us so.
Ideology plays a significant role here. We know people disagree with us on a range of issues, from climate change to taxes to vaccines. Indeed, we disagree on so much that it can seem, as one political commentator recently put it, that there are no facts anymore. That’s a way of expressing a seductive line of thought: There just is no way of escaping your perspective or biases. Every time you try to get outside of your own perspective, you just get more information filtered through your own perspective. As a consequence, objective truth is just irrelevant — either we’ll never know it or it doesn’t exist in the first place.
This is an old philosophical idea. The Greek philosopher Protagoras expressed it by saying "man is the measure of all things." That can seem liberating — we all get to invent our own truth! And it has certainly had its fair share of contemporary supporters. Academe, in particular, has been complicit in devaluing objective truth and in the subsequent rise of intellectual arrogance. The postmodernist generation of humanists (and I am one of them) grew up in the 80s and 90s distrusting metanarratives and the very idea of objectivity. But while these movements rightly made us aware of how the implicit lines of institutional, gendered, and racial power affect what passes for truth in a society, they were sometimes taken further to encourage a complete — and often incoherent — rejection of the idea that anything is true (except that rejection itself apparently).
Skepticism about truth is really more self-rationalization than good philosophy. It protects our biases and discourages us from trying to see ourselves as who we really are. More than that, a rejection of objective truth invites despotism simply because it collapses truth into whatever those in power allow to pass for truth in your bubble. And once that is accepted, then the very idea of speaking truth to power becomes moot. You can’t speak truth to power when power speaks truth by definition.
Our cultural embrace of epistemic or intellectual arrogance is the result of a toxic mix of technology, psychology, and ideology. To combat it, we have to reconnect with some basic values, including ones that philosophers have long thought were essential both to serious intellectual endeavors and to politics.
One of those ideas, as I just noted, is belief in objective truth. But another, less-noted concept is intellectual humility. By intellectual humility, I refer to a cluster of attitudes that we can take toward ourselves — recognizing your own fallibility, realizing that you don’t really know as much as you think, and owning your limitations and biases.
But being intellectually humble also means taking an active stance. It means seeing your worldview as open to improvement by the evidence and experience of other people. Being open to improvement is more than just being open to change. And it isn’t just a matter of self-improvement — using your genius to know even more. It is a matter of seeing your view as capable of improvement because of what others contribute.
Intellectual humility is not the same as skepticism. Improving your knowledge must start from a basis of rational conviction. That conviction allows you to know when to stop inquiring, when to realize that you know enough — that the earth really is round, the climate is warming, the Holocaust happened, and so on. That, of course, is tricky, and many a mistake in science and politics have been made because someone stopped inquiring before they should have. Hence the emphasis on evidence; being intellectually humble requires being responsive to the actual evidence, not to flights of fancy or conspiracy theories.
In a democracy, intellectual humility as I’ve defined it is most important for those in power, be it political power or a more diffuse but wide-ranging cultural power. That’s partly what makes institutions that encourage and protect rational dissent — like a free press and academic freedom — of such crucial importance. It is not just, as John Stuart Mill argued, that free inquiry is apt to see truth win out in the end — an overly optimistic view, I’ve always thought — but the fact that researchers can pursue lines of inquiry even if they make those in power uncomfortable. Such institutions, at their best, encourage the pursuit of truth via evidence — and as such, they have the potential to remind us that power, and our own bubbles, are not the measure of all things.
Yet institutional protections themselves are not quite enough. We need to incorporate intellectual humility — what John Dewey called the "scientific attitude" — as a cultural norm. "Merely legal guarantees of the civil liberties of free belief, free expression, free assembly are of little avail," Dewey noted, "if in daily life freedom of communication, the give and take of ideas, facts, experiences, is choked by mutual suspicion, by abuse, by fear and hatred."
Dewey knew that democracies can’t function if their citizens don’t have conviction — an apathetic electorate is no electorate at all. But our democracy also can’t function if we don’t seek, at least some of the time, to inhabit a common space where we can listen to each other and trade reasons back and forth. And that’s one reason that teaching our students the value of empathy, of reasons and dialogue, and the value and nature of evidence itself, is crucial — in fact, now more than ever. Encouraging evidential epistemologies helps combat intellectual arrogance.
Overcoming toxic arrogance is not easy, and our present political moment is not making it any easier. But if we want to live in a tolerant society where we are not only open-minded but willing to learn from others, we need to balance humility and conviction. We can start by looking past ourselves — and admitting that we don’t know it all.
Michael Patrick Lynch is professor of philosophy at the University of Connecticut and author, most recently, of The Internet of Us: Knowing More and Understanding Less in the Age of Big Data. He is the director of the Humanities Institute, and the principal investigator of Humility & Conviction in Public Life, an applied research project aimed at revitalizing our fractured public discourse. Follow him at Twitter @Plural_truth.