“No one ever went broke underestimating the intelligence of the American public,” said H.L. Mencken in the era of Babbitt and the Scopes “monkey” trial. Several generations later, one might speculate that no publisher has ever lost money with a book accusing Americans — particularly young ones — of being stupid.
The most influential book in that genre is surely Richard Hofstadter’s Anti-Intellectualism in American Life (1963), in which he argues that the American dislike for educational elitism derives from a number of interlocking cultural legacies, including religious fundamentalism, populism, the privileging of “common sense” over esoteric knowledge, the pragmatic values of business and science, and the cult of the self-made man. With some cyclical variation, Americans tend to distrust, resent, and even feel moral revulsion toward “intellectuals.”
As an English professor, I can attest to the power of that element in American culture, as can just about anyone in any academic field without direct, practical applications. When a stranger asks me what I do, I usually just say, “I’m a teacher.” The unfortunate follow-up remarks — usually about political bias in the classroom and sham apologies for their poor grammar meant to imply that I am a snob — usually make me wish I had said, “I sell hydraulic couplers,” an answer more likely to produce hums of respectful incomprehension.
If the situation was bad in Hofstadter’s time, it’s grown steadily worse over the past 40 years. The anti-intellectual legacy he described has often been used by the political right — since at least the McCarthy era — to label any complication of the usual pieties of patriotism, religion, and capitalism as subversive, dangerous, and un-American. And, one might add, the left has its own mirror-image dogmas.
Now, in the post-9/11 era, American anti-intellectualism has grown more powerful, pervasive, and dangerous than at any time in our history, and we have a duty — particularly as educators — to foster intelligence as a moral obligation.
Or at least that is the urgent selling point of a cartload of books published in the past several months.
For academics on the political left, the last eight years represent the sleep of reason producing the monsters of our time: suburban McMansions, gas-guzzling Hummers, pop evangelicalism, the triple-bacon cheeseburger, Are You Smarter Than a Fifth-Grader?, creation science, waterboarding, environmental apocalypse, Miley Cyrus, and the Iraq War — all presided over by that twice-elected, self-satisfied, inarticulate avatar of American incuriosity and hubris: he who shall not be named.
The Anti-Intellectual Presidency: The Decline of Presidential Rhetoric From George Washington to George W. Bush (2008), by Elvin T. Lim, examines speeches and public papers — noting shortened sentences, simplified diction, the proliferation of platitudes — to show a pattern of increased pandering to the lowest common intellectual denominator, combined with a mockery of complexity and analysis.
Just How Stupid Are We?: Facing the Truth About the American Voter (2008), by Richard Shenkman, argues that the dumbing down of our political culture is linked to the decline of organized labor and local party politics, which kept members informed on matters of substance. Building on arguments put forward in books such as What’s the Matter With Kansas?: How Conservatives Won the Heart of America (2004), by Thomas Frank, Shenkman shows how the political right has been able to don the populist mantle even as it pursues policies that thwart the economic and social interests of the average voter.
Meanwhile, the political left is unable to argue that those average Americans are in some way responsible for their own exploitation because they are too shallow and misinformed — too stupid — to recognize their own interests. One of Shenkman’s solutions is to require voters to pass a civics exam.
Former Vice President Al Gore obviously has a dog in this hunt, and his book The Assault on Reason (2007) argues that the fundamental principles of American freedom — descended from the Enlightenment — are being corrupted by the politics of fear, the abuse of faith, the power of an increasingly centralized media culture, and degradation of political checks and balances favoring an imperial presidency.
The results of that perfect storm include the aftermath of Hurricane Katrina, the continuing threat of global warming, the squandering of respect and sympathy for the United States after 9/11, and the nation’s dependence on foreign oil.
Most notably, Gore argues that the democratization of information and the community-building power of the Internet can play important roles in the creation of a “well-connected” citizenry and the restoration of a rational democracy.
Nevertheless, several books — with an emphasis on education and the young — argue that it is precisely the point-and-click culture of the Internet that is damaging our intelligence and our civic culture.
Always On: Language in an Online and Mobile World (2008), by Naomi S. Baron, shows how the proliferation of electronic communication has impaired students’ ability to write formal prose; moreover, it discourages direct communication, leading to isolation, self-absorption, and damaged relationships.
Worst of all, the prevalence of multi-tasking — of always being partly distracted, doing several things at once — has diminished the quality of our thought, reflection, self-expression, and even, surprisingly, our productivity. Baron’s solution is to turn off the distractions and focus on the task and people at hand.
Her conclusions are largely affirmed by Nicholas Carr’s cover story in the July/August 2008 issue of The Atlantic: “Is Google Making Us Stoopid?” which prompted a recent dialogue in The Chronicle (“Your Brain on Google,” July 11).
Carr, author of The Big Switch: Rewiring the World, From Edison to Google (2008), argues that daily use of the Internet may be rewiring our brains for skimming rather than for the sustained concentration that is required for reading books, listening to lectures, and writing long essays. Obviously, such rewiring is going to have the biggest impact on the rising generation appearing in our college classrooms: the “digital natives.”
The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future (2008), by Mark Bauerlein, provides alarming statistical support for the suspicion — widespread among professors (including me) — that young Americans are arriving at college with diminished verbal skills, an impaired work ethic, an inability to concentrate, and a lack of knowledge even as more and more money is spent on education.
It seems that our students are dumb and ignorant, but their self-esteem is high so they are impervious or hostile to criticism. Approaching his subject from the right, Bauerlein mentions the usual suspects — popular culture, pandering by educators, the culture war, etc. — but also reserves special attention for the digital technologies, which, for all their promise, have only more deeply immersed students in the peer obsessions of entertainment and fashion rather than encouraging more mature and sustained thought about politics, history, science, and the arts.
For Bauerlein, the future of American democracy “looks dim” unless we can counter the youth culture with respect for the knowledge of those over 30.
The most wide-ranging cultural study — extending Hofstadter’s analysis up to the present — is Susan Jacoby’s The Age of American Unreason (2008), in which she argues that American anti-intellectualism has reached unprecedented heights thanks to the converging influences of junk science, fundamentalism, celebrity-obsessed media, identity politics, urban-gang culture, political correctness, declining academic standards, moral relativism, political pandering, and the weakening of investigative journalism, among other factors.
Jacoby also supports the view that technology has damaged our ability to focus and think deeply. Her vision of the future is a nation that is unprepared for the global challenges we face.
As someone involved in education, I take the concerns of all of those writers quite seriously: The abilities and attitudes of students affect my life on a daily basis. It is my job, as I see it, to combat ignorance and foster the skills and knowledge needed to produce intelligent, ethical, and productive citizens. I see too many students who are:
-
Primarily focused on their own emotions — on the primacy of their “feelings” — rather than on analysis supported by evidence.
-
Uncertain what constitutes reliable evidence, thus tending to use the most easily found sources uncritically.
-
Convinced that no opinion is worth more than another: All views are equal.
-
Uncertain about academic honesty and what constitutes plagiarism. (I recently had a student defend herself by claiming that her paper was more than 50 percent original, so she should receive that much credit, at least.)
-
Unable to follow or make a sustained argument.
-
Uncertain about spelling and punctuation (and skeptical that such skills matter).
-
Hostile to anything that is not directly relevant to their career goals, which are vaguely understood.
-
Increasingly interested in the social and athletic above the academic, while “needing” to receive very high grades.
-
Not really embarrassed at their lack of knowledge and skills.
-
Certain that any academic failure is the fault of the professor rather than the student.
About half of the concerns I’ve listed — punctuation, plagiarism, argumentation, evaluation of evidence — can be effectively addressed in the classroom. But the other half make it increasingly difficult to do so without considerable institutional support: small classes, high standards, and full-time faculty members who are backed by the administration.
More than anything else, I see the group of books I’ve listed here as supporting the redirection of resources into the classroom, rather than into amenities and administrative bloat. We need to reverse the customer-service mentality that goes hand-in-hand with the transformation of most college teaching into a part-time, transient occupation and the absence of any reliable assessment of course outcomes besides student evaluations.
On the other hand, I am not so pessimistic about the abilities of the “digital natives.” Different generations have different ways of knowing — different configurations of multiple intelligences. Pick your era and your subject: How many of us know anything about farming anymore or how to read the changing of the seasons? How many of us know how to repair an automobile or make a cake from scratch?
Of course, we lament that the skills we have acquired at great pains can become lost to the next generation, but we can hardly reverse all of it. And it may be that the young are better adapted to what is coming than we are.
We can be student-centered and respond to their ways of viewing the world, but at the same time it seems reasonable to expect that students also become faculty-centered. Students must learn, as we do, to speak across generational lines and gradually abandon the notion of a world constructed purely around them.
While I share many of these authors’ concerns about the pathologies nurtured by new technologies, I have to agree with Gore’s position — that technology must play a prominent role in this continuing intergenerational negotiation. There are, undoubtedly, major changes taking place in the culture and psychology of the young, with serious consequences for everyone. And there are many steps that individual educators can take to deal with those changes.
But that’s a subject for next month’s column.