Over the next 10 years, scientific experts will be dealing with "extreme weather." No one knows how weird and dangerous it will get.
Moscow already faces Bahrain-like temperatures. Downpours swamp a fifth of Pakistan. President Mohamed Nasheed, of the Maldives, worries enough about future sea levels to hold a cabinet meeting underwater in scuba gear. (Don't miss this on YouTube!)
Parallel thinking should apply to a phenomenon of greater concern to readers here: "extreme academe." Think of it as the hysterical upgrading of ugly visions of the future already found in polite critiques of higher ed.
Back in 2003, for instance, former Harvard President Derek Bok, in Universities in the Marketplace: The Commercialization of Higher Education (Princeton University Press), drilled home the problem capsulized in his subtitle by noting that throughout the 1980s, deans and professors brought him "one proposition after another to exchange some piece or product of Harvard for money—often, quite substantial sums of money."
Though hardly the first to notice the trend—Stanley Aronowitz, in The Knowledge Factory: Dismantling the Corporate University and Creating True Higher Learning (Beacon Press, 2001), produced one prior cri de coeur—Bok, as the highest of high mandarins of academe, legitimized the insight. Now a healthy genre tracks this particular slide toward extreme academe, marked by such fine indictments as Jennifer Washburn's University, Inc.: The Corporate Corruption of Higher Education (Basic Books, 2006), and Frank Donoghue's The Last Professors: The Corporate University and the Fate of the Humanities (Fordham University Press, 2008). By last year's Wannabe U: Inside the Corporate University (University of Chicago Press, 2009), the downward spiral was such a cliché that sociologist Gaye Tuchman could mine it for laughs as well as an aperçu, with her semidisguised state-university president who's always declaring, "This is a university in transformation."
Other recent scrutinizers of academe perceive related threats. Mary Burgan, a former general secretary of the American Association of University Professors, marquees her main fear in her title: Whatever Happened to the Faculty? (Johns Hopkins University Press, 2006). Harvard English professor and New Yorker staff writer Louis Menand, in The Marketplace of Ideas (W.W. Norton, 2010), sees a partly antiquated 19th-century university system trying to solve 21st-century problems, such as how one adapts "the lecture monologue" to "a generation of students who are accustomed to dealing with multiple information streams in short bursts." Amanda Goodall, in Socrates in the Boardroom: Why Research Universities Should Be Led by Top Scholars (Princeton, 2009), warns that managerial empty suits will destroy the great American university.
Extreme academe, as a vision, ups the ante of such concerns. It adds flash and cynicism to mere trepidation. According to it, college students in 2020 will use plastic cards to open the glass security doors installed at each entrance to campus. On special occasions, the sole tenured faculty member at every institution will be wheeled out, like the stuffed remains of Jeremy Bentham at University College London, for receptions.
Plagiarism, having evolved, with the help of Stanley Fish, from mortal academic sin to mere "breach of disciplinary decorum," will be an elective track, on a par with fiction and poetry, within the creative-writing major. Several great research universities will be led by former Big Ten football coaches. Indeed, by 2020, President Bok's nightmare of the future, shared in his commencement address to Harvard's Class of 1988, may be the standard scenario across the land: corporate logos on syllabi and course materials, ads in the classroom (and presumably above the urinals), commercials during class time, and auctions to the highest bidders of "the last one hundred places" in every entering class.
My own peculiar worry about Academe 2020, offered with less than 20/20 foresight, may seem less catastrophic: the death of the book as object of study, the disappearance of "whole" books as assigned reading. Does that count as a preposterous figment of extreme academe, or is it closer than we think?
I don't mean the already overwrought debate over the crisis of the book as codex—the daily New York Times announcement that electronic readers stand primed to eliminate paper books. (This shift, of course, plays into the problem, since any shrewd publishing type can see how the paper book's demise might make it easier to digitally trim, abridge, and repackage texts in more "appealing" forms than their benighted authors envisaged.) The issue isn't the decline in book sales, though it, too, remains an element of the big picture. I am talking about the growing feeling among humanities professors—intuitive and anecdotal, shared over lunch like an embarrassing tale about a colleague—that for too many of today's undergraduates, reading a whole book, from A to Z, feels like a marathon unfairly imposed on a jogger.
To be fair, their elders increasingly encourage the thought that whole books lack the coolness of whole grains. Three years ago, Weidenfeld & Nicolson launched its Compact Editions series of classics such as Vanity Fair and Moby-Dick. The publisher explained that they'd been "sympathetically edited so that most of them are under 400 pages," but that the cuts "in no way detract from the spirit of the original." Surgery simply rendered such classics less "elitist." Dripping drollery in The Times of London, critic Richard Morrison opined that truth in advertising behooved the publisher to adjust titles as well, perhaps to Vanity Off-Peak Fare, and Mini-Dick.
Any wonder that last year, two cheeky University of Chicago undergrads with literary parents—Alexander Aciman and Emmett Rensin—published Twitterature (Penguin), boiling down classics of world lit to 140-character bone? Here's their speed-read version of The Epic of Gilgamesh: "@UrukRockCity—Great. That's it. I'm leaving Uruk. My best friend in the world is dead, all because the gods couldn't handle our bromance."
The signs of readerly surrender pop up everywhere. Princeton student Isia Jasiewicz, reviewing a book for Newsweek this summer as an intern, admits in her last paragraph that she bothered to read only the first 10 pages. Linda Nilson, director of the Office of Teaching Effectiveness at Clemson University, posts a piece titled, "Getting Students to Do the Reading" on the Web site of the National Education Association, advising: "Look for readings with graphics and pictures that reinforce the text, and pare down the required pages to the essentials. The less reading assigned, the more likely students will do it."
Destructive cultural trends lurk behind the decline of readerly ambition and student stamina. One is the expanding cultural bias in all writerly media toward clipped, hit-friendly brevity—no longer the soul of wit, but metric-driven pith in lieu of wit. Everywhere they turn, but particularly in mainstream, sophisticated venues—where middle-aged fogies desperately seek to stay ahead of the tech curve—young people hear, through the apotheosis of tweets, blog posts, Facebook updates, and sound bites as the core of communication, that short is always smarter and better than long, even though most everyone knows it's usually dumber and worse.
Another cultural trend propelling the possible death of the whole book as assigned reading is the pressurized hawking of interactivity, brought to us by the same media panderers to limited attention spans. It's no longer acceptable for A to listen to B for more than a few minutes before A gets his or her right to respond. High culture, for sure, also bears high responsibility for this, ranging back to Foucault's and Barthes's assaults on the "author," Eco's celebration of the "open work," and a score of other late-20th-century academic authorities questioning why creators of texts should determine where they begin or end as well as what they mean. On street level, we end up with commercial gambits such as Compact Editions. On syllabus level, we await the next generation of professors who will assign just part of Anna Karenina, or the best stretches of Great Expectations, all the while wondering why anyone ever wrote a book longer than John Stuart Mill's On Liberty.
A useful text with which to muse on this subject is Robert Darnton's The Case for Books: Past, Present, and Future (PublicAffairs, 2009). In it, the onetime newspaper reporter, distinguished scholar of the Enlightenment and the history of the book, and director of Harvard's libraries, swings between explanations and concerns about Google Book Search, and how the situation with books today looks in the perspective of history. Many of his observations give pause.
Darnton notices what many other professors also see in young people: "A generation 'born digital' is 'always on,' conversing everywhere on cellphones, tapping out instant messages, and networking in actual or virtual realities. The younger people you pass on the street or sit next to on a bus are simultaneously there and not there. They shake their shoulders and tap their feet to music audible only to them inside the cocoon of their digital systems. They seem to be wired differently from their elders, whose orientation to machines comes from another zone of the unconscious."
Many college-age sorts study their phones, put them away to try to focus on something else—the passing scenery outside the Amtrak train, a magazine, the old-fashioned book they've brought along—then yank the phones back out three or four minutes later and start tapping away again. Reading a book, however, requires concentration, endurance, the ability to disconnect from other connections. You have to be there rather than not there. Hyperwired young people may be making it to age 17 without acquiring that ability, let alone losing it.
Darnton recognizes that the authority of books—those objects to which, NEA studies and other data tell us, the young are not connecting—"derives from a great deal more than the technology that went into them." It comes from the years of research put into them, of revising and recasting sentences, of organizing paragraphs and chapters, of taking the time and space to set out one's evidence and counterevidence, the opinions of others, the context of one's subject, its upshot. Little of that can be done by the essay, let alone the post or tweet.
Darnton's musings intrigue because while few equal him as a lover of traditional books and their importance, he also betrays signs of "silicon syndrome" (compare "Stockholm syndrome"), a susceptibility to mounting assumptions that surround him. Darnton the Head Librarian sounds open to elevating every slight communication to a datum of significant cultural importance. "We are also experimenting," he writes of himself and his Harvard colleagues, "with plans to archive the millions of messages exchanged within the university by e-mail." Leaving aside the legalities, does anyone want to guess how the wheat and chaff divide there?
"Perhaps we suffer," he writes, "from too narrow a notion of publication, something we associate exclusively with professionals who produce journals and books."
Au contraire, the problem of the moment is that we suffer from too broad a notion of publication, applying the concept to every transient expression. The world and scholarship survived centuries—millennia—of not cataloging every comment made by people to one another. Yes, it's a shame we've lost the offhand remarks of Voltaire, what Shakespeare said to friends, and almost everything that might count as an e-mail in ancient Greece and Rome. A shame, too, that we don't have video of the Crucifixion, stills of the Flood, and things like that.
But are we worse for not having archived the ephemera of mankind, for having devoted libraries and syllabi to books—the weightiest, most important, most enduring forms of communication? The old criterion of librarianship and pedagogy was right: Save and study the substantive, don't worry about the insignificant. What will be the impact on future professors, wondering whether to assign whole books to future students, if libraries, of all institutions, start to see the book as merely primus inter pares among acts of communication? It is not a first among equals, because other forms of communication do not equal its weight, its power, its thoroughness.
Yes, we know—what is a book, after all? Anything an editor at a publishing house agrees to put between two covers, or zap to a Kindle/Sony Reader/Nook? Isn't it often truly (when the cachet of the word is put aside) just a thrown-together collection of short pieces stitched together, or a rush job, rather than a sustained, coherent text of 250 to 1,000 pages?
And who says that teaching whole books as whole books makes good sense anyway? Is every word of Freud's The Interpretation of Dreams, or Darwin's The Origin of Species, really necessary to understand those books? Doesn't Tolstoy run on at times?
Reasonable issues, all. But whatever clever eristic moves you make, there's a problem on the horizon—extreme academe is heading our way. Will professors hold the line? Will they insist that the most distracted generation in history rise to the challenge of reading books, or will future faculty members replace the book with the chapter? Maybe extreme weather and extreme academe will come together. As oceans rise, temperatures soar, electrical grids fail, and smartphones no longer charge, Generation Text may rediscover the real thing.