Last year, a former Princeton University president, William G. Bowen, delivered the Tanner Lectures at Stanford, continuing a long tradition of college leaders' using the top floors of the ivory tower to speak difficult truths about academe.
When the dot-com craze was sweeping the nation, back in 2000, Bowen—an author in the 1960s of the original "cost disease" diagnosis of labor-intensive industries—kept his eyes on the evidence. He didn't yet see reason to believe that colleges could use technology to save money. But another decade of progress changed his mind. "I am today a convert," he said. (The lectures were published this year by Princeton University Press as Higher Education in the Digital Age.) Bowen's random-trial-based research suggests that "online learning, in many of its manifestations, can lead to at least comparable learning outcomes relative to face-to-face instruction at a lower cost."
With MOOC mania showing no signs of abating, such strong conclusions from an esteemed scholar have drawn notice.
Less attention has been paid to what Bowen said next. Increasing labor productivity is one thing in theory, something else in practice. Less face-to-face instruction can mean fewer faces on the faculty side. It also means routinizing aspects of the student learning experience and thus reducing the day-to-day academic discretion of individual professors.
Bowen was a university president long enough to know how faculty are likely to respond, and what mechanisms of structural power and appeals to tradition will be employed to divert productivity gains to "gild the educational/research lily." "I wonder," he said, "if the particular models of what is often called 'shared governance' that have been developed over the last century are well suited to the digital world."
Nor does Bowen believe that this an issue of academic freedom: Freedom of expression "should not imply unilateral control over methods of teaching. There is nothing in the basic documents explaining academic freedom to suggest that such control is included. It is not."
These are the battle lines of early-21st-century higher education. Or at least they will be if the professoriate chooses to fight for traditional totems and privileges.
That would be a mistake. Shared governance, tenure, and academic freedom in the classroom are both indefensible and not worth the trouble. There is a better future for academic labor, one that employs the same technologies that seem so threatening today.
Shared governance has always been, as the historian Laurence Veysey wrote in 1965, "a useful device whereby administrative leaders could sound out opinion, detect discontent so as to better cope with it, and further a posture of official solidarity by giving everyone parliamentary 'rights.'" University employers think of their workers as workers, treat them as such, and try to hire them for what the market will bear. That's why colleges and universities employ a much larger percentage of adjunct professors than in decades past. Governance is for the governors; it can't really be shared.
Tenure, meanwhile, is one of the worst deals in all of labor. The best scholars don't need tenure, because they attract the money and prestige that universities crave. A few worthy souls use tenure to speak truth to administrative power, but for every one of those, 100 stay quiet. For the rest, tenure is a ball and chain. Professors give up hard cash for job security that ties them to a particular institution—and thus leaves them subject to administrative caprice—for life. It's a bad bargain that explains much of the atmosphere of grievance that hangs over academic culture.
Tenure is most important as a symbol of professional status. But anything can be a symbol. The academy should replace tenure with a gold medallion that means the same thing and stop trading away money and job flexibility for "security" that tastes like ashes as time goes by.
Total academic freedom in the classroom is just ridiculous. If an organization is focused on doing something, like education, it ought to have theories and standards and practices about that something. As the University of Chicago's longtime president, Robert Maynard Hutchins, said in another lecture-series-turned-book-length provocation, the system of letting students choose among classes taught by wholly autonomous professors "denies there is content to education, so the organization of the modern university denies that there is rationality in the higher learning. The free elective system as applied to professors means that they can follow their own bents, gratify their own curiosity, and offer credits in the results."
Hutchins was concerned with curriculum back in 1935, but the same laissez-faire approach is applied, then as now, to the practice of teaching. Student evaluations provide little useful information other than an inverse correlation to academic rigor. Teaching varies as professors see fit, which is widely, and so learning varies as well, and far too much.
If professors gather ranks to defend these old and decidedly mixed privileges, they will lose. Bowen is right in the particulars, and the force of that logic combined with the relentless economic reality of technological improvement will win out in the end.
Imagine, instead, that college professors took a different approach, treating the rise of information technology as an opportunity for liberation. When the only way to run a university was to gather all the books and smart people on a fenced-in plot of land, whoever controlled access to the gates was in charge. But we don't live in that world anymore.
While academic communities have stayed stuck in whatever locality their founders happened to have settled in long ago, the demography and vitality of the nation have changed profoundly. The creative classes have flocked to walkable urban communities that are alive with intellect and culture and largely rid of crime. In other words: Have you been to Brooklyn lately? It may have its share of annoying hipsters, but it's one big college town, in the best sense of the term.
Except there are no new colleges there, or in dozens of similar places nationwide. New companies and revitalized neighborhoods, tech start-ups, even manufacturing, but in a borough of 2.6 million people, no exciting new liberal-arts colleges, modern, cutting-edge research universities, or new kinds of higher-education organization.
There could be. The same technological tools that are making academic labor more productive are eliminating the need for top-heavy academic administration. Professors—the good ones, anyway—have the expertise and teaching skills that students need. They can cut out the middlemen and thrive on the flip side of labor productivity. Not fewer educators, but more and better education for more people.
To succeed, these new organizations would need to have a coherent theory of themselves and a specific educational focus. They couldn't be all things to all people, because that way lies bureaucratic bloat. The people who work there would be at-will employees—as almost all professionals are now—required to do a great job teaching. Some of the professors would live elsewhere, and so would some of the students, in the kind of hybrid terrestrial and virtual communities that increasingly characterize modern life. Teaching would no longer be the handmaiden of research. The grotesqueries of intercollegiate athletes would be gone.
These new colleges would be built where people want to live, and taught the way people want and need to learn. The long cold war between administration and professoriate would fall to history, where it belongs.