Earlier this year, I resigned from my position as an associate professor of computer science at the University of New Mexico; in July, I started as a software engineer at Google. Countless people, from my friends to my (former) dean, have asked, Why? Why give up an excellent—some say "cushy"—tenured faculty position for the grind of corporate life?
It's a good question. Tenure represents the ultimate in intellectual freedom; my colleagues in my department were talented, friendly, and incredibly innovative; and I was privileged to work with some excellent students, a number of whom would have fit in just fine at powerhouse institutions like the Massachusetts Institute of Technology or Purdue University.
Honestly, the reasons are myriad and complex, and some of them are purely personal. But I wanted to lay out the ones that speak to larger trends at the university, in New Mexico, in academe, and in the United States in general. I haven't made this move lightly, and I think it's an important cautionary note to sound: The factors that have made academe less appealing to me will also affect other professors. I'm concerned that the United States—one of the innovation powerhouses of the world—will hurt its own future considerably if we continue to make educational professions unappealing.
Making a difference. Ultimately, I got into academic science to make a positive contribution in the world. My goal hasn't changed, but, for some of the reasons I outline below, it has become harder to achieve over time. Google is a strong example of an organization that actually is using advanced computer science to make a real, positive difference in the world. While it's also difficult to make an impact working at an immense company like Google, in the current climate it seems I have a better chance here than in academe.
Work-life imbalance. Immense amounts have been written about the imbalance issue, and I won't try to reprise the arguments here. Suffice it to say that the professorial life can be grueling, if you try to do the job well, and being posttenure doesn't make it easier. This is a widespread problem in academe, and my university was no different. As of my departure, the University of New Mexico had still not approved a unified parental or family-leave policy for faculty, let alone established consistent policies and support for work-life balance.
Centralization of authority and decrease of autonomy. In my time at the university, I served under four presidents, three provosts, and two deans. The consistent pattern of management changes was centralization of control and resources, and increased pressure on departments and faculty members. Those trends gradually, but quite noticeably, produced implicit and explicit attacks on faculty autonomy (less money under faculty control and more uncertainty). In turn, I (and many others) feel that those attacks subvert both teaching and research missions of the university.
Budget climate. A near-decade of two simultaneous foreign wars, topped off by the most brutal recession in two generations, has left federal and state budgets reeling. A poisonous political climate and a Congressional meltdown has destroyed any chance of coherent, reasoned budget planning.
In the face of such pressures, we have seen at least seven years of flat or declining support for federal science programs while state legislatures have slashed educational spending across the country. Together, those forces are crunching universities, which has led to additional pressure on faculty members. Professors are being pushed ever harder to win ever higher levels of federal research money precisely at a time when that money is ever tougher to come by.
Such trends hurt the university by putting the teaching mission at odds with the research mission and subjugating both to the quest for the elusive dollar. A recent policy change in my old university's engineering school, for example, uses teaching load as a punishment to goad professors into chasing grant money. (Indeed, the policy measures research success only as a function of dollars brought in. Strangely, research productivity doesn't enter the picture, let alone creativity.)
Hyperspecialization, insularity, and narrowness of vision. The economic pressures have also turned into intellectual pressures. When humans feel panicked, we tend to become more conservative and risk-averse—we go with the sure thing, rather than the gamble.
The problem is that creativity is all about exploratory risk. The goal is to find new things—to go beyond state-of-the-art and to discover or create things that the world has never seen. It's a contradiction to simultaneously forge into the unknown and to insist on a sure bet. Traditionally, U.S. universities have provided a safe home for that kind of exploration, and federal, state, and corporate dollars have supported it (incidentally, buying advanced research far cheaper than it would be cost in either industry or government, and insulating those entities from the risk). The combination has yielded amazing dividends, paying off at many, many times the level of investment.
In the current climate, however, all of those entities, as well as scientists themselves, are leaning away from exploratory research and insisting on sure bets. Most of the money goes to ideas and techniques (and researchers) that have proven profitable in the past, while it's harder and harder to get ideas outside of the mainstream to be accepted by peer review, supported by the university, or financed by grant agencies. The result is increasingly narrow vision in a variety of scientific fields and an intolerance of creative exploration. (My colleague Kiri Wagstaff, of NASA's Jet Propulsion Laboratory, has written an excellent analysis of one facet of this problem within our own field of machine learning.)
Poor incentives. The "publish or perish" and "procure funding or perish" pressures discourage exploration outside one's own specialty. It's hard to do exploratory or interdisciplinary research when it is unlikely to yield either novel publications or new grant money (let alone, say, help students complete their degrees) in your own field.
But many things that are socially important to do don't necessarily require novel research in all of the participating fields, so there's a strong disincentive to work on those projects. As just one example from my own experience: My research team was asked to help on a medical-school project that would actually help save babies' lives. But the statistical techniques needed for the project were already established, so there was nothing precisely publishable for my graduate students working on it and no good basis for new grant proposals from the work. Any time spent on it would delay students' progress toward their Ph.D. with nothing to show for it on their CV's. When you can't get credit for helping to save babies' lives, then you know there's something seriously wrong in the incentive system.
Mass production of education. There's been a lot of excitement in the media about Stanford University's 100,000-student courses in computer science, MIT's open-sourced classes, and other efforts at mass distance education. In some ways, those efforts really are thrilling—they offer the first truly deep structural change in how we do education in perhaps a thousand years. They offer democratization—opening up access to world-class education to people from all over the globe of diverse economic and social backgrounds. How many Srinivasa Ramanujans might we enable, if only we could get high-quality education to more people? But I have to sound three notes of caution.
First, I worry that mass production in this case will have the same effect that it has had on manufacturing for over two centuries: Administrators and regents, eager to save money, will push for ever larger remote classes and fewer faculty members to teach them. Are we approaching a day in which there is only one professor of computer science for the whole country?
Second, I suspect that the "winners win" cycle will distort academe the same way that it has industry and society. When freed of constraints of distance and tuition, why wouldn't every student choose a Stanford or MIT education over, say, the University of New Mexico? How long before we see the AT&T, Microsoft, or Google of academe? How long before 1 percent of the universities and professors garner 99 percent of the students and money?
Third, and finally, this trend threatens to kill some of what is most valuable about the academic experience—to both students and teachers. At the most fundamental level, education happens between individuals—a personal connection, however long or short, between mentor and student.
Whether it's answering a question raised in class, taking 20 minutes to work through a tricky idea during office hours, or spending years of close collaboration in a Ph.D.-mentorship relationship, the human connection matters to both sides. It resonates at levels far deeper than the mere conveyance of information—it teaches us how to be social together and sets role models of what it is to perform in a field, to think rigorously, to be professional, and to be intellectually mature. I am terribly afraid that our efforts to democratize the process will kill the human connection and sterilize one of the most joyful facets of this thousand-year-old institution.
Salaries. It has always been the case that academics are paid less—substantially so—than their comparable colleagues in industry. (That is especially true in highly competitive fields, such as science, technology, engineering, and math as well as various health fields, law, and certain other disciplines.)
Traditionally, universities compensate for the disparity with broad intellectual freedom, a flexible schedule, and the joy of mentoring new generations of students. But all of the trends I have outlined above have cut into those compensations, leaving faculty members underpaid, but with little to show for it. As one of my colleagues remarked when I announced my departure, "We're being paid partly in cool. If you take away the cool parts of the job, you might as well go make more money elsewhere."
Anti-intellectualism, anti-education, and attacks on science and academe. There is a terrifying trend in this country right now of attacking academe, specifically, and free thought and intellectualism, generally. Free thought is painted as subversive, dangerous, elitist, and (strangely) conspiratorial. ("That word. I do not think it means what you think it means.")
Universities are accused of inefficiency and professors of becoming deadwood after tenure or of somehow "subverting the youth." (Socrates's accusers made a similar claim before they poisoned one of the great thinkers of the human race.) Politicians attack science to score points with angry voters, religious fundamentalists, and corporate sponsors. Some elements of those feelings have always floated through the United States psyche, but in recent years they have risen to the level of a festering wound in the zeitgeist of the country.
Perhaps those who sling accusations at education have forgotten that the United States helped reshape millennia of social and economic inequity by spreading public education in the 19th century? Or that education has underlaid the majority of the things that have made this country great—fields in which we have led the world? Art, music, literature, political philosophy, architecture, engineering, science, mathematics, medicine, and many others? That the largest economy in the world rests on (educated) innovation, and that the most powerful military in human history is enabled by technological and engineering fruits of the educational system? That the very bones of the United States—the Constitution we claim to hold so dear—were crafted by highly educated political idealists of the Enlightenment, who firmly believed that freedom and a more just society are possible only through the actions of an enlightened and educated population of voters?
Frankly, it's sickening, not to mention dangerous. If the haters, fearers, and political opportunists have their way, they will gut one of the greatest institutions in human history and, in the process, will cut the throat of this country, draining its lifeblood of future creativity. Other countries will be happy to fill the gap, I'm sure.
There are other factors behind my decision, of course. Any life change is too complex to express in a short essay. Those are the major ones, though.
Nor am I necessarily done with academe forever. I'm going to give the industry track a try for a while, but I could well find myself back in higher education in the future. There are certainly many things I still find beautiful and joyful about faculty work. In the interim, I will look for other ways to contribute to society, other ways to help educate the future, and other ways to change the world.