The following is by Paul Dicken, an adjunct lecturer in philosophy at the University of New South Wales, in Australia.
--------------------------------------------------------------------------------------------------------
When a new government came to power in Australia last year, it promised a shake-up of academic funding. In predictable political rhetoric, officials said that taxpayers’ money would no longer be squandered on “wasteful” research, but would instead be channeled into more deserving ventures—meaning, of course, those areas of science and engineering most closely related to supporting industry and the economy.
As an illustration of just how badly resources had been frittered away under the previous administration, officials delivered a public shaming of some of the more egregious offenders in academic research: an investigation into the meaning of “I” in 18th- and 19th-century existentialism; the role of sexuality in Islamic “representations” of reproductive health in Egypt; and a project to understand how public artwork could help people adapt to the challenges of climate change.
I know nothing about these particular efforts, and have no idea as to their academic merit. But I do know of a number of projects that would perfectly fit the bill.
My own background is in philosophy, and I have spent a great deal of time teaching introductory logic. In my previous university, there was an entire department whose fundamental methodological assumptions precisely exemplified the kind of blatant fallacy that would fail one of my freshman courses. I even had one colleague—an eminent professor in his field—whose work I would bring into my class to discuss as an example of poor logical reasoning. I have colleagues in history who despair about entire subdisciplines encrusted around events well documented not to have happened. A friend who works in African studies was told, without a trace of irony, that the genocide in Rwanda was caused by an inability to conceptualize the “Other.”
The simple truth is that the humanities are awash in wasteful research, something that we as a community seem reluctant to discuss. One difficulty is the way in which increased specialization has eroded cross-disciplinary discussion. If you really think that the best way to prevent generations of intertribal violence and political disenfranchisement spilling over into machetes at dawn is for everyone to just read more Hannah Arendt, chances are that you work in a department where everyone else thinks the same.
More to the point, your articles will only ever be published and peer-reviewed by those who share your fundamental assumptions, and your grant applications devolved to specialist panels of like-minded colleagues who know your work best. Even the recent trend for interdisciplinary studies has by now become fully estranged from its ecumenical origins, and crystallized into its own legion of self-contained fiefdoms, complete with their own journals and impenetrable academic jargon.
That is why we in the humanities find ourselves constantly on the defensive, writing self-agonized opinion pieces about the value of our own research; that is why we are embarrassed at parties when asked what we do, and end up mumbling something to the effect about being a “teacher.” That is not to say that those working outside of academe fail to appreciate how good research can require lengthy germination before yielding fruit; nor do they dismiss the value of intellectual inquiry pursued as an end in itself. But there remains a pervasive suspicion that those working in the humanities are somehow running a scam—a suspicion that we as academics have done absolutely nothing to address.
The response from Australian academics to government threats was the familiar litany of hand-wringing and finger-pointing. I went to several well-catered conferences in plush auditoriums, and absorbed a number of PowerPoint presentations on why the speaker’s own field of research was actually far more important than most people realized. There were plentiful asides on Prime Minister Tony Abbott’s obvious anti-intellectualism, and one or two more-troubling comments about how the general public just didn’t appreciate what we were doing. At no point did anyone entertain the suggestion that maybe not all areas of the humanities are equally valuable, or that some measure of internal policing might be in order.
However, as the government introduced its new budget in this month, it is the sciences that have come to occupy center stage. The Commonwealth Scientific and Industrial Research Organisation, roughly equivalent to the National Science Foundation, is bracing itself for a 20-percent cut, with additional cuts to other science agencies (though medical research will actually get increased support). These austerity measures have raised much more public interest than the ones to the humanities and social sciences. The entire issue of how best to allocate research funds can now be safely dismissed as the malicious mismanagement of a reactionary right-wing government, and any difficult questions that might have been raised about the quality of research projects drowned out by our liberal outrage.
All of which is a shame. I actually quite liked the idea of Mr. Abbott sitting at his desk behind a mountain of funding proposals, a rubber stamp in one hand and a cold one in the other. Not that I think my own research would have met his approval; I can hear him scoffing now. But the whole process might just have motivated us to take a serious look at how we manage ourselves in the humanities, and to think about our own standards of quality control. The government may well not be the best people to assess academic research in the humanities—but somebody’s got to do it.