At a faculty lunch several years ago, I was introduced to a suave young man who described his major research interest as “French Marxism.” The claim — framed by his waxed mustache and beret — suggested that the adjective “French” here functioned analogously to the way it might in phrases like “luxury watch” or “craft beer,” rather than, say, “sentential logic” or “quantum physics.” (And we’re obviously talking about French merlot with French fouée, not French fries with French dressing.)
It’s hard to countenance now, but at the time he was widely thought to be impossibly cool. He had a European motorbike, a desultory manner, and hair that violated physics. He was the kind of person who had become tired of a research area or theorist at the exact moment you’d discovered it; and he would unfailingly point you in the direction of thinkers whose names you neither knew nor could spell. He was so far ahead of the theoretical curve that one might have been forgiven for thinking that he created it — extruding it from the exhaust of his Vespa as he made his way around campus.
It’s unlikely, of course, that this is how he’d like to be remembered. Those of us who work in higher education consider ourselves above anything as ostensibly “cheap” and trivial as the whims of fashion. Our labor, including our research and contributions to university governance, is a serious endeavor marked by painfully obvious similarities to, say, those solemn 13th-century monks grinding out transcriptions of Aristotle’s Poetics at the University of Paris. Unlike the latest industry “disrupted” by M.B.A.-bait airport bestsellers — the endlessly merging and acquiring tech companies or radically evolving media conglomerates — our sector deliberates over suitably weighty affairs with numerous nods to venerable tradition. Behold our integrity to those who doubt it: We hath Latin mottos, Greek fraternities, and convocations that resembleth wizard conventions. (Not that we aren’t amenable to change: The maces carried by presidents and chancellors, for instance, are now purely symbolic; we have opted for other, more effective weapons, like restructures.) But by and large, we believe ourselves to be beyond the ephemeral. As every freshman course in “critical thinking” reminds us, the dull, unhappy burden of the rational mind is to follow the evidence where it leads, not the bandwagon.
And yet not. While we do understand this as an ideal, most of us know — at least during broken sleep or after the fourth beer — that ideals are unreliable witnesses. In fact, it may well be the university’s self-serious insistence on being above the whims of fashion that makes it so vulnerable to it. Like anti-vaxxers, we become entirely more susceptible to something precisely because we think we’re not.
And we are not just talking here of our “French Marxist.” Our gowns and hats may remain reasonably consistent, but on the academic side of things, fashionableness reaches into laboratories, course outlines, and practices of library acquisition. We all know this, regardless of the discipline: Certain paradigms, theorists, and problems are in, and some are out, and not being both aware of this and following it can be a fast track to professional irrelevance. Science and philosophy are particularly brutal in this respect, disciplines that even at their most charitable are sometimes apt to narrate their own histories — even very recent histories — as stories of venerable idiots doing the best with what they had.
I have spoken to mathematicians, biologists, psychiatrists, and economists, and — when they are sure nobody else is listening — they confess that this is often the case in their disciplines: The movements of their field are like the movements of a mob, perhaps just as unpredictable over the long term and equally difficult to resist. And like mobs, fashions are invariably neither hard to detect nor particularly complex. For instance, one cool move in the humanities a few years ago was to accuse an interlocutor of reductionism (along with naïveté, one of the mortal sins of sophisticated thought) by a simple but devastating act of pluralization — so where they said “discourse,” you would object “discourses!”; where they said “feminism,” you said “feminisms!”; where they said “knowledge,” you said “knowledges!”
Academics need to balance two opposing imperatives: the implicit demand to follow a herd and the requirement to appear trailblazing.
One’s arguments also only stood via a kind of postmodern patronage. You sent out a theorist or army of theorists to make your bidding for you and/or intimidate the enemy. I recall graduate conferences where the substance of certain heated debates consisted of little more than a furious exchange of proper nouns — something like this:
Grad Student A: “You say early Derrida, but also late Derrida and Badiou.”
Grad Student B: “Derrida and Badiou, late and early, but mostly early Derrida.”
Grad Student C: [Interjection] “Arendt! Arendt?”
Grad Student A: “Arendt? Arendt! But not so Arendt. Perhaps Foucault. Perhaps more Foucault and Butler.”
Grad Student C: “Foucault?!? No — Arendt is already Foucault, was always already Foucault.”
[Pause.]
Aged Faculty Member: “Gloria Steinem?”
[Silence. Few know who Steinem is; those who do know say nothing. The aged faculty member will be lunching by themselves today.]
(Apart from technology, fashion is one of the means by which young people reserve the right to torment the old.)
Your choice of theorist was to be German, French, or Italian, not Spanish, Iranian, or Turkish. (Spanish, Turkish, or Iranian novels were great, though. If you wanted to stay with English you needed to look at either Dallas, soft porn, or the oeuvre of Roger Hargreaves.) You should have been familiar enough in the language of your favored theorists to be able to say “world-historical import,” “discursive formation,” and “being-toward-death,” but incapable of “My name is Simone,” “I’d like a cheese sandwich,” or “Which way to the Louvre?”
But to say that “fashion” influences us might seem to offer us little — even if true, it’s not particularly helpful. Maybe we can be clearer by saying that academics need to balance two opposing imperatives: the implicit demand to follow a herd and the requirement to appear trailblazing. Like all moderns, we disdain slavish imitation at the same time as desiring the security of the crowd. Fashion exists, if nothing else, to allow for precisely that possibility; it permits us to speak out of both sides of that consummately modern mouth.
In this context, one version of a good article — one that has a good chance of getting published — is one that implicitly spouts an orthodoxy at the same time as screaming about something minor. You agree, for instance, with everything Foucault says, except for the fact that he continually ignores Brazil, or the periodic table, or your supervisor’s criminally unsung trilogy. It’s a sure-fire formula in which much of the paper is able to write itself. All disciplines are, to a greater or lesser extent, faddish, even if any particular fad is later shown to be inadequate or myopic, or perhaps — as my undergraduate students might put it — just really lame.
This is not to say that the way fashion operates within the university is identical to any form outside of it. Unlike the fickle — and, from the outside, reassuringly absurd — shifts seen on the catwalk, fashion inside the university appeals to more than just a change in aesthetic allegiance — it invariably invokes images of rationality and progress. (Of course, the mere fact that rationality and progress are invoked doesn’t mean they manifest themselves any more than invoking a dead aunt will result in her attending Thanksgiving.)
And this isn’t just the case in the humanities and social sciences. Even in the sciences, big discoveries can go almost undetected, and insignificant discoveries can get enormous traction. The Harvard biologist Ernst Mayr once dedicated an anxious editorial in a 1963 issue of Science to the subject, which he saw then as a growing problem. From 1903 to 1906 around 120 scientists published several hundred articles on N-Rays, a newly discovered form of radiation which had a number of surprising properties, the most surprising of which was that it didn’t actually exist. Like those companies in the tech boom that attracted enormous investment without actually producing anything at all, N-Ray studies, like exobiology, became an extraordinarily exciting area without a research object.
Administrators have their own rich history of chasing trendy, meaningless causes.
Mayr lamented the tyranny of fashion in scientific research and the epistemological libido for designating classical areas “passé.” While, he said, the pursuit of new discoveries is often very productive, one problem is that the older areas which are abandoned for the new are almost never exhausted when this happens: “When talent is diverted from [these older areas], science suffers an irreparable loss of know-how in the form of specialized information and methodology.” Mayr’s observation draws attention to the fact that fashion is never merely additive; it demands we abandon certain things, and those abandonments can be, at the very least, premature.
If anything, the hold of fads and fashion in knowledge has increased since Mayr’s day. In the contemporary academy, there is enormous, ongoing pressure to produce research quickly, to churn out publications and attain grants at a rate set by an institution’s — and a government’s — prestige-boosting or pecuniary objectives. Fashion fits most neatly into a “publish or perish” world. As researchers like Eric Abrahamson point out, the demand for industrial rates of research output undoubtedly encourages researchers to choose short-term projects with fast turnarounds, in a way akin to governments producing policies in line with election cycles; you maximize gross gains and minimize risk. We’re now obliged to innovate according to a strict schedule. (Indeed, some journals actually advertise their review-and-publication turnaround time as one of their features.) Fashion allows us to both lower risk and then distribute it across a discipline. That doesn’t mean the theses we’re pursuing are necessarily worth pursuing. But if we’re going to charge full-pace at a dead end, we might as well all do it together, right?
And any discussion of charging full-pace at a dead end leads us naturally and necessarily to consider university management. Just as professors are culpable for fashionable wrong turns, administrators have their own rich history of chasing trendy, meaningless causes. Who can forget Stanford University’s fling with MOOCs, the brief rise of the micro-Master’s degree, or the University of Texas at Austin’s infamous “Project 2021”? The susceptibility to fashionableness is revealed by a single oft-heard campus word: “innovation.” It’s a word we need to at least be wary of; it may one day be proved that even uttering it shuts down those parts of the brain responsible for impulse control and rational deliberation.
The valorization of “innovation” is not just a mere fashion in the sense of a voluntary turning of an intellectual tide; it is a structural necessity of the modern academy. With every second person at the university now a “manager” of some sort, the issue is always how managers are to spend their long days. (Academics are fast becoming a minority in universities, with the number of administrators increasing by more than 100 percent in the last 30 years, vastly outpacing any commensurate rise in either the number of students or faculty.) And they are employed to do a job; managers gonna manage, after all. What else to do but change stuff? No university manager ever got a good performance review for stating to a supervisor, “Well, I carried out an evaluation of the structure of the degree, and it seems to be working really well; we don’t need to change anything at this stage.” Regardless of whether or not they’re correct, that person’s career is going precisely nowhere.
This isn’t, of course, to defend some pure ideal of a university, something that never existed in any case; and nor is the snark above directed at all forms of innovation — but we need innovation of the right kind. Only in the modern world has change been seen as a good in itself. (“Innovation,” we need to remind ourselves, was most used as a pejorative until around the 18th century.)
Fashions often function to not only hide a lack of change but to distract us from slower changes.
In some senses all of this — both the evanescence of managerial and theoretical fashions — undoubtedly represents the alignment of the university sector with a far broader and more established imperative in contemporary capitalist economies: planned obsolescence. The origins of the idea go back at least to the estate broker Bernard London’s 1932 pamphlet “Ending the Depression Through Planned Obsolescence.” The heart of London’s suggestion was to have government impose legal obsolescence on commodities after they’d been in the market for a set period of time, in order to stimulate consumption. With its origins in the automotive industry, planned obsolescence accepted that cosmetic redesign was always going to be easier — and more salable — than genuine redesign. Unlike the commodities over which it had dominion, London’s idea itself has proved remarkably durable, even when it travels under different names.
A marketing company in Sydney recently persuaded one of my alma maters, the University of New South Wales, to adopt as its motto “Never Stand Still.” Nothing has captured to this point the absurdity and directionlessness of ceaseless change in the academy quite like this slogan — that is, nothing has never been quite so embarrassingly honest. “Never Stand Still” could also easily be the tagline of Sisyphus, the unfortunate mythic figure sentenced to unending and pointless labor. (“University of New South Wales: Keep Rolling That Boulder” might also have worked.)
At one level then, fashions in “innovation” are less threatening than they might otherwise be, in the sense that the changes they bring are rarely particularly deep. And in any case, given the unrestrained yearning for restructures in the sector, everything will be different in three years anyway. Anyone who has served as faculty long enough realizes that what we call innovation is often just “retro” without the ironic self-awareness, or perhaps any self-awareness at all: new letterheads, breathless memos to staff, and bullet-pointed series of Core Values, Strategic Objectives, and Key Initiatives.
Trends in fashion need to be simple: They must contain buzzwords, slogans, and bullet lists; if nothing else they need to be widely communicable. (As the sociologist Diana Crane pointed out many years ago, the spread of an idea is limited by the cognitive reorganization required to take it on.) Consultants get brought in and — after long analysis and phrases like “world class,” “strategic manifestations,” “desired positioning,” “brand architecture,” and “executive alignment” — we are given the emperor’s new restructure.
Fashion represents a clear and present danger in academia, but it is not all bad news all the time. There can be good, productive fashions. And one feature of the very communicability of an academic fashion means that it might also stand to be communicated outside the academy.
We are frequently told that today’s world is one of fragmentation — that the great promises of the internet and other forms of communication have not been realized; that the rate of the expansion of knowledge means that your colleague in the office or lab next door may not only be incapable of understanding your research field; he or she may not be able to even pronounce it. Fashion-qua-fashion may be oblivious to reason but it isn’t to communities which take themselves as being founded upon it. Besides this, there is something to be said for any force which encourages us to train our minds on similar ideas, issues, and problems. Fashion specifies areas in which collective intelligence can be pitted, opposing the centrifugal pull of increasing specialization.
But there is a darker side to all of this. We should adopt an antitrust-like wariness of any single idea becoming a monopoly. Fashions shouldn’t shun genuine difference; we should recall that many of the greatest discoveries have come not from networking and “connectivity,” but isolation. More dangerously, just like in theory, fashions in management allow institutions to screen themselves from their own behavior by always being able to create a regressive past from which they can contrast themselves: “Yes, we see those problems, but they were then and this is now. We have a completely new approach, as you can see by virtue of these truly excellent new buzzwords.”
And fashions often function to not only hide a lack of change, but to distract us from slower and longer lasting changes. One of these is the creeping progress of the notion that education is only justified via its vocational “usefulness.” Another is the psychological undermining of all opposition to change, where every resistance to institutional transformation can be framed in terms of the incapacity for individual adaptation. And so principled objection to institutional reform is now often cast in terms of personal “anxiety” or one’s being “risk averse.”
“Change managers,” though, conveniently tend to outsource their risk in the course of their management. The changes (very often called “tough decisions”) they implement unsurprisingly don’t tend to call for fewer change managers. Maybe that fashion, though, will eventually be seen for what it is — and there are any number of people working in the university who would welcome such a genuinely radical innovation.