The following is a guest post by Thomas Docherty, a professor of English and comparative literature at the University of Warwick.
Often after someone dies who had significant influence over our lives, there is an argument over his or her legacy. In the case of Margaret Thatcher, rarely has the debate been more divisive—and higher education is not immune. Among university colleagues there is argument not just about what her legacy actually is, but about whether professors need to do more to actively reject its influence, which continues to help guide education policy today.
Admirers will point to the obvious physical manifestation of her legacy: the University of Buckingham, Britain’s first private university, which opened while Thatcher was minister for education in 1976, and where she served as chancellor for a period starting in 1992. Buckingham’s current vice chancellor, Terence Kealey, praises Thatcher for introducing the sector “to greater accountability and to market forces.”
Indeed, higher education was where, as prime minister, Thatcher made an early push toward privatization. In 1981, two years into her premiership, Thatcher cut government funds for universities by nearly 20 percent. Thus began that journey, once described by the French philosopher Paul Valéry as a crisis of the spirit in which Knowledge, reduced to a market commodity, becomes subservient to Commerce. For some, this is a positive aspect of the legacy; but for many in higher education, this is precisely the inheritance that is to be rejected.
The commercialization of the sector has been much extended since; and, at the time of Thatcher’s death, it is probably the key area of contention and debate about the place and idea of the university in our times, in Britain and beyond. In England, it has led to larger costs for students and a push to value academic research based almost entirely on its economic benefit.
The Thatcher “revolution”—that assertion of the primacy of market solutions for every problem, accompanied by a preference for private gains over public goods—depended upon a weakening of popular confidence and trust in the public sphere and its servants. Indeed, one part of the legacy is an enduring suspicion of anything that is proposed as a public good, a suspicion encouraged by a culture that, grounded in generalized resentment against perceived “privilege,” belittles intellectual activity as worthless, unless it serves private economic gain.
Consequently, anything that is not connected to personal profit is suspect. If the university is to survive, it has to justify its existence in primarily—even solely—narrow financial and economic terms as the generator of private gains for individuals. Other qualities, like dissent, critique, thinking, and the spirit of pure exploration, have now become unnatural, an abnormal deviation from a somehow “naturalized” norm of the pursuit of individual wealth. The university, as a site for dissent and for critical thinking, has had to struggle ever since.
What she bequeaths is a legacy of social distrust, especially of anything that is supported through general taxation, a distrust of anything that is commonly shared, for, as she famously said in an interview in Woman’s Own magazine in 1987, “there is no such thing as society. There are individual men and women, and there are families.”
Here, though, is a paradox. In this condition, universities are to be placed under more or less constant surveillance, and the surveillance is to be carried out in the name of serving the individual and taxpayer. However, in practice, this has meant that the sector has had to serve the ideological preferences of successive governments; and since those governments have themselves been in thrall to a world of commerce, the university sector has been put in the hands of that world.
With privatization, Thatcherism reduced freedom to freedom of choice, thus making the ethical and juridical and political demands for freedom nothing more than a consumerist market-driven idea. She denied the existence of society, reducing it instead to an atomized aggregate of individuals, concerned for themselves and their families only. This is obviously bad news for social sciences; but it is equally bad for laboratory sciences concerned with environmental issues, or for arts that look to bring diverse individuals into cultural conversations of various kinds. It is also bad for anything that speaks of collaboration rather than competition, construed in crude quasi-Darwinist terms as a struggle for survival and primacy. The legacy here is that if you are poor, it is because you deserve to be so; if your university or discipline is not popular with students, then it is your own fault. It may explain why Buckingham’s disciplinary spread is rather narrower than most universities. The legacy is a narrowing of thought itself.
Perhaps, in the end, we need the argument about the legacy: Dialogue and debate might help rehabilitate the primary purposes of the sector. In that case, although in life she was utterly divisive, her passing might allow for a better future.