We’re sorry. Something went wrong.
We are unable to fully display the content of this page.
If you continue to experience issues, contact us at 202-466-1032 or firstname.lastname@example.org
At the age of 80, the composer Ned Rorem had a complaint: “I’ve read everything.” He could probably lay better claim than most to this alarming condition. Rorem is 98 now; one hopes he has found something new to read in the meantime. Those who haven’t read everything might turn to Emily Temple’s 2017 Literary Hub article “How Many Books Will You Read Before You Die?” which estimates your remaining book consumption based on age, sex, and reading speed. For instance, a 35-year-old male can expect to live to 82, in which case he will read 564 more books if he is an “average reader,” 2,350 if he is a “voracious reader,” and 3,670 if he is a “super reader.”
Once you’ve calculated your bibliophilic memento mori, you’ll need to figure out what to read next. We’ve asked nine scholars from across disciplines to select the “best” scholarly book published last year. Our definition of “best” was expansive: It could mean “most excellent,” of course, but also most provocative, most interesting, most creative, most influential, most delightfully insane. Here’s what they told us.
Eliot, the Anglophone 20th century’s most eminent literary intellectual, was always both vatic and uncontainable. Over the past decades, he has become an emblem of that white, male, conservative tradition rejected by the emancipated, less Eurocentric humanities we have — or hope to have — today.
But the current literary humanities is also, at least on its margins, committed to the search for difference and otherness, to breaking conventions and boundaries. That, today, is what Eliot can provide. He tests our complacencies and pieties as directly as anyone I can think of.
For instance, in his later work Eliot argued that no purely secular culture could survive. You need to have religion behind the humanities if they are to retain coherence and purpose. It’s an easy thought to dismiss, of course, and it may well be wrong.
Currently the old European humanities — the humanities that reach back at least to Cicero — are both secular and dissolving. Is their dissolution completely independent of their secularity? Perhaps not. That, for me, is an exciting and fundamental question.
Another instance: Eliot came to argue that the best society was a hierarchical one in which social status was inherited. I find this idea genuinely shocking, as I think most of us do. It is difficult to imagine anyone arguing for it in the 1940s and 1950s.
Haven’t we heard recently that “meritocracy” is failing us? Haven’t we also heard the rise of populist conservatism attributed to the chasm between a moneyed or credentialed elite and everybody else?
Eliot’s response would have been: A society in which status is independent of merit, and which shares a “whole way of life” (to use his own phrase) across its various divisions and hierarchies, avoids these problems. In such a society, politics’ reach is limited.
Eliot is making arguments most of us no longer want to hear. They are utopian and probably just wrong. But to keep them alive as a horizon for our thinking is not only salutary but exciting.
Simon During is a professor of English at the University of Melbourne.
A Relational View of Science
So it was with great joy that I discovered and consumed Carlo Rovelli’s latest offering, Helgoland: Making Sense of the Quantum Revolution (Riverhead Books). A bit less poetic but no less elegant than his bestseller Seven Brief Lessons on Physics, Helgoland hooked me so hard I read the entire book in one sitting. And then twice more.
Rovelli offers a tour through the shaky and convoluted terrain of quantum phenomena, which he argues are fundamentally “relational.” Nature, he writes, is not a collection of solitary, permanent objects and other substances; rather, Nature is fundamentally made of relations between quantities. A quantity conveys information (has meaning) only in relation to another quantity, never by itself.
Helgoland delivers its first gut punch early on, when Rovelli frames a fundamental question for quantum physics: How can we understand the quantity of energy that we observe as “an electron, its movement and other properties”? Strangely, its location can be predicted only for the moments we observe it. He writes, to my immense gratitude, “What does Nature care whether there is anyone to observe or not?” This paradox of quantum superposition (which also underlies Schrödinger’s cat) is resolved, we learn, by the realization that “an observer” need not be a human observer. The observer can also be another quantity. An electron’s position or velocity exists only relative to some other quantity of energy. When an electron is not interacting with anything, it has no physical properties. Rovelli explains other quantum mysteries via the same scientific punchline: A quantity has no inherent meaning — it is only meaningful in relation to something else.
A second revelation in the book was even more personal. My research program as a neuroscientist and psychological scientist — constructionism — is based on the idea that physical signals, like the raise of an eyebrow, a rise in blood pressure, or an increase of cortisol, have no inherent psychological meaning. These signals (a.k.a., quantities) are meaningful only in relation to one another and to other signals, including the electrical and chemical signals in a brain. Your brain doesn’t observe and detect — it constructs thoughts, feelings, perceptions, and other mental events as it makes meaning from sense data. This meaning-making process is a whole brain event; it is not localized to particular neurons. The meaning of any firing neuron is always in relation to other physical signals, including other neural firing. Constructionism is fundamentally a relational perspective, and so, ironically, I found a kindred scientific spirit in a book about a topic that I have struggled my entire adult life to understand.
Physics, writes Rovelli, “is always a first-person description of reality,” not a “description of things in the third person.” Later, he continues, “There is no way of seeing reality that is not dependent on a perspective — no point of view that is absolute and universal.” The same is true for every other science. And, of course, for book choices.
Lisa Feldman Barrett is a professor of psychology at Northeastern University and the author of How Emotions Are Made: The Secret Life of the Brain and Seven and a Half Lessons About the Brain.
Biography as Dantesque Ascension
In Burning Man: The Trials of D.H. Lawrence (Farrar, Straus & Giroux), Wilson turns to D.H. Lawrence and discovers an unexpected vein of autofiction, almost religious in its intensity. Lawrence did more than blur the boundary between life and literature. Virtually everything he wrote, Wilson claims, was a confession of spiritual desolation and a labor of personal ascent. He lived his life on the model of Dante’s Divine Comedy — journeying from the Inferno (and coal mines) of England to the Purgatory of Italy to the Paradise of New Mexico — and wrote accordingly. With each move, he rose to higher ground, seeking not only relief from the tuberculosis he had suffered since the age of 16 but also a shedding of self and overcoming of soul.
Some critics have balked at Wilson’s use of The Divine Comedy as a structuring device, but they miss its Keatsian power, how it captures Lawrence’s experiments in life and art as the single pilgrimage of a suffering penitent. Lawrence believed that “one has to be so terribly religious to be an artist.” All of his writing is a testament to what it was like to be D.H. Lawrence, an experience he likened to “my dear Saint Lawrence on his gridiron, when he said ‘Turn me over, brothers, I am done enough on this side.’”
The Divine Comedy, Wilson writes, is “a first-rate travel book,” and Lawrence expressed some of his greatest genius in his travel writing. More than a description of place and depiction of its mood, the best travel writing documents the voyage within, how one’s self is altered and elevated by changes in the land without. Lawrence, Wilson notes, “perfected the art of interior and exterior movement; he discovered in geographical extremes” — whether at Taos or Sicily — “a way to symbolize his own poles of being.”
Wilson says that Lawrence was “a catcher of quirks” because “the right quirks can take us straight to the mystery of character.” Wilson is also a catcher of quirks, and like Lawrence, she knows how to use them.
Corey Robin is a professor of political science at Brooklyn College. He is the author of The Enigma of Clarence Thomas, The Reactionary Mind: Conservatism From Edmund Burke to Donald Trump, and Fear: The History of a Political Idea.
You might think, as many philosophers have, that the key issues here are about equality of resources over a lifetime — what Bidadanure calls the “complete lives” view. Younger people will typically have less capital than older ones, for example, because older ones have had more time for accumulation. It’s also true, of course, that today’s students leave college with more debt than earlier generations. They have a harder time finding secure employment. These are significant kinds of intergenerational inequality. And indeed, Bidadanure explores policies aimed at undoing that injustice.
But Justice Across Ages also explores crucial issues of generational inequality that aren’t visible from this lifetime perspective. If young and old are to treat each other as equals, for example, we must ensure equal political participation not just over the lifespan but at all adult age cohorts. We shouldn’t segregate the old in separate social worlds. Legislatures shouldn’t be composed mostly of older people. Bidadanure, who runs the Stanford Basic Income Lab, examines current policy debates — about youth unemployment and job insecurity, access to college, job guarantees for the young, and basic income itself — in the light of the intersecting concerns for equality over our lifetimes and equality between age cohorts in the present. In this lively, clarifying book, she shows how philosophical thinking, informed by a profound grasp of today’s economic realities, can define a vision of a social world that takes seriously an ideal of intergenerational equality.
Kwame Anthony Appiah is a professor of philosophy and law at New York University.
The Living Earth
Ghosh diagnoses an outlook that continues to dominate our present, shaping how the powerful respond to our collective crisis. But he also guides us to sources for recovery from the enormous weight of this past: in the richness and resilience of Indigenous peoples’ ways of being in and understanding the nonhuman world, and in the possibility of new kinds of storytelling that give life to all beings on an earth that is itself understood as alive. Such storytelling, Ghosh says, would allow us to see nutmegs and opium and fossil fuels not as inert commodities but as possessed of history-making powers of their own.
Throughout, Ghosh brings to bear his prodigious skills as both a novelist and an anthropologist, while incorporating insights from an astonishing array of other disciplines — literary criticism, environmental science, botany, history, economics, and more — the kind of omnihumanism necessary to confronting an omnicidal vision. This capacious approach is a model of the cooperative mindset that Ghosh urges us to recover.
Set against the backdrop of the current migrant crisis, pandemic, and Black Lives Matter movement — all deeply entangled with the climate crisis whose history Ghosh traces — The Nutmeg’s Curse is scholarly but also poetic and personal. It teems with life, myth, and haunting trans-species encounters, but also stories from Ghosh’s own life: We follow him rushing, always, toward the crowd, toward the storm, carried forth by his familiar curiosity, humility, wonder, and empathy.
For all its beauty, the book’s aim is practical: to show us the irrepressible human capacity to regenerate modes of cooperation with one another and the world we inhabit. Reminding us of the cultural forces that colluded in disseminating a toxic ideology of morbid individualism that is far from “natural” to the human personality, Ghosh helps us understand that solving the climate crisis depends in the first order on writers and artists shifting human consciousness so that we again recognize our mutual dependence on one another and all beings. The Nutmeg’s Curse calls on all of us to widen our sense of what it is to be human and to commune with life and the land — and to recognize the force of the omnicidal outlook we are up against, and the sources of its perpetuation.
Priya Satia is a professor of history at Stanford.
Thinking, Not Fighting
I’m hardly the worst offender. I was once talking with a professor about some subtlety of linguistic theory, having what I thought was a friendly conversation. When I told him he had made a good point, he smiled, leaned back, and crowed, “Looks like I win.” This style of discourse is at its worst on social media, where the rule is fawning adoration of the in-group and savage ridicule of the out-group.
Galef appreciates the allure of this soldier mindset: “We use motivated reasoning not because we don’t know any better, but because we’re trying to protect things that are vitally important to us — our ability to feel good about our lives and ourselves, our motivation to try hard things and stick with them, our ability to look good and persuade, and our acceptance in our communities.” But she argues we can do better. We can adopt a scout mindset. We can work to see the world as it is, not how we want it to be.
Galef does several things in this book. She makes the case for the scout mindset, arguing that it’s not just better for society, it’s better for individuals — it makes us more intellectually nimble, more self-aware, better able to rationally plan our futures. It can even make us more attractive social partners. Galef gives concrete advice on how to better think about risk and uncertainty, and how to escape our echo chambers. She ends by advising us to “hold our identity lightly,” trying not to let the groups we identify with have too much sway on our sense of self.
Some of these “You Are Doing Things Wrong” books can be rough reading, but not this one. Galef tells some great stories, she is open to objections and qualifications, and she is honest and often funny about her own tendencies toward motivated reasoning. And she holds her own identity lightly, just as one would expect from a good scout.
Paul Bloom is professor of psychology at the University of Toronto.
Thornton offers a master class in how to write intellectual history. Her rigorous research undermines stereotypes and misconceptions about Latin American countries’ role in global economic governance. It is both enlightening and satisfying to read her account of how Mexican diplomats countered the dismissive and often racist opinions of their American and European counterparts, while working alongside (and sparring against) other thinkers of the so-called Third World.
I taught the book to a class of students in an evening master’s program in international affairs. Thornton visited to discuss the book with us. My students had never thought of countries like Mexico as important players in the international arena — they found Revolution in Development the most illuminating reading of the semester. It will be a book of record for years to come.
Ignacio M. Sánchez Prado is a professor of Spanish and Latin American studies at Washington University in St. Louis.
A Tower of Song
Take the state of Robert Johnson. The formal peg for the chapter “Revisionist Bluesology and Tangled Intellectual History” is Elijah Wald’s 2004 Escaping the Delta: Robert Johnson and the Invention of the Blues. That was an intentionally controversial book that took on, and meant to take down, almost all previous writers who had addressed themselves to the music and the mystery of the 1930s blues singer. But Weisbard’s subject is less any particular book or commentary than the concept of a particular musician’s work as a field for inexhaustible argument and treasure-hunting. In an uncharacteristically long four pages, he deploys Wald’s book to bring in more than 40 writers — from John Hammond in New Masses in 1937 to Kimberly Mack’s Fictional Blues, which, carrying a publication date of December 18, 2020, is really a book from the same year as Weisbard’s own. The result is revelatory. I know something about Robert Johnson, but I hadn’t read a third of what Weisbard brings in.
Embarrassment might be a common response to the book: There is so much here that the most sophisticated readers might find their first response a sense of shock over their own ignorance. But the prose is so alive to its subjects (“the death projected onto him long before his stardom was harvested” refers to Elvis Presley, but it could fix a hundred names here) that soon enough a sense of pleasure will take over. Songbooks is a great reference book, but before and after that it’s a funhouse.
Greil Marcus is a music critic.
The Crisis Concept
Wellmon and Reiter’s thesis is that while the crisis of the humanities feels like a novel threat born of recent changes, it is, in fact, as old as the modern humanities themselves. Modern humanists have always understood ourselves as in crisis, because we have needed to make the case for the humanities as a distinctive discourse and practice as a response to the forces of modernity — industrialization and capitalism, on the one hand, and technological advances and the rise of the natural sciences on the other — especially as those forces have manifested themselves within the modern research university.
In the book’s careful genealogy, the modern humanities are not seen as continuous with the traditional liberal arts or the Renaissance studia humanitatis. Instead, their roots are the modern research university as developed in 19th-century Germany. If we want to understand what passes for the humanities in our universities today, Wellmon and Reiter argue, we must know this history. (I confess I did not know much of it.) Such self-knowledge might allow us to defend ourselves without the stain of nostalgia, and to develop a more deeply informed sense of what ought to be restored in our disciplines and why.
Wellmon and Reitter say that crisis talk ought to be abandoned because it has led the humanities to overpromise what it can and should do for students and society. The humanities should not teach values, or pontificate about the meaning of life, or engage in the moral or political formation of students. Professors are scholars (i.e., disciplinary experts) not prophets or priests. On this point, Max Weber is their hero. If only we would embrace the Weberian ideal articulated in “Scholarship as Vocation,” we would help our students take responsibility for the freedom they have to choose their own values and ultimate commitments.
As valuable as this history is, the uncomfortable fact remains that the humanities are failing to thrive. In even the best colleges our condition is moribund (consider, for example, that less than 8 percent of incoming freshmen at Harvard College intend to major in the humanities). If this isn’t a crisis, I do not know what is.
The answer to this crisis may not be to better defend the modern humanities against the economic and ideological pressures working against them, but to rethink university education on a much larger scale. I am not at all convinced that doubling down on disciplinary expertise is the way forward for the humanities. The fact is that the humanities are thriving in those institutions that do not shy away from the higher aspirations of cultivating wisdom and the pursuit of truth and the good in their students. Perhaps these schools are not offering the “modern humanities” at all. But what seems to be a stubborn empirical fact is this: Young people are not drawn to literature, philosophy, and art in the mode of scholarly expertise.
This book has convinced me that “the modern humanities” are tied to the institutional structures of the modern research university and that those structures are deeply questionable as a model of education. Perhaps, then, we should be thinking not about how to reconceive the “modern humanities” outside the discourse of crisis, but about what might replace them in light of a larger re-evaluation of what a university education is for.
Jennifer A. Frey is an associate professor of philosophy at the University of South Carolina.