Skip to content
ADVERTISEMENT
Sign In
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
  • More
  • Sections
    • News
    • Advice
    • The Review
  • Topics
    • Data
    • Diversity, Equity, & Inclusion
    • Finance & Operations
    • International
    • Leadership & Governance
    • Teaching & Learning
    • Scholarship & Research
    • Student Success
    • Technology
    • Transitions
    • The Workplace
  • Magazine
    • Current Issue
    • Special Issues
    • Podcast: College Matters from The Chronicle
  • Newsletters
  • Virtual Events
  • Ask Chron
  • Store
    • Featured Products
    • Reports
    • Data
    • Collections
    • Back Issues
  • Jobs
    • Find a Job
    • Post a Job
    • Professional Development
    • Career Resources
    • Virtual Career Fair
    Upcoming Events:
    An AI-Driven Work Force
    AI and Microcredentials
Sign In
The Chronicle Review

What the Digital Humanities Can’t Do

By Kathryn Conrad September 8, 2014
What the Digital Humanities Can’t Do 1
Andrea Levy 
for The Chronicle Review

A popular Tumblr site for graduate students in the humanities last year was MLAJobs, which satirized postings in the Modern Language Association’s job list. One faux job ad captured one of the many frustrations faced by those in the humanities:

“Digital Humanities: Asst prof in American or British literature, 16th-20th century, with interest in the problematic of digital humanities. We prefer candidates who can tell us, What is digital humanities? Some familiarity with MSWord expected. Send vita and letters to Cornell University, Ithaca NY 14853.”

To continue reading for FREE, please sign in.

Sign In

Or subscribe now to read with unlimited access for as low as $10/month.

Don’t have an account? Sign up now.

A free account provides you access to a limited number of free articles each month, plus newsletters, job postings, salary data, and exclusive store discounts.

Sign Up

A popular Tumblr site for graduate students in the humanities last year was MLAJobs, which satirized postings in the Modern Language Association’s job list. One faux job ad captured one of the many frustrations faced by those in the humanities:

“Digital Humanities: Asst prof in American or British literature, 16th-20th century, with interest in the problematic of digital humanities. We prefer candidates who can tell us, What is digital humanities? Some familiarity with MSWord expected. Send vita and letters to Cornell University, Ithaca NY 14853.”

The problem, of course, is not that no one knows what the “digital humanities” are. Many scholars work in what might be described as digital humanities and have done so since long before the buzz term appeared. And I share their enthusiasm for the many ways in which digital projects can make our objects of study more accessible or open new ways of seeing and understanding them.

The problem is that “digital humanities” now appears to trump plain ol’ “humanities,” particularly among those who hold the purse strings. I fear that what now matters, what legitimizes the humanities in the eyes of many “stakeholders,” is that modifier: digital. Too many of us, beaten down by the relentless insistence on the supposed practicality of STEM degrees—and, thus, in an argumentative leap, the greater value of the STEM fields—are willing to accept this state of affairs. But if we do, we put at risk much more than job lines or funding. We enable the wave of utopianism that the digital pioneer Jaron Lanier has described and criticized in You Are Not a Gadget (Knopf, 2010).

In the minds of digital enthusiasts, the application of new technologies necessarily extends, rather than narrows, our capacities and vision. Perhaps we are all in the process of plugging into a larger meta-consciousness as we connect to one another through the web. Or, at a less heady level, maybe we just need more data to understand our relationship to one another and the world around us. The big-data approach, inspired by corporate data-mining and finding its way into humanities scholarship, presumes the latter. But Lanier questions the idea that “quantity not only turns into quality at some extreme of scale, but also does so according to principles we already understand. … A trope from the early days of computer science comes to mind: garbage in, garbage out.”

The computer doesn’t make meaning. We do. And most of us in the humanities are not sophisticated computer engineers; we require assistance to understand and use the algorithms required to get the next “level of description,” to use the language of complex-systems analysis. When we pass the buck to programmers, the algorithms, and, in turn, to the models they generate, we cede a major part of the meaning-making process. If we wade into the digital humanities, we need to understand, and continue to question, the digital part as well as the humanities part. We can’t allow interpretation to be hidden in a mathematical black box. We need to remember that these digital methods are based, at least initially, on human inputs.

Even our wisest scholars of digital humanities, while continuing to argue for the importance of humanities scholarship, are at risk of obscuring that point. N. Katherine Hayles, in her article “Cognition Everywhere: The Rise of the Cognitive Nonconscious and the Costs of Consciousness,” speaks of the need for the humanities to recognize “nonconscious cognition"—the interpretive function performed by nonconscious entities and systems—in order to avoid the “isolation of the humanities from the sciences and engineering” and to participate in collaborative intellectual work. Only by acknowledging that “interpretation is … pervasive in natural and built environments,” she argues, can “the humanities … make important contributions to such fields as architecture, electrical and mechanical engineering, computer science, industrial design, and many other fields.”

But many of the forms of nonconscious cognition that Hayles identifies, like stock-trading algorithms and networked smart devices, are part of a human-built environment. They do not spring fully formed from the head of Zeus. What she acknowledges as “the sophisticated methods that the humanities have developed for analyzing different kinds of interpretations and their ecological relationships with each other” need to be at the front end of our development of these forms of “nonconscious cognition,” such as shaping the interpretive assumptions that feed into the algorithms of many digital-humanities projects, rather than meekly following behind them.

The risk I see to the humanities is not in using algorithms or, indeed, in any individual digital-humanities projects. The risk is, to quote Lanier, in “digital reification,” in “lock-in,” which “removes ideas that do not fit into the winning digital representation scheme.” If it doesn’t fit, we ignore it, or change the definitions (of texts, of musical notes, of the humanities, of consciousness) to fit the scheme. We dumb down the object of representation in order to make our design look better; and, in turn, the object of representation changes to match the design.

Not only do texts change when digitally filtered, but we change when we observe and interpret them. As Lanier, Hayles, Sherry Turkle, and others have noted, mainstream cybernetics presumes that the human mind works in the same way that a computer does, and that “information is more essential than matter or energy,” as Hayles puts it. Turkle, in Alone Together: Why We Expect More From Technology and Less From Each Other (Basic Books, 2011), has suggested that we are in a “robotic moment,” allowing our technologies to shape our notion of the human rather than the other way around. Her research points out that our sense of the human is changing, and not necessarily for the better. We should perhaps ask ourselves the humanities questions that are being raised by neuroscientists: What happens to our consciousness when we let ourselves be raised by robots and screens, however benign? How do those mirror neurons fire when presented with animated emojis instead of human faces? Are we increasing human diversity, or are we constraining it?

ADVERTISEMENT

As Lanier quips, “People degrade themselves in order to make machines seem smart all the time.” We become “restricted in practice,” he says, “to what can be represented in a computer.” Our complex, ambiguous selves are at risk from this digital pressure toward conformity. To accept this state of affairs is a kind of madness, a madness of the sort that G.K. Chesterton described when he pointed out that “a small circle is quite as infinite as a large circle; but, though it is quite as infinite, it is not so large. In the same way the insane explanation is quite as complete as the sane one, but it is not so large.”

The humanities, like humans ourselves, are large. But we reduce ourselves more and more often to our tools, describing ourselves as computational systems, soft machines, wired meat. Those are only metaphors, but metaphors that profoundly shape us. In How We Became Posthuman (University of Chicago Press, 1999), Hayles challenged us to uncover the “profound normative and moral implications” of our technologies and examine how those technologies affect how we think and live.

The answer isn’t simply for humanities scholars to learn how to program or to work with programmers. The answer is also for programmers to heed the humanities in thinking through the implications of their decisions, to make those decisions visible, and for all of us to recognize that digital technologies are only some of the tools at our disposal. We needn’t be limited to them—or by them. The humanities aren’t “extra,” nor can they be subsumed into a more scientific or technological worldview. The"soft” perspective of the humanities, just like the human itself, cannot be adequately represented or processed by the digital. It is no failure to say so, any more than it is the saxophone’s failure that its music cannot be satisfactorily represented by a limited and limiting computerized version of an instrumental “note,” to use one of Lanier’s examples.

As we look to the future, then, humanities scholars need to think about how best to make use of our technologies without trying to emulate them. Digital technologies have excellent applications but aren’t a good fit for all projects, nor are the projects that do fit necessarily better than the ones that don’t. We must resist the temptation to jam our square pegs into round holes and to treat the products of digital-humanities scholarship as more valuable simply because they are digital.

ADVERTISEMENT

We must also recognize, as Hayles has pointed out, that humanities scholars have leaned heavily on human exceptionalism as the basis for our approach, and that digital scholarship might mitigate that sense of species self-centeredness. But if that sense of exceptionalism falls away as we continue to explore our place in complex world systems, that should be neither the end of the human or the end of the humanities. We needn’t fall into misanthropic despair, like some gaggle of Gullivers. Rather, a new humility with respect to our human limitations can help us to develop not only a wiser humanities but also wiser science and wiser technology, built on the realization that what we are capable of knowing, even with the help of our tools, is not all there is. That wiser scholarship might help us to understand our failures as well as our successes, to recognize our limits and ill-founded assumptions.

Read that as failure if you must. I prefer to think of it as an opportunity for wonder. And wonder, as Plato’s Socrates suggests, is the beginning of wisdom.

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.
Tags
Opinion
Share
  • Twitter
  • LinkedIn
  • Facebook
  • Email
ADVERTISEMENT
ADVERTISEMENT

More News

Photo illustration showing Santa Ono seated, places small in the corner of a dark space
'Unrelentingly Sad'
Santa Ono Wanted a Presidency. He Became a Pariah.
Illustration of a rushing crowd carrying HSI letters
Seeking precedent
Funding for Hispanic-Serving Institutions Is Discriminatory and Unconstitutional, Lawsuit Argues
Photo-based illustration of scissors cutting through paper that is a photo of an idyllic liberal arts college campus on one side and money on the other
Finance
Small Colleges Are Banding Together Against a Higher Endowment Tax. This Is Why.
Pano Kanelos, founding president of the U. of Austin.
Q&A
One Year In, What Has ‘the Anti-Harvard’ University Accomplished?

From The Review

Photo- and type-based illustration depicting the acronym AAUP with the second A as the arrow of a compass and facing not north but southeast.
The Review | Essay
The Unraveling of the AAUP
By Matthew W. Finkin
Photo-based illustration of the Capitol building dome propped on a stick attached to a string, like a trap.
The Review | Opinion
Colleges Can’t Trust the Federal Government. What Now?
By Brian Rosenberg
Illustration of an unequal sign in black on a white background
The Review | Essay
What Is Replacing DEI? Racism.
By Richard Amesbury

Upcoming Events

Plain_Acuity_DurableSkills_VF.png
Why Employers Value ‘Durable’ Skills
Warwick_Leadership_Javi.png
University Transformation: a Global Leadership Perspective
  • Explore Content
    • Latest News
    • Newsletters
    • Letters
    • Free Reports and Guides
    • Professional Development
    • Virtual Events
    • Chronicle Store
    • Chronicle Intelligence
    • Jobs in Higher Education
    • Post a Job
  • Know The Chronicle
    • About Us
    • Vision, Mission, Values
    • DEI at The Chronicle
    • Write for Us
    • Work at The Chronicle
    • Our Reporting Process
    • Advertise With Us
    • Brand Studio
    • Accessibility Statement
  • Account and Access
    • Manage Your Account
    • Manage Newsletters
    • Individual Subscriptions
    • Group and Institutional Access
    • Subscription & Account FAQ
  • Get Support
    • Contact Us
    • Reprints & Permissions
    • User Agreement
    • Terms and Conditions
    • Privacy Policy
    • California Privacy Policy
    • Do Not Sell My Personal Information
1255 23rd Street, N.W. Washington, D.C. 20037
© 2025 The Chronicle of Higher Education
The Chronicle of Higher Education is academe’s most trusted resource for independent journalism, career development, and forward-looking intelligence. Our readers lead, teach, learn, and innovate with insights from The Chronicle.
Follow Us
  • twitter
  • instagram
  • youtube
  • facebook
  • linkedin