If you peeked into my study a few months ago, you would have seen me facing a desk full of books and a blank computer screen. I was working on my newest project -- a social and cultural history of obsession. I had been lucky enough to land a Guggenheim fellowship and a year’s research leave from my university to pursue the project. So where was I? The topic was exciting, and some publishers had expressed interest in it. I had done a fair amount of research, gone to archives in the United States and abroad, bought a pile of books, and borrowed even more from the library. I’d read a lot, thought a great deal, taken a lot of notes. I looked like an academic hard at work.
But I wasn’t, quite. I was lacking something that seems to have nothing to do with assiduousness -- creative inspiration. I was stuck. You know the feeling: the overwhelming desire to avoid one’s desk; the shortened attention span that leaps upon any excuse to flee to another task; the muted sense of panic that undermines any attempt to find a way out of an intellectual dead end. I realized that when novelists and artists appear on talk shows, they usually are asked about their creative processes, but when academics bring out a new book, they are expected to talk only about the content of their work, not how they arrived at it. But creativity is just as important in academic writing as in any other -- ideas and notes are just the mud in the mortar.
The truth is that every rational work of scholarship is based on fairly irrational moments of inspiration. Or, as Einstein once said, “Imagination is more important than knowledge.” You can amass all the bits of information you like, but it takes a bolt of intellectual lightning to make scholarship come to life. I remember being in the intellectual doldrums when I was writing my book Enforcing Normalcy. I was trying to describe the way deafness was perceived in the 18th century. My research was interesting, but it seemed to amount to nothing more than a long list of other scholars’ insights. A friend came over for dinner and happened to ask about my parents’ deafness, and how my mother had become deaf. As I told the story of her childhood meningitis, it hit me that the interest in and influence of deafness in Europe had exploded in the 18th century. I previously had been headed in the predictable direction of thinking that because the number of schools for the deaf had increased, Europe had taken more of an interest in deafness, but now I realized that Europe had done more -- it had, in a sense, become culturally deaf through a new emphasis on nonverbal ways of communicating, most notably writing over speaking. I had found my thesis and was able to steam ahead with my work.
In a way, the need for that imaginative leap is what makes academic writing so difficult. Novelists, although they may do research, can just record what they imagine (granted, good novelists structure and rework their material meticulously). Scholars, on the other hand, must examine, categorize, and organize the world as it already exists. And then, although they may not know it, they have to use their imaginations to transmute mere facts into theses that make people stop and take notice.
That’s true in the sciences as well as the humanities. One of the most famous examples of scholarly inspiration concerns the German chemist F.A. Kekule. One night in 1865, he went to sleep after trying to sort out the chemical diagram for the benzene molecule. Up to that point, all of the elements in the periodic table were laid out in a linear fashion, but the data for benzene didn’t seem to fit in any arrangement that made sense. During the night, he dreamed of a snake that whirled into a circle and then bit its own tail. In the morning, Kekule suddenly realized that the molecule for benzene had to be arranged in a ring. The future of organic chemistry took off from that intuitive leap.
Professionals who are paid to be rational don’t necessarily like to admit that the darker, irrational part of themselves is where their creativity may lie. It’s something we university types don’t talk about, because one can’t quantify inspiration in familiar ways. Yet how do we choose our subjects if not from the deep well of our unconscious? I don’t believe my choice of obsession as a scholarly topic was random.
The old mind-body problem rears its bifurcated head here. We academics tend to have delusions of grandeur about our intelligence, our wit, our capacity for cold, hard logic, while we objectify creativity, talking about “flow,” “energy,” and “juices.” Caught between a hard mental rock and a touchy-feely soft place, we exclude the “I” and the personal stories surrounding that “I” from what we deem scholarly. Even with the recent spate of confessional academic works by scholars such as Jane Tompkins and Eve Sedgwick, we tend to think of “true” scholarship as objective and impersonal. But the most meaningful scholarly work usually comes from the deep, flowing well of one’s own obsessions and interests. Edward Said wrote about exile, oppression, and otherness from his own experience of being an Arab in a non-Arab world. Rosemarie Thompson, who writes about disabilities and freak shows, knows what it is like to be stared at by so-called normal people because of her physical impairment. Walter Kendrick, George Rousseau, and Helen Lefkowitz Horowitz have branched off from their own sexual interests to write about sexuality, including pornography and pedophilia, in other eras.
Freud wasn’t so far off in describing how we sublimate our darker impulses and brighten them up with the spick-and-span of scholarly organization. In Civilization and Its Discontents he tells us, “Sublimation ... is a particularly striking feature of cultural development, which makes it possible for the higher mental activities -- scientific, artistic, and ideological -- to play such a significant role in civilized life.” But too often the price of our objectivity is our connection to our subjective experience. Indeed, we’ve learned too well how to run our unique, personal voices through the artificial filter of acceptable academese so that our words read like a cross between the pronouncements of the mighty Oz and Linda Wertheimer. It’s sad that creative writers learn to find their voices while academics learn to lose them.
The fact is, it’s probably emotions rather than ideas that lead to intellectual insight. Perhaps the proof of that observation lies in the appearance of such insights so often at fairly unintellectual moments. For me, inspiration tends to come when I’ve spent the morning stuck on an issue and then go running. Friends have told me that their ideas come when they are swimming, doing hobbies, foaming a latte, or soaping up in the shower. The key, of course, is not to wait for inspiration, but to invest in the hard intellectual work that evokes it, and then to relax enough to allow the subconscious to do its part. You have to be steeped in your project. It has to be rolling around your consciousness day and night. That is when the intuitive process seems to happen -- when emotion combines with thought, activity, and dreaming. Antonio Damasio has written several books about how emotion helps us to reason. He has done research on brain-damaged people, and it turns out that people with injuries to the part of the brain that governs emotion ultimately cope much worse than people with damage to the rational part.
The rise of the irrational into consciousness can’t be rushed. Once, when I was sitting at my computer looking at the blank screen, I had the feeling that I was about to experience a conceptual sneeze. I could not have described rationally to anyone what I felt. I knew that stuff -- ideas, research, new connections -- was percolating and trying to come to the surface. But I couldn’t grapple with it all directly. I had done my homework. I felt that all I could do was hang over the precipice, so to speak, and wait.
For many of us scholars, that is a painful moment, if we’re even able to experience it, because we are trained to run with the hares of reason and not with the hounds of the irrational. We don’t have permission to tell our friends and colleagues, “I’m not there yet. I’m still hanging around or near my insight.” We lie, we bluff, we say, “The work is going well. I’m working on Chapter Two.” We don’t ever say, “I have no idea what happens after Chapter Two.” Or, “I’m deeply worried that Chapter Two doesn’t make any sense.” Or, “I thought I knew what my project was in Chapter One, but now it’s unraveling as I move forward.”
Uncertainty happens when our project shifts and gets more complicated, when a new insight has to be woven back into our assumptions. While such readjustment is usually a good thing, it can be disconcerting. When we were writing our dissertations under the stern eye of our advisers, that supervision kept us steady on some course of rationality. But when we are on our own, as most of us are after our first years in academe, we’ve got to stare down the continually evolving and devolving black hole of our intellectual projects. The chaos revealed there can make one long for the old days of intellectual security, when we were safe and sound under the benign or even sadistic surveillance of our mentors.
Scholarly creativity involves tolerating the pain of uncertainty or lack of resolution. Letting the irrational work in mysterious ways requires, strangely enough, a highly rational facility. Your first recourse is to panic, followed by the impulse to throw yourself into an existential pit of scholarly despair -- to imagine that all your work and time will be, or has been, lost in the chaos of muddled ideas. But the ability to wait, to let one’s conscious ideas percolate in the bubbling unconscious until the moment of imaginative synthesis, is a tough but necessary part of the creative process. Trusting in the irrational isn’t for sissies.
Making room for the irrational, putting yourself into your own work, and surviving uncertainty allow one’s creative juices to flow. But there is more to face -- the irrational boogeyman of postmodernity, the imps of deconstruction. Many of us know and even accept the work of postmodern deconstructionists like Jacques Derrida, who argue that the unity of texts is a myth devised to make us believe that logic works, that syntax yields meaning, and that we can create overarching, explanatory theories. Derrida’s point is that every work has at its center a black hole (he calls it an aporia), a kind of vortex of illogic and dissolution pulling the work apart. The writer attempts to combat the deconstructive force by creating the illusion that everything pulls together into a seamless, uncontradictory, whole argument that rises above its existence as mere human gasp or utterance.
My own writing experience has made me a believer of that insight of Derrida’s. You, too, might have known with fearful certainty that in the center of your work is a basic irrationality that you’re merely covering up in your daily attempt to put words on a page. You give examples, cite previously cited scholars, produce counterexamples, and liberally use the words “obviously” and “clearly,” along with the phrases “so we see” and “it is incontrovertible that. ... " In the long run, most of us get over the haunting aporia, even deconstructionists, and proceed. We act as if the problem is solved simply by putting “proper words in proper places,” as Jonathan Swift put it. But, always, the intensely irrational aporia lurks.
I can report that in the course of this year, on more than one occasion I felt as if I didn’t have a clue where I was heading. But at each of those times, I eventually had a breakthrough, the course of my project changed direction, and I was able to move on to the next insight. A week after my session of staring at a blank computer screen -- this was in the dead of winter -- I took a walk down to the banks of the Hudson River and watched the ice floes. When I looked quickly, all I saw was a lumpy, white expanse seemingly locked into place. But closer observation revealed a less static picture: distinct patterns that compose knowable areas, which suddenly break up, shift, and form new configurations. All along, the apparently frozen-solid jigsaw puzzle was in constant flux.
In a moment of irrational creativity, I realized that this winterscape was a perfect metaphor for what was going on in my study. Ideas form little floes, locked together for an instant or a while, and then rearrange themselves into new configurations without losing their substance. Allowing for creativity involves having the patience to wait until the ideas coalesce into a right relationship, and then taking a mental photograph of the configuration at that moment. There is a Zen to this -- because the ideas will continue to shift into new patterns. Like a surfer catching a wave, one must observe the patterns and rhythms of thought with both alertness and relaxation, to home in on a particular insight at the moment it crystallizes into clarity.
I’d like to report that I went home with my insight from the frozen river and started writing, and that my book is now finished. It isn’t; I’m in the middle of writing it. But living under house arrest with my project, getting to watch my own creativity in slo-mo, has taught me how to live on the edge with irrationality and the wisdom of leaping into it.
Lennard J. Davis is a professor of English and of disability and human development, and director of the Center on Biocultures, at the University of Illinois at Chicago. His most recent book is Bending Over Backwards: Disability, Dismodernism, and Other Difficult Positions (New York University Press, 2002).
http://chronicle.com Section: The Chronicle Review Volume 50, Issue 14, Page B10