“When I was cutting myself,” Natalie told me in a matter-of-fact way, “I wore big bracelets, and because I like to sew, I had lots of fabric in my room which I tied in bands around my wrists.” Over a lengthy lunch at the Moosewood Restaurant, in Ithaca, N.Y., the articulate 18-year-old from an affluent suburb spoke confidently about plans to enter Boston University and study child psychology, all the while narrating her personal history as a “cutter.”
Natalie seemed to be at ease talking with me, a college professor whom she knew only casually, about the ways in which she used safety pins, razors, and eventually scissors to pierce her own skin. That confirmed what I had gleaned from teaching and advising over the past decade: Cutting, in the form of repetitively slicing or puncturing one’s own flesh (especially the arms and thighs) without intending suicide, is no longer alien behavior among college students.
In fact, some students speak openly of their own cutting behavior and that of others, including celebrities like Angelina Jolie, Johnny Depp, and the late Princess Diana. A 1995 report in the Journal of the American Academy of Child and Adolescent Psychiatry indicates that cutting exists primarily among “popular” high-school girls who perform well academically. Many observers suggest that cutting and eating disorders often exist in tandem, casting both as diseases of the rich and pampered.
Although cutting is still characterized as a symptom and not an independent diagnosis in the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders, the bible of mental-health clinicians, self-injurious behavior, or SIB, appears to be on the rise among students at American colleges and universities. Two new studies, led by Janis L. Whitlock, a developmental psychologist at Cornell University, document that growth and provide a richer picture of such behavior.
The first study, conducted and written with John J. Eckenrode, a professor of human development at Cornell, and Daniel Silverman, chief medical officer and head of health services at Princeton University, randomly surveyed 3,069 undergraduate and graduate students at the two institutions. Published in June in the journal Pediatrics, it is an important bellwether: the first epidemiological study to give us a sense of what is actually happening on campuses. Whitlock’s data showed that 17 percent of the students had purposely tried to injure themselves by cutting or burning, and that almost 75 percent of those had done so multiple times. Both men and women seemed to be self-injuring, although women outnumbered men. When Whitlock controlled for parental-education levels, she found no correlation between social class and SIB.
In another study, published a month earlier in Developmental Psychology, Whitlock and her co-authors, Eckenrode and Jane Levine Powers, of the Family Life Development Center at Cornell, demonstrated that cutters, once a highly isolated group, now participate in a virtual community, incorporating more than 400 Internet message boards to share thoughts and experiences. The Internet provides support — sometimes to stop the behavior, sometimes to continue it. (Although the study doesn’t mention it, there is also a secretive culture of girls in the know who signal one another with orange plastic bracelets when they are actively cutting and white ones when they are trying to stop.) Whitlock’s pioneering work raises larger questions about what she calls the “social contagion” effect that has played a role in the spread of eating disorders among young women during the past 30 years.
In both self-injurious behavior and eating disorders, communicability is a social rather than a biomedical process. Neither cutting nor eating disorders involves a micro-organism, yet both seem to spread, as psychogenic illnesses often do, in response to cultural stimuli. On the contemporary college campus, cutters — like bulimics and anorexics — can usually find a group of people who understand their behavior. If not, they can turn to Web sites for people with SIB or eating disorders — a new opportunity that raises a host of questions for those in public health who are concerned about the process of current and future “psychic epidemics.”
Along with their laptops, cellphones, and iPods, students now arrive on campuses with an awareness of self-injurious behavior and a greater tolerance for it than in previous generations. Yet for psychological-services professionals, residence-hall staff members, deans, and faculty members, the addition of cutting to the contemporary repertoire of collegiate psychiatric disorders constitutes a significant clinical challenge.
Whitlock, who sees the behavior as an emerging public-health issue, is committed to providing mental-health professionals with strategies for intervention in this largely underrecognized disorder. As a social historian, I have somewhat different concerns: What are the origins of self-injurious behavior? Why do we see so much of it now on campuses? And how does it reflect American culture?
Medical literature is rich with cases of mentally ill people who enucleated their eyeballs, cut off limbs and sex organs, and ground glass between their teeth. According to Armando R. Favazza’s wide-ranging 1987 survey and analysis in Bodies Under Siege: Self-Mutilation and Body Modification in Culture and Psychiatry, most forms of self-mutilation represent an attempt at self-healing. Although it’s often hard to tell what exactly was wrong with such people without the benefit of modern case records, it is clear that the meaning of the behavior, as well as the diagnosis, changes over time along with medicine, psychiatry, and culture. Psychogenic illness in one culture may be characterized by nausea, in another by headaches and dizziness, and in still others by dancing, trances, appetite manipulation, or cutting.
In looking for the antecedents of contemporary campus cutters, I first searched the historic record for repetitive female cutters who were not suicidal. I found a telling early example in the case of Helen Miller, a 30-year-old “German Jewess” who was a patient of Walter Channing, a 19th-century asylum doctor (not to be confused with the famous Boston obstetrician of the same name). In 1875, after a history of both illness and crime in her 20s, Helen began to “cut up” — her term — when she re-entered the New York State Asylum for Insane Criminals, in Auburn, where she had already spent two years for burglary. For stealing a stuffed canary and a microscope lens, she was sent back to the asylum, where she became depressed and began to use sharp pieces of glass to periodically slash her wrists and arms. The details of her self-injury were of enormous interest to Channing, who counted at least 150 foreign objects that were removed from Helen’s body, including pieces of glass, wooden splinters, needles, pins, shoe nails, and a piece of tin. Although she sometimes cut her arms almost to the bone, Channing claimed that Helen felt no pain when she cut, and that she “apparently experienced actual erotic pleasure” from the medical probings that followed those disturbing incidents of self-mutilation.
Channing’s understanding of cutting reflected Victorian ideas about sex differences and medical reports in the American Journal of Insanity, the most up-to-date periodical in his field in the late 19th century. Apparently, at the nearby Utica Lunatic Asylum, a single female had an astounding 300 needles removed from her body. Such women were regarded as “hysterical,” and Victorian medicine cast their cutting behavior as a deliberate, typically female strategy for attention. Channing wrote of Helen Miller: “The wounds were made as lacerated [deep] as possible, the garments were covered unnecessarily with blood and a time of day was chosen when help was sure to be at hand. Everything was done to produce as much effect as possible.”
Many doctors, in many locations other than upstate New York, reported cases of “needle girls.” It is not surprising that sewing needles were a popular instrument of self-mutilation in a society where sewing was a regular domestic activity. Ernest Hart, a physician at a hospital in London, reported the case of E.G., a “young girl of good appearance and superior education” who “willfully introduced” and broke needles into her flesh, particularly her fingers. Hart, like Channing, was suspicious of his patient: “The records of surgery include many remarkable cases in which patients suffering under a morbid and commonly hysterical condition have inflicted various injuries on themselves, with the view of exciting the sympathy of friends, and of deceiving their surgeons.”
By the early 20th century, medical reports dropped the ghoulish focus on the number of cuts and moved away from the idea of an innate female propensity for attention-seeking behaviors. (Most of the cutters in the clinical literature were still women and girls, however, and that remains true today.) In 1913, in the work of L.E. Emerson, an influential Boston psychologist on the staff of the Psychopathic Institute at Boston State Hospital, a female cutter emerged for the first time as someone responding to a complex mix of unfortunate life events and circumstances, including what we now call “sexual abuse.” Emerson’s presentation of the case of Miss A., a 23-year-old who cut herself on the breast and legs as well as arms, was revolutionary for its detailed attention to the family environment.
Emerson — one of the first Americans to practice psychoanalysis — chronicled the sad story of Miss A., who, beginning at age 8, was “masturbated” on an almost daily basis by an uncle who lived in her home and also attempted to rape her. In her adolescence, she became “abnormally stout,” failed to menstruate regularly, and developed intense headaches. At 20, after a frightening sexual assault by her cousin, she discovered accidentally that self-mutilation eased her headache pain, thus beginning a pattern of repetitive self-injury which she found soothing. “Before I cut myself I had what I called a crazy headache,” she explained, “and I thought that the cutting of my wrist, and letting the blood flow had cured it.” Unlike earlier reports, Emerson gave Miss A. a great deal of voice, and, in doing so, contributed to the understanding that cutting could bring people relief, that it might be a way to cope with emotional pain.
Because the skin is the canvas for self-mutilators, cutters apparently ended up in the offices of skin doctors as well as psychiatrists. However, when they presented their symptoms to dermatologists, they usually lied about the etiology of their festering lesions to avoid the stigma associated with the behavior. For dermatologists, it was a clinical challenge to be able to recognize the handiwork of cutters, burners, and hair pullers. In 1929, at the moment when phonograph needles joined sewing needles as an instrument of self-injury, two Philadelphia dermatologists, John Stokes and Vaughn Garner, explained their diagnostic technique: “The bizarre and at time fantastic shape both of the individual lesion and of the arrangement of the group of lesions is, then, a suspicion-arouser of the first order.” Apparently, self-inflicted lesions tended to have regular and angular outlines rather than rounded ones. Dermatologists classified Jazz Age cutters under many rubrics: neurotic excoriations, dermatitis factitia, dermatitis artefactus, feigned eruptions, and hysterical dermatoses.
By the 1930s, psychiatry was able to offer a new understanding of repetitive self-injury. Karl Menninger, the Kansas-born psychiatrist who argued that the mentally ill were only slightly different from other people, suggested that even the most severe acts of self-mutilation were really just points on a spectrum. All of us, he asserted, participate in some self-injurious behaviors — nail-biting and picking pimples are familiar examples. But Menninger distinguished between psychotics and neurotics in terms of their patterns of self-mutilation. Psychotic patients, he asserted, make no effort at concealment, and neurotics rarely mutilate “irrevocably.”
Although he had worked with college students in a counseling center at Washburn Collegenow Washburn Universityin Topeka, Menninger’s 1935 address to the American Psychiatric Association on the subject of self-mutilation did not mention cutting on campuses. Either collegiate cutters did not exist then, or they were hidden and covert. Menninger certainly never envisioned that cutting would be talked about so openly by “normal” individuals like Natalie, my young confidante.
In my college generation in the 1960s, we spoke in hushed tones of girls who had experienced “nervous breakdowns,” but we were not in the business of sharing and comparing mental diagnoses the way students do today. I had never even heard of anorexia nervosa, and cutting seemed “sick” enough to preclude someone who did it from ever attending college, let alone living in the room next door and talking about it. Many of us read Joanne Greenberg’s 1964 I Never Promised You a Rose Garden, an emotional blockbuster based on the author’s experience at Chestnut Lodge, a private psychiatric hospital in Marylandwhere, I learned later, Greenberg was treated by the well-known psychoanalyst Frieda Fromm-Reichmann. In that powerful illness narrative, published under a pseudonym, the adolescent protagonist repeatedly burned herself and cut her wrists — behaviors that seemed wilder, crazier, and more exotic than they do now.
Cutters receiving treatment in that era were almost always patients in closed adolescent units in hospitals, there because they had been diagnosed as schizophrenic or with borderline states. And when one patient began to cut, others often followed. Some psychiatric facilities even witnessed “epidemics,” which they linked to the well-known adolescent propensity for copycat behavior, especially among those who were mentally ill.
On the basis of those reports, some proposed a “wrist-cutting/slashing syndrome” (also known as “delicate cutting syndrome”) involving young women who cut their arms and wrists repeatedly but without suicidal intent. Yet despite a broad consensus that the syndrome does exist — that it is more an impulse disorder than an attempt at suicide — the effort to get a distinct diagnostic category accepted in the third or fourth edition of the DSM has not yet succeeded.
Today we use a bland generic term — “self-injurious behavior” — but we should probe, more than ever, the troubling “contagion” issue raised by Whitlock’s studies of the generative, social, and electronic environment that surrounds contemporary cutting. Psychic epidemics simply will not be as local as in the past — when they were confined to one school, convent, or town — because of the ways that our young people communicate and use the Internet to explore themselves and their world.
A historical perspective also sheds light on how colleges have changed in their capacity — and willingness — to deal with self-injurious behavior. They began to recognize that students might have mental problems only in the early 20th century. According to Heather Munro Prescott, a historian at Central Connecticut State University, in 1910 Princeton was the first collegiate institution to offer psychological services. In 1921 Dartmouth put a psychiatrist on staff because its president, Ernest Martin Hopkins, believed that colleges should “stabilize minds” and make them “healthful” in addition to sharpening intellects. After World War II, mental-health services expanded to assist in the adjustment of veterans who flooded college campuses under the GI Bill of Rights.
In the 1960s, clinicians in campus psychological services confronted a rising tide of problematic social behavior — notably, increased sexual activity and substance abuse. Many students regarded psychotherapists as suspicious adults in those days, perhaps even stool pigeons for university interests and odious in loco parentis policies. Yet by the 1980s and 1990s, although in loco parentis was a curiosity of the past, the commitment to caring for the mental health of students had actually escalated. Students and their parents expected colleges to have accessible, sophisticated mental-health services just as they had libraries, career counseling, and physical education. In part, that reflected a decline in the social stigma associated with mental illness. Many people who rejected authority and expertise in their youth had turned, as adults, to psychotherapists and other mental-health professionals to help cope with issues like divorce, child custody, and “coming out.” So, too, many of their children had experience with therapists as they grew up, a growing number with eating disorders.
A conspicuous increase in anorexics and bulimics during those years forced health and psychological services to confront problematic eating behavior — to work cooperatively with residence-hall staff to both identify and prevent it. The National Eating Disorders Association also began to sponsor lectures, films, and workshops to educate the public about the dangers of those widely discussed disorders that were becoming popular fare in media like People magazine. Such efforts have been relatively successful in raising awareness of the dangers of anorexia nervosa and bulimia as well as the need for therapeutic intervention, but the number of eating-disordered collegians has not notably decreased.
Today college health professionals confirm that there are more students in need of psychological services than ever before, in part because of the Zoloft, Paxil, and Xanax carried in many backpacks. Contemporary psychopharmacology, combined with consumer advertising of psychotropic drugs, makes it possible for young people with emotional problems to attend college, maintain their equilibrium, and do their academic work. In 2000, 80 percent of a national sample of counseling-center directors reported increases in students with severe psychological problems.
If cutting is one indication of a more troubled collegiate environment, what might account for the prevalence of that particular symptom now? Is there something about cutting one’s own skin that is related to the ideas and values of contemporary Americans?
At the moment, transforming and enhancing our bodies is a major American cultural imperative supported by both commerce and medicine. “Body modification” businesses operate in most cities and towns; teenage girls are sporting belly-button jewelry and eyebrow rings. The growing popularity of such “body projects” — ranging from garden-variety piercing and tattooing to bizarre forms of cosmetic surgery — suggests a new mind-set about the malleability of the body as well as its ability to withstand violation and penetration. For many in the younger generation, not just those who think of themselves as “Goth,” the body is a critical message board, a way to convey information about the self. In fact, in some niches, arms and thighs that bear scabs can be badges of honor. Heavy-metal and punk bands valorize cutting, but even reasonable adults have gained some vicarious experience with the behavior in films like Fatal Attraction (1987), American Beauty (1999), and Thirteen (2003).
S urely the violent nature of our society provides a hospitable climate for self-mutilation. Countless social-science studies have shown that our children grow up accustomed to violence of all kinds, and that it has harmful, desensitizing effects. As children mature, their imaginations are shaped by cultural scripts picked up from brutal television shows, video games, and movies as well as sadistic real-life crime and gruesome war reportage. By the time they are adolescents, when cutting usually begins, some are intrigued — rather than revolted — by the thought of putting razor or scissor to flesh.
At Hobart and William Smith Colleges, in Geneva, N.Y., Debra DeMeis, dean of William Smith College, and a staff of experienced psychotherapists spoke with me about what they had seen on campuses over the past five years. “Cutting is not as pathological as it once was,” explained Bonnie Lambourn-Kavcic, a clinical psychologist at the counseling center. Part of its new “normality” is the growing understanding among clinicians that repetitive self-injury is not usually suicidal and that it can be a short-term response to unhappiness, stress, and depersonalization.
Because cutting is often preceded by some physical tension, like a headache or a sinking feeling, campus clinicians generally work to help the self-injurer break the connection between that somatic trigger and picking up a razor. In therapy cutters not only receive drugs, they are counseled on ways to become less impulsive, and to find alternative means to self-soothe.
Although some clinicians have posited that cutting is a response to the stress of college life — new people, less structure, anxiety about intellectual performance — DeMeis was certain from her experience that some cutters come to campus with the behavior already established. She suggested that families desperate to market their children to college recruiters are not likely to disclose such behaviors. Unlike learning disabilities or attention-deficit hyperactivity disorder, which can dictate special educational privileges, SIB carries no such advantage.
In fact, from the perspective of college health services, this latest “psychic epidemic” carries special problems. Mental-health professionals need to assist people who engage in self-injurious behavior, but they also need to protect other students whose social relationships and academic performance can be affected by the high drama of a behavior that still raises the fear of suicide. With young women, in particular, friends get very involved, they often become frightened, and many take on caretaking roles far beyond their experience and responsibility.
Despite the greater tolerance for mental-health problems today, “psych services” confront some inherent conflicts about how to handle cutters. Although most cutters will stay in college and see therapists, some may be asked to leave when their behavior becomes flamboyant (such as leaving blood repeatedly in a communal bathroom) or troublesome enough to negatively affect the daily lives of friends. In the words of DeMeis, an academic dean as well as a psychologist: “What may be therapeutically good for the cutter may not be best for the institution, or for other students.”
Clearly, this is new terrain for campus clinicians. Will cutting escalate and plateau the way eating disorders have? Will medicine be able to come to grips quickly with the complicated mix of biology, individual psychology, and culture that generates this behavior?
In the end, Menninger was probably right. Mental disorders really are only points on a spectrum, and those who are troubled almost always choose symptoms that borrow from the cultural material of the times in which they live. Ideally, more self-injurious behavior can be averted by wise multidisciplinary collaboration about how to treat the cutters in our midst, most of whom, according to Whitlock’s study, still do not seek medical or mental-health attention. Despite the new tolerance and the cultural chatter, many cutters remain secretive and feel considerable shame about their behavior.
With a population highly vulnerable to peer suggestion, as college students are, we need to develop a sensitive, nonpunitive response that does not condone or excite others to do the same. And whether cutting behavior begins in high school or college, whether it involves girls or boys, it should provoke parents, mental-health professionals, and educators to consider the ways in which this longstanding symptom mirrors our violent, troubled times.
Joan Jacobs Brumberg is a professor of history, human development, and women’s studies at Cornell University.
http://chronicle.com Section: The Chronicle Review Volume 53, Issue 16, Page B6