Numbers suggest certainty, and when it comes to campus crime, everybody wants answers. That’s what the Clery Act set out to reveal: How many rapes, burglaries, assaults? When? Where? Enacted nearly two decades ago, the federal law requires colleges to send the government lengthy reports each year, detailing their policies and tallying their total crimes.
But do statistics keep students safe? As campus security has become a national fixation, some scholars of the Clery Act — and officials who must comply with it daily — challenge the wisdom of producing timeand labor-intensive reports of dubious value. And they puzzle over a paradox: The law requires them to publish the numbers, but students and their families don’t seem to read them.
As a consumer-protection statute, the Clery Act has faltered. Its crime-reporting requirements were intended to inform families, to change students’ behavior, and to keep them safe. But research on the use of the reports shows that very few students look — and even those who seek solid information are out of luck. Did the University of New Orleans really have just one aggravated assault in 2007, while nearby Tulane University, with slightly fewer students, had 24? Did the University of California at Davis have almost as many forcible-sex offenses as the system’s nine other campuses combined? Did New York University really see burglaries jump 771 percent in a single year?
Concerns that colleges would misinterpret the law’s regulations or misrepresent their crime figures arose early on, and they linger. Among the 6,600 institutions subject to the Clery Act — anywhere a student can use federal financial aid — interpretations of how to prepare the reports and what to include vary substantially.
And yet the numbers, neither reliable nor comparable, are cropping up in more places.
The U.S. Department of Education now publishes the crime statistics on a user-friendly Web site (http://ope.ed.gov/security) and tells prospective applicants to use College Navigator, its online-search tool, to compare institutions side by side (four arsons at Amherst, zero at Williams). Other new programs designed to promote accountability encourage similar comparisons of campus-crime figures (“Get the facts for a smart college choice!”). Reader’s Digest joined in last year, ranking colleges on safety according to their crime rates.
Colleges spend a good deal of money putting out crime reports of questionable value. How much the process costs, including percentages of staff salaries, is so hard to calculate that even those who work on the reports could not guess how much. The Clery Act’s complexity requires perpetual training, especially as amendments continue to change colleges’ reporting requirements. Dolores A. Stafford, chief of police at George Washington University and a national expert on the Clery Act, fields five or six calls a week from colleges that need help figuring out what to report. When she travels to institutions for audits, she always turns up errors in the way they are counting crimes. “Not once,” she says, “have I found a campus that’s completely in compliance.”
The law has improved campuses’ discussions of safety, several college police chiefs say, leading to bigger security budgets and more education programs. But with the reports, the law has done something else, says Steve Bowser, director of public safety at Spelman College: “It has created a bureaucracy.”
Behind High Numbers
It has also produced deceptive data. Take the University of California system. In 2007 the Davis campus listed 62 forcible-sex offenses. The system’s nine other campuses reported between zero and 22. San Diego — with about 27,000 students, compared with Davis’s 30,000 — listed just two.
To the prospective student, Davis looks dicey. In fact, it may actually be safer. Sexual assault is a historically underreported crime, and outreach, which Davis does a lot of, encourages more victims to come forward.
Davis’s 30-year-old Campus Violence Prevention Program has benefited from three long-serving staff members, strong support from the university’s chancellor, and multiple grants from the U.S. Department of Justice’s Office on Violence Against Women. Last year the program reached more than 22,000 students with services in education, prevention, and support.
“The students know where to go, and the people they go to know where to send them,” says Jennifer Beeman, director of the program, which operates out of the campus police department. “If we’re doing our job, the numbers are going to be higher.”
When crime statistics come out each year, Ms. Beeman looks at those of other institutions, but she doesn’t compare them with hers. “I don’t think that’s fair,” she says. She will, however, check other campuses that have received violence-prevention grants from the Justice Department, like Ohio State University and the University of Minnesota-Twin Cities. There, as at Davis, she tends to see relatively high numbers for sexual assault.
That’s the context she offers parents at freshman orientation when she shows them Davis’s annual security report. “Initially parents look at our numbers and think, ‘Oh, dear Lord,’” Ms. Beeman says. “We really try and frame it for them.”
The Clery Act requires colleges to list any crime reported to “campus security authorities,” a designation that applies to pretty much anyone who supervises students. In practice, most colleges focus on law-enforcement officials, but Davis tries to reach a broader group. Last year the university’s compliance team, led by Ms. Beeman, sent about 900 e-mail messages to staff members, linking to a PowerPoint presentation on their responsibilities under the law. About half responded, and Davis added any crimes they reported to the tally. The law also requires colleges to solicit data from local police departments, and last year Davis contacted 26 of them. A law clerk synced those statistics with the campus totals.
Many colleges don’t have the resources or the incentive to be that meticulous. In 2000 the Education Department investigated the whole University of California system, following articles in The Sacramento Bee and a complaint from Security on Campus, the Clery Act’s watchdog group. The department issued no fines but determined that the system had underreported crimes. Over the 18 years of the law, only five institutions have been fined, up to $27,500 for each violation.
After the investigation, Davis redoubled its efforts toward full disclosure. But campus-safety experts say the university’s approach is unusual. They suspect that many institutions simply import numbers from the “Clery compliant” software programs sold to police departments and send off their reports.
Such different methods hardly allow for apples-to-apples comparisons, says Ms. Beeman.
“I’d be really leery to report my police statistics and call it a day,” she says. “If one campus is doing that, and another campus is sending over 800 people a thing saying, ‘Have you heard anything?’ ... then you’ve got apples and — I don’t even know. Apples and pork.”
Legacy of Confusion
Reliable information was the goal of the Clery Act. It bears the name of Jeanne Ann Clery, who, as a freshman at Lehigh University in 1986, was raped and murdered in her dormitory room. Her parents, Constance B. Clery and the late Howard K. Clery Jr., discovered that Lehigh had not publicly disclosed several previous crimes on its campus, and they appealed to state and federal lawmakers.
In 1990, in a wave of consumer-protection legislation, President George H.W. Bush signed the Crime Awareness and Campus Security Act. The Clerys set up Security on Campus to monitor colleges’ compliance.
The annual security reports, just one requirement under the law, bestowed a legacy of confusion. A federal report in 1997 by what was then called the General Accounting Office found “considerable variation in colleges’ practices for deciding which incidents to include in their reports and what categories to use in classifying certain crimes.” Campus officials expressed frustration over a lack of adequate guidance. Finally, in 2005, the Education Department published a Clery Act handbook — after Security on Campus successfully lobbied Congress for a $750,000 appropriation for the project.
The handbook weighs in at 216 pages but hardly resolves all the sticking points. Campus police officers routinely struggle with the crime classifications, which are used by the FBI. Nuances of a crime’s location are especially confounding, experts say.
In their annual reports, colleges must list crimes that occurred on their campuses; in their residence halls; in “noncampus” buildings, fraternity houses, and other facilities owned independently; and on nearby public property. That last category is especially tricky. It generally includes adjacent public sidewalks and streets, except when a campus is surrounded by a wall. If a campus borders an ungated park, however, crimes that occur up to one mile into that park should be reported. But “public property” does not generally include, for example, the parking lot of an apartment complex where many students live or the strip of bars where they hang out.
Amid criticism, New York University announced in November that it would revise its 2007 report to clarify the number of crimes in its residence halls. With many of them scattered around the city, the university considered only three to be on its campus. But after the student newspaper described that practice, NYU said it would reissue its report with a new category for off-campus dormitories. A university spokesman said the adjustment would make the statistics “more meaningful.”
Location is far from the only hang-up that plagues the reporting process. Many colleges struggle with the distinctions among burglary, robbery, and larceny (the last of which is not included in the reports). Also, campus officials often mistake which crimes to count by victim versus by incident, says Ms. Stafford, of George Washington University. Aggravated assaults, for example, get counted by victim, but robberies — even of five people by one perpetrator — should be counted by incident.
Advice from the Education Department doesn’t always help. Several years ago its auditors mistakenly told Virginia Tech that if crimes came to light well after they had occurred, they should be counted not when they were reported but in the year they actually happened. The agency corrected its error, but not before many institutions scrambled to adjust their numbers, which they then had to change back.
The quest for consistency continues, in training sessions conducted by Security on Campus, the International Association of Campus Law Enforcement Administrators, and a company called Westat, which manages Clery Act compliance for the Education Department. From August until December 2008, as colleges prepared and revised their reports, Westat’s help desk received 5,696 calls and 984 e-mail messages about how to comply with the Clery Act.
Most inconsistencies in crime statistics come from good-faith mistakes, campus-crime experts say, but some colleges may manipulate their numbers. “You always have to think about, if the institutions are creating the stats, you can make the statistics say what you want them to say,” says Steven J. Healy, chief of police at Princeton University and a former president of the campus-law-enforcement group.
One police chief contacted by The Chronicle mentioned pressure from top administrators to bring down high numbers. Another said he had heard of padding totals, particularly of alcohol-related offenses, that looked suspiciously low.
Sexual-assault statistics present a particular dilemma, says Douglas F. Tuttle, an instructor and former public-safety director at the University of Delaware who presents frequently on the Clery Act. Even when students don’t alert the campus police to a rape, they usually tell someone. Mr. Tuttle would not want that person to look for a statistic and not find it.
“If there was ever a year that we had no forcible-sex offenses on a campus,” he says, “I’d probably put one.”
Straight Zeros
Westat is now perusing security reports for 2007. When its contractors spot something unusual or suspicious, they contact the college in question. Any murders or nonforcible-sex offenses, such as statutory rape and incest, trigger calls. So do notable rises or drops in any category over previous years.
David Bergeron, director of policy and budget development in the Education Department’s Office of Postsecondary Education, has been known to pick up the phone himself. “People don’t generally like it when I call,” he says.
Once he saw an institution report 999 hate crimes. As it turned out, he says, “somebody thought it was a universal federal code for missing data.”
Another common mistake is conflating violations of state law and institutional policy on alcohol and weapons. If an incident is both a campus offense and a crime, it should be counted only as the latter. But the distinction requires constant monitoring, says Mr. Tuttle: “Someone has to say, ‘Wait a minute, it’s not illegal to have swords, it’s just a violation of our dorm rules, so we shouldn’t count that one.’”
For the first time this year, Westat double-checked all institutions that reported straight zeros: about 3,000, or 45 percent of them. “That’s a huge callback,” says Mr. Bergeron. But even if the contractor couldn’t get to everyone, he says, “it just seemed like maybe a good thing to do.”
Westat inspects colleges’ crime statistics as much as possible before publishing them, but the Education Department still includes a disclaimer on its Web site. “The crime data reported by the institutions have not been subjected to independent verification by the U.S. Department of Education,” it says. “Therefore, the department cannot vouch for the accuracy of the data reported here.”
Mr. Bergeron acknowledges certain factors that complicate comparisons between institutions. In some states, liquor laws are relatively strict. In others, campus police officers have jurisdiction over wider geographic areas beyond the campus. Still, he says, “the data are very consistent.”
‘Colossal Waste’
Consistent or not, security reports can baffle a layperson. They’re hard to grasp without intimate knowledge of the law and its regulations. The intended audience for the reports — prospective students and their families — lacks that context.
“They’ll assume that a campus that has 50 burglaries and 10 sexual assaults is more dangerous than a campus that has 20 burglaries and two sexual assaults,” says Ms. Stafford. “And that very well may not be the case.”
The reports include total crimes, not rates. Higher numbers may be a result of more-regular police patrols. And, as at Davis, creating an environment where students feel comfortable reporting crimes also tends to raise the tallies. Counting correctly, Ms. Stafford says, does that, too.
In 2004, for example, New York University reported 14 burglaries. The following year: 122. Someone who looks at the Education Department’s site, which shows three years of data, may wonder about the apparent crime spree.
In NYU’s own report on its Web site, there is an explanation. But the footnote might not mean much to a student or parent. “Increase of burglaries attributed to reclassification of certain larcenies based on information ascertained at Iaclea national conference in 2005,” it says.
That’s assuming people will read the report at all. A few years ago, when the law required colleges to print and distribute their annual crime statistics, Ms. Stafford would make a point to visit freshman dormitories on the day of delivery. “There were trash cans full of them,” she says. “It was a colossal waste of money.”
Beginning in 2000, new regulations allowed institutions to publish their reports online, but some colleges still print booklets.
There is scant research on who reads the reports in either format. In 2001, Steven M. Janosik, an associate professor of educational leadership and policy studies at Virginia Tech, polled nearly 4,000 students at 300 colleges. About one in five had read his or her college’s annual security report. But more than nine in 10 said the data had not influenced their decisions of where to enroll.
“These statistical reports aren’t very helpful in the ways they were intended,” Mr. Janosik says. At the same time, he says, new amendments and regulations make them more complex and cumbersome for colleges each year.
“We’d be spending our time better if we worried less about the minutiae and more about the programs and services that have a chance of changing student behavior,” Mr. Janosik says. “I would rather have a police officer doing an educational program, making rounds of the campus, trying to create safer environments, than I would have that same individual behind a desk flipping through reports and tallying frequencies.”
A Flawed System
No one argues with the spirit of the Clery Act: that students and parents should know what crimes occur on college campuses. Administrators applaud the attention it has brought to campus safety and point to the benefits of a requirement that colleges issue “timely warnings” of immediate threats.
Before the law was passed, top officials generally shunned talk of crime statistics, says Mr. Tuttle, of Delaware. Now, with those numbers public, colleges are more likely to propose solutions, he says: “Having something positive to say” about what your campus is doing to minimize crime “becomes a selling point.”
S. Daniel Carter, director of public policy for Security on Campus, defends the law’s role even as he acknowledges its accumulated bulk.
“The Clerys never set out to create a government bureaucracy. They set out to protect kids,” he says. And yet “when the federal government gets involved, layers of complexity get added.”
Changes in the law in the past two decades have sought to make the information the Education Department publishes more uniform across institutions. Those amendments have left colleges trying to keep up — not lobbying for an overhaul of the Clery Act.
“People are resigned to the fact that this is what we do,” says Mr. Healy, of Princeton, pointing out that cities have to file similar annual reports. “There’s desire for the law to be simplified,” he says, but “there’s not a lot of desire out there for new amendments.”
Beyond inconsistencies, the reporting process is based on a system that criminologists find flawed, says John J. Sloan III, an associate professor of criminal justice at the University of Alabama at Birmingham.
Local and national crime statistics, if reported solely by law-enforcement agencies, are inaccurate, says Mr. Sloan, who is writing a book about campus crime. What produces better data, he says, is “triangulation,” or taking into account surveys of victims and criminals as well.
Mr. Sloan calls the Clery Act’s current reporting requirements “symbolic politics.” What may lead to safer campuses, he says, is a list of minimum security requirements — such as blue-light telephones and security cameras — with “real teeth” when colleges do not comply.
That’s just one fix, he says, for a system that revolves around complex annual reports, a huge investment with little return. “The focus,” Mr. Sloan says, “ought to shift from producing a bunch of statistics that are basically useless.”
FORCIBLE-SEX OFFENSES AT U. OF CALIFORNIA CAMPUSES Berkeley | 34,940 | 11 | Davis | 29,796 | 62 | Irvine | 26,483 | 6 | Los Angeles | 37,476 | 22 | Merced | 1,871 | 0 | Riverside | 17,187 | 12 | San Diego | 27,020 | 2 | San Francisco | 2,999 | 10 | Santa Barbara | 21,410 | 8 | Santa Cruz | 15,825 | 3 | NOTE: The totals combine the Clery Act’s on-campus, noncampus, and public-property reporting categories for the universities’ main campuses only. | SOURCE: U.S. Department of Education | |
http://chronicle.com Section: Students Volume 55, Issue 21, Page A1