In the Wild West of college admissions, there is no Data Sheriff.
The latest reminder arrived on Monday when Claremont McKenna College announced that a senior administrator had resigned after admitting to falsely reporting SAT statistics since 2005. In an e-mail to the campus, Pamela B. Gann, the college’s president, said an internal review found that scores for each fall’s freshman class had been “generally inflated by an average of 10-20 points each.” The apparent perpetrator was Richard C. Vos, long the college’s dean of admissions and financial aid, who has resigned from the college.
The announcement has shaken those who work on both sides of the admissions process. In the span of 24 hours, Mr. Vos, described by several colleagues as an engaging and thoughtful dean, has become a symbol of the pressures that come with top-level admissions jobs. As one mid-career dean said on Tuesday, “I just keep thinking about how much pressure an experienced and mature admissions professional must be under to do whatever he did.”
Furthermore, the falsification of SAT scores at Claremont McKenna reveals an uncomfortable truth: The same numbers to which applicants and parents—not to mention presidents and trustees—often attribute great power and meaning, are, like most things on the planet, subject to manipulation. For better or worse, the admissions industry generally operates on an honor system that governs the flow of “self-reported” data submitted to the federal government, accreditors, bond-rating agencies, and publications such as U.S. News & World Report. The Common Data Set has standardized the information colleges provide to various outlets, but admissions officials say interpretations of even the most basic information vary from campus to campus.
Does this mean all colleges fudge their numbers? Of course not. Could this happen anywhere? Probably.
“The long and the short of it is that there are ‘gentlemen’s rules’ about what you report and to whom,” Charles A. Deacon, Georgetown University’s dean of admissions wrote in an e-mail on Tuesday.
At Georgetown, Melissa A. Costanzi, the university’s senior associate director of admissions, oversees a daily flood of data. An applicant takes the SAT, tells the College Board to report his score to Georgetown, and ultimately an electronic file containing that score makes its way to the admissions office. Ms. Costanzi then loads each score into the office’s database, which matches it to the appropriate applicant’s file.
Months later, once Georgetown has its final list of enrolled freshmen, Ms. Costanzi then creates a data “snap shot” that contains information about the entire class. She then works with the university’s office of institutional research to make sure there are no discrepancies or missing data. The university then uses the same data set to send information to the federal Integrated Postsecondary Education Data System, U.S. News, and other outlets.
“We want to present ourselves in the right way, to be as transparent as possible,” Ms. Costanzi said. “How easy would it be to go into that data file and just change the scores if you wanted to? It wouldn’t be hard to do, but it would be a slimy thing to do.”
An institution’s very structure may make it more or less difficult to inflate the numbers, admissions experts said on Tuesday. At some small colleges, for instance, one person in the admissions office oversees all data and signs off on the numbers. On other campuses, the responsibility for maintaining admissions statistics is shared between two different offices, or rests outside the admissions office.
At DePaul University, for instance, the office of institutional research maintains admissions data and handles reporting. “Those people are reluctant to round up to the nearest hundredth of a point,” Jon Boeckenstedt, associate vice president for enrollment policy and planning at DePaul, said of the university’s institutional researchers.
Like many admissions officials I interviewed on Tuesday, Mr. Boeckenstedt suspected that misreporting is neither rare nor rampant, but somewhere in between. “The fact that students and parents like to make strong distinctions between insignificant differences is part of the problem here,” he said. “I’m not blaming the victims, but there’s a reason why people believe there’s a strategic advantage in doing this.”
Mr. Boeckenstedt also described the many other ways that colleges have manipulated data to pump up their test-score averages. Over the years, some institutions have not included the scores of athletes and low-scoring applicants admitted through special programs, for instance. This practice gave rise to the term “NIPS,” for “not in profile students.”
Although lying to Moody’s or any credit-rating agency could land an college in hot water, falsifying test scores doesn’t carry the risk of criminal charges. “Asking colleges to report their own SAT scores is like asking automakers to report their own gas mileage,” Mr. Boeckenstedt said. “A lot of them are going to use the most favorable numbers.”
Outright fabrications are one thing; varied interpretations among earnest admissions officials another. In an era when the College Board’s Score Choice program allows applicants to choose which test scores colleges see, admissions officials must make many choices how to present enrollment statistics to the public. Some of those choices are not black-or-white equations. Should a college “super score,” adding an applicant’s highest SAT math score to his or her highest critical-reading score, even though those tests were taken on different days? Such questions toll in many admissions offices, and there are many variations on the practice among colleges.
“When we talk about admissions statistics, we’re so used to hearing stories of the elasticity of admissions data,” says David A. Hawkins, director of public policy and research for the National Association for College Admission Counseling. “Generally speaking, the effort to put your best foot forward with your data happens all the time. How far is too far? There are many details and there are often gray areas.”
Robert G. Springall agrees. Mr. Springall, dean of admissions at Bucknell University, said he got chills when he first heard about the inflated scores at Claremont McKenna. Less than two weeks ago, he had dined with Mr. Vos, the former admissions dean at Claremont, at an event in Tampa. The two talked shop, discussing recruitment, the economy’s effect on admissions, and how to talk to parents about what happens in admissions committees.
Reflecting on the falsified SAT scores, Mr. Springall said there’s a difference between lying about data and wrestling with difficult questions about how to present data. The first is wrong, but the second is ubiquitous. “What do we count and what don’t we count?” Mr. Springall said. “This is something we all discuss on our campuses.”
For example, Mr. Springall and his staff have recently discussed the increasing number of applicants who submit both ACT and SAT scores. In the old days, applicants took one or the other, but so far 1,383 of this year’s 8,191 applicants have sent scores from both exams. “Which score,” Mr. Springall said, “is the student score of record? Do we report both? Do we use the better score? What data point’s in the student’s best interest? Which one’s in the institution’s best interest?”
Bucknell’s policy is to use the highest score and discard the other. So the SAT scores of an enrolled student who scored higher on the ACT are not included in the calculation of the incoming classes SAT scores. But there’s no regulation preventing Bucknell from changing this policy tomorrow.
It’s just one reminder that there’s more than one way to create a profile. A quick check of a few Web sites reveals that some colleges publish online profiles that include the median scores for admitted students. Others, like Bucknell, publish median scores only for the students who end up enrolling. “We would look 30-40-50 points better, in terms of the SAT, if we did it the other way,” Mr. Springall said. “In part, that would be a marketing effort.”
The relentless demands of marketing and the allure of institutional metrics may or may not explain why an admissions official at Claremont McKenna air-brushed the college’s SAT scores for seven years. Either way, Mark C. Moody, co-director of college counseling at Colorado Academy, believes the news sends a frustrating signal to the many audiences watching colleges.
“On the other side of the desk, in the trenches with the kids applying to college, we work really hard to be honest and to help kids present themselves authentically,” he wrote in an e-mail Tuesday. “We try to understand the colleges’ decision-making (as reflected in part by ‘the numbers’) so we can help students honestly self-assess and advise them well. We also maintain the belief that the system is fair and honest, and try to head off the cynicism that our families could easily adopt by viewing college admission as a rigged game of numbers and self-interest.”