George Washington University’s admissions office misreported data on the class rank of incoming students for more than a decade, the institution announced on Thursday. The errors, as described by university officials, resulted from a combination of good data and faulty “estimates.”
Previously, George Washington reported—on its Web site and to U.S. News & World Report—that 78 percent of the members of last fall’s freshman class had graduated in the top 10 percent of their high-school class. The correct figure, university officials said on Thursday, was 58 percent.
That number, however, refers only to the proportion of students for whom the admissions office had obtained actual class-rank data—just 38 percent of last fall’s freshman class. “This is not a complete number on any admission office’s Web site,” Forrest Maltzman, senior vice provost for academic affairs and planning, said of class rank.
Nonetheless, Mr. Maltzman described the misreported data as resulting from a series of embarrassing mistakes by employees in the admissions office over time. He vowed that the university would ensure that such mistakes do not happen again. “We’ve made a very bad error that I’m not very happy about,” he said.
Class rank is a traditional measure of a student’s academic achievement compared with that of his or her peers. Although many high schools have stopped ranking students according to their grade-point averages, colleges often tout the percentage of incoming students who ranked in the top portion of their high-school classes. U.S. News & World Report’s annual college guide rewards colleges for enrolling large proportions of students who graduate in the top 10 percent.
In short, class rank, like many measures of selectivity, is a flawed but powerful indicator of quality and prestige among selective institutions and students who apply to them. George Washington’s announcement provides yet another reminder that the metrics of merit are also slippery—and subject to manipulation.
Officials at George Washington said they had discovered problems with the class-rank data during an internal review of admissions statistics last summer. That’s when the university moved undergraduate-enrollment functions under the purview of the provost’s office.
Mr. Maltzman, who has been overseeing admissions operations as the university searches for its first enrollment chief, said he was surprised when he saw that the admissions office had apparently obtained class-rank data for more than half of the students in the Class of 2015. Given that as many as two-thirds of high schools do not report class rank, he had expected the figure to be 30 to 40 percent.
Moreover, the percentage of students reported as having graduated in the top 10 percent of their high-school classes surprised him, especially when compared with data reported by a peer institution. So Mr. Maltzman inquired about the numbers. “They looked high,” he said.
George Washington officials said they later discovered that the admissions office had been estimating the class rank for high-performing students whom they “assumed” were in the top 10 percent of their classes, based on their grade-point averages and standardized-test scores. “That inflated the number,” said Mr. Maltzman. “And this grew over time. As fewer and fewer students had any class rank, that was making the estimation portion of it a bigger factor. And over a long period of time, students got better, and we ended up with more in our top category.”
Mr. Maltzman and a university spokeswoman declined to provide specifics about how many admissions officials might have known about problems with the class-rank data or participated in the estimates. An auditing firm hired by the university has determined that the misreporting was not done “with malice,” according to officials. “I don’t think they realized the extent to which it was distorting the numbers,” Mr. Maltzman said.
Nevertheless, replacing hard-and-fast numbers with mere estimates involves a conscious choice, and, it’s fair to assume, an intent to polish the truth. The motivations for such polishing may have had little to do with the U.S. News rankings, which, despite rumors to the contrary, a college cannot easily scale by manipulating just a single category of admissions data. (Class rank accounts for 40 percent of a college’s “selectivity” rating, which, in turn, accounts for just 15 percent of a college’s overall ranking.) Artificially inflating your college’s ranking is possible, but it’s not easy.
In any case, a world without U.S. News would still be a world in which admissions officials often feel pressure to make their institution—or their office—look better. If nothing else, George Washington’s class-rank estimates affirm the “elasticity of admissions data,” and Thursday’s announcement was just the latest reminder that admissions statistics are only as reliable as the people who report them.
In January, a senior enrollment official at Claremont McKenna College resigned after admitting to falsely reporting SAT statistics since 2005. In August, Emory University announced that employees in its admissions and institutional-research offices had intentionally misreported admissions data for more than a decade.
At George Washington, the auditing firm found no other problems with the admissions statistics, including application rates and ACT/SAT scores, according to the university. Officials said the university would conduct regular audits of such data in the future.
On Thursday a spokeswoman for George Washington declined to say whether any employee had been fired as a result of the university’s findings—only that those responsible for the class-rank estimates were no longer responsible for reporting admissions data. Previously, the admissions office recorded enrollment data in the university’s system; now, the office of academic planning and assessment will report and verify those data.
“It makes sense to me to actually have somebody who’s not evaluated on the quality of students characterizing those students,” Mr. Maltzman said. “It creates another check.”