College Completion

For-Profit Colleges Compute Their Own Graduation Rates

March 02, 2012

For-profit colleges are some of the biggest critics of the federal graduation rate, arguing that it gives an inaccurate image of their institutions. They point out that the official calculation doesn't take into account the vast majority of the students who attend their institutions, most of whom are neither "first-time" nor "full-time."

So major for-profit institutions, including the University of Phoenix and American Public, DeVry, and Kaplan Universities, now compute and publicize their own alternative graduation or "completion rates," saying that these better reflect the nature of their student bodies and their institutional missions.

Making Sense of Graduation Rates

College Completion Web Site

How important are completion rates? The Chronicle's new site presents the numbers, puts them in context, and allows you to compare rates across the nation.

Completion in Context

The Rise and Fall of the Graduation Rate

The way students go to college now makes the government's measure less useful than ever.

Do Completion Rates Really Measure Quality?

Seven experts assess the meaning behind the measurements.

Read More Analysis and Insight »

Kaplan, for example, highlights its "risk-adjusted graduation rate," designed to show how it does in graduating students who come from poor families, work, have children, or are affected by other factors that make it tougher to complete college. American Public publishes a "completion rate" that counts students only after they've taken a few courses and allows up to 10 years to finish a degree—a better way, officials say, to adjust for its many military students, who often attend part time.

But with each college company using different criteria, their efforts end up clouding the picture further, rather than clarifying anything.

"If everybody has the right to start the clock whenever they want and count the students they want, this doesn't work," says Mark Schneider, vice president at the American Institutes for Research and a former U.S. commissioner for education statistics, who studies graduation rates. "Each one going off on their own clearly is not going to be helpful."

The use of these self-developed rates can also be deceptive, say some critics, especially when colleges publish them alongside other graduation statistics that are not comparable.

"It's selective transparency," says David Waddington, an assistant professor of education at Concordia University, in Canada, who has taken the University of Phoenix to task for the self-developed "completion rate" featured in the 2010 edition of its Annual Academic Report, which it voluntarily began to publish four years ago. In that report and a subsequent one, Phoenix published its rate, which includes students who came to the university having already earned credits toward their degree, alongside national averages, which count only students starting out with no credits at all.

Putting that completion rate next to the national figures "is not really a fair comparison," says Mr. Waddington. The completion rate (which Phoenix now calls a "modified graduation rate") reflects better on Phoenix because it is more representative of its student body, university leaders say. Mr. Waddington says it's "not fully honest" to run that alongside figures that are not comparable.

"They publicize the numbers that are more favorable to them, not the numbers that are getting everybody up in arms," says Mr. Waddington. The University of Phoenix Online, the company's biggest "campus," has an official six-year bachelor's graduation rate of 5.1 percent, but that measures fewer than 1 percent of its more than 253,000 students.

The architect of the academic report, E. Adam Honea, the university's former provost and now senior vice president for academic research, says there was no intent to deceive in juxtaposing the two rates. "I'm always clear that they're different," said Mr. Honea.

Mr. Honea says the academic report doesn't include the official graduation rate because the university doesn't have a single consolidated official rate; under the federal rules, each of the dozens of its campuses has its own. He also said the academic reports highlight the good and the bad, noting that the publications show that the university's own six-year "modified graduation rate" has declined from the 38 percent reported in 2008 to 31 percent in 2011. The decline, he said, coincides with the period when the university was more aggressively recruiting students.

The federal government contributes to the confusion. For-profits and other colleges with programs covered by the new "gainful employment" regulation are required to compute and publish an "on time completion rate," which many consider misleading and mostly meaningless.

For this rate, colleges determine how many students have completed each degree each year and then report the proportion of those graduates who completed within the normal period of time, which is four years for a bachelor's and two for an associate or a master's degree. So if five students graduated with bachelor's degrees between June 2010 and July 2011, and they all managed to do it in just four years, the college's "on time completion rate" would be 100 percent, even if 95 others who started the program four years earlier all dropped out.

Who Gets Counted?

Phoenix counts nearly all degree-seeking students in its modified rate, but excludes students who don't first earn three credits, the equivalent of completing one course.

For its homegrown calculation, American Public doesn't begin to count students until they've completed the equivalent of three classes with a 2.0 GPA (or two classes at the graduate level with a 3.0 GPA). Jennifer Stephens Helm, the university's vice president for institutional research and assessment, said in a statement that the company chose that approach so its measure would exclude students who take just a couple of classes as they attend multiple colleges.

American Public's "completion rate" also allows for students to take up to 10 years to complete a bachelor's degree and up to seven years for an associate or a master's degree. That window is longer than either the official rate, which is based on 150 percent of normal time for a degree (hence three years for an associate degree and six for a bachelor's). It's also longer than an alternative approach now being promoted by an organization called Transparency by Design, which calls for institutions to report completions based on 150 percent and 200 percent of time. By its measure, American Public's bachelor's-degree completion rate has ranged from 60 percent to 45 percent.

Kaplan University has self-reported an overall graduation rate of 27 percent for students counted in the official formula, but according to its most recent Academic Report, for 2009-10, it believes "a better way to measure our graduation rate" is to count only students who have completed 18 credits, the equivalent of one semester, because those are the students "who have demonstrated they can do college-level work." By that measure, it claims a graduation rate of 51 percent.

When it comes to graduating first-time, full-time students with some of the seven nationally recognized "risk factors," Kaplan claims its graduation rate for students with two or more factors is higher than the national average for a comparable group of students.

DeVry follows the simplest approach, with few exclusions. Along with its official graduation rate, it publishes a "graduation rate with transfers" that includes all full-time students.

Yet even that approach can be misleading, notes Mr. Schneider, the scholar of graduation rates, because it makes no distinctions between students coming in with a couple of credits and those who may already be halfway to a degree. Including students with credits in the group to be measured for how many graduate within six years skews the calculation.

He and others say it would be helpful to for-profit colleges and other institutions serving nontraditional students to develop a graduation-rate calculation that could take transfer students into account, but only if the formula could also adjust for how many credits the transfer students start with.

A few years ago, Mr. Schneider and others proposed such a formula, in conjunction with the Nexus Research & Policy Center, an organization begun with backing from the founder of the University of Phoenix. "We tried to figure out a set of measures that would work," recalls Mr. Schnider, and then sent the "cookbook" out to several for-profit and nonprofit colleges for their feedback. The proposal created a proportional measure for handling transfer students' credits.

The idea attracted some interest, he says, but it never took off.